WorldWideScience

Sample records for receptors theory validation

  1. Drive: Theory and Construct Validation.

    Science.gov (United States)

    Siegling, Alex B; Petrides, K V

    2016-01-01

    This article explicates the theory of drive and describes the development and validation of two measures. A representative set of drive facets was derived from an extensive corpus of human attributes (Study 1). Operationalised using an International Personality Item Pool version (the Drive:IPIP), a three-factor model was extracted from the facets in two samples and confirmed on a third sample (Study 2). The multi-item IPIP measure showed congruence with a short form, based on single-item ratings of the facets, and both demonstrated cross-informant reliability. Evidence also supported the measures' convergent, discriminant, concurrent, and incremental validity (Study 3). Based on very promising findings, the authors hope to initiate a stream of research in what is argued to be a rather neglected niche of individual differences and non-cognitive assessment.

  2. Construct Validity: Advances in Theory and Methodology

    OpenAIRE

    Strauss, Milton E.; Smith, Gregory T.

    2009-01-01

    Measures of psychological constructs are validated by testing whether they relate to measures of other constructs as specified by theory. Each test of relations between measures reflects on the validity of both the measures and the theory driving the test. Construct validation concerns the simultaneous process of measure and theory validation. In this chapter, we review the recent history of validation efforts in clinical psychological science that has led to this perspective, and we review f...

  3. A theory of cross-validation error

    OpenAIRE

    Turney, Peter D.

    1994-01-01

    This paper presents a theory of error in cross-validation testing of algorithms for predicting real-valued attributes. The theory justifies the claim that predicting real-valued attributes requires balancing the conflicting demands of simplicity and accuracy. Furthermore, the theory indicates precisely how these conflicting demands must be balanced, in order to minimize cross-validation error. A general theory is presented, then it is developed in detail for linear regression and instance-bas...

  4. [The receptor theory of atherosclerosis].

    Science.gov (United States)

    Likhoded, V G; Bondarenko, V M; Gintsburg, A L

    2010-01-01

    Lipopolysaccharides of Gram-negative bacteria can interact with Toll-like receptor 4 (TLR4) and induce atheroma formation. The risk of atherosclerosis is decreased in case of TLR4 mutation. Other bacterial ligands and endogenous ligands of TLRs can also be involved in induction of atherogenesis. The general concept of atherosclerosis pathogentsis is presented. According to this concept atherogenesis can be initiated by some reactions resulting from interaction of exogenous and endogenous microbial ligands with Toll-like receptors.

  5. Current Concerns in Validity Theory.

    Science.gov (United States)

    Kane, Michael

    Validity is concerned with the clarification and justification of the intended interpretations and uses of observed scores. It has not been easy to formulate a general methodology set of principles for validation, but progress has been made, especially as the field has moved from relatively limited criterion-related models to sophisticated…

  6. Receptor theory and biological constraints on value.

    Science.gov (United States)

    Berns, Gregory S; Capra, C Monica; Noussair, Charles

    2007-05-01

    Modern economic theories of value derive from expected utility theory. Behavioral evidence points strongly toward departures from linear value weighting, which has given rise to alternative formulations that include prospect theory and rank-dependent utility theory. Many of the nonlinear forms for value assumed by these theories can be derived from the assumption that value is signaled by neurotransmitters in the brain, which obey simple laws of molecular movement. From the laws of mass action and receptor occupancy, we show how behaviorally observed forms of nonlinear value functions can arise.

  7. On the validity range of piston theory

    CSIR Research Space (South Africa)

    Meijer, M-C

    2015-06-01

    Full Text Available The basis of linear piston theory in unsteady potential flow is used in this work to develop a quantitative treatment of the validity range of piston theory. In the limit of steady flow, velocity perturbations from Donov’s series expansion...

  8. Network Security Validation Using Game Theory

    Science.gov (United States)

    Papadopoulou, Vicky; Gregoriades, Andreas

    Non-functional requirements (NFR) such as network security recently gained widespread attention in distributed information systems. Despite their importance however, there is no systematic approach to validate these requirements given the complexity and uncertainty characterizing modern networks. Traditionally, network security requirements specification has been the results of a reactive process. This however, limited the immunity property of the distributed systems that depended on these networks. Security requirements specification need a proactive approach. Networks' infrastructure is constantly under attack by hackers and malicious software that aim to break into computers. To combat these threats, network designers need sophisticated security validation techniques that will guarantee the minimum level of security for their future networks. This paper presents a game-theoretic approach to security requirements validation. An introduction to game theory is presented along with an example that demonstrates the application of the approach.

  9. Functional Validation of Heteromeric Kainate Receptor Models.

    Science.gov (United States)

    Paramo, Teresa; Brown, Patricia M G E; Musgaard, Maria; Bowie, Derek; Biggin, Philip C

    2017-11-21

    Kainate receptors require the presence of external ions for gating. Most work thus far has been performed on homomeric GluK2 but, in vivo, kainate receptors are likely heterotetramers. Agonists bind to the ligand-binding domain (LBD) which is arranged as a dimer of dimers as exemplified in homomeric structures, but no high-resolution structure currently exists of heteromeric kainate receptors. In a full-length heterotetramer, the LBDs could potentially be arranged either as a GluK2 homomer alongside a GluK5 homomer or as two GluK2/K5 heterodimers. We have constructed models of the LBD dimers based on the GluK2 LBD crystal structures and investigated their stability with molecular dynamics simulations. We have then used the models to make predictions about the functional behavior of the full-length GluK2/K5 receptor, which we confirmed via electrophysiological recordings. A key prediction and observation is that lithium ions bind to the dimer interface of GluK2/K5 heteromers and slow their desensitization. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  10. Validation of psychoanalytic theories: towards a conceptualization of references.

    Science.gov (United States)

    Zachrisson, Anders; Zachrisson, Henrik Daae

    2005-10-01

    The authors discuss criteria for the validation of psychoanalytic theories and develop a heuristic and normative model of the references needed for this. Their core question in this paper is: can psychoanalytic theories be validated exclusively from within psychoanalytic theory (internal validation), or are references to sources of knowledge other than psychoanalysis also necessary (external validation)? They discuss aspects of the classic truth criteria correspondence and coherence, both from the point of view of contemporary psychoanalysis and of contemporary philosophy of science. The authors present arguments for both external and internal validation. Internal validation has to deal with the problems of subjectivity of observations and circularity of reasoning, external validation with the problem of relevance. They recommend a critical attitude towards psychoanalytic theories, which, by carefully scrutinizing weak points and invalidating observations in the theories, reduces the risk of wishful thinking. The authors conclude by sketching a heuristic model of validation. This model combines correspondence and coherence with internal and external validation into a four-leaf model for references for the process of validating psychoanalytic theories.

  11. Theory and Validation for the Collision Module

    DEFF Research Database (Denmark)

    Simonsen, Bo Cerup

    1999-01-01

    This report describes basic modelling principles, the theoretical background and validation examples for the Collision Module for the computer program DAMAGE.......This report describes basic modelling principles, the theoretical background and validation examples for the Collision Module for the computer program DAMAGE....

  12. Theory and Validity of Life Satisfaction Scales

    Science.gov (United States)

    Diener, Ed; Inglehart, Ronald; Tay, Louis

    2013-01-01

    National accounts of subjective well-being are being considered and adopted by nations. In order to be useful for policy deliberations, the measures of life satisfaction must be psychometrically sound. The reliability, validity, and sensitivity to change of life satisfaction measures are reviewed. The scales are stable under unchanging conditions,…

  13. Validation of Theory: Exploring and Reframing Popper’s Worlds

    Directory of Open Access Journals (Sweden)

    Steven E. Wallis

    2008-12-01

    Full Text Available Popper’s well-known arguments describe the need for advancing social theory through a process of falsification. Despite Popper’s call, there has been little change in the academic process of theory development and testing. This paper builds on Popper’s lesser-known idea of “three worlds” (physical, emotional/conceptual, and theoretical to investigate the relationship between knowledge, theory, and action. In this paper, I explore his three worlds to identify alternative routes to support the validation of theory. I suggest there are alternative methods for validation, both between, and within, the three worlds and that a combination of validation and falsification methods may be superior to any one method. Integral thinking is also put forward to support the validation process. Rather than repeating the call for full Popperian falsification, this paper recognizes that the current level of social theorizing provides little opportunity for such falsification. Rather than sidestepping the goal of Popperian falsification, the paths suggested here may be seen as providing both validation and falsification as stepping-stones toward the goal of more effective social and organizational theory.

  14. Theory and simulations of adhesion receptor dimerization on membrane surfaces.

    Science.gov (United States)

    Wu, Yinghao; Honig, Barry; Ben-Shaul, Avinoam

    2013-03-19

    The equilibrium constants of trans and cis dimerization of membrane bound (2D) and freely moving (3D) adhesion receptors are expressed and compared using elementary statistical-thermodynamics. Both processes are mediated by the binding of extracellular subdomains whose range of motion in the 2D environment is reduced upon dimerization, defining a thin reaction shell where dimer formation and dissociation take place. We show that the ratio between the 2D and 3D equilibrium constants can be expressed as a product of individual factors describing, respectively, the spatial ranges of motions of the adhesive domains, and their rotational freedom within the reaction shell. The results predicted by the theory are compared to those obtained from a novel, to our knowledge, dynamical simulations methodology, whereby pairs of receptors perform realistic translational, internal, and rotational motions in 2D and 3D. We use cadherins as our model system. The theory and simulations explain how the strength of cis and trans interactions of adhesive receptors are affected both by their presence in the constrained intermembrane space and by the 2D environment of membrane surfaces. Our work provides fundamental insights as to the mechanism of lateral clustering of adhesion receptors after cell-cell contact and, more generally, to the formation of lateral microclusters of proteins on cell surfaces. Copyright © 2013 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  15. A Validation Study of Maslow's Hierarchy of Needs Theory.

    Science.gov (United States)

    Clay, Rex J.

    A study was conducted to expand the body of research that tests the validity of Abraham Maslow's hierarchy of needs theory in a work context where it often serves as a guide for the supervisor's relationships with his subordinates. Data was gathered by questionnaire which tested for a hierarchy of needs among instructors at four community colleges…

  16. Validity Theory: Reform Policies, Accountability Testing, and Consequences

    Science.gov (United States)

    Chalhoub-Deville, Micheline

    2016-01-01

    Educational policies such as Race to the Top in the USA affirm a central role for testing systems in government-driven reform efforts. Such reform policies are often referred to as the global education reform movement (GERM). Changes observed with the GERM style of testing demand socially engaged validity theories that include consequential…

  17. The validity and value of inclusive fitness theory.

    Science.gov (United States)

    Bourke, Andrew F G

    2011-11-22

    Social evolution is a central topic in evolutionary biology, with the evolution of eusociality (societies with altruistic, non-reproductive helpers) representing a long-standing evolutionary conundrum. Recent critiques have questioned the validity of the leading theory for explaining social evolution and eusociality, namely inclusive fitness (kin selection) theory. I review recent and past literature to argue that these critiques do not succeed. Inclusive fitness theory has added fundamental insights to natural selection theory. These are the realization that selection on a gene for social behaviour depends on its effects on co-bearers, the explanation of social behaviours as unalike as altruism and selfishness using the same underlying parameters, and the explanation of within-group conflict in terms of non-coinciding inclusive fitness optima. A proposed alternative theory for eusocial evolution assumes mistakenly that workers' interests are subordinate to the queen's, contains no new elements and fails to make novel predictions. The haplodiploidy hypothesis has yet to be rigorously tested and positive relatedness within diploid eusocial societies supports inclusive fitness theory. The theory has made unique, falsifiable predictions that have been confirmed, and its evidence base is extensive and robust. Hence, inclusive fitness theory deserves to keep its position as the leading theory for social evolution.

  18. Life Origination Hydrate Theory (LOH-Theory) and Mitosis and Replication Hydrate Theory (MRH-Theory): three-dimensional PC validation

    Science.gov (United States)

    Kadyshevich, E. A.; Dzyabchenko, A. V.; Ostrovskii, V. E.

    2014-04-01

    Size compatibility of the CH4-hydrate structure II and multi-component DNA fragments is confirmed by three-dimensional simulation; it is validation of the Life Origination Hydrate Theory (LOH-Theory).

  19. Validation of Theory of Consumption Values Scales for Deal Sites

    DEFF Research Database (Denmark)

    Sudzina, Frantisek

    2016-01-01

    Deal sites became a widely used artefact. But there is still only a limited number of papers investigating their adoption and use. Most of the research published on the topic is qualitative. It is typical for an early stage of investigation of any new artefact. The Theory of Consumption Values ex...... explains purchase behavior. The aim of this paper is to validate scales for the Theory of Consumption Values for deal sites. This should pave a way for quantitative investigation of motives for purchasing using deal sites....

  20. Nursing intellectual capital theory: operationalization and empirical validation of concepts.

    Science.gov (United States)

    Covell, Christine L; Sidani, Souraya

    2013-08-01

    To present the operationalization of concepts in the nursing intellectual capital theory and the results of a methodological study aimed at empirically validating the concepts. The nursing intellectual capital theory proposes that the stocks of nursing knowledge in an organization are embedded in two concepts, nursing human capital and nursing structural capital. The theory also proposes that two concepts in the work environment, nurse staffing and employer support for nursing continuing professional development, influence nursing human capital. A cross-sectional design. A systematic three-step process was used to operationalize the concepts of the theory. In 2008, data were collected for 147 inpatient units from administrative departments and unit managers in 6 Canadian hospitals. Exploratory factor analyses were conducted to determine if the indicator variables accurately reflect their respective concepts. The proposed indicator variables collectively measured the nurse staffing concept. Three indicators were retained to construct nursing human capital: clinical expertise and experience concept. The nursing structural capital and employer support for nursing continuing professional development concepts were not validated empirically. The nurse staffing and the nursing human capital: clinical expertise and experience concepts will be brought forward for further model testing. Refinement for some of the indicator variables of the concepts is indicated. Additional research is required with different sources of data to confirm the findings. © 2012 Blackwell Publishing Ltd.

  1. An Econometric Validation of Malthusian Theory: Evidence in Nigeria

    Directory of Open Access Journals (Sweden)

    Musa Abdullahi Sakanko

    2018-01-01

    Full Text Available Rising population is an asset, provided, the skills of the workforce are used to the maximum extent. If not appropriately channelized, it can be a liability for a nation. A skilled and hardworking population can emerge as a foundation for a country’s development. This study examines the validity of Malthusian Theory in Nigeria using time series data from 1960 to 2016, employs the ARDL bound test techniques. The result shows that in the long-run, population growth and food production move proportionately, while population growth poses a depleting effect on food production in the short-run, thus validating the incidence of Malthusian impact in Nigerian economy in the short-run. The researcher recommended the government should strategize plans, which will further intensify family planning and birth control measure, compulsory western education and revitalization of the agricultural sector.DOI: 10.150408/sjie.v7i1.6461

  2. Application of validity theory and methodology to patient-reported outcome measures (PROMs): building an argument for validity.

    Science.gov (United States)

    Hawkins, Melanie; Elsworth, Gerald R; Osborne, Richard H

    2018-07-01

    Data from subjective patient-reported outcome measures (PROMs) are now being used in the health sector to make or support decisions about individuals, groups and populations. Contemporary validity theorists define validity not as a statistical property of the test but as the extent to which empirical evidence supports the interpretation of test scores for an intended use. However, validity testing theory and methodology are rarely evident in the PROM validation literature. Application of this theory and methodology would provide structure for comprehensive validation planning to support improved PROM development and sound arguments for the validity of PROM score interpretation and use in each new context. This paper proposes the application of contemporary validity theory and methodology to PROM validity testing. The validity testing principles will be applied to a hypothetical case study with a focus on the interpretation and use of scores from a translated PROM that measures health literacy (the Health Literacy Questionnaire or HLQ). Although robust psychometric properties of a PROM are a pre-condition to its use, a PROM's validity lies in the sound argument that a network of empirical evidence supports the intended interpretation and use of PROM scores for decision making in a particular context. The health sector is yet to apply contemporary theory and methodology to PROM development and validation. The theoretical and methodological processes in this paper are offered as an advancement of the theory and practice of PROM validity testing in the health sector.

  3. 76 FR 46307 - Proposed collection; Comment Request; A Generic Submission for Theory Development and Validation...

    Science.gov (United States)

    2011-08-02

    .... Formative research in the area of theory development and validation would provide the basis for developing...) Develop and refine integrative theories; (3) Identify and observe theoretical and innovative trends in... Request; A Generic Submission for Theory Development and Validation (NCI) SUMMARY: Under the provisions of...

  4. Weak turbulence theory of Langmuir waves: A reconsideration of validity of quasilinear theory

    International Nuclear Information System (INIS)

    Liang, Y.M.; Diamond, P.H.

    1991-01-01

    The weak turbulence theory of Langmuir waves in a one-dimensional, one-species plasma is discussed. Analytical calculations using the theory of two-point correlation functions show that in the weak turbulence regime τ ac much-lt min[τ tr , γ k -1 ], the nonlinear enhancement of the mode growth rate relative to the linear Landau mode growth rate γ k L is rather weak, and quasilinear theory is reproduced at the lowest order. Hence this work also proves the validity of the quasilinear theory. Here τ ac ∼ (kΔv ph ) -1 is the phase-mixing time or the auto-correlation time, and τ tr ∼ (k 2 D ql ) -1/3 is the particle decorrelation time or the turbulence trapping time. In particular, the lowest order nonlinear correction to γ k L in the regime τ ac much-lt τ tr much-lt γ k -1 is proportional to (1/ω k τ tr )γ k L . Both corrections are additive, not multiplicative, and are of higher order in the weak turbulence expansion. The smallness of the corrections is due to the fact that the only mechanism for the relaxation of the plasma distribution function in a one-dimensional, one-species plasma is momentum exchange between waves and particles, which is exactly the interaction considered in the quasilinear theory. No like-like particle momentum exchange is allowed due to momentum conservation constraints. Similar calculations are also done for the traveling wave tube, which can be used to test this theory experimentally, especially for the case of bump-on-tail instability. A comparison of theoretical predictions with experimental results is presented. 3 refs

  5. Cooperative Learning: Improving University Instruction by Basing Practice on Validated Theory

    Science.gov (United States)

    Johnson, David W.; Johnson, Roger T.; Smith, Karl A.

    2014-01-01

    Cooperative learning is an example of how theory validated by research may be applied to instructional practice. The major theoretical base for cooperative learning is social interdependence theory. It provides clear definitions of cooperative, competitive, and individualistic learning. Hundreds of research studies have validated its basic…

  6. Method validation in pharmaceutical analysis: from theory to practical optimization

    Directory of Open Access Journals (Sweden)

    Jaqueline Kaleian Eserian

    2015-01-01

    Full Text Available The validation of analytical methods is required to obtain high-quality data. For the pharmaceutical industry, method validation is crucial to ensure the product quality as regards both therapeutic efficacy and patient safety. The most critical step in validating a method is to establish a protocol containing well-defined procedures and criteria. A well planned and organized protocol, such as the one proposed in this paper, results in a rapid and concise method validation procedure for quantitative high performance liquid chromatography (HPLC analysis.   Type: Commentary

  7. Towards Validating Risk Indicators Based on Measurement Theory (Extended version)

    NARCIS (Netherlands)

    Morali, A.; Wieringa, Roelf J.

    Due to the lack of quantitative information and for cost-efficiency, most risk assessment methods use partially ordered values (e.g. high, medium, low) as risk indicators. In practice it is common to validate risk indicators by asking stakeholders whether they make sense. This way of validation is

  8. Conceptualizing Essay Tests' Reliability and Validity: From Research to Theory

    Science.gov (United States)

    Badjadi, Nour El Imane

    2013-01-01

    The current paper on writing assessment surveys the literature on the reliability and validity of essay tests. The paper aims to examine the two concepts in relationship with essay testing as well as to provide a snapshot of the current understandings of the reliability and validity of essay tests as drawn in recent research studies. Bearing in…

  9. Construct Validity of Measures of Becker's Side Bet Theory.

    Science.gov (United States)

    Shore, Lynn M.; Tetrick, Lois E.; Shore, Ted H.; Barksdale, Kevin

    2000-01-01

    Becker's side bet theory (remaining in a job because of perceived costs of leaving) was tested using data from 327 working business students. Three factors were most consistent with the theory: bureaucratic organization, nonwork-related concerns, and adjustment to social position. Attachment to the organization was significantly linked to tangible…

  10. All the mathematics in the world: logical validity and classical set theory

    Directory of Open Access Journals (Sweden)

    David Charles McCarty

    2017-12-01

    Full Text Available A recognizable topological model construction shows that any consistent principles of classical set theory, including the validity of the law of the excluded third, together with a standard class theory, do not suffice to demonstrate the general validity of the law of the excluded third. This result calls into question the classical mathematician's ability to offer solid justifications for the logical principles he or she favors.

  11. Towards program theory validation: Crowdsourcing the qualitative analysis of participant experiences.

    Science.gov (United States)

    Harman, Elena; Azzam, Tarek

    2018-02-01

    This exploratory study examines a novel tool for validating program theory through crowdsourced qualitative analysis. It combines a quantitative pattern matching framework traditionally used in theory-driven evaluation with crowdsourcing to analyze qualitative interview data. A sample of crowdsourced participants are asked to read an interview transcript and identify whether program theory components (Activities and Outcomes) are discussed and to highlight the most relevant passage about that component. The findings indicate that using crowdsourcing to analyze qualitative data can differentiate between program theory components that are supported by a participant's experience and those that are not. This approach expands the range of tools available to validate program theory using qualitative data, thus strengthening the theory-driven approach. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Validating the predictions of case-based decision theory

    OpenAIRE

    Radoc, Benjamin

    2015-01-01

    Real-life decision-makers typically do not know all possible outcomes arising from alternative courses of action. Instead, when people face a problem, they may rely on the recollection of their past personal experience: the situation, the action taken, and the accompanying consequence. In addition, the applicability of a past experience in decision-making may depend on how similar the current problem is to situations encountered previously. Case-based decision theory (CBDT), proposed by Itzha...

  13. Computer arithmetic and validity theory, implementation, and applications

    CERN Document Server

    Kulisch, Ulrich

    2013-01-01

    This is the revised and extended second edition of the successful basic book on computer arithmetic. It is consistent with the newest recent standard developments in the field. The book shows how the arithmetic capability of the computer can be enhanced. The work is motivated by the desire and the need to improve the accuracy of numerical computing and to control the quality of the computed results (validity). The accuracy requirements for the elementary floating-point operations are extended to the customary product spaces of computations including interval spaces. The mathematical properties

  14. Four tenets of modern validity theory for medical education assessment and evaluation.

    Science.gov (United States)

    Royal, Kenneth D

    2017-01-01

    Validity is considered by many to be the most important criterion for evaluating a set of scores, yet few agree on what exactly the term means. Since the mid-1800s, scholars have been concerned with the notion of validity, but over time, the term has developed a variety of meanings across academic disciplines and contexts. Accordingly, when scholars with different academic backgrounds, many of whom hold deeply entrenched perspectives about validity conceptualizations, converge in the field of medical education assessment, it is a recipe for confusion. Thus, it is important to work toward a consensus about validity in the context of medical education assessment. Thus, the purpose of this work was to present four fundamental tenets of modern validity theory in an effort to establish a framework for scholars in the field of medical education assessment to follow when conceptualizing validity, interpreting validity evidence, and reporting research findings.

  15. Interviews with the dead: using meta-life qualitative analysis to validate Hippocrates' theory of humours

    Science.gov (United States)

    Secretion, F; Conjur, G S; Attitude, S P

    1998-01-01

    BACKGROUND: Hippocrates devised his theory of the 4 humours (blood, phlegm, black bile and yellow bile) 24 centuries ago. Since then, medicine has evolved into a complex body of confusing and sometimes contradictory facts. The authors, seeing a need to determine the validity of his theory, hired a psychic. METHODS: The psychic interviewed 4 eminent ancient physicians, including Hippocrates. A randomized double-blind cross-over design was used for this meta-life qualitative analysis. RESULTS: All of the interviewees agreed that the theory of humours is an accurate model to explain disease and personality. INTERPRETATION: Hiring a psychic to conduct after-death interviews with key informants is a useful way to validate scientific theories. PMID:9875254

  16. Systematic Development and Validation of a Theory-Based Questionnaire to Assess Toddler Feeding12

    OpenAIRE

    Hurley, Kristen M.; Pepper, M. Reese; Candelaria, Margo; Wang, Yan; Caulfield, Laura E.; Latta, Laura; Hager, Erin R.; Black, Maureen M.

    2013-01-01

    This paper describes the development and validation of a 27-item caregiver-reported questionnaire on toddler feeding. The development of the Toddler Feeding Behavior Questionnaire was based on a theory of interactive feeding that incorporates caregivers’ responses to concerns about their children’s dietary intake, appetite, size, and behaviors rather than relying exclusively on caregiver actions. Content validity included review by an expert panel (n = 7) and testing in a pilot sample (n = 10...

  17. Reliability and validity of advanced theory-of-mind measures in middle childhood and adolescence.

    Science.gov (United States)

    Hayward, Elizabeth O; Homer, Bruce D

    2017-09-01

    Although theory-of-mind (ToM) development is well documented for early childhood, there is increasing research investigating changes in ToM reasoning in middle childhood and adolescence. However, the psychometric properties of most advanced ToM measures for use with older children and adolescents have not been firmly established. We report on the reliability and validity of widely used, conventional measures of advanced ToM with this age group. Notable issues with both reliability and validity of several of the measures were evident in the findings. With regard to construct validity, results do not reveal a clear empirical commonality between tasks, and, after accounting for comprehension, developmental trends were evident in only one of the tasks investigated. Statement of contribution What is already known on this subject? Second-order false belief tasks have acceptable internal consistency. The Eyes Test has poor internal consistency. Validity of advanced theory-of-mind tasks is often based on the ability to distinguish clinical from typical groups. What does this study add? This study examines internal consistency across six widely used advanced theory-of-mind tasks. It investigates validity of tasks based on comprehension of items by typically developing individuals. It further assesses construct validity, or commonality between tasks. © 2017 The British Psychological Society.

  18. Assessing Academic Advising Outcomes Using Social Cognitive Theory: A Validity and Reliability Study

    Science.gov (United States)

    Erlich, Richard J.; Russ-Eft, Darlene F.

    2012-01-01

    The validity and reliability of three instruments, the "Counselor Rubric for Gauging Student Understanding of Academic Planning," micro-analytic questions, and the "Student Survey for Understanding Academic Planning," all based on social cognitive theory, were tested as means to assess self-efficacy and self-regulated learning in college academic…

  19. Preliminary Process Theory does not validate the Comparison Question Test: A comment on Palmatier and Rovner

    NARCIS (Netherlands)

    Ben-Shakar, G.; Gamer, M.; Iacono, W.; Meijer, E.; Verschuere, B.

    2015-01-01

    Palmatier and Rovner (2015) attempt to establish the construct validity of the Comparison Question Test (CQT) by citing extensive research ranging from modern neuroscience to memory and psychophysiology. In this comment we argue that merely citing studies on the preliminary process theory (PPT) of

  20. Academic Self-Esteem and Perceived Validity of Grades: A Test of Self-Verification Theory.

    Science.gov (United States)

    Okun, Morris A.; Fournet, Lee M.

    1993-01-01

    The hypothesis derived from self-verification theory that semester grade point average would be positively related to perceived validity of grade scores among high self-esteem undergraduates and inversely related for low self-esteem students was not supported in a study with 281 undergraduates. (SLD)

  1. Measuring Constructs in Family Science: How Can Item Response Theory Improve Precision and Validity?

    Science.gov (United States)

    Gordon, Rachel A.

    2015-01-01

    This article provides family scientists with an understanding of contemporary measurement perspectives and the ways in which item response theory (IRT) can be used to develop measures with desired evidence of precision and validity for research uses. The article offers a nontechnical introduction to some key features of IRT, including its…

  2. Validation of Triphasic Mixture Theory for a Mimic of Intervertebral Disk Tissue

    NARCIS (Netherlands)

    Oomens, C.W.J.; Heus, de H.J.; Huyghe, J.M.R.J.; Nelissen, J.G.L.; Janssen, J.D.

    1995-01-01

    This paper describes experimental studies an synthetic madel materials that mimic the mechanical behavior af intervertebral disk tissue. The results are used ta validate the triphasic mixture theory to describe soft charged hydrated materiais. Permeability and swelling pressure experiments were used

  3. Antibody Selection for Cancer Target Validation of FSH-Receptor in Immunohistochemical Settings

    Directory of Open Access Journals (Sweden)

    Nina Moeker

    2017-10-01

    Full Text Available Background: The follicle-stimulating hormone (FSH-receptor (FSHR has been reported to be an attractive target for antibody therapy in human cancer. However, divergent immunohistochemical (IHC findings have been reported for FSHR expression in tumor tissues, which could be due to the specificity of the antibodies used. Methods: Three frequently used antibodies (sc-7798, sc-13935, and FSHR323 were validated for their suitability in an immunohistochemical study for FSHR expression in different tissues. As quality control, two potential therapeutic anti-hFSHR Ylanthia® antibodies (Y010913, Y010916 were used. The specificity criteria for selection of antibodies were binding to native hFSHR of different sources, and no binding to non-related proteins. The ability of antibodies to stain the paraffin-embedded Flp-In Chinese hamster ovary (CHO/FSHR cells was tested after application of different epitope retrieval methods. Results: From the five tested anti-hFSHR antibodies, only Y010913, Y010916, and FSHR323 showed specific binding to native, cell-presented hFSHR. Since Ylanthia® antibodies were selected to specifically recognize native FSHR, as required for a potential therapeutic antibody candidate, FSHR323 was the only antibody to detect the receptor in IHC/histochemical settings on transfected cells, and at markedly lower, physiological concentrations (ex., in Sertoli cells of human testes. The pattern of FSH323 staining noticed for ovarian, prostatic, and renal adenocarcinomas indicated that FSHR was expressed mainly in the peripheral tumor blood vessels. Conclusion: Of all published IHC antibodies tested, only antibody FSHR323 proved suitable for target validation of hFSHR in an IHC setting for cancer. Our studies could not confirm the previously reported FSHR overexpression in ovarian and prostate cancer cells. Instead, specific overexpression in peripheral tumor blood vessels could be confirmed after thorough validation of the antibodies used.

  4. Creation and validation of the barriers to alcohol reduction (BAR) scale using classical test theory and item response theory.

    Science.gov (United States)

    Kunicki, Zachary J; Schick, Melissa R; Spillane, Nichea S; Harlow, Lisa L

    2018-06-01

    Those who binge drink are at increased risk for alcohol-related consequences when compared to non-binge drinkers. Research shows individuals may face barriers to reducing their drinking behavior, but few measures exist to assess these barriers. This study created and validated the Barriers to Alcohol Reduction (BAR) scale. Participants were college students ( n  = 230) who endorsed at least one instance of past-month binge drinking (4+ drinks for women or 5+ drinks for men). Using classical test theory, exploratory structural equation modeling found a two-factor structure of personal/psychosocial barriers and perceived program barriers. The sub-factors, and full scale had reasonable internal consistency (i.e., coefficient omega = 0.78 (personal/psychosocial), 0.82 (program barriers), and 0.83 (full measure)). The BAR also showed evidence for convergent validity with the Brief Young Adult Alcohol Consequences Questionnaire ( r  = 0.39, p  Theory (IRT) analysis showed the two factors separately met the unidimensionality assumption, and provided further evidence for severity of the items on the two factors. Results suggest that the BAR measure appears reliable and valid for use in an undergraduate student population of binge drinkers. Future studies may want to re-examine this measure in a more diverse sample.

  5. The Theory of Planned Behavior (TPB) and Pre-Service Teachers' Technology Acceptance: A Validation Study Using Structural Equation Modeling

    Science.gov (United States)

    Teo, Timothy; Tan, Lynde

    2012-01-01

    This study applies the theory of planned behavior (TPB), a theory that is commonly used in commercial settings, to the educational context to explain pre-service teachers' technology acceptance. It is also interested in examining its validity when used for this purpose. It has found evidence that the TPB is a valid model to explain pre-service…

  6. Validation of Sustainable Development Practices Scale Using the Bayesian Approach to Item Response Theory

    Directory of Open Access Journals (Sweden)

    Martin Hernani Merino

    2014-12-01

    Full Text Available There has been growing recognition of the importance of creating performance measurement tools for the economic, social and environmental management of micro and small enterprise (MSE. In this context, this study aims to validate an instrument to assess perceptions of sustainable development practices by MSEs by means of a Graded Response Model (GRM with a Bayesian approach to Item Response Theory (IRT. The results based on a sample of 506 university students in Peru, suggest that a valid measurement instrument was achieved. At the end of the paper, methodological and managerial contributions are presented.

  7. A pilot study to validate measures of the theory of reasoned action for organ donation behavior.

    Science.gov (United States)

    Wong, Shui Hung; Chow, Amy Yin Man

    2018-04-01

    The present study aimed at taking the first attempt in validating the measures generated based on the theory of reasoned action (TRA). A total of 211 university students participated in the study, 95 were included in the exploratory factor analysis and 116 were included in the confirmatory factor analysis. The TRA measurements were established with adequate psychometric properties, internal consistency, and construct validity. Findings also suggested that attitude toward organ donation has both a cognitive and affective nature, while the subjective norm of the family seems to be important to students' views on organ donation.

  8. Electrocatalysis of borohydride oxidation: a review of density functional theory approach combined with experimental validation

    Science.gov (United States)

    Sison Escaño, Mary Clare; Lacdao Arevalo, Ryan; Gyenge, Elod; Kasai, Hideaki

    2014-09-01

    The electrocatalysis of borohydride oxidation is a complex, up-to-eight-electron transfer process, which is essential for development of efficient direct borohydride fuel cells. Here we review the progress achieved by density functional theory (DFT) calculations in explaining the adsorption of BH4- on various catalyst surfaces, with implications for electrocatalyst screening and selection. Wherever possible, we correlate the theoretical predictions with experimental findings, in order to validate the proposed models and to identify potential directions for further advancements.

  9. Robust Backlash Estimation for Industrial Drive-Train Systems—Theory and Validation

    DEFF Research Database (Denmark)

    Papageorgiou, Dimitrios; Blanke, Mogens; Niemann, Hans Henrik

    2018-01-01

    Backlash compensation is used in modern machinetool controls to ensure high-accuracy positioning. When wear of a machine causes deadzone width to increase, high-accuracy control may be maintained if the deadzone is accurately estimated. Deadzone estimation is also an important parameter to indica......-of-the-art Siemens equipment. The experiments validate the theory and show that expected performance and robustness to parameter uncertainties are both achieved....

  10. Protocol: validation of the INCODE barometer to measure the innovation compe-tence through the Rasch Measurement Theory

    Directory of Open Access Journals (Sweden)

    Lidia Sanchez

    2017-06-01

    Full Text Available This communication presents a protocol in order to show the different phases that must be followed in order to validate the INCODE barometer, which is used to measure the innovation competence, with Rasch Measurement Theory. Five phases are stated: dimensionality analysis, individual reliability and validity analysis of ítems and persons, global reliability and validity analysis, and cathegory analysis.

  11. The Reinforcement Sensitivity Theory of Personality Questionnaire (RST-PQ): Development and validation.

    Science.gov (United States)

    Corr, Philip J; Cooper, Andrew J

    2016-11-01

    We report the development and validation of a questionnaire measure of the revised reinforcement sensitivity theory (rRST) of personality. Starting with qualitative responses to defensive and approach scenarios modeled on typical rodent ethoexperimental situations, exploratory and confirmatory factor analyses (CFAs) revealed a robust 6-factor structure: 2 unitary defensive factors, fight-flight-freeze system (FFFS; related to fear) and the behavioral inhibition system (BIS; related to anxiety); and 4 behavioral approach system (BAS) factors (Reward Interest, Goal-Drive Persistence, Reward Reactivity, and Impulsivity). Theoretically motivated thematic facets were employed to sample the breadth of defensive space, comprising FFFS (Flight, Freeze, and Active Avoidance) and BIS (Motor Planning Interruption, Worry, Obsessive Thoughts, and Behavioral Disengagement). Based on theoretical considerations, and statistically confirmed, a separate scale for Defensive Fight was developed. Validation evidence for the 6-factor structure came from convergent and discriminant validity shown by correlations with existing personality scales. We offer the Reinforcement Sensitivity Theory of Personality Questionnaire to facilitate future research specifically on rRST and, more broadly, on approach-avoidance theories of personality. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  12. The Rorschach texture response: a construct validation study using attachment theory.

    Science.gov (United States)

    Cassella, Michael J; Viglione, Donald J

    2009-11-01

    Using attachment theory, in this research, we explored the construct validity of the Rorschach (Exner, 1974) Texture (T) response as a measure of interpersonal closeness and contact. A total of 40 men and 39 women completed the Rorschach and 2 attachment inventories. Their romantic partners also completed an informant version of the attachment measures. Attachment styles were measured by factor scores involving both self-report and partner report. Results indicate that attachment theory, as a broad conceptual framework, is associated with T. Specifically, T = 1 is most closely associated with a secure attachment style, T > 1 with aspects of the preoccupied style, and T = 0 with aspects of the avoidant style and an absence of secure attachment. Needs for closeness and contact associated with T can be couched within an adult attachment theory, but in this study, we did not test for problematic aspects of insecure attachment. Gender is a complicating factor and deserves more study.

  13. PIV Measurements for Validation of Self-induction Theory of Vortex Breakdown

    Science.gov (United States)

    Thompson, Brad; Dabiri, Dana

    2005-11-01

    THE PROBLEM: Tail buffeting is a severe operational and maintenance problem in twin-tailed aircraft. Tail buffeting is driven by aerodynamic forces resulting from the wing leading edge produced concentrated vortices and their subsequent abrupt breakdown and radial expansion. The expansion leads to large-diameter helical vortices, which impose lateral forces on the tails. Various brute-force, empirical approaches have provided some ad-hoc fixes, but poor understanding of the underlying physics prevents effective design solutions. It is not yet possible to design buffet-free aircraft from first principles. Preliminary work offers a unique explanation for vortex breakdown called the azimuthal vorticity gradient theory ...(Cain 2001). This paper will present and establish experimental evidence using DPIV to validate this recent theory. Cain, C. B. (2001). The Self-Induction Theory of Vortex Breakdown. Aeronautics Dept. Seattle, WA, University of Washington. Master Thesis.

  14. Validation of a rapid, non-radioactive method to quantify internalisation of G-protein coupled receptors

    NARCIS (Netherlands)

    Jongsma, Maikel; Florczyk, Urszula M.; Hendriks-Balk, Marieelle C.; Michel, Martin C.; Peters, Stephan L. M.; Alewijnse, Astrid E.

    2007-01-01

    Agonist exposure can cause internalisation of G-protein coupled receptors (GPCRs), which may be a part of desensitisation but also of cellular signaling. Previous methods to study internalisation have been tedious or only poorly quantitative. Therefore, we have developed and validated a quantitative

  15. The validity and scalability of the Theory of Mind Scale with toddlers and preschoolers.

    Science.gov (United States)

    Hiller, Rachel M; Weber, Nathan; Young, Robyn L

    2014-12-01

    Despite the importance of theory of mind (ToM) for typical development, there remain 2 key issues affecting our ability to draw robust conclusions. One is the continued focus on false belief as the sole measure of ToM. The second is the lack of empirically validated measures of ToM as a broad construct. Our key aim was to examine the validity and reliability of the 5-item ToM scale (Peterson, Wellman, & Liu, 2005). In particular, we extended on previous research of this scale by assessing its scalability and validity for use with children from 2 years of age. Sixty-eight typically developing children (aged 24 to 61 months) were assessed on the scale's 5 tasks, along with a sixth Sally-Anne false-belief task. Our data replicated the scalability of the 5 tasks for a Rasch-but not Guttman-scale. Guttman analysis showed that a 4-item scale may be more suitable for this age range. Further, the tasks showed good internal consistency and validity for use with children as young as 2 years of age. Overall, the measure provides a valid and reliable tool for the assessment of ToM, and in particular, the longitudinal assessment of this ability as a construct. (c) 2014 APA, all rights reserved.

  16. A fundamental special-relativistic theory valid for all real-valued speeds

    Directory of Open Access Journals (Sweden)

    Vedprakash Sewjathan

    1984-01-01

    Full Text Available This paper constitutes a fundamental rederivation of special relativity based on the c-invariance postulate but independent of the assumption ds′2=±ds2 (Einstein [1], Kittel et al [2], Recami [3], the equivalence principle, homogeneity of space-time, isotropy of space, group properties and linearity of space-time transformations or the coincidence of the origins of inertial space-time frames. The mathematical formalism is simpler than Einstein's [4] and Recami's [3]. Whilst Einstein's subluminal and Recami's superluminal theories are rederived in this paper by further assuming the equivalence principle and “mathematical inverses” [4,3], this paper derives (independent of these assumptions with physico-mathematical motivation an alternate singularity-free special-relativistic theory which replaces Einstein's factor [1/(1−V2/c2]12 and Recami's extended-relativistic factor [1/(V2/c2−1]12 by [(1−(V2/c2n/(1−V2/c2]12, where n equals the value of (m(V/m02 as |V|→c. In this theory both Newton's and Einstein's subluminal theories are experimentally valid on account of negligible terms. This theory implies that non-zero rest mass luxons will not be detected as ordinary non-zero rest mass bradyons because of spatial collapse, and non-zero rest mass tachyons are undetectable because they exist in another cosmos, resulting in a supercosmos of matter, with the possibility of infinitely many such supercosmoses, all moving forward in time. Furthermore this theory is not based on any assumption giving rise to the twin paradox controversy. The paper concludes with a discussion of the implications of this theory for general relativity.

  17. Evaluating the validity of the Work Role Functioning Questionnaire (Canadian French version) using classical test theory and item response theory.

    Science.gov (United States)

    Hong, Quan Nha; Coutu, Marie-France; Berbiche, Djamal

    2017-01-01

    The Work Role Functioning Questionnaire (WRFQ) was developed to assess workers' perceived ability to perform job demands and is used to monitor presenteeism. Still few studies on its validity can be found in the literature. The purpose of this study was to assess the items and factorial composition of the Canadian French version of the WRFQ (WRFQ-CF). Two measurement approaches were used to test the WRFQ-CF: Classical Test Theory (CTT) and non-parametric Item Response Theory (IRT). A total of 352 completed questionnaires were analyzed. A four-factor and three-factor model models were tested and shown respectively good fit with 14 items (Root Mean Square Error of Approximation (RMSEA) = 0.06, Standardized Root Mean Square Residual (SRMR) = 0.04, Bentler Comparative Fit Index (CFI) = 0.98) and with 17 items (RMSEA = 0.059, SRMR = 0.048, CFI = 0.98). Using IRT, 13 problematic items were identified, of which 9 were common with CTT. This study tested different models with fewer problematic items found in a three-factor model. Using a non-parametric IRT and CTT for item purification gave complementary results. IRT is still scarcely used and can be an interesting alternative method to enhance the quality of a measurement instrument. More studies are needed on the WRFQ-CF to refine its items and factorial composition.

  18. Assessment of a recombinant androgen receptor binding assay: initial steps towards validation.

    Science.gov (United States)

    Freyberger, Alexius; Weimer, Marc; Tran, Hoai-Son; Ahr, Hans-Jürgen

    2010-08-01

    Despite more than a decade of research in the field of endocrine active compounds with affinity for the androgen receptor (AR), still no validated recombinant AR binding assay is available, although recombinant AR can be obtained from several sources. With funding from the European Union (EU)-sponsored 6th framework project, ReProTect, we developed a model protocol for such an assay based on a simple AR binding assay recently developed at our institution. Important features of the protocol were the use of a rat recombinant fusion protein to thioredoxin containing both the hinge region and ligand binding domain (LBD) of the rat AR (which is identical to the human AR-LBD) and performance in a 96-well plate format. Besides two reference compounds [dihydrotestosterone (DHT), androstenedione] ten test compounds with different affinities for the AR [levonorgestrel, progesterone, prochloraz, 17alpha-methyltestosterone, flutamide, norethynodrel, o,p'-DDT, dibutylphthalate, vinclozolin, linuron] were used to explore the performance of the assay. At least three independent experiments per compound were performed. The AR binding properties of reference and test compounds were well detected, in terms of the relative ranking of binding affinities, there was good agreement with published data obtained from experiments using recombinant AR preparations. Irrespective of the chemical nature of the compound, individual IC(50)-values for a given compound varied by not more than a factor of 2.6. Our data demonstrate that the assay reliably ranked compounds with strong, weak, and no/marginal affinity for the AR with high accuracy. It avoids the manipulation and use of animals, as a recombinant protein is used and thus contributes to the 3R concept. On the whole, this assay is a promising candidate for further validation. Copyright 2009 Elsevier Inc. All rights reserved.

  19. Validation of experimental molecular crystal structures with dispersion-corrected density functional theory calculations

    International Nuclear Information System (INIS)

    Streek, Jacco van de; Neumann, Marcus A.

    2010-01-01

    The accuracy of a dispersion-corrected density functional theory method is validated against 241 experimental organic crystal structures from Acta Cryst. Section E. This paper describes the validation of a dispersion-corrected density functional theory (d-DFT) method for the purpose of assessing the correctness of experimental organic crystal structures and enhancing the information content of purely experimental data. 241 experimental organic crystal structures from the August 2008 issue of Acta Cryst. Section E were energy-minimized in full, including unit-cell parameters. The differences between the experimental and the minimized crystal structures were subjected to statistical analysis. The r.m.s. Cartesian displacement excluding H atoms upon energy minimization with flexible unit-cell parameters is selected as a pertinent indicator of the correctness of a crystal structure. All 241 experimental crystal structures are reproduced very well: the average r.m.s. Cartesian displacement for the 241 crystal structures, including 16 disordered structures, is only 0.095 Å (0.084 Å for the 225 ordered structures). R.m.s. Cartesian displacements above 0.25 Å either indicate incorrect experimental crystal structures or reveal interesting structural features such as exceptionally large temperature effects, incorrectly modelled disorder or symmetry breaking H atoms. After validation, the method is applied to nine examples that are known to be ambiguous or subtly incorrect

  20. Development and psychometric validation of social cognitive theory scales in an oral health context.

    Science.gov (United States)

    Jones, Kelly; Parker, Eleanor J; Steffens, Margaret A; Logan, Richard M; Brennan, David; Jamieson, Lisa M

    2016-04-01

    This study aimed to develop and evaluate scales reflecting potentially modifiable social cognitive theory-based risk indicators associated with homeless populations' oral health. The scales are referred to as the social cognitive theory risk scales in an oral health context (SCTOH) and are referred to as SCTOH(SE), SCTOH(K) and SCTOH(F), respectively. The three SCTOH scales assess the key constructs of social cognitive theory: self-efficacy, knowledge and fatalism. The reliability and validity of the three scales were evaluated in a convenience sample of 248 homeless participants (age range 17-78 years, 79% male) located in a metropolitan setting in Australia. The scales were supported by exploratory factor analysis and established three distinct and internally consistent domains of social cognition: oral health-related self-efficacy, oral health-related knowledge and oral health-related fatalism, with Cronbach's alphas of 0.95, 0.85 and Spearman's-Brown ρ of 0.69. Concurrent ability was confirmed by each SCTOH scale's association with oral health status in the expected directions. The three SCTOH scales appear to be internally valid and reliable. If confirmed by further research, these scales could potentially be used for tailored educational and cognitive-behavioural interventions to reduce oral health inequalities among homeless and other vulnerable populations. © 2015 Public Health Association of Australia.

  1. Validation of a proxy for estrogen receptor status in breast cancer patients using dispensing data.

    Science.gov (United States)

    Srasuebkul, Preeyaporn; Dobbins, Timothy A; Pearson, Sallie-Anne

    2014-06-01

    To assess the performance of a proxy for estrogen receptor (ER) status in breast cancer patients using dispensing data. We derived our proxy using 167 patients. ER+ patients had evidence of at least one dispensing record for hormone therapy during the lookback period, irrespective of diagnosis date and ER- had no dispensing records for hormone therapy during the period. We validated the proxy against our gold standard, ER status from pathology reports or medical records. We assessed the proxy's performance using three lookback periods: 4.5 years, 2 years, 1 year. More than half of our cohort (62%) were >50 years, 54% had stage III/IV breast cancer at recruitment, (46%) were diagnosed with breast cancer in 2009 and 23% were diagnosed before 2006. Sensitivity and specificity were high for the 4.5 year lookback period (93%, 95% CI: 86-96%; and 95%: 83-99%), respectively) and remained high for the 2-year lookback period (91%: 84-95%; and 95%: 83-99%). Sensitivity decreased (83%: 75.2-89%) but specificity remained high (95%: 83-99%) using the 1-year lookback period and the period is long enough to allow sufficient time for hormone therapy to be dispensed. Our proxy accurately infers ER status in studies of breast cancer treatment based on secondary health data. The proxy is most robust with a minimum lookback period of 2 years. © 2012 Wiley Publishing Asia Pty Ltd.

  2. Grid inhomogeneous solvation theory: hydration structure and thermodynamics of the miniature receptor cucurbit[7]uril.

    Science.gov (United States)

    Nguyen, Crystal N; Young, Tom Kurtzman; Gilson, Michael K

    2012-07-28

    The displacement of perturbed water upon binding is believed to play a critical role in the thermodynamics of biomolecular recognition, but it is nontrivial to unambiguously define and answer questions about this process. We address this issue by introducing grid inhomogeneous solvation theory (GIST), which discretizes the equations of inhomogeneous solvation theory (IST) onto a three-dimensional grid situated in the region of interest around a solute molecule or complex. Snapshots from explicit solvent simulations are used to estimate localized solvation entropies, energies, and free energies associated with the grid boxes, or voxels, and properly summing these thermodynamic quantities over voxels yields information about hydration thermodynamics. GIST thus provides a smoothly varying representation of water properties as a function of position, rather than focusing on hydration sites where solvent is present at high density. It therefore accounts for full or partial displacement of water from sites that are highly occupied by water, as well as for partly occupied and water-depleted regions around the solute. GIST can also provide a well-defined estimate of the solvation free energy and therefore enables a rigorous end-states analysis of binding. For example, one may not only use a first GIST calculation to project the thermodynamic consequences of displacing water from the surface of a receptor by a ligand, but also account, in a second GIST calculation, for the thermodynamics of subsequent solvent reorganization around the bound complex. In the present study, a first GIST analysis of the molecular host cucurbit[7]uril is found to yield a rich picture of hydration structure and thermodynamics in and around this miniature receptor. One of the most striking results is the observation of a toroidal region of high water density at the center of the host's nonpolar cavity. Despite its high density, the water in this toroidal region is disfavored energetically and

  3. Gene-environment interaction between the oxytocin receptor (OXTR) gene and parenting behaviour on children's theory of mind.

    Science.gov (United States)

    Wade, Mark; Hoffmann, Thomas J; Jenkins, Jennifer M

    2015-12-01

    Theory of mind (ToM) is the ability to interpret and understand human behaviour by representing the mental states of others. Like many human capacities, ToM is thought to develop through both complex biological and socialization mechanisms. However, no study has examined the joint effect of genetic and environmental influences on ToM. This study examined how variability in the oxytocin receptor gene (OXTR) and parenting behavior--two widely studied factors in ToM development-interacted to predict ToM in pre-school-aged children. Participants were 301 children who were part of an ongoing longitudinal birth cohort study. ToM was assessed at age 4.5 using a previously validated scale. Parenting was assessed through observations of mothers' cognitively sensitive behaviours. Using a family-based association design, it was suggestive that a particular variant (rs11131149) interacted with maternal cognitive sensitivity on children's ToM (P = 0.019). More copies of the major allele were associated with higher ToM as a function of increasing cognitive sensitivity. A sizeable 26% of the variability in ToM was accounted for by this interaction. This study provides the first empirical evidence of gene-environment interactions on ToM, supporting the notion that genetic factors may be modulated by potent environmental influences early in development. © The Author (2015). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  4. Electrocatalysis of borohydride oxidation: a review of density functional theory approach combined with experimental validation

    International Nuclear Information System (INIS)

    Sison Escaño, Mary Clare; Arevalo, Ryan Lacdao; Kasai, Hideaki; Gyenge, Elod

    2014-01-01

    The electrocatalysis of borohydride oxidation is a complex, up-to-eight-electron transfer process, which is essential for development of efficient direct borohydride fuel cells. Here we review the progress achieved by density functional theory (DFT) calculations in explaining the adsorption of BH 4 − on various catalyst surfaces, with implications for electrocatalyst screening and selection. Wherever possible, we correlate the theoretical predictions with experimental findings, in order to validate the proposed models and to identify potential directions for further advancements. (topical review)

  5. Optimization and in Vivo Validation of Peptide Vectors Targeting the LDL Receptor.

    Science.gov (United States)

    Jacquot, Guillaume; Lécorché, Pascaline; Malcor, Jean-Daniel; Laurencin, Mathieu; Smirnova, Maria; Varini, Karine; Malicet, Cédric; Gassiot, Fanny; Abouzid, Karima; Faucon, Aude; David, Marion; Gaudin, Nicolas; Masse, Maxime; Ferracci, Géraldine; Dive, Vincent; Cisternino, Salvatore; Khrestchatisky, Michel

    2016-12-05

    Active targeting and delivery to pathophysiological organs of interest is of paramount importance to increase specific accumulation of therapeutic drugs or imaging agents while avoiding systemic side effects. We recently developed a family of new peptide ligands of the human and rodent LDL receptor (LDLR), an attractive cell-surface receptor with high uptake activity and local enrichment in several normal or pathological tissues (Malcor et al., J. Med. Chem. 2012, 55 (5), 2227). Initial chemical optimization of the 15-mer, all natural amino acid compound 1/VH411 (DSGL[CMPRLRGC] c DPR) and structure-activity relationship (SAR) investigation led to the cyclic 8 amino acid analogue compound 22/VH445 ([cMPRLRGC] c ) which specifically binds hLDLR with a K D of 76 nM and has an in vitro blood half-life of ∼3 h. Further introduction of non-natural amino acids led to the identification of compound 60/VH4106 ([(d)-"Pen"M"Thz"RLRGC] c ), which showed the highest K D value of 9 nM. However, this latter analogue displayed the lowest in vitro blood half-life (∼1.9 h). In the present study, we designed a new set of peptide analogues, namely, VH4127 to VH4131, with further improved biological properties. Detailed analysis of the hLDLR-binding kinetics of previous and new analogues showed that the latter all displayed very high on-rates, in the 10 6 s -1. M -1 range, and off-rates varying from the low 10 -2 s -1 to the 10 -1 s -1 range. Furthermore, all these new analogues showed increased blood half-lives in vitro, reaching ∼7 and 10 h for VH4129 and VH4131, respectively. Interestingly, we demonstrate in cell-based assays using both VH445 and the most balanced optimized analogue VH4127 ([cM"Thz"RLRG"Pen"] c ), showing a K D of 18 nM and a blood half-life of ∼4.3 h, that its higher on-rate correlated with a significant increase in both the extent of cell-surface binding to hLDLR and the endocytosis potential. Finally, intravenous injection of tritium-radiolabeled 3 H

  6. Neuroscience, virtual reality and neurorehabilitation: brain repair as a validation of brain theory.

    Science.gov (United States)

    Verschure, Paul F M J

    2011-01-01

    This paper argues that basing cybertherapy approaches on a theoretical understanding of the brain has advantages. On one hand it provides for a rational approach towards therapy design while on the other allowing for a direct validation of brain theory in the clinic. As an example this paper discusses how the Distributed Adaptive Control architecture, a theory of mind, brain and action, has given rise to a new paradigm in neurorehabilitation called the Rehabilitation Gaming System (RGS) and to novel neuroprosthetic systems. The neuroprosthetic system considered is developed to replace the function of cerebellar micro-circuits, expresses core aspects of the learning systems of DAC and has been successfully tested in in-vivo experiments. The Virtual reality based rehabilitation paradigm of RGS has been validated in the treatment of acute and chronic stroke and has been shown to be more effective than existing methods. RGS provides a foundation for integrated at-home therapy systems that can operate largely autonomously when also augmented with appropriate physiological monitoring and diagnostic devices. These examples provide first steps towards a science based medicine.

  7. A new measurement for the revised reinforcement sensitivity theory: psychometric criteria and genetic validation

    Directory of Open Access Journals (Sweden)

    Martin eReuter

    2015-03-01

    Full Text Available Jeffrey Gray’s Reinforcement Sensitivity Theory (RST represents one of the most influential biologically-based personality theories describing individual differences in approach and avoidance tendencies. The most prominent self-report inventory to measure individual differences in approach and avoidance behavior to date is the BIS/BAS scale by Carver & White (1994. As Gray & McNaughton (2000 revised the RST after its initial formulation in the 1970/80s, and given the Carver & White measure is based on the initial conceptualization of RST, there is a growing need for self-report inventories measuring individual differences in the revised behavioral inhibition system (BIS, behavioral activation system (BAS and the fight, flight, freezing system (FFFS. Therefore, in this paper we present a new questionnaire measuring individual differences in the revised constructs of the BIS, BAS and FFFS in N = 1814 participants (German sample. An English translated version of the new measure is also presented and tested in N = 299 English language participants. A large number of German participants (N = 1090 also filled in the BIS/BAS scales by Carver & White (1994 and the correlations between these measures are presented. Finally, this same subgroup of participants provided buccal swaps for the investigation of the arginine vasopressin receptor 1a (AVPR1a gene. Here, a functional genetic polymorphism (rs11174811 on the AVPR1a gene was shown to be associated with individual differences in both the revised BIS and classic BIS dimensions.

  8. Development and validation of receptor occupancy pharmacodynamic assays used in the clinical development of the monoclonal antibody vedolizumab.

    Science.gov (United States)

    Wyant, Tim; Estevam, Jose; Yang, Lili; Rosario, Maria

    2016-03-01

    Vedolizumab is a monoclonal antibody approved for use in ulcerative colitis and Crohn's disease. By specifically binding to α4 β7 integrin, vedolizumab prevents trafficking of lymphocytes to the gut, thereby interfering with disease pathology. During the clinical development program, the pharmacodynamic effect of vedolizumab was evaluated by 2 flow cytometry receptor occupancy assays: act-1 (ACT-1) and mucosal addressin cell adhesion molecule-1 (MAdCAM-1). Here we describe the development and validation of these assays. The ACT-1 assay is a receptor occupancy free-site assay that uses a monoclonal antibody with the same binding epitope as vedolizumab to detect free (unbound) sites on α4 β7 integrin. The MAdCAM-1 assay used a soluble version of the natural ligand for α4 β7 integrin to detect free sites. The assays were validated using a fit-for-purpose approach throughout the clinical development of vedolizumab. Both the ACT-1 assay and the MAdCAM-1 assay demonstrated acceptable reproducibility and repeatability. The assays were sufficiently stable to allow for clinical use. During clinical testing the assays demonstrated that vedolizumab was able to saturate peripheral cells at all doses tested. Two pharmacodynamic receptor occupancy assays were developed and validated to assess the effect of vedolizumab on peripheral blood cells. The results of these assays demonstrated the practical use of flow cytometry to examine pharmacodynamic response in clinical trials. © 2015 International Clinical Cytometry Society.

  9. Systematic development and validation of a theory-based questionnaire to assess toddler feeding.

    Science.gov (United States)

    Hurley, Kristen M; Pepper, M Reese; Candelaria, Margo; Wang, Yan; Caulfield, Laura E; Latta, Laura; Hager, Erin R; Black, Maureen M

    2013-12-01

    This paper describes the development and validation of a 27-item caregiver-reported questionnaire on toddler feeding. The development of the Toddler Feeding Behavior Questionnaire was based on a theory of interactive feeding that incorporates caregivers' responses to concerns about their children's dietary intake, appetite, size, and behaviors rather than relying exclusively on caregiver actions. Content validity included review by an expert panel (n = 7) and testing in a pilot sample (n = 105) of low-income mothers of toddlers. Construct validity and reliability were assessed among a second sample of low-income mothers of predominately African-American (70%) toddlers aged 12-32 mo (n = 297) participating in the baseline evaluation of a toddler overweight prevention study. Internal consistency (Cronbach's α: 0.64-0.87) and test-retest (0.57-0.88) reliability were acceptable for most constructs. Exploratory and confirmatory factor analyses revealed 5 theoretically derived constructs of feeding: responsive, forceful/pressuring, restrictive, indulgent, and uninvolved (root mean square error of approximation = 0.047, comparative fit index = 0.90, standardized root mean square residual = 0.06). Statistically significant (P feeding behaviors, toddler overweight status, perceived toddler fussiness, and maternal mental health. The Toddler Feeding Behavior Questionnaire adds to the field by providing a brief instrument that can be administered in 5 min to examine how caregiver-reported feeding behaviors relate to toddler health and behavior.

  10. Systematic Development and Validation of a Theory-Based Questionnaire to Assess Toddler Feeding12

    Science.gov (United States)

    Hurley, Kristen M.; Pepper, M. Reese; Candelaria, Margo; Wang, Yan; Caulfield, Laura E.; Latta, Laura; Hager, Erin R.; Black, Maureen M.

    2013-01-01

    This paper describes the development and validation of a 27-item caregiver-reported questionnaire on toddler feeding. The development of the Toddler Feeding Behavior Questionnaire was based on a theory of interactive feeding that incorporates caregivers’ responses to concerns about their children’s dietary intake, appetite, size, and behaviors rather than relying exclusively on caregiver actions. Content validity included review by an expert panel (n = 7) and testing in a pilot sample (n = 105) of low-income mothers of toddlers. Construct validity and reliability were assessed among a second sample of low-income mothers of predominately African-American (70%) toddlers aged 12–32 mo (n = 297) participating in the baseline evaluation of a toddler overweight prevention study. Internal consistency (Cronbach’s α: 0.64–0.87) and test-retest (0.57–0.88) reliability were acceptable for most constructs. Exploratory and confirmatory factor analyses revealed 5 theoretically derived constructs of feeding: responsive, forceful/pressuring, restrictive, indulgent, and uninvolved (root mean square error of approximation = 0.047, comparative fit index = 0.90, standardized root mean square residual = 0.06). Statistically significant (P feeding behaviors, toddler overweight status, perceived toddler fussiness, and maternal mental health. The Toddler Feeding Behavior Questionnaire adds to the field by providing a brief instrument that can be administered in 5 min to examine how caregiver-reported feeding behaviors relate to toddler health and behavior. PMID:24068792

  11. Transverse signal decay under the weak field approximation: Theory and validation.

    Science.gov (United States)

    Berman, Avery J L; Pike, G Bruce

    2018-07-01

    To derive an expression for the transverse signal time course from systems in the motional narrowing regime, such as water diffusing in blood. This was validated in silico and experimentally with ex vivo blood samples. A closed-form solution (CFS) for transverse signal decay under any train of refocusing pulses was derived using the weak field approximation. The CFS was validated via simulations of water molecules diffusing in the presence of spherical perturbers, with a range of sizes and under various pulse sequences. The CFS was compared with more conventional fits assuming monoexponential decay, including chemical exchange, using ex vivo blood Carr-Purcell-Meiboom-Gill data. From simulations, the CFS was shown to be valid in the motional narrowing regime and partially into the intermediate dephasing regime, with increased accuracy with increasing Carr-Purcell-Meiboom-Gill refocusing rate. In theoretical calculations of the CFS, fitting for the transverse relaxation rate (R 2 ) gave excellent agreement with the weak field approximation expression for R 2 for Carr-Purcell-Meiboom-Gill sequences, but diverged for free induction decay. These same results were confirmed in the ex vivo analysis. Transverse signal decay in the motional narrowing regime can be accurately described analytically. This theory has applications in areas such as tissue iron imaging, relaxometry of blood, and contrast agent imaging. Magn Reson Med 80:341-350, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  12. On the validity of the classical hydrodynamic lubrication theory applied to squeeze film dampers

    International Nuclear Information System (INIS)

    Danaila, S; Moraru, L

    2010-01-01

    Squeeze film dampers (SFD) are devices utilized to control vibrations of the shafts of high-speed rotating machinery. The SFD - squirrel cage combination is probably the most used system for tuning the stiffness and damping of the supports for rotors installed on ball bearings. Squeeze film dampers are essentially hydrodynamic bearings which contain the ball bearings housings of ball-bearings supported shafts. Consequently, the oil film within the SFD are influenced only by the precession and nutation of the shaft, that is the flow of the oil within the damper is not directly influenced by the spin of the rotor. However, in the classical theory, the flow in the thin film is also governed by the Reynolds equation. In this paper, some of the limits of the classical theory of the SFD are discussed and theoretical and experimental studies, which illustrate the ideas presented herein, are presented as well. The orbits of an unbalanced rotor that is supported by a ball-bearings-SFD-squirrel-cage assembly at one end and by rigidly mounted ball bearings at the other end are computed using the bearing forces provided by the classical short bearing theory. The numerical model also includes the properties of the squirrel cage. The parameters of the squirrel cage were measured, together with the effect of the friction within the assembly. Experimental unbalance responses were also collected for various rotation speeds and unbalances to validate the numerical simulations.

  13. Validation of a rapid, non-radioactive method to quantify internalisation of G-protein coupled receptors.

    Science.gov (United States)

    Jongsma, Maikel; Florczyk, Urszula M; Hendriks-Balk, Mariëlle C; Michel, Martin C; Peters, Stephan L M; Alewijnse, Astrid E

    2007-07-01

    Agonist exposure can cause internalisation of G-protein coupled receptors (GPCRs), which may be a part of desensitisation but also of cellular signaling. Previous methods to study internalisation have been tedious or only poorly quantitative. Therefore, we have developed and validated a quantitative method using a sphingosine-1-phosphate (S1P) receptor as a model. Because of a lack of suitable binding studies, it has been difficult to study S1P receptor internalisation. Using a N-terminal HisG-tag, S1P(1) receptors on the cell membrane can be visualised via immunocytochemistry with a specific anti-HisG antibody. S1P-induced internalisation was concentration dependent and was quantified using a microplate reader, detecting either absorbance, a fluorescent or luminescent signal, depending on the antibodies used. Among those, the fluorescence detection method was the most convenient to use. The relative ease of this method makes it suitable to measure a large number of data points, e.g. to compare the potency and efficacy of receptor ligands.

  14. Bringing loyalty to e-Health: theory validation using three internet-delivered interventions.

    Science.gov (United States)

    Crutzen, Rik; Cyr, Dianne; de Vries, Nanne K

    2011-09-24

    Internet-delivered interventions can effectively change health risk behaviors, but the actual use of these interventions by the target group once they access the website is often very low (high attrition, low adherence). Therefore, it is relevant and necessary to focus on factors related to use of an intervention once people arrive at the intervention website. We focused on user perceptions resulting in e-loyalty (ie, intention to visit an intervention again and to recommend it to others). A background theory for e-loyalty, however, is still lacking for Internet-delivered interventions. The objective of our study was to propose and validate a conceptual model regarding user perceptions and e-loyalty within the field of eHealth. We presented at random 3 primary prevention interventions aimed at the general public and, subsequently, participants completed validated measures regarding user perceptions and e-loyalty. Time on each intervention website was assessed by means of server registrations. Of the 592 people who were invited to participate, 397 initiated the study (response rate: 67%) and 351 (48% female, mean age 43 years, varying in educational level) finished the study (retention rate: 88%). Internal consistency of all measures was high (Cronbach alpha > .87). The findings demonstrate that the user perceptions regarding effectiveness (beta(range) .21-.41) and enjoyment (beta(range) .14-.24) both had a positive effect on e-loyalty, which was mediated by active trust (beta(range) .27-.60). User perceptions and e-loyalty had low correlations with time on the website (r(range) .04-.18). The consistent pattern of findings speaks in favor of their robustness and contributes to theory validation regarding e-loyalty. The importance of a theory-driven solution to a practice-based problem (ie, low actual use) needs to be stressed in view of the importance of the Internet in terms of intervention development. Longitudinal studies are needed to investigate whether people

  15. Evolutionary game theory for physical and biological scientists. I. Training and validating population dynamics equations.

    Science.gov (United States)

    Liao, David; Tlsty, Thea D

    2014-08-06

    Failure to understand evolutionary dynamics has been hypothesized as limiting our ability to control biological systems. An increasing awareness of similarities between macroscopic ecosystems and cellular tissues has inspired optimism that game theory will provide insights into the progression and control of cancer. To realize this potential, the ability to compare game theoretic models and experimental measurements of population dynamics should be broadly disseminated. In this tutorial, we present an analysis method that can be used to train parameters in game theoretic dynamics equations, used to validate the resulting equations, and used to make predictions to challenge these equations and to design treatment strategies. The data analysis techniques in this tutorial are adapted from the analysis of reaction kinetics using the method of initial rates taught in undergraduate general chemistry courses. Reliance on computer programming is avoided to encourage the adoption of these methods as routine bench activities.

  16. Two theories on the test bench: Internal and external validity of the theories of Ronald Inglehart and Shalom Schwartz

    OpenAIRE

    Datler, Georg; Jagodzinski, Wolfgang; Schmidt, Peter

    2013-01-01

    In the last decades value research has produced a vast number of theoretical concepts. However, it is unclear how the different value theories relate to each other. This study makes a first step toward a systematic comparison of value theories. It focuses on the individual level of the two approaches that are, at present, probably the most prominent in international research - the theory of basic human values of Shalom Schwartz and the postmodernization theory of Ronald Inglehart. Using data ...

  17. Signalling with retinoids in the human lung: validation of new tools for the expression study of retinoid receptors

    International Nuclear Information System (INIS)

    Poulain, Stéphane; Lacomme, Stéphanie; Battaglia-Hsu, Shyue-Fang; Manoir, Stanislas du; Brochin, Lydia; Vignaud, Jean-Michel; Martinet, Nadine

    2009-01-01

    Retinoid Receptors are involved in development and cell homeostasis. Alterations of their expressions have been observed in lung cancer. However, retinoid chemoprevention trials in populations at risk to develop such tumors have failed. Therefore, the pertinence of new clinical trials using second generation retinoid requires prior better understanding of retinoid signalling. This is our aim when validating extensively research tools, focused on Retinoic Acid Receptor beta, whose major role in lung cancer is documented. Biocomputing was used to assess the genomic organization of RAR beta. Its putative RAR-beta1' promoter features were investigated experimentally. Specific measures realized, with qRT-PCR Syber Green assays and a triplex of Taqman probes, were extensively validated to establish Retinoid Receptors mRNAs reference values for in vivo normal human bronchial cells, lung tumors and cell lines. Finally, a pan-RAR-beta antibody was generated and extensively validated by western-blot and immunoprecipitation. No promoter-like activity was found for RAR-beta1'. RAR-beta2 mRNAs increase signs the normal differentiation of the human bronchial epithelium while a decrease is observed in most lung cancer cell lines. Accordingly, it is also, along with RXR beta, down-regulated in lung tumors. When using nuclear extracts of BEAS-2B and normal lung cells, only the RAR-beta2 long protein isoform was recognized by our antibody. Rigorous samples processing and extensive biocomputing, were the key factors for this study. mRNA reference values and validated tools can now be used to advance researches on retinoid signalling in the lung

  18. Validation of experimental molecular crystal structures with dispersion-corrected density functional theory calculations.

    Science.gov (United States)

    van de Streek, Jacco; Neumann, Marcus A

    2010-10-01

    This paper describes the validation of a dispersion-corrected density functional theory (d-DFT) method for the purpose of assessing the correctness of experimental organic crystal structures and enhancing the information content of purely experimental data. 241 experimental organic crystal structures from the August 2008 issue of Acta Cryst. Section E were energy-minimized in full, including unit-cell parameters. The differences between the experimental and the minimized crystal structures were subjected to statistical analysis. The r.m.s. Cartesian displacement excluding H atoms upon energy minimization with flexible unit-cell parameters is selected as a pertinent indicator of the correctness of a crystal structure. All 241 experimental crystal structures are reproduced very well: the average r.m.s. Cartesian displacement for the 241 crystal structures, including 16 disordered structures, is only 0.095 Å (0.084 Å for the 225 ordered structures). R.m.s. Cartesian displacements above 0.25 A either indicate incorrect experimental crystal structures or reveal interesting structural features such as exceptionally large temperature effects, incorrectly modelled disorder or symmetry breaking H atoms. After validation, the method is applied to nine examples that are known to be ambiguous or subtly incorrect.

  19. A discussion on validity of the diffusion theory by Monte Carlo method

    Science.gov (United States)

    Peng, Dong-qing; Li, Hui; Xie, Shusen

    2008-12-01

    Diffusion theory was widely used as a basis of the experiments and methods in determining the optical properties of biological tissues. A simple analytical solution could be obtained easily from the diffusion equation after a series of approximations. Thus, a misinterpret of analytical solution would be made: while the effective attenuation coefficient of several semi-infinite bio-tissues were the same, the distribution of light fluence in the tissues would be the same. In order to assess the validity of knowledge above, depth resolved internal fluence of several semi-infinite biological tissues which have the same effective attenuation coefficient were simulated with wide collimated beam in the paper by using Monte Carlo method in different condition. Also, the influence of bio-tissue refractive index on the distribution of light fluence was discussed in detail. Our results showed that, when the refractive index of several bio-tissues which had the same effective attenuation coefficient were the same, the depth resolved internal fluence would be the same; otherwise, the depth resolved internal fluence would be not the same. The change of refractive index of tissue would have affection on the light depth distribution in tissue. Therefore, the refractive index is an important optical property of tissue, and should be taken in account while using the diffusion approximation theory.

  20. Validation of the 3D finite element transport theory code EVENT for shielding applications

    International Nuclear Information System (INIS)

    Warner, Paul; Oliveira, R.E. de

    2000-01-01

    This paper is concerned with the validation of the 3D deterministic neutral-particle transport theory code EVENT for shielding applications. The code is based on the finite element-spherical harmonics (FE-P N ) method which has been extensively developed over the last decade. A general multi-group, anisotropic scattering formalism enables the code to address realistic steady state and time dependent, multi-dimensional coupled neutron/gamma radiation transport problems involving high scattering and deep penetration alike. The powerful geometrical flexibility and competitive computational effort makes the code an attractive tool for shielding applications. In recognition of this, EVENT is currently in the process of being adopted by the UK nuclear industry. The theory behind EVENT is described and its numerical implementation is outlined. Numerical results obtained by the code are compared with predictions of the Monte Carlo code MCBEND and also with the results from benchmark shielding experiments. In particular, results are presented for the ASPIS experimental configuration for both neutron and gamma ray calculations using the BUGLE 96 nuclear data library. (author)

  1. Validation of somatostatin receptor scintigraphy in the localization of neuroendocrine tumors

    International Nuclear Information System (INIS)

    Lamberts, S.W.J.; Reubi, J.C.; Krenning, E.P.

    1993-01-01

    Somatostatin analogs are used in the control of hormonal hypersecretion and tumor growth of patients with acromegaly, islet cell carcinomas and carcinoids. Recently we showed that somatostatin receptor positive tumors can be visualized in vivo after the administration of radionuclide-labeled somatostatin analogs. Receptor imaging was positive in 18/21 islet cell tumors, 32/37 carcinoids, 26/28 paragangliomas, 9/14 medullary thyroid carcinomas, and 5/7 small cell lung cancers. Somatostatin receptor imaging is an easy, harmless and painless diagnostic method. It localizes multiple and/or metastatic tumors, predicts the successful control of hormonal hypersecretion by octreotide and seems to be of prognostic value in certain types of cancer. This scintigraphic method might help in patient selection for clinical trials with somatostatin analogs in the treatment of neuroendocrine cancers. (orig.)

  2. Validation of somatostatin receptor scintigraphy in the localization of neuroendocrine tumors

    Energy Technology Data Exchange (ETDEWEB)

    Lamberts, S.W.J. (Depts. of Medicine and Nuclear Medicine, Erasmus Univ., Rotterdam (Netherlands) Div. of Cell Biology and Experimental Cancer Research, Institution of Pathology, Bern Univ. (Switzerland)); Reubi, J.C. (Depts. of Medicine and Nuclear Medicine, Erasmus Univ., Rotterdam (Netherlands) Div. of Cell Biology and Experimental Cancer Research, Institution of Pathology, Bern Univ. (Switzerland)); Krenning, E.P. (Depts. of Medicine and Nuclear Medicine, Erasmus Univ., Rotterdam (Netherlands) Div. of Cell Biology and Experimental Cancer Research, Institution of Pathology, Bern Univ. (Switzerland))

    1993-01-01

    Somatostatin analogs are used in the control of hormonal hypersecretion and tumor growth of patients with acromegaly, islet cell carcinomas and carcinoids. Recently we showed that somatostatin receptor positive tumors can be visualized in vivo after the administration of radionuclide-labeled somatostatin analogs. Receptor imaging was positive in 18/21 islet cell tumors, 32/37 carcinoids, 26/28 paragangliomas, 9/14 medullary thyroid carcinomas, and 5/7 small cell lung cancers. Somatostatin receptor imaging is an easy, harmless and painless diagnostic method. It localizes multiple and/or metastatic tumors, predicts the successful control of hormonal hypersecretion by octreotide and seems to be of prognostic value in certain types of cancer. This scintigraphic method might help in patient selection for clinical trials with somatostatin analogs in the treatment of neuroendocrine cancers. (orig.).

  3. Validation of antibodies for neuroanatomical localization of the P2Y receptor in macaque brain

    DEFF Research Database (Denmark)

    Dreisig, Karin; Degn, Matilda; Sund, Louise

    2016-01-01

    Focus on the purinergic receptor P2Y11 has increased following the finding of an association between the sleep disorder narcolepsy and a genetic variant in P2RY11 causing decreased gene expression. Narcolepsy is believed to arise from an autoimmune destruction of the hypothalamic neurons that pro......Focus on the purinergic receptor P2Y11 has increased following the finding of an association between the sleep disorder narcolepsy and a genetic variant in P2RY11 causing decreased gene expression. Narcolepsy is believed to arise from an autoimmune destruction of the hypothalamic neurons...

  4. Theory of mind in remitted bipolar disorder: Younger patients struggle in tasks of higher ecological validity.

    Science.gov (United States)

    Feyerabend, Julia; Lüttke, Stefan; Grosse-Wentrup, Fabienne; Wolter, Sibylla; Hautzinger, Martin; Wolkenstein, Larissa

    2018-04-15

    To date, research concerning Theory of Mind (ToM) in remitted bipolar disorder (rBD) has yielded inconclusive results. This may be a result of methodological shortcomings and the failure to consider relevant third variables. Furthermore, studies using ecologically valid stimuli are rare. This study examines ToM in rBD patients, using ecologically valid stimuli. Additionally, the effects of sad mood induction (MI) as well as of age and gender are considered. The sample comprises N = 44 rBD patients (rBDPs) and N = 40 healthy controls (HCs). ToM decoding is assessed using the Cambridge Mindreading Face-Voice-Battery (CAM) and ToM reasoning using the Movie for the Assessment of Social Cognition (MASC). Both tasks were divided into two parts to conduct one part with and one without MI. While across the whole sample there was no evidence that rBDPs and HCs differed in ToM decoding or reasoning, in the younger subsample (age < 45) rBDPs performed worse than HCs in ToM decoding. While MI negatively influenced reasoning in both groups, gender had no effect. Most patients in this study had a high level of social functioning, limiting the generalizability of the results. As important social steps have to be undertaken before middle-age, the decoding deficits in younger rBDPs might be of particular importance not only for social functioning but also for the course of illness. Furthermore, this age-related deficit may explain the inconclusive findings that have been reported so far. Copyright © 2018 Elsevier B.V. All rights reserved.

  5. Validation of the kinetic-turbulent-neoclassical theory for edge intrinsic rotation in DIII-D

    Science.gov (United States)

    Ashourvan, Arash; Grierson, B. A.; Battaglia, D. J.; Haskey, S. R.; Stoltzfus-Dueck, T.

    2018-05-01

    In a recent kinetic model of edge main-ion (deuterium) toroidal velocity, intrinsic rotation results from neoclassical orbits in an inhomogeneous turbulent field [T. Stoltzfus-Dueck, Phys. Rev. Lett. 108, 065002 (2012)]. This model predicts a value for the toroidal velocity that is co-current for a typical inboard X-point plasma at the core-edge boundary (ρ ˜ 0.9). Using this model, the velocity prediction is tested on the DIII-D tokamak for a database of L-mode and H-mode plasmas with nominally low neutral beam torque, including both signs of plasma current. Values for the flux-surface-averaged main-ion rotation velocity in the database are obtained from the impurity carbon rotation by analytically calculating the main-ion—impurity neoclassical offset. The deuterium rotation obtained in this manner has been validated by direct main-ion measurements for a limited number of cases. Key theoretical parameters of ion temperature and turbulent scale length are varied across a wide range in an experimental database of discharges. Using a characteristic electron temperature scale length as a proxy for a turbulent scale length, the predicted main-ion rotation velocity has a general agreement with the experimental measurements for neutral beam injection (NBI) powers in the range PNBI balanced—but high powered—NBI, the net injected torque through the edge can exceed 1 Nm in the counter-current direction. The theory model has been extended to compute the rotation degradation from this counter-current NBI torque by solving a reduced momentum evolution equation for the edge and found the revised velocity prediction to be in agreement with experiment. Using the theory modeled—and now tested—velocity to predict the bulk plasma rotation opens up a path to more confidently projecting the confinement and stability in ITER.

  6. Science of Water Leaks: Validated Theory for Moisture Flow in Microchannels and Nanochannels.

    Science.gov (United States)

    Lei, Wenwen; Fong, Nicole; Yin, Yongbai; Svehla, Martin; McKenzie, David R

    2015-10-27

    Water is ubiquitous; the science of its transport in micro- and nanochannels has applications in electronics, medicine, filtration, packaging, and earth and planetary science. Validated theory for water vapor and two-phase water flows is a "missing link"; completing it enables us to define and quantify flow in a set of four standard leak configurations with dimensions from the nanoscale to the microscale. Here we report the first measurements of water vapor flow rates through four silica microchannels as a function of humidity, including under conditions when air is present as a background gas. An important finding is that the tangential momentum accommodation coefficient (TMAC) is strongly modified by surface layers of adsorbed water molecules, in agreement with previous work on the TMAC for nitrogen molecules impacting a silica surface in the presence of moisture. We measure enhanced flow rates for two-phase flows in silica microchannels driven by capillary filling. For the measurement of flows in nanochannels we use heavy water mass spectrometry. We construct the theory for the flow rates of the dominant modes of water transport through each of the four standard configurations and benchmark it against our new measurements in silica and against previously reported measurements for nanochannels in carbon nanotubes, carbon nanopipes, and porous alumina. The findings show that all behavior can be described by the four standard leak configurations and that measurements of leak behavior made using other molecules, such as helium, are not reliable. Single-phase water vapor flow is overestimated by a helium measurement, while two-phase flows are greatly underestimated for channels larger than 100 nm or for all channels when boundary slip applies, to an extent that depends on the slip length for the liquid-phase flows.

  7. Using G-Theory to Enhance Evidence of Reliability and Validity for Common Uses of the Paulhus Deception Scales.

    Science.gov (United States)

    Vispoel, Walter P; Morris, Carrie A; Kilinc, Murat

    2018-01-01

    We applied a new approach to Generalizability theory (G-theory) involving parallel splits and repeated measures to evaluate common uses of the Paulhus Deception Scales based on polytomous and four types of dichotomous scoring. G-theory indices of reliability and validity accounting for specific-factor, transient, and random-response measurement error supported use of polytomous over dichotomous scores as contamination checks; as control, explanatory, and outcome variables; as aspects of construct validation; and as indexes of environmental effects on socially desirable responding. Polytomous scoring also provided results for flagging faking as dependable as those when using dichotomous scoring methods. These findings argue strongly against the nearly exclusive use of dichotomous scoring for the Paulhus Deception Scales in practice and underscore the value of G-theory in demonstrating this. We provide guidelines for applying our G-theory techniques to other objectively scored clinical assessments, for using G-theory to estimate how changes to a measure might improve reliability, and for obtaining software to conduct G-theory analyses free of charge.

  8. The Development of the Functional Literacy Experience Scale Based upon Ecological Theory (FLESBUET) and Validity-Reliability Study

    Science.gov (United States)

    Özenç, Emine Gül; Dogan, M. Cihangir

    2014-01-01

    This study aims to perform a validity-reliability test by developing the Functional Literacy Experience Scale based upon Ecological Theory (FLESBUET) for primary education students. The study group includes 209 fifth grade students at Sabri Taskin Primary School in the Kartal District of Istanbul, Turkey during the 2010-2011 academic year.…

  9. Enhancing Validity When Researching the "Other": Insights from Pierre Bourdieu's Theory of Social Science Research Practice

    Science.gov (United States)

    Naidoo, Devika

    2014-01-01

    This article explores aspects of Pierre Bourdieu's theory of social science research practice and discusses their relevance for enhancing validity when researching the "other." Aspects such as: a relational way of thinking about concepts, epistemology and methodology; the rigorous construction of the object of research; and…

  10. An Evaluation of the Cross-Cultural Validity of Holland's Theory: Career Choices by Workers in India.

    Science.gov (United States)

    Leong, Frederick T. L.; Austin, James T.; Sekaran, Uma; Komarraju, Meera

    1998-01-01

    Natives of India (n=172) completed Holland's Vocational Preference Inventory and job satisfaction measures. The inventory did not exhibit high external validity with this population. Congruence, consistency, and differentiation did not predict job or occupational satisfaction, suggesting cross-cultural limits on Holland's theory. (SK)

  11. Minute Impurities Contribute Significantly to Olfactory Receptor Ligand Studies: Tales from Testing the Vibration Theory

    OpenAIRE

    Paoli, M.; M?nch, D.; Haase, A.; Skoulakis, E.; Turin, L.; Galizia, C. G.

    2017-01-01

    Several studies have attempted to test the vibrational hypothesis of odorant receptor activation in behavioral and physiological studies using deuterated compounds as odorants. The results have been mixed. Here, we attempted to test how deuterated compounds activate odorant receptors using calcium imaging of the fruit fly antennal lobe. We found specific activation of one area of the antennal lobe corresponding to inputs from a specific receptor. However, upon more detailed analysis, we disco...

  12. Validity of leptin receptor-deficiency (db/db) type 2 diabetes mellitus mice as a model of secondary osteoporosis

    Science.gov (United States)

    Huang, Le; You, Yong-Ke; Zhu, Tracy Y.; Zheng, Li-Zhen; Huang, Xiao-Ru; Chen, Hai-Yong; Yao, Dong; Lan, Hui-Yao; Qin, Ling

    2016-06-01

    This study aimed to evaluate the validation of the leptin receptor-deficient mice model for secondary osteoporosis associated with type 2 diabetes mellitus (T2DM) at bone micro-architectural level. Thirty three 36-week old male mice were divided into four groups: normal control (db/m) (n = 7), leptin receptor-deficient T2DM (db/db) (n = 8), human C-reactive protein (CRP) transgenic normal control (crp/db/m) (n = 7), and human CRP transgenic T2DM (crp/db/db) (n = 11). Lumber vertebrae (L5) and bilateral lower limbs were scanned by micro-CT to analyze trabecular and cortical bone quality. Right femora were used for three-point bending to analyze the mechanical properties. Trabecular bone quality at L5 was better in db/db or crp/db/db group in terms of bone mineral density (BMD), bone volume fraction, connectivity density, trabecular number and separation (all p  0.05). Maximum loading and energy yield in mechanical test were similar among groups while the elastic modulus in db/db and crp/db/db significantly lower than db/m. The leptin-receptor mice is not a proper model for secondary osteoporosis associated with T2DM.

  13. Discovery and validation of information theory-based transcription factor and cofactor binding site motifs.

    Science.gov (United States)

    Lu, Ruipeng; Mucaki, Eliseos J; Rogan, Peter K

    2017-03-17

    Data from ChIP-seq experiments can derive the genome-wide binding specificities of transcription factors (TFs) and other regulatory proteins. We analyzed 765 ENCODE ChIP-seq peak datasets of 207 human TFs with a novel motif discovery pipeline based on recursive, thresholded entropy minimization. This approach, while obviating the need to compensate for skewed nucleotide composition, distinguishes true binding motifs from noise, quantifies the strengths of individual binding sites based on computed affinity and detects adjacent cofactor binding sites that coordinate with the targets of primary, immunoprecipitated TFs. We obtained contiguous and bipartite information theory-based position weight matrices (iPWMs) for 93 sequence-specific TFs, discovered 23 cofactor motifs for 127 TFs and revealed six high-confidence novel motifs. The reliability and accuracy of these iPWMs were determined via four independent validation methods, including the detection of experimentally proven binding sites, explanation of effects of characterized SNPs, comparison with previously published motifs and statistical analyses. We also predict previously unreported TF coregulatory interactions (e.g. TF complexes). These iPWMs constitute a powerful tool for predicting the effects of sequence variants in known binding sites, performing mutation analysis on regulatory SNPs and predicting previously unrecognized binding sites and target genes. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  14. Experimental Validation of the Electrokinetic Theory and Development of Seismoelectric Interferometry by Cross-Correlation

    Directory of Open Access Journals (Sweden)

    F. C. Schoemaker

    2012-01-01

    Full Text Available We experimentally validate a relatively recent electrokinetic formulation of the streaming potential (SP coefficient as developed by Pride (1994. The start of our investigation focuses on the streaming potential coefficient, which gives rise to the coupling of mechanical and electromagnetic fields. It is found that the theoretical amplitude values of this dynamic SP coefficient are in good agreement with the normalized experimental results over a wide frequency range, assuming no frequency dependence of the bulk conductivity. By adopting the full set of electrokinetic equations, a full-waveform wave propagation model is formulated. We compare the model predictions, neglecting the interface response and modeling only the coseismic fields, with laboratory measurements of a seismic wave of frequency 500 kHz that generates electromagnetic signals. Agreement is observed between measurement and electrokinetic theory regarding the coseismic electric field. The governing equations are subsequently adopted to study the applicability of seismoelectric interferometry. It is shown that seismic sources at a single boundary location are sufficient to retrieve the 1D seismoelectric responses, both for the coseismic and interface components, in a layered model.

  15. On the validity of the effective field theory approach to SM precision tests

    Energy Technology Data Exchange (ETDEWEB)

    Contino, Roberto [EPFL, Lausanne (Switzerland). Inst. de Theorie des Phenomenes Physiques; CERN, Geneva (Switzerland). Theoretical Physics Dept.; Falkowski, Adam [Paris-11 Univ., 91 - Orsay (France). Lab. de Physique Theorique; Goertz, Florian; Riva, Francesco [CERN, Geneva (Switzerland). Theoretical Physics Dept.; Grojean, Christophe [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany)

    2016-09-15

    We discuss the conditions for an effective field theory (EFT) to give an adequate low-energy description of an underlying physics beyond the Standard Model (SM). Starting from the EFT where the SM is extended by dimension-6 operators, experimental data can be used without further assumptions to measure (or set limits on) the EFT parameters. The interpretation of these results requires instead a set of broad assumptions (e.g. power counting rules) on the UV dynamics. This allows one to establish, in a bottom-up approach, the validity range of the EFT description, and to assess the error associated with the truncation of the EFT series. We give a practical prescription on how experimental results could be reported, so that they admit a maximally broad range of theoretical interpretations. Namely, the experimental constraints on dimension-6 operators should be reported as functions of the kinematic variables that set the relevant energy scale of the studied process. This is especially important for hadron collider experiments where collisions probe a wide range of energy scales.

  16. Discriminant content validity: a quantitative methodology for assessing content of theory-based measures, with illustrative applications.

    Science.gov (United States)

    Johnston, Marie; Dixon, Diane; Hart, Jo; Glidewell, Liz; Schröder, Carin; Pollard, Beth

    2014-05-01

    In studies involving theoretical constructs, it is important that measures have good content validity and that there is not contamination of measures by content from other constructs. While reliability and construct validity are routinely reported, to date, there has not been a satisfactory, transparent, and systematic method of assessing and reporting content validity. In this paper, we describe a methodology of discriminant content validity (DCV) and illustrate its application in three studies. Discriminant content validity involves six steps: construct definition, item selection, judge identification, judgement format, single-sample test of content validity, and assessment of discriminant items. In three studies, these steps were applied to a measure of illness perceptions (IPQ-R) and control cognitions. The IPQ-R performed well with most items being purely related to their target construct, although timeline and consequences had small problems. By contrast, the study of control cognitions identified problems in measuring constructs independently. In the final study, direct estimation response formats for theory of planned behaviour constructs were found to have as good DCV as Likert format. The DCV method allowed quantitative assessment of each item and can therefore inform the content validity of the measures assessed. The methods can be applied to assess content validity before or after collecting data to select the appropriate items to measure theoretical constructs. Further, the data reported for each item in Appendix S1 can be used in item or measure selection. Statement of contribution What is already known on this subject? There are agreed methods of assessing and reporting construct validity of measures of theoretical constructs, but not their content validity. Content validity is rarely reported in a systematic and transparent manner. What does this study add? The paper proposes discriminant content validity (DCV), a systematic and transparent method

  17. Binding constants of membrane-anchored receptors and ligands: A general theory corroborated by Monte Carlo simulations.

    Science.gov (United States)

    Xu, Guang-Kui; Hu, Jinglei; Lipowsky, Reinhard; Weikl, Thomas R

    2015-12-28

    Adhesion processes of biological membranes that enclose cells and cellular organelles are essential for immune responses, tissue formation, and signaling. These processes depend sensitively on the binding constant K2D of the membrane-anchored receptor and ligand proteins that mediate adhesion, which is difficult to measure in the "two-dimensional" (2D) membrane environment of the proteins. An important problem therefore is to relate K2D to the binding constant K3D of soluble variants of the receptors and ligands that lack the membrane anchors and are free to diffuse in three dimensions (3D). In this article, we present a general theory for the binding constants K2D and K3D of rather stiff proteins whose main degrees of freedom are translation and rotation, along membranes and around anchor points "in 2D," or unconstrained "in 3D." The theory generalizes previous results by describing how K2D depends both on the average separation and thermal nanoscale roughness of the apposing membranes, and on the length and anchoring flexibility of the receptors and ligands. Our theoretical results for the ratio K2D/K3D of the binding constants agree with detailed results from Monte Carlo simulations without any data fitting, which indicates that the theory captures the essential features of the "dimensionality reduction" due to membrane anchoring. In our Monte Carlo simulations, we consider a novel coarse-grained model of biomembrane adhesion in which the membranes are represented as discretized elastic surfaces, and the receptors and ligands as anchored molecules that diffuse continuously along the membranes and rotate at their anchor points.

  18. Content Validity and Psychometric Characteristics of the "Knowledge about Older Patients Quiz" for Nurses Using Item Response Theory.

    Science.gov (United States)

    Dikken, Jeroen; Hoogerduijn, Jita G; Kruitwagen, Cas; Schuurmans, Marieke J

    2016-11-01

    To assess the content validity and psychometric characteristics of the Knowledge about Older Patients Quiz (KOP-Q), which measures nurses' knowledge regarding older hospitalized adults and their certainty regarding this knowledge. Cross-sectional. Content validity: general hospitals. Psychometric characteristics: nursing school and general hospitals in the Netherlands. Content validity: 12 nurse specialists in geriatrics. Psychometric characteristics: 107 first-year and 78 final-year bachelor of nursing students, 148 registered nurses, and 20 nurse specialists in geriatrics. Content validity: The nurse specialists rated each item of the initial KOP-Q (52 items) on relevance. Ratings were used to calculate Item-Content Validity Index and average Scale-Content Validity Index (S-CVI/ave) scores. Items with insufficient content validity were removed. Psychometric characteristics: Ratings of students, nurses, and nurse specialists were used to test for different item functioning (DIF) and unidimensionality before item characteristics (discrimination and difficulty) were examined using Item Response Theory. Finally, norm references were calculated and nomological validity was assessed. Content validity: Forty-three items remained after assessing content validity (S-CVI/ave = 0.90). Psychometric characteristics: Of the 43 items, two demonstrating ceiling effects and 11 distorting ability estimates (DIF) were subsequently excluded. Item characteristics were assessed for the remaining 30 items, all of which demonstrated good discrimination and difficulty parameters. Knowledge was positively correlated with certainty about this knowledge. The final 30-item KOP-Q is a valid, psychometrically sound, comprehensive instrument that can be used to assess the knowledge of nursing students, hospital nurses, and nurse specialists in geriatrics regarding older hospitalized adults. It can identify knowledge and certainty deficits for research purposes or serve as a tool in educational

  19. 39 The Validity of Herzberg's Dual-Factor Theory on Job Satisfaction ...

    African Journals Online (AJOL)

    User

    2012-01-24

    Jan 24, 2012 ... But Herzberg's two-factor theory (also called motivation /Hygiene theory) has been ... democratic process and more importantly, it attracts the right calibre of politicians into ..... Journal of Applied Psychology pp.162. Francis ...

  20. Validation of a Theory of Planned Behavior-Based Questionnaire to Examine Factors Associated With Milk Expression.

    Science.gov (United States)

    Bai, Yeon K; Dinour, Lauren M

    2017-11-01

    A proper assessment of multidimensional needs for breastfeeding mothers in various settings is crucial to facilitate and support breastfeeding and its exclusivity. The theory of planned behavior (TPB) has been used frequently to measure factors associated with breastfeeding. Full utility of the TPB requires accurate measurement of theory constructs. Research aim: This study aimed to develop and confirm the psychometric properties of an instrument, Milk Expression on Campus, based on the TPB and to establish the reliability and validity of the instrument. In spring 2015, 218 breastfeeding (current or in the recent past) employees and students at one university campus in northern New Jersey completed the online questionnaire containing demography and theory-based items. Internal consistency (α) and split-half reliability ( r) tests and factor analyses established and confirmed the reliability and construct validity of this instrument. Milk Expression on Campus showed strong and significant reliabilities as a full scale (α = .78, r = .74, p theory construct subscales. Validity was confirmed as psychometric properties corresponded to the factors extracted from the scale. Four factors extracted from the direct construct subscales accounted for 79.49% of the total variability. Four distinct factors from the indirect construct subscales accounted for 73.68% of the total variability. Milk Expression on Campus can serve as a model TPB-based instrument to examine factors associated with women's milk expression behavior. The utility of this instrument extends to designing effective promotion programs to foster breastfeeding and milk expression behaviors in diverse settings.

  1. Design, synthesis, and validation of a β-turn mimetic library targeting protein-protein and peptide-receptor interactions.

    Science.gov (United States)

    Whitby, Landon R; Ando, Yoshio; Setola, Vincent; Vogt, Peter K; Roth, Bryan L; Boger, Dale L

    2011-07-06

    The design and synthesis of a β-turn mimetic library as a key component of a small-molecule library targeting the major recognition motifs involved in protein-protein interactions is described. Analysis of a geometric characterization of 10,245 β-turns in the protein data bank (PDB) suggested that trans-pyrrolidine-3,4-dicarboxamide could serve as an effective and synthetically accessible library template. This was confirmed by initially screening select compounds against a series of peptide-activated GPCRs that recognize a β-turn structure in their endogenous ligands. This validation study was highlighted by identification of both nonbasic and basic small molecules with high affinities (K(i) = 390 and 23 nM, respectively) for the κ-opioid receptor (KOR). Consistent with the screening capabilities of collaborators and following the design validation, the complete library was assembled as 210 mixtures of 20 compounds, providing a total of 4200 compounds designed to mimic all possible permutations of 3 of the 4 residues in a naturally occurring β-turn. Unique to the design and because of the C(2) symmetry of the template, a typical 20 × 20 × 20-mix (8000 compounds prepared as 400 mixtures of 20 compounds) needed to represent 20 variations in the side chains of three amino acid residues reduces to a 210 × 20-mix, thereby simplifying the library synthesis and subsequent screening. The library was prepared using a solution-phase synthetic protocol with liquid-liquid or liquid-solid extractions for purification and conducted on a scale that insures its long-term availability for screening campaigns. Screening the library against the human opioid receptors (KOR, MOR, and DOR) identified not only the activity of library members expected to mimic the opioid receptor peptide ligands but also additional side-chain combinations that provided enhanced receptor binding selectivities (>100-fold) and affinities (as low as K(i) = 80 nM for KOR). A key insight to emerge from

  2. Validity of the linear no-threshold theory of radiation carcinogenesis at low doses

    International Nuclear Information System (INIS)

    Cohen, B.L.

    1999-01-01

    A great deal is known about the cancer risk of high radiation doses from studies of Japanese A-bomb survivors, patients exposed for medical therapy, occupational exposures, etc. But the vast majority of important applications deal with much lower doses, usually accumulated at much lower dose rates, referred to as 'low-level radiation' (LLR). Conventionally, the cancer risk from LLR has been estimated by the use of linear no-threshold theory (LNT). For example, it is assumed that the cancer risk from 0 01 Sr (100 mrem) of dose is 0 01 times the risk from 1 Sv (100 rem). In recent years, the former risk estimates have often been reduced by a 'dose and dose rate reduction factor', which is taken to be a factor of 2. But otherwise, the LNT is frequently used for doses as low as one hundred-thousandth of those for which there is direct evidence of cancer induction by radiation. It is the origin of the commonly used expression 'no level of radiation is safe' and the consequent public fear of LLR. The importance of this use of the LNT can not be exaggerated and is used in many applications in the nuclear industry. The LNT paradigm has also been carried over to chemical carcinogens, leading to severe restrictions on use of cleaning fluids, organic chemicals, pesticides, etc. If the LNT were abandoned for radiation, it would probably also be abandoned for chemical carcinogens. In view of these facts, it is important to consider the validity of the LNT. That is the purpose of this paper. (author)

  3. Development and validation of an item response theory-based Social Responsiveness Scale short form.

    Science.gov (United States)

    Sturm, Alexandra; Kuhfeld, Megan; Kasari, Connie; McCracken, James T

    2017-09-01

    Research and practice in autism spectrum disorder (ASD) rely on quantitative measures, such as the Social Responsiveness Scale (SRS), for characterization and diagnosis. Like many ASD diagnostic measures, SRS scores are influenced by factors unrelated to ASD core features. This study further interrogates the psychometric properties of the SRS using item response theory (IRT), and demonstrates a strategy to create a psychometrically sound short form by applying IRT results. Social Responsiveness Scale analyses were conducted on a large sample (N = 21,426) of youth from four ASD databases. Items were subjected to item factor analyses and evaluation of item bias by gender, age, expressive language level, behavior problems, and nonverbal IQ. Item selection based on item psychometric properties, DIF analyses, and substantive validity produced a reduced item SRS short form that was unidimensional in structure, highly reliable (α = .96), and free of gender, age, expressive language, behavior problems, and nonverbal IQ influence. The short form also showed strong relationships with established measures of autism symptom severity (ADOS, ADI-R, Vineland). Degree of association between all measures varied as a function of expressive language. Results identified specific SRS items that are more vulnerable to non-ASD-related traits. The resultant 16-item SRS short form may possess superior psychometric properties compared to the original scale and emerge as a more precise measure of ASD core symptom severity, facilitating research and practice. Future research using IRT is needed to further refine existing measures of autism symptomatology. © 2017 Association for Child and Adolescent Mental Health.

  4. Experimental design optimisation: theory and application to estimation of receptor model parameters using dynamic positron emission tomography

    International Nuclear Information System (INIS)

    Delforge, J.; Syrota, A.; Mazoyer, B.M.

    1989-01-01

    General framework and various criteria for experimental design optimisation are presented. The methodology is applied to estimation of receptor-ligand reaction model parameters with dynamic positron emission tomography data. The possibility of improving parameter estimation using a new experimental design combining an injection of the β + -labelled ligand and an injection of the cold ligand is investigated. Numerical simulations predict remarkable improvement in the accuracy of parameter estimates with this new experimental design and particularly the possibility of separate estimations of the association constant (k +1 ) and of receptor density (B' max ) in a single experiment. Simulation predictions are validated using experimental PET data in which parameter uncertainties are reduced by factors ranging from 17 to 1000. (author)

  5. Minute Impurities Contribute Significantly to Olfactory Receptor Ligand Studies: Tales from Testing the Vibration Theory.

    Science.gov (United States)

    Paoli, M; Münch, D; Haase, A; Skoulakis, E; Turin, L; Galizia, C G

    2017-01-01

    Several studies have attempted to test the vibrational hypothesis of odorant receptor activation in behavioral and physiological studies using deuterated compounds as odorants. The results have been mixed. Here, we attempted to test how deuterated compounds activate odorant receptors using calcium imaging of the fruit fly antennal lobe. We found specific activation of one area of the antennal lobe corresponding to inputs from a specific receptor. However, upon more detailed analysis, we discovered that an impurity of 0.0006% ethyl acetate in a chemical sample of benzaldehyde-d 5 was entirely responsible for a sizable odorant-evoked response in Drosophila melanogaster olfactory receptor cells expressing dOr42b. Without gas chromatographic purification within the experimental setup, this impurity would have created a difference in the responses of deuterated and nondeuterated benzaldehyde, suggesting that dOr42b be a vibration sensitive receptor, which we show here not to be the case. Our results point to a broad problem in the literature on use of non-GC-pure compounds to test receptor selectivity, and we suggest how the limitations can be overcome in future studies.

  6. Validity and Reliability of Published Comprehensive Theory of Mind Tests for Normal Preschool Children: A Systematic Review

    Directory of Open Access Journals (Sweden)

    Seyyede Zohreh Ziatabar Ahmadi

    2015-12-01

    Full Text Available Objective: Theory of mind (ToM or mindreading is an aspect of social cognition that evaluates mental states and beliefs of oneself and others. Validity and reliability are very important criteria when evaluating standard tests; and without them, these tests are not usable. The aim of this study was to systematically review the validity and reliability of published English comprehensive ToM tests developed for normal preschool children.Method: We searched MEDLINE (PubMed interface, Web of Science, Science direct, PsycINFO, and also evidence base Medicine (The Cochrane Library databases from 1990 to June 2015. Search strategy was Latin transcription of ‘Theory of Mind’ AND test AND children. Also, we manually studied the reference lists of all final searched articles and carried out a search of their references. Inclusion criteria were as follows: Valid and reliable diagnostic ToM tests published from 1990 to June 2015 for normal preschool children; and exclusion criteria were as follows: the studies that only used ToM tests and single tasks (false belief tasks for ToM assessment and/or had no description about structure, validity or reliability of their tests. Methodological quality of the selected articles was assessed using the Critical Appraisal Skills Programme (CASP.Result: In primary searching, we found 1237 articles in total databases. After removing duplicates and applying all inclusion and exclusion criteria, we selected 11 tests for this systematic review. Conclusion: There were a few valid, reliable and comprehensive ToM tests for normal preschool children. However, we had limitations concerning the included articles. The defined ToM tests were different in populations, tasks, mode of presentations, scoring, mode of responses, times and other variables. Also, they had various validities and reliabilities. Therefore, it is recommended that the researchers and clinicians select the ToM tests according to their psychometric

  7. Can large Nc equivalence between supersymmetric Yang-Mills theory and its orbifold projections be valid?

    International Nuclear Information System (INIS)

    Kovtun, Pavel; Uensal, Mithat; Yaffe, Laurence G.

    2005-01-01

    In previous work, we found that necessary and sufficient conditions for large N c equivalence between parent and daughter theories, for a wide class of orbifold projections of U(N c ) gauge theories, are just the natural requirements that the discrete symmetry used to define the projection not be spontaneously broken in the parent theory, and the discrete symmetry permuting equivalent gauge group factors not be spontaneously broken in the daughter theory. In this paper, we discuss the application of this result to Z k projections of N=1 supersymmetric Yang-Mills theory in four dimensions, as well as various multiflavor generalizations. Z k projections with k>2 yielding chiral gauge theories violate the symmetry realization conditions needed for large N c equivalence, due to the spontaneous symmetry breaking of discrete chiral symmetry in the parent super-Yang-Mills theory. But for Z 2 projections, we show that previous assertions of large N c inequivalence, in infinite volume, between the parent and daughter theories were based on incorrect mappings of vacuum energies, theta angles, or connected correlators between the two theories. With the correct identifications, there is no sign of any inconsistency. A subtle but essential feature of the connection between parent and daughter theories involves multivaluedness in the mapping of theta parameters from parent to daughter

  8. Retrieval of Droplet size Density Distribution from Multiple field of view Cross polarized Lidar Signals: Theory and Experimental Validation

    Science.gov (United States)

    2016-06-02

    Retrieval of droplet-size density distribution from multiple-field-of-view cross-polarized lidar signals: theory and experimental validation...Gilles Roy, Luc Bissonnette, Christian Bastille, and Gilles Vallee Multiple-field-of-view (MFOV) secondary-polarization lidar signals are used to...use secondary polarization. A mathematical relation among the PSD, the lidar fields of view, the scattering angles, and the angular depolarization

  9. Numerical verification/validation of the theory of coupled reactors for deuterium critical assembly, using MCNP5 and Serpent codes

    International Nuclear Information System (INIS)

    Hussein, M.S; Lewis, B.J.; Bonin, H.W.

    2013-01-01

    The theory of multipoint coupled reactors developed by multi-group transport is verified by using the probabilistic transport code MCNP5 and the continuous-energy Monte Carlo reactor physics burnup calculation Serpent code. The verification was performed by calculating the multiplication factors (or criticality factors) and coupling coefficients for a two-region test reactor known as the Deuterium Critical Assembly, DCA. The multiplication factors k eff calculated numerically and independently from simulations of the DCA by MCNP5 and Serpent codes are compared with the multiplication factors k eff calculated based on the coupled reactor theory. Excellent agreement was obtained between the multiplication factors k eff calculated with the Serpent code, with MCNP5, and from the coupled reactor theory. This analysis demonstrates that the Serpent code is valid for the multipoint coupled reactor calculations. (author)

  10. Numerical verification/validation of the theory of coupled reactors for deuterium critical assembly, using MCNP5 and Serpent codes

    Energy Technology Data Exchange (ETDEWEB)

    Hussein, M.S, E-mail: mohamed.hussein@rmc.ca [Royal Military College of Canada, Dept. of Chemistry and Chemical Engineering, Kingston, Ontario (Canada); Lewis, B.J., E-mail: Brent.Lewis@uoit.ca [Univ. of Ontario Inst. of Technology, Faculty of Energy Systems and Nuclear Science, Oshawa, Ontario (Canada); Bonin, H.W., E-mail: bonin-h@rmc.ca [Royal Military College of Canada, Dept. of Chemistry and Chemical Engineering, Kingston, Ontario (Canada)

    2013-07-01

    The theory of multipoint coupled reactors developed by multi-group transport is verified by using the probabilistic transport code MCNP5 and the continuous-energy Monte Carlo reactor physics burnup calculation Serpent code. The verification was performed by calculating the multiplication factors (or criticality factors) and coupling coefficients for a two-region test reactor known as the Deuterium Critical Assembly, DCA. The multiplication factors k{sub eff} calculated numerically and independently from simulations of the DCA by MCNP5 and Serpent codes are compared with the multiplication factors k{sub eff} calculated based on the coupled reactor theory. Excellent agreement was obtained between the multiplication factors k{sub eff} calculated with the Serpent code, with MCNP5, and from the coupled reactor theory. This analysis demonstrates that the Serpent code is valid for the multipoint coupled reactor calculations. (author)

  11. Investigating Postgraduate College Admission Interviews: Generalizability Theory Reliability and Incremental Predictive Validity

    Science.gov (United States)

    Arce-Ferrer, Alvaro J.; Castillo, Irene Borges

    2007-01-01

    The use of face-to-face interviews is controversial for college admissions decisions in light of the lack of availability of validity and reliability evidence for most college admission processes. This study investigated reliability and incremental predictive validity of a face-to-face postgraduate college admission interview with a sample of…

  12. Test of Achievement in Quantitative Economics for Secondary Schools: Construction and Validation Using Item Response Theory

    Science.gov (United States)

    Eleje, Lydia I.; Esomonu, Nkechi P. M.

    2018-01-01

    A Test to measure achievement in quantitative economics among secondary school students was developed and validated in this study. The test is made up 20 multiple choice test items constructed based on quantitative economics sub-skills. Six research questions guided the study. Preliminary validation was done by two experienced teachers in…

  13. Validity in Mixed Methods Research in Education: The Application of Habermas' Critical Theory

    Science.gov (United States)

    Long, Haiying

    2017-01-01

    Mixed methods approach has developed into the third methodological movement in educational research. Validity in mixed methods research as an important issue, however, has not been examined as extensively as that of quantitative and qualitative research. Additionally, the previous discussions of validity in mixed methods research focus on research…

  14. The General Necessary Condition for the Validity of Dirac's Transition Perturbation Theory

    Science.gov (United States)

    Quang, Nguyen Vinh

    1996-01-01

    For the first time, from the natural requirements for the successive approximation the general necessary condition of validity of the Dirac's method is explicitly established. It is proved that the conception of 'the transition probability per unit time' is not valid. The 'super-platinium rules' for calculating the transition probability are derived for the arbitrarily strong time-independent perturbation case.

  15. The predictive validity of prospect theory versus expected utility in health utility measurement.

    Science.gov (United States)

    Abellan-Perpiñan, Jose Maria; Bleichrodt, Han; Pinto-Prades, Jose Luis

    2009-12-01

    Most health care evaluations today still assume expected utility even though the descriptive deficiencies of expected utility are well known. Prospect theory is the dominant descriptive alternative for expected utility. This paper tests whether prospect theory leads to better health evaluations than expected utility. The approach is purely descriptive: we explore how simple measurements together with prospect theory and expected utility predict choices and rankings between more complex stimuli. For decisions involving risk prospect theory is significantly more consistent with rankings and choices than expected utility. This conclusion no longer holds when we use prospect theory utilities and expected utilities to predict intertemporal decisions. The latter finding cautions against the common assumption in health economics that health state utilities are transferable across decision contexts. Our results suggest that the standard gamble and algorithms based on, should not be used to value health.

  16. receptores

    Directory of Open Access Journals (Sweden)

    Salete Regina Daronco Benetti

    2006-01-01

    Full Text Available Se trata de un estudio etnográfico, que tuvo lo objetivo de interpretar el sistema de conocimiento y del significado atribuidos a la sangre referente a la transfusión sanguínea por los donadores y receptores de un banco de sangre. Para la colecta de las informaciones se observaron los participantes y la entrevista etnográfica se realizó el análisis de dominio, taxonómicos y temáticos. Los dominios culturales fueron: la sangre es vida: fuente de vida y alimento valioso; creencias religiosas: fuentes simbólicas de apoyos; donación sanguínea: un gesto colaborador que exige cuidarse, gratifica y trae felicidad; donación sanguínea: fuente simbólica de inseguridad; estar enfermo es una condición para realizar transfusión sanguínea; transfusión sanguínea: esperanza de vida; Creencias populares: transfusión sanguínea como riesgo para la salud; donadores de sangre: personas benditas; donar y recibir sangre: como significado de felicidad. Temática: “líquido precioso que origina, sostiene, modifica la vida, provoca miedo e inseguridad”.

  17. Testing odorant-receptor interaction theories in humans through discrimination of isotopomers

    Directory of Open Access Journals (Sweden)

    Mara Andrione

    2017-12-01

    Full Text Available Odour reception takes place on the olfactory receptor neuron membrane, where molecular receptors interact with volatile odorant molecules. This interaction is classically thought to rely on chemical and structural features of the odorant, e.g. size, shape, functional groups. However, this model does not allow formulating a correct prediction for the smell of an odorant, suggesting that other molecular properties may play a role in the odour transduction process. An alternative model of olfaction maintains that odorant receptors can probe not only the structural and chemical features, but also the molecular vibration spectrum of the odorants. This constitutes the so-called vibration model of olfaction. According to this model, two isotopomers of the same molecule, i.e. two forms of the same molecule, one unaltered and one in which one or more hydrogen atoms are substituted with deuterium – which are therefore structurally and chemically identical, but with different molecular vibration spectra – would interact differently with an olfactory receptor, producing different olfactory perceptions in the brain. Here, we report on a duo-trio discrimination experiment conducted on human subjects, testing isotopomer pairs that have recently been shown to be differentially encoded in the honeybee brain.

  18. Using big-data to validate theories of rehabilitation in aphasia

    Directory of Open Access Journals (Sweden)

    Swathi Kiran

    2015-05-01

    Full Text Available Introduction. While the evidence for efficacy of rehabilitation of language disorders is fairly robust and conclusive (Allen, et al., 2012; Brady, et al., 2012; Kelly, Brady, & Enderby, 2010; Cherney, et al., 2008, a major limitation identified by these reviews is that the sample size of patients in each of the interventions have been modest (5-20 patients. As technology moves our field forward, we can now collect and analyze larger sets of data to validate theories of rehabilitation. As a first step, we report data from a recently completed study examining the effectiveness of software platform (Constant Therapy to deliver, monitor and analyze treatment for individuals with aphasia (Des Roches et al., 2015. Methods. Fifty one individuals with language and cognitive deficits were administered standardized tests (Western Aphasia Battery, Boston Naming Test, Pyramids and Palm Trees, and Cognitive Linguistic Quick Test prior to initiation and following completion of therapy. Forty-two experimental patients used the iPad-based therapy once a week with the clinician and up to six days a week for home practice. Nine control patients practiced therapy on the iPad once per week with the clinician only. Thirty-eight therapy tasks were divided into language and cognitive activities that were developed (Des Roches et al., 2015, 28 of these tasks included buttons that revealed a hint to assist the patient answer the item. The assigned therapy tasks were tailored to that individual’s language and cognitive impairment profile based on an initial baseline assessment. Each task was practiced until accuracy on task reached 100% on multiple occasions at which point that task was replaced with the task at the next level of difficulty. The 51 patients each completed a 10 week program leading to total of 3327 therapy sessions across patients. Analysis and Results: Mixed regression models showed that both the experimental and control groups improved but

  19. Measuring theory of mind across middle childhood: Reliability and validity of the Silent Films and Strange Stories tasks.

    Science.gov (United States)

    Devine, Rory T; Hughes, Claire

    2016-09-01

    Recent years have seen a growth of research on the development of children's ability to reason about others' mental states (or "theory of mind") beyond the narrow confines of the preschool period. The overall aim of this study was to investigate the psychometric properties of a task battery composed of items from Happé's Strange Stories task and Devine and Hughes' Silent Film task. A sample of 460 ethnically and socially diverse children (211 boys) between 7 and 13years of age completed the task battery at two time points separated by 1month. The Strange Stories and Silent Film tasks were strongly correlated even when verbal ability and narrative comprehension were taken into account, and all items loaded onto a single theory-of-mind latent factor. The theory-of-mind latent factor provided reliable estimates of performance across a wide range of theory-of-mind ability and showed no evidence of differential item functioning across gender, ethnicity, or socioeconomic status. The theory-of-mind latent factor also exhibited strong 1-month test-retest reliability, and this stability did not vary as a function of child characteristics. Taken together, these findings provide evidence for the validity and reliability of the Strange Stories and Silent Film task battery as a measure of individual differences in theory of mind suitable for use across middle childhood. We consider the methodological and conceptual implications of these findings for research on theory of mind beyond the preschool years. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. Binding equilibrium and kinetics of membrane-anchored receptors and ligands in cell adhesion: Insights from computational model systems and theory

    Science.gov (United States)

    Weikl, Thomas R.; Hu, Jinglei; Xu, Guang-Kui; Lipowsky, Reinhard

    2016-01-01

    ABSTRACT The adhesion of cell membranes is mediated by the binding of membrane-anchored receptor and ligand proteins. In this article, we review recent results from simulations and theory that lead to novel insights on how the binding equilibrium and kinetics of these proteins is affected by the membranes and by the membrane anchoring and molecular properties of the proteins. Simulations and theory both indicate that the binding equilibrium constant K2D and the on- and off-rate constants of anchored receptors and ligands in their 2-dimensional (2D) membrane environment strongly depend on the membrane roughness from thermally excited shape fluctuations on nanoscales. Recent theory corroborated by simulations provides a general relation between K2D and the binding constant K3D of soluble variants of the receptors and ligands that lack the membrane anchors and are free to diffuse in 3 dimensions (3D). PMID:27294442

  1. Cultural theory and risk perception: validity and utility explored in the French context

    International Nuclear Information System (INIS)

    Brenot, J.; Bonnefous, S.; Mays, C.

    1996-01-01

    Explaining perceived risk can draw upon factors related to the person (e.g. demographics, personality, social/professional status, political orientation), or to the risk source (e.g. health impacts, economic effects). According to Cultural Theory risk perceptions are culturally biased. Wildavsky and Dake operationalised the Cultural Theory with questionnaire scales and found that resulting 'cultural profiles' best predict individual differences in risk perception. A French version of their questionnaire was inserted into a representative national risk opinion survey of May 1993; 1022 adults (age 18 and over) were interviewed. Major results are presented. The four cultural scales (hierarchy, egalitarianism, fatalism and individualism) show high correlations with political orientation as expected, but also with, for example, age, gender, income and education level. However, scale relationships to perception of risk situations (twenty, mainly technological) are not as strong as expected. Sjoeberg found similar results in Sweden. The utility of the existing operationalisation of Cultural Theory for risk perception analysis is discussed. (author)

  2. Cultural theory and risk perception: validity and utility explored in the French context

    Energy Technology Data Exchange (ETDEWEB)

    Brenot, J.; Bonnefous, S.; Mays, C. [CEA Centre d`Etudes de Fontenay-aux-Roses, 92 (France). Inst. de Protection et de Surete Nucleaire

    1996-12-31

    Explaining perceived risk can draw upon factors related to the person (e.g. demographics, personality, social/professional status, political orientation), or to the risk source (e.g. health impacts, economic effects). According to Cultural Theory risk perceptions are culturally biased. Wildavsky and Dake operationalised the Cultural Theory with questionnaire scales and found that resulting `cultural profiles` best predict individual differences in risk perception. A French version of their questionnaire was inserted into a representative national risk opinion survey of May 1993; 1022 adults (age 18 and over) were interviewed. Major results are presented. The four cultural scales (hierarchy, egalitarianism, fatalism and individualism) show high correlations with political orientation as expected, but also with, for example, age, gender, income and education level. However, scale relationships to perception of risk situations (twenty, mainly technological) are not as strong as expected. Sjoeberg found similar results in Sweden. The utility of the existing operationalisation of Cultural Theory for risk perception analysis is discussed. (author).

  3. Determination of regional flow by use of intravascular PET tracers: microvascular theory and experimental validation for pig livers

    DEFF Research Database (Denmark)

    Munk, O L; Bass, L; Feng, H

    2003-01-01

    Today, the standard approach for the kinetic analysis of dynamic PET studies is compartment models, in which the tracer and its metabolites are confined to a few well-mixed compartments. We examine whether the standard model is suitable for modern PET data or whether theories including more...... physiologic realism can advance the interpretation of dynamic PET data. A more detailed microvascular theory is developed for intravascular tracers in single-capillary and multiple-capillary systems. The microvascular models, which account for concentration gradients in capillaries, are validated and compared...... with the standard model in a pig liver study. METHODS: Eight pigs underwent a 5-min dynamic PET study after (15)O-carbon monoxide inhalation. Throughout each experiment, hepatic arterial blood and portal venous blood were sampled, and flow was measured with transit-time flow meters. The hepatic dual...

  4. JacketSE: An Offshore Wind Turbine Jacket Sizing Tool; Theory Manual and Sample Usage with Preliminary Validation

    Energy Technology Data Exchange (ETDEWEB)

    Damiani, Rick [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2016-02-08

    This manual summarizes the theory and preliminary verifications of the JacketSE module, which is an offshore jacket sizing tool that is part of the Wind-Plant Integrated System Design & Engineering Model toolbox. JacketSE is based on a finite-element formulation and on user-prescribed inputs and design standards' criteria (constraints). The physics are highly simplified, with a primary focus on satisfying ultimate limit states and modal performance requirements. Preliminary validation work included comparing industry data and verification against ANSYS, a commercial finite-element analysis package. The results are encouraging, and future improvements to the code are recommended in this manual.

  5. A Cross-Cultural Validation of the Sequential-Simultaneous Theory of Intelligence in Children.

    Science.gov (United States)

    Moon, Soo-Back; McLean, James E.; Kaufman, Alan S.

    2003-01-01

    The Kaufman Assessment Battery for Children - Korean (K-ABC-K) was developed to assess the intelligence and achievement of preschool and school-aged Korean children. This study examined the validity of the Sequential Processing, Simultaneous Processing and Achievement scales of the K-ABC-K. The factor analyses provided strong support for the…

  6. Why Do College Students Cheat? A Structural Equation Modeling Validation of the Theory of Planned Behavior

    Science.gov (United States)

    AL-Dossary, Saeed Abdullah

    2017-01-01

    Cheating on tests is a serious problem in education. The purpose of this study was to test the efficacy of a modified form of the theory of planned behavior (TPB) to predict cheating behavior among a sample of Saudi university students. This study also sought to test the influence of cheating in high school on cheating in college within the…

  7. Experimental validation of the Wigner distributions theory of phase-contrast imaging

    International Nuclear Information System (INIS)

    Donnelly, Edwin F.; Price, Ronald R.; Pickens, David R.

    2005-01-01

    Recently, a new theory of phase-contrast imaging has been proposed by Wu and Liu [Med. Phys. 31, 2378-2384 (2004)]. This theory, based upon Wigner distributions, provides a much stronger foundation for the evaluation of phase-contrast imaging systems than did the prior theories based upon Fresnel-Kirchhoff diffraction theory. In this paper, we compare results of measurements made in our laboratory of phase contrast for different geometries and tube voltages to the predictions of the Wu and Liu model. In our previous publications, we have used an empirical measurement (the edge enhancement index) to parametrize the degree of phase-contrast effects in an image. While the Wu and Liu model itself does not predict image contrast, it does measure the degree of phase contrast that the system can image for a given spatial frequency. We have found that our previously published experimental results relating phase-contrast effects to geometry and x-ray tube voltage are consistent with the predictions of the Wu and Liu model

  8. Aromatic interactions impact ligand binding and function at serotonin 5-HT2C G protein-coupled receptors: receptor homology modelling, ligand docking, and molecular dynamics results validated by experimental studies

    Science.gov (United States)

    Córdova-Sintjago, Tania; Villa, Nancy; Fang, Lijuan; Booth, Raymond G.

    2014-02-01

    The serotonin (5-hydroxytryptamine, 5-HT) 5-HT2 G protein-coupled receptor (GPCR) family consists of types 2A, 2B, and 2C that share ∼75% transmembrane (TM) sequence identity. Agonists for 5-HT2C receptors are under development for psychoses; whereas, at 5-HT2A receptors, antipsychotic effects are associated with antagonists - in fact, 5-HT2A agonists can cause hallucinations and 5-HT2B agonists cause cardiotoxicity. It is known that 5-HT2A TM6 residues W6.48, F6.51, and F6.52 impact ligand binding and function; however, ligand interactions with these residues at the 5-HT2C receptor have not been reported. To predict and validate molecular determinants for 5-HT2C-specific activation, results from receptor homology modelling, ligand docking, and molecular dynamics simulation studies were compared with experimental results for ligand binding and function at wild type and W6.48A, F6.51A, and F6.52A point-mutated 5-HT2C receptors.

  9. Dopamine Receptor D4 Gene Variation Predicts Preschoolers' Developing Theory of Mind

    Science.gov (United States)

    Lackner, Christine; Sabbagh, Mark A.; Hallinan, Elizabeth; Liu, Xudong; Holden, Jeanette J. A.

    2012-01-01

    Individual differences in preschoolers' understanding that human action is caused by internal mental states, or representational theory of mind (RTM), are heritable, as are developmental disorders such as autism in which RTM is particularly impaired. We investigated whether polymorphisms of genes affecting dopamine (DA) utilization and metabolism…

  10. Experimental validation of the Maxwell-Stefan theory for the description of liquid-side mass transfer-absorption of NH3 in water using a stirred cell.

    NARCIS (Netherlands)

    Frank, M.J.W.; Frank, M.J.W.; Kuipers, J.A.M.; van Swaaij, Willibrordus Petrus Maria

    1996-01-01

    The main goal of this paper is to demonstrate the validity of the Maxwell-Stefan theory for the description of liquid phase mass transport processes in a binary mixture. To critically test this theory absorption experiments of ammonia in water were conducted in a stirred cell. The flux model

  11. Dislocation-mediated strain hardening in tungsten: Thermo-mechanical plasticity theory and experimental validation

    Science.gov (United States)

    Terentyev, Dmitry; Xiao, Xiazi; Dubinko, A.; Bakaeva, A.; Duan, Huiling

    2015-12-01

    A self-consistent thermo-mechanical model to study the strain-hardening behavior of polycrystalline tungsten was developed and validated by a dedicated experimental route. Dislocation-dislocation multiplication and storage, as well dislocation-grain boundary (GB) pinning were the major mechanisms underlying the evolution of plastic deformation, thus providing a link between the strain hardening behavior and material's microstructure. The microstructure of the polycrystalline tungsten samples has been thoroughly investigated by scanning and electron microscopy. The model was applied to compute stress-strain loading curves of commercial tungsten grades, in the as-received and as-annealed states, in the temperature range of 500-1000 °C. Fitting the model to the independent experimental results obtained using a single crystal and as-received polycrystalline tungsten, the model demonstrated its capability to predict the deformation behavior of as-annealed samples in a wide temperature range and applied strain. The relevance of the dislocation-mediated plasticity mechanisms used in the model have been validated using transmission electron microscopy examination of the samples deformed up to different amounts of strain. On the basis of the experimental validation, the limitations of the model are determined and discussed.

  12. Bridging the gap between neurocognitive processing theory and performance validity assessment among the cognitively impaired: a review and methodological approach.

    Science.gov (United States)

    Leighton, Angela; Weinborn, Michael; Maybery, Murray

    2014-10-01

    Bigler (2012) and Larrabee (2012) recently addressed the state of the science surrounding performance validity tests (PVTs) in a dialogue highlighting evidence for the valid and increased use of PVTs, but also for unresolved problems. Specifically, Bigler criticized the lack of guidance from neurocognitive processing theory in the PVT literature. For example, individual PVTs have applied the simultaneous forced-choice methodology using a variety of test characteristics (e.g., word vs. picture stimuli) with known neurocognitive processing implications (e.g., the "picture superiority effect"). However, the influence of such variations on classification accuracy has been inadequately evaluated, particularly among cognitively impaired individuals. The current review places the PVT literature in the context of neurocognitive processing theory, and identifies potential methodological factors to account for the significant variability we identified in classification accuracy across current PVTs. We subsequently evaluated the utility of a well-known cognitive manipulation to provide a Clinical Analogue Methodology (CAM), that is, to alter the PVT performance of healthy individuals to be similar to that of a cognitively impaired group. Initial support was found, suggesting the CAM may be useful alongside other approaches (analogue malingering methodology) for the systematic evaluation of PVTs, particularly the influence of specific neurocognitive processing components on performance.

  13. The nutrition for sport knowledge questionnaire (NSKQ): development and validation using classical test theory and Rasch analysis.

    Science.gov (United States)

    Trakman, Gina Louise; Forsyth, Adrienne; Hoye, Russell; Belski, Regina

    2017-01-01

    Appropriate dietary intake can have a significant influence on athletic performance. There is a growing consensus on sports nutrition and professionals working with athletes often provide dietary education. However, due to the limitations of existing sports nutrition knowledge questionnaires, previous reports of athletes' nutrition knowledge may be inaccurate. An updated questionnaire has been developed based on a recent review of sports nutrition guidelines. The tool has been validated using a robust methodology that incorporates relevant techniques from classical test theory (CTT) and Item response theory (IRT), namely, Rasch analysis. The final questionnaire has 89 questions and six sub-sections (weight management, macronutrients, micronutrients, sports nutrition, supplements, and alcohol). The content and face validity of the tool have been confirmed based on feedback from expert sports dietitians and university sports students, respectively. The internal reliability of the questionnaire as a whole is high (KR = 0.88), and most sub-sections achieved an acceptable internal reliability. Construct validity has been confirmed, with an independent T-test revealing a significant ( p  < 0.001) difference in knowledge scores of nutrition (64 ± 16%) and non-nutrition students (51 ± 19%). Test-retest reliability has been assured, with a strong correlation ( r  = 0.92, p  < 0.001) between individuals' scores on two attempts of the test, 10 days to 2 weeks apart. Three of the sub-sections fit the Rasch Unidimensional Model. The final version of the questionnaire represents a significant improvement over previous tools. Each nutrition sub-section is unidimensional, and therefore researchers and practitioners can use these individually, as required. Use of the questionnaire will allow researchers to draw conclusions about the effectiveness of nutrition education programs, and differences in knowledge across athletes of varying ages, genders, and athletic

  14. Validation of an Instrument for Assessing Conceptual Change with Respect to the Theory of Evolution by Secondary Biology Students

    Science.gov (United States)

    Goff, Kevin David

    This pilot study evaluated the validity of a new quantitative, closed-response instrument for assessing student conceptual change regarding the theory of evolution. The instrument has two distinguishing design features. First, it is designed not only to gauge student mastery of the scientific model of evolution, but also to elicit a trio of deeply intuitive tendencies that are known to compromise many students' understanding: the projection of intentional agency, teleological directionality, and immutable essences onto biological phenomena. Second, in addition to a section of conventional multiple choice questions, the instrument contains a series of items where students may simultaneously endorse both scientifically normative propositions and intuitively appealing yet unscientific propositions, without having to choose between them. These features allow for the hypothesized possibility that the three intuitions are partly innate, themselves products of cognitive evolution in our hominin ancestors, and thus may continue to inform students' thinking even after instruction and conceptual change. The test was piloted with 340 high school students from diverse schools and communities. Confirmatory factor analysis and other statistical methods provided evidence that the instrument already has strong potential for validly distinguishing students who hold a correct scientific understanding from those who do not, but that revision and retesting are needed to render it valid for gauging students' adherence to intuitive misconceptions. Ultimately the instrument holds promise as a tool for classroom intervention studies by conceptual change researchers, for diagnostic testing and data gathering by instructional leaders, and for provoking classroom dialogue and debate by science teachers.

  15. Measuring health workers' motivation composition: validation of a scale based on Self-Determination Theory in Burkina Faso.

    Science.gov (United States)

    Lohmann, Julia; Souares, Aurélia; Tiendrebéogo, Justin; Houlfort, Nathalie; Robyn, Paul Jacob; Somda, Serge M A; De Allegri, Manuela

    2017-05-22

    Although motivation of health workers in low- and middle-income countries (LMICs) has become a topic of increasing interest by policy makers and researchers in recent years, many aspects are not well understood to date. This is partly due to a lack of appropriate measurement instruments. This article presents evidence on the construct validity of a psychometric scale developed to measure motivation composition, i.e., the extent to which motivation of different origin within and outside of a person contributes to their overall work motivation. It is theoretically grounded in Self-Determination Theory (SDT). We conducted a cross-sectional survey of 1142 nurses in 522 government health facilities in 24 districts of Burkina Faso. We assessed the scale's validity in a confirmatory factor analysis framework, investigating whether the scale measures what it was intended to measure (content, structural, and convergent/discriminant validity) and whether it does so equally well across health worker subgroups (measurement invariance). Our results show that the scale measures a slightly modified version of the SDT continuum of motivation well. Measurements were overall comparable between subgroups, but results indicate that caution is warranted if a comparison of motivation scores between groups is the focus of analysis. The scale is a valuable addition to the repository of measurement tools for health worker motivation in LMICs. We expect it to prove useful in the quest for a more comprehensive understanding of motivation as well as of the effects and potential side effects of interventions intended to enhance motivation.

  16. A multidimensional assessment of the validity and utility of alcohol use disorder severity as determined by item response theory models.

    Science.gov (United States)

    Dawson, Deborah A; Saha, Tulshi D; Grant, Bridget F

    2010-02-01

    The relative severity of the 11 DSM-IV alcohol use disorder (AUD) criteria are represented by their severity threshold scores, an item response theory (IRT) model parameter inversely proportional to their prevalence. These scores can be used to create a continuous severity measure comprising the total number of criteria endorsed, each weighted by its relative severity. This paper assesses the validity of the severity ranking of the 11 criteria and the overall severity score with respect to known AUD correlates, including alcohol consumption, psychological functioning, family history, antisociality, and early initiation of drinking, in a representative population sample of U.S. past-year drinkers (n=26,946). The unadjusted mean values for all validating measures increased steadily with the severity threshold score, except that legal problems, the criterion with the highest score, was associated with lower values than expected. After adjusting for the total number of criteria endorsed, this direct relationship was no longer evident. The overall severity score was no more highly correlated with the validating measures than a simple count of criteria endorsed, nor did the two measures yield different risk curves. This reflects both within-criterion variation in severity and the fact that the number of criteria endorsed and their severity are so highly correlated that severity is essentially redundant. Attempts to formulate a scalar measure of AUD will do as well by relying on simple counts of criteria or symptom items as by using scales weighted by IRT measures of severity. Published by Elsevier Ireland Ltd.

  17. Validation of quantitative brain dopamine D2 receptor imaging with a conventional single-head SPET camera

    International Nuclear Information System (INIS)

    Nikkinen, P.; Liewendahl, K.; Savolainen, S.; Launes, J.

    1993-01-01

    Phantom measurements were performed with a conventional single-head single-photon emission tomography (SPET) camera in order to validate the relevance of the basal ganglia/frontal cortex iodine-123 iodobenzamide (IBZM) uptake ratios measured in patients. Inside a cylindrical phantom (diameter 22 cm), two cylinders with a diameter of 3.3 cm were inserted. The activity concentrations of the cylinders ranged from 6.0 to 22.6 kBq/ml and the cylinder/background activity ratios varied from 1.4 to 3.8. From reconstructed SPET images the cylinder/background activity ratios were calculated using three different regions of interest (ROIs). A linear relationship between the measured activity ratio and the true activity ratio was obtained. In patient studies, basal ganglia/frontal cortex IBZM uptake ratios determined from the reconstructed slices using attentuation correction prior to reconstruction were 1.30 ±0.03 in idiopathic Parkinson's disease (n = 9), 1,33 ±0.09 in infantile and juvenile neuronal ceroid lipofuscinosis (n = 7) and 1.34 ±0.05 in narcolepsy (n = 8). Patients with Huntington's disease had significantly lower ratios (1.09 ±0.04, n = 5). The corrected basal ganglia/frontal cortex ratios, determined using linear regression, were about 80 % higher. The use of dual-window scatter correction increased the measured ratios by about 10 %. Although comprehensive correction methods can further improve the resolution in SPET images, the resolution of the SPET system used by us (1.5 - 2 cm) will determine what is achievable in basal ganglia D2 receptor imaging. (orig.)

  18. Validation of quantitative brain dopamine D2 receptor imaging with a conventional single-head SPET camera

    Energy Technology Data Exchange (ETDEWEB)

    Nikkinen, P [Helsinki Univ. (Finland). Dept. of Clinical Chemistry; Liewendahl, K [Helsinki Univ. (Finland). Dept. of Clinical Chemistry; Savolainen, S [Helsinki Univ. (Finland). Dept. of Physics; Launes, J [Helsinki Univ. (Finland). Dept. of Neurology

    1993-08-01

    Phantom measurements were performed with a conventional single-head single-photon emission tomography (SPET) camera in order to validate the relevance of the basal ganglia/frontal cortex iodine-123 iodobenzamide (IBZM) uptake ratios measured in patients. Inside a cylindrical phantom (diameter 22 cm), two cylinders with a diameter of 3.3 cm were inserted. The activity concentrations of the cylinders ranged from 6.0 to 22.6 kBq/ml and the cylinder/background activity ratios varied from 1.4 to 3.8. From reconstructed SPET images the cylinder/background activity ratios were calculated using three different regions of interest (ROIs). A linear relationship between the measured activity ratio and the true activity ratio was obtained. In patient studies, basal ganglia/frontal cortex IBZM uptake ratios determined from the reconstructed slices using attentuation correction prior to reconstruction were 1.30 [+-]0.03 in idiopathic Parkinson's disease (n = 9), 1,33 [+-]0.09 in infantile and juvenile neuronal ceroid lipofuscinosis (n = 7) and 1.34 [+-]0.05 in narcolepsy (n = 8). Patients with Huntington's disease had significantly lower ratios (1.09 [+-]0.04, n = 5). The corrected basal ganglia/frontal cortex ratios, determined using linear regression, were about 80 % higher. The use of dual-window scatter correction increased the measured ratios by about 10 %. Although comprehensive correction methods can further improve the resolution in SPET images, the resolution of the SPET system used by us (1.5 - 2 cm) will determine what is achievable in basal ganglia D2 receptor imaging. (orig.)

  19. Cognitive-behavioural theories of helplessness/hopelessness: valid models of depression?

    Science.gov (United States)

    Henkel, V; Bussfeld, P; Möller, H-J; Hegerl, U

    2002-10-01

    Helplessness and hopelessness are central aspects of cognitive-behavioural explanations for the development and persistence of depression. In this article a general overview concerning the evolution of those approaches to depression is provided. Included is a critical examination of the theories. The review of the literature suggests that those cognitive models describing helplessness/hopelessness as trait factors mediating depression do not really have a strong empirical base. The majority of those studies had been conducted in healthy or only mildly depressed subjects. Thus, there seems to be little justification for broad generalisations beyond the populations studied. It seems that some of the reported studies have not tested the underlying theories adequately (e. g. correlation had sometimes been interpreted as causation; adequate prospective longitudinal study designs had seldom been applied). Moreover, the theoretical models are not generally prepared to explain all depressive features (e. g. the possibility of a spontaneous shift in a manic episode). Despite those limitations, there is a relevant impact of the learned helplessness paradigm on preclinical research in neurobiological correlates of depressive states. Last but not least, the models are of high interest with respect to the theoretical background of important modules of cognitive-behavioural therapy and its acute and prophylactic effects.

  20. Development and validation of a short-lag spatial coherence theory for photoacoustic imaging

    Science.gov (United States)

    Graham, Michelle T.; Lediju Bell, Muyinatu A.

    2018-02-01

    We previously derived spatial coherence theory to be implemented for studying theoretical properties of ShortLag Spatial Coherence (SLSC) beamforming applied to photoacoustic images. In this paper, our newly derived theoretical equation is evaluated to generate SLSC images of a point target and a 1.2 mm diameter target and corresponding lateral profiles. We compared SLSC images simulated solely based on our theory to SLSC images created after beamforming acoustic channel data from k-Wave simulations of 1.2 mm-diameter disc target. This process was repeated for a point target and the full width at half the maximum signal amplitudes were measured to estimate the resolution of each imaging system. Resolution as a function of lag was comparable for the first 10% of the receive aperture (i.e., the short-lag region), after which resolution measurements diverged by a maximum of 1 mm between the two types of simulated images. These results indicate the potential for both simulation methods to be utilized as independent resources to study coherence-based photoacoustic beamformers when imaging point-like targets.

  1. On post-inflation validity of perturbation theory in Horndeski scalar-tensor models

    Energy Technology Data Exchange (ETDEWEB)

    Germani, Cristiano [Institut de Ciències del Cosmos (ICCUB), Universitat de Barcelona, Martí Franquès 1, E08028 Barcelona (Spain); Kudryashova, Nina [Arnold Sommerfeld Center, Ludwig-Maximilians-University, Theresienstr. 37, 80333 Muenchen (Germany); Watanabe, Yuki, E-mail: germani@icc.ub.edu, E-mail: nina.kudryashova@campus.lmu.de, E-mail: yuki.watanabe@nat.gunma-ct.ac.jp [Department of Physics, National Institute of Technology, Gunma College, Gunma 371-8530 (Japan)

    2016-08-01

    By using the newtonian gauge, we re-confirm that, as in the minimal case, the re-scaled Mukhanov-Sasaki variable is conserved leading to a constraint equation for the Newtonian potential. However, conversely to the minimal case, in Horndeski theories, the super-horizon Newtonian potential can potentially grow to very large values after inflation exit. If that happens, inflationary predictability is lost during the oscillating period. When this does not happen, the perturbations generated during inflation can be standardly related to the CMB, if the theory chosen is minimal at low energies. As a concrete example, we analytically and numerically discuss the new Higgs inflationary case. There, the Inflaton is the Higgs boson that is non-minimally kinetically coupled to gravity. During the high-energy part of the post-inflationary oscillations, the system is anisotropic and the Newtonian potential is largely amplified. Thanks to the smallness of today's amplitude of curvature perturbations, however, the system stays in the linear regime, so that inflationary predictions are not lost. At low energies, when the system relaxes to the minimal case, the anisotropies disappear and the Newtonian potential converges to a constant value. We show that the constant value to which the Newtonian potential converges is related to the frozen part of curvature perturbations during inflation, precisely like in the minimal case.

  2. Empirical validation of a real options theory based method for optimizing evacuation decisions within chemical plants.

    Science.gov (United States)

    Reniers, G L L; Audenaert, A; Pauwels, N; Soudan, K

    2011-02-15

    This article empirically assesses and validates a methodology to make evacuation decisions in case of major fire accidents in chemical clusters. In this paper, a number of empirical results are presented, processed and discussed with respect to the implications and management of evacuation decisions in chemical companies. It has been shown in this article that in realistic industrial settings, suboptimal interventions may result in case the prospect to obtain additional information at later stages of the decision process is ignored. Empirical results also show that implications of interventions, as well as the required time and workforce to complete particular shutdown activities, may be very different from one company to another. Therefore, to be optimal from an economic viewpoint, it is essential that precautionary evacuation decisions are tailor-made per company. Copyright © 2010 Elsevier B.V. All rights reserved.

  3. Combined Heat Transfer in High-Porosity High-Temperature Fibrous Insulations: Theory and Experimental Validation

    Science.gov (United States)

    Daryabeigi, Kamran; Cunnington, George R.; Miller, Steve D.; Knutson, Jeffry R.

    2010-01-01

    Combined radiation and conduction heat transfer through various high-temperature, high-porosity, unbonded (loose) fibrous insulations was modeled based on first principles. The diffusion approximation was used for modeling the radiation component of heat transfer in the optically thick insulations. The relevant parameters needed for the heat transfer model were derived from experimental data. Semi-empirical formulations were used to model the solid conduction contribution of heat transfer in fibrous insulations with the relevant parameters inferred from thermal conductivity measurements at cryogenic temperatures in a vacuum. The specific extinction coefficient for radiation heat transfer was obtained from high-temperature steady-state thermal measurements with large temperature gradients maintained across the sample thickness in a vacuum. Standard gas conduction modeling was used in the heat transfer formulation. This heat transfer modeling methodology was applied to silica, two types of alumina, and a zirconia-based fibrous insulation, and to a variation of opacified fibrous insulation (OFI). OFI is a class of insulations manufactured by embedding efficient ceramic opacifiers in various unbonded fibrous insulations to significantly attenuate the radiation component of heat transfer. The heat transfer modeling methodology was validated by comparison with more rigorous analytical solutions and with standard thermal conductivity measurements. The validated heat transfer model is applicable to various densities of these high-porosity insulations as long as the fiber properties are the same (index of refraction, size distribution, orientation, and length). Furthermore, the heat transfer data for these insulations can be obtained at any static pressure in any working gas environment without the need to perform tests in various gases at various pressures.

  4. BIGHORN Computational Fluid Dynamics Theory, Methodology, and Code Verification & Validation Benchmark Problems

    Energy Technology Data Exchange (ETDEWEB)

    Xia, Yidong [Idaho National Lab. (INL), Idaho Falls, ID (United States); Andrs, David [Idaho National Lab. (INL), Idaho Falls, ID (United States); Martineau, Richard Charles [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-08-01

    This document presents the theoretical background for a hybrid finite-element / finite-volume fluid flow solver, namely BIGHORN, based on the Multiphysics Object Oriented Simulation Environment (MOOSE) computational framework developed at the Idaho National Laboratory (INL). An overview of the numerical methods used in BIGHORN are discussed and followed by a presentation of the formulation details. The document begins with the governing equations for the compressible fluid flow, with an outline of the requisite constitutive relations. A second-order finite volume method used for solving the compressible fluid flow problems is presented next. A Pressure-Corrected Implicit Continuous-fluid Eulerian (PCICE) formulation for time integration is also presented. The multi-fluid formulation is being developed. Although multi-fluid is not fully-developed, BIGHORN has been designed to handle multi-fluid problems. Due to the flexibility in the underlying MOOSE framework, BIGHORN is quite extensible, and can accommodate both multi-species and multi-phase formulations. This document also presents a suite of verification & validation benchmark test problems for BIGHORN. The intent for this suite of problems is to provide baseline comparison data that demonstrates the performance of the BIGHORN solution methods on problems that vary in complexity from laminar to turbulent flows. Wherever possible, some form of solution verification has been attempted to identify sensitivities in the solution methods, and suggest best practices when using BIGHORN.

  5. Theory and validation of a liquid radiation filter greenhouse simulation for performance prediction

    International Nuclear Information System (INIS)

    Feuermann, D.; Kopel, R.; Zeroni, M.; Levi, S.; Gale, J.

    1997-01-01

    A greenhouse is described which has a selectively absorbing liquid radiation filter (LRF) circulating in double layered cladding. The filter removes much of the near infrared wave band of solar radiation (700 nm) while transmitting most of the photosynthetic radiation (400-700 nm). This greatly reduces the heat input to the greenhouse and, by transferring heat from day to night, facilitates better temperature control. This is particularly important for CO2 fertilization, which requires that the greenhouse should remain closed during daylight hours. A computer simulation model was developed to study the relationship between design parameters of such a LRF greenhouse and its thermal performance under different climatic conditions. The model was based on a small number of governing equations describing the major physical phenomena responsible for the greenhouse climate. Validation of the simulation was performed with data from a 330 m2 LRF greenhouse, operating in the Negev (Israel) desert highlands. The predicted greenhouse temperatures were found to agree with measured values to within one to two degrees Celsius. Performances of a LRF and a conventional greenhouse were compared using the simulation and hourly meteorological data for central Israel. For the summer season of May to October, the number of daylight hours during which the LRF greenhouse could remain closed was larger by about two-thirds than that of the conventional greenhouse

  6. Modeling of mitochondria bioenergetics using a composable chemiosmotic energy transduction rate law: theory and experimental validation.

    Directory of Open Access Journals (Sweden)

    Ivan Chang

    Full Text Available Mitochondrial bioenergetic processes are central to the production of cellular energy, and a decrease in the expression or activity of enzyme complexes responsible for these processes can result in energetic deficit that correlates with many metabolic diseases and aging. Unfortunately, existing computational models of mitochondrial bioenergetics either lack relevant kinetic descriptions of the enzyme complexes, or incorporate mechanisms too specific to a particular mitochondrial system and are thus incapable of capturing the heterogeneity associated with these complexes across different systems and system states. Here we introduce a new composable rate equation, the chemiosmotic rate law, that expresses the flux of a prototypical energy transduction complex as a function of: the saturation kinetics of the electron donor and acceptor substrates; the redox transfer potential between the complex and the substrates; and the steady-state thermodynamic force-to-flux relationship of the overall electro-chemical reaction. Modeling of bioenergetics with this rate law has several advantages: (1 it minimizes the use of arbitrary free parameters while featuring biochemically relevant parameters that can be obtained through progress curves of common enzyme kinetics protocols; (2 it is modular and can adapt to various enzyme complex arrangements for both in vivo and in vitro systems via transformation of its rate and equilibrium constants; (3 it provides a clear association between the sensitivity of the parameters of the individual complexes and the sensitivity of the system's steady-state. To validate our approach, we conduct in vitro measurements of ETC complex I, III, and IV activities using rat heart homogenates, and construct an estimation procedure for the parameter values directly from these measurements. In addition, we show the theoretical connections of our approach to the existing models, and compare the predictive accuracy of the rate law with

  7. Modeling of mitochondria bioenergetics using a composable chemiosmotic energy transduction rate law: theory and experimental validation.

    Science.gov (United States)

    Chang, Ivan; Heiske, Margit; Letellier, Thierry; Wallace, Douglas; Baldi, Pierre

    2011-01-01

    Mitochondrial bioenergetic processes are central to the production of cellular energy, and a decrease in the expression or activity of enzyme complexes responsible for these processes can result in energetic deficit that correlates with many metabolic diseases and aging. Unfortunately, existing computational models of mitochondrial bioenergetics either lack relevant kinetic descriptions of the enzyme complexes, or incorporate mechanisms too specific to a particular mitochondrial system and are thus incapable of capturing the heterogeneity associated with these complexes across different systems and system states. Here we introduce a new composable rate equation, the chemiosmotic rate law, that expresses the flux of a prototypical energy transduction complex as a function of: the saturation kinetics of the electron donor and acceptor substrates; the redox transfer potential between the complex and the substrates; and the steady-state thermodynamic force-to-flux relationship of the overall electro-chemical reaction. Modeling of bioenergetics with this rate law has several advantages: (1) it minimizes the use of arbitrary free parameters while featuring biochemically relevant parameters that can be obtained through progress curves of common enzyme kinetics protocols; (2) it is modular and can adapt to various enzyme complex arrangements for both in vivo and in vitro systems via transformation of its rate and equilibrium constants; (3) it provides a clear association between the sensitivity of the parameters of the individual complexes and the sensitivity of the system's steady-state. To validate our approach, we conduct in vitro measurements of ETC complex I, III, and IV activities using rat heart homogenates, and construct an estimation procedure for the parameter values directly from these measurements. In addition, we show the theoretical connections of our approach to the existing models, and compare the predictive accuracy of the rate law with our experimentally

  8. Localized multi-scale energy and vorticity analysis. II. Finite-amplitude instability theory and validation

    Science.gov (United States)

    San Liang, X.; Robinson, Allan R.

    2007-12-01

    Reynolds stress framework. Using the techniques of marginalization and localization, this work sets up an example for the generalization of certain geophysical fluid dynamics theories for more generic purposes.

  9. Hierarchical path planning and control of a small fixed-wing UAV: Theory and experimental validation

    Science.gov (United States)

    Jung, Dongwon

    2007-12-01

    problem is formulated by setting up geometric linear constraints as well as boundary conditions. Subsequently, we construct B-spline path templates by solving a set of distinct optimization problems. For application in UAV motion planning, the path templates are incorporated to replace parts of the entire path by the smooth B-spline paths. Each path segment is stitched together while preserving continuity to obtain a final smooth reference path to be used for path following control. The path following control for a small fixed-wing UAV to track the prescribed smooth reference path is also addressed. Assuming the UAV is equipped with an autopilot for low level control, we adopt a kinematic error model with respect to the moving Serret-Frenet frame attached to a path for tracking controller design. A kinematic path following control law that commands heading rate is presented. Backstepping is applied to derive the roll angle command by taking into account the approximate closed-loop roll dynamics. A parameter adaptation technique is employed to account for the inaccurate time constant of the closed-loop roll dynamics during actual implementation. Finally, we implement the proposed hierarchical path control of a small UAV on the actual hardware platform, which is based on an 1/5 scale R/C model airframe (Decathlon) and the autopilot hardware and software. Based on the hardware-in-the-loop (HIL) simulation environment, the proposed hierarchical path control algorithm has been validated through on-line, real-time implementation on a small micro-controller. By a seamless integration of the control algorithms for path planning, path smoothing, and path following, it has been demonstrated that the UAV equipped with a small autopilot having limited computational resources manages to accomplish the path control objective to reach the goal while avoiding obstacles with minimal human intervention.

  10. Smart Aquifer Characterisation validated using Information Theory and Cost benefit analysis

    Science.gov (United States)

    Moore, Catherine

    2016-04-01

    The field data acquisition required to characterise aquifer systems are time consuming and expensive. Decisions regarding field testing, the type of field measurements to make and the spatial and temporal resolution of measurements have significant cost repercussions and impact the accuracy of various predictive simulations. The Smart Aquifer Characterisation (SAC) research programme (New Zealand (NZ)) addresses this issue by assembling and validating a suite of innovative methods for characterising groundwater systems at the large, regional and national scales. The primary outcome is a suite of cost effective tools and procedures provided to resource managers to advance the understanding and management of groundwater systems and thereby assist decision makers and communities in the management of their groundwater resources, including the setting of land use limits that protect fresh water flows and quality and the ecosystems dependent on that fresh water. The programme has focused novel investigation approaches including the use of geophysics, satellite remote sensing, temperature sensing and age dating. The SMART (Save Money And Reduce Time) aspect of the programme emphasises techniques that use these passive cost effective data sources to characterise groundwater systems at both the aquifer and the national scale by: • Determination of aquifer hydraulic properties • Determination of aquifer dimensions • Quantification of fluxes between ground waters and surface water • Groundwater age dating These methods allow either a lower cost method for estimating these properties and fluxes, or a greater spatial and temporal coverage for the same cost. To demonstrate the cost effectiveness of the methods a 'data worth' analysis is undertaken. The data worth method involves quantification of the utility of observation data in terms of how much it reduces the uncertainty of model parameters and decision focussed predictions which depend on these parameters. Such

  11. Novel and validated titrimetric method for determination of selected angiotensin-II-receptor antagonists in pharmaceutical preparations and its comparison with UV spectrophotometric determination

    Directory of Open Access Journals (Sweden)

    Shrikant H. Patil

    2012-12-01

    Full Text Available A novel and simple titrimetric method for determination of commonly used angiotensin-II-receptor antagonists (ARA-IIs is developed and validated. The direct acid base titration of four ARA-IIs, namely eprosartan mesylate, irbesartan, telmisartan and valsartan, was carried out in the mixture of ethanol:water (1:1 as solvent using standardized sodium hydroxide aqueous solution as titrant, either visually using phenolphthalein as an indicator or potentiometrically using combined pH electrode. The method was found to be accurate and precise, having relative standard deviation of less than 2% for all ARA-IIs studied. Also, it was shown that the method could be successfully applied to the assay of commercial pharmaceuticals containing the above-mentioned ARA-IIs. The validity of the method was tested by the recovery studies of standard addition to pharmaceuticals and the results were found to be satisfactory. Results obtained by this method were found to be in good agreement with those obtained by UV spectrophotometric method. For UV spectrophotometric analysis ethanol was used as a solvent and wavelength of 233 nm, 246 nm, 296 nm, and 250 nm was selected for determination of eprosartan mesylate, irbesartan, telmisartan, and valsartan respectively. The proposed titrimetric method is simple, rapid, convenient and sufficiently precise for quality control purposes. Keywords: Angiotensin-II-receptor antagonists, Titrimetric assay, UV spectrophotometry, Validation

  12. Importance of the pharmacological profile of the bound ligand in enrichment on nuclear receptors: toward the use of experimentally validated decoy ligands.

    Science.gov (United States)

    Lagarde, Nathalie; Zagury, Jean-François; Montes, Matthieu

    2014-10-27

    The evaluation of virtual ligand screening methods is of major importance to ensure their reliability. Taking into account the agonist/antagonist pharmacological profile should improve the quality of the benchmarking data sets since ligand binding can induce conformational changes in the nuclear receptor structure and such changes may vary according to the agonist/antagonist ligand profile. We indeed found that splitting the agonist and antagonist ligands into two separate data sets for a given nuclear receptor target significantly enhances the quality of the evaluation. The pharmacological profile of the ligand bound in the binding site of the target structure was also found to be an additional critical parameter. We also illustrate that active compound data sets for a given pharmacological activity can be used as a set of experimentally validated decoy ligands for another pharmacological activity to ensure a reliable and challenging evaluation of virtual screening methods.

  13. Substance Use Stigma: Reliability and validity of a theory-based scale for substance-using populations*

    Science.gov (United States)

    Smith, Laramie R.; Earnshaw, Valerie A.; Copenhaver, Michael M.; Cunningham, Chinazo O.

    2016-01-01

    Background Substance use disorders consistently rank among the most stigmatized conditions worldwide. Thus, substance use stigma fosters health inequities among persons with substance use disorders and remains a key barrier to successful screening and treatment efforts. Current efforts to measure substance use stigma are limited. This study aims to advance measurement efforts by drawing on stigma theory to develop and evaluate the Substance Use Stigma Mechanisms Scale (SU-SMS). The SU-SMS was designed to capture enacted, anticipated, and internalized substance use stigma mechanisms among persons with current and past substance use disorders, and distinguish between key stigma sources most likely to impact this target population. Methods This study was a cross-sectional evaluation of the validity, reliability, and generalizability of the SU-SMS across two independent samples with diverse substance use and treatment histories. Results Findings support the structural and construct validity of the SU-SMS, suggesting the scale was able to capture enacted, anticipated, and internalized stigma as distinct stigma experiences. It also further differentiated between two distinct stigma sources (family and healthcare providers). Analysis of these mechanisms and psychosocial metrics suggests that the scale is also associated with other health-related outcomes. Furthermore, the SU-SMS demonstrated high levels of internal reliability and generalizability across two independent samples of persons with diverse substance use disorders and treatment histories. Conclusion The SU-SMS may serve as a valuable tool for better understanding the processes through which substance use stigma serves to undermine key health behaviors and outcomes among persons with substance use disorders. PMID:26972790

  14. A unified bond theory, probabilistic meso-scale modeling, and experimental validation of deformed steel rebar in normal strength concrete

    Science.gov (United States)

    Wu, Chenglin

    Bond between deformed rebar and concrete is affected by rebar deformation pattern, concrete properties, concrete confinement, and rebar-concrete interfacial properties. Two distinct groups of bond models were traditionally developed based on the dominant effects of concrete splitting and near-interface shear-off failures. Their accuracy highly depended upon the test data sets selected in analysis and calibration. In this study, a unified bond model is proposed and developed based on an analogy to the indentation problem around the rib front of deformed rebar. This mechanics-based model can take into account the combined effect of concrete splitting and interface shear-off failures, resulting in average bond strengths for all practical scenarios. To understand the fracture process associated with bond failure, a probabilistic meso-scale model of concrete is proposed and its sensitivity to interface and confinement strengths are investigated. Both the mechanical and finite element models are validated with the available test data sets and are superior to existing models in prediction of average bond strength (rib spacing-to-height ratio of deformed rebar. It can accurately predict the transition of failure modes from concrete splitting to rebar pullout and predict the effect of rebar surface characteristics as the rib spacing-to-height ratio increases. Based on the unified theory, a global bond model is proposed and developed by introducing bond-slip laws, and validated with testing of concrete beams with spliced reinforcement, achieving a load capacity prediction error of less than 26%. The optimal rebar parameters and concrete cover in structural designs can be derived from this study.

  15. Is the Market Eroding Moral Norms? A Micro-analytical Validation of Some Ideas of Anomie Theory

    Directory of Open Access Journals (Sweden)

    Eckhard Burkatzki

    2008-11-01

    Full Text Available Anomie theorists have been reporting the suppression of shared welfare orientations by the overwhelming dominance of economic values within capitalist societies since before the outset of neoliberalism debate. Obligations concerning common welfare are more and more often subordinated to the overarching aim of realizing economic success goals. This should be especially valid with for social life in contemporary market societies. This empirical investigation examines the extent to which market imperatives and values of the societal community are anchored within the normative orientations of market actors. Special attention is paid to whether the shape of these normative orientations varies with respect to the degree of market inclusion. Empirical analyses, based on the data of a standardized written survey within the German working population carried out in 2002, show that different types of normative orientation can be distinguished among market actors. These types are quite similar to the well-known types of anomic adaptation developed by Robert K. Merton in “Social Structure and Anomie” and are externally valid with respect to the prediction of different forms of economic crime. Further analyses show that the type of normative orientation actors adopt within everyday life depends on the dregree of market inclusion. Confirming anomie theory, it is shown that the individual willingness to subordinate matters of common welfare to the aim of economic success—radical market activism—gets stronger the more actors are included in the market sphere. Finally, the relevance of reported findings for the explanation of violent behavior, especially with view to varieties of corporate violence, is discussed.

  16. Validation of molecular crystal structures from powder diffraction data with dispersion-corrected density functional theory (DFT-D)

    International Nuclear Information System (INIS)

    Streek, Jacco van de; Neumann, Marcus A.

    2014-01-01

    The accuracy of 215 experimental organic crystal structures from powder diffraction data is validated against a dispersion-corrected density functional theory method. In 2010 we energy-minimized 225 high-quality single-crystal (SX) structures with dispersion-corrected density functional theory (DFT-D) to establish a quantitative benchmark. For the current paper, 215 organic crystal structures determined from X-ray powder diffraction (XRPD) data and published in an IUCr journal were energy-minimized with DFT-D and compared to the SX benchmark. The on average slightly less accurate atomic coordinates of XRPD structures do lead to systematically higher root mean square Cartesian displacement (RMSCD) values upon energy minimization than for SX structures, but the RMSCD value is still a good indicator for the detection of structures that deserve a closer look. The upper RMSCD limit for a correct structure must be increased from 0.25 Å for SX structures to 0.35 Å for XRPD structures; the grey area must be extended from 0.30 to 0.40 Å. Based on the energy minimizations, three structures are re-refined to give more precise atomic coordinates. For six structures our calculations provide the missing positions for the H atoms, for five structures they provide corrected positions for some H atoms. Seven crystal structures showed a minor error for a non-H atom. For five structures the energy minimizations suggest a higher space-group symmetry. For the 225 SX structures, the only deviations observed upon energy minimization were three minor H-atom related issues. Preferred orientation is the most important cause of problems. A preferred-orientation correction is the only correction where the experimental data are modified to fit the model. We conclude that molecular crystal structures determined from powder diffraction data that are published in IUCr journals are of high quality, with less than 4% containing an error in a non-H atom

  17. Long-Term Impact of Valid Case Criterion on Capturing Population-Level Growth under Item Response Theory Equating. Research Report. ETS RR-17-17

    Science.gov (United States)

    Deng, Weiling; Monfils, Lora

    2017-01-01

    Using simulated data, this study examined the impact of different levels of stringency of the valid case inclusion criterion on item response theory (IRT)-based true score equating over 5 years in the context of K-12 assessment when growth in student achievement is expected. Findings indicate that the use of the most stringent inclusion criterion…

  18. Experimental evidence of the validity of Bahe–Varela theory to describe the volumetric properties of ionic liquids

    International Nuclear Information System (INIS)

    Rilo, E.; Domínguez-Pérez, M.; Varela, L.M.; Cabeza, O.

    2012-01-01

    Highlights: ► We present the theoretical equation given by the Bahe–Varela pseudolattice model. ► The adaptation of that model to predict partial molar volume in mixtures is reviewed. ► We fit the B–V equation to 13 ionic liquids (ILs) + 2 solvent binary systems at four temperatures. ► The ILs used were four tetrafluoroborate and four alkyl sulphate imidazoliums, and the solvents water and ethanol. ► The fit quality is excellent for all systems. - Abstract: Bahe–Varela (B–V) theory, based in the pseudo-lattice model, explains the thermodynamics of electrolyte solutions over the whole composition range. Thus, Bahe in 1972 extended the Debye–Huckel theory, developed for the most diluted electrolytes, to more concentrated solutions taking into account long range inter-ionic Coulombian forces. The introduction of long range interactions in the model generates naturally a pseudo-lattice arrangement of the ions in the concentrated liquid. As Bahe’s description fails at very concentrated solutions, in 1997 Varela and co-workers extended that pseudo-lattice theory by including the short range interactions with solvent molecules and other ions. In recent years, with the discovery of ionic liquids, B–V theory has attracted an increasing interest because these compounds can be seen as very concentrated electrolyte solutions (in fact they can be seen as “solutions” in the limit of no solvent!). Following this line, Turmine and co-workers in 2007 adapt the B–V theory to explain the volumetric properties of ionic liquid solutions up to saturation (i.e. the pure compound). They applied the equation extracted to fit the variation of the partial molar volume of three different aqueous solutions of IL with good results. In the present paper, we extend that analysis for systems formed by eight ILs and two solvents: water and ethanol. We have used density measurements for six aqueous mixtures recently published, two with 1-alkyl-3-methyl imidazolium

  19. Identification of putative agouti-related protein(87-132)-melanocortin-4 receptor interactions by homology molecular modeling and validation using chimeric peptide ligands.

    Science.gov (United States)

    Wilczynski, Andrzej; Wang, Xiang S; Joseph, Christine G; Xiang, Zhimin; Bauzo, Rayna M; Scott, Joseph W; Sorensen, Nicholas B; Shaw, Amanda M; Millard, William J; Richards, Nigel G; Haskell-Luevano, Carrie

    2004-04-22

    Agouti-related protein (AGRP) is one of only two naturally known antagonists of G-protein-coupled receptors (GPCRs) identified to date. Specifically, AGRP antagonizes the brain melanocortin-3 and -4 receptors involved in energy homeostasis. Alpha-melanocyte stimulating hormone (alpha-MSH) is one of the known endogenous agonists for these melanocortin receptors. Insight into putative interactions between the antagonist AGRP amino acids with the melanocortin-4 receptor (MC4R) may be important for the design of unique ligands for the treatment of obesity related diseases and is currently lacking in the literature. A three-dimensional homology molecular model of the mouse MC4 receptor complex with the hAGRP(87-132) ligand docked into the receptor has been developed to identify putative antagonist ligand-receptor interactions. Key putative AGRP-MC4R interactions include the Arg111 of hAGRP(87-132) interacting in a negatively charged pocket located in a cavity formed by transmembrane spanning (TM) helices 1, 2, 3, and 7, capped by the acidic first extracellular loop (EL1) and specifically with the conserved melanocortin receptor residues mMC4R Glu92 (TM2), mMC4R Asp114 (TM3), and mMC4R Asp118 (TM3). Additionally, Phe112 and Phe113 of hAGRP(87-132) putatively interact with an aromatic hydrophobic pocket formed by the mMC4 receptor residues Phe176 (TM4), Phe193 (TM5), Phe253 (TM6), and Phe254 (TM6). To validate the AGRP-mMC4R model complex presented herein from a ligand perspective, we generated nine chimeric peptide ligands based on a modified antagonist template of the hAGRP(109-118) (Tyr-c[Asp-Arg-Phe-Phe-Asn-Ala-Phe-Dpr]-Tyr-NH(2)). In these chimeric ligands, the antagonist AGRP Arg-Phe-Phe residues were replaced by the melanocortin agonist His/D-Phe-Arg-Trp amino acids. These peptides resulted in agonist activity at the mouse melanocortin receptors (mMC1R and mMC3-5Rs). The most notable results include the identification of a novel subnanomolar melanocortin peptide

  20. Charge and pairing dynamics in the attractive Hubbard model: Mode coupling and the validity of linear-response theory

    Science.gov (United States)

    Bünemann, Jörg; Seibold, Götz

    2017-12-01

    Pump-probe experiments have turned out as a powerful tool in order to study the dynamics of competing orders in a large variety of materials. The corresponding analysis of the data often relies on standard linear-response theory generalized to nonequilibrium situations. Here we examine the validity of such an approach for the charge and pairing response of systems with charge-density wave and (or) superconducting (SC) order. Our investigations are based on the attractive Hubbard model which we study within the time-dependent Hartree-Fock approximation. In particular, we calculate the quench and pump-probe dynamics for SC and charge order parameters in order to analyze the frequency spectra and the coupling of the probe field to the specific excitations. Our calculations reveal that the "linear-response assumption" is justified for small to moderate nonequilibrium situations (i.e., pump pulses) in the case of a purely charge-ordered ground state. However, the pump-probe dynamics on top of a superconducting ground state is determined by phase and amplitude modes which get coupled far from the equilibrium state indicating the failure of the linear-response assumption.

  1. Development and validation of the Alzheimer's prevention beliefs measure in a multi-ethnic cohort-a behavioral theory approach.

    Science.gov (United States)

    Seifan, Alon; Ganzer, Christine A; Vermeylen, Francoise; Parry, Stephen; Zhu, Jifeng; Lyons, Abigail; Isaacson, Richard; Kim, Sarang

    2017-12-01

    Understanding health beliefs and how they influence willingness will enable the development of targeted curricula that maximize public engagement in Alzheimer's disease (AD) risk reduction behaviors. Literature on behavioral theory and community input was used to develop and validate a health beliefs survey about AD risk reduction among 428 community-dwelling adults. Principal component analysis was performed to assess internal consistency. Linear regression was performed to identify key predictors of Willingness to engage in AD risk reduction behaviors. The measure as well as the individual scales (Benefits, Barriers, Severity, Susceptibility and Social Norm) were found to be internally consistent. Overall, as Benefits and Barriers scores increased, Willingness scores also increased. Those without prior AD experience or family history had lower willingness scores. Finally, we observed an interaction between age and norms, suggesting that social factors related to AD prevention may differentially affect people of different ages. The Alzheimer Prevention Beliefs Measure provides assessment of several health belief factors related to AD prevention. Age, Family History, Logistical Barriers and total Benefits are significant determinants of willingness to engage in AD risk reduction behaviors, such as seeing a doctor or making a lifestyle change. © The Author 2017. Published by Oxford University Press on behalf of Faculty of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  2. Classical test theory and Rasch analysis validation of the Upper Limb Functional Index in subjects with upper limb musculoskeletal disorders.

    Science.gov (United States)

    Bravini, Elisabetta; Franchignoni, Franco; Giordano, Andrea; Sartorio, Francesco; Ferriero, Giorgio; Vercelli, Stefano; Foti, Calogero

    2015-01-01

    To perform a comprehensive analysis of the psychometric properties and dimensionality of the Upper Limb Functional Index (ULFI) using both classical test theory and Rasch analysis (RA). Prospective, single-group observational design. Freestanding rehabilitation center. Convenience sample of Italian-speaking subjects with upper limb musculoskeletal disorders (N=174). Not applicable. The Italian version of the ULFI. Data were analyzed using parallel analysis, exploratory factor analysis, and RA for evaluating dimensionality, functioning of rating scale categories, item fit, hierarchy of item difficulties, and reliability indices. Parallel analysis revealed 2 factors explaining 32.5% and 10.7% of the response variance. RA confirmed the failure of the unidimensionality assumption, and 6 items out of the 25 misfitted the Rasch model. When the analysis was rerun excluding the misfitting items, the scale showed acceptable fit values, loading meaningfully to a single factor. Item separation reliability and person separation reliability were .98 and .89, respectively. Cronbach alpha was .92. RA revealed weakness of the scale concerning dimensionality and internal construct validity. However, a set of 19 ULFI items defined through the statistical process demonstrated a unidimensional structure, good psychometric properties, and clinical meaningfulness. These findings represent a useful starting point for further analyses of the tool (based on modern psychometric approaches and confirmatory factor analysis) in larger samples, including different patient populations and nationalities. Copyright © 2015 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  3. Examining the validity of the unitary theory of clinical relationships: comparison of observed and experienced parent-doctor interaction.

    Science.gov (United States)

    Young, Bridget; Ward, Jo; Forsey, Mary; Gravenhorst, Katja; Salmon, Peter

    2011-10-01

    We explored parent-doctor relationships in the care of children with leukaemia from three perspectives simultaneously: parents', doctors' and observers'. Our aim was to investigate convergence and divergence between these perspectives and thereby examine the validity of unitary theory of emotionality and authority in clinical relationships. 33 audiorecorded parent-doctor consultations and separate interviews with parents and doctors, which we analysed qualitatively and from which we selected three prototype cases. Across the whole sample doctors' sense of relationship generally converged with our observations of consultation, but parents' sense of relationship diverged strongly from each. Contrary to current assumptions, parents' sense of emotional connection with doctors did not depend on doctors' emotional behaviour, and parents did not feel disempowered by doctors' authority. Moreover, authority and emotionality were not conceptually distinct for parents, who gained emotional support from doctors' exercise of authority. The relationships looked very different from the three perspectives. These divergences indicate weaknesses in current ideas of emotionality and authority in clinical relationships and the necessity of multisource datasets to develop these ideas in a way that characterises clinical relationships from all perspectives. Methodological development will be needed to address the challenges posed by multisource datasets. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  4. Examining the validity of the Academic Motivation Scale by comparing scale construction to self-determination theory.

    Science.gov (United States)

    Cokley, K O

    2000-04-01

    This study examined the construct validity of the Academic Motivation Scale. Specifically, subscale correlations were examined to assess whether support for a continuum of self-determination would be provided. The three types of Intrinsic Motivation were significantly and positively correlated with each other .67, .62, and .58, while the three types of Extrinsic Motivation were significantly and positively intercorrelated .50, .49, and .45. The former subscales, however, correlated higher with Introjected Regulation than Identified Regulation, suggesting that Introjected Regulation may be indicative of more self-determined behavior than has previously been believed. Also, the Intrinsic Motivation To Accomplish subscale had a stronger relationship with two of the Extrinsic Motivation subscales, Identified Regulation and Introjected Regulation, than did the Extrinsic Motivation subscales with each other. This suggests that the differences between Extrinsic and Intrinsic Motivation are not as obvious as has been believed. Also, contrary to self-determination theory, Amotivation had a stronger negative correlation with Identified Regulation (r = -.31) than with any of the Intrinsic Motivation subscales (rs = -.27, -.19, and -.11).

  5. Validation of a an analysis method of Marine Bio toxins Type Saxitoxin based on test coupled receptor (RBA) with Radiochemical Detection with liquid scintillation

    International Nuclear Information System (INIS)

    Selmi, Zied

    2009-01-01

    The saxitoxin s are bio toxins belonging to the family of toxins of the type PSP. They are paralysing toxins secreted by marine micro-organisms, phytoplankton, called Alexandrium. They constitute a risk for the human health in the event of their consumption in contaminated food. The acceptable maximum limit of these bio toxins in molluscs and shellfish is fixed to 800 μg /kg of meat of molluscs or shellfish. It proves, thus, that it is essential to develop and validate analytical methods for the level monitoring of contamination of the marine resources by these species in order to found a program of their monitoring and to guarantee an acceptable level of the food safety of the products available on the national and international markets. The present work allowed the validation of the quantification method of these toxins which is based on the use of the Receptor Binding Assay (RBA) with liquid scintillation nuclear technique detection using tritium as radiotracer and while proceeding by the different statistical tests of validation (Standard Nf XP T 90-210). The field of linearity ranged from 0 to 20 n M and the limit of detection was found to be 1 n M. The validation of this method will allow the reinforcement of the analytical means of analysis of marine bi toxins type SXT and to set up, in the near future, a monitoring and surveillance routine program for these bio toxins at the national, regional and African scales. (Author)

  6. In-house validation of a liquid chromatography-tandem mass spectrometry method for the determination of selective androgen receptor modulators (SARMS) in bovine urine.

    Science.gov (United States)

    Schmidt, Kathrin S; Mankertz, Joachim

    2018-06-01

    A sensitive and robust LC-MS/MS method allowing the rapid screening and confirmation of selective androgen receptor modulators in bovine urine was developed and successfully validated according to Commission Decision 2002/657/EC, chapter 3.1.3 'alternative validation', by applying a matrix-comprehensive in-house validation concept. The confirmation of the analytes in the validation samples was achieved both on the basis of the MRM ion ratios as laid down in Commission Decision 2002/657/EC and by comparison of their enhanced product ion (EPI) spectra with a reference mass spectral library by making use of the QTRAP technology. Here, in addition to the MRM survey scan, EPI spectra were generated in a data-dependent way according to an information-dependent acquisition criterion. Moreover, stability studies of the analytes in solution and in matrix according to an isochronous approach proved the stability of the analytes in solution and in matrix for at least the duration of the validation study. To identify factors that have a significant influence on the test method in routine analysis, a factorial effect analysis was performed. To this end, factors considered to be relevant for the method in routine analysis (e.g. operator, storage duration of the extracts before measurement, different cartridge lots and different hydrolysis conditions) were systematically varied on two levels. The examination of the extent to which these factors influence the measurement results of the individual analytes showed that none of the validation factors exerts a significant influence on the measurement results.

  7. Empirical assessment of the validity limits of the surface wave full ray theory using realistic 3-D Earth models

    KAUST Repository

    Parisi, Laura

    2016-02-10

    The surface wave full ray theory (FRT) is an efficient tool to calculate synthetic waveforms of surface waves. It combines the concept of local modes with exact ray tracing as a function of frequency, providing a more complete description of surface wave propagation than the widely used great circle approximation (GCA). The purpose of this study is to evaluate the ability of the FRT approach to model teleseismic long-period surface waveforms (T ∼ 45–150 s) in the context of current 3-D Earth models to empirically assess its validity domain and its scope for future studies in seismic tomography. To achieve this goal, we compute vertical and horizontal component fundamental mode synthetic Rayleigh waveforms using the FRT, which are compared with calculations using the highly accurate spectral element method. We use 13 global earth models including 3-D crustal and mantle structure, which are derived by successively varying the strength and lengthscale of heterogeneity in current tomographic models. For completeness, GCA waveforms are also compared with the spectral element method. We find that the FRT accurately predicts the phase and amplitude of long-period Rayleigh waves (T ∼ 45–150 s) for almost all the models considered, with errors in the modelling of the phase (amplitude) of Rayleigh waves being smaller than 5 per cent (10 per cent) in most cases. The largest errors in phase and amplitude are observed for T ∼ 45 s and for the three roughest earth models considered that exhibit shear wave anomalies of up to ∼20 per cent, which is much larger than in current global tomographic models. In addition, we find that overall the GCA does not predict Rayleigh wave amplitudes well, except for the longest wave periods (T ∼ 150 s) and the smoothest models considered. Although the GCA accurately predicts Rayleigh wave phase for current earth models such as S20RTS and S40RTS, FRT\\'s phase errors are smaller, notably for the shortest wave periods considered (T

  8. Construct validity in Operations Management by using Rasch Measurement Theory. The case of the construct “motivation to implement continuous improvement"

    Directory of Open Access Journals (Sweden)

    Lidia Sanchez-Ruiz

    2016-12-01

    Full Text Available Construct design and validation is a common practise in the Operations Management field. In this sense, the aim of this study is to present Rasch Measurement Theory (RMT as richful and useful methodology in order to validate constructs. In order to do so, the measurement controversy in the social science is presented; then, RMT is explained as a solution for this measurement issue; after that, the different applications of RMT are described and, finally, the different stages of the validation process are presented. Thus, this work aims to serve as a guide for those researchers interested in the methodology. Therefore, a specific case is included: the validation of the construct “motivation to implement continuous improvement”.

  9. Development and Validation of an Instrument Measuring Theory-Based Determinants of Monitoring Obesogenic Behaviors of Pre-Schoolers among Hispanic Mothers

    Directory of Open Access Journals (Sweden)

    Paul Branscum

    2016-06-01

    Full Text Available Public health interventions are greatly needed for obesity prevention, and planning for such strategies should include community participation. The study’s purpose was to develop and validate a theory-based instrument with low-income, Hispanic mothers of preschoolers, to assess theory-based determinants of maternal monitoring of child’s consumption of fruits and vegetables and sugar-sweetened beverages (SSB. Nine focus groups with mothers were conducted to determine nutrition-related behaviors that mothers found as most obesogenic for their children. Next, behaviors were operationally defined and rated for importance and changeability. Two behaviors were selected for investigation (fruits and vegetable and SSB. Twenty semi-structured interviews with mothers were conducted next to develop culturally appropriate items for the instrument. Afterwards, face and content validity were established using a panel of six experts. Finally, the instrument was tested with a sample of 238 mothers. Psychometric properties evaluated included construct validity (using the maximum likelihood extraction method of factor analysis, and internal consistency reliability (Cronbach’s alpha. Results suggested that all scales on the instrument were valid and reliable, except for the autonomy scales. Researchers and community planners working with Hispanic families can use this instrument to measure theory-based determinants of parenting behaviors related to preschoolers’ consumption of fruits and vegetables, and SSB.

  10. Validity and reliability analysis of the planned behavior theory scale related to the testicular self-examination in a Turkish context.

    Science.gov (United States)

    Iyigun, Emine; Tastan, Sevinc; Ayhan, Hatice; Kose, Gulsah; Acikel, Cengizhan

    2016-06-01

    This study aimed to determine the validity and reliability levels of the Planned Behavior Theory Scale as related to a testicular self-examination. The study was carried out in a health-profession higher-education school in Ankara, Turkey, from April to June 2012. The study participants comprised 215 male students. Study data were collected by using a questionnaire, a planned behavior theory scale related to testicular self-examination, and Champion's Health Belief Model Scale (CHBMS). The sub-dimensions of the planned behavior theory scale, namely those of intention, attitude, subjective norms and self-efficacy, were found to have Cronbach's alpha values of between 0.81 and 0.89. Exploratory factor analysis showed that items of the scale had five factors that accounted for 75% of the variance. Of these, the sub-dimension of intention was found to have the highest level of contribution. A significant correlation was found between the sub-dimensions of the testicular self-examination planned behavior theory scale and those of CHBMS (p Planned Behavior Theory Scale is a valid and reliable measurement for Turkish society.

  11. Development and validation of the nasopharyngeal cancer scale among the system of quality of life instruments for cancer patients (QLICP-NA V2.0): combined classical test theory and generalizability theory.

    Science.gov (United States)

    Wu, Jiayuan; Hu, Liren; Zhang, Gaohua; Liang, Qilian; Meng, Qiong; Wan, Chonghua

    2016-08-01

    This research was designed to develop a nasopharyngeal cancer (NPC) scale based on quality of life (QOL) instruments for cancer patients (QLICP-NA). This scale was developed by using a modular approach and was evaluated by classical test and generalizability theories. Programmed decision procedures and theories on instrument development were applied to create QLICP-NA V2.0. A total of 121 NPC inpatients were assessed using QLICP-NA V2.0 to measure their QOL data from hospital admission until discharge. Scale validity, reliability, and responsiveness were evaluated by correlation, factor, parallel, multi-trait scaling, and t test analyses, as well as by generalizability (G) and decision (D) studies of the generalizability theory. Results of multi-trait scaling, correlation, factor, and parallel analyses indicated that QLICP-NA V2.0 exhibited good construct validity. The significant difference of QOL between the treated and untreated NPC patients indicated a good clinical validity of the questionnaire. The internal consistency (α) and test-retest reliability coefficients (intra-class correlations) of each domain, as well as the overall scale, were all >0.70. Ceiling effects were not found in all domains and most facets, except for common side effects (24.8 %) in the domain of common symptoms and side effects, tumor early symptoms (27.3 %) and therapeutic side effects (23.2 %) in specific domain, whereas floor effects did not exist in each domain/facet. The overall changes in the physical and social domains were significantly different between pre- and post-treatments with a moderate effective size (standard response mean) ranging from 0.21 to 0.27 (p theory. QLICP-NA V2.0 exhibited reasonable degrees of validity, reliability, and responsiveness. However, this scale must be further improved before it can be used as a practical instrument to evaluate the QOL of NPC patients in China.

  12. Test-retest paradigm of the forced swimming test in female mice is not valid for predicting antidepressant-like activity: participation of acetylcholine and sigma-1 receptors.

    Science.gov (United States)

    Su, Jing; Hato-Yamada, Noriko; Araki, Hiroaki; Yoshimura, Hiroyuki

    2013-01-01

    The forced swimming test (FST) in mice is widely used to predict the antidepressant activity of a drug, but information describing the immobility of female mice is limited. We investigated whether a prior swimming experience affects the immobility duration in a second FST in female mice and whether the test-retest paradigm is a valid screening tool for antidepressants. Female ICR mice were exposed to the FST using two experimental paradigms: a single FST and a double FST in which mice had experienced FST once 24 h prior to the second trail. The initial FST experience reliably prolonged immobility duration in the second FST. The antidepressants imipramine and paroxetine significantly reduced immobility duration in the single FST, but not in the double FST. Scopolamine and the sigma-1 (σ1) antagonist NE-100 administered before the second trial significantly prevented the prolongation of immobility. Neither a 5-HT1A nor a 5-HT2A receptor agonist affected immobility duration. We suggest that the test-retest paradigm in female mice is not adequate for predicting antidepressant-like activity of a drug; the prolongation of immobility in the double FST is modulated through acetylcholine and σ1 receptors.

  13. Pharmacokinetic-pharmacodynamic analysis of antipsychotics-induced extrapyramidal symptoms based on receptor occupancy theory incorporating endogenous dopamine release.

    Science.gov (United States)

    Matsui-Sakata, Akiko; Ohtani, Hisakazu; Sawada, Yasufumi

    2005-06-01

    We aimed to analyze the risks of extrapyramidal symptoms (EPS) induced by typical and atypical antipsychotic drugs using a common pharmacokinetic-pharmacodynamic (PK-PD) model based on the receptor occupancy. We collected the data for EPS induced by atypical antipsychotics, risperidone, olanzapine and quetiapine, and a typical antipsychotic, haloperidol from literature and analyzed the following five indices of EPS, the ratio of patients obliged to take anticholinergic medication, the occurrence rates of plural extrapyramidal symptoms (more than one of tremor, dystonia, hypokinesia, akathisia, extrapyramidal syndrome, etc.), parkinsonism, akathisia, and extrapyramidal syndrome. We tested two models, i.e., a model incorporating endogenous dopamine release owing to 5-HT2A receptor inhibition and a model not considering the endogenous dopamine release, and used them to examine the relationship between the D2 receptor occupancy of endogenous dopamine and the extent of drug-induced EPS. The model incorporating endogenous dopamine release better described the relationship between the mean D2 receptor occupancy of endogenous dopamine and the extent of EPS than the other model, as assessed by the final sum of squares of residuals (final SS) and Akaike's Information Criteria (AIC). Furthermore, the former model could appropriately predict the risks of EPS induced by two other atypical antipsychotics, clozapine and ziprasidone, which were not incorporated into the model development. The developed model incorporating endogenous dopamine release owing to 5-HT2A receptor inhibition may be useful for the prediction of antipsychotics-induced EPS.

  14. Testing the validity of a receptor kinetic model via TcNGA functional imaging of liver transplant recipients. Final report

    International Nuclear Information System (INIS)

    Stadalnik, R.C.

    1993-01-01

    The author had accomplished the expertise for I-125-HSA plasma volume, galactose clearance for determination of hepatic plasma flow as well as finalizing the kinetic model. They have just completed modifying the microscale Scatchard assay for greater precision of receptor measurement using only 5--10 mg of liver tissue. In addition, he determined during the past year that the most practical method and clinically reasonable measurement of liver volume was to measure the transplanted liver in vivo using Tc-NGA images in the anterior, posterior, and right lateral projections, using the method of Rollo and DeLand. Direct measurement of liver weight obtained during transplant operation was not reliable due to variability of fluid retention in the donor liver secondary to ischemia, preservation fluid, etc., which thereby did not reflect an accurate liver weight which is needed in the kinetic analysis comparison, i.e., V h (hepatic plasma volume)

  15. Development and validation of the irritable bowel syndrome scale under the system of quality of life instruments for chronic diseases QLICD-IBS: combinations of classical test theory and generalizability theory.

    Science.gov (United States)

    Lei, Pingguang; Lei, Guanghe; Tian, Jianjun; Zhou, Zengfen; Zhao, Miao; Wan, Chonghua

    2014-10-01

    This paper is aimed to develop the irritable bowel syndrome (IBS) scale of the system of Quality of Life Instruments for Chronic Diseases (QLICD-IBS) by the modular approach and validate it by both classical test theory and generalizability theory. The QLICD-IBS was developed based on programmed decision procedures with multiple nominal and focus group discussions, in-depth interview, and quantitative statistical procedures. One hundred twelve inpatients with IBS were used to provide the data measuring QOL three times before and after treatments. The psychometric properties of the scale were evaluated with respect to validity, reliability, and responsiveness employing correlation analysis, factor analyses, multi-trait scaling analysis, t tests and also G studies and D studies of generalizability theory analysis. Multi-trait scaling analysis, correlation, and factor analyses confirmed good construct validity and criterion-related validity when using SF-36 as a criterion. Test-retest reliability coefficients (Pearson r and intra-class correlation (ICC)) for the overall score and all domains were higher than 0.80; the internal consistency α for all domains at two measurements were higher than 0.70 except for the social domain (0.55 and 0.67, respectively). The overall score and scores for all domains/facets had statistically significant changes after treatments with moderate or higher effect size standardized response mean (SRM) ranging from 0.72 to 1.02 at domain levels. G coefficients and index of dependability (Ф coefficients) confirmed the reliability of the scale further with more exact variance components. The QLICD-IBS has good validity, reliability, responsiveness, and some highlights and can be used as the quality of life instrument for patients with IBS.

  16. Development and validation of the coronary heart disease scale under the system of quality of life instruments for chronic diseases QLICD-CHD: combinations of classical test theory and Generalizability Theory.

    Science.gov (United States)

    Wan, Chonghua; Li, Hezhan; Fan, Xuejin; Yang, Ruixue; Pan, Jiahua; Chen, Wenru; Zhao, Rong

    2014-06-04

    Quality of life (QOL) for patients with coronary heart disease (CHD) is now concerned worldwide with the specific instruments being seldom and no one developed by the modular approach. This paper is aimed to develop the CHD scale of the system of Quality of Life Instruments for Chronic Diseases (QLICD-CHD) by the modular approach and validate it by both classical test theory and Generalizability Theory. The QLICD-CHD was developed based on programmed decision procedures with multiple nominal and focus group discussions, in-depth interview, pre-testing and quantitative statistical procedures. 146 inpatients with CHD were used to provide the data measuring QOL three times before and after treatments. The psychometric properties of the scale were evaluated with respect to validity, reliability and responsiveness employing correlation analysis, factor analyses, multi-trait scaling analysis, t-tests and also G studies and D studies of Genralizability Theory analysis. Multi-trait scaling analysis, correlation and factor analyses confirmed good construct validity and criterion-related validity when using SF-36 as a criterion. The internal consistency α and test-retest reliability coefficients (Pearson r and Intra-class correlations ICC) for the overall instrument and all domains were higher than 0.70 and 0.80 respectively; The overall and all domains except for social domain had statistically significant changes after treatments with moderate effect size SRM (standardized response mea) ranging from 0.32 to 0.67. G-coefficients and index of dependability (Ф coefficients) confirmed the reliability of the scale further with more exact variance components. The QLICD-CHD has good validity, reliability, and moderate responsiveness and some highlights, and can be used as the quality of life instrument for patients with CHD. However, in order to obtain better reliability, the numbers of items for social domain should be increased or the items' quality, not quantity, should be

  17. What Do You Think You Are Measuring? A Mixed-Methods Procedure for Assessing the Content Validity of Test Items and Theory-Based Scaling

    Science.gov (United States)

    Koller, Ingrid; Levenson, Michael R.; Glück, Judith

    2017-01-01

    The valid measurement of latent constructs is crucial for psychological research. Here, we present a mixed-methods procedure for improving the precision of construct definitions, determining the content validity of items, evaluating the representativeness of items for the target construct, generating test items, and analyzing items on a theoretical basis. To illustrate the mixed-methods content-scaling-structure (CSS) procedure, we analyze the Adult Self-Transcendence Inventory, a self-report measure of wisdom (ASTI, Levenson et al., 2005). A content-validity analysis of the ASTI items was used as the basis of psychometric analyses using multidimensional item response models (N = 1215). We found that the new procedure produced important suggestions concerning five subdimensions of the ASTI that were not identifiable using exploratory methods. The study shows that the application of the suggested procedure leads to a deeper understanding of latent constructs. It also demonstrates the advantages of theory-based item analysis. PMID:28270777

  18. Development and validation of the Brazilian version of the Attitudes to Aging Questionnaire (AAQ: An example of merging classical psychometric theory and the Rasch measurement model

    Directory of Open Access Journals (Sweden)

    Trentini Clarissa M

    2008-01-01

    Full Text Available Abstract Background Aging has determined a demographic shift in the world, which is considered a major societal achievement, and a challenge. Aging is primarily a subjective experience, shaped by factors such as gender and culture. There is a lack of instruments to assess attitudes to aging adequately. In addition, there is no instrument developed or validated in developing region contexts, so that the particularities of ageing in these areas are not included in the measures available. This paper aims to develop and validate a reliable attitude to aging instrument by combining classical psychometric approach and Rasch analysis. Methods Pilot study and field trial are described in details. Statistical analysis included classic psychometric theory (EFA and CFA and Rasch measurement model. The latter was applied to examine unidimensionality, response scale and item fit. Results Sample was composed of 424 Brazilian old adults, which was compared to an international sample (n = 5238. The final instrument shows excellent psychometric performance (discriminant validity, confirmatory factor analysis and Rasch fit statistics. Rasch analysis indicated that modifications in the response scale and item deletions improved the initial solution derived from the classic approach. Conclusion The combination of classic and modern psychometric theories in a complementary way is fruitful for development and validation of instruments. The construction of a reliable Brazilian Attitudes to Aging Questionnaire is important for assessing cultural specificities of aging in a transcultural perspective and can be applied in international cross-cultural investigations running less risk of cultural bias.

  19. Analytical Validation and Clinical Qualification of a New Immunohistochemical Assay for Androgen Receptor Splice Variant-7 Protein Expression in Metastatic Castration-resistant Prostate Cancer.

    Science.gov (United States)

    Welti, Jonathan; Rodrigues, Daniel Nava; Sharp, Adam; Sun, Shihua; Lorente, David; Riisnaes, Ruth; Figueiredo, Ines; Zafeiriou, Zafeiris; Rescigno, Pasquale; de Bono, Johann S; Plymate, Stephen R

    2016-10-01

    The androgen receptor splice variant-7 (AR-V7) has been implicated in the development of castration-resistant prostate cancer (CRPC) and resistance to abiraterone and enzalutamide. To develop a validated assay for detection of AR-V7 protein in tumour tissue and determine its expression and clinical significance as patients progress from hormone-sensitive prostate cancer (HSPC) to CRPC. Following monoclonal antibody generation and validation, we retrospectively identified patients who had HSPC and CRPC tissue available for AR-V7 immunohistochemical (IHC) analysis. Nuclear AR-V7 expression was determined using IHC H score (HS) data. The change in nuclear AR-V7 expression from HSPC to CRPC and the association between nuclear AR-V7 expression and overall survival (OS) was determined. Nuclear AR-V7 expression was significantly lower in HSPC (median HS 50, interquartile range [IQR] 17.5-90) compared to CRPC (HS 135, IQR 80-157.5; pprostate cancer. A higher level of AR-V7 identifies a group of patients who respond less well to certain prostate cancer treatments and live for a shorter period of time. Copyright © 2016 European Association of Urology. Published by Elsevier B.V. All rights reserved.

  20. Stereochemical study of mouse muscone receptor MOR215-1 and vibrational theory based on statistical physics formalism.

    Science.gov (United States)

    Ben Khemis, Ismahene; Mechi, Nesrine; Ben Lamine, Abdelmottaleb

    2018-02-10

    In the biosensor system, olfactory receptor sites could be activated by odorant molecules and then the biological interactions are converted into electrical signals by a signal transduction cascade that leads the toopening of ion channels, generating a current that leads into the cilia and depolarizes the membrane. The aim of this paper is to present a new investigation that allows determining the olfactory band using a monolayer adsorption with identical sites modeling which may also describe the static and the dynamic sensitivities through the expression of the olfactory response. Moreover, knowing the size of receptor site in olfactory sensory neurons provides valuable information about the relationship between molecular structure and biological activity. The determination of microreceptors and mesoreceptors is mostly carried out via physical adsorption and the radius is calculated using the Kelvin equation. The mean values of radius obtained from the maximum of the receptor size distributions peaks are 4 nm for ℓ-muscone and 6 nm for d-muscone. Copyright © 2018. Published by Elsevier Ltd.

  1. Nucleation theory with delayed interactions: an application to the early stages of the receptor-mediated adhesion/fusion kinetics of lipid vesicles.

    Science.gov (United States)

    Raudino, Antonio; Pannuzzo, Martina

    2010-01-28

    A semiquantitative theory aimed to describe the adhesion kinetics between soft objects, such as living cells or vesicles, has been developed. When rigid bodies are considered, the adhesion kinetics is successfully described by the classical Derjaguin, Landau, Verwey, and Overbeek (DLVO) picture, where the energy profile of two approaching bodies is given by a two asymmetrical potential wells separated by a barrier. The transition probability from the long-distance to the short-distance minimum defines the adhesion rate. Conversely, soft bodies might follow a different pathway to reach the short-distance minimum: thermally excited fluctuations give rise to local protrusions connecting the approaching bodies. These transient adhesion sites are stabilized by short-range adhesion forces (e.g., ligand-receptor interactions between membranes brought at contact distance), while they are destabilized both by repulsive forces and by the elastic deformation energy. Above a critical area of the contact site, the adhesion forces prevail: the contact site grows in size until the complete adhesion of the two bodies inside a short-distance minimum is attained. This nucleation mechanism has been developed in the framework of a nonequilibrium Fokker-Planck picture by considering both the adhesive patch growth and dissolution processes. In addition, we also investigated the effect of the ligand-receptor pairing kinetics at the adhesion site in the time course of the patch expansion. The ratio between the ligand-receptor pairing kinetics and the expansion rate of the adhesion site is of paramount relevance in determining the overall nucleation rate. The theory enables one to self-consistently include both thermodynamics (energy barrier height) and dynamic (viscosity) parameters, giving rise in some limiting cases to simple analytical formulas. The model could be employed to rationalize fusion kinetics between vesicles, provided the short-range adhesion transition is the rate

  2. The Children's Social Understanding Scale: construction and validation of a parent-report measure for assessing individual differences in children's theories of mind.

    Science.gov (United States)

    Tahiroglu, Deniz; Moses, Louis J; Carlson, Stephanie M; Mahy, Caitlin E V; Olofson, Eric L; Sabbagh, Mark A

    2014-11-01

    Children's theory of mind (ToM) is typically measured with laboratory assessments of performance. Although these measures have generated a wealth of informative data concerning developmental progressions in ToM, they may be less useful as the sole source of information about individual differences in ToM and their relation to other facets of development. In the current research, we aimed to expand the repertoire of methods available for measuring ToM by developing and validating a parent-report ToM measure: the Children's Social Understanding Scale (CSUS). We present 3 studies assessing the psychometric properties of the CSUS. Study 1 describes item analysis, internal consistency, test-retest reliability, and relation of the scale to children's performance on laboratory ToM tasks. Study 2 presents cross-validation data for the scale in a different sample of preschool children with a different set of ToM tasks. Study 3 presents further validation data for the scale with a slightly older age group and a more advanced ToM task, while controlling for several other relevant cognitive abilities. The findings indicate that the CSUS is a reliable and valid measure of individual differences in children's ToM that may be of great value as a complement to standard ToM tasks in many different research contexts. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  3. Question of Drug Workers under the Prism Theory of Axel Honneth recognition . Human Right to Work and his pretension of Universal Validity

    OpenAIRE

    Maria Cecília Máximo Teodoro; Sabrina Colares Nogueira

    2016-01-01

    Relate the theme regarding the Theory of Recognition Axel Honnet, the human right to work as a human right to claim to universal validity, from the perspective of the Declaration of Fundamental Principles and Rights at Work ILO, for purposes of setting parameters and recognition of social (re) integration of chemical class dependent workers is the main goal of this article. We will analyze the theoretical framework in which the pursuit of human dignity at work is the goal of the state to whic...

  4. Rhetorical Approaches to Crisis Communication: The Research, Development, and Validation of an Image Repair Situational Theory for Educational Leaders

    Science.gov (United States)

    Vogelaar, Robert J.

    2005-01-01

    In this project a product to aid educational leaders in the process of communicating in crisis situations is presented. The product was created and received a formative evaluation using an educational research and development methodology. Ultimately, an administrative training course that utilized an Image Repair Situational Theory was developed.…

  5. Validation of molecular crystal structures from powder diffraction data with dispersion-corrected density functional theory (DFT-D)

    DEFF Research Database (Denmark)

    van de Streek, Jacco; Neumann, Marcus A

    2014-01-01

    In 2010 we energy-minimized 225 high-quality single-crystal (SX) structures with dispersion-corrected density functional theory (DFT-D) to establish a quantitative benchmark. For the current paper, 215 organic crystal structures determined from X-ray powder diffraction (XRPD) data and published...

  6. Empirical assessment of the validity limits of the surface wave full ray theory using realistic 3-D Earth models

    KAUST Repository

    Parisi, Laura; Ferreira, Ana M.G.

    2016-01-01

    The surface wave full ray theory (FRT) is an efficient tool to calculate synthetic waveforms of surface waves. It combines the concept of local modes with exact ray tracing as a function of frequency, providing a more complete description of surface

  7. Life history theory and breast cancer risk: methodological and theoretical challenges: Response to "Is estrogen receptor negative breast cancer risk associated with a fast life history strategy?".

    Science.gov (United States)

    Aktipis, Athena

    2016-01-01

    In a meta-analysis published by myself and co-authors, we report differences in the life history risk factors for estrogen receptor negative (ER-) and estrogen receptor positive (ER+) breast cancers. Our meta-analysis did not find the association of ER- breast cancer risk with fast life history characteristics that Hidaka and Boddy suggest in their response to our article. There are a number of possible explanations for the differences between their conclusions and the conclusions we drew from our meta-analysis, including limitations of our meta-analysis and methodological challenges in measuring and categorizing estrogen receptor status. These challenges, along with the association of ER+ breast cancer with slow life history characteristics, may make it challenging to find a clear signal of ER- breast cancer with fast life history characteristics, even if that relationship does exist. The contradictory results regarding breast cancer risk and life history characteristics illustrate a more general challenge in evolutionary medicine: often different sub-theories in evolutionary biology make contradictory predictions about disease risk. In this case, life history models predict that breast cancer risk should increase with faster life history characteristics, while the evolutionary mismatch hypothesis predicts that breast cancer risk should increase with delayed reproduction. Whether life history tradeoffs contribute to ER- breast cancer is still an open question, but current models and several lines of evidence suggest that it is a possibility. © The Author(s) 2016. Published by Oxford University Press on behalf of the Foundation for Evolution, Medicine, and Public Health.

  8. Measuring selfhood according to self-determination theory: Construction and validation of the Ego Functioning Questionnaire (EFQ

    Directory of Open Access Journals (Sweden)

    Majstorović Nebojša

    2008-01-01

    Full Text Available The goal of this research was to develop and validate an instrument designed to measure the three types of self proposed by Hodgins and Knee (2002: integrated, ego-invested, and impersonal. This measure was termed The Ego Functioning Questionnaire (EFQ. In Study 1 (N=202, the factorial structure of the EFQ was examined by means of an exploratory factor analysis, and the metric properties of its subscales were documented. In Study 2 (N=300, the 3 factor structure of the EFQ was successfully corroborated using a confirmatory factor analysis. In Study 3 (N=131, associations between the EFQ and a variety of cognitive, affective, and social variables were found to display meaningful patterns, thereby providing support for the EFQ’s construct validity. Also, the EFQ was not susceptible to socially desirable responding. Results are discussed in terms of their fundamental and applied implications.

  9. Intersystem crossing and dynamics in O(3P) + C2H4 multichannel reaction: Experiment validates theory

    Science.gov (United States)

    Fu, Bina; Han, Yong-Chang; Bowman, Joel M.; Angelucci, Luca; Balucani, Nadia; Leonori, Francesca; Casavecchia, Piergiorgio

    2012-01-01

    The O(3P) + C2H4 reaction, of importance in combustion and atmospheric chemistry, stands out as a paradigm reaction involving triplet- and singlet-state potential energy surfaces (PESs) interconnected by intersystem crossing (ISC). This reaction poses challenges for theory and experiments owing to the ruggedness and high dimensionality of these potentials, as well as the long lifetimes of the collision complexes. Primary products from five competing channels (H + CH2CHO, H + CH3CO, H2 + CH2CO, CH3 + HCO, CH2 + CH2O) and branching ratios (BRs) are determined in crossed molecular beam experiments with soft electron-ionization mass-spectrometric detection at a collision energy of 8.4 kcal/mol. As some of the observed products can only be formed via ISC from triplet to singlet PESs, from the product BRs the extent of ISC is inferred. A new full-dimensional PES for the triplet state as well as spin-orbit coupling to the singlet PES are reported, and roughly half a million surface hopping trajectories are run on the coupled singlet-triplet PESs to compare with the experimental BRs and differential cross-sections. Both theory and experiment find almost equal contributions from the two PESs to the reaction, posing the question of how important is it to consider the ISC as one of the nonadiabatic effects for this and similar systems involved in combustion chemistry. Detailed comparisons at the level of angular and translational energy distributions between theory and experiment are presented for the two primary channel products, CH3 + HCO and H + CH2CHO. The agreement between experimental and theoretical functions is excellent, implying that theory has reached the capability of describing complex multichannel nonadiabatic reactions. PMID:22665777

  10. Question of Drug Workers under the Prism Theory of Axel Honneth recognition . Human Right to Work and his pretension of Universal Validity

    Directory of Open Access Journals (Sweden)

    Maria Cecília Máximo Teodoro

    2016-06-01

    Full Text Available Relate the theme regarding the Theory of Recognition Axel Honnet, the human right to work as a human right to claim to universal validity, from the perspective of the Declaration of Fundamental Principles and Rights at Work ILO, for purposes of setting parameters and recognition of social (re integration of chemical class dependent workers is the main goal of this article. We will analyze the theoretical framework in which the pursuit of human dignity at work is the goal of the state to which we treat below the dignity of man as a right whose claim to universal validity is based on the recognition of the theory of Axel Honneth . That said, we analyze the possible dignity of man at work in a global context from the recognition of new rights, especially with regard to the category of workers who occupy the social portion of chemical-dependent, whose participation in the social process is prevented, creating a category of workers socially excluded, which is intended to overcome.

  11. Testing for the validity of purchasing power parity theory both in the long-run and the short-run for ASEAN-5

    Science.gov (United States)

    Choji, Niri Martha; Sek, Siok Kun

    2017-11-01

    The purchasing power parity theory says that the trade rates among two nations ought to be equivalent to the proportion of the total price levels between the two nations. For more than a decade, there has been substantial interest in testing for the validity of the Purchasing Power Parity (PPP) empirically. This paper performs a series of tests to see if PPP is valid for ASEAN-5 nations for the period of 2000-2016 using monthly data. For this purpose, we conducted four different tests of stationarity, two cointegration tests (Pedroni and Westerlund), and also the VAR model. The stationarity (unit root) tests reveal that the variables are not stationary at levels however stationary at first difference. Cointegration test results did not reject the H0 of no cointegration implying the absence long-run association among the variables and results of the VAR model did not reveal a strong short-run relationship. Based on the data, we, therefore, conclude that PPP is not valid in long-and short-run for ASEAN-5 during 2000-2016.

  12. Assessing the Implicit Theory of Willpower for Strenuous Mental Activities Scale: Multigroup, across-gender, and cross-cultural measurement invariance and convergent and divergent validity.

    Science.gov (United States)

    Napolitano, Christopher M; Job, Veronika

    2018-05-21

    Why do some people struggle with self-control (colloquially called willpower) whereas others are able to sustain it during challenging circumstances? Recent research showed that a person's implicit theories of willpower-whether they think self-control capacity is a limited or nonlimited resource-predict sustained self-control on laboratory tasks and on goal-related outcomes in everyday life. The present research tests the Implicit Theory of Willpower for Strenuous Mental Activities Scale (or ITW-M) Scale for measurement invariance across samples and gender within each culture, and two cultural contexts (the U.S. and Switzerland/Germany). Across a series of multigroup confirmatory factor analyses, we found support for the measurement invariance of the ITW-M scale across samples within and across two cultures, as well as across men and women. Further, the analyses showed expected patterns of convergent (with life-satisfaction and trait-self-control) and discriminant validity (with implicit theory of intelligence). These results provide guidelines for future research and clinical practice using the ITW-M scale for the investigation of latent group differences, for example, between gender or cultures. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  13. Cross-cultural validity of the theory of planned behavior for predicting healthy food choice in secondary school students of Inner Mongolia.

    Science.gov (United States)

    Shimazaki, Takashi; Bao, Hugejiletu; Deli, Geer; Uechi, Hiroaki; Lee, Ying-Hua; Miura, Kayo; Takenaka, Koji

    2017-11-01

    Unhealthy eating behavior is a serious health concern among secondary school students in Inner Mongolia. To predict their healthy food choices and devise methods of correcting unhealthy choices, we sought to confirm the cross-cultural validity of the theory of planned behavior among Inner Mongolian students. A cross-sectional study, conducted between November and December 2014. Overall, 3047 students were enrolled. We devised a questionnaire based on the theory of planned behavior to measure its components (intentions, attitudes, subjective norms, and perceived behavioral control) in relation to healthy food choices; we also assessed their current engagement in healthy food choices. A principal component analysis revealed high contribution rates for the components (69.32%-88.77%). A confirmatory factor analysis indicated that the components of the questionnaire had adequate model fit (goodness of fit index=0.997, adjusted goodness of fit index=0.984, comparative fit index=0.998, and root mean square error of approximation=0.049). Notably, data from participants within the suburbs did not support the theory of planned behavior construction. Several paths did not predict the hypothesis variables. However, attitudes toward healthy food choices strongly predicted behavioral intention (path coefficients 0.49-0.77, ptheory of planned behavior can apply to secondary school students in urban areas. Furthermore, attitudes towards healthy food choices were the best predictor of behavioral intentions to engage in such choices in Inner Mongolian students. Copyright © 2017 Diabetes India. Published by Elsevier Ltd. All rights reserved.

  14. Rejoinder to commentary on Palmatier and Rovner (2015): credibility assessment: Preliminary Process Theory, the polygraph process, and construct validity.

    Science.gov (United States)

    Palmatier, John J; Rovner, Louis

    2015-01-01

    We briefly review comments submitted in response to the target article in this series (Palmatier & Rovner, 2015) arguing that a scientifically defensible construct for the instrumental assessment of credibility (i.e. polygraph) may be found in Barry's Preliminary Process Theory (PPT). Our review of the relevant scientific literature discovered a growing body of converging evidence, particularly from the neurosciences that focus not only on deception, but more broadly on memory, emotion, and the orienting response (OR), leading to this conclusion. After reviewing the submitted comments, we are further convinced, especially as applied scientists, that at this time the most viable direction forward is in the context of the PPT. Concurrently, we candidly acknowledge that research must be conducted to address not only commentator concerns but, if warranted, modification of existing theory. Although disagreement continues to exist regarding the order in which questions are asked, the most significant finding, is perhaps that not a single commentator argues against this growing, and vital applied science (i.e., the instrumental assessment of credibility - polygraph). Copyright © 2014 Elsevier B.V. All rights reserved.

  15. 3-Econsystems: MicroRNAs, Receptors, and Latent Viruses; Some Insights Biology Can Gain from Economic Theory.

    Science.gov (United States)

    Polansky, Hanan; Javaherian, Adrian

    2016-01-01

    This mini-review describes three biological systems. All three include competing molecules and a limiting molecule that binds the competing molecules. Such systems are extensively researched by economists. In fact, the issue of limited resources is the defining feature of economic systems. Therefore, we call these systems "econsystems." In an econsystem, the allocation of the limiting molecule between the competing molecules determines the behavior of the system. A cell is an example of an econsystem. Therefore, a change in the allocation of a limiting molecule as a result of, for instance, an abnormal change in the concentration of one of the competing molecules, may result in abnormal cellular behavior, and disease. The first econsystem described in this mini-review includes a long non-coding RNA and a messenger RNA (lncRNA and mRNA). The limiting molecule is a microRNA (miRNA). The lncRNA and mRNA are known as competing endogenous RNAs (ceRNAs). The second econsystem includes two receptors, and the limiting molecule is a ligand. The third econsystem includes a cis-regulatory element of a latent virus and that of a human gene. The limiting molecule is a transcription complex that binds both cis-elements.

  16. Crack Detection in Fibre Reinforced Plastic Structures Using Embedded Fibre Bragg Grating Sensors: Theory, Model Development and Experimental Validation

    DEFF Research Database (Denmark)

    Pereira, Gilmar Ferreira; Mikkelsen, Lars Pilgaard; McGugan, Malcolm

    2015-01-01

    properties. When applying this concept to different structures, sensor systems and damage types, a combination of damage mechanics, monitoring technology, and modelling is required. The primary objective of this article is to demonstrate such a combination. This article is divided in three main topics......: the damage mechanism (delamination of FRP), the structural health monitoring technology (fibre Bragg gratings to detect delamination), and the finite element method model of the structure that incorporates these concepts into a final and integrated damage-monitoring concept. A novel method for assessing...... by the end-user. Conjointly, a novel model for sensor output prediction (virtual sensor) was developed using this FBG sensor crack monitoring concept and implemented in a finite element method code. The monitoring method was demonstrated and validated using glass fibre double cantilever beam specimens...

  17. Second-order theory for coupling 2D numerical and physical wave tanks: Derivation, evaluation and experimental validation

    DEFF Research Database (Denmark)

    Yang, Zhiwen; Liu, Shuxue; Bingham, Harry B.

    2013-01-01

    , 171–186] is extended to include the second-order dispersive correction. The new formulation is presented in a unified form that includes both progressive and evanescent modes and covers wavemaker configurations of the piston- and flap-type. The second order paddle stroke correction allows for improved...... nonlinear wave generation in the physical wave tank based on target numerical solutions. The performance and efficiency of the new model is first evaluated theoretically based on second order Stokes waves. Due to the complexity of the problem, the proposed method has been truncated at 2D and the treatment...... that the new second-order coupling theory provides an improvement in the quality of nonlinear wave generation when compared to existing techniques....

  18. A validation report for the KALIMER core design computing system by the Monte Carlo transport theory code

    International Nuclear Information System (INIS)

    Lee, Ki Bog; Kim, Yeong Il; Kim, Kang Seok; Kim, Sang Ji; Kim, Young Gyun; Song, Hoon; Lee, Dong Uk; Lee, Byoung Oon; Jang, Jin Wook; Lim, Hyun Jin; Kim, Hak Sung

    2004-05-01

    In this report, the results of KALIMER (Korea Advanced LIquid MEtal Reactor) core design calculated by the K-CORE computing system are compared and analyzed with those of MCDEP calculation. The effective multiplication factor, flux distribution, fission power distribution and the number densities of the important nuclides effected from the depletion calculation for the R-Z model and Hex-Z model of KALIMER core are compared. It is confirmed that the results of K-CORE system compared with those of MCDEP based on the Monte Carlo transport theory method agree well within 700 pcm for the effective multiplication factor estimation and also within 2% in the driver fuel region, within 10% in the radial blanket region for the reaction rate and the fission power density. Thus, the K-CORE system for the core design of KALIMER by treating the lumped fission product and mainly important nuclides can be used as a core design tool keeping the necessary accuracy

  19. Psychometric validation of the Persian Bergen Social Media Addiction Scale using classic test theory and Rasch models.

    Science.gov (United States)

    Lin, Chung-Ying; Broström, Anders; Nilsen, Per; Griffiths, Mark D; Pakpour, Amir H

    2017-12-01

    Background and aims The Bergen Social Media Addiction Scale (BSMAS), a six-item self-report scale that is a brief and effective psychometric instrument for assessing at-risk social media addiction on the Internet. However, its psychometric properties in Persian have never been examined and no studies have applied Rasch analysis for the psychometric testing. This study aimed to verify the construct validity of the Persian BSMAS using confirmatory factor analysis (CFA) and Rasch models among 2,676 Iranian adolescents. Methods In addition to construct validity, measurement invariance in CFA and differential item functioning (DIF) in Rasch analysis across gender were tested for in the Persian BSMAS. Results Both CFA [comparative fit index (CFI) = 0.993; Tucker-Lewis index (TLI) = 0.989; root mean square error of approximation (RMSEA) = 0.057; standardized root mean square residual (SRMR) = 0.039] and Rasch (infit MnSq = 0.88-1.28; outfit MnSq = 0.86-1.22) confirmed the unidimensionality of the BSMAS. Moreover, measurement invariance was supported in multigroup CFA including metric invariance (ΔCFI = -0.001; ΔSRMR = 0.003; ΔRMSEA = -0.005) and scalar invariance (ΔCFI = -0.002; ΔSRMR = 0.005; ΔRMSEA = 0.001) across gender. No item displayed DIF (DIF contrast = -0.48 to 0.24) in Rasch across gender. Conclusions Given the Persian BSMAS was unidimensional, it is concluded that the instrument can be used to assess how an adolescent is addicted to social media on the Internet. Moreover, users of the instrument may comfortably compare the sum scores of the BSMAS across gender.

  20. Innovation, Product Development, and New Business Models in Networks: How to come from case studies to a valid and operational theory

    DEFF Research Database (Denmark)

    Rasmussen, Erik Stavnsager; Jørgensen, Jacob Høj; Goduscheit, René Chester

    2007-01-01

    We have in the research project NEWGIBM (New Global ICT based Business Models) during 2005 and 2006 closely cooperated with a group of firms. The focus in the project has been development of new business models (and innovation) in close cooperation with multiple partners. These partners have been...... customers, suppliers, R&D partners, and others. The methodological problem is thus, how to come from e.g. one in-depth case study to a more formalized theory or model on how firms can develop new projects and be innovative in a network. The paper is structured so that it starts with a short presentation...... of the two key concepts in our research setting and theoretical models: Innovation and networks. It is not our intention in this paper to present a lengthy discussion of the two concepts, but a short presentation is necessary to understand the validity and interpretation discussion later in the paper. Next...

  1. Errors in the universal and sufficient heuristic criteria of estimating validity limits of geometric optics and of the geometric theory of diffraction

    International Nuclear Information System (INIS)

    Borovikov, V.A.; Kinber, B.E.

    1988-01-01

    The heuristic criteria (HC) of validity of geometric optics (GO) and of the geometric theory of diffraction (GTD), suggested in [2-7, 13, 14] and based on identifying the physical volume occupied by the ray with the Fresnel volume (FV) introduced in these papers (i.e., the envelope of the first Fresnel zone), are analyzed. Numerous examples of HC invalidity are given, as well as the reasons. In particular, HC provide an incorrect answer for all GO problems with caustics, since in these problems there always exists a ray, whose FV is nonlocal and covers the FV of other rays. The HC are shown to be unsuitable for multiple ray GTD problems, as well as for the simplest problems of diffraction of a cylindrical wave by a half-plane and of a plane wave by a curved half-plane

  2. Quality of life in the Danish general population--normative data and validity of WHOQOL-BREF using Rasch and item response theory models

    DEFF Research Database (Denmark)

    Noerholm, V; Groenvold, M; Watt, T

    2004-01-01

    BACKGROUND: The main objective of this study was to investigate the construct validity of the WHOQOL-BREF by use of Rasch and Item Response Theory models and to examine the stability of the model across high/low scoring individuals, gender, education, and depressive illness. Furthermore......, the objective of the study was to estimate the reference data for the quality of life questionnaire WHOQOL-BREF in the general Danish population and in subgroups defined by age, gender, and education. METHODS: Mail-out-mail-back questionnaires were sent to a randomly selected sample of the Danish general...... population. The response rate was 68.5%, and the sample reported here contained 1101 respondents: 578 women and 519 men (four respondents did not indicate their genders). RESULTS: Each of the four domains of the WHOQOL-BREF scale fitted a two-parameter IRT model, but did not fit the Rasch model. Due...

  3. A mathematical method for verifying the validity of measured information about the flows of energy resources based on the state estimation theory

    Science.gov (United States)

    Pazderin, A. V.; Sof'in, V. V.; Samoylenko, V. O.

    2015-11-01

    Efforts aimed at improving energy efficiency in all branches of the fuel and energy complex shall be commenced with setting up a high-tech automated system for monitoring and accounting energy resources. Malfunctions and failures in the measurement and information parts of this system may distort commercial measurements of energy resources and lead to financial risks for power supplying organizations. In addition, measurement errors may be connected with intentional distortion of measurements for reducing payment for using energy resources on the consumer's side, which leads to commercial loss of energy resource. The article presents a universal mathematical method for verifying the validity of measurement information in networks for transporting energy resources, such as electricity and heat, petroleum, gas, etc., based on the state estimation theory. The energy resource transportation network is represented by a graph the nodes of which correspond to producers and consumers, and its branches stand for transportation mains (power lines, pipelines, and heat network elements). The main idea of state estimation is connected with obtaining the calculated analogs of energy resources for all available measurements. Unlike "raw" measurements, which contain inaccuracies, the calculated flows of energy resources, called estimates, will fully satisfy the suitability condition for all state equations describing the energy resource transportation network. The state equations written in terms of calculated estimates will be already free from residuals. The difference between a measurement and its calculated analog (estimate) is called in the estimation theory an estimation remainder. The obtained large values of estimation remainders are an indicator of high errors of particular energy resource measurements. By using the presented method it is possible to improve the validity of energy resource measurements, to estimate the transportation network observability, to eliminate

  4. Using Rasch Analysis to Evaluate the Reliability and Validity of the Swallowing Quality of Life Questionnaire: An Item Response Theory Approach.

    Science.gov (United States)

    Cordier, Reinie; Speyer, Renée; Schindler, Antonio; Michou, Emilia; Heijnen, Bas Joris; Baijens, Laura; Karaduman, Ayşe; Swan, Katina; Clavé, Pere; Joosten, Annette Veronica

    2018-02-01

    The Swallowing Quality of Life questionnaire (SWAL-QOL) is widely used clinically and in research to evaluate quality of life related to swallowing difficulties. It has been described as a valid and reliable tool, but was developed and tested using classic test theory. This study describes the reliability and validity of the SWAL-QOL using item response theory (IRT; Rasch analysis). SWAL-QOL data were gathered from 507 participants at risk of oropharyngeal dysphagia (OD) across four European countries. OD was confirmed in 75.7% of participants via videofluoroscopy and/or fiberoptic endoscopic evaluation, or a clinical diagnosis based on meeting selected criteria. Patients with esophageal dysphagia were excluded. Data were analysed using Rasch analysis. Item and person reliability was good for all the items combined. However, person reliability was poor for 8 subscales and item reliability was poor for one subscale. Eight subscales exhibited poor person separation and two exhibited poor item separation. Overall item and person fit statistics were acceptable. However, at an individual item fit level results indicated unpredictable item responses for 28 items, and item redundancy for 10 items. The item-person dimensionality map confirmed these findings. Results from the overall Rasch model fit and Principal Component Analysis were suggestive of a second dimension. For all the items combined, none of the item categories were 'category', 'threshold' or 'step' disordered; however, all subscales demonstrated category disordered functioning. Findings suggest an urgent need to further investigate the underlying structure of the SWAL-QOL and its psychometric characteristics using IRT.

  5. Investigating the little rip and other future singularities of the universe, and validity of the second law of thermodynamics in F(R theory

    Directory of Open Access Journals (Sweden)

    M Aghaei Abchouyeh

    2015-01-01

    Full Text Available The future singularities are possible in a universe that is described by F(R theory. In previous studies the occurrence of the singularities in F(R theory have been considered by using a special function for the Hubble parameter and calculating the F(R function for each of the singularities. Using the specified Hubble parameter causes some difficulties in the study of the second law of thermodynamics. In this paper by using the scale factor, the behavior of F(R function near each type of the singularities is considered. We can check the validity of the second law of thermodynamics near the singularities. At first we study the Little Rip and then the other types of singularities are considered. The results show that the second law of thermodynamics is satisfied near the singularity type (I with some special conditions and is violated with some other conditions. it is satisfied near the Little Rip, type (II, (III and (IV singularities

  6. The Concurrent and Construct Validity of Intrinsic/Extrinsic Motivation in Japanese EFL Learners : A Self-Determination Theory Perspective

    OpenAIRE

    HONDA, Katsuhisa; SAKYU, Masahide

    2005-01-01

    Vallerand, Blais, Brière,&Pelletier (1989)は、内発的/外発的動機づけ(intrinsic/extrinsic motivation)と非動機づけ(amotivation)を総合的に測定するEchelle de Motivation en Education (EME)を作成した。フランス語で開発されたEMEは,Deci&Ryan (1985)の自己決定理論(self-determination theory)にもとづくものであるが,英語話者ならびに英語をL2として学習する者にも,その妥当性が保証されるようになってきている。本田・佐久(2004)では,その英語版であるAcademic Motivation Scale (AMS)から抽出した項目を,英語を専攻する短大生に提示し,日本の言語環境におけるAMSの妥当性と信頼性を検討した。AMSの再検査信頼係数の範囲ならびに平均係数の値から,日本人英語学習者の再検査信頼係数は妥当な結果と判断され,また検証的因子分析(AMOSモデル)によってAMSの7因子構造が提示された。本稿では,自己決定理論の日...

  7. Validation of molecular crystal structures from powder diffraction data with dispersion-corrected density functional theory (DFT-D).

    Science.gov (United States)

    van de Streek, Jacco; Neumann, Marcus A

    2014-12-01

    In 2010 we energy-minimized 225 high-quality single-crystal (SX) structures with dispersion-corrected density functional theory (DFT-D) to establish a quantitative benchmark. For the current paper, 215 organic crystal structures determined from X-ray powder diffraction (XRPD) data and published in an IUCr journal were energy-minimized with DFT-D and compared to the SX benchmark. The on average slightly less accurate atomic coordinates of XRPD structures do lead to systematically higher root mean square Cartesian displacement (RMSCD) values upon energy minimization than for SX structures, but the RMSCD value is still a good indicator for the detection of structures that deserve a closer look. The upper RMSCD limit for a correct structure must be increased from 0.25 Å for SX structures to 0.35 Å for XRPD structures; the grey area must be extended from 0.30 to 0.40 Å. Based on the energy minimizations, three structures are re-refined to give more precise atomic coordinates. For six structures our calculations provide the missing positions for the H atoms, for five structures they provide corrected positions for some H atoms. Seven crystal structures showed a minor error for a non-H atom. For five structures the energy minimizations suggest a higher space-group symmetry. For the 225 SX structures, the only deviations observed upon energy minimization were three minor H-atom related issues. Preferred orientation is the most important cause of problems. A preferred-orientation correction is the only correction where the experimental data are modified to fit the model. We conclude that molecular crystal structures determined from powder diffraction data that are published in IUCr journals are of high quality, with less than 4% containing an error in a non-H atom.

  8. Validation of the brief version of the Recovery Self-Assessment (RSA-B) using Rasch measurement theory.

    Science.gov (United States)

    Barbic, Skye P; Kidd, Sean A; Davidson, Larry; McKenzie, Kwame; O'Connell, Maria J

    2015-12-01

    In psychiatry, the recovery paradigm is increasingly identified as the overarching framework for service provision. Currently, the Recovery Self-Assessment (RSA), a 36-item rating scale, is commonly used to assess the uptake of a recovery orientation in clinical services. However, the consumer version of the RSA has been found challenging to complete because of length and the reading level required. In response to this feedback, a brief 12-item version of the RSA was developed (RSA-B). This article describes the development of the modified instrument and the application of traditional psychometric analysis and Rasch Measurement Theory to test the psychometrics properties of the RSA-B. Data from a multisite study of adults with serious mental illnesses (n = 1256) who were followed by assertive community treatment teams were examined for reliability, clinical meaning, targeting, response categories, model fit, reliability, dependency, and raw interval-level measurement. Analyses were performed using the Rasch Unidimensional Measurement Model (RUMM 2030). Adequate fit to the Rasch model was observed (χ2 = 112.46, df = 90, p = .06) and internal consistency was good (r = .86). However, Rasch analysis revealed limitations of the 12-item version, with items covering only 39% of the targeted theoretical continuum, 2 misfitting items, and strong evidence for the 5 option response categories not working as intended. This study revealed areas for improvement in the shortened version of the 12-item RSA-B. A revisit of the conceptual model and original 36-item rating scale is encouraged to select items that will help practitioners and researchers measure the full range of recovery orientation. (c) 2015 APA, all rights reserved).

  9. Construct and face validity of a new model for the three-hit theory of depression using PACAP mutant mice on CD1 background.

    Science.gov (United States)

    Farkas, József; Kovács, László Á; Gáspár, László; Nafz, Anna; Gaszner, Tamás; Ujvári, Balázs; Kormos, Viktória; Csernus, Valér; Hashimoto, Hitoshi; Reglődi, Dóra; Gaszner, Balázs

    2017-06-23

    Major depression is a common cause of chronic disability. Despite decades of efforts, no equivocally accepted animal model is available for studying depression. We tested the validity of a new model based on the three-hit concept of vulnerability and resilience. Genetic predisposition (hit 1, mutation of pituitary adenylate cyclase-activating polypeptide, PACAP gene), early-life adversity (hit 2, 180-min maternal deprivation, MD180) and chronic variable mild stress (hit 3, CVMS) were combined. Physical, endocrinological, behavioral and functional morphological tools were used to validate the model. Body- and adrenal weight changes as well as corticosterone titers proved that CVMS was effective. Forced swim test indicated increased depression in CVMS PACAP heterozygous (Hz) mice with MD180 history, accompanied by elevated anxiety level in marble burying test. Corticotropin-releasing factor neurons in the oval division of the bed nucleus of the stria terminalis showed increased FosB expression, which was refractive to CVMS exposure in wild-type and Hz mice. Urocortin1 neurons became over-active in CMVS-exposed PACAP knock out (KO) mice with MD180 history, suggesting the contribution of centrally projecting Edinger-Westphal nucleus to the reduced depression and anxiety level of stressed KO mice. Serotoninergic neurons of the dorsal raphe nucleus lost their adaptation ability to CVMS in MD180 mice. In conclusion, the construct and face validity criteria suggest that MD180 PACAP HZ mice on CD1 background upon CVMS may be used as a reliable model for the three-hit theory. Copyright © 2017 IBRO. Published by Elsevier Ltd. All rights reserved.

  10. Technical note on the validation of a semi-automated image analysis software application for estrogen and progesterone receptor detection in breast cancer

    Science.gov (United States)

    2011-01-01

    Background The immunohistochemical detection of estrogen (ER) and progesterone (PR) receptors in breast cancer is routinely used for prognostic and predictive testing. Whole slide digitalization supported by dedicated software tools allows quantization of the image objects (e.g. cell membrane, nuclei) and an unbiased analysis of immunostaining results. Validation studies of image analysis applications for the detection of ER and PR in breast cancer specimens provided strong concordance between the pathologist's manual assessment of slides and scoring performed using different software applications. Methods The effectiveness of two connected semi-automated image analysis software (NuclearQuant v. 1.13 application for Pannoramic™ Viewer v. 1.14) for determination of ER and PR status in formalin-fixed paraffin embedded breast cancer specimens immunostained with the automated Leica Bond Max system was studied. First the detection algorithm was calibrated to the scores provided an independent assessors (pathologist), using selected areas from 38 small digital slides (created from 16 cases) containing a mean number of 195 cells. Each cell was manually marked and scored according to the Allred-system combining frequency and intensity scores. The performance of the calibrated algorithm was tested on 16 cases (14 invasive ductal carcinoma, 2 invasive lobular carcinoma) against the pathologist's manual scoring of digital slides. Results The detection was calibrated to 87 percent object detection agreement and almost perfect Total Score agreement (Cohen's kappa 0.859, quadratic weighted kappa 0.986) from slight or moderate agreement at the start of the study, using the un-calibrated algorithm. The performance of the application was tested against the pathologist's manual scoring of digital slides on 53 regions of interest of 16 ER and PR slides covering all positivity ranges, and the quadratic weighted kappa provided almost perfect agreement (κ = 0.981) among the two

  11. Wide Angle Imaging Lidar (WAIL): Theory of Operation and Results from Cross-Platform Validation at the ARM Southern Great Plains Site

    Science.gov (United States)

    Polonsky, I. N.; Davis, A. B.; Love, S. P.

    2004-05-01

    WAIL was designed to determine physical and geometrical characteristics of optically thick clouds using the off-beam component of the lidar return that can be accurately modeled within the 3D photon diffusion approximation. The theory shows that the WAIL signal depends not only on the cloud optical characteristics (phase function, extinction and scattering coefficients) but also on the outer thickness of the cloud layer. This makes it possible to estimate the mean optical and geometrical thicknesses of the cloud. The comparison with Monte Carlo simulation demonstrates the high accuracy of the diffusion approximation for moderately to very dense clouds. During operation WAIL is able to collect a complete data set from a cloud every few minutes, with averaging over horizontal scale of a kilometer or so. In order to validate WAIL's ability to deliver cloud properties, the LANL instrument was deployed as a part of the THickness from Off-beam Returns (THOR) validation IOP. The goal was to probe clouds above the SGP CART site at night in March 2002 from below (WAIL and ARM instruments) and from NASA's P3 aircraft (carrying THOR, the GSFC counterpart of WAIL) flying above the clouds. The permanent cloud instruments we used to compare with the results obtained from WAIL were ARM's laser ceilometer, micro-pulse lidar (MPL), millimeter-wavelength cloud radar (MMCR), and micro-wave radiometer (MWR). The comparison shows that, in spite of an unusually low cloud ceiling, an unfavorable observation condition for WAIL's present configuration, cloud properties obtained from the new instrument are in good agreement with their counterparts obtained by other instruments. So WAIL can duplicate, at least for single-layer clouds, the cloud products of the MWR and MMCR together. But WAIL does this with green laser light, which is far more representative than microwaves of photon transport processes at work in the climate system.

  12. Cross-cultural validity of the Spanish version of PHQ-9 among pregnant Peruvian women: a Rasch item response theory analysis.

    Science.gov (United States)

    Zhong, Qiuyue; Gelaye, Bizu; Fann, Jesse R; Sanchez, Sixto E; Williams, Michelle A

    2014-04-01

    We sought to evaluate the validity of the Spanish language version of the patient health questionnaire-9 (PHQ-9) depression scale in a large sample of pregnant Peruvian women using Rasch item response theory (IRT) approaches. We further sought to examine the appropriateness of the response formats, reliability and potential differential item functioning (DIF) by maternal age, educational attainment and employment status. This cross-sectional study was conducted among 1520 pregnant women in Lima, Peru. A structured interview was used to collect information on demographic characteristics and PHQ-9 items. Data from the PHQ-9 were fitted to the Rasch IRT model and tested for appropriate category ordering, the assumptions of unidimensionality and local independence, item fit, reliability and presence of DIF. The Spanish language version of PHQ-9 demonstrated unidimensionality, local independence, and acceptable fit for the Rasch IRT model. However, we detected disordered response categories for the original four response categories. After collapsing "more than half the days" and "nearly every day", the response categories ordered properly and the PHQ-9 fit the Rasch IRT model. The PHQ-9 had moderate internal consistency (person separation index, PSI=0.72). Additionally, the items of PHQ-9 were free of DIF with regard to age, educational attainment, and employment status. The Spanish language version of the PHQ-9 was shown to have item properties of an effective screening instrument. Collapsing rating scale categories and reconstructing three-point Likert scale for all items improved the fit of the instrument. Future studies are warranted to establish new cutoff scores and criterion validity of the three-point Likert scale response options for the Spanish language version of the PHQ-9. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. Clinical Validation of the Nursing Outcome "Swallowing Status" in People with Stroke: Analysis According to the Classical and Item Response Theories.

    Science.gov (United States)

    Oliveira-Kumakura, Ana Railka de Souza; de Araujo, Thelma Leite; Costa, Alice Gabrielle de Sousa; Cavalcante, Tahissa Frota; Lopes, Marcos Venícios de Oliveira; Carvalho, Emilia Campos

    2017-09-19

    To validate clinically the nursing outcome "Swallowing status". The adjustment of the nursing outcome was investigated according to the Classical and Item Response Theories. The models were compared regarding information loss, goodness-of-fit, and differential item functioning. Stability and internal consistency were examined. The nursing outcome has the best fit in the generalized partial credit model with different discrimination parameters. Strong correlations among the scores of each indicator were observed. There was no differential item functioning of the outcome indicators. The scale presented high internal consistency (Cronbach's α = .954) and stability (and > .800). This study presents a valid nursing outcome. Most accurate monitoring of sensitivity to an intervention. Validar clinicamente o resultado de enefermagem "Estado da Deglutição". MÉTODOS: O ajustamento do resultado foi investigado de acordo com as teorias Clássica e de Resposta ao Item. Os modelos foram comparados assumindo parâmetros de itens cruzados de igual discriminação. Investigaram-se as propriedades de bondade do ajuste, funcionamento diferencial dos itens, estabilidade e consistência interna. O resultado se ajustou melhor a partir do Modelo de crédito parcial generalizado, o qual demonstrou unidimensionalidade do resultado e forte correlação entre os escores de cada indicador. Não houve funcionamento diferencial dos indicadores. A consistência interna para a escala global (Cronbach's α = .954) e a estabilidade (>.800) mantiveram-se elevadas. CONCLUSÃO: O estudo apresenta um resultado de enfermagem válido. RELEVÂNCIA PARA A PRÁTICA CLÍNICA: Maior acurácia para monitorar a sensibilidade da intervenção. © 2017 NANDA International, Inc.

  14. Validation and psychometric properties of the Somatic and Psychological HEalth REport (SPHERE) in a young Australian-based population sample using non-parametric item response theory.

    Science.gov (United States)

    Couvy-Duchesne, Baptiste; Davenport, Tracey A; Martin, Nicholas G; Wright, Margaret J; Hickie, Ian B

    2017-08-01

    The Somatic and Psychological HEalth REport (SPHERE) is a 34-item self-report questionnaire that assesses symptoms of mental distress and persistent fatigue. As it was developed as a screening instrument for use mainly in primary care-based clinical settings, its validity and psychometric properties have not been studied extensively in population-based samples. We used non-parametric Item Response Theory to assess scale validity and item properties of the SPHERE-34 scales, collected through four waves of the Brisbane Longitudinal Twin Study (N = 1707, mean age = 12, 51% females; N = 1273, mean age = 14, 50% females; N = 1513, mean age = 16, 54% females, N = 1263, mean age = 18, 56% females). We estimated the heritability of the new scores, their genetic correlation, and their predictive ability in a sub-sample (N = 1993) who completed the Composite International Diagnostic Interview. After excluding items most responsible for noise, sex or wave bias, the SPHERE-34 questionnaire was reduced to 21 items (SPHERE-21), comprising a 14-item scale for anxiety-depression and a 10-item scale for chronic fatigue (3 items overlapping). These new scores showed high internal consistency (alpha > 0.78), moderate three months reliability (ICC = 0.47-0.58) and item scalability (Hi > 0.23), and were positively correlated (phenotypic correlations r = 0.57-0.70; rG = 0.77-1.00). Heritability estimates ranged from 0.27 to 0.51. In addition, both scores were associated with later DSM-IV diagnoses of MDD, social anxiety and alcohol dependence (OR in 1.23-1.47). Finally, a post-hoc comparison showed that several psychometric properties of the SPHERE-21 were similar to those of the Beck Depression Inventory. The scales of SPHERE-21 measure valid and comparable constructs across sex and age groups (from 9 to 28 years). SPHERE-21 scores are heritable, genetically correlated and show good predictive ability of mental health in an Australian-based population

  15. Radiosynthesis and in vitro validation of 3H-NS14492 as a novel high affinity alpha7 nicotinic receptor radioligand

    DEFF Research Database (Denmark)

    Magnussen, Janus H.; Ettrup, Anders; Donat, Cornelius K.

    2015-01-01

    The neuronal alpha 7 nicotinic acetylcholine receptor is a homo-pentameric ligand-gated ion channel that is a promising drug target for cognitive deficits in Alzheimer's disease and schizophrenia. We have previously described 11C-NS14492 as a suitable agonist radioligand for in vivo positron...... emission tomography (PET) occupancy studies of the alpha 7 nicotinic receptor in the pig brain. In order to investigate the utility of the same compound for in vitro studies, 3H-NS14492 was synthesized and its binding properties were characterized using in vitro autoradiography and homogenate binding...... assays in pig frontal cortex. 3H-NS14492 showed specific binding to alpha 7 nicotinic receptors in autoradiography, revealing a dissociation constant (Kd) of 2.1 ± 0.7 nM and a maximum number of binding sites (Bmax) of 15.7±2.0 fmol/mg tissue equivalent. Binding distribution was similar...

  16. Insulin receptors

    International Nuclear Information System (INIS)

    Kahn, C.R.; Harrison, L.C.

    1988-01-01

    This book contains the proceedings on insulin receptors. Part A: Methods for the study of structure and function. Topics covered include: Method for purification and labeling of insulin receptors, the insulin receptor kinase, and insulin receptors on special tissues

  17. Validation of self-directed learning instrument and establishment of normative data for nursing students in taiwan: using polytomous item response theory.

    Science.gov (United States)

    Cheng, Su-Fen; Lee-Hsieh, Jane; Turton, Michael A; Lin, Kuan-Chia

    2014-06-01

    Little research has investigated the establishment of norms for nursing students' self-directed learning (SDL) ability, recognized as an important capability for professional nurses. An item response theory (IRT) approach was used to establish norms for SDL abilities valid for the different nursing programs in Taiwan. The purposes of this study were (a) to use IRT with a graded response model to reexamine the SDL instrument, or the SDLI, originally developed by this research team using confirmatory factor analysis and (b) to establish SDL ability norms for the four different nursing education programs in Taiwan. Stratified random sampling with probability proportional to size was used. A minimum of 15% of students from the four different nursing education degree programs across Taiwan was selected. A total of 7,879 nursing students from 13 schools were recruited. The research instrument was the 20-item SDLI developed by Cheng, Kuo, Lin, and Lee-Hsieh (2010). IRT with the graded response model was used with a two-parameter logistic model (discrimination and difficulty) for the data analysis, calculated using MULTILOG. Norms were established using percentile rank. Analysis of item information and test information functions revealed that 18 items exhibited very high discrimination and two items had high discrimination. The test information function was higher in this range of scores, indicating greater precision in the estimate of nursing student SDL. Reliability fell between .80 and .94 for each domain and the SDLI as a whole. The total information function shows that the SDLI is appropriate for all nursing students, except for the top 2.5%. SDL ability norms were established for each nursing education program and for the nation as a whole. IRT is shown to be a potent and useful methodology for scale evaluation. The norms for SDL established in this research will provide practical standards for nursing educators and students in Taiwan.

  18. Molecular interactions of agonist and inverse agonist ligands at serotonin 5-HT2C G protein-coupled receptors: computational ligand docking and molecular dynamics studies validated by experimental mutagenesis results

    Science.gov (United States)

    Córdova-Sintjago, Tania C.; Liu, Yue; Booth, Raymond G.

    2015-02-01

    To understand molecular determinants for ligand activation of the serotonin 5-HT2C G protein-coupled receptor (GPCR), a drug target for obesity and neuropsychiatric disorders, a 5-HT2C homology model was built according to an adrenergic β2 GPCR (β2AR) structure and validated using a 5-HT2B GPCR crystal structure. The models were equilibrated in a simulated phosphatidyl choline membrane for ligand docking and molecular dynamics studies. Ligands included (2S, 4R)-(-)-trans-4-(3'-bromo- and trifluoro-phenyl)-N,N-dimethyl-1,2,3,4-tetrahydronaphthalene-2-amine (3'-Br-PAT and 3'-CF3-PAT), a 5-HT2C agonist and inverse agonist, respectively. Distinct interactions of 3'-Br-PAT and 3'-CF3-PAT at the wild-type (WT) 5-HT2C receptor model were observed and experimental 5-HT2C receptor mutagenesis studies were undertaken to validate the modelling results. For example, the inverse agonist 3'-CF3-PAT docked deeper in the WT 5-HT2C binding pocket and altered the orientation of transmembrane helices (TM) 6 in comparison to the agonist 3'-Br-PAT, suggesting that changes in TM orientation that result from ligand binding impact function. For both PATs, mutation of 5-HT2C residues S3.36, T3.37, and F5.47 to alanine resulted in significantly decreased affinity, as predicted from modelling results. It was concluded that upon PAT binding, 5-HT2C residues T3.37 and F5.47 in TMs 3 and 5, respectively, engage in inter-helical interactions with TMs 4 and 6, respectively. The movement of TMs 5 and 6 upon agonist and inverse agonist ligand binding observed in the 5-HT2C receptor modelling studies was similar to movements reported for the activation and deactivation of the β2AR, suggesting common mechanisms among aminergic neurotransmitter GPCRs.

  19. Assessment of a robust model protocol with accelerated throughput for a human recombinant full length estrogen receptor-alpha binding assay: protocol optimization and intralaboratory assay performance as initial steps towards validation.

    Science.gov (United States)

    Freyberger, Alexius; Wilson, Vickie; Weimer, Marc; Tan, Shirlee; Tran, Hoai-Son; Ahr, Hans-Jürgen

    2010-08-01

    Despite about two decades of research in the field of endocrine active compounds, still no validated human recombinant (hr) estrogen receptor-alpha (ERalpha) binding assay is available, although hr-ERalpha is available from several sources. In a joint effort, US EPA and Bayer Schering Pharma with funding from the EU-sponsored 6th framework project, ReProTect, developed a model protocol for such a binding assay. Important features of this assay are the use of a full length hr-ERalpha and performance in a 96-well plate format. A full length hr-ERalpha was chosen, as it was considered to provide the most accurate and human-relevant results, whereas truncated receptors could perform differently. Besides three reference compounds [17beta-estradiol, norethynodrel, dibutylphthalate] nine test compounds with different affinities for the ERalpha [diethylstilbestrol (DES), ethynylestradiol, meso-hexestrol, equol, genistein, o,p'-DDT, nonylphenol, n-butylparaben, and corticosterone] were used to explore the performance of the assay. Three independent experiments per compound were performed on different days, and dilutions of test compounds from deep-frozen stocks, solutions of radiolabeled ligand and receptor preparation were freshly prepared for each experiment. The ERalpha binding properties of reference and test compounds were well detected. As expected dibutylphthalate and corticosterone were non-binders in this assay. In terms of the relative ranking of binding affinities, there was good agreement with published data obtained from experiments using a human recombinant ERalpha ligand binding domain. Irrespective of the chemical nature of the compound, individual IC(50)-values for a given compound varied by not more than a factor of 2.5. Our data demonstrate that the assay was robust and reliably ranked compounds with strong, weak, and no affinity for the ERalpha with high accuracy. It avoids the manipulation and use of animals, i.e., the preparation of uterine cytosol as

  20. Communication Theory.

    Science.gov (United States)

    Penland, Patrick R.

    Three papers are presented which delineate the foundation of theory and principles which underlie the research and instructional approach to communications at the Graduate School of Library and Information Science, University of Pittsburgh. Cybernetic principles provide the integration, and validation is based in part on a situation-producing…

  1. Coarse-grained/molecular mechanics of the TAS2R38 bitter taste receptor: experimentally-validated detailed structural prediction of agonist binding.

    Directory of Open Access Journals (Sweden)

    Alessandro Marchiori

    Full Text Available Bitter molecules in humans are detected by ∼25 G protein-coupled receptors (GPCRs. The lack of atomic resolution structure for any of them is complicating an in depth understanding of the molecular mechanisms underlying bitter taste perception. Here, we investigate the molecular determinants of the interaction of the TAS2R38 bitter taste receptor with its agonists phenylthiocarbamide (PTC and propylthiouracil (PROP. We use the recently developed hybrid Molecular Mechanics/Coarse Grained (MM/CG method tailored specifically for GPCRs. The method, through an extensive exploration of the conformational space in the binding pocket, allows the identification of several residues important for agonist binding that would have been very difficult to capture from the standard bioinformatics/docking approach. Our calculations suggest that both agonists bind to Asn103, Phe197, Phe264 and Trp201, whilst they do not interact with the so-called extra cellular loop 2, involved in cis-retinal binding in the GPCR rhodopsin. These predictions are consistent with data sets based on more than 20 site-directed mutagenesis and functional calcium imaging experiments of TAS2R38. The method could be readily used for other GPCRs for which experimental information is currently lacking.

  2. Validation of the reference tissue model for estimation of dopaminergic D{sub 2}-like receptor binding with [{sup 18}F](N-methyl)benperidol in humans

    Energy Technology Data Exchange (ETDEWEB)

    Antenor-Dorsey, Jo Ann V. [Department of Anatomy and Neurobiology, Washington University School of Medicine, St. Louis, MO (United States); Markham, Joanne; Moerlein, Stephen M. [Department of Radiology, Washington University School of Medicine, St. Louis, MO (United States); Videen, Tom O. [Department of Radiology, Washington University School of Medicine, St. Louis, MO (United States); Department of Neurology, Washington University School of Medicine, St. Louis, MO (United States); Perlmutter, Joel S. [Department of Anatomy and Neurobiology, Washington University School of Medicine, St. Louis, MO (United States); Department of Radiology, Washington University School of Medicine, St. Louis, MO (United States); Department of Neurology, Washington University School of Medicine, St. Louis, MO (United States); Program in Physical Therapy, Washington University School of Medicine, St. Louis, MO (United States)], E-mail: joel@npg.wustl.edu

    2008-04-15

    Positron emission tomography measurements of dopaminergic D{sub 2}-like receptors may provide important insights into disorders such as Parkinson's disease, schizophrenia, dystonia and Tourette's syndrome. The positron emission tomography (PET) radioligand [{sup 18}F](N-methyl)benperidol ([{sup 18}F]NMB) has high affinity and selectivity for D{sub 2}-like receptors and is not displaced by endogenous dopamine. The goal of this study is to evaluate the use of a graphical method utilizing a reference tissue region for [{sup 18}F]-NMB PET analysis by comparisons to an explicit three-compartment tracer kinetic model and graphical method that use arterial blood measurements. We estimated binding potential (BP) in the caudate and putamen using all three methods in 16 humans and found that the three-compartment tracer kinetic method provided the highest BP estimates while the graphical method using a reference region yielded the lowest estimates (P<.0001 by repeated-measures ANOVA). However, the three methods yielded highly correlated BP estimates for the two regions of interest. We conclude that the graphical method using a reference region still provides a useful estimate of BP comparable to methods using arterial blood sampling, especially since the reference region method is less invasive and computationally more straightforward, thereby simplifying these measurements.

  3. Validation of the reference tissue model for estimation of dopaminergic D2-like receptor binding with [18F](N-methyl)benperidol in humans

    International Nuclear Information System (INIS)

    Antenor-Dorsey, Jo Ann V.; Markham, Joanne; Moerlein, Stephen M.; Videen, Tom O.; Perlmutter, Joel S.

    2008-01-01

    Positron emission tomography measurements of dopaminergic D 2 -like receptors may provide important insights into disorders such as Parkinson's disease, schizophrenia, dystonia and Tourette's syndrome. The positron emission tomography (PET) radioligand [ 18 F](N-methyl)benperidol ([ 18 F]NMB) has high affinity and selectivity for D 2 -like receptors and is not displaced by endogenous dopamine. The goal of this study is to evaluate the use of a graphical method utilizing a reference tissue region for [ 18 F]-NMB PET analysis by comparisons to an explicit three-compartment tracer kinetic model and graphical method that use arterial blood measurements. We estimated binding potential (BP) in the caudate and putamen using all three methods in 16 humans and found that the three-compartment tracer kinetic method provided the highest BP estimates while the graphical method using a reference region yielded the lowest estimates (P<.0001 by repeated-measures ANOVA). However, the three methods yielded highly correlated BP estimates for the two regions of interest. We conclude that the graphical method using a reference region still provides a useful estimate of BP comparable to methods using arterial blood sampling, especially since the reference region method is less invasive and computationally more straightforward, thereby simplifying these measurements

  4. Experimental and numerical validation of the effective medium theory for the B-term band broadening in 1st and 2nd generation monolithic silica columns.

    Science.gov (United States)

    Deridder, Sander; Vanmessen, Alison; Nakanishi, Kazuki; Desmet, Gert; Cabooter, Deirdre

    2014-07-18

    Effective medium theory (EMT) expressions for the B-term band broadening in monolithic silica columns are presented at the whole-column as well as at the mesoporous skeleton level. Given the bi-continuous nature of the monolithic medium, regular as well as inverse formulations of the EMT-expressions have been established. The established expressions were validated by applying them to a set of experimental effective diffusion (Deff)-data obtained via peak parking on a number of 1st and 2nd generation monolithic silica columns, as well as to a set of numerical diffusion simulations in a simplified monolithic column representation (tetrahedral skeleton model) with different external porosities and internal diffusion coefficients. The numerically simulated diffusion data can be very closely represented over a very broad range of zone retention factors (up to k″=80) using the established EMT-expressions, especially when using the inverse variant. The expressions also allow representing the experimentally measured effective diffusion data very closely. The measured Deff/Dmol-values were found to decrease significantly with increasing retention factor, in general going from about Deff/Dmol=0.55 to 0.65 at low k″ (k″≅1.5-3.8) to Deff/Dmol=0.25 at very high k″ (k″≅40-80). These values are significantly larger than observed in fully-porous and core-shell particles. The intra-skeleton diffusion coefficient (Dpz) was typically found to be of the order of Dpz/Dmol=0.4, compared to Dpz/Dmol=0.2-0.35 observed in most particle-based columns. These higher Dpz/Dmol values are the cause of the higher Deff/Dmol values observed. In addition, it also appears that the higher internal diffusion is linked to the higher porosity of the mesoporous skeleton that has a relatively open structure with relatively wide pores. The observed (weak) relation between Dpz/Dmol and the zone retention factor appears to be in good agreement with that predicted when applying the regular

  5. Validation of the Danish version of the McGill Ingestive Skills Assessment using classical test theory and the Rasch model

    DEFF Research Database (Denmark)

    Hansen, Tina; Lambert, Heather C; Faber, Jens

    2012-01-01

    Purpose: The study aimed to validate the Danish version of the Canadian the "McGill Ingestive Skills Assessment" (MISA-DK) for measuring dysphagia in frail elders. Method: One-hundred and ten consecutive older medical patients were recruited to the study. Reliability was assessed by internal...... consistency (Chronbach's alpha). External construct validity (convergent and known-groups validity) was evaluated against theoretical constructs assessing the complex concept of ingestive skills. Internal construct validity was tested using Rasch analysis. Results: High internal consistency reliability...... with Chronbach's alpha of 0.77-0.95 was evident. External construct validity was supported by expected high correlations with most of the constructs related to ingestive skills (r(s)¿=¿0.53 to r(s)¿=¿0.66). The MISA-DK discriminated significantly between known-groups. Fit to the Rasch model (x(2) (df)¿=¿12 (12...

  6. Principles of Proper Validation

    DEFF Research Database (Denmark)

    Esbensen, Kim; Geladi, Paul

    2010-01-01

    to suffer from the same deficiencies. The PPV are universal and can be applied to all situations in which the assessment of performance is desired: prediction-, classification-, time series forecasting-, modeling validation. The key element of PPV is the Theory of Sampling (TOS), which allow insight......) is critically necessary for the inclusion of the sampling errors incurred in all 'future' situations in which the validated model must perform. Logically, therefore, all one data set re-sampling approaches for validation, especially cross-validation and leverage-corrected validation, should be terminated...

  7. In Vitro Pre-Clinical Validation of Suicide Gene Modified Anti-CD33 Redirected Chimeric Antigen Receptor T-Cells for Acute Myeloid Leukemia.

    Directory of Open Access Journals (Sweden)

    Kentaro Minagawa

    Full Text Available Approximately fifty percent of patients with acute myeloid leukemia can be cured with current therapeutic strategies which include, standard dose chemotherapy for patients at standard risk of relapse as assessed by cytogenetic and molecular analysis, or high-dose chemotherapy with allogeneic hematopoietic stem cell transplant for high-risk patients. Despite allogeneic hematopoietic stem cell transplant about 25% of patients still succumb to disease relapse, therefore, novel strategies are needed to improve the outcome of patients with acute myeloid leukemia.We developed an immunotherapeutic strategy targeting the CD33 myeloid antigen, expressed in ~ 85-90% of patients with acute myeloid leukemia, using chimeric antigen receptor redirected T-cells. Considering that administration of CAR T-cells has been associated with cytokine release syndrome and other potential off-tumor effects in patients, safety measures were here investigated and reported. We genetically modified human activated T-cells from healthy donors or patients with acute myeloid leukemia with retroviral supernatant encoding the inducible Caspase9 suicide gene, a ΔCD19 selectable marker, and a humanized third generation chimeric antigen receptor recognizing human CD33. ΔCD19 selected inducible Caspase9-CAR.CD33 T-cells had a 75±3.8% (average ± standard error of the mean chimeric antigen receptor expression, were able to specifically lyse CD33+ targets in vitro, including freshly isolated leukemic blasts from patients, produce significant amount of tumor-necrosis-factor-alpha and interferon-gamma, express the CD107a degranulation marker, and proliferate upon antigen specific stimulation. Challenging ΔCD19 selected inducible Caspase9-CAR.CD33 T-cells with programmed-death-ligand-1 enriched leukemia blasts resulted in significant killing like observed for the programmed-death-ligand-1 negative leukemic blasts fraction. Since the administration of 10 nanomolar of a non

  8. Social Connectedness and Perceived Listening Effort in Adult Cochlear Implant Users: A Grounded Theory to Establish Content Validity for a New Patient-Reported Outcome Measure.

    Science.gov (United States)

    Hughes, Sarah E; Hutchings, Hayley A; Rapport, Frances L; McMahon, Catherine M; Boisvert, Isabelle

    2018-02-08

    Individuals with hearing loss often report a need for increased effort when listening, particularly in challenging acoustic environments. Despite audiologists' recognition of the impact of listening effort on individuals' quality of life, there are currently no standardized clinical measures of listening effort, including patient-reported outcome measures (PROMs). To generate items and content for a new PROM, this qualitative study explored the perceptions, understanding, and experiences of listening effort in adults with severe-profound sensorineural hearing loss before and after cochlear implantation. Three focus groups (1 to 3) were conducted. Purposive sampling was used to recruit 17 participants from a cochlear implant (CI) center in the United Kingdom. The participants included adults (n = 15, mean age = 64.1 years, range 42 to 84 years) with acquired severe-profound sensorineural hearing loss who satisfied the UK's national candidacy criteria for cochlear implantation and their normal-hearing significant others (n = 2). Participants were CI candidates who used hearing aids (HAs) and were awaiting CI surgery or CI recipients who used a unilateral CI or a CI and contralateral HA (CI + HA). Data from a pilot focus group conducted with 2 CI recipients were included in the analysis. The data, verbatim transcripts of the focus group proceedings, were analyzed qualitatively using constructivist grounded theory (GT) methodology. A GT of listening effort in cochlear implantation was developed from participants' accounts. The participants provided rich, nuanced descriptions of the complex and multidimensional nature of their listening effort. Interpreting and integrating these descriptions through GT methodology, listening effort was described as the mental energy required to attend to and process the auditory signal, as well as the effort required to adapt to, and compensate for, a hearing loss. Analyses also suggested that listening effort for most participants was

  9. Validation of missed space-group symmetry in X-ray powder diffraction structures with dispersion-corrected density functional theory

    DEFF Research Database (Denmark)

    Hempler, Daniela; Schmidt, Martin U.; Van De Streek, Jacco

    2017-01-01

    More than 600 molecular crystal structures with correct, incorrect and uncertain space-group symmetry were energy-minimized with dispersion-corrected density functional theory (DFT-D, PBE-D3). For the purpose of determining the correct space-group symmetry the required tolerance on the atomic...... with missed symmetry were investigated by dispersion-corrected density functional theory. In 98.5% of the cases the correct space group is found....

  10. Somatostatin receptors

    DEFF Research Database (Denmark)

    Møller, Lars Neisig; Stidsen, Carsten Enggaard; Hartmann, Bolette

    2003-01-01

    functional units, receptors co-operate. The total receptor apparatus of individual cell types is composed of different-ligand receptors (e.g. SRIF and non-SRIF receptors) and co-expressed receptor subtypes (e.g. sst(2) and sst(5) receptors) in characteristic proportions. In other words, levels of individual......-peptides, receptor agonists and antagonists. Relatively long half lives, as compared to those of the endogenous ligands, have been paramount from the outset. Motivated by theoretical puzzles or the shortcomings of present-day diagnostics and therapy, investigators have also aimed to produce subtype...

  11. Hypothesis and Theory: Revisiting Views on the Co-evolution of the Melanocortin Receptors and the Accessory Proteins, MRAP1 and MRAP2.

    Science.gov (United States)

    Dores, Robert M

    2016-01-01

    The evolution of the melanocortin receptors (MCRs) is closely associated with the evolution of the melanocortin-2 receptor accessory proteins (MRAPs). Recent annotation of the elephant shark genome project revealed the sequence of a putative MRAP1 ortholog. The presence of this sequence in the genome of a cartilaginous fish raises the possibility that the mrap1 and mrap2 genes in the genomes of gnathostome vertebrates were the result of the chordate 2R genome duplication event. The presence of a putative MRAP1 ortholog in a cartilaginous fish genome is perplexing. Recent studies on melanocortin-2 receptor (MC2R) in the genomes of the elephant shark and the Japanese stingray indicate that these MC2R orthologs can be functionally expressed in CHO cells without co-expression of an exogenous mrap1 cDNA. The novel ligand selectivity of these cartilaginous fish MC2R orthologs is discussed. Finally, the origin of the mc2r and mc5r genes is reevaluated. The distinctive primary sequence conservation of MC2R and MC5R is discussed in light of the physiological roles of these two MCR paralogs.

  12. Clinical Validation of a Pixon-Based Reconstruction Method Allowing a Twofold Reduction in Planar Images Time of 111In-Pentetreotide Somatostatin Receptor Scintigraphy

    Directory of Open Access Journals (Sweden)

    Philippe Thuillier

    2017-08-01

    Full Text Available ObjectiveThe objective of this study was to evaluate the diagnostic efficacy of Pixon-based reconstruction method on planar somatostatin receptor scintigraphy (SRS.MethodsAll patients with neuroendocrine tumors (NETs disease who were referred for SRS to our department during 1-year period from January to December 2015 were consecutively included. Three nuclear physicians independently reviewed all the data sets of images which included conventional images (CI; 15 min/view and processed images (PI obtained by reconstructing the first 450 s extracted data using Oncoflash® software package. Image analysis using a 3-point rating scale for abnormal uptake of 111 Indium-DTPA-Phe-octreotide in any lesion or organ was interpreted as positive, uncertain, or negative for the evidence of NET disease. A maximum grade uptake of the radiotracer in the lesion was assessed by the Krenning scale method. The results of image interpretation by the two methods were considered significantly discordant when the difference in organ involvement assessment was negative vs. positive or in lesion uptake was ≥2 grades. Agreement between the results of two methods and by different scan observers was evaluated using Cohen κ coefficients.ResultsThere was no significant (p = 0.403 correlation between data acquisition protocol and quality image. The rates of significant discrepancies for exam interpretation and organs involvement assessment were 2.8 and 2.6%, respectively. Mean κ values revealed a good agreement for concordance between CI and PI interpretation without difference of agreement for inter/intra-observer analysis.ConclusionOur results suggest the feasibility to use a Pixon-based reconstruction method for SRS planar images allowing a twofold reduction of acquisition time and without significant alteration of image quality or on image interpretation.

  13. Spectroscopic studies of model photo-receptors: validation of a nanosecond time-resolved micro-spectrophotometer design using photoactive yellow protein and α-phycoerythrocyanin.

    Science.gov (United States)

    Purwar, Namrta; Tenboer, Jason; Tripathi, Shailesh; Schmidt, Marius

    2013-09-13

    Time-resolved spectroscopic experiments have been performed with protein in solution and in crystalline form using a newly designed microspectrophotometer. The time-resolution of these experiments can be as good as two nanoseconds (ns), which is the minimal response time of the image intensifier used. With the current setup, the effective time-resolution is about seven ns, determined mainly by the pulse duration of the nanosecond laser. The amount of protein required is small, on the order of 100 nanograms. Bleaching, which is an undesirable effect common to photoreceptor proteins, is minimized by using a millisecond shutter to avoid extensive exposure to the probing light. We investigate two model photoreceptors, photoactive yellow protein (PYP), and α-phycoerythrocyanin (α-PEC), on different time scales and at different temperatures. Relaxation times obtained from kinetic time-series of difference absorption spectra collected from PYP are consistent with previous results. The comparison with these results validates the capability of this spectrophotometer to deliver high quality time-resolved absorption spectra.

  14. Patient self-report section of the ASES questionnaire: a Spanish validation study using classical test theory and the Rasch model.

    Science.gov (United States)

    Vrotsou, Kalliopi; Cuéllar, Ricardo; Silió, Félix; Rodriguez, Miguel Ángel; Garay, Daniel; Busto, Gorka; Trancho, Ziortza; Escobar, Antonio

    2016-10-18

    The aim of the current study was to validate the self-report section of the American Shoulder and Elbow Surgeons questionnaire (ASES-p) into Spanish. Shoulder pathology patients were recruited and followed up to 6 months post treatment. The ASES-p, Constant, SF-36 and Barthel scales were filled-in pre and post treatment. Reliability was tested with Cronbach's alpha, convergent validity with Spearman's correlations coefficients. Confirmatory factor analysis (CFA) and the Rasch model were implemented for assessing structural validity and unidimensionality of the scale. Models with and without the pain item were considered. Responsiveness to change was explored via standardised effect sizes. Results were acceptable for both tested models. Cronbach's alpha was 0.91, total scale correlations with Constant and physical SF-36 dimensions were >0.50. Factor loadings for CFA were >0.40. The Rasch model confirmed unidimensionality of the scale, even though item 10 "do usual sport" was suggested as non-informative. Finally, patients with improved post treatment shoulder function and those receiving surgery had higher standardised effect sizes. The adapted Spanish ASES-p version is a valid and reliable tool for shoulder evaluation and its unidimensionality is supported by the data.

  15. The Children's Social Understanding Scale: Construction and Validation of a Parent-Report Measure for Assessing Individual Differences in Children's Theories of Mind

    Science.gov (United States)

    Tahiroglu, Deniz; Moses, Louis J.; Carlson, Stephanie M.; Mahy, Caitlin E. V.; Olofson, Eric L.; Sabbagh, Mark A.

    2014-01-01

    Children's theory of mind (ToM) is typically measured with laboratory assessments of performance. Although these measures have generated a wealth of informative data concerning developmental progressions in ToM, they may be less useful as the sole source of information about individual differences in ToM and their relation to other facets of…

  16. A Meta-Analytic Review of the Theories of Reasoned Action and Planned Behavior in Physical Activity: Predictive Validity and the Contribution of Additional Variables.

    Science.gov (United States)

    Hagger, Martin S.; Chatzisarantis, Nikos L. D.; Biddle, Stuart J. H.

    2002-01-01

    Examined relations between behavior, intentions, attitudes, subjective norms, perceived behavioral control, self-efficacy, and past behaviors using the Theories of Reasoned Action (TRA) and Planned Behavior (TPB) in physical activity. This quantitative integration of the physical activity literature supported the major relationships of the…

  17. Validity of Memory Tasks across the WJ III COG, NEPSYII, and WRAML-2 in a Mixed Clinical Sample of Children: Applicability to Four Neurocognitive Theories

    Science.gov (United States)

    Psimas, J. Lynsey

    2012-01-01

    Current research regarding the neurocognitive construct of memory in children and adolescents within clinical populations is insufficient (Hughes & Graham, 2002). Controversial theories of memory have led to divergent hypotheses about the construct of memory. Based on current disparity regarding the theoretical paradigm of memory, it cannot be…

  18. Detection and validation of unscalable item score patterns using Item Response Theory: An illustration with Harter's Self-Perception Profile for Children

    NARCIS (Netherlands)

    Meijer, R.R.; Egberink, I.J.L.; Emons, Wilco H.M.; Sijtsma, Klaas

    2008-01-01

    We illustrate the usefulness of person-fit methodology for personality assessment. For this purpose, we use person-fit methods from item response theory. First, we give a nontechnical introduction to existing person-fit statistics. Second, we analyze data from Harter's (1985)Self-Perception Profile

  19. Parametric mapping of 5HT1A receptor sites in the human brain with the Hypotime method: theory and normal values

    DEFF Research Database (Denmark)

    Møller, Mette; Rodell, Anders; Gjedde, Albert

    2009-01-01

    The radioligand [carbonyl-(11)C]WAY-100635 ((11)C-WAY) is a PET tracer of the serotonin 5HT(1A) receptors in the human brain. It is metabolized so rapidly in the circulation that it behaves more as a chemical microsphere than as a tracer subject to continuous exchange between the circulation...... and reference regions continue to exchange radioligand with the circulation during the entire uptake period. Here, we proposed a method of calculation (Hypotime) that specifically uses the washout rather than the accumulation of (11)C-WAY to determine binding potentials (BP(ND)), without the use of regression...

  20. Clinical Translation and Validation of a Predictive Biomarker for Patritumab, an Anti-human Epidermal Growth Factor Receptor 3 (HER3) Monoclonal Antibody, in Patients With Advanced Non-small Cell Lung Cancer.

    Science.gov (United States)

    Mendell, Jeanne; Freeman, Daniel J; Feng, Wenqin; Hettmann, Thore; Schneider, Matthias; Blum, Sabine; Ruhe, Jens; Bange, Johannes; Nakamaru, Kenji; Chen, Shuquan; Tsuchihashi, Zenta; von Pawel, Joachim; Copigneaux, Catherine; Beckman, Robert A

    2015-03-01

    During early clinical development, prospective identification of a predictive biomarker and validation of an assay method may not always be feasible. Dichotomizing a continuous biomarker measure to classify responders also leads to challenges. We present a case study of a prospective-retrospective approach for a continuous biomarker identified after patient enrollment but defined prospectively before the unblinding of data. An analysis of the strengths and weaknesses of this approach and the challenges encountered in its practical application are also provided. HERALD (NCT02134015) was a double-blind, phase 2 study in patients with non-small cell lung cancer (NSCLC) randomized to erlotinib with placebo or with high or low doses of patritumab, a monoclonal antibody targeted against human epidermal growth factor receptor 3 (HER3). While the primary objective was to assess safety and progression-free survival (PFS), a secondary objective was to determine a single predictive biomarker hypothesis to identify subjects most likely to benefit from the addition of patritumab. Although not identified as the primary biomarker in the study protocol, on the basis of preclinical results from 2 independent laboratories, expression levels of the HER3 ligand heregulin (HRG) were prospectively declared the predictive biomarker before data unblinding but after subject enrollment. An assay to measure HRG mRNA was developed and validated. Other biomarkers, such as epidermal growth factor receptor (EGFR) mutation status, were also evaluated in an exploratory fashion. The cutoff value for high vs. low HRG mRNA levels was set at the median delta threshold cycle. A maximum likelihood analysis was performed to evaluate the provisional cutoff. The relationship of HRG values to PFS hazard ratios (HRs) was assessed as a measure of internal validation. Additional NSCLC samples were analyzed to characterize HRG mRNA distribution. The subgroup of patients with high HRG mRNA levels ("HRG

  1. Antagonistic properties of a natural product-Bicuculline with the gamma-aminobutyric acid receptor: studied through electrostatic potential mapping, electronic and vibrational spectra using ab initio and density functional theory.

    Science.gov (United States)

    Srivastava, Anubha; Tandon, Poonam; Jain, Sudha; Asthana, B P

    2011-12-15

    (+)-Bicuculline (hereinafter referred to as bicuculline), a phthalide isoquinoline alkaloid is of current interest as an antagonist of gamma-aminobutyric acid (GABA). Its inhibitor properties have been studied through molecular electrostatic potential (MEP) mapping of this molecule and GABA receptor. The hot site on the potential surface of bicuculline, which is also isosteric with GABA receptor, has been used to interpret the inhibitor property. A systematic quantum chemical study of the possible conformations, their relative stabilities, FT-Raman, FT-IR and UV-vis spectroscopic analysis of bicuculline has been reported. The optimized geometries, wavenumber and intensity of the vibrational bands of all the conformers of bicuculline have been calculated using ab initio Hartree-Fock (HF) and density functional theory (DFT) employing B3LYP functional and 6-311G(d,p) basis set. Mulliken atomic charges, HOMO-LUMO gap ΔE, ionization potential, dipole moments and total energy have also been obtained for the optimized geometries of both the molecules. TD-DFT method is used to calculate the electronic absorption parameters in gas phase as well as in solvent environment using integral equation formalism-polarizable continuum model (IEF-PCM) employing 6-31G basis set and the results thus obtained are compared with the UV absorption spectra. The combination of experimental and calculated results provides an insight into the structural and vibrational spectroscopic properties of bicuculline. Copyright © 2011 Elsevier B.V. All rights reserved.

  2. Validation of the bipolar disorder etiology scale based on psychological behaviorism theory and factors related to the onset of bipolar disorder.

    Directory of Open Access Journals (Sweden)

    Jae Woo Park

    Full Text Available OBJECTIVES: The aim of this study was to identify psychosocial factors related to the onset of bipolar I disorder (BD. To do so, the Bipolar Disorder Etiology Scale (BDES, based on psychological behaviorism, was developed and validated. Using the BDES, common factors related to both major depressive disorder (MDD and BD and specific factors related only to BD were investigated. METHOD: The BDES, which measures 17 factors based on psychological behaviorism hypotheses, was developed and validated. This scale was administered to 113 non-clinical control subjects, 30 subjects with MDD, and 32 people with BD. ANOVA and post hoc analyses were conducted. Subscales on which MDD and BD groups scored higher than controls were classified as common factors, while those on which the BD group scored higher than MDD and control groups were classified as specific factors. RESULTS: The BDES has acceptable reliability and validity. Twelve common factors influence both MDD and BD and one specific factor influences only BD. Common factors include the following: learning grandiose self-labeling, learning dangerous behavior, reinforcing impulsive behavior, exposure to irritability, punishment of negative emotional expression, lack of support, sleep problems, antidepressant problems, positive arousal to threat, lack of social skills, and pursuit of short-term pleasure. The specific factor is manic emotional response. CONCLUSIONS: Manic emotional response was identified as a specific factor related to the onset of BD, while parents' grandiose labeling is a candidate for a specific factor. Many factors are related to the onset of both MDD and BD.

  3. Validation of the bipolar disorder etiology scale based on psychological behaviorism theory and factors related to the onset of bipolar disorder.

    Science.gov (United States)

    Park, Jae Woo; Park, Kee Hwan

    2014-01-01

    The aim of this study was to identify psychosocial factors related to the onset of bipolar I disorder (BD). To do so, the Bipolar Disorder Etiology Scale (BDES), based on psychological behaviorism, was developed and validated. Using the BDES, common factors related to both major depressive disorder (MDD) and BD and specific factors related only to BD were investigated. The BDES, which measures 17 factors based on psychological behaviorism hypotheses, was developed and validated. This scale was administered to 113 non-clinical control subjects, 30 subjects with MDD, and 32 people with BD. ANOVA and post hoc analyses were conducted. Subscales on which MDD and BD groups scored higher than controls were classified as common factors, while those on which the BD group scored higher than MDD and control groups were classified as specific factors. The BDES has acceptable reliability and validity. Twelve common factors influence both MDD and BD and one specific factor influences only BD. Common factors include the following: learning grandiose self-labeling, learning dangerous behavior, reinforcing impulsive behavior, exposure to irritability, punishment of negative emotional expression, lack of support, sleep problems, antidepressant problems, positive arousal to threat, lack of social skills, and pursuit of short-term pleasure. The specific factor is manic emotional response. Manic emotional response was identified as a specific factor related to the onset of BD, while parents' grandiose labeling is a candidate for a specific factor. Many factors are related to the onset of both MDD and BD.

  4. Detection and validation of unscalable item score patterns using item response theory: an illustration with Harter's Self-Perception Profile for Children.

    Science.gov (United States)

    Meijer, Rob R; Egberink, Iris J L; Emons, Wilco H M; Sijtsma, Klaas

    2008-05-01

    We illustrate the usefulness of person-fit methodology for personality assessment. For this purpose, we use person-fit methods from item response theory. First, we give a nontechnical introduction to existing person-fit statistics. Second, we analyze data from Harter's (1985) Self-Perception Profile for Children (Harter, 1985) in a sample of children ranging from 8 to 12 years of age (N = 611) and argue that for some children, the scale scores should be interpreted with care and caution. Combined information from person-fit indexes and from observation, interviews, and self-concept theory showed that similar score profiles may have a different interpretation. For some children in the sample, item scores did not adequately reflect their trait level. Based on teacher interviews, this was found to be due most likely to a less developed self-concept and/or problems understanding the meaning of the questions. We recommend investigating the scalability of score patterns when using self-report inventories to help the researcher interpret respondents' behavior correctly.

  5. Gonadotropins, their receptors, and the regulation of testicular functions in fish

    NARCIS (Netherlands)

    Schulz, Rüdiger W; Vischer, H F; Cavaco, J E; Dos Santos Rocha, M.E.; Tyler, R.C.; Goos, H.J.; Bogerd, J.

    The pituitary gonadotropins luteinizing hormone (LH) and follicle-stimulating hormone (FSH) regulate steroidogenesis and spermatogenesis by activating receptors expressed by Leydig cells (LH receptor) and Sertoli cells (FSH receptor), respectively. This concept is also valid in fish, although the

  6. Validation of missed space-group symmetry in X-ray powder diffraction structures with dispersion-corrected density functional theory.

    Science.gov (United States)

    Hempler, Daniela; Schmidt, Martin U; van de Streek, Jacco

    2017-08-01

    More than 600 molecular crystal structures with correct, incorrect and uncertain space-group symmetry were energy-minimized with dispersion-corrected density functional theory (DFT-D, PBE-D3). For the purpose of determining the correct space-group symmetry the required tolerance on the atomic coordinates of all non-H atoms is established to be 0.2 Å. For 98.5% of 200 molecular crystal structures published with missed symmetry, the correct space group is identified; there are no false positives. Very small, very symmetrical molecules can end up in artificially high space groups upon energy minimization, although this is easily detected through visual inspection. If the space group of a crystal structure determined from powder diffraction data is ambiguous, energy minimization with DFT-D provides a fast and reliable method to select the correct space group.

  7. Particle-size distribution modified effective medium theory and validation by magneto-dielectric Co-Ti substituted BaM ferrite composites

    Science.gov (United States)

    Li, Qifan; Chen, Yajie; Harris, Vincent G.

    2018-05-01

    This letter reports an extended effective medium theory (EMT) including particle-size distribution functions to maximize the magnetic properties of magneto-dielectric composites. It is experimentally verified by Co-Ti substituted barium ferrite (BaCoxTixFe12-2xO19)/wax composites with specifically designed particle-size distributions. In the form of an integral equation, the extended EMT formula essentially takes the size-dependent parameters of magnetic particle fillers into account. It predicts the effective permeability of magneto-dielectric composites with various particle-size distributions, indicating an optimal distribution for a population of magnetic particles. The improvement of the optimized effective permeability is significant concerning magnetic particles whose properties are strongly size dependent.

  8. Quantification of human epidermal growth factor receptor 2 immunohistochemistry using the Ventana Image Analysis System: correlation with gene amplification by fluorescence in situ hybridization: the importance of instrument validation for achieving high (>95%) concordance rate.

    Science.gov (United States)

    Dennis, Jake; Parsa, Rezvaneh; Chau, Donnie; Koduru, Prasad; Peng, Yan; Fang, Yisheng; Sarode, Venetia Rumnong

    2015-05-01

    The use of computer-based image analysis for scoring human epidermal growth factor receptor 2 (HER2) immunohistochemistry (IHC) has gained a lot of interest recently. We investigated the performance of the Ventana Image Analysis System (VIAS) in HER2 quantification by IHC and its correlation with fluorescence in situ hybridization (FISH). We specifically compared the 3+ IHC results using the manufacturer's machine score cutoffs versus laboratory-defined cutoffs with the FISH assay. Using the manufacturer's 3+ cutoff (VIAS score; 2.51 to 3.5), 181/536 (33.7%) were scored 3+, and FISH was positive in 147/181 (81.2%), 2 (1.1%) were equivocal, and 32 (17.6%) were FISH (-). Using the laboratory-defined 3+ cutoff (VIAS score 3.5), 52 (28.7%) cases were downgraded to 2+, of which 29 (55.7%) were FISH (-), and 23 (44.2%) were FISH (+). With the revised cutoff, there were improvements in the concordance rate from 89.1% to 97.0% and in the positive predictive value from 82.1% to 97.6%. The false-positive rate for 3+ decreased from 9.0% to 0.8%. Six of 175 (3.4%) IHC (-) cases were FISH (+). Three cases with a VIAS score 3.5 showed polysomy of chromosome 17. In conclusion, the VIAS may be a valuable tool for assisting pathologists in HER2 scoring; however, the positive cutoff defined by the manufacturer is associated with a high false-positive rate. This study highlights the importance of instrument validation/calibration to reduce false-positive results.

  9. Receptor assay

    Energy Technology Data Exchange (ETDEWEB)

    Kato, K; Ibayashi, H [Kyushu Univ., Fukuoka (Japan). Faculty of Medicine

    1975-05-01

    This paper summarized present status and problems of analysis of hormone receptor and a few considerations on clinical significance of receptor abnormalities. It was pointed that in future clinical field quantitative and qualitative analysis of receptor did not remain only in the etiological discussion, but that it was an epoch-making field of investigation which contained the possiblity of artificial change of sensitivity of living body on drugs and the development connected directly with treatment of various diseases.

  10. Explicating Validity

    Science.gov (United States)

    Kane, Michael T.

    2016-01-01

    How we choose to use a term depends on what we want to do with it. If "validity" is to be used to support a score interpretation, validation would require an analysis of the plausibility of that interpretation. If validity is to be used to support score uses, validation would require an analysis of the appropriateness of the proposed…

  11. An analytical model for the prediction of fluid-elastic forces in a rod bundle subjected to axial flow: theory, experimental validation and application to PWR fuel assemblies

    International Nuclear Information System (INIS)

    Beaud, F.

    1997-01-01

    A model predicting the fluid-elastic forces in a bundle of circular cylinders subjected to axial flow is presented in this paper. Whereas previously published models were limited to circular flow channel, the present one allows to take a rectangular flow external boundary into account. For that purpose, an original approach is derived from the standard method of images. This model will eventually be used to predict the fluid-structure coupling between the flow of primary coolant and a fuel assemblies in PWR nuclear reactors. It is indeed of major importance since the flow is shown to induce quite high damping and could therefore mitigate the incidence of an external load like a seismic excitation on the dynamics of the assemblies. The proposed model is validated on two cases from the literature but still needs further comparisons with the experiments being currently carried out on the EDF set-up. The flow has been shown to induce an approximate 12% damping on a PWR fuel assembly, at nominal reactor conditions. The possible grid effect on the fluid-structure coupling has been neglected so far but will soon be investigated at EDF. (author)

  12. Nuclear data covariances and sensitivity analysis, validation of a methodology based on the perturbation theory; application to an innovative concept: the molten thorium salt fueled reactor

    International Nuclear Information System (INIS)

    Bidaud, A.

    2005-10-01

    Neutron transport simulation of nuclear reactors is based on the knowledge of the neutron-nucleus interaction (cross-sections, fission neutron yields and spectra...) for the dozens of nuclei present in the core over a very large energy range (fractions of eV to several MeV). To obtain the goal of the sustainable development of nuclear power, future reactors must have new and more strict constraints to their design: optimization of ore materials will necessitate breeding (generation of fissile material from fertile material), and waste management will require transmutation. Innovative reactors that could achieve such objectives - generation IV or ADS (accelerator driven system) - are loaded with new fuels (thorium, heavy actinides) and function with neutron spectra for which nuclear data do not benefit from 50 years of industrial experience, and thus present particular challenges. After validation on an experimental reactor using an international benchmark, we take classical reactor physics tools along with available nuclear data uncertainties to calculate the sensitivities and uncertainties of the criticality and temperature coefficient of a thorium molten salt reactor. In addition, a study based on the important reaction rates for the calculation of cycle's equilibrium allows us to estimate the efficiency of different reprocessing strategies and the contribution of these reaction rates on the uncertainty of the breeding and then on the uncertainty of the size of the reprocessing plant. Finally, we use this work to propose an improvement of the high priority experimental request list. (author)

  13. Gastrointestinal Spatiotemporal mRNA Expression of Ghrelin vs Growth Hormone Receptor and New Growth Yield Machine Learning Model Based on Perturbation Theory.

    Science.gov (United States)

    Ran, Tao; Liu, Yong; Li, Hengzhi; Tang, Shaoxun; He, Zhixiong; Munteanu, Cristian R; González-Díaz, Humberto; Tan, Zhiliang; Zhou, Chuanshe

    2016-07-27

    The management of ruminant growth yield has economic importance. The current work presents a study of the spatiotemporal dynamic expression of Ghrelin and GHR at mRNA levels throughout the gastrointestinal tract (GIT) of kid goats under housing and grazing systems. The experiments show that the feeding system and age affected the expression of either Ghrelin or GHR with different mechanisms. Furthermore, the experimental data are used to build new Machine Learning models based on the Perturbation Theory, which can predict the effects of perturbations of Ghrelin and GHR mRNA expression on the growth yield. The models consider eight longitudinal GIT segments (rumen, abomasum, duodenum, jejunum, ileum, cecum, colon and rectum), seven time points (0, 7, 14, 28, 42, 56 and 70 d) and two feeding systems (Supplemental and Grazing feeding) as perturbations from the expected values of the growth yield. The best regression model was obtained using Random Forest, with the coefficient of determination R(2) of 0.781 for the test subset. The current results indicate that the non-linear regression model can accurately predict the growth yield and the key nodes during gastrointestinal development, which is helpful to optimize the feeding management strategies in ruminant production system.

  14. Field theory and strings

    International Nuclear Information System (INIS)

    Bonara, L.; Cotta-Ramusino, P.; Rinaldi, M.

    1987-01-01

    It is well-known that type I and heterotic superstring theories have a zero mass spectrum which correspond to the field content of N=1 supergravity theory coupled to supersymmetric Yang-Mills theory in 10-D. The authors study the field theory ''per se'', in the hope that simple consistency requirements will determine the theory completely once one knows the field content inherited from string theory. The simplest consistency requirements are: N=1 supersymmetry; and absence of chiral anomalies. This is what the authors discuss in this paper here leaving undetermined the question of the range of validity of the resulting field theory. As is known, a model of N=1 supergravity (SUGRA) coupled to supersymmetric Yang-Mills (SYM) theory was known in the form given by Chapline and Manton. The coupling of SUGRA to SYM was determined by the definition of the ''field strength'' 3-form H in this paper

  15. Dopamine Receptors and Parkinson's Disease

    Directory of Open Access Journals (Sweden)

    Shin Hisahara

    2011-01-01

    Full Text Available Parkinson's disease (PD is a progressive extrapyramidal motor disorder. Pathologically, this disease is characterized by the selective dopaminergic (DAergic neuronal degeneration in the substantia nigra. Correcting the DA deficiency in PD with levodopa (L-dopa significantly attenuates the motor symptoms; however, its effectiveness often declines, and L-dopa-related adverse effects emerge after long-term treatment. Nowadays, DA receptor agonists are useful medication even regarded as first choice to delay the starting of L-dopa therapy. In advanced stage of PD, they are also used as adjunct therapy together with L-dopa. DA receptor agonists act by stimulation of presynaptic and postsynaptic DA receptors. Despite the usefulness, they could be causative drugs for valvulopathy and nonmotor complication such as DA dysregulation syndrome (DDS. In this paper, physiological characteristics of DA receptor familyare discussed. We also discuss the validity, benefits, and specific adverse effects of pharmaceutical DA receptor agonist.

  16. "The Theory of Heat Radiation" Revisited: A Commentary on the Validity of Kirchhoff's Law of Thermal Emission and Max Planck's Claim of Universality

    Directory of Open Access Journals (Sweden)

    Robitaille P.-M.

    2015-04-01

    Full Text Available Affirming Kirchhoff’s Law of thermal emission, Max Planck conferred upon his own equation and its constants, h and k , universal significance. All arbitrary cavities were said to behave as blackbodies. They were thought to contain b lack, or normal radiation, which depended only upon temperature and frequency of observation, irrespective of the nature of the cavity walls. Today, laboratory blackbodies a re specialized, heated devices whose interior walls are lined with highly absorptive surfaces, such as graphite, soot, or other sophisticated materials. Such evidence repeatedly calls into question Kirchhoff’s Law, as nothing in the laboratory is independent of the nature of the walls. By focusing on Max Planck’s classic text, “ The Theory of Heat Radiation ’, it can be demonstrated that the German physicist was unable to properly justify Kirchhoff’s Law. At every turn, he was confronted with the fact that materials possess frequency dependent reflectivity and absorptivity, but he often chose to sidestep these realities. He used polarized light to derive Kirchhoff’s Law, when it is well known that blackbody radiation is never polar- ized. Through the use of an element, d σ , at the bounding surface between two media, he reached the untenable position that arbitrary materials have the same reflective prop- erties. His Eq.40 ( ρ = ρ ′ , constituted a dismissal of experimental reality. It is evident that if one neglects reflection, then all cavities must be black. Unable to ensure that perfectly reflecting cavities can be filled with black radiation, Planck inserted a minute carbon particle, which he qualified as a “catalyst”. In fact, it was acting as a perfect absorber, fully able to provide, on its own, the radiation sought. In 1858, Balfour Stew- art had outlined that the proper treatment of cavity radiation must include reflection. Yet, Max Planck did not cite the Scottish scientist. He also d id not correctly address

  17. Methodical Challenges and a Possible Resolution in the Assessment of Receptor Reserve for Adenosine, an Agonist with Short Half-Life

    Directory of Open Access Journals (Sweden)

    Judit Zsuga

    2017-05-01

    Full Text Available The term receptor reserve, first introduced and used in the traditional receptor theory, is an integrative measure of response-inducing ability of the interaction between an agonist and a receptor system (consisting of a receptor and its downstream signaling. The underlying phenomenon, i.e., stimulation of a submaximal fraction of receptors can apparently elicit the maximal effect (in certain cases, provides an opportunity to assess the receptor reserve. However, determining receptor reserve is challenging for agonists with short half-lives, such as adenosine. Although adenosine metabolism can be inhibited several ways (in order to prevent the rapid elimination of adenosine administered to construct concentration–effect (E/c curves for the determination, the consequent accumulation of endogenous adenosine biases the results. To address this problem, we previously proposed a method, by means of which this bias can be mathematically corrected (utilizing a traditional receptor theory-independent approach. In the present investigation, we have offered in silico validation of this method by simulating E/c curves with the use of the operational model of agonism and then by evaluating them using our method. We have found that our method is suitable to reliably assess the receptor reserve for adenosine in our recently published experimental setting, suggesting that it may be capable for a qualitative determination of receptor reserve for rapidly eliminating agonists in general. In addition, we have disclosed a possible interference between FSCPX (8-cyclopentyl-N3-[3-(4-(fluorosulfonylbenzoyloxypropyl]-N1-propylxanthine, an irreversible A1 adenosine receptor antagonist, and NBTI (S-(2-hydroxy-5-nitrobenzyl-6-thioinosine, a nucleoside transport inhibitor, i.e., FSCPX may blunt the effect of NBTI.

  18. The validation of language tests

    African Journals Online (AJOL)

    KATEVG

    Stellenbosch Papers in Linguistics, Vol. ... validation is necessary because of the major impact which test results can have on the many ... Messick (1989: 20) introduces his much-quoted progressive matrix (cf. table 1), which ... argue that current accounts of validity only superficially address theories of measurement.

  19. TLX: An elusive receptor.

    Science.gov (United States)

    Benod, Cindy; Villagomez, Rosa; Webb, Paul

    2016-03-01

    TLX (tailless receptor) is a member of the nuclear receptor superfamily and belongs to a class of nuclear receptors for which no endogenous or synthetic ligands have yet been identified. TLX is a promising therapeutic target in neurological disorders and brain tumors. Thus, regulatory ligands for TLX need to be identified to complete the validation of TLX as a useful target and would serve as chemical probes to pursue the study of this receptor in disease models. It has recently been proved that TLX is druggable. However, to identify potent and specific TLX ligands with desirable biological activity, a deeper understanding of where ligands bind, how they alter TLX conformation and of the mechanism by which TLX mediates the transcription of its target genes is needed. While TLX is in the process of escaping from orphanhood, future ligand design needs to progress in parallel with improved understanding of (i) the binding cavity or surfaces to target with small molecules on the TLX ligand binding domain and (ii) the nature of the TLX coregulators in particular cell and disease contexts. Both of these topics are discussed in this review. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Pharmacological approach of the receptors

    International Nuclear Information System (INIS)

    Puech, A.J.

    1989-01-01

    This paper explains the three main goals for clinical positron emission tomography (PET) studies: detection of receptor abnormalities in groups of patients to propose therapeutic indication of new ligands; validation of current hypothesis of drug effect; rational clinical drug development specially for dose-finding studies. (H.W.)

  1. FACTAR validation

    International Nuclear Information System (INIS)

    Middleton, P.B.; Wadsworth, S.L.; Rock, R.C.; Sills, H.E.; Langman, V.J.

    1995-01-01

    A detailed strategy to validate fuel channel thermal mechanical behaviour codes for use of current power reactor safety analysis is presented. The strategy is derived from a validation process that has been recently adopted industry wide. Focus of the discussion is on the validation plan for a code, FACTAR, for application in assessing fuel channel integrity safety concerns during a large break loss of coolant accident (LOCA). (author)

  2. Employee Development and Turnover Intention: Theory Validation

    Science.gov (United States)

    Rahman, Wali; Nas, Zekeriya

    2013-01-01

    Purpose: This study aims to examine the pattern of behavior of turnover intentions in developing countries "vis-a-vis" the one in advanced countries through the empirical data from public universities in Khyber Pakhtunkhwa, Pakistan. The study provides empirical evidence from academia in Pakistan, thereby enriching the understanding of…

  3. A Validation of Cognitive Evaluation Theory.

    Science.gov (United States)

    McDonald, Charles H.

    The role of money and other types of feedback on motivation to do a task or job has long been of interest to managers in business. To examine Deci's hypothesis concerning the effects of contingent rewards on intrinsic task interest, 42 high school students worked puzzles involving the solution of mazes and anagrams. Competence in task was made…

  4. Developing a Domain Theory Defining and Exemplifying a Learning Theory of Progressive Attainments

    Science.gov (United States)

    Bunderson, C. Victor

    2011-01-01

    This article defines the concept of Domain Theory, or, when educational measurement is the goal, one might call it a "Learning Theory of Progressive Attainments in X Domain". The concept of Domain Theory is first shown to be rooted in validity theory, then the concept of domain theory is expanded to amplify its necessary but long neglected…

  5. Waltz's Theory of Theory

    DEFF Research Database (Denmark)

    Wæver, Ole

    2009-01-01

    -empiricism and anti-positivism of his position. Followers and critics alike have treated Waltzian neorealism as if it was at bottom a formal proposition about cause-effect relations. The extreme case of Waltz being so victorious in the discipline, and yet being consistently mis-interpreted on the question of theory......, shows the power of a dominant philosophy of science in US IR, and thus the challenge facing any ambitious theorising. The article suggests a possible movement of fronts away from the ‘fourth debate' between rationalism and reflectivism towards one of theory against empiricism. To help this new agenda...

  6. Identification, validation, and clinical implementation of tumor-associated biomarkers to improve therapy concepts, survival, and quality of life of cancer patients: tasks of the Receptor and Biomarker Group of the European Organization for Research and Treatment of Cancer.

    NARCIS (Netherlands)

    Schmitt, M.; Harbeck, N.; Daidone, M.G.; Brynner, N.; Duffy, M.J.; Foekens, J.A.; Sweep, C.G.J.

    2004-01-01

    Guiding principles are provided and discussed on how to inform the physician scientist and cancer researcher about quality control systems to enable a consistent assessment of the clinical value of tumor-associated biomarkers. Next to cancer research itself, the Receptor and Biomarker Group of the

  7. Nuclear data covariances and sensitivity analysis, validation of a methodology based on the perturbation theory; application to an innovative concept: the molten thorium salt fueled reactor; Analyses de sensibilite et d'incertitude de donnees nucleaires. Contribution a la validation d'une methodologie utilisant la theorie des perturbations; application a un concept innovant: reacteur a sels fondus thorium a spectre epithermique

    Energy Technology Data Exchange (ETDEWEB)

    Bidaud, A

    2005-10-15

    Neutron transport simulation of nuclear reactors is based on the knowledge of the neutron-nucleus interaction (cross-sections, fission neutron yields and spectra...) for the dozens of nuclei present in the core over a very large energy range (fractions of eV to several MeV). To obtain the goal of the sustainable development of nuclear power, future reactors must have new and more strict constraints to their design: optimization of ore materials will necessitate breeding (generation of fissile material from fertile material), and waste management will require transmutation. Innovative reactors that could achieve such objectives - generation IV or ADS (accelerator driven system) - are loaded with new fuels (thorium, heavy actinides) and function with neutron spectra for which nuclear data do not benefit from 50 years of industrial experience, and thus present particular challenges. After validation on an experimental reactor using an international benchmark, we take classical reactor physics tools along with available nuclear data uncertainties to calculate the sensitivities and uncertainties of the criticality and temperature coefficient of a thorium molten salt reactor. In addition, a study based on the important reaction rates for the calculation of cycle's equilibrium allows us to estimate the efficiency of different reprocessing strategies and the contribution of these reaction rates on the uncertainty of the breeding and then on the uncertainty of the size of the reprocessing plant. Finally, we use this work to propose an improvement of the high priority experimental request list. (author)

  8. Imidazole derivatives as angiotensin II AT1 receptor blockers: Benchmarks, drug-like calculations and quantitative structure-activity relationships modeling

    Science.gov (United States)

    Alloui, Mebarka; Belaidi, Salah; Othmani, Hasna; Jaidane, Nejm-Eddine; Hochlaf, Majdi

    2018-03-01

    We performed benchmark studies on the molecular geometry, electron properties and vibrational analysis of imidazole using semi-empirical, density functional theory and post Hartree-Fock methods. These studies validated the use of AM1 for the treatment of larger systems. Then, we treated the structural, physical and chemical relationships for a series of imidazole derivatives acting as angiotensin II AT1 receptor blockers using AM1. QSAR studies were done for these imidazole derivatives using a combination of various physicochemical descriptors. A multiple linear regression procedure was used to design the relationships between molecular descriptor and the activity of imidazole derivatives. Results validate the derived QSAR model.

  9. Microcontinuum field theories

    CERN Document Server

    Eringen, A Cemal

    1999-01-01

    Microcontinuum field theories constitute an extension of classical field theories -- of elastic bodies, deformations, electromagnetism, and the like -- to microscopic spaces and short time scales. Material bodies are here viewed as collections of large numbers of deformable particles, much as each volume element of a fluid in statistical mechanics is viewed as consisting of a large number of small particles for which statistical laws are valid. Classical continuum theories are valid when the characteristic length associated with external forces or stimuli is much larger than any internal scale of the body under consideration. When the characteristic lengths are comparable, however, the response of the individual constituents becomes important, for example, in considering the fluid or elastic properties of blood, porous media, polymers, liquid crystals, slurries, and composite materials. This volume is concerned with the kinematics of microcontinua. It begins with a discussion of strain, stress tensors, balanc...

  10. Motivasyonel Dil (MD Teorisi ve Ölçme Aracının Türkçede Geçerlik ve Güvenilirlik Analizi = The Relaibility and Validity Analyses of Motivational Language Theory and Scale

    Directory of Open Access Journals (Sweden)

    Türker BAŞ

    2011-08-01

    Full Text Available When the literature of leadership and communication is examined, it can be identified that until 1990s, there was not enough study on the effects of a leader’s language and its content on motivation and performance of employees. This gap was filled in theoretical dimension by the Motivating Language Theory by Sullivan (1988 and as an extension of this theory, the Motivating Language Scale developed by Mayfield, Mayfield and Kopf (1995 based on the former study closed in practical dimension. In this study, the scale developed by Mayfield, Mayfield and Kopf (1995 has been tested for its validity and reliability. As a result of analyses carried out, it has been determined that the scale has a high rate of validity and reliability. Therefore, it is assessed that this scale can contribute to empirical studies in the future.

  11. Validation philosophy

    International Nuclear Information System (INIS)

    Vornehm, D.

    1994-01-01

    To determine when a set of calculations falls within an umbrella of an existing validation documentation, it is necessary to generate a quantitative definition of range of applicability (our definition is only qualitative) for two reasons: (1) the current trend in our regulatory environment will soon make it impossible to support the legitimacy of a validation without quantitative guidelines; and (2) in my opinion, the lack of support by DOE for further critical experiment work is directly tied to our inability to draw a quantitative open-quotes line-in-the-sandclose quotes beyond which we will not use computer-generated values

  12. Kerlinger's Criterial Referents Theory Revisited.

    Science.gov (United States)

    Zak, Itai; Birenbaum, Menucha

    1980-01-01

    Kerlinger's criterial referents theory of attitudes was tested cross-culturally by administering an education attitude referents summated-rating scale to 713 individuals in Israel. The response pattern to criterial and noncriterial referents was examined. Results indicated empirical cross-cultural validity of theory, but questioned measuring…

  13. Ring Theory

    CERN Document Server

    Jara, Pascual; Torrecillas, Blas

    1988-01-01

    The papers in this proceedings volume are selected research papers in different areas of ring theory, including graded rings, differential operator rings, K-theory of noetherian rings, torsion theory, regular rings, cohomology of algebras, local cohomology of noncommutative rings. The book will be important for mathematicians active in research in ring theory.

  14. Game theory

    DEFF Research Database (Denmark)

    Hendricks, Vincent F.

    Game Theory is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in game theory. We hear their views on game theory, its aim, scope, use, the future direction of game theory and how their work fits in these respects....

  15. SEEKING A VALID THEORY OF MAGIC REALISM: A CRITICAL LITERATURE REVIEW / BUSCANDO UNA TEORÍA VALIDA DE REALISMO MÁGICO: UNA REVISIÓN BIBLIOGRÁFICA CRÍTICA

    Directory of Open Access Journals (Sweden)

    Milagro Asensio

    2017-11-01

    Full Text Available A pesar de que mucho se ha investigado y escrito sobre el género literario realismo mágico, las conclusiones a las que llegan la mayor parte de los estudios varían y, en algunos casos, hasta se contradicen. Los malentendidos asociados al término, por un lado, y el apego al modelo estructuralista, incluso por parte de estudios postcoloniales y postmodernos del género, por el otro, han impedido el desarrollo de un marco conceptual que pueda ser usado para analizar obras mágico realistas. Este artículo aborda estas cuestiones revisando críticamente la bibliografía sobre realismo mágico desde que el término fue acuñado. Se cuestiona el criterio más usado para definir y caracterizar el género a la luz de teorías postcoloniales y postmodernas. Esta evaluación de las numerosas contribuciones realizadas por investigadores de diferentes enfoques literarios apunta a delinear un marco teórico bien fundado con criterios sólidos de definición del género que puedan ser validados y mejorados en investigaciones futuras. / Despite much has been researched and written about the literary genre magic realism, the conclusions to which most studies arrive vary and, in some cases, even contradict each other. The misconceptions associated to the term, on the one hand, and the attachment to a structuralist model, even from postcolonial and postmodern studies of the genre, on the other, have hindered the development of a conceptual framework that can be used for analysing magical realist works. This article addresses these issues by reviewing the critical literature on magic realism since the term was first coined. The most widely used criteria to define and characterise the genre is questioned in the light of both postcolonial and postmodern theories. This examination of the many contributions made by researchers from different literary approaches aims at outlining a well-grounded theoretical framework with sound criteria to define the genre that can

  16. Knock-In Mice with NOP-eGFP Receptors Identify Receptor Cellular and Regional Localization.

    Science.gov (United States)

    Ozawa, Akihiko; Brunori, Gloria; Mercatelli, Daniela; Wu, Jinhua; Cippitelli, Andrea; Zou, Bende; Xie, Xinmin Simon; Williams, Melissa; Zaveri, Nurulain T; Low, Sarah; Scherrer, Grégory; Kieffer, Brigitte L; Toll, Lawrence

    2015-08-19

    The nociceptin/orphanin FQ (NOP) receptor, the fourth member of the opioid receptor family, is involved in many processes common to the opioid receptors including pain and drug abuse. To better characterize receptor location and trafficking, knock-in mice were created by inserting the gene encoding enhanced green fluorescent protein (eGFP) into the NOP receptor gene (Oprl1) and producing mice expressing a functional NOP-eGFP C-terminal fusion in place of the native NOP receptor. The NOP-eGFP receptor was present in brain of homozygous knock-in animals in concentrations somewhat higher than in wild-type mice and was functional when tested for stimulation of [(35)S]GTPγS binding in vitro and in patch-clamp electrophysiology in dorsal root ganglia (DRG) neurons and hippocampal slices. Inhibition of morphine analgesia was equivalent when tested in knock-in and wild-type mice. Imaging revealed detailed neuroanatomy in brain, spinal cord, and DRG and was generally consistent with in vitro autoradiographic imaging of receptor location. Multicolor immunohistochemistry identified cells coexpressing various spinal cord and DRG cellular markers, as well as coexpression with μ-opioid receptors in DRG and brain regions. Both in tissue slices and primary cultures, the NOP-eGFP receptors appear throughout the cell body and in processes. These knock-in mice have NOP receptors that function both in vitro and in vivo and appear to be an exceptional tool to study receptor neuroanatomy and correlate with NOP receptor function. The NOP receptor, the fourth member of the opioid receptor family, is involved in pain, drug abuse, and a number of other CNS processes. The regional and cellular distribution has been difficult to determine due to lack of validated antibodies for immunohistochemical analysis. To provide a new tool for the investigation of receptor localization, we have produced knock-in mice with a fluorescent-tagged NOP receptor in place of the native NOP receptor. These

  17. String theory

    International Nuclear Information System (INIS)

    Chan Hongmo.

    1987-10-01

    The paper traces the development of the String Theory, and was presented at Professor Sir Rudolf Peierls' 80sup(th) Birthday Symposium. The String theory is discussed with respect to the interaction of strings, the inclusion of both gauge theory and gravitation, inconsistencies in the theory, and the role of space-time. The physical principles underlying string theory are also outlined. (U.K.)

  18. Synthesis, in vitro validation and in vivo pharmacokinetics of [{sup 125}I]N-[2-(4-iodophenyl)ethyl]-N-methyl-2-(1-piperidinyl) ethylamine: A high-affinity ligand for imaging sigma receptor positive tumors

    Energy Technology Data Exchange (ETDEWEB)

    John, Christy S; Gulden, Mary E; Vilner, Bertold J; Bowen, Wayne D

    1996-08-01

    N-[2-(4-iodophenyl)ethyl]-N-methyl-2-(1-piperidinyl)ethylamine, IPEMP, and the corresponding bromo derivative, BrPEMP, have been synthesized and characterized. Both BrPEMP and IPEMP were evaluated for sigma-1 and sigma-2 subtype receptor affinities and found to possess very high affinities for both receptor subtypes. The precursor for radioiodination n-tributylstannylphenylethylpiperidinylethylamine was prepared from its bromo derivative by palladium-catalyzed stannylation reaction. Radioiodinated 4-[{sup 125}I]PEMP was readily prepared in high yields and high specific activity by oxidative iododestannylation reaction using chloramine-T as oxidizing agent. Sites labeled by 4-[{sup 125}I]PEMP in guinea pig brain membranes showed high affinity for BD1008, haloperidol, and (+)-pentazocine (Ki = 5.06 {+-} 0.40, 32.6 {+-} 2.75, and 48.1 {+-} 8.60 nM, respectively), which is consistent with sigma receptor pharmacology. Competition binding studies of 4-[{sup 125}I]PEMP in melanoma (A375) and MCF-7 breast cancer cells showed a high affinity, dose-dependent inhibition of binding with known sigma ligand N-[2-(3,4-dichlorophenyl)ethyl]-N-methyl-2-(1-pyrrolidinyl) ethylamine, BD1008 (Ki = 5, 11 nM, respectively), supporting the labeling of sigma sites in these cells. Haloperidol, however showed a weaker (Ki 100-200 nM) affinity for the sites labeled by 4-[{sup 125}I]PEMP in these cells. Biodistribution studies of 4-[{sup 125}I]PEMP in rats showed a fast clearance of this radiopharmaceutical from blood, liver, lung, and other organs. A co-injection of 4-IPEMP with 4-[{sup 125}I]PEMP resulted in 37%, 69%, and 35% decrease in activity in liver, kidney, and brain (organs possessing sigma receptors), respectively at 1-h postinjection. These results suggest that 4-[{sup 125}I]PEMP is a promising radiopharmaceutical for pursuing further studies in animal models with tumors.

  19. Sociological theory and social reality

    Directory of Open Access Journals (Sweden)

    J Díez Nicolás

    2014-12-01

    Full Text Available This paper pretends to demonstrate the complementary relations between three relatively recent sociological theories, each one of which explains a different aspect of the same social object: the origin, diffusion and change of social and cultural values, aiming at demonstrating that there is not such a thing as a sociological theory that explains all, but rather diverse theories that offer partial explanations of social reality. To that effect, and on the basis of the necessary relationship between theory and research, three different theories are evaluated separately: Hawley’s and Duncan’s theory of the social ecosystem, Galtung’s centre-periphery theory, and Inglehart’s theory of values’ change in modern-industrial societies, offering theoretical and empirical evidence of their complementary relations, based on Spanish and international data. Social ecosystem and centre-periphery theories show a high level of generalization (through space and time and a high level of abstraction, though both can easily operationalize their main concepts through valid and reliable indicators. The theory of values’ change, however, though showing a high level of generalization, is limited in time to the historical period after World War II, and also shows a high level of abstraction. Centre-periphery theory and values’ change theory use individual and collective units of analysis, but social ecosystem theory only uses collective units, by definition. The three theories lead to the conclusion that ‘security’ values will gain a growing importance in present societies.

  20. Validating MEDIQUAL Constructs

    Science.gov (United States)

    Lee, Sang-Gun; Min, Jae H.

    In this paper, we validate MEDIQUAL constructs through the different media users in help desk service. In previous research, only two end-users' constructs were used: assurance and responsiveness. In this paper, we extend MEDIQUAL constructs to include reliability, empathy, assurance, tangibles, and responsiveness, which are based on the SERVQUAL theory. The results suggest that: 1) five MEDIQUAL constructs are validated through the factor analysis. That is, importance of the constructs have relatively high correlations between measures of the same construct using different methods and low correlations between measures of the constructs that are expected to differ; and 2) five MEDIQUAL constructs are statistically significant on media users' satisfaction in help desk service by regression analysis.

  1. String theory or field theory?

    International Nuclear Information System (INIS)

    Marshakov, A.V.

    2002-01-01

    The status of string theory is reviewed, and major recent developments - especially those in going beyond perturbation theory in the string theory and quantum field theory frameworks - are analyzed. This analysis helps better understand the role and place of experimental phenomena, it is emphasized that there are some insurmountable problems inherent in it - notably the impossibility to formulate the quantum theory of gravity on its basis - which prevent it from being a fundamental physical theory of the world of microscopic distances. It is this task, the creation of such a theory, which string theory, currently far from completion, is expected to solve. In spite of its somewhat vague current form, string theory has already led to a number of serious results and greatly contributed to progress in the understanding of quantum field theory. It is these developments, which are our concern in this review [ru

  2. Validity of Management Control Topoi

    DEFF Research Database (Denmark)

    Nørreklit, Lennart; Nørreklit, Hanne; Israelsen, Poul

    2004-01-01

    The validity of research and company topoi for constructing/analyzing relaity is analyzed as the integration of the four aspects (dimensions): fact, possibility (logic), value and comunication. Main stream, agency theory and social constructivism are critizied for reductivism (incomplete integrat...

  3. Nonperturbative perturbation theory

    International Nuclear Information System (INIS)

    Bender, C.M.

    1989-01-01

    In this talk we describe a recently proposed graphical perturbative calculational scheme for quantum field theory. The basic idea is to expand in the power of the interaction term. For example, to solve a λφ 4 theory in d-dimensional space-time, we introduce a small parameter δ and consider a λ(φ 2 ) 1+δ field theory. We show how to expand such a theory as a series in powers of δ. The resulting perturbation series appears to have a finite radius of convergence and numerical results for low-dimensional models are good. We have computed the two-point and four-point Green's functions to second order in powers of δ and the 2n-point Green's functions (n>2) to order δ. We explain how to renormalize the theory and show that, to first order in powers of δ, when δ>0 and d≥4 the theory is free. This conclusion remains valid to second order in powers of δ, and we believe that it remains valid to all orders in powers of δ. The new perturbative scheme is consistent with global supersymmetry invariance. We examine a two-dimensional supersymmetric quantum field theory in which we do not know of any other means for doing analytical calculations. We illustrate the power of this new technique by computing the ground-state energy density E to second order in this new perturbation theory. We show that there is a beautiful and delicate cancellation between infinite classes of graphs which leads to the result that E=0. (orig.)

  4. Certification Testing as an Illustration of Argument-Based Validation

    Science.gov (United States)

    Kane, Michael

    2004-01-01

    The theories of validity developed over the past 60 years are quite sophisticated, but the methodology of validity is not generally very effective. The validity evidence for major testing programs is typically much weaker than the evidence for more technical characteristics such as reliability. In addition, most validation efforts have a strong…

  5. Supergravity theories

    International Nuclear Information System (INIS)

    Uehara, S.

    1985-01-01

    Of all supergravity theories, the maximal, i.e., N = 8 in 4-dimension or N = 1 in 11-dimension, theory should perform the unification since it owns the highest degree of symmetry. As to the N = 1 in d = 11 theory, it has been investigated how to compactify to the d = 4 theories. From the phenomenological point of view, local SUSY GUTs, i.e., N = 1 SUSY GUTs with soft breaking terms, have been studied from various angles. The structures of extended supergravity theories are less understood than those of N = 1 supergravity theories, and matter couplings in N = 2 extended supergravity theories are under investigation. The harmonic superspace was recently proposed which may be useful to investigate the quantum effects of extended supersymmetry and supergravity theories. As to the so-called Kaluza-Klein supergravity, there is another possibility. (Mori, K.)

  6. Topos theory

    CERN Document Server

    Johnstone, PT

    2014-01-01

    Focusing on topos theory's integration of geometric and logical ideas into the foundations of mathematics and theoretical computer science, this volume explores internal category theory, topologies and sheaves, geometric morphisms, other subjects. 1977 edition.

  7. The LDL receptor.

    Science.gov (United States)

    Goldstein, Joseph L; Brown, Michael S

    2009-04-01

    In this article, the history of the LDL receptor is recounted by its codiscoverers. Their early work on the LDL receptor explained a genetic cause of heart attacks and led to new ways of thinking about cholesterol metabolism. The LDL receptor discovery also introduced three general concepts to cell biology: receptor-mediated endocytosis, receptor recycling, and feedback regulation of receptors. The latter concept provides the mechanism by which statins selectively lower plasma LDL, reducing heart attacks and prolonging life.

  8. Probability theory

    CERN Document Server

    Dorogovtsev, A Ya; Skorokhod, A V; Silvestrov, D S; Skorokhod, A V

    1997-01-01

    This book of problems is intended for students in pure and applied mathematics. There are problems in traditional areas of probability theory and problems in the theory of stochastic processes, which has wide applications in the theory of automatic control, queuing and reliability theories, and in many other modern science and engineering fields. Answers to most of the problems are given, and the book provides hints and solutions for more complicated problems.

  9. Gauge theories

    International Nuclear Information System (INIS)

    Lee, B.W.

    1976-01-01

    Some introductory remarks to Yang-Mills fields are given and the problem of the Coulomb gauge is considered. The perturbation expansion for quantized gauge theories is discussed and a survey of renormalization schemes is made. The role of Ward-Takahashi identities in gauge theories is discussed. The author then discusses the renormalization of pure gauge theories and theories with spontaneously broken symmetry. (B.R.H.)

  10. Holographic effective field theories

    Energy Technology Data Exchange (ETDEWEB)

    Martucci, Luca [Dipartimento di Fisica ed Astronomia “Galileo Galilei' , Università di Padova,and INFN - Sezione di Padova, Via Marzolo 8, I-35131 Padova (Italy); Zaffaroni, Alberto [Dipartimento di Fisica, Università di Milano-Bicocca,and INFN - Sezione di Milano-Bicocca, I-20126 Milano (Italy)

    2016-06-28

    We derive the four-dimensional low-energy effective field theory governing the moduli space of strongly coupled superconformal quiver gauge theories associated with D3-branes at Calabi-Yau conical singularities in the holographic regime of validity. We use the dual supergravity description provided by warped resolved conical geometries with mobile D3-branes. Information on the baryonic directions of the moduli space is also obtained by using wrapped Euclidean D3-branes. We illustrate our general results by discussing in detail their application to the Klebanov-Witten model.

  11. Nonstationary statistical theory for multipactor

    International Nuclear Information System (INIS)

    Anza, S.; Vicente, C.; Gil, J.; Boria, V. E.; Gimeno, B.; Raboso, D.

    2010-01-01

    This work presents a new and general approach to the real dynamics of the multipactor process: the nonstationary statistical multipactor theory. The nonstationary theory removes the stationarity assumption of the classical theory and, as a consequence, it is able to adequately model electron exponential growth as well as absorption processes, above and below the multipactor breakdown level. In addition, it considers both double-surface and single-surface interactions constituting a full framework for nonresonant polyphase multipactor analysis. This work formulates the new theory and validates it with numerical and experimental results with excellent agreement.

  12. Laboratory compliance with the American Society of Clinical Oncology/College of American Pathologists human epidermal growth factor receptor 2 testing guidelines: a 3-year comparison of validation procedures.

    Science.gov (United States)

    Dyhdalo, Kathryn S; Fitzgibbons, Patrick L; Goldsmith, Jeffery D; Souers, Rhona J; Nakhleh, Raouf E

    2014-07-01

    The American Society of Clinical Oncology/College of American Pathologists (ASCO/CAP) published guidelines in 2007 regarding testing accuracy, interpretation, and reporting of results for HER2 studies. A 2008 survey identified areas needing improved compliance. To reassess laboratory response to those guidelines following a full accreditation cycle for an updated snapshot of laboratory practices regarding ASCO/CAP guidelines. In 2011, a survey was distributed with the HER2 immunohistochemistry (IHC) proficiency testing program identical to the 2008 survey. Of the 1150 surveys sent, 977 (85.0%) were returned, comparable to the original survey response in 2008 (757 of 907; 83.5%). New participants submitted 124 of 977 (12.7%) surveys. The median laboratory accession rate was 14,788 cases with 211 HER2 tests performed annually. Testing was validated with fluorescence in situ hybridization in 49.1% (443 of 902) of the laboratories; 26.3% (224 of 853) of the laboratories used another IHC assay. The median number of cases to validate fluorescence in situ hybridization (n = 40) and IHC (n = 27) was similar to those in 2008. Ninety-five percent concordance with fluorescence in situ hybridization was achieved by 76.5% (254 of 332) of laboratories for IHC(-) findings and 70.4% (233 of 331) for IHC(+) cases. Ninety-five percent concordance with another IHC assay was achieved by 71.1% (118 of 168) of the laboratories for negative findings and 69.6% (112 of 161) of the laboratories for positive cases. The proportion of laboratories interpreting HER2 IHC using ASCO/CAP guidelines (86.6% [798 of 921] in 2011; 83.8% [605 of 722] in 2008) remains similar. Although fixation time improvements have been made, assay validation deficiencies still exist. The results of this survey were shared within the CAP, including the Laboratory Accreditation Program and the ASCO/CAP panel revising the HER2 guidelines published in October 2013. The Laboratory Accreditation Program checklist was

  13. Atomic theories

    CERN Document Server

    Loring, FH

    2014-01-01

    Summarising the most novel facts and theories which were coming into prominence at the time, particularly those which had not yet been incorporated into standard textbooks, this important work was first published in 1921. The subjects treated cover a wide range of research that was being conducted into the atom, and include Quantum Theory, the Bohr Theory, the Sommerfield extension of Bohr's work, the Octet Theory and Isotopes, as well as Ionisation Potentials and Solar Phenomena. Because much of the material of Atomic Theories lies on the boundary between experimentally verified fact and spec

  14. Grounded theory.

    Science.gov (United States)

    Harris, Tina

    2015-04-29

    Grounded theory is a popular research approach in health care and the social sciences. This article provides a description of grounded theory methodology and its key components, using examples from published studies to demonstrate practical application. It aims to demystify grounded theory for novice nurse researchers, by explaining what it is, when to use it, why they would want to use it and how to use it. It should enable nurse researchers to decide if grounded theory is an appropriate approach for their research, and to determine the quality of any grounded theory research they read.

  15. Number theory via Representation theory

    Indian Academy of Sciences (India)

    2014-11-09

    Number theory via Representation theory. Eknath Ghate. November 9, 2014. Eightieth Annual Meeting, Chennai. Indian Academy of Sciences1. 1. This is a non-technical 20 minute talk intended for a general Academy audience.

  16. Superstring theory

    International Nuclear Information System (INIS)

    Schwarz, J.H.

    1985-01-01

    Dual string theories, initially developed as phenomenological models of hadrons, now appear more promising as candidates for a unified theory of fundamental interactions. Type I superstring theory (SST I), is a ten-dimensional theory of interacting open and closed strings, with one supersymmetry, that is free from ghosts and tachyons. It requires that an SO(eta) or Sp(2eta) gauge group be used. A light-cone-gauge string action with space-time supersymmetry automatically incorporates the superstring restrictions and leads to the discovery of type II superstring theory (SST II). SST II is an interacting theory of closed strings only, with two D=10 supersymmetries, that is also free from ghosts and tachyons. By taking six of the spatial dimensions to form a compact space, it becomes possible to reconcile the models with our four-dimensional perception of spacetime and to define low-energy limits in which SST I reduces to N=4, D=4 super Yang-Mills theory and SST II reduces to N=8, D=4 supergravity theory. The superstring theories can be described by a light-cone-gauge action principle based on fields that are functionals of string coordinates. With this formalism any physical quantity should be calculable. There is some evidence that, unlike any conventional field theory, the superstring theories provide perturbatively renormalizable (SST I) or finite (SST II) unifications of gravity with other interactions

  17. String theory or field theory?

    International Nuclear Information System (INIS)

    Marshakov, Andrei V

    2002-01-01

    The status of string theory is reviewed, and major recent developments - especially those in going beyond perturbation theory in the string theory and quantum field theory frameworks - are analyzed. This analysis helps better understand the role and place of string theory in the modern picture of the physical world. Even though quantum field theory describes a wide range of experimental phenomena, it is emphasized that there are some insurmountable problems inherent in it - notably the impossibility to formulate the quantum theory of gravity on its basis - which prevent it from being a fundamental physical theory of the world of microscopic distances. It is this task, the creation of such a theory, which string theory, currently far from completion, is expected to solve. In spite of its somewhat vague current form, string theory has already led to a number of serious results and greatly contributed to progress in the understanding of quantum field theory. It is these developments which are our concern in this review. (reviews of topical problems)

  18. Dependence theory via game theory

    NARCIS (Netherlands)

    Grossi, D.; Turrini, P.

    2011-01-01

    In the multi-agent systems community, dependence theory and game theory are often presented as two alternative perspectives on the analysis of social interaction. Up till now no research has been done relating these two approaches. The unification presented provides dependence theory with the sort

  19. An Asymptotic Derivation of Weakly Nonlinear Ray Theory

    Indian Academy of Sciences (India)

    The transport equation for the amplitude has been deduced with an error (2) where is the small parameter appearing in the high frequency approximation. On a length scale over which Choquet–Bruhat's theory is valid, this theory reduces to the former. The theory is valid on a much larger length scale and the leading ...

  20. Model theory

    CERN Document Server

    Chang, CC

    2012-01-01

    Model theory deals with a branch of mathematical logic showing connections between a formal language and its interpretations or models. This is the first and most successful textbook in logical model theory. Extensively updated and corrected in 1990 to accommodate developments in model theoretic methods - including classification theory and nonstandard analysis - the third edition added entirely new sections, exercises, and references. Each chapter introduces an individual method and discusses specific applications. Basic methods of constructing models include constants, elementary chains, Sko

  1. Viability Theory

    CERN Document Server

    Aubin, Jean-Pierre; Saint-Pierre, Patrick

    2011-01-01

    Viability theory designs and develops mathematical and algorithmic methods for investigating the adaptation to viability constraints of evolutions governed by complex systems under uncertainty that are found in many domains involving living beings, from biological evolution to economics, from environmental sciences to financial markets, from control theory and robotics to cognitive sciences. It involves interdisciplinary investigations spanning fields that have traditionally developed in isolation. The purpose of this book is to present an initiation to applications of viability theory, explai

  2. Galois Theory

    CERN Document Server

    Cox, David A

    2012-01-01

    Praise for the First Edition ". . .will certainly fascinate anyone interested in abstract algebra: a remarkable book!"—Monatshefte fur Mathematik Galois theory is one of the most established topics in mathematics, with historical roots that led to the development of many central concepts in modern algebra, including groups and fields. Covering classic applications of the theory, such as solvability by radicals, geometric constructions, and finite fields, Galois Theory, Second Edition delves into novel topics like Abel’s theory of Abelian equations, casus irreducibili, and the Galo

  3. Game theory.

    Science.gov (United States)

    Dufwenberg, Martin

    2011-03-01

    Game theory is a toolkit for examining situations where decision makers influence each other. I discuss the nature of game-theoretic analysis, the history of game theory, why game theory is useful for understanding human psychology, and why game theory has played a key role in the recent explosion of interest in the field of behavioral economics. WIREs Cogni Sci 2011 2 167-173 DOI: 10.1002/wcs.119 For further resources related to this article, please visit the WIREs website. Copyright © 2010 John Wiley & Sons, Ltd.

  4. Elastoplasticity theory

    CERN Document Server

    Hashiguchi, Koichi

    2009-01-01

    This book details the mathematics and continuum mechanics necessary as a foundation of elastoplasticity theory. It explains physical backgrounds with illustrations and provides descriptions of detailed derivation processes..

  5. Complete relaxation and conformational exchange matrix (CORCEMA) analysis of intermolecular saturation transfer effects in reversibly forming ligand-receptor complexes.

    Science.gov (United States)

    Jayalakshmi, V; Krishna, N Rama

    2002-03-01

    A couple of recent applications of intermolecular NOE (INOE) experiments as applied to biomolecular systems involve the (i) saturation transfer difference NMR (STD-NMR) method and (ii) the intermolecular cross-saturation NMR (ICS-NMR) experiment. STD-NMR is a promising tool for rapid screening of a large library of compounds to identify bioactive ligands binding to a target protein. Additionally, it is also useful in mapping the binding epitopes presented by a bioactive ligand to its target protein. In this latter application, the STD-NMR technique is essentially similar to the ICS-NMR experiment, which is used to map protein-protein or protein-nucleic acid contact surfaces in complexes. In this work, we present a complete relaxation and conformational exchange matrix (CORCEMA) theory (H. N. B. Moseley et al., J. Magn. Reson. B 108, 243-261 (1995)) applicable for these two closely related experiments. As in our previous work, we show that when exchange is fast on the relaxation rate scale, a simplified CORCEMA theory can be formulated using a generalized average relaxation rate matrix. Its range of validity is established by comparing its predictions with those of the exact CORCEMA theory which is valid for all exchange rates. Using some ideal model systems we have analyzed the factors that influence the ligand proton intensity changes when the resonances from some protons on the receptor protein are saturated. The results show that the intensity changes in the ligand signals in an intermolecular NOE experiment are very much dependent upon: (1) the saturation time, (2) the location of the saturated receptor protons with respect to the ligand protons, (3) the conformation of the ligand-receptor interface, (4) the rotational correlation times for the molecular species, (5) the kinetics of the reversibly forming complex, and (6) the ligand/receptor ratio. As an example of a typical application of the STD-NMR experiment we have also simulated the STD effects for a

  6. Verification, validation, and reliability of predictions

    International Nuclear Information System (INIS)

    Pigford, T.H.; Chambre, P.L.

    1987-04-01

    The objective of predicting long-term performance should be to make reliable determinations of whether the prediction falls within the criteria for acceptable performance. Establishing reliable predictions of long-term performance of a waste repository requires emphasis on valid theories to predict performance. The validation process must establish the validity of the theory, the parameters used in applying the theory, the arithmetic of calculations, and the interpretation of results; but validation of such performance predictions is not possible unless there are clear criteria for acceptable performance. Validation programs should emphasize identification of the substantive issues of prediction that need to be resolved. Examples relevant to waste package performance are predicting the life of waste containers and the time distribution of container failures, establishing the criteria for defining container failure, validating theories for time-dependent waste dissolution that depend on details of the repository environment, and determining the extent of congruent dissolution of radionuclides in the UO 2 matrix of spent fuel. Prediction and validation should go hand in hand and should be done and reviewed frequently, as essential tools for the programs to design and develop repositories. 29 refs

  7. Perturbation theory

    International Nuclear Information System (INIS)

    Bartlett, R.; Kirtman, B.; Davidson, E.R.

    1978-01-01

    After noting some advantages of using perturbation theory some of the various types are related on a chart and described, including many-body nonlinear summations, quartic force-field fit for geometry, fourth-order correlation approximations, and a survey of some recent work. Alternative initial approximations in perturbation theory are also discussed. 25 references

  8. Need theory

    NARCIS (Netherlands)

    R. Veenhoven (Ruut)

    2014-01-01

    markdownabstract__Abstract__ Need theory of happiness is linked to affect theory, which holds that happiness is a reflection of how well we feel generally. In this view, we do not "calculate" happiness but rather "infer" it, the typical heuristic being "I feel good most of the time, hence

  9. Diffraction theory

    NARCIS (Netherlands)

    Bouwkamp, C.J.

    1954-01-01

    A critical review is presented of recent progress in classical diffraction theory. Both scalar and electromagnetic problems are discussed. The report may serve as an introduction to general diffraction theory although the main emphasis is on diffraction by plane obstacles. Various modifications of

  10. DO TANZANIAN COMPANIES PRACTICE PECKING ORDER THEORY, AGENCY COST THEORY OR TRADE-OFF THEORY? AN EMPIRICAL STUDY IN TANZANIAN LISTED COMPANIES

    Directory of Open Access Journals (Sweden)

    Ntogwa Ng'habi Bundala

    2012-01-01

    Full Text Available The empirical study was focused predominantly on validity tests of the three theories on capital structures, the static trade-off theory, the pecking order theory (information asymmetry theory, and agency cost theory in the Tanzanian context. The study used secondary data from eight of the non-financial companies listed in Dar Es Salaam Stock Exchange (DSE from 2006-2012. The study used descriptive (quantitative approach to test the practicality of the theories in Tanzania. The multiple regressions model used to test the theoretical relationship between the financial leverage and characteristics of the company. The research found that there is no strong evidence for validation of static trade off theory, little support of pecking order theory, but the agency cost theory is confirmed to be valid and practiced in Tanzania. It recommended that Tanzanian companies should be adhering to the determinants of the capital structure in the Tanzanian context found by this study.

  11. Potential Theory

    CERN Document Server

    Lukeš, Jaroslav; Netuka, Ivan; Veselý, Jiří

    1988-01-01

    Within the tradition of meetings devoted to potential theory, a conference on potential theory took place in Prague on 19-24, July 1987. The Conference was organized by the Faculty of Mathematics and Physics, Charles University, with the collaboration of the Institute of Mathematics, Czechoslovak Academy of Sciences, the Department of Mathematics, Czech University of Technology, the Union of Czechoslovak Mathematicians and Physicists, the Czechoslovak Scientific and Technical Society, and supported by IMU. During the Conference, 69 scientific communications from different branches of potential theory were presented; the majority of them are in­ cluded in the present volume. (Papers based on survey lectures delivered at the Conference, its program as well as a collection of problems from potential theory will appear in a special volume of the Lecture Notes Series published by Springer-Verlag). Topics of these communications truly reflect the vast scope of contemporary potential theory. Some contributions deal...

  12. Conspiracy Theory

    DEFF Research Database (Denmark)

    Bjerg, Ole; Presskorn-Thygesen, Thomas

    2017-01-01

    The paper is a contribution to current debates about conspiracy theories within philosophy and cultural studies. Wittgenstein’s understanding of language is invoked to analyse the epistemological effects of designating particular questions and explanations as a ‘conspiracy theory......’. It is demonstrated how such a designation relegates these questions and explanations beyond the realm of meaningful discourse. In addition, Agamben’s concept of sovereignty is applied to explore the political effects of using the concept of conspiracy theory. The exceptional epistemological status assigned...... to alleged conspiracy theories within our prevalent paradigms of knowledge and truth is compared to the exceptional legal status assigned to individuals accused of terrorism under the War on Terror. The paper concludes by discussing the relation between conspiracy theory and ‘the paranoid style...

  13. Field theory

    CERN Multimedia

    1999-11-08

    In these lectures I will build up the concept of field theory using the language of Feynman diagrams. As a starting point, field theory in zero spacetime dimensions is used as a vehicle to develop all the necessary techniques: path integral, Feynman diagrams, Schwinger-Dyson equations, asymptotic series, effective action, renormalization etc. The theory is then extended to more dimensions, with emphasis on the combinatorial aspects of the diagrams rather than their particular mathematical structure. The concept of unitarity is used to, finally, arrive at the various Feynman rules in an actual, four-dimensional theory. The concept of gauge-invariance is developed, and the structure of a non-abelian gauge theory is discussed, again on the level of Feynman diagrams and Feynman rules.

  14. Uncertainty theory

    CERN Document Server

    Liu, Baoding

    2015-01-01

    When no samples are available to estimate a probability distribution, we have to invite some domain experts to evaluate the belief degree that each event will happen. Perhaps some people think that the belief degree should be modeled by subjective probability or fuzzy set theory. However, it is usually inappropriate because both of them may lead to counterintuitive results in this case. In order to rationally deal with belief degrees, uncertainty theory was founded in 2007 and subsequently studied by many researchers. Nowadays, uncertainty theory has become a branch of axiomatic mathematics for modeling belief degrees. This is an introductory textbook on uncertainty theory, uncertain programming, uncertain statistics, uncertain risk analysis, uncertain reliability analysis, uncertain set, uncertain logic, uncertain inference, uncertain process, uncertain calculus, and uncertain differential equation. This textbook also shows applications of uncertainty theory to scheduling, logistics, networks, data mining, c...

  15. Concept theory

    DEFF Research Database (Denmark)

    Hjørland, Birger

    2009-01-01

      Concept theory is an extremely broad, interdisciplinary and complex field of research related to many deep fields with very long historical traditions without much consensus. However, information science and knowledge organization cannot avoid relating to theories of concepts. Knowledge...... organizing systems (e.g. classification systems, thesauri and ontologies) should be understood as systems basically organizing concepts and their semantic relations. The same is the case with information retrieval systems. Different theories of concepts have different implications for how to construe......, evaluate and use such systems. Based on "a post-Kuhnian view" of paradigms this paper put forward arguments that the best understanding and classification of theories of concepts is to view and classify them in accordance with epistemological theories (empiricism, rationalism, historicism and pragmatism...

  16. Automatic validation of numerical solutions

    DEFF Research Database (Denmark)

    Stauning, Ole

    1997-01-01

    This thesis is concerned with ``Automatic Validation of Numerical Solutions''. The basic theory of interval analysis and self-validating methods is introduced. The mean value enclosure is applied to discrete mappings for obtaining narrow enclosures of the iterates when applying these mappings...... differential equations, but in this thesis, we describe how to use the methods for enclosing iterates of discrete mappings, and then later use them for discretizing solutions of ordinary differential equations. The theory of automatic differentiation is introduced, and three methods for obtaining derivatives...... are described: The forward, the backward, and the Taylor expansion methods. The three methods have been implemented in the C++ program packages FADBAD/TADIFF. Some examples showing how to use the three metho ds are presented. A feature of FADBAD/TADIFF not present in other automatic differentiation packages...

  17. How Far Does a Receptor Influence Vibrational Properties of an Odorant?

    DEFF Research Database (Denmark)

    Reese, Anna; List, Nanna Holmgaard; Kongsted, Jacob

    2016-01-01

    -assisted electron transfer. Through molecular dynamics simulations we elucidate the binding specificity of a receptor towards acetophenone odorant. The vibrational properties of acetophenone inside the receptor are then studied by the polarizable embedding density functional theory approach, allowing to quantify...... protein-odorant interactions. Finally, we judge whether the effects of the protein provide any indications towards the existing theories of olfaction....

  18. General Open Systems Theory and the Substrata-Factor Theory of Reading.

    Science.gov (United States)

    Kling, Martin

    This study was designed to extend the generality of the Substrata-Factor Theory by two methods of investigation: (1) theoretically, to est"blish the validity of the hypothesis that an isomorphic relationship exists between the Substrata-Factor Theory and the General Open Systems Theory, and (2) experimentally, to disc"ver through a…

  19. General covariance and quantum theory

    International Nuclear Information System (INIS)

    Mashhoon, B.

    1986-01-01

    The extension of the principle of relativity to general coordinate systems is based on the hypothesis that an accelerated observer is locally equivalent to a hypothetical inertial observer with the same velocity as the noninertial observer. This hypothesis of locality is expected to be valid for classical particle phenomena as well as for classical wave phenomena but only in the short-wavelength approximation. The generally covariant theory is therefore expected to be in conflict with the quantum theory which is based on wave-particle duality. This is explicitly demonstrated for the frequency of electromagnetic radiation measured by a uniformly rotating observer. The standard Doppler formula is shown to be valid only in the geometric optics approximation. A new definition for the frequency is proposed, and the resulting formula for the frequency measured by the rotating observer is shown to be consistent with expectations based on the classical theory of electrons. A tentative quantum theory is developed on the basis of the generalization of the Bohr frequency condition to include accelerated observers. The description of the causal sequence of events is assumed to be independent of the motion of the observer. Furthermore, the quantum hypothesis is supposed to be valid for all observers. The implications of this theory are critically examined. The new formula for frequency, which is still based on the hypothesis of locality, leads to the observation of negative energy quanta by the rotating observer and is therefore in conflict with the quantum theory

  20. Validation of three-dimensional incompressible spatial direct numerical simulation code: A comparison with linear stability and parabolic stability equation theories for boundary-layer transition on a flat plate

    Science.gov (United States)

    Joslin, Ronald D.; Streett, Craig L.; Chang, Chau-Lyan

    1992-01-01

    Spatially evolving instabilities in a boundary layer on a flat plate are computed by direct numerical simulation (DNS) of the incompressible Navier-Stokes equations. In a truncated physical domain, a nonstaggered mesh is used for the grid. A Chebyshev-collocation method is used normal to the wall; finite difference and compact difference methods are used in the streamwise direction; and a Fourier series is used in the spanwise direction. For time stepping, implicit Crank-Nicolson and explicit Runge-Kutta schemes are used to the time-splitting method. The influence-matrix technique is used to solve the pressure equation. At the outflow boundary, the buffer-domain technique is used to prevent convective wave reflection or upstream propagation of information from the boundary. Results of the DNS are compared with those from both linear stability theory (LST) and parabolized stability equation (PSE) theory. Computed disturbance amplitudes and phases are in very good agreement with those of LST (for small inflow disturbance amplitudes). A measure of the sensitivity of the inflow condition is demonstrated with both LST and PSE theory used to approximate inflows. Although the DNS numerics are very different than those of PSE theory, the results are in good agreement. A small discrepancy in the results that does occur is likely a result of the variation in PSE boundary condition treatment in the far field. Finally, a small-amplitude wave triad is forced at the inflow, and simulation results are compared with those of LST. Again, very good agreement is found between DNS and LST results for the 3-D simulations, the implication being that the disturbance amplitudes are sufficiently small that nonlinear interactions are negligible.

  1. Dengeleme Teorisi’nin Geçerliliğinin Panel Veri Analizi ile Test Edilmesi: BİST’de Ampirik Bir Uygulama(Testing the Validity of Trade-Off Theory by Using Panel Regression Analysis: An Empirical Application on ISE

    Directory of Open Access Journals (Sweden)

    İbrahim BOZKURT

    2014-12-01

    Full Text Available The aim of this study is to test the validity of Trade-Off theory by investigating the relationship between capital structures and market values of firms on ISE. In this study, 127.008 financial ratio, 20.664 monthly stock return, 4.704 market value and debt ratio which are belong to 168 firms traded on ISE between 2005 and 2011 is used. Firstly, efficient model predicting bankruptcy is confirmed by using balanced panel regression analysis for ISE. Secondly, by using efficient model, firms are divided two groups that consist of firms which have or not bankruptcy risk for each period and the relationship market values and debt levels of firms in each group is analyzed by using unbalanced panel regression analysis. The results of the analysis reveal that there is positive relationship between market values and debt levels of firms in terms of both groups. This result implies that Trade-Off theory is not validity on ISE.

  2. Receptor-receptor interactions within receptor mosaics. Impact on neuropsychopharmacology.

    Science.gov (United States)

    Fuxe, K; Marcellino, D; Rivera, A; Diaz-Cabiale, Z; Filip, M; Gago, B; Roberts, D C S; Langel, U; Genedani, S; Ferraro, L; de la Calle, A; Narvaez, J; Tanganelli, S; Woods, A; Agnati, L F

    2008-08-01

    Future therapies for diseases associated with altered dopaminergic signaling, including Parkinson's disease, schizophrenia and drug addiction or drug dependence may substantially build on the existence of intramembrane receptor-receptor interactions within dopamine receptor containing receptor mosaics (RM; dimeric or high-order receptor oligomers) where it is believed that the dopamine D(2) receptor may operate as the 'hub receptor' within these complexes. The constitutive adenosine A(2A)/dopamine D(2) RM, located in the dorsal striato-pallidal GABA neurons, are of particular interest in view of the demonstrated antagonistic A(2A)/D(2) interaction within these heteromers; an interaction that led to the suggestion and later demonstration that A(2A) antagonists could be used as novel anti-Parkinsonian drugs. Based on the likely existence of A(2A)/D(2)/mGluR5 RM located both extrasynaptically on striato-pallidal GABA neurons and on cortico-striatal glutamate terminals, multiple receptor-receptor interactions within this RM involving synergism between A(2A)/mGluR5 to counteract D(2) signaling, has led to the proposal of using combined mGluR5 and A(2A) antagonists as a future anti-Parkinsonian treatment. Based on the same RM in the ventral striato-pallidal GABA pathways, novel strategies for the treatment of schizophrenia, building on the idea that A(2A) agonists and/or mGluR5 agonists will help reduce the increased dopaminergic signaling associated with this disease, have been suggested. Such treatment may ensure the proper glutamatergic drive from the mediodorsal thalamic nucleus to the prefrontal cortex, one which is believed to be reduced in schizophrenia due to a dominance of D(2)-like signaling in the ventral striatum. Recently, A(2A) receptors also have been shown to counteract the locomotor and sensitizing actions of cocaine and increases in A(2A) receptors have also been observed in the nucleus accumbens after extended cocaine self-administration, probably

  3. Number theory

    CERN Document Server

    Andrews, George E

    1994-01-01

    Although mathematics majors are usually conversant with number theory by the time they have completed a course in abstract algebra, other undergraduates, especially those in education and the liberal arts, often need a more basic introduction to the topic.In this book the author solves the problem of maintaining the interest of students at both levels by offering a combinatorial approach to elementary number theory. In studying number theory from such a perspective, mathematics majors are spared repetition and provided with new insights, while other students benefit from the consequent simpl

  4. Risk theory

    CERN Document Server

    Schmidli, Hanspeter

    2017-01-01

    This book provides an overview of classical actuarial techniques, including material that is not readily accessible elsewhere such as the Ammeter risk model and the Markov-modulated risk model. Other topics covered include utility theory, credibility theory, claims reserving and ruin theory. The author treats both theoretical and practical aspects and also discusses links to Solvency II. Written by one of the leading experts in the field, these lecture notes serve as a valuable introduction to some of the most frequently used methods in non-life insurance. They will be of particular interest to graduate students, researchers and practitioners in insurance, finance and risk management.

  5. Mapping Theory

    DEFF Research Database (Denmark)

    Smith, Shelley

    This paper came about within the context of a 13-month research project, Focus Area 1 - Method and Theory, at the Center for Public Space Research at the Royal Academy of the Arts School of Architecture in Copenhagen, Denmark. This project has been funded by RealDania. The goals of the research...... project, Focus Area 1 - Method and Theory, which forms the framework for this working paper, are: * To provide a basis from which to discuss the concept of public space in a contemporary architectural and urban context - specifically relating to theory and method * To broaden the discussion of the concept...

  6. Plasticity theory

    CERN Document Server

    Lubliner, Jacob

    2008-01-01

    The aim of Plasticity Theory is to provide a comprehensive introduction to the contemporary state of knowledge in basic plasticity theory and to its applications. It treats several areas not commonly found between the covers of a single book: the physics of plasticity, constitutive theory, dynamic plasticity, large-deformation plasticity, and numerical methods, in addition to a representative survey of problems treated by classical methods, such as elastic-plastic problems, plane plastic flow, and limit analysis; the problem discussed come from areas of interest to mechanical, structural, and

  7. Agency Theory

    DEFF Research Database (Denmark)

    Linder, Stefan; Foss, Nicolai Juul

    Agency theory studies the problems and solutions linked to delegation of tasks from principals to agents in the context of conflicting interests between the parties. Beginning from clear assumptions about rationality, contracting and informational conditions, the theory addresses problems of ex...... ante (“hidden characteristics”) as well as ex post information asymmetry (“hidden action”), and examines conditions under which various kinds of incentive instruments and monitoring arrangements can be deployed to minimize the welfare loss. Its clear predictions and broad applicability have allowed...... agency theory to enjoy considerable scientific impact on social science; however, it has also attracted considerable criticism....

  8. Agency Theory

    DEFF Research Database (Denmark)

    Linder, Stefan; Foss, Nicolai Juul

    2015-01-01

    Agency theory studies the problems and solutions linked to delegation of tasks from principals to agents in the context of conflicting interests between the parties. Beginning from clear assumptions about rationality, contracting, and informational conditions, the theory addresses problems of ex...... ante (‘hidden characteristics’) as well as ex post information asymmetry (‘hidden action’), and examines conditions under which various kinds of incentive instruments and monitoring arrangements can be deployed to minimize the welfare loss. Its clear predictions and broad applicability have allowed...... agency theory to enjoy considerable scientific impact on social science; however, it has also attracted considerable criticism....

  9. Relativistic fluid theories - Self organization

    International Nuclear Information System (INIS)

    Mahajan, S.M.; Hazeltine, R.D.; Yoshida, Z.

    2003-01-01

    Developments in two distinct but related subjects are reviewed: 1) Formulation and investigation of closed fluid theories which transcend the limitations of standard magnetohydrodynamics (MHD), in particular, theories which are valid in the long mean free path limit and in which pressure anisotropy, heat flow, and arbitrarily strong sheared flows are treated consistently, and 2) Exploitation of the two-fluid theories to derive new plasma configurations in which the flow-field is a co-determinant of the overall dynamics; some of these states belong to the category of self-organized relaxed states. Physical processes which may provide a route to self-organization and complexity are also explored. (author)

  10. Gauge theory of amorphous magnets

    International Nuclear Information System (INIS)

    Nesterov, A.I.; Ovchinnikov, S.G.

    1989-01-01

    A gauge theory of disordered magnets as a field theory in the principal fiber bundle with structure group SL(3, R) is constructed. The gauge field interacting with a vector field (the magnetization) is responsible for the disorder. A complete system of equations, valid for arbitrary disordered magnets, is obtained. In the limiting case of a free gauge field the proposed approach leads to the well-known Volovik-Dzyaloshinskii theory, which describes isotropic spin glasses. In the other limiting case when the curvature is zero the results of Ignatchenko and Iskhakov for weakly disordered ferromagnets are reproduced

  11. How Developments in Psychology and Technology Challenge Validity Argumentation

    Science.gov (United States)

    Mislevy, Robert J.

    2016-01-01

    Validity is the sine qua non of properties of educational assessment. While a theory of validity and a practical framework for validation has emerged over the past decades, most of the discussion has addressed familiar forms of assessment and psychological framings. Advances in digital technologies and in cognitive and social psychology have…

  12. Validation and Design Science Research in Information Systems

    NARCIS (Netherlands)

    Sol, H G; Gonzalez, Rafael A.; Mora, Manuel

    2012-01-01

    Validation within design science research in Information Systems (DSRIS) is much debated. The relationship of validation to artifact evaluation is still not clear. This chapter aims at elucidating several components of DSRIS in relation to validation. The role of theory and theorizing are an

  13. Continuity theory

    CERN Document Server

    Nel, Louis

    2016-01-01

    This book presents a detailed, self-contained theory of continuous mappings. It is mainly addressed to students who have already studied these mappings in the setting of metric spaces, as well as multidimensional differential calculus. The needed background facts about sets, metric spaces and linear algebra are developed in detail, so as to provide a seamless transition between students' previous studies and new material. In view of its many novel features, this book will be of interest also to mature readers who have studied continuous mappings from the subject's classical texts and wish to become acquainted with a new approach. The theory of continuous mappings serves as infrastructure for more specialized mathematical theories like differential equations, integral equations, operator theory, dynamical systems, global analysis, topological groups, topological rings and many more. In light of the centrality of the topic, a book of this kind fits a variety of applications, especially those that contribute to ...

  14. Model theory

    CERN Document Server

    Hodges, Wilfrid

    1993-01-01

    An up-to-date and integrated introduction to model theory, designed to be used for graduate courses (for students who are familiar with first-order logic), and as a reference for more experienced logicians and mathematicians.

  15. Interpolation theory

    CERN Document Server

    Lunardi, Alessandra

    2018-01-01

    This book is the third edition of the 1999 lecture notes of the courses on interpolation theory that the author delivered at the Scuola Normale in 1998 and 1999. In the mathematical literature there are many good books on the subject, but none of them is very elementary, and in many cases the basic principles are hidden below great generality. In this book the principles of interpolation theory are illustrated aiming at simplification rather than at generality. The abstract theory is reduced as far as possible, and many examples and applications are given, especially to operator theory and to regularity in partial differential equations. Moreover the treatment is self-contained, the only prerequisite being the knowledge of basic functional analysis.

  16. [Nuclear theory

    International Nuclear Information System (INIS)

    1989-06-01

    This report discusses concepts in nuclear theory such as: neutrino nucleosynthesis; double beta decay; neutrino oscillations; chiral symmetry breaking; T invariance; quark propagator; cold fusion; and other related topics

  17. Livability theory

    NARCIS (Netherlands)

    R. Veenhoven (Ruut)

    2014-01-01

    markdownabstract__Abstract__ Assumptions Livability theory involves the following six key assumptions: 1. Like all animals, humans have innate needs, such as for food, safety, and companionship. 2. Gratification of needs manifests in hedonic experience. 3. Hedonic experience determines how

  18. Acetylcholine receptor antibody

    Science.gov (United States)

    ... medlineplus.gov/ency/article/003576.htm Acetylcholine receptor antibody To use the sharing features on this page, please enable JavaScript. Acetylcholine receptor antibody is a protein found in the blood of ...

  19. Nokton theory

    OpenAIRE

    SAIDANI Lassaad

    2015-01-01

    The nokton theory is an attempt to construct a theory adapted to every physical phenomenon. Space and time have been discretized. Its laws are iterative and precise. Probability plays an important role here. At first I defined the notion of image function and its mathematical framework. The notion of nokton and its state are the basis of several definitions. I later defined the canonical image function and the canonical contribution. Two constants have been necessary to define the dynam...

  20. Nokton theory

    OpenAIRE

    SAIDANI Lassaad

    2017-01-01

    The nokton theory is an attempt to construct a theory adapted to every physical phenomenon. Space and time have been discretized. Its laws are iterative and precise. Probability plays an important role here. At first I defined the notion of image function and its mathematical framework. The notion of nokton and its state are the basis of several definitions. I later defined the canonical image function and the canonical contribution. Two constants have been necessary to define the dynam...

  1. Graph theory

    CERN Document Server

    Gould, Ronald

    2012-01-01

    This introduction to graph theory focuses on well-established topics, covering primary techniques and including both algorithmic and theoretical problems. The algorithms are presented with a minimum of advanced data structures and programming details. This thoroughly corrected 1988 edition provides insights to computer scientists as well as advanced undergraduates and graduate students of topology, algebra, and matrix theory. Fundamental concepts and notation and elementary properties and operations are the first subjects, followed by examinations of paths and searching, trees, and networks. S

  2. Molecular pharmacology of promiscuous seven transmembrane receptors sensing organic nutrients.

    Science.gov (United States)

    Wellendorph, Petrine; Johansen, Lars Dan; Bräuner-Osborne, Hans

    2009-09-01

    A number of highly promiscuous seven transmembrane (7TM) receptors have been cloned and characterized within the last few years. It is noteworthy that many of these receptors are activated broadly by amino acids, proteolytic degradation products, carbohydrates, or free fatty acids and are expressed in taste tissue, the gastrointestinal tract, endocrine glands, adipose tissue, and/or kidney. These receptors thus hold the potential to act as sensors of food intake, regulating, for example, release of incretin hormones from the gut, insulin/glucagon from the pancreas, and leptin from adipose tissue. The promiscuous tendency in ligand recognition of these receptors is in contrast to the typical specific interaction with one physiological agonist seen for most receptors, which challenges the classic "lock-and-key" concept. We here review the molecular mechanisms of nutrient sensing of the calcium-sensing receptor, the G protein-coupled receptor family C, group 6, subtype A (GPRC6A), and the taste1 receptor T1R1/T1R3, which are sensing L-alpha-amino acids, the carbohydrate-sensing T1R2/T1R3 receptor, the proteolytic degradation product sensor GPR93 (also termed GPR92), and the free fatty acid (FFA) sensing receptors FFA1, FFA2, FFA3, GPR84, and GPR120. The involvement of the individual receptors in sensing of food intake has been validated to different degrees because of limited availability of specific pharmacological tools and/or receptor knockout mice. However, as a group, the receptors represent potential drug targets, to treat, for example, type II diabetes by mimicking food intake by potent agonists or positive allosteric modulators. The ligand-receptor interactions of the promiscuous receptors of organic nutrients thus remain an interesting subject of emerging functional importance.

  3. Cooperative ethylene receptor signaling

    OpenAIRE

    Liu, Qian; Wen, Chi-Kuang

    2012-01-01

    The gaseous plant hormone ethylene is perceived by a family of five ethylene receptor members in the dicotyledonous model plant Arabidopsis. Genetic and biochemical studies suggest that the ethylene response is suppressed by ethylene receptor complexes, but the biochemical nature of the receptor signal is unknown. Without appropriate biochemical measures to trace the ethylene receptor signal and quantify the signal strength, the biological significance of the modulation of ethylene responses ...

  4. Exploring a Theory Describing the Physics of Information Systems, Characterizing the Phenomena of Complex Information Systems

    National Research Council Canada - National Science Library

    Harmon, Scott

    2001-01-01

    This project accomplished all of its objectives: document a theory of information physics, conduct a workshop on planing experiments to test this theory, and design experiments that validate this theory...

  5. Theory of spiral structure

    International Nuclear Information System (INIS)

    Lin, C.C.

    1977-01-01

    The density wave theory of galactic spirals has now developed into a form suitable for consideration by experts in Applied Mechanics. On the one hand, comparison of theoretical deductions with observational data has convinced astrophysicists of the validity of the basic physical picture and the calculated results. On the other hand, the dynamical problems of a stellar system, such as those concerning the origin of spiral structure in galaxies, have not been completely solved. This paper reviews the current status of such developments, including a brief summary of comparison with observations. A particularly important mechanism, currently called the mechanism of energy exchange, is described in some detail. The mathematical problems and the physical processes involved are similar to those occurring in certain instability mechanisms in the 'magnetic bottle' designed for plasma containment. Speculations are given on the future developments of the theory and on observational programs. (Auth.)

  6. A clockwork theory

    Energy Technology Data Exchange (ETDEWEB)

    Giudice, Gian F.; McCullough, Matthew [CERN, Theoretical Physics Department,Geneva (Switzerland)

    2017-02-07

    The clockwork is a mechanism for generating light particles with exponentially suppressed interactions in theories which contain no small parameters at the fundamental level. We develop a general description of the clockwork mechanism valid for scalars, fermions, gauge bosons, and gravitons. This mechanism can be implemented with a discrete set of new fields or, in its continuum version, through an extra spatial dimension. In both cases the clockwork emerges as a useful tool for model-building applications. Notably, the continuum clockwork offers a solution to the Higgs naturalness problem, which turns out to be the same as in linear dilaton duals of Little String Theory. We also elucidate the similarities and differences of the continuum clockwork with large extra dimensions and warped spaces. All clockwork models, in the discrete and continuum, exhibit novel phenomenology with a distinctive spectrum of closely spaced resonances.

  7. Salinas : theory manual.

    Energy Technology Data Exchange (ETDEWEB)

    Walsh, Timothy Francis; Reese, Garth M.; Bhardwaj, Manoj Kumar

    2011-11-01

    Salinas provides a massively parallel implementation of structural dynamics finite element analysis, required for high fidelity, validated models used in modal, vibration, static and shock analysis of structural systems. This manual describes the theory behind many of the constructs in Salinas. For a more detailed description of how to use Salinas, we refer the reader to Salinas, User's Notes. Many of the constructs in Salinas are pulled directly from published material. Where possible, these materials are referenced herein. However, certain functions in Salinas are specific to our implementation. We try to be far more complete in those areas. The theory manual was developed from several sources including general notes, a programmer notes manual, the user's notes and of course the material in the open literature.

  8. Perturbation theory with instantons

    International Nuclear Information System (INIS)

    Carruthers, P.; Pinsky, S.S.; Zachariasen, F.

    1977-05-01

    ''Perturbation theory'' rules are developed for calculating the effect of instantons in a pure Yang-Mills theory with no fermions, in the ''dilute gas'' approximation in which the N-instanton solution is assumed to be the sum of N widely separated one-instanton solutions. These rules are then used to compute the gluon propagator and proper vertex function including all orders of the instanton interaction but only to lowest order in the gluon coupling. It is to be expected that such an approximation is valid only for momenta q larger than the physical mass μ. The result is that in this regime instantons cause variations in the propagator and vertex of the form (μ 2 /q 2 )/sup -8π 2 b/ where b is the coefficient in the expansion of the β function: β = bg 3 +...

  9. Affine field theories

    International Nuclear Information System (INIS)

    Cadavid, A.C.

    1989-01-01

    The author constructs a non-Abelian field theory by gauging a Kac-Moody algebra, obtaining an infinite tower of interacting vector fields and associated ghosts, that obey slightly modified Feynman rules. She discusses the spontaneous symmetry breaking of such theory via the Higgs mechanism. If the Higgs particle lies in the Cartan subalgebra of the Kac-Moody algebra, the previously massless vectors acquire a mass spectrum that is linear in the Kac-Moody index and has additional fine structure depending on the associated Lie algebra. She proceeds to show that there is no obstacle in implementing the affine extension of supersymmetric Yang-Mills theories. The result is valid in four, six and ten space-time dimensions. Then the affine extension of supergravity is investigated. She discusses only the loop algebra since the affine extension of the super-Poincare algebra appears inconsistent. The construction of the affine supergravity theory is carried out by the group manifold method and leads to an action describing infinite towers of spin 2 and spin 3/2 fields that interact subject to the symmetries of the loop algebra. The equations of motion satisfy the usual consistency check. Finally, she postulates a theory in which both the vector and scalar fields lie in the loop algebra of SO(3). This theory has an expanded soliton sector, and corresponding to the original 't Hooft-Polyakov solitonic solutions she now finds an infinite family of exact, special solutions of the new equations. She also proposes a perturbation method for obtaining an arbitrary solution of those equations for each level of the affine index

  10. Critical theory and holocaust

    Directory of Open Access Journals (Sweden)

    Krstić Predrag

    2006-01-01

    Full Text Available In this paper the author is attempting to establish the relationship - or the lack of it - of the Critical Theory to the "Jewish question" and justification of perceiving signs of Jewish religious heritage in the thought of the representatives of this movement. The holocaust marked out by the name of "Auschwitz", is here tested as a point where the nature of this relationship has been decided. In this encounter with the cardinal challenge for the contemporary social theory, the particularity of the Frankfurt School reaction is here revealed through Adorno installing Auschwitz as unexpected but lawful emblem of the ending of the course that modern history has assumed. The critique of this "fascination" with Auschwitz, as well as certain theoretical pacification and measured positioning of the holocaust into discontinued plane of "unfinished" and continuation and closure of the valued project, are given through communicative-theoretical pre-orientation of Jürgen Habermas’s Critical Theory and of his followers. Finally, through the work of Detlev Claussen, it is suggested that in the youngest generation of Adorno’s students there are signs of revision to once already revised Critical Theory and a kind of defractured and differentiated return to the initial understanding of the decisiveness of the holocaust experience. This shift in the attitude of the Critical Theory thinkers to the provocation of holocaust is not, however, particularly reflected towards the status of Jews and their tradition, but more to the age old questioning and explanatory patterns for which they served as a "model". The question of validity of the enlightenment project, the nature of occidental rationalism, (nonexistence of historical theology and understanding of the identity and emancipation - describe the circle of problems around which the disagreement is concentrated in the social critical theory.

  11. Construct Validity and Case Validity in Assessment

    Science.gov (United States)

    Teglasi, Hedwig; Nebbergall, Allison Joan; Newman, Daniel

    2012-01-01

    Clinical assessment relies on both "construct validity", which focuses on the accuracy of conclusions about a psychological phenomenon drawn from responses to a measure, and "case validity", which focuses on the synthesis of the full range of psychological phenomena pertaining to the concern or question at hand. Whereas construct validity is…

  12. Implausibility of the vibrational theory of olfaction.

    Science.gov (United States)

    Block, Eric; Jang, Seogjoo; Matsunami, Hiroaki; Sekharan, Sivakumar; Dethier, Bérénice; Ertem, Mehmed Z; Gundala, Sivaji; Pan, Yi; Li, Shengju; Li, Zhen; Lodge, Stephene N; Ozbil, Mehmet; Jiang, Huihong; Penalba, Sonia F; Batista, Victor S; Zhuang, Hanyi

    2015-05-26

    The vibrational theory of olfaction assumes that electron transfer occurs across odorants at the active sites of odorant receptors (ORs), serving as a sensitive measure of odorant vibrational frequencies, ultimately leading to olfactory perception. A previous study reported that human subjects differentiated hydrogen/deuterium isotopomers (isomers with isotopic atoms) of the musk compound cyclopentadecanone as evidence supporting the theory. Here, we find no evidence for such differentiation at the molecular level. In fact, we find that the human musk-recognizing receptor, OR5AN1, identified using a heterologous OR expression system and robustly responding to cyclopentadecanone and muscone, fails to distinguish isotopomers of these compounds in vitro. Furthermore, the mouse (methylthio)methanethiol-recognizing receptor, MOR244-3, as well as other selected human and mouse ORs, responded similarly to normal, deuterated, and (13)C isotopomers of their respective ligands, paralleling our results with the musk receptor OR5AN1. These findings suggest that the proposed vibration theory does not apply to the human musk receptor OR5AN1, mouse thiol receptor MOR244-3, or other ORs examined. Also, contrary to the vibration theory predictions, muscone-d30 lacks the 1,380- to 1,550-cm(-1) IR bands claimed to be essential for musk odor. Furthermore, our theoretical analysis shows that the proposed electron transfer mechanism of the vibrational frequencies of odorants could be easily suppressed by quantum effects of nonodorant molecular vibrational modes. These and other concerns about electron transfer at ORs, together with our extensive experimental data, argue against the plausibility of the vibration theory.

  13. Zn Coordination Chemistry:  Development of Benchmark Suites for Geometries, Dipole Moments, and Bond Dissociation Energies and Their Use To Test and Validate Density Functionals and Molecular Orbital Theory.

    Science.gov (United States)

    Amin, Elizabeth A; Truhlar, Donald G

    2008-01-01

    We present nonrelativistic and relativistic benchmark databases (obtained by coupled cluster calculations) of 10 Zn-ligand bond distances, 8 dipole moments, and 12 bond dissociation energies in Zn coordination compounds with O, S, NH3, H2O, OH, SCH3, and H ligands. These are used to test the predictions of 39 density functionals, Hartree-Fock theory, and seven more approximate molecular orbital theories. In the nonrelativisitic case, the M05-2X, B97-2, and mPW1PW functionals emerge as the most accurate ones for this test data, with unitless balanced mean unsigned errors (BMUEs) of 0.33, 0.38, and 0.43, respectively. The best local functionals (i.e., functionals with no Hartree-Fock exchange) are M06-L and τ-HCTH with BMUEs of 0.54 and 0.60, respectively. The popular B3LYP functional has a BMUE of 0.51, only slightly better than the value of 0.54 for the best local functional, which is less expensive. Hartree-Fock theory itself has a BMUE of 1.22. The M05-2X functional has a mean unsigned error of 0.008 Å for bond lengths, 0.19 D for dipole moments, and 4.30 kcal/mol for bond energies. The X3LYP functional has a smaller mean unsigned error (0.007 Å) for bond lengths but has mean unsigned errors of 0.43 D for dipole moments and 5.6 kcal/mol for bond energies. The M06-2X functional has a smaller mean unsigned error (3.3 kcal/mol) for bond energies but has mean unsigned errors of 0.017 Å for bond lengths and 0.37 D for dipole moments. The best of the semiempirical molecular orbital theories are PM3 and PM6, with BMUEs of 1.96 and 2.02, respectively. The ten most accurate functionals from the nonrelativistic benchmark analysis are then tested in relativistic calculations against new benchmarks obtained with coupled-cluster calculations and a relativistic effective core potential, resulting in M05-2X (BMUE = 0.895), PW6B95 (BMUE = 0.90), and B97-2 (BMUE = 0.93) as the top three functionals. We find significant relativistic effects (∼0.01 Å in bond lengths, ∼0

  14. Nevanlinna theory

    CERN Document Server

    Kodaira, Kunihiko

    2017-01-01

    This book deals with the classical theory of Nevanlinna on the value distribution of meromorphic functions of one complex variable, based on minimum prerequisites for complex manifolds. The theory was extended to several variables by S. Kobayashi, T. Ochiai, J. Carleson, and P. Griffiths in the early 1970s. K. Kodaira took up this subject in his course at The University of Tokyo in 1973 and gave an introductory account of this development in the context of his final paper, contained in this book. The first three chapters are devoted to holomorphic mappings from C to complex manifolds. In the fourth chapter, holomorphic mappings between higher dimensional manifolds are covered. The book is a valuable treatise on the Nevanlinna theory, of special interests to those who want to understand Kodaira's unique approach to basic questions on complex manifolds.

  15. Gauge theories

    International Nuclear Information System (INIS)

    Kenyon, I.R.

    1986-01-01

    Modern theories of the interactions between fundamental particles are all gauge theories. In the case of gravitation, application of this principle to space-time leads to Einstein's theory of general relativity. All the other interactions involve the application of the gauge principle to internal spaces. Electromagnetism serves to introduce the idea of a gauge field, in this case the electromagnetic field. The next example, the strong force, shows unique features at long and short range which have their origin in the self-coupling of the gauge fields. Finally the unification of the description of the superficially dissimilar electromagnetic and weak nuclear forces completes the picture of successes of the gauge principle. (author)

  16. What If Quantum Theory Violates All Mathematics?

    Directory of Open Access Journals (Sweden)

    Rosinger Elemér Elad

    2017-09-01

    Full Text Available It is shown by using a rather elementary argument in Mathematical Logic that if indeed, quantum theory does violate the famous Bell Inequalities, then quantum theory must inevitably also violate all valid mathematical statements, and in particular, such basic algebraic relations like 0 = 0, 1 = 1, 2 = 2, 3 = 3, … and so on …

  17. A systematic review of the reliability and validity of discrete choice experiments in valuing non-market environmental goods

    DEFF Research Database (Denmark)

    Rokotonarivo, Sarobidy; Schaafsma, Marije; Hockley, Neal

    2016-01-01

    reliability measures. DCE results were generally consistent with those of other stated preference techniques (convergent validity), but hypothetical bias was common. Evidence supporting theoretical validity (consistency with assumptions of rational choice theory) was limited. In content validity tests, 2...

  18. Galois theory

    CERN Document Server

    Stewart, Ian

    2003-01-01

    Ian Stewart's Galois Theory has been in print for 30 years. Resoundingly popular, it still serves its purpose exceedingly well. Yet mathematics education has changed considerably since 1973, when theory took precedence over examples, and the time has come to bring this presentation in line with more modern approaches.To this end, the story now begins with polynomials over the complex numbers, and the central quest is to understand when such polynomials have solutions that can be expressed by radicals. Reorganization of the material places the concrete before the abstract, thus motivating the g

  19. Scattering theory

    International Nuclear Information System (INIS)

    Sitenko, A.

    1991-01-01

    This book emerged out of graduate lectures given by the author at the University of Kiev and is intended as a graduate text. The fundamentals of non-relativistic quantum scattering theory are covered, including some topics, such as the phase-function formalism, separable potentials, and inverse scattering, which are not always coverded in textbooks on scattering theory. Criticisms of the text are minor, but the reviewer feels an inadequate index is provided and the citing of references in the Russian language is a hindrance in a graduate text

  20. Whose consensus is it anyway? Scientific versus legalistic conceptions of validity

    NARCIS (Netherlands)

    Borsboom, D.

    2012-01-01

    Paul E. Newton provides an insightful and scholarly overview of central issues in validity theory. As he notes, many of the conceptual problems in validity theory derive from the fact that the word validity has two meanings. First, it indicates whether a test measures what it purports to measure.

  1. Comparing theories' performance in predicting violence

    OpenAIRE

    Haas, Henriette; Cusson, Maurice

    2015-01-01

    The stakes of choosing the best theory as a basis for violence prevention and offender rehabilitation are high. However, no single theory of violence has ever been universally accepted by a majority of established researchers. Psychiatry, psychology and sociology are each subdivided into different schools relying upon different premises. All theories can produce empirical evidence for their validity, some of them stating the opposite of each other. Calculating different models wit...

  2. Unresolved issues in theories of autoimmune disease using myocarditis as a framework

    OpenAIRE

    Root-Bernstein, Robert; Fairweather, DeLisa

    2014-01-01

    Many theories of autoimmune disease have been proposed since the discovery that the immune system can attack the body. These theories include the hidden or cryptic antigen theory, modified antigen theory, T cell bypass, T cell-B cell mismatch, epitope spread or drift, the bystander effect, molecular mimicry, anti-idiotype theory, antigenic complementarity, and dual-affinity T cell receptors. We critically review these theories and relevant mathematical models as they apply to autoimmune myoca...

  3. Neutrons moderation theory; Theorie du ralentissement des neutrons

    Energy Technology Data Exchange (ETDEWEB)

    Vigier, J P

    1949-07-01

    This report gives a summarized presentation of the theory of fast neutrons diffusion and moderation in a given environment as elaborated by M. Langevin, E. Fermi, R. Marshak and others. This statistical theory is based on three assumptions: there is no inelastic diffusion, the elastic diffusion has a spherical symmetry with respect to the center of gravity of the neutron-nucleus system (s-scattering), and the effects of chemical bonds and thermal agitation of nuclei are neglected. The first chapter analyzes the Boltzmann equation of moderation, its first approximate solution (age-velocity equation) and its domain of validity, the extension of the age-velocity theory (general solution) and the boundary conditions, the upper order approximation (spherical harmonics method and Laplace transformation), the asymptotic solutions, and the theory of spatial momenta. The second chapter analyzes the energy distribution of delayed neutrons (stationary and non-stationary cases). (J.S.)

  4. Leadership Theories.

    Science.gov (United States)

    Sferra, Bobbie A.; Paddock, Susan C.

    This booklet describes various theoretical aspects of leadership, including the proper exercise of authority, effective delegation, goal setting, exercise of control, assignment of responsibility, performance evaluation, and group process facilitation. It begins by describing the evolution of general theories of leadership from historic concepts…

  5. Combinatorial Theory

    CERN Document Server

    Hall, Marshall

    2011-01-01

    Includes proof of van der Waerden's 1926 conjecture on permanents, Wilson's theorem on asymptotic existence, and other developments in combinatorics since 1967. Also covers coding theory and its important connection with designs, problems of enumeration, and partition. Presents fundamentals in addition to latest advances, with illustrative problems at the end of each chapter. Enlarged appendixes include a longer list of block designs.

  6. Control Theory.

    Science.gov (United States)

    Toso, Robert B.

    2000-01-01

    Inspired by William Glasser's Reality Therapy ideas, Control Theory (CT) is a disciplinary approach that stresses people's ability to control only their own behavior, based on internal motivations to satisfy five basic needs. At one North Dakota high school, CT-trained teachers are the program's best recruiters. (MLH)

  7. Framing theory

    NARCIS (Netherlands)

    de Vreese, C.H.; Lecheler, S.; Mazzoleni, G.; Barnhurst, K.G.; Ikeda, K.; Maia, R.C.M.; Wessler, H.

    2016-01-01

    Political issues can be viewed from different perspectives and they can be defined differently in the news media by emphasizing some aspects and leaving others aside. This is at the core of news framing theory. Framing originates within sociology and psychology and has become one of the most used

  8. Electricity Theory

    International Nuclear Information System (INIS)

    Gong, Ha Soung

    2006-12-01

    The text book composed of five parts, which are summary of this book, arrangement of electricity theory including electricity nad magnetism, a direct current, and alternating current. It has two dictionary electricity terms for a synonym. The last is an appendix. It is for preparing for test of officer, electricity engineer and fire fighting engineer.

  9. Theory U

    DEFF Research Database (Denmark)

    Monthoux, Pierre Guillet de; Statler, Matt

    2014-01-01

    The recent Carnegie report (Colby, et al., 2011) characterizes the goal of business education as the development of practical wisdom. In this chapter, the authors reframe Scharmer’s Theory U as an attempt to develop practical wisdom by applying certain European philosophical concepts. Specifically...

  10. Theory U

    DEFF Research Database (Denmark)

    Guillet de Monthoux, Pierre; Statler, Matt

    2017-01-01

    The recent Carnegie report (Colby, et al., 2011) characterizes the goal of business education as the development of practical wisdom. In this chapter, the authors reframe Scharmer's Theory U as an attempt to develop practical wisdom by applying certain European philosophical concepts. Specifically...

  11. Theory summary

    International Nuclear Information System (INIS)

    Tang, W.M.

    2001-01-01

    This is a summary of the advances in magnetic fusion energy theory research presented at the 17th International Atomic Energy Agency Fusion Energy Conference from 19 24 October, 1998 in Yokohama, Japan. Theory and simulation results from this conference provided encouraging evidence of significant progress in understanding the physics of thermonuclear plasmas. Indeed, the grand challenge for this field is to acquire the basic understanding that can readily enable the innovations which would make fusion energy practical. In this sense, research in fusion energy is increasingly able to be categorized as fitting well the 'Pasteur's Quadrant' paradigm, where the research strongly couples basic science ('Bohr's Quadrant') to technological impact ('Edison's Quadrant'). As supported by some of the work presented at this conference, this trend will be further enhanced by advanced simulations. Eventually, realistic three-dimensional modeling capabilities, when properly combined with rapid and complete data interpretation of results from both experiments and simulations, can contribute to a greatly enhanced cycle of understanding and innovation. Plasma science theory and simulation have provided reliable foundations for this improved modeling capability, and the exciting advances in high-performance computational resources have further accelerated progress. There were 68 papers presented at this conference in the area of magnetic fusion energy theory

  12. Complexity Theory

    Science.gov (United States)

    Lee, William H K.

    2016-01-01

    A complex system consists of many interacting parts, generates new collective behavior through self organization, and adaptively evolves through time. Many theories have been developed to study complex systems, including chaos, fractals, cellular automata, self organization, stochastic processes, turbulence, and genetic algorithms.

  13. Matching theory

    CERN Document Server

    Plummer, MD

    1986-01-01

    This study of matching theory deals with bipartite matching, network flows, and presents fundamental results for the non-bipartite case. It goes on to study elementary bipartite graphs and elementary graphs in general. Further discussed are 2-matchings, general matching problems as linear programs, the Edmonds Matching Algorithm (and other algorithmic approaches), f-factors and vertex packing.

  14. Activity Theory

    DEFF Research Database (Denmark)

    Bertelsen, Olav Wedege; Bødker, Susanne

    2003-01-01

    the young HCI research tradition. But HCI was already facing problems: lack of consideration for other aspects of human behavior, for interaction with other people, for culture. Cognitive science-based theories lacked means to address several issues that came out of the empirical projects....

  15. Contribution to diffraction theory

    International Nuclear Information System (INIS)

    Chako, N.

    1966-11-01

    In a first part, we have given a general and detailed treatment of the modern theory of diffraction. The rigorous theory is formulated as a boundary value problem of the wave equation or Maxwell equations. However, up to the present time, such a program of treating diffraction by optical systems, even for simple optical instruments, has not been realized due to the complicated character of the boundary conditions. The recent developments show clearly the nature of the approximation of the classical theories originally due to Fresnel and Young, later formulated in a rigorous manner by Kirchhoff and Rubinowicz, respectively and, at the same time the insufficiency of these theories in explaining a number of diffraction phenomena. Furthermore, we have made a study of the limitations of the approximate theories and the recent attempts to improve these. The second part is devoted to a general mathematical treatment of the theory of diffraction of optical systems including aberrations. After a general and specific analysis of geometrical and wave aberrations along classical and modern (Nijboer) lines, we have been able to evaluate the diffraction integrals representing the image field at any point in image space explicitly, when the aberrations are small. Our formulas are the generalisations of all anterior results obtained by previous investigators. Moreover, we have discussed the Zernike-Nijboer theory of aberration and generalised it not only for rotational systems, but also for non-symmetric systems as well, including the case of non circular apertures. The extension to non-circular apertures is done by introducing orthogonal functions or polynomials over such aperture shapes. So far the results are valid for small aberrations, that is to say, where the deformation of the real wave front emerging from the optical system is less than a wave length of light or of the electromagnetic wave from the ideal wave front. If the aberrations are large, then one must employ the

  16. GABA receptor imaging

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jong Doo [Yonsei University College of Medicine, Seoul (Korea, Republic of)

    2007-04-15

    GABA is primary an inhibitory neurotransmitter that is localized in inhibitory interneurons. GABA is released from presynaptic terminals and functions by binding to GABA receptors. There are two types of GABA receptors, GABA{sub A}-receptor that allows chloride to pass through a ligand gated ion channel and GABA{sub B}-receptor that uses G-proteins for signaling. The GABA{sub A}-receptor has a GABA binding site as well as a benzodiazepine binding sites, which modulate GABA{sub A}-receptor function. Benzodiazepine GABAA receptor imaging can be accomplished by radiolabeling derivates that activates benzodiazepine binding sites. There has been much research on flumazenil (FMZ) labeled with {sup 11}C-FMZ, a benzodiazepine derivate that is a selective, reversible antagonist to GABAA receptors. Recently, {sup 18}F-fluoroflumazenil (FFMZ) has been developed to overcome {sup 11}C's short half-life. {sup 18}F-FFMZ shows high selective affinity and good pharmacodynamics, and is a promising PET agent with better central benzodiazepine receptor imaging capabilities. In an epileptic focus, because the GABA/benzodiazepine receptor amount is decreased, using '1{sup 1}C-FMZ PET instead of {sup 18}F-FDG, PET, restrict the foci better and may also help find lesions better than high resolution MR. GABA{sub A} receptors are widely distributed in the cerebral cortex, and can be used as an viable neuronal marker. Therefore it can be used as a neuronal cell viability marker in cerebral ischemia. Also, GABA-receptors decrease in areas where neuronal plasticity develops, therefore, GABA imaging can be used to evaluate plasticity. Besides these usages, GABA receptors are related with psychological diseases, especially depression and schizophrenia as well as cerebral palsy, a motor-related disorder, so further in-depth studies are needed for these areas.

  17. GABA receptor imaging

    International Nuclear Information System (INIS)

    Lee, Jong Doo

    2007-01-01

    GABA is primary an inhibitory neurotransmitter that is localized in inhibitory interneurons. GABA is released from presynaptic terminals and functions by binding to GABA receptors. There are two types of GABA receptors, GABA A -receptor that allows chloride to pass through a ligand gated ion channel and GABA B -receptor that uses G-proteins for signaling. The GABA A -receptor has a GABA binding site as well as a benzodiazepine binding sites, which modulate GABA A -receptor function. Benzodiazepine GABAA receptor imaging can be accomplished by radiolabeling derivates that activates benzodiazepine binding sites. There has been much research on flumazenil (FMZ) labeled with 11 C-FMZ, a benzodiazepine derivate that is a selective, reversible antagonist to GABAA receptors. Recently, 18 F-fluoroflumazenil (FFMZ) has been developed to overcome 11 C's short half-life. 18 F-FFMZ shows high selective affinity and good pharmacodynamics, and is a promising PET agent with better central benzodiazepine receptor imaging capabilities. In an epileptic focus, because the GABA/benzodiazepine receptor amount is decreased, using '1 1 C-FMZ PET instead of 18 F-FDG, PET, restrict the foci better and may also help find lesions better than high resolution MR. GABA A receptors are widely distributed in the cerebral cortex, and can be used as an viable neuronal marker. Therefore it can be used as a neuronal cell viability marker in cerebral ischemia. Also, GABA-receptors decrease in areas where neuronal plasticity develops, therefore, GABA imaging can be used to evaluate plasticity. Besides these usages, GABA receptors are related with psychological diseases, especially depression and schizophrenia as well as cerebral palsy, a motor-related disorder, so further in-depth studies are needed for these areas

  18. Sociological Theory and Social Reality [ENG

    Directory of Open Access Journals (Sweden)

    JUAN DÍEZ NICOLÁS

    2013-01-01

    Full Text Available This paper pretends to demonstrate the complementary relations between three relatively recent sociological theories, each one of which explains a different aspect of the same social object: the origin, diffusion and change of social and cultural values, aiming at demonstrating that there is not such a thing as a sociological theory that explains all, but rather diverse theories that offer partial explanations of social reality. To that effect, and on the basis of the necessary relationship between theory and research, three different theories are evaluated separately: Hawley?s and Duncan?s theory of the social ecosystem, Galtung?s centre-periphery theory, and Inglehart?s theory of values? change in modern-industrial societies, offering theoretical and empirical evidence of their complementary relations, based on Spanish and international data. Social ecosystem and centre-periphery theories show a high level of generalization (through space and time and a high level of abstraction, though both can easily operationalize their main concepts through valid and reliable indicators. The theory of values? change, however, though showing a high level of generalization, is limited in time to the historical period after World War II, and also shows a high level of abstraction. Centre-periphery theory and values? change theory use individual and collective units of analysis, but social ecosystem theory only uses collective units, by definition. The three theories lead to the conclusion that ?security? values will gain a growing importance in present societies.

  19. Bootstrapping N=3 superconformal theories

    Energy Technology Data Exchange (ETDEWEB)

    Lemos, Madalena; Liendo, Pedro [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany). Theory Group; Meneghelli, Carlo [Stony Brook Univ., Stony Brook, NY (United States). Simons Center for Geometry and Physics; Mitev, Vladimir [Mainz Univ. (Germany). PRISMA Cluster of Excellence

    2016-12-15

    We initiate the bootstrap program for N=3 superconformal field theories (SCFTs) in four dimensions. The problem is considered from two fronts: the protected subsector described by a 2d chiral algebra, and crossing symmetry for half-BPS operators whose superconformal primaries parametrize the Coulomb branch of N=3 theories. With the goal of describing a protected subsector of a family of =3 SCFTs, we propose a new 2d chiral algebra with super Virasoro symmetry that depends on an arbitrary parameter, identified with the central charge of the theory. Turning to the crossing equations, we work out the superconformal block expansion and apply standard numerical bootstrap techniques in order to constrain the CFT data. We obtain bounds valid for any theory but also, thanks to input from the chiral algebra results, we are able to exclude solutions with N=4 supersymmetry, allowing us to zoom in on a specific N=3 SCFT.

  20. Bootstrapping N=3 superconformal theories

    Energy Technology Data Exchange (ETDEWEB)

    Lemos, Madalena; Liendo, Pedro [DESY Hamburg, Theory Group,Notkestrasse 85, D-22607 Hamburg (Germany); Meneghelli, Carlo [Simons Center for Geometry and Physics,Stony Brook University, Stony Brook, NY 11794-3636 (United States); Mitev, Vladimir [PRISMA Cluster of Excellence, Institut für Physik,JGU Mainz, Staudingerweg 7, 55128 Mainz (Germany)

    2017-04-06

    We initiate the bootstrap program for N=3 superconformal field theories (SCFTs) in four dimensions. The problem is considered from two fronts: the protected subsector described by a 2d chiral algebra, and crossing symmetry for half-BPS operators whose superconformal primaries parametrize the Coulomb branch of N=3 theories. With the goal of describing a protected subsector of a family of N=3 SCFTs, we propose a new 2d chiral algebra with super Virasoro symmetry that depends on an arbitrary parameter, identified with the central charge of the theory. Turning to the crossing equations, we work out the superconformal block expansion and apply standard numerical bootstrap techniques in order to constrain the CFT data. We obtain bounds valid for any theory but also, thanks to input from the chiral algebra results, we are able to exclude solutions with N=4 supersymmetry, allowing us to zoom in on a specific N=3 SCFT.

  1. Cavitation Nuclei: Experiments and Theory

    DEFF Research Database (Denmark)

    Mørch, Knud Aage

    2009-01-01

    The Swedish astrophysicist and Nobel Prize winner Hannes Alfven said: Theories come and go - the experiment is here forever. Often a theory, which we set up to describe an observed physical phenomenon, suffers from the lack of knowledge of decisive parameters, and therefore at best the theory...... becomes insufficient. Contrary, the experiment always reveals nature itself, though at prevailing experimental conditions. With essential parameters being out of control and even maybe unidentified, apparently similar experiments may deviate way beyond our expectations. However, these discrepancies offer...... us a chance to reflect on the character of the unknown parameters. In this way non-concordant experimental results may hold the key to the development of better theories - and to new experiments for the testing of their validity. Cavitation and cavitation nuclei are phenomena of that character....

  2. Glucocorticoid receptor modulators.

    Science.gov (United States)

    Meijer, Onno C; Koorneef, Lisa L; Kroon, Jan

    2018-06-01

    The glucocorticoid hormone cortisol acts throughout the body to support circadian processes and adaptation to stress. The glucocorticoid receptor is the target of cortisol and of synthetic glucocorticoids, which are used widely in the clinic. Both agonism and antagonism of the glucocorticoid receptor may be beneficial in disease, but given the wide expression of the receptor and involvement in various processes, beneficial effects are often accompanied by unwanted side effects. Selective glucocorticoid receptor modulators are ligands that induce a receptor conformation that allows activation of only a subset of downstream signaling pathways. Such molecules thereby combine agonistic and antagonistic properties. Here we discuss the mechanisms underlying selective receptor modulation and their promise in treating diseases in several organ systems where cortisol signaling plays a role. Copyright © 2018 Elsevier Masson SAS. All rights reserved.

  3. Dengue virus receptor

    OpenAIRE

    Hidari, Kazuya I.P.J.; Suzuki, Takashi

    2011-01-01

    Dengue virus is an arthropod-borne virus transmitted by Aedes mosquitoes. Dengue virus causes fever and hemorrhagic disorders in humans and non-human primates. Direct interaction of the virus introduced by a mosquito bite with host receptor molecule(s) is crucial for virus propagation and the pathological progression of dengue diseases. Therefore, elucidation of the molecular mechanisms underlying the interaction between dengue virus and its receptor(s) in both humans and mosquitoes is essent...

  4. Communication theory

    DEFF Research Database (Denmark)

    Stein, Irene F.; Stelter, Reinhard

    2011-01-01

    Communication theory covers a wide variety of theories related to the communication process (Littlejohn, 1999). Communication is not simply an exchange of information, in which we have a sender and a receiver. This very technical concept of communication is clearly outdated; a human being...... is not a data processing device. In this chapter, communication is understood as a process of shared meaning-making (Bruner, 1990). Human beings interpret their environment, other people, and themselves on the basis of their dynamic interaction with the surrounding world. Meaning is essential because people...... ascribe specific meanings to their experiences, their actions in life or work, and their interactions. Meaning is reshaped, adapted, and transformed in every communication encounter. Furthermore, meaning is cocreated in dialogues or in communities of practice, such as in teams at a workplace or in school...

  5. Operator theory

    CERN Document Server

    2015-01-01

    A one-sentence definition of operator theory could be: The study of (linear) continuous operations between topological vector spaces, these being in general (but not exclusively) Fréchet, Banach, or Hilbert spaces (or their duals). Operator theory is thus a very wide field, with numerous facets, both applied and theoretical. There are deep connections with complex analysis, functional analysis, mathematical physics, and electrical engineering, to name a few. Fascinating new applications and directions regularly appear, such as operator spaces, free probability, and applications to Clifford analysis. In our choice of the sections, we tried to reflect this diversity. This is a dynamic ongoing project, and more sections are planned, to complete the picture. We hope you enjoy the reading, and profit from this endeavor.

  6. Potential theory

    CERN Document Server

    Helms, Lester L

    2014-01-01

    Potential Theory presents a clear path from calculus to classical potential theory and beyond, with the aim of moving the reader into the area of mathematical research as quickly as possible. The subject matter is developed from first principles using only calculus. Commencing with the inverse square law for gravitational and electromagnetic forces and the divergence theorem, the author develops methods for constructing solutions of Laplace's equation on a region with prescribed values on the boundary of the region. The latter half of the book addresses more advanced material aimed at those with the background of a senior undergraduate or beginning graduate course in real analysis. Starting with solutions of the Dirichlet problem subject to mixed boundary conditions on the simplest of regions, methods of morphing such solutions onto solutions of Poisson's equation on more general regions are developed using diffeomorphisms and the Perron-Wiener-Brelot method, culminating in application to Brownian motion. In ...

  7. Practical theories

    DEFF Research Database (Denmark)

    Jensen, Klaus Bruhn

    2016-01-01

    This article revisits the place of normative and other practical issues in the wider conceptual architecture of communication theory, building on the tradition of philosophical pragmatism. The article first characterizes everyday concepts of communication as the accumulated outcome of natural...... evolution and history: practical resources for human existence and social coexistence. Such practical concepts have served as the point of departure for diverse theoretical conceptions of what communication is. The second part of the article highlights the past neglect and current potential of normative...... communication theories that ask, in addition, what communication ought to be, and what it could be, taking the relationship between communication and justice as a case in point. The final section returns to empirical conceptualizations of different institutions, practices and discourses of communication...

  8. Gauge theories

    International Nuclear Information System (INIS)

    Jarlskog, C.

    An introduction to the unified gauge theories of weak and electromagnetic interactions is given. The ingredients of gauge theories and symmetries and conservation laws lead to discussion of local gauge invariance and QED, followed by weak interactions and quantum flavor dynamics. The construction of the standard SU(2)xU(1) model precedes discussion of the unification of weak and electromagnetic interactions and weak neutral current couplings in this model. Presentation of spontaneous symmetry breaking and spontaneous breaking of a local symmetry leads to a spontaneous breaking scheme for the standard SU(2)xU(1) model. Consideration of quarks, leptons, masses and the Cabibbo angles, of the four quark and six quark models and CP violation lead finally to grand unification, followed by discussion of mixing angles in the Georgi-Glashow model, the Higgses of the SU(5) model and proton/ neutron decay in SU(5). (JIW)

  9. Twistor theory

    International Nuclear Information System (INIS)

    Perjes, Z.

    1982-01-01

    Particle models in twistor theory are reviewed, starting with an introduction into the kinematical-twistor formalism which describes massive particles in Minkowski space-time. The internal transformations of constituent twistors are then discussed. The quantization rules available from a study of twistor scattering situations are used to construct quantum models of fundamental particles. The theory allows the introduction of an internal space with a Kaehlerian metric where hadron structure is described by spherical states of bound constituents. It is conjectured that the spectrum of successive families of hadrons might approach an accumulation point in energy. Above this threshold energy, the Kaehlerian analog of ionization could occur wherein the zero-mass constituents (twistors) of the particle break free. (Auth.)

  10. Biocultural Theory

    DEFF Research Database (Denmark)

    Carroll, Joseph; Clasen, Mathias; Jonsson, Emelie

    2017-01-01

    Biocultural theory is an integrative research program designed to investigate the causal interactions between biological adaptations and cultural constructions. From the biocultural perspective, cultural processes are rooted in the biological necessities of the human life cycle: specifically human...... of research as contributions to a coherent, collective research program. This article argues that a mature biocultural paradigm needs to be informed by at least 7 major research clusters: (a) gene-culture coevolution; (b) human life history theory; (c) evolutionary social psychology; (d) anthropological...... forms of birth, growth, survival, mating, parenting, and sociality. Conversely, from the biocultural perspective, human biological processes are constrained, organized, and developed by culture, which includes technology, culturally specific socioeconomic and political structures, religious...

  11. Computability theory

    CERN Document Server

    Weber, Rebecca

    2012-01-01

    What can we compute--even with unlimited resources? Is everything within reach? Or are computations necessarily drastically limited, not just in practice, but theoretically? These questions are at the heart of computability theory. The goal of this book is to give the reader a firm grounding in the fundamentals of computability theory and an overview of currently active areas of research, such as reverse mathematics and algorithmic randomness. Turing machines and partial recursive functions are explored in detail, and vital tools and concepts including coding, uniformity, and diagonalization are described explicitly. From there the material continues with universal machines, the halting problem, parametrization and the recursion theorem, and thence to computability for sets, enumerability, and Turing reduction and degrees. A few more advanced topics round out the book before the chapter on areas of research. The text is designed to be self-contained, with an entire chapter of preliminary material including re...

  12. Elastoplasticity theory

    CERN Document Server

    Hashiguchi, Koichi

    2014-01-01

    This book was written to serve as the standard textbook of elastoplasticity for students, engineers and researchers in the field of applied mechanics. The present second edition is improved thoroughly from the first edition by selecting the standard theories from various formulations and models, which are required to study the essentials of elastoplasticity steadily and effectively and will remain universally in the history of elastoplasticity. It opens with an explanation of vector-tensor analysis and continuum mechanics as a foundation to study elastoplasticity theory, extending over various strain and stress tensors and their rates. Subsequently, constitutive equations of elastoplastic and viscoplastic deformations for monotonic, cyclic and non-proportional loading behavior in a general rate and their applications to metals and soils are described in detail, and constitutive equations of friction behavior between solids and its application to the prediction of stick-slip phenomena are delineated. In additi...

  13. Livability theory

    OpenAIRE

    Veenhoven, Ruut

    2014-01-01

    markdownabstract__Abstract__ Assumptions Livability theory involves the following six key assumptions: 1. Like all animals, humans have innate needs, such as for food, safety, and companionship. 2. Gratification of needs manifests in hedonic experience. 3. Hedonic experience determines how much we like the life we live (happiness). Hence, happiness depends on need gratification. 4.Need gratification depends on both external living conditions and inner abilities to use these. Hence, bad living...

  14. Testing theories

    International Nuclear Information System (INIS)

    Casten, R F

    2015-01-01

    This paper discusses some simple issues that arise in testing models, with a focus on models for low energy nuclear structure. By way of simplified examples, we illustrate some dangers in blind statistical assessments, pointing out especially the need to include theoretical uncertainties, the danger of over-weighting precise or physically redundant experimental results, the need to assess competing theories with independent and physically sensitive observables, and the value of statistical tests properly evaluated. (paper)

  15. Validity evidence based on test content.

    Science.gov (United States)

    Sireci, Stephen; Faulkner-Bond, Molly

    2014-01-01

    Validity evidence based on test content is one of the five forms of validity evidence stipulated in the Standards for Educational and Psychological Testing developed by the American Educational Research Association, American Psychological Association, and National Council on Measurement in Education. In this paper, we describe the logic and theory underlying such evidence and describe traditional and modern methods for gathering and analyzing content validity data. A comprehensive review of the literature and of the aforementioned Standards is presented. For educational tests and other assessments targeting knowledge and skill possessed by examinees, validity evidence based on test content is necessary for building a validity argument to support the use of a test for a particular purpose. By following the methods described in this article, practitioners have a wide arsenal of tools available for determining how well the content of an assessment is congruent with and appropriate for the specific testing purposes.

  16. Graph theory

    CERN Document Server

    Diestel, Reinhard

    2017-01-01

    This standard textbook of modern graph theory, now in its fifth edition, combines the authority of a classic with the engaging freshness of style that is the hallmark of active mathematics. It covers the core material of the subject with concise yet reliably complete proofs, while offering glimpses of more advanced methods in each field by one or two deeper results, again with proofs given in full detail. The book can be used as a reliable text for an introductory course, as a graduate text, and for self-study. From the reviews: “This outstanding book cannot be substituted with any other book on the present textbook market. It has every chance of becoming the standard textbook for graph theory.”Acta Scientiarum Mathematiciarum “Deep, clear, wonderful. This is a serious book about the heart of graph theory. It has depth and integrity. ”Persi Diaconis & Ron Graham, SIAM Review “The book has received a very enthusiastic reception, which it amply deserves. A masterly elucidation of modern graph theo...

  17. Scattering theory

    CERN Document Server

    Friedrich, Harald

    2016-01-01

    This corrected and updated second edition of "Scattering Theory" presents a concise and modern coverage of the subject. In the present treatment, special attention is given to the role played by the long-range behaviour of the projectile-target interaction, and a theory is developed, which is well suited to describe near-threshold bound and continuum states in realistic binary systems such as diatomic molecules or molecular ions. It is motivated by the fact that experimental advances have shifted and broadened the scope of applications where concepts from scattering theory are used, e.g. to the field of ultracold atoms and molecules, which has been experiencing enormous growth in recent years, largely triggered by the successful realization of Bose-Einstein condensates of dilute atomic gases in 1995. The book contains sections on special topics such as near-threshold quantization, quantum reflection, Feshbach resonances and the quantum description of scattering in two dimensions. The level of abstraction is k...

  18. Dimensions of ecosystem theory

    International Nuclear Information System (INIS)

    O'Neill, R.V.; Reichle, D.E.

    1979-01-01

    Various dimensions of ecosystem structure and behavior that seem to develop from the ubiquitous phenomena of system growth and persistence were studied. While growth and persistence attributes of ecosystems may appear to be simplistic phenomena upon which to base a comprehensive ecosystem theory, these same attributes have been fundamental to the theoretical development of other biological disciplines. These attributes were explored at a hierarchical level in a self-organizing system, and adaptive system strategies that result were analyzed. Previously developed causative relations (Reichle et al., 1975c) were examined, their theoretical implications expounded upon, and the assumptions tested with data from a variety of forest types. The conclusions are not a theory in themselves, but a state of organization of concepts contributing towards a unifying theory, along the lines promulgated by Bray (1958). The inferences drawn rely heavily upon data from forested ecosystems of the world, and have yet to be validated against data from a much more diverse range of ecosystem types. Not all of the interpretations are logically tight - there is room for other explanations, which it is hoped will provide fruitful grounds for further speculation

  19. Multipactor theory for multicarrier signals

    International Nuclear Information System (INIS)

    Anza, S.; Vicente, C.; Gil, J.; Mattes, M.; Raboso, D.; Boria, V. E.; Gimeno, B.

    2011-01-01

    This work presents a new theory of multipactor under multicarrier signals for parallel-plate geometries, assuming a homogeneous electric field and one-dimensional electron motion. It is the generalization of the nonstationary multipactor theory for single-carrier signals [S. Anza et al.,Phys. Plasmas 17, 062110 (2010)]. It is valid for multicarrier signals with an arbitrary number of carriers with different amplitude, arbitrary frequency, and phase conditions and for any material coating. This new theory is able to model the real dynamics of the electrons during the multipactor discharge for both single and double surface interactions. Among other parameters of the discharge, it calculates the evolution in time of the charge growth, electron absorption, and creation rates as well as the instantaneous secondary emission yield and order. An extensive set of numerical tests with particle-in-cell software has been carried out in order to validate the theory under many different conditions. This theoretical development constitutes the first multipactor theory which completely characterizes the multipactor discharge for arbitrary multicarrier signals, setting the first step for further investigations in the field.

  20. Is the string theory doomed?

    International Nuclear Information System (INIS)

    Le Meur, H.; Daninos, F.; Bachas, C.

    2007-01-01

    Since its beginning, in the sixties, the string theory has succeeded in overcoming a lot of theoretical difficulties but now the complete absence of experimental validation entertains doubts about its ability to represent the real world and questions its hegemony in today's theoretical physics. Other space-time theories like the twistors, or the non-commutative geometry, or the loop quantum gravity, or the causal dynamics triangulation might begin receiving more attention. Despite all that, the string theory can be given credit for 4 achievements. First, the string theory has provided a consistent quantum description of gravity. Secondly, the string theory has built a theoretical frame that has allowed the unification of the 4 basic interactions. Thirdly, the string theory applied to astrophysics issues has demonstrated that the evaporation of a black hole does not necessarily lead to a loss of information which comforts the universality of the conservation of the quantity of information in any system and as a consequence put a fatal blow to the so-called paradox observed in black holes. Fourthly, the string theory has given a new and original meaning on the true nature of space-time. (A.C.)

  1. Estimating uncertainty of inference for validation

    Energy Technology Data Exchange (ETDEWEB)

    Booker, Jane M [Los Alamos National Laboratory; Langenbrunner, James R [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Ross, Timothy J [UNM

    2010-09-30

    We present a validation process based upon the concept that validation is an inference-making activity. This has always been true, but the association has not been as important before as it is now. Previously, theory had been confirmed by more data, and predictions were possible based on data. The process today is to infer from theory to code and from code to prediction, making the role of prediction somewhat automatic, and a machine function. Validation is defined as determining the degree to which a model and code is an accurate representation of experimental test data. Imbedded in validation is the intention to use the computer code to predict. To predict is to accept the conclusion that an observable final state will manifest; therefore, prediction is an inference whose goodness relies on the validity of the code. Quantifying the uncertainty of a prediction amounts to quantifying the uncertainty of validation, and this involves the characterization of uncertainties inherent in theory/models/codes and the corresponding data. An introduction to inference making and its associated uncertainty is provided as a foundation for the validation problem. A mathematical construction for estimating the uncertainty in the validation inference is then presented, including a possibility distribution constructed to represent the inference uncertainty for validation under uncertainty. The estimation of inference uncertainty for validation is illustrated using data and calculations from Inertial Confinement Fusion (ICF). The ICF measurements of neutron yield and ion temperature were obtained for direct-drive inertial fusion capsules at the Omega laser facility. The glass capsules, containing the fusion gas, were systematically selected with the intent of establishing a reproducible baseline of high-yield 10{sup 13}-10{sup 14} neutron output. The deuterium-tritium ratio in these experiments was varied to study its influence upon yield. This paper on validation inference is the

  2. Correcting Fallacies in Validity, Reliability, and Classification

    Science.gov (United States)

    Sijtsma, Klaas

    2009-01-01

    This article reviews three topics from test theory that continue to raise discussion and controversy and capture test theorists' and constructors' interest. The first topic concerns the discussion of the methodology of investigating and establishing construct validity; the second topic concerns reliability and its misuse, alternative definitions…

  3. Playing to win over: validating persuasive games

    NARCIS (Netherlands)

    R.S. Jacobs (Ruud)

    2017-01-01

    textabstractThis dissertation describes four years of scientific inquiry into persuasive games – digital games designed to persuade – as part of a multidisciplinary research project ‘Persuasive Gaming. From Theory-Based Design to Validation and Back’ funded by the Netherlands Organization for

  4. Statistical test theory for the behavioral sciences

    CERN Document Server

    de Gruijter, Dato N M

    2007-01-01

    Since the development of the first intelligence test in the early 20th century, educational and psychological tests have become important measurement techniques to quantify human behavior. Focusing on this ubiquitous yet fruitful area of research, Statistical Test Theory for the Behavioral Sciences provides both a broad overview and a critical survey of assorted testing theories and models used in psychology, education, and other behavioral science fields. Following a logical progression from basic concepts to more advanced topics, the book first explains classical test theory, covering true score, measurement error, and reliability. It then presents generalizability theory, which provides a framework to deal with various aspects of test scores. In addition, the authors discuss the concept of validity in testing, offering a strategy for evidence-based validity. In the two chapters devoted to item response theory (IRT), the book explores item response models, such as the Rasch model, and applications, incl...

  5. Theory of tapered laser cooling

    International Nuclear Information System (INIS)

    Okamoto, Hiromi; Wei, J.

    1998-01-01

    A theory of tapered laser cooling for fast circulating ion beams in a storage ring is constructed. The authors describe the fundamentals of this new cooling scheme, emphasizing that it might be the most promising way to beam crystallization. The cooling rates are analytically evaluated to study the ideal operating condition. They discuss the physical implication of the tapering factor of cooling laser, and show how to determine its optimum value. Molecular dynamics method is employed to demonstrate the validity of the present theory

  6. Contribution to a Theory of Detailed Design

    DEFF Research Database (Denmark)

    Mortensen, Niels Henrik

    1999-01-01

    It has been recognised, that literature actually do not propose a theory of detailed design. In this paper a theory contribution is proposed, linking part design to organ design and allowing a type of functional reasoning. The proposed theory satisfies our need for explaining the nature of a part...... structure, for support of synthesis of part structure, i.e. detailed design, and our need for digital modelling of part structures.The aim of this paper is to contribute to a design theory valid for detailed design. The proposal is based upon the theory's ability to explain the nature of machine parts...... and assemblies, to support the synthesis of parts and to allow the modelling, especially digital modelling of a part structure. The contribution is based upon Theory of Technical Systems, Hubka, and the Domain Theory, Andreasen. This paper is based on a paper presented at ICED 99, Mortensen, but focus...

  7. Water soluble and efficient amino acid Schiff base receptor for reversible fluorescence turn-on detection of Zn2+ ions: Quantum chemical calculations and detection of bacteria

    Science.gov (United States)

    Subha, L.; Balakrishnan, C.; Natarajan, Satheesh; Theetharappan, M.; Subramanian, Balanehru; Neelakantan, M. A.

    2016-01-01

    An amino acid Schiff base (R) capable of recognizing Zn2+ ions selectively and sensitively in an aqueous medium was prepared and characterized. Upon addition of Zn2+ ions, the receptor exhibits fluorescence intensity enhancements ( 40 fold) at 460 nm (quantum yield, Φ = 0.05 for R and Φ = 0.18 for R-Zn2+) and can be detected by naked eye under UV light. The receptor can recognize the Zn2+ (1.04 × 10- 8 M) selectively for other metal ions in the pH range of 7.5-11. The Zn2+ chelation with R decreases the loss of energy through non-radiative transition and leads to fluorescence enhancement. The binding mode of the receptor with Zn2+ was investigated by 1H NMR titration and further validated by ESI-MS. The elemental color mapping and SEM/EDS analysis were also used to study the binding of R with Zn2+. Density functional theory calculations were carried out to understand the binding mechanism. The receptor was applied as a microbial sensor for Escherichia coli and Staphylococcus aureus.

  8. Angiotensin type 2 receptors

    DEFF Research Database (Denmark)

    Sumners, Colin; de Kloet, Annette D; Krause, Eric G

    2015-01-01

    In most situations, the angiotensin AT2-receptor (AT2R) mediates physiological actions opposing those mediated by the AT1-receptor (AT1R), including a vasorelaxant effect. Nevertheless, experimental evidence vastly supports that systemic application of AT2R-agonists is blood pressure neutral...

  9. Communication theory

    CERN Document Server

    Goldie, Charles M

    1991-01-01

    This book is an introduction, for mathematics students, to the theories of information and codes. They are usually treated separately but, as both address the problem of communication through noisy channels (albeit from different directions), the authors have been able to exploit the connection to give a reasonably self-contained treatment, relating the probabilistic and algebraic viewpoints. The style is discursive and, as befits the subject, plenty of examples and exercises are provided. Some examples and exercises are provided. Some examples of computer codes are given to provide concrete illustrations of abstract ideas.

  10. Design theory

    CERN Document Server

    2009-01-01

    This book deals with the basic subjects of design theory. It begins with balanced incomplete block designs, various constructions of which are described in ample detail. In particular, finite projective and affine planes, difference sets and Hadamard matrices, as tools to construct balanced incomplete block designs, are included. Orthogonal latin squares are also treated in detail. Zhu's simpler proof of the falsity of Euler's conjecture is included. The construction of some classes of balanced incomplete block designs, such as Steiner triple systems and Kirkman triple systems, are also given.

  11. Theory of coherent resonance energy transfer

    International Nuclear Information System (INIS)

    Jang, Seogjoo; Cheng, Y.-C.; Reichman, David R.; Eaves, Joel D.

    2008-01-01

    A theory of coherent resonance energy transfer is developed combining the polaron transformation and a time-local quantum master equation formulation, which is valid for arbitrary spectral densities including common modes. The theory contains inhomogeneous terms accounting for nonequilibrium initial preparation effects and elucidates how quantum coherence and nonequilibrium effects manifest themselves in the coherent energy transfer dynamics beyond the weak resonance coupling limit of the Foerster and Dexter (FD) theory. Numerical tests show that quantum coherence can cause significant changes in steady state donor/acceptor populations from those predicted by the FD theory and illustrate delicate cooperation of nonequilibrium and quantum coherence effects on the transient population dynamics.

  12. What If Quantum Theory Violates All Mathematics?

    Science.gov (United States)

    Rosinger, Elemér Elad

    2017-09-01

    It is shown by using a rather elementary argument in Mathematical Logic that if indeed, quantum theory does violate the famous Bell Inequalities, then quantum theory must inevitably also violate all valid mathematical statements, and in particular, such basic algebraic relations like 0 = 0, 1 = 1, 2 = 2, 3 = 3, … and so on … An interest in that result is due to the following three alternatives which it imposes upon both Physics and Mathematics: Quantum Theory is inconsistent. Quantum Theory together with Mathematics are inconsistent. Mathematics is inconsistent. In this regard one should recall that, up until now, it is not known whether Mathematics is indeed consistent.

  13. Glutamate receptor agonists

    DEFF Research Database (Denmark)

    Vogensen, Stine Byskov; Greenwood, Jeremy R; Bunch, Lennart

    2011-01-01

    The neurotransmitter (S)-glutamate [(S)-Glu] is responsible for most of the excitatory neurotransmission in the central nervous system. The effect of (S)-Glu is mediated by both ionotropic and metabotropic receptors. Glutamate receptor agonists are generally a-amino acids with one or more...... stereogenic centers due to strict requirements in the agonist binding pocket of the activated state of the receptor. By contrast, there are many examples of achiral competitive antagonists. The present review addresses how stereochemistry affects the activity of glutamate receptor ligands. The review focuses...... mainly on agonists and discusses stereochemical and conformational considerations as well as biostructural knowledge of the agonist binding pockets, which is useful in the design of glutamate receptor agonists. Examples are chosen to demonstrate how stereochemistry not only determines how the agonist...

  14. AMPA receptor ligands

    DEFF Research Database (Denmark)

    Strømgaard, Kristian; Mellor, Ian

    2004-01-01

    Alpha-amino-3-hydroxy-5-methyl-4-isoxazole propionic acid (AMPA) receptors (AMPAR), subtype of the ionotropic glutamate receptors (IGRs), mediate fast synaptic transmission in the central nervous system (CNS), and are involved in many neurological disorders, as well as being a key player in the f......Alpha-amino-3-hydroxy-5-methyl-4-isoxazole propionic acid (AMPA) receptors (AMPAR), subtype of the ionotropic glutamate receptors (IGRs), mediate fast synaptic transmission in the central nervous system (CNS), and are involved in many neurological disorders, as well as being a key player...... in the formation of memory. Hence, ligands affecting AMPARs are highly important for the study of the structure and function of this receptor, and in this regard polyamine-based ligands, particularly polyamine toxins, are unique as they selectively block Ca2+ -permeable AMPARs. Indeed, endogenous intracellular...

  15. International business theory and marketing theory

    OpenAIRE

    Soldner, Helmut

    1984-01-01

    International business theory and marketing theory : elements for internat. marketing theory building. - In: Marketing aspects of international business / Gerald M. Hampton ... (eds.). - Boston u.a. : Kluwer, 1984. - S. 25-57

  16. Options theory

    International Nuclear Information System (INIS)

    Markland, J.T.

    1992-01-01

    Techniques used in conventional project appraisal are mathematically very simple in comparison to those used in reservoir modelling, and in the geosciences. Clearly it would be possible to value assets in mathematically more sophisticated ways if it were meaningful and worthwhile so to do. The DCf approach in common use has recognized limitations; the inability to select a meaningful discount rate being particularly significant. Financial Theory has advanced enormously over the last few years, along with computational techniques, and methods are beginning to appear which may change the way we do project evaluations in practice. The starting point for all of this was a paper by Black and Scholes, which asserts that almost all corporate liabilities can be viewed as options of varying degrees of complexity. Although the financial presentation may be unfamiliar to engineers and geoscientists, some of the concepts used will not be. This paper outlines, in plain English, the basis of option pricing theory for assessing the market value of a project. it also attempts to assess the future role of this type of approach in practical Petroleum Exploration and Engineering economics. Reference is made to relevant published Natural Resource literature

  17. Lipophorin Receptor: The Insect Lipoprotein Receptor

    Indian Academy of Sciences (India)

    IAS Admin

    Director of ... function of the Lp is to deliver lipids throughout the insect body for metabolism ... Lipid is used as a major energy source for development as well as other metabolic .... LpR4 receptor variant was expressed exclusively in the brain and.

  18. General Theory of Absorption in Porous Materials: Restricted Multilayer Theory.

    Science.gov (United States)

    Aduenko, Alexander A; Murray, Andy; Mendoza-Cortes, Jose L

    2018-04-18

    In this article, we present an approach for the generalization of adsorption of light gases in porous materials. This new theory goes beyond Langmuir and Brunauer-Emmett-Teller theories, which are the standard approaches that have a limited application to crystalline porous materials by their unphysical assumptions on the amount of possible adsorption layers. The derivation of a more general equation for any crystalline porous framework is presented, restricted multilayer theory. Our approach allows the determination of gas uptake considering only geometrical constraints of the porous framework and the interaction energy of the guest molecule with the framework. On the basis of this theory, we calculated optimal values for the adsorption enthalpy at different temperatures and pressures. We also present the use of this theory to determine the optimal linker length for a topologically equivalent framework series. We validate this theoretical approach by applying it to metal-organic frameworks (MOFs) and show that it reproduces the experimental results for seven different reported materials. We obtained the universal equation for the optimal linker length, given the topology of a porous framework. This work applied the general equation to MOFs and H 2 to create energy-storage materials; however, this theory can be applied to other crystalline porous materials and light gases, which opens the possibility of designing the next generations of energy-storage materials by first considering only the geometrical constraints of the porous materials.

  19. Validating presupposed versus focused text information.

    Science.gov (United States)

    Singer, Murray; Solar, Kevin G; Spear, Jackie

    2017-04-01

    There is extensive evidence that readers continually validate discourse accuracy and congruence, but that they may also overlook conspicuous text contradictions. Validation may be thwarted when the inaccurate ideas are embedded sentence presuppositions. In four experiments, we examined readers' validation of presupposed ("given") versus new text information. Throughout, a critical concept, such as a truck versus a bus, was introduced early in a narrative. Later, a character stated or thought something about the truck, which therefore matched or mismatched its antecedent. Furthermore, truck was presented as either given or new information. Mismatch target reading times uniformly exceeded the matching ones by similar magnitudes for given and new concepts. We obtained this outcome using different grammatical constructions and with different antecedent-target distances. In Experiment 4, we examined only given critical ideas, but varied both their matching and the main verb's factivity (e.g., factive know vs. nonfactive think). The Match × Factivity interaction closely resembled that previously observed for new target information (Singer, 2006). Thus, readers can successfully validate given target information. Although contemporary theories tend to emphasize either deficient or successful validation, both types of theory can accommodate the discourse and reader variables that may regulate validation.

  20. Ground-water models: Validate or invalidate

    Science.gov (United States)

    Bredehoeft, J.D.; Konikow, Leonard F.

    1993-01-01

    The word validation has a clear meaning to both the scientific community and the general public. Within the scientific community the validation of scientific theory has been the subject of philosophical debate. The philosopher of science, Karl Popper, argued that scientific theory cannot be validated, only invalidated. Popper’s view is not the only opinion in this debate; however, many scientists today agree with Popper (including the authors). To the general public, proclaiming that a ground-water model is validated carries with it an aura of correctness that we do not believe many of us who model would claim. We can place all the caveats we wish, but the public has its own understanding of what the word implies. Using the word valid with respect to models misleads the public; verification carries with it similar connotations as far as the public is concerned. Our point is this: using the terms validation and verification are misleading, at best. These terms should be abandoned by the ground-water community.

  1. 76 FR 62421 - Submission for OMB Review; Comment Request; A Generic Submission for Theory Development and...

    Science.gov (United States)

    2011-10-07

    ...; Comment Request; A Generic Submission for Theory Development and Validation (NCI) SUMMARY: Under the... control number. Proposed Collection: Title: A Generic Submission for Theory Development and Validation...), to conduct and support behavioral research informed by and informing theory. Formative research in...

  2. Lesson 6: Signature Validation

    Science.gov (United States)

    Checklist items 13 through 17 are grouped under the Signature Validation Process, and represent CROMERR requirements that the system must satisfy as part of ensuring that electronic signatures it receives are valid.

  3. Electric theory

    International Nuclear Information System (INIS)

    Gong, Ha Seong

    2006-02-01

    This book explains electric theory which is divided into four chapters. The first chapter includes electricity and material, electric field, capacitance, magnetic field and electromagnetic force, inductance. The second chapter mentions electronic circuit analysis, electric resistance,heating and power, chemical activity on current and battery with electrolysis. The third chapter deals with an alternating current circuit about the basics of an AC circuit, operating of resistance, inductance and capacitance, series circuit and parallel circuit of PLC, an alternating current circuit, Three-phase Alternating current, two terminal pair network and voltage and current of non-linearity circuit. The last explains transient phenomena of RC series circuit, RL series circuit, transient phenomena of an alternating current circuit and transient phenomena of RLC series circuit.

  4. Sustainablegrowth theories

    International Nuclear Information System (INIS)

    Nobile, G.

    1993-07-01

    With reference to highly debated sustainable growth strategies to counter pressing interrelated global environmental and socio-economic problems, this paper reviews economic and resource development theories proposed by classical and neoclassical economists. The review evidences the growing debate among public administration decision makers regarding appropriate methods to assess the worth of natural resources and ecosystems. Proposed methods tend to be biased either towards environmental protection or economic development. Two major difficulties in the effective implementation of sustainable growth strategies are also evidenced - the management of such strategies would require appropriate revisions to national accounting systems, and the dynamic flow of energy and materials between an economic system and the environment would generate a sequence of unstable structures evolving in a chaotic and unpredictable way

  5. Validation of UTAUT and UTAUT2 scales for deal sites

    DEFF Research Database (Denmark)

    Sudzina, Frantisek

    2016-01-01

    , that is usual for an early stage of investigation of anything. The Unified Theory of Acceptance and Use of Technology explains adoption and continues usage motives. The aim of this paper is to validate scales for the first and the second version of the Unified Theory of Acceptance and Use of Technology for deal...

  6. The conceptualization and empirical validation of website user satisfaction

    NARCIS (Netherlands)

    Moenaert, R.K.; Muylle, S.; Despontin, M.

    2004-01-01

    This article addresses the concern for effective web site design by means of the conceptualization and empirical validation of a web site user satisfaction construct. Based on IS success theory, hypermedia design theory, a qualitative exploratory pilot study, and a quantitative online critical

  7. Serotonin Receptors in Hippocampus

    Science.gov (United States)

    Berumen, Laura Cristina; Rodríguez, Angelina; Miledi, Ricardo; García-Alcocer, Guadalupe

    2012-01-01

    Serotonin is an ancient molecular signal and a recognized neurotransmitter brainwide distributed with particular presence in hippocampus. Almost all serotonin receptor subtypes are expressed in hippocampus, which implicates an intricate modulating system, considering that they can be localized as autosynaptic, presynaptic, and postsynaptic receptors, even colocalized within the same cell and being target of homo- and heterodimerization. Neurons and glia, including immune cells, integrate a functional network that uses several serotonin receptors to regulate their roles in this particular part of the limbic system. PMID:22629209

  8. A theory of human error

    Science.gov (United States)

    Mcruer, D. T.; Clement, W. F.; Allen, R. W.

    1981-01-01

    Human errors tend to be treated in terms of clinical and anecdotal descriptions, from which remedial measures are difficult to derive. Correction of the sources of human error requires an attempt to reconstruct underlying and contributing causes of error from the circumstantial causes cited in official investigative reports. A comprehensive analytical theory of the cause-effect relationships governing propagation of human error is indispensable to a reconstruction of the underlying and contributing causes. A validated analytical theory of the input-output behavior of human operators involving manual control, communication, supervisory, and monitoring tasks which are relevant to aviation, maritime, automotive, and process control operations is highlighted. This theory of behavior, both appropriate and inappropriate, provides an insightful basis for investigating, classifying, and quantifying the needed cause-effect relationships governing propagation of human error.

  9. A Future of Communication Theory: Systems Theory.

    Science.gov (United States)

    Lindsey, Georg N.

    Concepts of general systems theory, cybernetics and the like may provide the methodology for communication theory to move from a level of technology to a level of pure science. It was the purpose of this paper to (1) demonstrate the necessity of applying systems theory to the construction of communication theory, (2) review relevant systems…

  10. The validation of Huffaz Intelligence Test (HIT)

    Science.gov (United States)

    Rahim, Mohd Azrin Mohammad; Ahmad, Tahir; Awang, Siti Rahmah; Safar, Ajmain

    2017-08-01

    In general, a hafiz who can memorize the Quran has many specialties especially in respect to their academic performances. In this study, the theory of multiple intelligences introduced by Howard Gardner is embedded in a developed psychometric instrument, namely Huffaz Intelligence Test (HIT). This paper presents the validation and the reliability of HIT of some tahfiz students in Malaysia Islamic schools. A pilot study was conducted involving 87 huffaz who were randomly selected to answer the items in HIT. The analysis method used includes Partial Least Square (PLS) on reliability, convergence and discriminant validation. The study has validated nine intelligences. The findings also indicated that the composite reliabilities for the nine types of intelligences are greater than 0.8. Thus, the HIT is a valid and reliable instrument to measure the multiple intelligences among huffaz.

  11. La teoría del valor-trabajo y la cuestión de su validez en el marco del llamado "posfordismo" The labour theory of value and the question of its validity in the context of the called "post-fordism"

    Directory of Open Access Journals (Sweden)

    Nicolás G. Pagura

    2010-12-01

    Full Text Available El propósito de este artículo es revisar la validez explicativa de la teoría marxista del valor a la luz de los cambios que se producen en el capitalismo tras la crisis de los años '70, cuando progresivamente se instaura (sobre todo en los países centrales un régimen de acumulación y regulación -que muchos autores han denominado "posfordismo"- con diferencias importantes respecto al dominante desde la posguerra. La tesis central que se defiende es que gran parte de las tendencias ancladas en las transformaciones operadas en el sistema de producción de núcleos hegemónicos del capital internacional -desde la terciarización a los nuevos modos de gestión y formación de la fuerza de trabajo- ponen en cuestión el concepto de "trabajo abstracto" que estaba en la base de la teoría del valor en su formulación clásica por Marx, tornando necesaria la revisión crítica de esta teoría.The purpose of this article is to go through the explanatory validity of the marxist theory of value in the light of the changes that are produced in the capitalism after the crisis of 70's, when a regime of accumulation an regulation - that many authors have named "post-fordism" - is progressively established (especially in central countries with important differences regarding the dominant since the post-war period. The main thesis defended is that a great part of the trends anchored in the transformations operated on the system of production of hegemonic cores of the international capital - since the service economy to the new manners of management and training of the labour forces - put in question the concept of "abstract labour" that was in the base of the theory of value in its classic formulation by Marx, making necessary the critical review of this theory.

  12. Theory of edge radiation

    Energy Technology Data Exchange (ETDEWEB)

    Geloni, G.; Kocharyan, V.; Saldin, E.; Schneidmiller, E.; Yurkov, M.

    2008-08-15

    We formulate a complete theory of Edge Radiation based on a novel method relying on Fourier Optics techniques. Similar types of radiation like Transition UndulatorRadiation are addressed in the framework of the same formalism. Special attention is payed in discussing the validity of approximations upon which the theory is built. Our study makes consistent use of both similarity techniques and comparisons with numerical results from simulation. We discuss both near and far zone. Physical understanding of many asymptotes is discussed. Based on the solution of the field equation with a tensor Green's function technique, we also discuss an analytical model to describe the presence of a vacuum chamber. In particular, explicit calculations for a circular vacuum chamber are reported. Finally, we consider the use of Edge Radiation as a tool for electron beam diagnostics. We discuss Coherent Edge Radiation, Extraction of Edge Radiation by a mirror, and other issues becoming important at high electron energy and long radiation wavelength. Based on this work we also study the impact of Edge Radiation on XFEL setups and we discuss recent results. (orig.)

  13. Rate theory

    International Nuclear Information System (INIS)

    Maillard, S.; Skorek, R.; Maugis, P.; Dumont, M.

    2015-01-01

    This chapter presents the basic principles of cluster dynamics as a particular case of mesoscopic rate theory models developed to investigate fuel behaviour under irradiation such as in UO 2 . It is shown that as this method simulates the evolution of the concentration of every type of point or aggregated defect in a grain of material. It produces rich information that sheds light on the mechanisms involved in microstructure evolution and gas behaviour that are not accessible through conventional models but yet can provide for improvements in those models. Cluster dynamics parameters are mainly the energetic values governing the basic evolution mechanisms of the material (diffusion, trapping and thermal resolution). In this sense, the model has a general applicability to very different operational situations (irradiation, ion-beam implantation, annealing) provided that they rely on the same basic mechanisms, without requiring additional data fitting, as is required for more empirical conventional models. This technique, when applied to krypton implanted and annealed samples, yields a precise interpretation of the release curves and helps assess migration mechanisms and the krypton diffusion coefficient, for which data is very difficult to obtain due to the low solubility of the gas. (authors)

  14. Alberi Validates New Theory, Sheds Light on Semiconductors | News | NREL

    Science.gov (United States)

    (cadmium telluride), and GaN (gallium nitride), which are used for cell phones, solar panels, and LED suppressing defects with light, which may allow higher efficiencies in solar panels, greater lifespan for LED

  15. Predicting interactions from mechanistic information: Can omic data validate theories?

    International Nuclear Information System (INIS)

    Borgert, Christopher J.

    2007-01-01

    To address the most pressing and relevant issues for improving mixture risk assessment, researchers must first recognize that risk assessment is driven by both regulatory requirements and scientific research, and that regulatory concerns may expand beyond the purely scientific interests of researchers. Concepts of 'mode of action' and 'mechanism of action' are used in particular ways within the regulatory arena, depending on the specific assessment goals. The data requirements for delineating a mode of action and predicting interactive toxicity in mixtures are not well defined from a scientific standpoint due largely to inherent difficulties in testing certain underlying assumptions. Understanding the regulatory perspective on mechanistic concepts will be important for designing experiments that can be interpreted clearly and applied in risk assessments without undue reliance on extrapolation and assumption. In like fashion, regulators and risk assessors can be better equipped to apply mechanistic data if the concepts underlying mechanistic research and the limitations that must be placed on interpretation of mechanistic data are understood. This will be critically important for applying new technologies to risk assessment, such as functional genomics, proteomics, and metabolomics. It will be essential not only for risk assessors to become conversant with the language and concepts of mechanistic research, including new omic technologies, but also, for researchers to become more intimately familiar with the challenges and needs of risk assessment

  16. Theory of edge diffraction in electromagnetics

    CERN Document Server

    Ufimtsev, Pyotr

    2009-01-01

    This book is an essential resource for researchers involved in designing antennas and RCS calculations. It is also useful for students studying high frequency diffraction techniques. It contains basic original ideas of the Physical Theory of Diffraction (PTD), examples of its practical application, and its validation by the mathematical theory of diffraction. The derived analytic expressions are convenient for numerical calculations and clearly illustrate the physical structure of the scattered field.

  17. G-protein coupling of cannabinoid receptors

    International Nuclear Information System (INIS)

    Glass, M.

    2001-01-01

    Full text: Since the cloning of the cannabinoid CB1 and CB2 receptors in the early 1990's extensive research has focused on understanding their signal transduction pathways. While it has been known for sometime that both receptors can couple to intracellular signalling via pertussis toxin sensitive G-proteins (Gi/Go), the specificity and kinetics of these interactions have only recently been elucidated. We have developed an in situ reconstitution approach to investigating receptor-G-protein interactions. This approach involves chaotropic extraction of receptor containing membranes in order to inactivate or remove endogenous G-proteins. Recombinant or isolated brain G-proteins can then be added back to the receptors, and their activation monitored through the binding of [ 35 S]-GTPγS. This technique has been utilised for an extensive study of cannabinoid receptor mediated activation of G-proteins. In these studies we have established that CB1 couples with high affinity to both Gi and Go type G-proteins. In contrast, CB2 couples strongly to Gi, but has a very low affinity for Go. This finding correlated well with the previous findings that while CB1 and CB2 both couple to the inhibition of adenylate cyclase, CB1 but not CB2 could also inhibit calcium channels. We then examined the ability of a range of cannabinoid agonists to activate the Gi and Go via CB1. Conventional receptor theory suggests that a receptor is either active or inactive with regard to a G-protein and that the active receptor activates all relevant G-proteins equally. However, in this study we found that agonists could produce different degrees of activation, depending on which G-protein was present. Further studies have compared the ability of the two endocannabinoids to drive the activation of Gi or Go. These studies show that agonists can induce multiple forms of activated receptor that differ in their ability to catalyse the activation of Gi or Go. The ability of an agonist to drive a receptor

  18. Validity of Linder Hypothesis in Bric Countries

    Directory of Open Access Journals (Sweden)

    Rana Atabay

    2016-03-01

    Full Text Available In this study, the theory of similarity in preferences (Linder hypothesis has been introduced and trade in BRIC countries has been examined whether the trade between these countries was valid for this hypothesis. Using the data for the period 1996 – 2010, the study applies to panel data analysis in order to provide evidence regarding the empirical validity of the Linder hypothesis for BRIC countries’ international trade. Empirical findings show that the trade between BRIC countries is in support of Linder hypothesis.

  19. Status of the Vibrational Theory of Olfaction

    Science.gov (United States)

    Hoehn, Ross D.; Nichols, David E.; Neven, Hartmut; Kais, Sabre

    2018-03-01

    The vibrational theory of olfaction is an attempt to describe a possible mechanism for olfaction which is explanatory and provides researchers with a set of principles which permit predictions allowing for structure-odor relations. Similar theories have occurred several times throughout olfactory science; this theory has again recently come to prominence by Luca Turin who suggested that inelastic electron tunneling is the method by which vibrations are detected by the olfactory receptors within the hose. This work is intended to convey to the reader the an up-to-date account of the vibrational theory of olfaction, both the historical iterations as well as the present iteration. This text is designed to give a chronological account of both theoretical and experimental studies on the topic, while providing context, comments and background where they were found to be needed.

  20. Perturbation Theory for Open Two-Level Nonlinear Quantum Systems

    International Nuclear Information System (INIS)

    Zhang Zhijie; Jiang Dongguang; Wang Wei

    2011-01-01

    Perturbation theory is an important tool in quantum mechanics. In this paper, we extend the traditional perturbation theory to open nonlinear two-level systems, treating decoherence parameter γ as a perturbation. By this virtue, we give a perturbative solution to the master equation, which describes a nonlinear open quantum system. The results show that for small decoherence rate γ, the ratio of the nonlinear rate C to the tunneling coefficient V (i.e., r = C/V) determines the validity of the perturbation theory. For small ratio r, the perturbation theory is valid, otherwise it yields wrong results. (general)

  1. Development and Validation of the Sorokin Psychosocial Love Inventory for Divorced Individuals

    Science.gov (United States)

    D'Ambrosio, Joseph G.; Faul, Anna C.

    2013-01-01

    Objective: This study describes the development and validation of the Sorokin Psychosocial Love Inventory (SPSLI) measuring love actions toward a former spouse. Method: Classical measurement theory and confirmatory factor analysis (CFA) were utilized with an a priori theory and factor model to validate the SPSLI. Results: A 15-item scale…

  2. Latinas/os in Community College Developmental Education: Increasing Moments of Academic and Interpersonal Validation

    Science.gov (United States)

    Acevedo-Gil, Nancy; Santos, Ryan E.; Alonso, LLuliana; Solorzano, Daniel G.

    2015-01-01

    This qualitative study examines the experiences of Latinas/os in community college English and math developmental education courses. Critical race theory in education and the theory of validation serve as guiding frameworks. The authors find that institutional agents provide academic validation by emphasizing high expectations, focusing on social…

  3. Validity in Qualitative Evaluation

    OpenAIRE

    Vasco Lub

    2015-01-01

    This article provides a discussion on the question of validity in qualitative evaluation. Although validity in qualitative inquiry has been widely reflected upon in the methodological literature (and is still often subject of debate), the link with evaluation research is underexplored. Elaborating on epistemological and theoretical conceptualizations by Guba and Lincoln and Creswell and Miller, the article explores aspects of validity of qualitative research with the explicit objective of con...

  4. AN EDUCATIONAL THEORY MODEL--(SIGGS), AN INTEGRATION OF SET THEORY, INFORMATION THEORY, AND GRAPH THEORY WITH GENERAL SYSTEMS THEORY.

    Science.gov (United States)

    MACCIA, ELIZABETH S.; AND OTHERS

    AN ANNOTATED BIBLIOGRAPHY OF 20 ITEMS AND A DISCUSSION OF ITS SIGNIFICANCE WAS PRESENTED TO DESCRIBE CURRENT UTILIZATION OF SUBJECT THEORIES IN THE CONSTRUCTION OF AN EDUCATIONAL THEORY. ALSO, A THEORY MODEL WAS USED TO DEMONSTRATE CONSTRUCTION OF A SCIENTIFIC EDUCATIONAL THEORY. THE THEORY MODEL INCORPORATED SET THEORY (S), INFORMATION THEORY…

  5. Validation of HEDR models

    International Nuclear Information System (INIS)

    Napier, B.A.; Simpson, J.C.; Eslinger, P.W.; Ramsdell, J.V. Jr.; Thiede, M.E.; Walters, W.H.

    1994-05-01

    The Hanford Environmental Dose Reconstruction (HEDR) Project has developed a set of computer models for estimating the possible radiation doses that individuals may have received from past Hanford Site operations. This document describes the validation of these models. In the HEDR Project, the model validation exercise consisted of comparing computational model estimates with limited historical field measurements and experimental measurements that are independent of those used to develop the models. The results of any one test do not mean that a model is valid. Rather, the collection of tests together provide a level of confidence that the HEDR models are valid

  6. How to Map Theory: Reliable Methods Are Fruitless Without Rigorous Theory.

    Science.gov (United States)

    Gray, Kurt

    2017-09-01

    Good science requires both reliable methods and rigorous theory. Theory allows us to build a unified structure of knowledge, to connect the dots of individual studies and reveal the bigger picture. Some have criticized the proliferation of pet "Theories," but generic "theory" is essential to healthy science, because questions of theory are ultimately those of validity. Although reliable methods and rigorous theory are synergistic, Action Identification suggests psychological tension between them: The more we focus on methodological details, the less we notice the broader connections. Therefore, psychology needs to supplement training in methods (how to design studies and analyze data) with training in theory (how to connect studies and synthesize ideas). This article provides a technique for visually outlining theory: theory mapping. Theory mapping contains five elements, which are illustrated with moral judgment and with cars. Also included are 15 additional theory maps provided by experts in emotion, culture, priming, power, stress, ideology, morality, marketing, decision-making, and more (see all at theorymaps.org ). Theory mapping provides both precision and synthesis, which helps to resolve arguments, prevent redundancies, assess the theoretical contribution of papers, and evaluate the likelihood of surprising effects.

  7. Ionotropic crustacean olfactory receptors.

    Directory of Open Access Journals (Sweden)

    Elizabeth A Corey

    Full Text Available The nature of the olfactory receptor in crustaceans, a major group of arthropods, has remained elusive. We report that spiny lobsters, Panulirus argus, express ionotropic receptors (IRs, the insect chemosensory variants of ionotropic glutamate receptors. Unlike insects IRs, which are expressed in a specific subset of olfactory cells, two lobster IR subunits are expressed in most, if not all, lobster olfactory receptor neurons (ORNs, as confirmed by antibody labeling and in situ hybridization. Ligand-specific ORN responses visualized by calcium imaging are consistent with a restricted expression pattern found for other potential subunits, suggesting that cell-specific expression of uncommon IR subunits determines the ligand sensitivity of individual cells. IRs are the only type of olfactory receptor that we have detected in spiny lobster olfactory tissue, suggesting that they likely mediate olfactory signaling. Given long-standing evidence for G protein-mediated signaling in activation of lobster ORNs, this finding raises the interesting specter that IRs act in concert with second messenger-mediated signaling.

  8. Recursion Theory Week

    CERN Document Server

    Müller, Gert; Sacks, Gerald

    1990-01-01

    These proceedings contain research and survey papers from many subfields of recursion theory, with emphasis on degree theory, in particular the development of frameworks for current techniques in this field. Other topics covered include computational complexity theory, generalized recursion theory, proof theoretic questions in recursion theory, and recursive mathematics.

  9. Development and validation of the Alcohol Myopia Scale.

    Science.gov (United States)

    Lac, Andrew; Berger, Dale E

    2013-09-01

    Alcohol myopia theory conceptualizes the ability of alcohol to narrow attention and how this demand on mental resources produces the impairments of self-inflation, relief, and excess. The current research was designed to develop and validate a scale based on this framework. People who were alcohol users rated items representing myopic experiences arising from drinking episodes in the past month. In Study 1 (N = 260), the preliminary 3-factor structure was supported by exploratory factor analysis. In Study 2 (N = 289), the 3-factor structure was substantiated with confirmatory factor analysis, and it was superior in fit to an empirically indefensible 1-factor structure. The final 14-item scale was evaluated with internal consistency reliability, discriminant validity, convergent validity, criterion validity, and incremental validity. The alcohol myopia scale (AMS) illuminates conceptual underpinnings of this theory and yields insights for understanding the tunnel vision that arises from intoxication.

  10. Noncommutative gauge theories and Kontsevich's formality theorem

    International Nuclear Information System (INIS)

    Jurco, B.; Schupp, P.; Wess, J.

    2001-01-01

    The equivalence of star products that arise from the background field with and without fluctuations and Kontsevich's formality theorem allow an explicitly construction of a map that relates ordinary gauge theory and noncommutative gauge theory (Seiberg-Witten map.) Using noncommutative extra dimensions the construction is extended to noncommutative nonabelian gauge theory for arbitrary gauge groups; as a byproduct we obtain a 'Mini Seiberg-Witten map' that explicitly relates ordinary abelian and nonabelian gauge fields. All constructions are also valid for non-constant B-field, and even more generally for any Poisson tensor

  11. On perturbation theory for distance dependent statistics.

    Energy Technology Data Exchange (ETDEWEB)

    Mashkevich, S V

    1994-12-31

    It is known that perturbation theory for anyons has to be modified near Bose statistics in order to get correct finite results. For ``distance dependent statistics`` or anyons with smeared flux tubes, perturbation theory is in principle applicable directly but gives results which hold for too small values of the statistical parameter and, in particular, are not valid as the flux tube radius tends to zero. In this paper we discuss the way to modify perturbation theory for this situation, which allows to obtain the appropriate results. (author). 6 refs.

  12. K-theory and representation theory

    International Nuclear Information System (INIS)

    Kuku, A.O.

    2003-01-01

    This contribution includes K-theory of orders, group-rings and modules over EI categories, equivariant higher algebraic K-theory for finite, profinite and compact Lie group actions together with their relative generalisations and applications

  13. Implicit ligand theory for relative binding free energies

    Science.gov (United States)

    Nguyen, Trung Hai; Minh, David D. L.

    2018-03-01

    Implicit ligand theory enables noncovalent binding free energies to be calculated based on an exponential average of the binding potential of mean force (BPMF)—the binding free energy between a flexible ligand and rigid receptor—over a precomputed ensemble of receptor configurations. In the original formalism, receptor configurations were drawn from or reweighted to the apo ensemble. Here we show that BPMFs averaged over a holo ensemble yield binding free energies relative to the reference ligand that specifies the ensemble. When using receptor snapshots from an alchemical simulation with a single ligand, the new statistical estimator outperforms the original.

  14. Gravity, general relativity theory and alternative theories

    International Nuclear Information System (INIS)

    Zel'dovich, Ya.B.; Grishchuk, L.P.; Moskovskij Gosudarstvennyj Univ.

    1986-01-01

    The main steps in plotting the current gravitation theory and some prospects of its subsequent development are reviewed. The attention is concentrated on a comparison of the relativistic gravitational field with other physical fields. Two equivalent formulations of the general relativity (GR) - geometrical and field-theoretical - are considered in detail. It is shown that some theories of gravity constructed as the field theories at a flat background space-time are in fact just different formulations of GR and not alternative theories

  15. Generalizability theory and item response theory

    OpenAIRE

    Glas, Cornelis A.W.; Eggen, T.J.H.M.; Veldkamp, B.P.

    2012-01-01

    Item response theory is usually applied to items with a selected-response format, such as multiple choice items, whereas generalizability theory is usually applied to constructed-response tasks assessed by raters. However, in many situations, raters may use rating scales consisting of items with a selected-response format. This chapter presents a short overview of how item response theory and generalizability theory were integrated to model such assessments. Further, the precision of the esti...

  16. Personal receptor repertoires: olfaction as a model

    Directory of Open Access Journals (Sweden)

    Olender Tsviya

    2012-08-01

    Full Text Available Abstract Background Information on nucleotide diversity along completely sequenced human genomes has increased tremendously over the last few years. This makes it possible to reassess the diversity status of distinct receptor proteins in different human individuals. To this end, we focused on the complete inventory of human olfactory receptor coding regions as a model for personal receptor repertoires. Results By performing data-mining from public and private sources we scored genetic variations in 413 intact OR loci, for which one or more individuals had an intact open reading frame. Using 1000 Genomes Project haplotypes, we identified a total of 4069 full-length polypeptide variants encoded by these OR loci, average of ~10 per locus, constituting a lower limit for the effective human OR repertoire. Each individual is found to harbor as many as 600 OR allelic variants, ~50% higher than the locus count. Because OR neuronal expression is allelically excluded, this has direct effect on smell perception diversity of the species. We further identified 244 OR segregating pseudogenes (SPGs, loci showing both intact and pseudogene forms in the population, twenty-six of which are annotatively “resurrected” from a pseudogene status in the reference genome. Using a custom SNP microarray we validated 150 SPGs in a cohort of 468 individuals, with every individual genome averaging 36 disrupted sequence variations, 15 in homozygote form. Finally, we generated a multi-source compendium of 63 OR loci harboring deletion Copy Number Variations (CNVs. Our combined data suggest that 271 of the 413 intact OR loci (66% are affected by nonfunctional SNPs/indels and/or CNVs. Conclusions These results portray a case of unusually high genetic diversity, and suggest that individual humans have a highly personalized inventory of functional olfactory receptors, a conclusion that might apply to other receptor multigene families.

  17. String Theory and M-Theory

    Science.gov (United States)

    Becker, Katrin; Becker, Melanie; Schwarz, John H.

    String theory is one of the most exciting and challenging areas of modern theoretical physics. This book guides the reader from the basics of string theory to recent developments. It introduces the basics of perturbative string theory, world-sheet supersymmetry, space-time supersymmetry, conformal field theory and the heterotic string, before describing modern developments, including D-branes, string dualities and M-theory. It then covers string geometry and flux compactifications, applications to cosmology and particle physics, black holes in string theory and M-theory, and the microscopic origin of black-hole entropy. It concludes with Matrix theory, the AdS/CFT duality and its generalizations. This book is ideal for graduate students and researchers in modern string theory, and will make an excellent textbook for a one-year course on string theory. It contains over 120 exercises with solutions, and over 200 homework problems with solutions available on a password protected website for lecturers at www.cambridge.org/9780521860697. Comprehensive coverage of topics from basics of string theory to recent developments Ideal textbook for a one-year course in string theory Includes over 100 exercises with solutions Contains over 200 homework problems with solutions available to lecturers on-line

  18. Validation of simulation models

    DEFF Research Database (Denmark)

    Rehman, Muniza; Pedersen, Stig Andur

    2012-01-01

    In philosophy of science, the interest for computational models and simulations has increased heavily during the past decades. Different positions regarding the validity of models have emerged but the views have not succeeded in capturing the diversity of validation methods. The wide variety...

  19. Hormone receptor densities in relation to 10B neutron capture therapy

    International Nuclear Information System (INIS)

    Hechter, O.; Schwartz, I.L.

    1982-01-01

    This presentation is a theoretical discussion of the possibility that appropriate steroid-carborane derivatives might be used to selectively deliver boron-10 ( 10 B) to tumor cells with sex-hormone receptors in sufficient concentration for effective neutron capture theory (NCT) of hormone-dependent mammary and prostatic cancer. The results indicate the concentrations of androgen receptors (AR) and progesterone receptors (PR) in malignant prostatic cells or of estrogen receptors (ER) in malignant mammary cells are two low to achieve nuclear 10 B concentrations of 1 + g per g of tumor by using a steroid ligand coupled to a single carborane cage

  20. Introduction to the theory of gravitational radiation

    International Nuclear Information System (INIS)

    Damour, T.

    1987-01-01

    In these lectures our attention is restricted to the analytical investigations of the theory of gravitational radiation. There exist already several reviews concerning this topic and, in particular, a recent detailed review, by Thorne, where gravitational radiation theory is put in a form suitable for astrophysical studies. This is why the scope of these lectures is limited to supplement the existing reviews in two ways. First, both the basic concepts of gravitational radiation theory, and the precise conditions, as well as the limitations, of validity of some of the well-known results in this theory are presented. Indeed, as these results have been, or will be, applied in astrophysics, it is important to have clearly in mind both what they mean, and when they can be legitimately applied. Second, a progress report on some of the ongoing analytical research in gravitational radiation theory is presented. 144 references

  1. Infrared problems in field perturbation theory

    International Nuclear Information System (INIS)

    David, Francois.

    1982-12-01

    The work presented mainly covers questions related to the presence of ''infrared'' divergences in perturbation expansions of the Green functions of certain massless field theories. It is important to determine the mathematical status of perturbation expansions in field theory in order to define the region in which they are valid. Renormalization and the symmetry of a theory are important factors in infrared problems. The main object of this thesis resides in the mathematical techniques employed: integral representations of the Feynman amplitudes; methods for desingularization, regularization and dimensional renormalization. Nonlinear two dimensional space-time sigma models describing Goldstone's low energy boson dynamics associated with a breaking of continuous symmetry are studied. Random surface models are then investigated followed by infrared divergences in super-renormalizable theories. Finally, nonperturbation effects in massless theories are studied by expanding the two-dimensional nonlinear sigma model in 1/N [fr

  2. Cosmological constraints on Brans-Dicke theory.

    Science.gov (United States)

    Avilez, A; Skordis, C

    2014-07-04

    We report strong cosmological constraints on the Brans-Dicke (BD) theory of gravity using cosmic microwave background data from Planck. We consider two types of models. First, the initial condition of the scalar field is fixed to give the same effective gravitational strength Geff today as the one measured on Earth, GN. In this case, the BD parameter ω is constrained to ω>692 at the 99% confidence level, an order of magnitude improvement over previous constraints. In the second type, the initial condition for the scalar is a free parameter leading to a somewhat stronger constraint of ω>890, while Geff is constrained to 0.981theory and are valid for any Horndeski theory, the most general second-order scalar-tensor theory, which approximates the BD theory on cosmological scales. In this sense, our constraints place strong limits on possible modifications of gravity that might explain cosmic acceleration.

  3. On multiscale moving contact line theory.

    Science.gov (United States)

    Li, Shaofan; Fan, Houfu

    2015-07-08

    In this paper, a multiscale moving contact line (MMCL) theory is presented and employed to simulate liquid droplet spreading and capillary motion. The proposed MMCL theory combines a coarse-grained adhesive contact model with a fluid interface membrane theory, so that it can couple molecular scale adhesive interaction and surface tension with hydrodynamics of microscale flow. By doing so, the intermolecular force, the van der Waals or double layer force, separates and levitates the liquid droplet from the supporting solid substrate, which avoids the shear stress singularity caused by the no-slip condition in conventional hydrodynamics theory of moving contact line. Thus, the MMCL allows the difference of the surface energies and surface stresses to drive droplet spreading naturally. To validate the proposed MMCL theory, we have employed it to simulate droplet spreading over various elastic substrates. The numerical simulation results obtained by using MMCL are in good agreement with the molecular dynamics results reported in the literature.

  4. Deflation-activated receptors, not classical inflation-activated receptors, mediate the Hering-Breuer deflation reflex.

    Science.gov (United States)

    Yu, Jerry

    2016-11-01

    Many airway sensory units respond to both lung inflation and deflation. Whether those responses to opposite stimuli come from one sensor (one-sensor theory) or more than one sensor (multiple-sensor theory) is debatable. One-sensor theory is commonly presumed in the literature. This article proposes a multiple-sensor theory in which a sensory unit contains different sensors for sensing different forces. Two major types of mechanical sensors operate in the lung: inflation- and deflation-activated receptors (DARs). Inflation-activated sensors can be further divided into slowly adapting receptors (SARs) and rapidly adapting receptors (RARs). Many SAR and RAR units also respond to lung deflation because they contain DARs. Pure DARs, which respond to lung deflation only, are rare in large animals but are easily identified in small animals. Lung deflation-induced reflex effects previously attributed to RARs should be assigned to DARs (including pure DARs and DARs associated with SARs and RARs) if the multiple-sensor theory is accepted. Thus, based on the information, it is proposed that activation of DARs can attenuate lung deflation, shorten expiratory time, increase respiratory rate, evoke inspiration, and cause airway secretion and dyspnea.

  5. Assays for calcitonin receptors

    International Nuclear Information System (INIS)

    Teitelbaum, A.P.; Nissenson, R.A.; Arnaud, C.D.

    1985-01-01

    The assays for calcitonin receptors described focus on their use in the study of the well-established target organs for calcitonin, bone and kidney. The radioligand used in virtually all calcitonin binding studies is 125 I-labelled salmon calcitonin. The lack of methionine residues in this peptide permits the use of chloramine-T for the iodination reaction. Binding assays are described for intact bone, skeletal plasma membranes, renal plasma membranes, and primary kidney cell cultures of rats. Studies on calcitonin metabolism in laboratory animals and regulation of calcitonin receptors are reviewed

  6. How Far Does a Receptor Influence Vibrational Properties of an Odorant?

    Science.gov (United States)

    Reese, Anna; List, Nanna Holmgaard; Kongsted, Jacob; Solov'yov, Ilia A

    2016-01-01

    The biophysical mechanism of the sense of smell, or olfaction, is still highly debated. The mainstream explanation argues for a shape-based recognition of odorant molecules by olfactory receptors, while recent investigations suggest the primary olfactory event to be triggered by a vibrationally-assisted electron transfer reaction. We consider this controversy by studying the influence of a receptor on the vibrational properties of an odorant in atomistic details as the coupling between electronic degrees of freedom of the receptor and the vibrations of the odorant is the key parameter of the vibrationally-assisted electron transfer. Through molecular dynamics simulations we elucidate the binding specificity of a receptor towards acetophenone odorant. The vibrational properties of acetophenone inside the receptor are then studied by the polarizable embedding density functional theory approach, allowing to quantify protein-odorant interactions. Finally, we judge whether the effects of the protein provide any indications towards the existing theories of olfaction.

  7. Validating experimental and theoretical Langmuir probe analyses

    Science.gov (United States)

    Pilling, L. S.; Carnegie, D. A.

    2007-08-01

    Analysis of Langmuir probe characteristics contains a paradox in that it is unknown a priori which theory is applicable before it is applied. Often theories are assumed to be correct when certain criteria are met although they may not validate the approach used. We have analysed the Langmuir probe data from cylindrical double and single probes acquired from a dc discharge plasma over a wide variety of conditions. This discharge contains a dual-temperature distribution and hence fitting a theoretically generated curve is impractical. To determine the densities, an examination of the current theories was necessary. For the conditions where the probe radius is the same order of magnitude as the Debye length, the gradient expected for orbital-motion limited (OML) is approximately the same as the radial-motion gradients. An analysis of the 'gradients' from the radial-motion theory was able to resolve the differences from the OML gradient value of two. The method was also able to determine whether radial or OML theories applied without knowledge of the electron temperature, or separation of the ion and electron contributions. Only the value of the space potential is necessary to determine the applicable theory.

  8. Water soluble and efficient amino acid Schiff base receptor for reversible fluorescence turn-on detection of Zn²⁺ ions: Quantum chemical calculations and detection of bacteria.

    Science.gov (United States)

    Subha, L; Balakrishnan, C; Natarajan, Satheesh; Theetharappan, M; Subramanian, Balanehru; Neelakantan, M A

    2016-01-15

    An amino acid Schiff base (R) capable of recognizing Zn(2+) ions selectively and sensitively in an aqueous medium was prepared and characterized. Upon addition of Zn(2+) ions, the receptor exhibits fluorescence intensity enhancements (~40 fold) at 460 nm (quantum yield, Φ=0.05 for R and Φ=0.18 for R-Zn(2+)) and can be detected by naked eye under UV light. The receptor can recognize the Zn(2+) (1.04×10(-8) M) selectively for other metal ions in the pH range of 7.5-11. The Zn(2+) chelation with R decreases the loss of energy through non-radiative transition and leads to fluorescence enhancement. The binding mode of the receptor with Zn(2+) was investigated by (1)H NMR titration and further validated by ESI-MS. The elemental color mapping and SEM/EDS analysis were also used to study the binding of R with Zn(2+). Density functional theory calculations were carried out to understand the binding mechanism. The receptor was applied as a microbial sensor for Escherichia coli and Staphylococcus aureus. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Simple recursion relations for general field theories

    International Nuclear Information System (INIS)

    Cheung, Clifford; Shen, Chia-Hsien; Trnka, Jaroslav

    2015-01-01

    On-shell methods offer an alternative definition of quantum field theory at tree-level, replacing Feynman diagrams with recursion relations and interaction vertices with a handful of seed scattering amplitudes. In this paper we determine the simplest recursion relations needed to construct a general four-dimensional quantum field theory of massless particles. For this purpose we define a covering space of recursion relations which naturally generalizes all existing constructions, including those of BCFW and Risager. The validity of each recursion relation hinges on the large momentum behavior of an n-point scattering amplitude under an m-line momentum shift, which we determine solely from dimensional analysis, Lorentz invariance, and locality. We show that all amplitudes in a renormalizable theory are 5-line constructible. Amplitudes are 3-line constructible if an external particle carries spin or if the scalars in the theory carry equal charge under a global or gauge symmetry. Remarkably, this implies the 3-line constructibility of all gauge theories with fermions and complex scalars in arbitrary representations, all supersymmetric theories, and the standard model. Moreover, all amplitudes in non-renormalizable theories without derivative interactions are constructible; with derivative interactions, a subset of amplitudes is constructible. We illustrate our results with examples from both renormalizable and non-renormalizable theories. Our study demonstrates both the power and limitations of recursion relations as a self-contained formulation of quantum field theory.

  10. Angiotensin type 2 receptor (AT2R) and receptor Mas

    DEFF Research Database (Denmark)

    Villela, Daniel; Leonhardt, Julia; Patel, Neal

    2015-01-01

    The angiotensin type 2 receptor (AT2R) and the receptor Mas are components of the protective arms of the renin-angiotensin system (RAS), i.e. they both mediate tissue protective and regenerative actions. The spectrum of actions of these two receptors and their signalling mechanisms display striki...

  11. Unitary field theories

    International Nuclear Information System (INIS)

    Bergmann, P.G.

    1980-01-01

    A problem of construction of the unitary field theory is discussed. The preconditions of the theory are briefly described. The main attention is paid to the geometrical interpretation of physical fields. The meaning of the conceptions of diversity and exfoliation is elucidated. Two unitary field theories are described: the Weyl conformic geometry and Calitzy five-dimensioned theory. It is proposed to consider supersymmetrical theories as a new approach to the problem of a unitary field theory. It is noted that the supergravitational theories are really unitary theories, since the fields figuring there do not assume invariant expansion

  12. Theory of thermal stresses

    CERN Document Server

    Boley, Bruno A

    1997-01-01

    Highly regarded text presents detailed discussion of fundamental aspects of theory, background, problems with detailed solutions. Basics of thermoelasticity, heat transfer theory, thermal stress analysis, more. 1985 edition.

  13. Elementary particle theory

    International Nuclear Information System (INIS)

    Marciano, W.J.

    1984-12-01

    The present state of the art in elementary particle theory is reviewed. Topics include quantum electrodynamics, weak interactions, electroweak unification, quantum chromodynamics, and grand unified theories. 113 references

  14. Entropy Evaluation Based on Value Validity

    Directory of Open Access Journals (Sweden)

    Tarald O. Kvålseth

    2014-09-01

    Full Text Available Besides its importance in statistical physics and information theory, the Boltzmann-Shannon entropy S has become one of the most widely used and misused summary measures of various attributes (characteristics in diverse fields of study. It has also been the subject of extensive and perhaps excessive generalizations. This paper introduces the concept and criteria for value validity as a means of determining if an entropy takes on values that reasonably reflect the attribute being measured and that permit different types of comparisons to be made for different probability distributions. While neither S nor its relative entropy equivalent S* meet the value-validity conditions, certain power functions of S and S* do to a considerable extent. No parametric generalization offers any advantage over S in this regard. A measure based on Euclidean distances between probability distributions is introduced as a potential entropy that does comply fully with the value-validity requirements and its statistical inference procedure is discussed.

  15. Some considerations for validation of repository performance assessment models

    International Nuclear Information System (INIS)

    Eisenberg, N.

    1991-01-01

    Validation is an important aspect of the regulatory uses of performance assessment. A substantial body of literature exists indicating the manner in which validation of models is usually pursued. Because performance models for a nuclear waste repository cannot be tested over the long time periods for which the model must make predictions, the usual avenue for model validation is precluded. Further impediments to model validation include a lack of fundamental scientific theory to describe important aspects of repository performance and an inability to easily deduce the complex, intricate structures characteristic of a natural system. A successful strategy for validation must attempt to resolve these difficulties in a direct fashion. Although some procedural aspects will be important, the main reliance of validation should be on scientific substance and logical rigor. The level of validation needed will be mandated, in part, by the uses to which these models are put, rather than by the ideal of validation of a scientific theory. Because of the importance of the validation of performance assessment models, the NRC staff has engaged in a program of research and international cooperation to seek progress in this important area. 2 figs., 16 refs

  16. Glutamate receptor ligands

    DEFF Research Database (Denmark)

    Guldbrandt, Mette; Johansen, Tommy N; Frydenvang, Karla Andrea

    2002-01-01

    Homologation and substitution on the carbon backbone of (S)-glutamic acid [(S)-Glu, 1], as well as absolute stereochemistry, are structural parameters of key importance for the pharmacological profile of (S)-Glu receptor ligands. We describe a series of methyl-substituted 2-aminoadipic acid (AA...

  17. Ginkgolides and glycine receptors

    DEFF Research Database (Denmark)

    Jaracz, Stanislav; Nakanishi, Koji; Jensen, Anders A.

    2004-01-01

    Ginkgolides from the Ginkgo biloba tree are diterpenes with a cage structure consisting of six five-membered rings and a unique tBu group. They exert a variety of biological properties. In addition to being antagonists of the platelet activating factor receptor (PAFR), it has recently been shown ...

  18. adrenergic receptor with preeclampsia

    African Journals Online (AJOL)

    User

    2011-05-09

    May 9, 2011 ... due to a post- receptor defect (Karadas et al., 2007). Several polymorphisms have ... the detection of the Arg16Gly polymorphism, overnight digestion at. 37°C with 10 U ..... DW, Wood AJ, Stein CM (2004). Beta2-adrenoceptor ...

  19. Metformin and insulin receptors

    International Nuclear Information System (INIS)

    Vigneri, R.; Gullo, D.; Pezzino, V.

    1984-01-01

    The authors evaluated the effect of metformin (N,N-dimethylbiguanide), a biguanide known to be less toxic than phenformin, on insulin binding to its receptors, both in vitro and in vivo. Specific 125 I-insulin binding to cultured IM-9 human lymphocytes and MCF-7 human breast cancer cells was determined after preincubation with metformin. Specific 125 I-insulin binding to circulating monocytes was also evaluated in six controls, eight obese subjects, and six obese type II diabetic patients before and after a short-term treatment with metformin. Plasma insulin levels and blood glucose were also measured on both occasions. Metformin significantly increased insulin binding in vitro to both IM-9 lymphocytes and MCF-7 cells; the maximum increment was 47.1% and 38.0%, respectively. Metformin treatment significantly increased insulin binding in vivo to monocytes of obese subjects and diabetic patients. Scatchard analysis indicated that the increased binding was mainly due to an increase in receptor capacity. Insulin binding to monocytes of normal controls was unchanged after metformin as were insulin levels in all groups; blood glucose was significantly reduced after metformin only in diabetic patients. These data indicate that metformin increases insulin binding to its receptors in vitro and in vivo. The effect in vivo is observed in obese subjects and in obese type II diabetic patients, paralleling the clinical effectiveness of this antidiabetic agent, and is not due to receptor regulation by circulating insulin, since no variation in insulin levels was recorded

  20. Sierra Structural Dynamics Theory Manual

    Energy Technology Data Exchange (ETDEWEB)

    Reese, Garth M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-10-19

    Sierra/SD provides a massively parallel implementation of structural dynamics finite element analysis, required for high fidelity, validated models used in modal, vibration, static and shock analysis of structural systems. This manual describes the theory behind many of the constructs in Sierra/SD. For a more detailed description of how to use Sierra/SD , we refer the reader to Sierra/SD, User's Notes . Many of the constructs in Sierra/SD are pulled directly from published material. Where possible, these materials are referenced herein. However, certain functions in Sierra/SD are specific to our implementation. We try to be far more complete in those areas. The theory manual was developed from several sources including general notes, a programmer notes manual, the user's notes and of course the material in the open literature. This page intentionally left blank.

  1. Evidence for Alpha Receptors in the Human Ureter

    Science.gov (United States)

    Madeb, Ralph; Knopf, Joy; Golijanin, Dragan; Bourne, Patricia; Erturk, Erdal

    2007-04-01

    An immunohistochemical and western blot expression analysis of human ureters was performed in order to characterize the alpha-1-adrenergic receptor distribution along the length of the human ureteral wall. Mapping the distribution will assist in understanding the potential role alpha -1-adrenergic receptors and their subtype density might have in the pathophysiology of ureteral colic and stone passage. Patients diagnosed with renal cancer or bladder cancer undergoing nephrectomy, nephroureterectomy, or cystectomy had ureteral specimens taken from the proximal, mid, distal and tunneled ureter. Tissues were processed for fresh frozen examination and fixed in formalin. None of the ureteral specimens were involved with cancer. Serial histologic sections and immunohistochemical studies were performed using antibodies specific for alpha-1-adrenergic receptor subtypes (alpha 1a, alpha 1b, alpha 1d). The sections were examined under a light microscope and scored as positive or negative. In order to validate and quantify the alpha receptor subtypes along the human ureter. Western blotting techniques were applied. Human ureter stained positively for alpha -1-adrenergic receptors. Immunostaining appeared red, with intense reaction in the smooth muscle of the ureter and endothelium of the neighboring blood vessels. There was differential expression between all the receptors with the highest staining for alpha-1D subtype. The highest protein expression for all three subtypes was in the renal pelvis and decreased with advancement along the ureter to the distal ureter. At the distal ureter, there was marked increase in expression as one progressed towards the ureteral orifice. The same pattern of protein expression was exhibited for all three alpha -1-adrenergic receptor subtypes. We provide preliminary evidence for the ability to detect and quantify the alpha-1-receptor subtypes along the human ureter which to the best of our knowledge has never been done with

  2. Local homotopy theory

    CERN Document Server

    Jardine, John F

    2015-01-01

    This monograph on the homotopy theory of topologized diagrams of spaces and spectra gives an expert account of a subject at the foundation of motivic homotopy theory and the theory of topological modular forms in stable homotopy theory. Beginning with an introduction to the homotopy theory of simplicial sets and topos theory, the book covers core topics such as the unstable homotopy theory of simplicial presheaves and sheaves, localized theories, cocycles, descent theory, non-abelian cohomology, stacks, and local stable homotopy theory. A detailed treatment of the formalism of the subject is interwoven with explanations of the motivation, development, and nuances of ideas and results. The coherence of the abstract theory is elucidated through the use of widely applicable tools, such as Barr's theorem on Boolean localization, model structures on the category of simplicial presheaves on a site, and cocycle categories. A wealth of concrete examples convey the vitality and importance of the subject in topology, n...

  3. Model Validation Status Review

    International Nuclear Information System (INIS)

    E.L. Hardin

    2001-01-01

    The primary objective for the Model Validation Status Review was to perform a one-time evaluation of model validation associated with the analysis/model reports (AMRs) containing model input to total-system performance assessment (TSPA) for the Yucca Mountain site recommendation (SR). This review was performed in response to Corrective Action Request BSC-01-C-01 (Clark 2001, Krisha 2001) pursuant to Quality Assurance review findings of an adverse trend in model validation deficiency. The review findings in this report provide the following information which defines the extent of model validation deficiency and the corrective action needed: (1) AMRs that contain or support models are identified, and conversely, for each model the supporting documentation is identified. (2) The use for each model is determined based on whether the output is used directly for TSPA-SR, or for screening (exclusion) of features, events, and processes (FEPs), and the nature of the model output. (3) Two approaches are used to evaluate the extent to which the validation for each model is compliant with AP-3.10Q (Analyses and Models). The approaches differ in regard to whether model validation is achieved within individual AMRs as originally intended, or whether model validation could be readily achieved by incorporating information from other sources. (4) Recommendations are presented for changes to the AMRs, and additional model development activities or data collection, that will remedy model validation review findings, in support of licensing activities. The Model Validation Status Review emphasized those AMRs that support TSPA-SR (CRWMS M and O 2000bl and 2000bm). A series of workshops and teleconferences was held to discuss and integrate the review findings. The review encompassed 125 AMRs (Table 1) plus certain other supporting documents and data needed to assess model validity. The AMRs were grouped in 21 model areas representing the modeling of processes affecting the natural and

  4. Model Validation Status Review

    Energy Technology Data Exchange (ETDEWEB)

    E.L. Hardin

    2001-11-28

    The primary objective for the Model Validation Status Review was to perform a one-time evaluation of model validation associated with the analysis/model reports (AMRs) containing model input to total-system performance assessment (TSPA) for the Yucca Mountain site recommendation (SR). This review was performed in response to Corrective Action Request BSC-01-C-01 (Clark 2001, Krisha 2001) pursuant to Quality Assurance review findings of an adverse trend in model validation deficiency. The review findings in this report provide the following information which defines the extent of model validation deficiency and the corrective action needed: (1) AMRs that contain or support models are identified, and conversely, for each model the supporting documentation is identified. (2) The use for each model is determined based on whether the output is used directly for TSPA-SR, or for screening (exclusion) of features, events, and processes (FEPs), and the nature of the model output. (3) Two approaches are used to evaluate the extent to which the validation for each model is compliant with AP-3.10Q (Analyses and Models). The approaches differ in regard to whether model validation is achieved within individual AMRs as originally intended, or whether model validation could be readily achieved by incorporating information from other sources. (4) Recommendations are presented for changes to the AMRs, and additional model development activities or data collection, that will remedy model validation review findings, in support of licensing activities. The Model Validation Status Review emphasized those AMRs that support TSPA-SR (CRWMS M&O 2000bl and 2000bm). A series of workshops and teleconferences was held to discuss and integrate the review findings. The review encompassed 125 AMRs (Table 1) plus certain other supporting documents and data needed to assess model validity. The AMRs were grouped in 21 model areas representing the modeling of processes affecting the natural and

  5. Quantile regression theory and applications

    CERN Document Server

    Davino, Cristina; Vistocco, Domenico

    2013-01-01

    A guide to the implementation and interpretation of Quantile Regression models This book explores the theory and numerous applications of quantile regression, offering empirical data analysis as well as the software tools to implement the methods. The main focus of this book is to provide the reader with a comprehensivedescription of the main issues concerning quantile regression; these include basic modeling, geometrical interpretation, estimation and inference for quantile regression, as well as issues on validity of the model, diagnostic tools. Each methodological aspect is explored and

  6. Graphical Model Theory for Wireless Sensor Networks

    International Nuclear Information System (INIS)

    Davis, William B.

    2002-01-01

    Information processing in sensor networks, with many small processors, demands a theory of computation that allows the minimization of processing effort, and the distribution of this effort throughout the network. Graphical model theory provides a probabilistic theory of computation that explicitly addresses complexity and decentralization for optimizing network computation. The junction tree algorithm, for decentralized inference on graphical probability models, can be instantiated in a variety of applications useful for wireless sensor networks, including: sensor validation and fusion; data compression and channel coding; expert systems, with decentralized data structures, and efficient local queries; pattern classification, and machine learning. Graphical models for these applications are sketched, and a model of dynamic sensor validation and fusion is presented in more depth, to illustrate the junction tree algorithm

  7. Rationality, Theory Acceptance and Decision Theory

    Directory of Open Access Journals (Sweden)

    J. Nicolas Kaufmann

    1998-06-01

    Full Text Available Following Kuhn's main thesis according to which theory revision and acceptance is always paradigm relative, I propose to outline some possible consequences of such a view. First, asking the question in what sense Bayesian decision theory could serve as the appropriate (normative theory of rationality examined from the point of view of the epistemology of theory acceptance, I argue that Bayesianism leads to a narrow conception of theory acceptance. Second, regarding the different types of theory revision, i.e. expansion, contraction, replacement and residuals shifts, I extract from Kuhn's view a series of indications showing that theory replacement cannot be rationalized within the framework of Bayesian decision theory, not even within a more sophisticated version of that model. Third, and finally, I will point to the need for a more comprehensive model of rationality than the Bayesian expected utility maximization model, the need for a model which could better deal with the different aspects of theory replacement. I will show that Kuhn's distinction between normal and revolutionary science gives us several hints for a more adequate theory of rationality in science. I will also show that Kuhn is not in a position to fully articulate his main ideas and that he well be confronted with a serious problem concerning collective choice of a paradigm.

  8. Fermion boson metamorphosis in field theory

    International Nuclear Information System (INIS)

    Ha, Y.K.

    1982-01-01

    In two-dimensional field theories many features are especially transparent if the Fermi fields are represented by non-local expressions of the Bose fields. Such a procedure is known as boson representation. Bilinear quantities appear in the Lagrangian of a fermion theory transform, however, as simple local expressions of the bosons so that the resulting theory may be written as a theory of bosons. Conversely, a theory of bosons may be transformed into an equivalent theory of fermions. Together they provide a basis for generating many interesting equivalences between theories of different types. In the present work a consistent scheme for constructing a canonical Fermi field in terms of a real scalar field is developed and such a procedure is valid and consistent with the tenets of quantum field theory is verified. A boson formulation offers a unifying theme in understanding the structure of many theories. This is illustrated by the boson formulation of a multifermion theory with chiral and internal symmetries. The nature of dynamical generation of mass when the theory undergoes boson transmutation and the preservation of continuous chiral symmetry in the massive case are examined. The dynamics of the system depends to a great extent on the specific number of fermions and different models of the same system can have very different properties. Many unusual symmetries of the fermion theory, such as hidden symmetry, duality and triality symmetries, are only manifest in the boson formulation. The underlying connections between some models with U(N) internal symmetry and another class of fermion models built with Majorana fermions which have O(2N) internal symmetry are uncovered

  9. Olfactory Receptor Database: a sensory chemoreceptor resource

    OpenAIRE

    Skoufos, Emmanouil; Marenco, Luis; Nadkarni, Prakash M.; Miller, Perry L.; Shepherd, Gordon M.

    2000-01-01

    The Olfactory Receptor Database (ORDB) is a WWW-accessible database that has been expanded from an olfactory receptor resource to a chemoreceptor resource. It stores data on six classes of G-protein-coupled sensory chemoreceptors: (i) olfactory receptor-like proteins, (ii) vomeronasal receptors, (iii) insect olfactory receptors, (iv) worm chemoreceptors, (v) taste papilla receptors and (vi) fungal pheromone receptors. A complementary database of the ligands of these receptors (OdorDB) has bee...

  10. From chaos to unification: U theory vs. M theory

    International Nuclear Information System (INIS)

    Ye, Fred Y.

    2009-01-01

    A unified physical theory called U theory, that is different from M theory, is defined and characterized. U theory, which includes spinor and twistor theory, loop quantum gravity, causal dynamical triangulations, E-infinity unification theory, and Clifford-Finslerian unifications, is based on physical tradition and experimental foundations. In contrast, M theory pays more attention to mathematical forms. While M theory is characterized by supersymmetry string theory, U theory is characterized by non-supersymmetry unified field theory.

  11. Validity in Qualitative Evaluation

    Directory of Open Access Journals (Sweden)

    Vasco Lub

    2015-12-01

    Full Text Available This article provides a discussion on the question of validity in qualitative evaluation. Although validity in qualitative inquiry has been widely reflected upon in the methodological literature (and is still often subject of debate, the link with evaluation research is underexplored. Elaborating on epistemological and theoretical conceptualizations by Guba and Lincoln and Creswell and Miller, the article explores aspects of validity of qualitative research with the explicit objective of connecting them with aspects of evaluation in social policy. It argues that different purposes of qualitative evaluations can be linked with different scientific paradigms and perspectives, thus transcending unproductive paradigmatic divisions as well as providing a flexible yet rigorous validity framework for researchers and reviewers of qualitative evaluations.

  12. Generalized algebra-valued models of set theory

    NARCIS (Netherlands)

    Löwe, B.; Tarafder, S.

    2015-01-01

    We generalize the construction of lattice-valued models of set theory due to Takeuti, Titani, Kozawa and Ozawa to a wider class of algebras and show that this yields a model of a paraconsistent logic that validates all axioms of the negation-free fragment of Zermelo-Fraenkel set theory.

  13. Quantal foundation of the nucleon exchange transport theory

    International Nuclear Information System (INIS)

    Randrup, J.

    1985-07-01

    The central elements of the nucleon exchange transport theory are discussed within a fully quantal framework in order to elucidate the principal characteristics, validity and limitations of the theory. Special consideration is given to the mean rate of energy dissipation and the penetrability coefficient. (orig.)

  14. Notes on the Verlinde formula in nonrational conformal field theories

    International Nuclear Information System (INIS)

    Jego, Charles; Troost, Jan

    2006-01-01

    We review and extend evidence for the validity of a generalized Verlinde formula, in particular, nonrational conformal field theories. We identify a subset of representations of the chiral algebra in nonrational conformal field theories that give rise to an analogue of the relation between modular S-matrices and fusion coefficients in rational conformal field theories. To that end we review and extend the Cardy-type brane calculations in bosonic and supersymmetric Liouville theory (and its duals) as well as in H 3 + . We analyze the three-point functions of Liouville theory and of H 3 + in detail to directly identify the fusion coefficients from the operator product expansion. Moreover, we check the validity of a proposed generic formula for localized brane one-point functions in nonrational conformal field theories

  15. Cross validation in LULOO

    DEFF Research Database (Denmark)

    Sørensen, Paul Haase; Nørgård, Peter Magnus; Hansen, Lars Kai

    1996-01-01

    The leave-one-out cross-validation scheme for generalization assessment of neural network models is computationally expensive due to replicated training sessions. Linear unlearning of examples has recently been suggested as an approach to approximative cross-validation. Here we briefly review...... the linear unlearning scheme, dubbed LULOO, and we illustrate it on a systemidentification example. Further, we address the possibility of extracting confidence information (error bars) from the LULOO ensemble....

  16. Verification and validation benchmarks.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  17. Transient FDTD simulation validation

    OpenAIRE

    Jauregui Tellería, Ricardo; Riu Costa, Pere Joan; Silva Martínez, Fernando

    2010-01-01

    In computational electromagnetic simulations, most validation methods have been developed until now to be used in the frequency domain. However, the EMC analysis of the systems in the frequency domain many times is not enough to evaluate the immunity of current communication devices. Based on several studies, in this paper we propose an alternative method of validation of the transients in time domain allowing a rapid and objective quantification of the simulations results.

  18. HEDR model validation plan

    International Nuclear Information System (INIS)

    Napier, B.A.; Gilbert, R.O.; Simpson, J.C.; Ramsdell, J.V. Jr.; Thiede, M.E.; Walters, W.H.

    1993-06-01

    The Hanford Environmental Dose Reconstruction (HEDR) Project has developed a set of computational ''tools'' for estimating the possible radiation dose that individuals may have received from past Hanford Site operations. This document describes the planned activities to ''validate'' these tools. In the sense of the HEDR Project, ''validation'' is a process carried out by comparing computational model predictions with field observations and experimental measurements that are independent of those used to develop the model

  19. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the

  20. Verification and validation benchmarks

    International Nuclear Information System (INIS)

    Oberkampf, William L.; Trucano, Timothy G.

    2008-01-01

    Verification and validation (V and V) are the primary means to assess the accuracy and reliability of computational simulations. V and V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V and V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the