WorldWideScience

Sample records for normalization process theory

  1. Development of a theory of implementation and integration: Normalization Process Theory

    Directory of Open Access Journals (Sweden)

    May Carl R

    2009-05-01

    Full Text Available Abstract Background Theories are important tools in the social and natural sciences. The methods by which they are derived are rarely described and discussed. Normalization Process Theory explains how new technologies, ways of acting, and ways of working become routinely embedded in everyday practice, and has applications in the study of implementation processes. This paper describes the process by which it was built. Methods Between 1998 and 2008, we developed a theory. We derived a set of empirical generalizations from analysis of data collected in qualitative studies of healthcare work and organization. We developed an applied theoretical model through analysis of empirical generalizations. Finally, we built a formal theory through a process of extension and implication analysis of the applied theoretical model. Results Each phase of theory development showed that the constructs of the theory did not conflict with each other, had explanatory power, and possessed sufficient robustness for formal testing. As the theory developed, its scope expanded from a set of observed regularities in data with procedural explanations, to an applied theoretical model, to a formal middle-range theory. Conclusion Normalization Process Theory has been developed through procedures that were properly sceptical and critical, and which were opened to review at each stage of development. The theory has been shown to merit formal testing.

  2. Overcoming the Problem of Embedding Change in Educational Organizations: A Perspective from Normalization Process Theory

    Science.gov (United States)

    Wood, Phil

    2017-01-01

    In this article, I begin by outlining some of the barriers which constrain sustainable organizational change in schools and universities. I then go on to introduce a theory which has already started to help explain complex change and innovation processes in health and care contexts, Normalization Process Theory. Finally, I consider what this…

  3. Process evaluation of discharge planning implementation in healthcare using normalization process theory.

    Science.gov (United States)

    Nordmark, Sofi; Zingmark, Karin; Lindberg, Inger

    2016-04-27

    Discharge planning is a care process that aims to secure the transfer of care for the patient at transition from home to the hospital and back home. Information exchange and collaboration between care providers are essential, but deficits are common. A wide range of initiatives to improve the discharge planning process have been developed and implemented for the past three decades. However, there are still high rates of reported medical errors and adverse events related to failures in the discharge planning. Using theoretical frameworks such as Normalization Process Theory (NPT) can support evaluations of complex interventions and processes in healthcare. The aim of this study was to explore the embedding and integration of the DPP from the perspective of registered nurses, district nurses and homecare organizers. The study design was explorative, using the NPT as a framework to explore the embedding and integration of the DPP. Data consisted of written documentation from; workshops with staff, registered adverse events and system failures, web based survey and individual interviews with staff. Using the NPT as a framework to explore the embedding and integration of discharge planning after 10 years in use showed that the staff had reached a consensus of opinion of what the process was (coherence) and how they evaluated the process (reflexive monitoring). However, they had not reached a consensus of opinion of who performed the process (cognitive participation) and how it was performed (collective action). This could be interpreted as the process had not become normalized in daily practice. The result shows necessity to observe the implementation of old practices to better understand the needs of new ones before developing and implementing new practices or supportive tools within healthcare to reach the aim of development and to accomplish sustainable implementation. The NPT offers a generalizable framework for analysis, which can explain and shape the

  4. A qualitative systematic review of studies using the normalization process theory to research implementation processes.

    Science.gov (United States)

    McEvoy, Rachel; Ballini, Luciana; Maltoni, Susanna; O'Donnell, Catherine A; Mair, Frances S; Macfarlane, Anne

    2014-01-02

    There is a well-recognized need for greater use of theory to address research translational gaps. Normalization Process Theory (NPT) provides a set of sociological tools to understand and explain the social processes through which new or modified practices of thinking, enacting, and organizing work are implemented, embedded, and integrated in healthcare and other organizational settings. This review of NPT offers readers the opportunity to observe how, and in what areas, a particular theoretical approach to implementation is being used. In this article we review the literature on NPT in order to understand what interventions NPT is being used to analyze, how NPT is being operationalized, and the reported benefits, if any, of using NPT. Using a framework analysis approach, we conducted a qualitative systematic review of peer-reviewed literature using NPT. We searched 12 electronic databases and all citations linked to six key NPT development papers. Grey literature/unpublished studies were not sought. Limitations of English language, healthcare setting and year of publication 2006 to June 2012 were set. Twenty-nine articles met the inclusion criteria; in the main, NPT is being applied to qualitatively analyze a diverse range of complex interventions, many beyond its original field of e-health and telehealth. The NPT constructs have high stability across settings and, notwithstanding challenges in applying NPT in terms of managing overlaps between constructs, there is evidence that it is a beneficial heuristic device to explain and guide implementation processes. NPT offers a generalizable framework that can be applied across contexts with opportunities for incremental knowledge gain over time and an explicit framework for analysis, which can explain and potentially shape implementation processes. This is the first review of NPT in use and it generates an impetus for further and extended use of NPT. We recommend that in future NPT research, authors should explicate

  5. Self-consistent normal ordering of gauge field theories

    International Nuclear Information System (INIS)

    Ruehl, W.

    1987-01-01

    Mean-field theories with a real action of unconstrained fields can be self-consistently normal ordered. This leads to a considerable improvement over standard mean-field theory. This concept is applied to lattice gauge theories. First an appropriate real action mean-field theory is constructed. The equations determining the Gaussian kernel necessary for self-consistent normal ordering of this mean-field theory are derived. (author). 4 refs

  6. Effect of care management program structure on implementation: a normalization process theory analysis.

    Science.gov (United States)

    Holtrop, Jodi Summers; Potworowski, Georges; Fitzpatrick, Laurie; Kowalk, Amy; Green, Lee A

    2016-08-15

    Care management in primary care can be effective in helping patients with chronic disease improve their health status, however, primary care practices are often challenged with implementation. Further, there are different ways to structure care management that may make implementation more or less successful. Normalization process theory (NPT) provides a means of understanding how a new complex intervention can become routine (normalized) in practice. In this study, we used NPT to understand how care management structure affected how well care management became routine in practice. Data collection involved semi-structured interviews and observations conducted at 25 practices in five physician organizations in Michigan, USA. Practices were selected to reflect variation in physician organizations, type of care management program, and degree of normalization. Data were transcribed, qualitatively coded and analyzed, initially using an editing approach and then a template approach with NPT as a guiding framework. Seventy interviews and 25 observations were completed. Two key structures for care management organization emerged: practice-based care management where the care managers were embedded in the practice as part of the practice team; and centralized care management where the care managers worked independently of the practice work flow and was located outside the practice. There were differences in normalization of care management across practices. Practice-based care management was generally better normalized as compared to centralized care management. Differences in normalization were well explained by the NPT, and in particular the collective action construct. When care managers had multiple and flexible opportunities for communication (interactional workability), had the requisite knowledge, skills, and personal characteristics (skill set workability), and the organizational support and resources (contextual integration), a trusting professional relationship

  7. Normal form theory and spectral sequences

    OpenAIRE

    Sanders, Jan A.

    2003-01-01

    The concept of unique normal form is formulated in terms of a spectral sequence. As an illustration of this technique some results of Baider and Churchill concerning the normal form of the anharmonic oscillator are reproduced. The aim of this paper is to show that spectral sequences give us a natural framework in which to formulate normal form theory. © 2003 Elsevier Science (USA). All rights reserved.

  8. Evaluating Complex Interventions and Health Technologies Using Normalization Process Theory: Development of a Simplified Approach and Web-Enabled Toolkit

    LENUS (Irish Health Repository)

    May, Carl R

    2011-09-30

    Abstract Background Normalization Process Theory (NPT) can be used to explain implementation processes in health care relating to new technologies and complex interventions. This paper describes the processes by which we developed a simplified version of NPT for use by clinicians, managers, and policy makers, and which could be embedded in a web-enabled toolkit and on-line users manual. Methods Between 2006 and 2010 we undertook four tasks. (i) We presented NPT to potential and actual users in multiple workshops, seminars, and presentations. (ii) Using what we discovered from these meetings, we decided to create a simplified set of statements and explanations expressing core constructs of the theory (iii) We circulated these statements to a criterion sample of 60 researchers, clinicians and others, using SurveyMonkey to collect qualitative textual data about their criticisms of the statements. (iv) We then reconstructed the statements and explanations to meet users\\' criticisms, embedded them in a web-enabled toolkit, and beta tested this \\'in the wild\\'. Results On-line data collection was effective: over a four week period 50\\/60 participants responded using SurveyMonkey (40\\/60) or direct phone and email contact (10\\/60). An additional nine responses were received from people who had been sent the SurveyMonkey form by other respondents. Beta testing of the web enabled toolkit produced 13 responses, from 327 visits to http:\\/\\/www.normalizationprocess.org. Qualitative analysis of both sets of responses showed a high level of support for the statements but also showed that some statements poorly expressed their underlying constructs or overlapped with others. These were rewritten to take account of users\\' criticisms and then embedded in a web-enabled toolkit. As a result we were able translate the core constructs into a simplified set of statements that could be utilized by non-experts. Conclusion Normalization Process Theory has been developed through

  9. Evaluating complex interventions and health technologies using normalization process theory: development of a simplified approach and web-enabled toolkit

    Directory of Open Access Journals (Sweden)

    Murray Elizabeth

    2011-09-01

    Full Text Available Abstract Background Normalization Process Theory (NPT can be used to explain implementation processes in health care relating to new technologies and complex interventions. This paper describes the processes by which we developed a simplified version of NPT for use by clinicians, managers, and policy makers, and which could be embedded in a web-enabled toolkit and on-line users manual. Methods Between 2006 and 2010 we undertook four tasks. (i We presented NPT to potential and actual users in multiple workshops, seminars, and presentations. (ii Using what we discovered from these meetings, we decided to create a simplified set of statements and explanations expressing core constructs of the theory (iii We circulated these statements to a criterion sample of 60 researchers, clinicians and others, using SurveyMonkey to collect qualitative textual data about their criticisms of the statements. (iv We then reconstructed the statements and explanations to meet users' criticisms, embedded them in a web-enabled toolkit, and beta tested this 'in the wild'. Results On-line data collection was effective: over a four week period 50/60 participants responded using SurveyMonkey (40/60 or direct phone and email contact (10/60. An additional nine responses were received from people who had been sent the SurveyMonkey form by other respondents. Beta testing of the web enabled toolkit produced 13 responses, from 327 visits to http://www.normalizationprocess.org. Qualitative analysis of both sets of responses showed a high level of support for the statements but also showed that some statements poorly expressed their underlying constructs or overlapped with others. These were rewritten to take account of users' criticisms and then embedded in a web-enabled toolkit. As a result we were able translate the core constructs into a simplified set of statements that could be utilized by non-experts. Conclusion Normalization Process Theory has been developed through

  10. Theory of normal metals

    International Nuclear Information System (INIS)

    Mahan, G.D.

    1992-01-01

    The organizers requested that I give eight lectures on the theory of normal metals, ''with an eye on superconductivity.'' My job was to cover the general properties of metals. The topics were selected according to what the students would need to known for the following lectures on superconductivity. My role was to prepare the ground work for the later lectures. The problem is that there is not yet a widely accepted theory for the mechanism which pairs the electrons. Many mechanisms have been proposed, with those of phonons and spin fluctuations having the most followers. So I tried to discuss both topics. I also introduced the tight-binding model for metals, which forms the basis for most of the work on the cuprate superconductors

  11. Understanding the challenges to implementing case management for people with dementia in primary care in England: a qualitative study using Normalization Process Theory.

    Science.gov (United States)

    Bamford, Claire; Poole, Marie; Brittain, Katie; Chew-Graham, Carolyn; Fox, Chris; Iliffe, Steve; Manthorpe, Jill; Robinson, Louise

    2014-11-08

    Case management has been suggested as a way of improving the quality and cost-effectiveness of support for people with dementia. In this study we adapted and implemented a successful United States' model of case management in primary care in England. The results are reported elsewhere, but a key finding was that little case management took place. This paper reports the findings of the process evaluation which used Normalization Process Theory to understand the barriers to implementation. Ethnographic methods were used to explore the views and experiences of case management. Interviews with 49 stakeholders (patients, carers, case managers, health and social care professionals) were supplemented with observation of case managers during meetings and initial assessments with patients. Transcripts and field notes were analysed initially using the constant comparative approach and emerging themes were then mapped onto the framework of Normalization Process Theory. The primary focus during implementation was on the case managers as isolated individuals, with little attention being paid to the social or organizational context within which they worked. Barriers relating to each of the four main constructs of Normalization Process Theory were identified, with a lack of clarity over the scope and boundaries of the intervention (coherence); variable investment in the intervention (cognitive participation); a lack of resources, skills and training to deliver case management (collective action); and limited reflection and feedback on the case manager role (reflexive monitoring). Despite the intuitive appeal of case management to all stakeholders, there were multiple barriers to implementation in primary care in England including: difficulties in embedding case managers within existing well-established community networks; the challenges of protecting time for case management; and case managers' inability to identify, and act on, emerging patient and carer needs (an essential, but

  12. Promoting health workers' ownership of infection prevention and control: using Normalization Process Theory as an interpretive framework.

    Science.gov (United States)

    Gould, D J; Hale, R; Waters, E; Allen, D

    2016-12-01

    All health workers should take responsibility for infection prevention and control (IPC). Recent reduction in key reported healthcare-associated infections in the UK is impressive, but the determinants of success are unknown. It is imperative to understand how IPC strategies operate as new challenges arise and threats of antimicrobial resistance increase. The authors undertook a retrospective, independent evaluation of an action plan to enhance IPC and 'ownership' (individual accountability) for IPC introduced throughout a healthcare organization. Twenty purposively selected informants were interviewed. Data were analysed inductively. Normalization Process Theory (NPT) was applied to interpret the findings and explain how the action plan was operating. Six themes emerged through inductive analysis. Theme 1: 'Ability to make sense of ownership' provided evidence of the first element of NPT (coherence). Regardless of occupational group or seniority, informants understood the importance of IPC ownership and described what it entailed. They identified three prerequisites: 'Always being vigilant' (Theme 2), 'Importance of access to information' (Theme 3) and 'Being able to learn together in a no-blame culture' (Theme 4). Data relating to each theme provided evidence of the other elements of NPT that are required to embed change: planning implementation (cognitive participation), undertaking the work necessary to achieve change (collective action), and reflection on what else is needed to promote change as part of continuous quality improvement (reflexive monitoring). Informants identified barriers (e.g. workload) and facilitators (clear lines of communication and expectations for IPC). Eighteen months after implementing the action plan incorporating IPC ownership, there was evidence of continuous service improvement and significant reduction in infection rates. Applying a theory that identifies factors that promote/inhibit routine incorporation ('normalization') of IPC

  13. Mean fields and self consistent normal ordering of lattice spin and gauge field theories

    International Nuclear Information System (INIS)

    Ruehl, W.

    1986-01-01

    Classical Heisenberg spin models on lattices possess mean field theories that are well defined real field theories on finite lattices. These mean field theories can be self consistently normal ordered. This leads to a considerable improvement over standard mean field theory. This concept is carried over to lattice gauge theories. We construct first an appropriate real mean field theory. The equations determining the Gaussian kernel necessary for self-consistent normal ordering of this mean field theory are derived. (orig.)

  14. Theory of the low-voltage impedance of superconductor-- p insulator--normal metal tunnel junctions

    International Nuclear Information System (INIS)

    Lemberger, T.R.

    1984-01-01

    A theory for the low-voltage impedance of a superconductor-- p insulator--normal metal tunnel junction is developed that includes the effects of charge imbalance and of quasiparticle fluctuations. A novel, inelastic, charge-imbalance relaxation process is identified that is associated with the junction itself. This new process leads to the surprising result that the charge-imbalance component of the dc resistance of a junction becomes independent of the electron-phonon scattering rate as the insulator resistance decreases

  15. Entropy generation and momentum transfer in the superconductor-normal and normal-superconductor phase transformations and the consistency of the conventional theory of superconductivity

    Science.gov (United States)

    Hirsch, J. E.

    2018-05-01

    Since the discovery of the Meissner effect, the superconductor to normal (S-N) phase transition in the presence of a magnetic field is understood to be a first-order phase transformation that is reversible under ideal conditions and obeys the laws of thermodynamics. The reverse (N-S) transition is the Meissner effect. This implies in particular that the kinetic energy of the supercurrent is not dissipated as Joule heat in the process where the superconductor becomes normal and the supercurrent stops. In this paper, we analyze the entropy generation and the momentum transfer between the supercurrent and the body in the S-N transition and the N-S transition as described by the conventional theory of superconductivity. We find that it is not possible to explain the transition in a way that is consistent with the laws of thermodynamics unless the momentum transfer between the supercurrent and the body occurs with zero entropy generation, for which the conventional theory of superconductivity provides no mechanism. Instead, we point out that the alternative theory of hole superconductivity does not encounter such difficulties.

  16. Normal modified stable processes

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler; Shephard, N.

    2002-01-01

    Gaussian (NGIG) laws. The wider framework thus established provides, in particular, for added flexibility in the modelling of the dynamics of financial time series, of importance especially as regards OU based stochastic volatility models for equities. In the special case of the tempered stable OU process......This paper discusses two classes of distributions, and stochastic processes derived from them: modified stable (MS) laws and normal modified stable (NMS) laws. This extends corresponding results for the generalised inverse Gaussian (GIG) and generalised hyperbolic (GH) or normal generalised inverse...

  17. Perturbations and quasi-normal modes of black holes in Einstein-Aether theory

    International Nuclear Information System (INIS)

    Konoplya, R.A.; Zhidenko, A.

    2007-01-01

    We develop a new method for calculation of quasi-normal modes of black holes, when the effective potential, which governs black hole perturbations, is known only numerically in some region near the black hole. This method can be applied to perturbations of a wide class of numerical black hole solutions. We apply it to the black holes in the Einstein-Aether theory, a theory where general relativity is coupled to a unit time-like vector field, in order to observe local Lorentz symmetry violation. We found that in the non-reduced Einstein-Aether theory, real oscillation frequency and damping rate of quasi-normal modes are larger than those of Schwarzschild black holes in the Einstein theory

  18. Nevanlinna theory, normal families, and algebraic differential equations

    CERN Document Server

    Steinmetz, Norbert

    2017-01-01

    This book offers a modern introduction to Nevanlinna theory and its intricate relation to the theory of normal families, algebraic functions, asymptotic series, and algebraic differential equations. Following a comprehensive treatment of Nevanlinna’s theory of value distribution, the author presents advances made since Hayman’s work on the value distribution of differential polynomials and illustrates how value- and pair-sharing problems are linked to algebraic curves and Briot–Bouquet differential equations. In addition to discussing classical applications of Nevanlinna theory, the book outlines state-of-the-art research, such as the effect of the Yosida and Zalcman–Pang method of re-scaling to algebraic differential equations, and presents the Painlevé–Yosida theorem, which relates Painlevé transcendents and solutions to selected 2D Hamiltonian systems to certain Yosida classes of meromorphic functions. Aimed at graduate students interested in recent developments in the field and researchers wor...

  19. A normal form approach to the theory of nonlinear betatronic motion

    International Nuclear Information System (INIS)

    Bazzani, A.; Todesco, E.; Turchetti, G.; Servizi, G.

    1994-01-01

    The betatronic motion of a particle in a circular accelerator is analysed using the transfer map description of the magnetic lattice. In the linear case the transfer matrix approach is shown to be equivalent to the Courant-Snyder theory: In the normal coordinates' representation the transfer matrix is a pure rotation. When the nonlinear effects due to the multipolar components of the magnetic field are taken into account, a similar procedure is used: a nonlinear change of coordinates provides a normal form representation of the map, which exhibits explicit symmetry properties depending on the absence or presence of resonance relations among the linear tunes. The use of normal forms is illustrated in the simplest but significant model of a cell with a sextupolar nonlinearity which is described by the quadratic Henon map. After recalling the basic theoretical results in Hamiltonian dynamics, we show how the normal forms describe the different topological structures of phase space such as KAM tori, chains of islands and chaotic regions; a critical comparison with the usual perturbation theory for Hamilton equations is given. The normal form theory is applied to compute the tune shift and deformation of the orbits for the lattices of the SPS and LHC accelerators, and scaling laws are obtained. Finally, the correction procedure of the multipolar errors of the LHC, based on the analytic minimization of the tune shift computed via the normal forms, is described and the results for a model of the LHC are presented. This application, relevant for the lattice design, focuses on the advantages of normal forms with respect to tracking when parametric dependences have to be explored. (orig.)

  20. A critical analysis of the implementation of service user involvement in primary care research and health service development using normalization process theory.

    Science.gov (United States)

    Tierney, Edel; McEvoy, Rachel; O'Reilly-de Brún, Mary; de Brún, Tomas; Okonkwo, Ekaterina; Rooney, Michelle; Dowrick, Chris; Rogers, Anne; MacFarlane, Anne

    2016-06-01

    There have been recent important advances in conceptualizing and operationalizing involvement in health research and health-care service development. However, problems persist in the field that impact on the scope for meaningful involvement to become a routine - normalized - way of working in primary care. In this review, we focus on current practice to critically interrogate factors known to be relevant for normalization - definition, enrolment, enactment and appraisal. Ours was a multidisciplinary, interagency team, with community representation. We searched EBSCO host for papers from 2007 to 2011 and engaged in an iterative, reflexive approach to sampling, appraising and analysing the literature following the principles of a critical interpretive synthesis approach and using Normalization Process Theory. Twenty-six papers were chosen from 289 papers, as a purposeful sample of work that is reported as service user involvement in the field. Few papers provided a clear working definition of service user involvement. The dominant identified rationale for enrolling service users in primary care projects was linked with policy imperatives for co-governance and emancipatory ideals. The majority of methodologies employed were standard health services research methods that do not qualify as research with service users. This indicates a lack of congruence between the stated aims and methods. Most studies only reported positive outcomes, raising questions about the balance or completeness of the published appraisals. To improve normalization of meaningful involvement in primary care, it is necessary to encourage explicit reporting of definitions, methodological innovation to enhance co-governance and dissemination of research processes and findings. © 2014 The Authors Health Expectations Published by John Wiley & Sons Ltd.

  1. Understanding the implementation of complex interventions in health care: the normalization process model

    Directory of Open Access Journals (Sweden)

    Rogers Anne

    2007-09-01

    Full Text Available Abstract Background The Normalization Process Model is a theoretical model that assists in explaining the processes by which complex interventions become routinely embedded in health care practice. It offers a framework for process evaluation and also for comparative studies of complex interventions. It focuses on the factors that promote or inhibit the routine embedding of complex interventions in health care practice. Methods A formal theory structure is used to define the model, and its internal causal relations and mechanisms. The model is broken down to show that it is consistent and adequate in generating accurate description, systematic explanation, and the production of rational knowledge claims about the workability and integration of complex interventions. Results The model explains the normalization of complex interventions by reference to four factors demonstrated to promote or inhibit the operationalization and embedding of complex interventions (interactional workability, relational integration, skill-set workability, and contextual integration. Conclusion The model is consistent and adequate. Repeated calls for theoretically sound process evaluations in randomized controlled trials of complex interventions, and policy-makers who call for a proper understanding of implementation processes, emphasize the value of conceptual tools like the Normalization Process Model.

  2. Implementing nutrition guidelines for older people in residential care homes: a qualitative study using Normalization Process Theory

    Directory of Open Access Journals (Sweden)

    Bamford Claire

    2012-10-01

    Full Text Available Abstract Background Optimizing the dietary intake of older people can prevent nutritional deficiencies and diet-related diseases, thereby improving quality of life. However, there is evidence that the nutritional intake of older people living in care homes is suboptimal, with high levels of saturated fat, salt, and added sugars. The UK Food Standards Agency therefore developed nutrient- and food-based guidance for residential care homes. The acceptability of these guidelines and their feasibility in practice is unknown. This study used the Normalization Process Theory (NPT to understand the barriers and facilitators to implementing the guidelines and inform future implementation. Methods We conducted a process evaluation in five care homes in the north of England using qualitative methods (observation and interviews to explore the views of managers, care staff, catering staff, and domestic staff. Data were analyzed thematically and discussed in data workshops; emerging themes were then mapped to the constructs of NPT. Results Many staff perceived the guidelines as unnecessarily restrictive and irrelevant to older people. In terms of NPT, the guidelines simply did not make sense (coherence, and as a result, relatively few staff invested in the guidelines (cognitive participation. Even where staff supported the guidelines, implementation was hampered by a lack of nutritional knowledge and institutional support (collective action. Finally, the absence of observable benefits to clients confirmed the negative preconceptions of many staff, with limited evidence of reappraisal following implementation (reflexive monitoring. Conclusions The successful implementation of the nutrition guidelines requires that the fundamental issues relating to their perceived value and fit with other priorities and goals be addressed. Specialist support is needed to equip staff with the technical knowledge and skills required for menu analysis and development and to

  3. Irreversible processes kinetic theory

    CERN Document Server

    Brush, Stephen G

    2013-01-01

    Kinetic Theory, Volume 2: Irreversible Processes deals with the kinetic theory of gases and the irreversible processes they undergo. It includes the two papers by James Clerk Maxwell and Ludwig Boltzmann in which the basic equations for transport processes in gases are formulated, together with the first derivation of Boltzmann's ""H-theorem"" and a discussion of this theorem, along with the problem of irreversibility.Comprised of 10 chapters, this volume begins with an introduction to the fundamental nature of heat and of gases, along with Boltzmann's work on the kinetic theory of gases and s

  4. Qualitative Process Theory.

    Science.gov (United States)

    1984-07-01

    solving common sense reasoning mathematical reasoning naive physics aritificial intelligence * 20. ABSTRACT (Continue o,, reverse side Ift necessary and...AD-A148 987 QUALITATIVE PROCESS THEORY(U) MASSACHUSETTS INST OF 1/2 TEEH CAMBRIDGE ARTIFICIAL INTELLIGENCE LAB K D FORBUS JUL 84 RI-TR-789 N88814-80...NATIONAL BUREAU Of STAN ARDS IJ% A 4 I .7 Technical Report 789 Q[-----Qualitative• ProcessTheory . Kenneth Dale Forbus MIT Artificial Intelligence

  5. Destination memory and cognitive theory of mind in normal ageing.

    Science.gov (United States)

    El Haj, Mohamad; Raffard, Stéphane; Gély-Nargeot, Marie-Christine

    2016-01-01

    Destination memory is the ability to remember the destination to which a piece of information has been addressed (e.g., "Did I tell you about the promotion?"). This ability is found to be impaired in normal ageing. Our work aimed to link this deterioration to the decline in theory of mind. Forty younger adults (M age = 23.13 years, SD = 4.00) and 36 older adults (M age = 69.53 years, SD = 8.93) performed a destination memory task. They also performed the False-belief test addressing cognitive theory of mind and the Reading the mind in the eyes test addressing affective theory of mind. Results showed significant deterioration in destination memory, cognitive theory of mind and affective theory of mind in the older adults. The older adults' performance on destination memory was significantly correlated with and predicted by their performance on cognitive theory of mind. Difficulties in the ability to interpret and predict others' mental states are related to destination memory decline in older adults.

  6. 'It Opened My Eyes'-examining the impact of a multifaceted chlamydia testing intervention on general practitioners using Normalization Process Theory.

    Science.gov (United States)

    Yeung, Anna; Hocking, Jane; Guy, Rebecca; Fairley, Christopher K; Smith, Kirsty; Vaisey, Alaina; Donovan, Basil; Imrie, John; Gunn, Jane; Temple-Smith, Meredith

    2018-03-28

    Chlamydia is the most common notifiable sexually transmissible infection in Australia. Left untreated, it can develop into pelvic inflammatory disease and infertility. The majority of notifications come from general practice and it is ideally situated to test young Australians. The Australian Chlamydia Control Effectiveness Pilot (ACCEPt) was a multifaceted intervention that aimed to reduce chlamydia prevalence by increasing testing in 16- to 29-year-olds attending general practice. GPs were interviewed to describe the effectiveness of the ACCEPt intervention in integrating chlamydia testing into routine practice using Normalization Process Theory (NPT). GPs were purposively selected based on age, gender, geographic location and size of practice at baseline and midpoint. Interview data were analysed regarding the intervention components and results were interpreted using NPT. A total of 44 GPs at baseline and 24 at midpoint were interviewed. Most GPs reported offering a test based on age at midpoint versus offering a test based on symptoms or patient request at baseline. Quarterly feedback was the most significant ACCEPt component for facilitating a chlamydia test. The ACCEPt intervention has been able to moderately normalize chlamydia testing among GPs, although the components had varying levels of effectiveness. NPT can demonstrate the effective implementation of an intervention in general practice and has been valuable in understanding which components are essential and which components can be improved upon.

  7. The relationship of theory of mind and executive functions in normal, deaf and cochlear-implanted children

    Directory of Open Access Journals (Sweden)

    Farideh Nazarzadeh

    2014-08-01

    Full Text Available Background and Aim : Theory of mind refers to the ability to understand the others have mental states that can be different from one's own mental states or facts. This study aimed to investigate the relationship of theory of mind and executive functions in normal hearing, deaf, and cochlear-implanted children.Methods: The study population consisted of normal, deaf and cochlear-implanted girl students in Mashhad city, Iran. Using random sampling, 30 children (10 normal, 10 deaf and 10 cochlear-implanted in age groups of 8-12 years old were selected. To measure the theoty of mind, theory of mind 38-item scale and to assess executive function, Coolidge neuropsychological and personality test was used. Research data were analyzed using the Spearman correlation coefficient, analysis of variance and Kruskal-Wallis tests.Results: There was a significant difference between the groups in the theory of mind and executive function subscales, organization, planning-decision-making, and inhibition. Between normal and deaf groups (p=0.01, as well as cochlear-implanted and deaf groups (p=0.01, there was significant difference in planning decision-making subscale. There was not any significant relationship between the theory of mind and executive functions generally or the theory of mind and executive function subscales in these three groups independently.Conclusion: Based on our findings, cochlear-implanted and deaf children have lower performance in theory of mind and executive function compared with normal hearing children.

  8. Theory of normal and superconducting properties of fullerene-based solids

    International Nuclear Information System (INIS)

    Cohen, M.L.

    1992-10-01

    Recent experiments on the normal-state and superconducting properties of fullerene-based solids are used to constrain the proposal theories of the electronic nature of these materials. In general, models of superconductivity based on electron pairing induced by phonons are consistent with electronic band theory. The latter experiments also yield estimates of the parameters characterizing these type H superconductors. It is argued that, at this point, a ''standard model'' of phonons interacting with itinerant electrons may be a good first approximation for explaining the properties of the metallic fullerenes

  9. Stochastic processes inference theory

    CERN Document Server

    Rao, Malempati M

    2014-01-01

    This is the revised and enlarged 2nd edition of the authors’ original text, which was intended to be a modest complement to Grenander's fundamental memoir on stochastic processes and related inference theory. The present volume gives a substantial account of regression analysis, both for stochastic processes and measures, and includes recent material on Ridge regression with some unexpected applications, for example in econometrics. The first three chapters can be used for a quarter or semester graduate course on inference on stochastic processes. The remaining chapters provide more advanced material on stochastic analysis suitable for graduate seminars and discussions, leading to dissertation or research work. In general, the book will be of interest to researchers in probability theory, mathematical statistics and electrical and information theory.

  10. The Distribution of the Product Explains Normal Theory Mediation Confidence Interval Estimation.

    Science.gov (United States)

    Kisbu-Sakarya, Yasemin; MacKinnon, David P; Miočević, Milica

    2014-05-01

    The distribution of the product has several useful applications. One of these applications is its use to form confidence intervals for the indirect effect as the product of 2 regression coefficients. The purpose of this article is to investigate how the moments of the distribution of the product explain normal theory mediation confidence interval coverage and imbalance. Values of the critical ratio for each random variable are used to demonstrate how the moments of the distribution of the product change across values of the critical ratio observed in research studies. Results of the simulation study showed that as skewness in absolute value increases, coverage decreases. And as skewness in absolute value and kurtosis increases, imbalance increases. The difference between testing the significance of the indirect effect using the normal theory versus the asymmetric distribution of the product is further illustrated with a real data example. This article is the first study to show the direct link between the distribution of the product and indirect effect confidence intervals and clarifies the results of previous simulation studies by showing why normal theory confidence intervals for indirect effects are often less accurate than those obtained from the asymmetric distribution of the product or from resampling methods.

  11. Microscopic theory of the current-voltage relationship across a normal-superconducting interface

    International Nuclear Information System (INIS)

    Kraehenbuehl, Y.; Watts-Tobin, R.J.

    1979-01-01

    Measurements by Pippard et al. have shown the existence of an extra resistance due to the penetration of an electrical potential into a superconductor. Previous theories of this effect are unable to explain the full temperature dependence of the extra resistance because they use oversimplified models of the normal--superconducting interface. We show that the microscopic theory for dirty superconductors leads to a good agreement with experiment over the whole temperature range

  12. Statistical Theory of Normal Grain Growth Revisited

    International Nuclear Information System (INIS)

    Gadomski, A.; Luczka, J.

    2002-01-01

    In this paper, we discuss three physically relevant problems concerning the normal grain growth process. These are: Infinite vs finite size of the system under study (a step towards more realistic modeling); conditions of fine-grained structure formation, with possible applications to thin films and biomembranes, and interesting relations to superplasticity of materials; approach to log-normality, an ubiquitous natural phenomenon, frequently reported in literature. It turns out that all three important points mentioned are possible to be included in a Mulheran-Harding type behavior of evolving grains-containing systems that we have studied previously. (author)

  13. The challenge of transferring an implementation strategy from academia to the field: a process evaluation of local quality improvement collaboratives in Dutch primary care using the normalization process theory.

    Science.gov (United States)

    Trietsch, Jasper; van Steenkiste, Ben; Hobma, Sjoerd; Frericks, Arnoud; Grol, Richard; Metsemakers, Job; van der Weijden, Trudy

    2014-12-01

    A quality improvement strategy consisting of comparative feedback and peer review embedded in available local quality improvement collaboratives proved to be effective in changing the test-ordering behaviour of general practitioners. However, implementing this strategy was problematic. We aimed for large-scale implementation of an adapted strategy covering both test ordering and prescribing performance. Because we failed to achieve large-scale implementation, the aim of this study was to describe and analyse the challenges of the transferring process. In a qualitative study 19 regional health officers, pharmacists, laboratory specialists and general practitioners were interviewed within 6 months after the transfer period. The interviews were audiotaped, transcribed and independently coded by two of the authors. The codes were matched to the dimensions of the normalization process theory. The general idea of the strategy was widely supported, but generating the feedback was more complex than expected and the need for external support after transfer of the strategy remained high because participants did not assume responsibility for the work and the distribution of resources that came with it. Evidence on effectiveness, a national infrastructure for these collaboratives and a general positive attitude were not sufficient for normalization. Thinking about managing large databases, responsibility for tasks and distribution of resources should start as early as possible when planning complex quality improvement strategies. Merely exploring the barriers and facilitators experienced in a preceding trial is not sufficient. Although multifaceted implementation strategies to change professional behaviour are attractive, their inherent complexity is also a pitfall for large-scale implementation. © 2014 John Wiley & Sons, Ltd.

  14. Overweight but unseen: a review of the underestimation of weight status and a visual normalization theory.

    Science.gov (United States)

    Robinson, E

    2017-10-01

    Although overweight and obesity are widespread across most of the developed world, a considerable body of research has now accumulated, which suggests that adiposity often goes undetected. A substantial proportion of individuals with overweight or obesity do not identify they are overweight, and large numbers of parents of children with overweight or obesity fail to identify their child as being overweight. Lay people and medical practitioners are also now poor at identifying overweight and obesity in others. A visual normalization theory of the under-detection of overweight and obesity is proposed. This theory is based on the notion that weight status is judged relative to visual body size norms. Because larger body sizes are now common, this has caused a recalibration to the range of body sizes that are perceived as being 'normal' and increased the visual threshold for what constitutes 'overweight'. Evidence is reviewed that indicates this process has played a significant role in the under-detection of overweight and obesity. The public health relevance of the under-detection of overweight and obesity is also discussed. © 2017 The Authors. Obesity Reviews published by John Wiley & Sons Ltd on behalf of World Obesity.

  15. Acquisition by Processing Theory: A Theory of Everything?

    Science.gov (United States)

    Carroll, Susanne E.

    2004-01-01

    Truscott and Sharwood Smith (henceforth T&SS) propose a novel theory of language acquisition, "Acquisition by Processing Theory" (APT), designed to account for both first and second language acquisition, monolingual and bilingual speech perception and parsing, and speech production. This is a tall order. Like any theoretically ambitious…

  16. Examination of the neighborhood activation theory in normal and hearing-impaired listeners.

    Science.gov (United States)

    Dirks, D D; Takayanagi, S; Moshfegh, A; Noffsinger, P D; Fausti, S A

    2001-02-01

    well as to an elderly group of listeners with sensorineural hearing loss in the speech-shaped noise (Experiment 3). The results of three experiments verified predictions of NAM in both normal hearing and hearing-impaired listeners. In each experiment, words from low density neighborhoods were recognized more accurately than those from high density neighborhoods. The presence of high frequency neighbors (average neighborhood frequency) produced poorer recognition performance than comparable conditions with low frequency neighbors. Word frequency was found to have a highly significant effect on word recognition. Lexical conditions with high word frequencies produced higher performance scores than conditions with low frequency words. The results supported the basic tenets of NAM theory and identified both neighborhood structural properties and word frequency as significant lexical factors affecting word recognition when listening in noise and "in quiet." The results of the third experiment permit extension of NAM theory to individuals with sensorineural hearing loss. Future development of speech recognition tests should allow for the effects of higher level cognitive (lexical) factors on lower level phonemic processing.

  17. An adaptive orienting theory of error processing.

    Science.gov (United States)

    Wessel, Jan R

    2018-03-01

    The ability to detect and correct action errors is paramount to safe and efficient goal-directed behaviors. Existing work on the neural underpinnings of error processing and post-error behavioral adaptations has led to the development of several mechanistic theories of error processing. These theories can be roughly grouped into adaptive and maladaptive theories. While adaptive theories propose that errors trigger a cascade of processes that will result in improved behavior after error commission, maladaptive theories hold that error commission momentarily impairs behavior. Neither group of theories can account for all available data, as different empirical studies find both impaired and improved post-error behavior. This article attempts a synthesis between the predictions made by prominent adaptive and maladaptive theories. Specifically, it is proposed that errors invoke a nonspecific cascade of processing that will rapidly interrupt and inhibit ongoing behavior and cognition, as well as orient attention toward the source of the error. It is proposed that this cascade follows all unexpected action outcomes, not just errors. In the case of errors, this cascade is followed by error-specific, controlled processing, which is specifically aimed at (re)tuning the existing task set. This theory combines existing predictions from maladaptive orienting and bottleneck theories with specific neural mechanisms from the wider field of cognitive control, including from error-specific theories of adaptive post-error processing. The article aims to describe the proposed framework and its implications for post-error slowing and post-error accuracy, propose mechanistic neural circuitry for post-error processing, and derive specific hypotheses for future empirical investigations. © 2017 Society for Psychophysiological Research.

  18. How to Develop a Multi-Grounded Theory: the evolution of a business process theory

    Directory of Open Access Journals (Sweden)

    Mikael Lind

    2006-05-01

    Full Text Available In the information systems field there is a great need for different theories. Theory development can be performed in different ways – deductively and/or inductively. Different approaches with their pros and cons for theory development exists. A combined approach, which builds on inductive as well as deductive thinking, has been put forward – a Multi-Grounded Theory approach. In this paper the evolution of a business process theory is regarded as the development of a multi-grounded theory. This evolution is based on empirical studies, theory-informed conceptual development and the creation of conceptual cohesion. The theoretical development has involved a dialectic approach aiming at a theoretical synthesis based on antagonistic theories. The result of this research process was a multi-grounded business process theory. Multi-grounded means that the theory is empirically, internally and theoretically founded. This business process theory can be used as an aid for business modellers to direct attention towards relevant aspects when business process determination is performed.

  19. Stochastic processes and quantum theory

    International Nuclear Information System (INIS)

    Klauder, J.R.

    1975-01-01

    The author analyses a variety of stochastic processes, namely real time diffusion phenomena, which are analogues of imaginary time quantum theory and convariant imaginary time quantum field theory. He elaborates some standard properties involving probability measures and stochastic variables and considers a simple class of examples. Finally he develops the fact that certain stochastic theories actually exhibit divergences that simulate those of covariant quantum field theory and presents examples of both renormaizable and unrenormalizable behavior. (V.J.C.)

  20. Estimating Non-Normal Latent Trait Distributions within Item Response Theory Using True and Estimated Item Parameters

    Science.gov (United States)

    Sass, D. A.; Schmitt, T. A.; Walker, C. M.

    2008-01-01

    Item response theory (IRT) procedures have been used extensively to study normal latent trait distributions and have been shown to perform well; however, less is known concerning the performance of IRT with non-normal latent trait distributions. This study investigated the degree of latent trait estimation error under normal and non-normal…

  1. Gaussian processes and constructive scalar field theory

    International Nuclear Information System (INIS)

    Benfatto, G.; Nicolo, F.

    1981-01-01

    The last years have seen a very deep progress of constructive euclidean field theory, with many implications in the area of the random fields theory. The authors discuss an approach to super-renormalizable scalar field theories, which puts in particular evidence the connections with the theory of the Gaussian processes associated to the elliptic operators. The paper consists of two parts. Part I treats some problems in the theory of Gaussian processes which arise in the approach to the PHI 3 4 theory. Part II is devoted to the discussion of the ultraviolet stability in the PHI 3 4 theory. (Auth.)

  2. Accumulating Project Management Knowledge Using Process Theory

    NARCIS (Netherlands)

    Niederman, Fred; March, Salvatore T.; Mueller, Benjamin

    2016-01-01

    Process theory has become an important mechanism for the accumulation of knowledge in a number of disciplines. In contrast with variance theory, which focuses on co-variation of dependent and independent variables, process theory focuses on sequences of activities, their duration and the intervals

  3. Empirical processes: theory and applications

    OpenAIRE

    Venturini Sergio

    2005-01-01

    Proceedings of the 2003 Summer School in Statistics and Probability in Torgnon (Aosta, Italy) held by Prof. Jon A. Wellner and Prof. M. Banerjee. The topic presented was the theory of empirical processes with applications to statistics (m-estimation, bootstrap, semiparametric theory).

  4. Dissociative Functions in the Normal Mourning Process.

    Science.gov (United States)

    Kauffman, Jeffrey

    1994-01-01

    Sees dissociative functions in mourning process as occurring in conjunction with integrative trends. Considers initial shock reaction in mourning as model of normal dissociation in mourning process. Dissociation is understood to be related to traumatic significance of death in human consciousness. Discerns four psychological categories of…

  5. Long-wave theory for a new convective instability with exponential growth normal to the wall.

    Science.gov (United States)

    Healey, J J

    2005-05-15

    A linear stability theory is presented for the boundary-layer flow produced by an infinite disc rotating at constant angular velocity in otherwise undisturbed fluid. The theory is developed in the limit of long waves and when the effects of viscosity on the waves can be neglected. This is the parameter regime recently identified by the author in a numerical stability investigation where a curious new type of instability was found in which disturbances propagate and grow exponentially in the direction normal to the disc, (i.e. the growth takes place in a region of zero mean shear). The theory describes the mechanisms controlling the instability, the role and location of critical points, and presents a saddle-point analysis describing the large-time evolution of a wave packet in frames of reference moving normal to the disc. The theory also shows that the previously obtained numerical solutions for numerically large wavelengths do indeed lie in the asymptotic long-wave regime, and so the behaviour and mechanisms described here may apply to a number of cross-flow instability problems.

  6. Evaluating accounting information systems that support multiple GAAP reporting using Normalized Systems Theory

    NARCIS (Netherlands)

    Vanhoof, E.; Huysmans, P.; Aerts, Walter; Verelst, J.; Aveiro, D.; Tribolet, J.; Gouveia, D.

    2014-01-01

    This paper uses a mixed methods approach of design science and case study research to evaluate structures of Accounting Information Systems (AIS) that report in multiple Generally Accepted Accounting Principles (GAAP), using Normalized Systems Theory (NST). To comply with regulation, many companies

  7. On the theory of optimal processes

    International Nuclear Information System (INIS)

    Goldenberg, P.; Provenzano, V.

    1975-01-01

    The theory of optimal processes is a recent mathematical formalism that is used to solve an important class of problems in science and in technology, that cannot be solved by classical variational techniques. An example of such processes would be the control of a nuclear reactor. Certain features of the theory of optimal processes are discussed, emphasizing the central contribution of Pontryagin with his formulation of the maximum principle. An application of the theory of optimum control is presented. The example is a time optimum problem applied to a simplified model of a nuclear reactor. It deals with the question of changing the equilibrium power level of the reactor in an optimum time

  8. Stochastic processes and filtering theory

    CERN Document Server

    Jazwinski, Andrew H

    1970-01-01

    This unified treatment of linear and nonlinear filtering theory presents material previously available only in journals, and in terms accessible to engineering students. Its sole prerequisites are advanced calculus, the theory of ordinary differential equations, and matrix analysis. Although theory is emphasized, the text discusses numerous practical applications as well.Taking the state-space approach to filtering, this text models dynamical systems by finite-dimensional Markov processes, outputs of stochastic difference, and differential equations. Starting with background material on probab

  9. The conflict and process theory of Melanie Klein.

    Science.gov (United States)

    Kavaler-Adler, S

    1993-09-01

    This article depicts the theory of Melanie Klein in both its conflict and process dimensions. In addition, it outlines Klein's strategic place in psychoanalytic history and in psychoanalytic theory formation. Her major contributions are seen in light of their clinical imperatives, and aspects of her metapsychology that seem negligible are differentiated from these clinical imperatives. Klein's role as a dialectical fulcrum between drive and object relations theories is explicated. Within the conflict theory, drive derivatives of sex and aggression are reformulated as object-related passions of love and hate. The process dimensions of Klein's theory are outlined in terms of dialectical increments of depressive position process as it alternates with regressive paranoid-schizoid-position mental phenomenology. The mourning process as a developmental process is particularly high-lighted in terms of self-integrative progression within the working through of the depressive position.

  10. Accumulating project management knowledge through process theory

    NARCIS (Netherlands)

    Niederman, Fred; March, Salvatore T.; Mueller, Benjamin

    2014-01-01

    This paper describes how the general notion of process theory can provide a foundational component in a portfolio of project management theories. The paper begins by outlining a variety of views pertaining to the nature of theory and theory development. This forms a basis for understanding how

  11. Biomechanical Analysis of Normal Brain Development during the First Year of Life Using Finite Strain Theory

    OpenAIRE

    Kim, Jeong Chul; Wang, Li; Shen, Dinggang; Lin, Weili

    2016-01-01

    The first year of life is the most critical time period for structural and functional development of the human brain. Combining longitudinal MR imaging and finite strain theory, this study aimed to provide new insights into normal brain development through a biomechanical framework. Thirty-three normal infants were longitudinally imaged using MRI from 2 weeks to 1 year of age. Voxel-wise Jacobian determinant was estimated to elucidate volumetric changes while Lagrange strains (both normal and...

  12. Theories on migration processes of Cd in Jiaozhou Bay

    Science.gov (United States)

    Yang, Dongfang; Li, Haixia; Wang, Qi; Ding, Jun; Zhang, Longlei

    2018-03-01

    Understanding the migration progress is essential to pollution control, while developing theories for the migration progress is the scientific basis. This paper further developed five key theories on migration processes of Cd including homogeneous theory, environmental dynamic theory, horizontal loss theory, migration trend theory and vertical migration theory, respectively. The performance and practical values of these theories were demonstrated in the application of these on analyzing the migration process of Cd in Jiaozhou Bay. Results these theory helpful to better understand the migration progress of pollutants in marine bay.

  13. Elementary process theory axiomatic introduction and applications

    CERN Document Server

    Cabbolet, Marcoen J T F

    2011-01-01

    Modern physics lacks a unitary theory that applies to all four fundamental interactions. This PhD thesis is a proposal for a single, complete, and coherent scheme of mathematically formulated elementary laws of nature. While the first chapter presents the general background, the second chapter addresses the method by which the main result has been developed. The next three chapters rigorously introduce the Elementary Process Theory, its mathematical foundations, and its applications to physics, cosmology and philosophy of mind. The final two chapters discuss the results and present the conclusions. Summarizing, the Elementary Process Theory is a scheme of seven well-formed closed expressions, written in the mathematical language of set matrix theory – a generalization of Zermelo-Fraenkel set theory. In the physical world, these seven expressions can be interpreted as elementary principles governing the universe at supersmall scale. The author critically confronts the theory with Quantum Mechanics and Genera...

  14. Recollection is a continuous process: implications for dual-process theories of recognition memory.

    Science.gov (United States)

    Mickes, Laura; Wais, Peter E; Wixted, John T

    2009-04-01

    Dual-process theory, which holds that recognition decisions can be based on recollection or familiarity, has long seemed incompatible with signal detection theory, which holds that recognition decisions are based on a singular, continuous memory-strength variable. Formal dual-process models typically regard familiarity as a continuous process (i.e., familiarity comes in degrees), but they construe recollection as a categorical process (i.e., recollection either occurs or does not occur). A continuous process is characterized by a graded relationship between confidence and accuracy, whereas a categorical process is characterized by a binary relationship such that high confidence is associated with high accuracy but all lower degrees of confidence are associated with chance accuracy. Using a source-memory procedure, we found that the relationship between confidence and source-recollection accuracy was graded. Because recollection, like familiarity, is a continuous process, dual-process theory is more compatible with signal detection theory than previously thought.

  15. Processing Information in Quantum Decision Theory

    OpenAIRE

    Yukalov, V. I.; Sornette, D.

    2008-01-01

    A survey is given summarizing the state of the art of describing information processing in Quantum Decision Theory, which has been recently advanced as a novel variant of decision making, based on the mathematical theory of separable Hilbert spaces. This mathematical structure captures the effect of superposition of composite prospects, including many incorporated intended actions. The theory characterizes entangled decision making, non-commutativity of subsequent decisions, and intention int...

  16. Emotion processes in normal and abnormal development and preventive intervention.

    Science.gov (United States)

    Izard, Carroll E; Fine, Sarah; Mostow, Allison; Trentacosta, Christopher; Campbell, Jan

    2002-01-01

    We present an analysis of the role of emotions in normal and abnormal development and preventive intervention. The conceptual framework stems from three tenets of differential emotions theory (DET). These principles concern the constructs of emotion utilization; intersystem connections among modular emotion systems, cognition, and action; and the organizational and motivational functions of discrete emotions. Particular emotions and patterns of emotions function differentially in different periods of development and in influencing the cognition and behavior associated with different forms of psychopathology. Established prevention programs have not emphasized the concept of emotion as motivation. It is even more critical that they have generally neglected the idea of modulating emotions, not simply to achieve self-regulation, but also to utilize their inherently adaptive functions as a means of facilitating the development of social competence and preventing psychopathology. The paper includes a brief description of a theory-based prevention program and suggestions for complementary targeted interventions to address specific externalizing and internalizing problems. In the final section, we describe ways in which emotion-centered preventions can provide excellent opportunities for research on the development of normal and abnormal behavior.

  17. Theories of transporting processes of Cu in Jiaozhou Bay

    Science.gov (United States)

    Yang, Dongfang; Su, Chunhua; Zhu, Sixi; Wu, Yunjie; Zhou, Wei

    2018-02-01

    Many marine bays have been polluted along with the rapid development of industry and population size, and understanding the transporting progresses of pollutants is essential to pollution control. In order to better understanding the transporting progresses of pollutants in marine, this paper carried on a comprehensive research of the theories of transporting processes of Cu in Jiaozhou Bay. Results showed that the transporting processes of Cu in this bay could be summarized into seven key theories including homogeneous theory, environmental dynamic theory, horizontal loss theory, source to waters transporting theory, sedimentation transporting theory, migration trend theory and vertical transporting theory, respectively. These theories helpful to better understand the migration progress of pollutants in marine bay.

  18. A predictive processing theory of sensorimotor contingencies: Explaining the puzzle of perceptual presence and its absence in synesthesia.

    Science.gov (United States)

    Seth, Anil K

    2014-01-01

    Normal perception involves experiencing objects within perceptual scenes as real, as existing in the world. This property of "perceptual presence" has motivated "sensorimotor theories" which understand perception to involve the mastery of sensorimotor contingencies. However, the mechanistic basis of sensorimotor contingencies and their mastery has remained unclear. Sensorimotor theory also struggles to explain instances of perception, such as synesthesia, that appear to lack perceptual presence and for which relevant sensorimotor contingencies are difficult to identify. On alternative "predictive processing" theories, perceptual content emerges from probabilistic inference on the external causes of sensory signals, however, this view has addressed neither the problem of perceptual presence nor synesthesia. Here, I describe a theory of predictive perception of sensorimotor contingencies which (1) accounts for perceptual presence in normal perception, as well as its absence in synesthesia, and (2) operationalizes the notion of sensorimotor contingencies and their mastery. The core idea is that generative models underlying perception incorporate explicitly counterfactual elements related to how sensory inputs would change on the basis of a broad repertoire of possible actions, even if those actions are not performed. These "counterfactually-rich" generative models encode sensorimotor contingencies related to repertoires of sensorimotor dependencies, with counterfactual richness determining the degree of perceptual presence associated with a stimulus. While the generative models underlying normal perception are typically counterfactually rich (reflecting a large repertoire of possible sensorimotor dependencies), those underlying synesthetic concurrents are hypothesized to be counterfactually poor. In addition to accounting for the phenomenology of synesthesia, the theory naturally accommodates phenomenological differences between a range of experiential states

  19. Spin-filtering effect and proximity effect in normal metal/ferromagnetic insulator/normal metal/superconductor junctions

    International Nuclear Information System (INIS)

    Li Hong; Yang Wei; Yang Xinjian; Qin Minghui; Xu Yihong

    2007-01-01

    Taking into account the thickness of the ferromagnetic insulator (FI), the spin-filtering effect and proximity effect in normal metal/ferromagnetic insulator/normal metal/superconductor (NM/FI/NM/SC) junctions are studied based on an extended Blonder-Tinkham-Klapwijk (BTK) theory. It is shown that a spin-dependent energy shift during the tunneling process induces splitting of the sub-energy gap conductance peaks and the spin polarization in the ferromagnetic insulator causes an imbalance of the peak heights. Different from the ferromagnet the spin-filtering effect of the FI cannot cause the reversion of the normalized conductance in NM/FI/NM/SC junctions

  20. Calculation of TC in a normal-superconductor bilayer using the microscopic-based Usadel theory

    International Nuclear Information System (INIS)

    Martinis, John M.; Hilton, G.C.; Irwin, K.D.; Wollman, D.A.

    2000-01-01

    The Usadel equations give a theory of superconductivity, valid in the diffusive limit, that is a generalization of the microscopic equations of the BCS theory. Because the theory is expressed in a tractable and physical form, even experimentalists can analytically and numerically calculate detailed properties of superconductors in physically relevant geometries. Here, we describe the Usadel equations and review their solution in the case of predicting the transition temperature T C of a thin normal-superconductor bilayer. We also extend this calculation for thicker bilayers to show the dependence on the resistivity of the films. These results, which show a dependence on both the interface resistance and heat capacity of the films, provide important guidance on fabricating bilayers with reproducible transition temperatures

  1. Normalized value coding explains dynamic adaptation in the human valuation process.

    Science.gov (United States)

    Khaw, Mel W; Glimcher, Paul W; Louie, Kenway

    2017-11-28

    The notion of subjective value is central to choice theories in ecology, economics, and psychology, serving as an integrated decision variable by which options are compared. Subjective value is often assumed to be an absolute quantity, determined in a static manner by the properties of an individual option. Recent neurobiological studies, however, have shown that neural value coding dynamically adapts to the statistics of the recent reward environment, introducing an intrinsic temporal context dependence into the neural representation of value. Whether valuation exhibits this kind of dynamic adaptation at the behavioral level is unknown. Here, we show that the valuation process in human subjects adapts to the history of previous values, with current valuations varying inversely with the average value of recently observed items. The dynamics of this adaptive valuation are captured by divisive normalization, linking these temporal context effects to spatial context effects in decision making as well as spatial and temporal context effects in perception. These findings suggest that adaptation is a universal feature of neural information processing and offer a unifying explanation for contextual phenomena in fields ranging from visual psychophysics to economic choice.

  2. Developing Visualization Support System for Teaching/Learning Database Normalization

    Science.gov (United States)

    Folorunso, Olusegun; Akinwale, AdioTaofeek

    2010-01-01

    Purpose: In tertiary institution, some students find it hard to learn database design theory, in particular, database normalization. The purpose of this paper is to develop a visualization tool to give students an interactive hands-on experience in database normalization process. Design/methodology/approach: The model-view-controller architecture…

  3. Self-consistent theory of normal-to-superconducting transition

    International Nuclear Information System (INIS)

    Radzihovsky, L.; Chicago Univ., IL

    1995-01-01

    I study the normal-to-superconducting (NS) transition within the Ginzburg-Landau (GL) model, taking into account the fluctuations in the m-component complex order parameter ψ α and the vector potential A in the arbitrary dimension d, for any m. I find that the transition is of second order and that the previous conclusion of the fluctuation-driven first-order transition is a possible artifact of the breakdown of the ε-expansion and the inaccuracy of the 1/m-expansion for physical values ε = 1, m 1. I compute the anomalous η(d, m) exponent at the NS transition, and find η(3, 1) ∼ -0.38. In the m → ∞ limit, η(d, m) becomes exact and agrees with the 1/m-expansion. Near d = 4 the theory is also in good agreement with the perturbative ε-expansion results for m > 183 and provides a sensible interpolation formula for arbitrary d and m. (orig.)

  4. Dual-Process Theories and Cognitive Development: Advances and Challenges

    Science.gov (United States)

    Barrouillet, Pierre

    2011-01-01

    Dual-process theories have gained increasing importance in psychology. The contrast that they describe between an old intuitive and a new deliberative mind seems to make these theories especially suited to account for development. Accordingly, this special issue aims at presenting the latest applications of dual-process theories to cognitive…

  5. PROCESS CAPABILITY ESTIMATION FOR NON-NORMALLY DISTRIBUTED DATA USING ROBUST METHODS - A COMPARATIVE STUDY

    Directory of Open Access Journals (Sweden)

    Yerriswamy Wooluru

    2016-06-01

    Full Text Available Process capability indices are very important process quality assessment tools in automotive industries. The common process capability indices (PCIs Cp, Cpk, Cpm are widely used in practice. The use of these PCIs based on the assumption that process is in control and its output is normally distributed. In practice, normality is not always fulfilled. Indices developed based on normality assumption are very sensitive to non- normal processes. When distribution of a product quality characteristic is non-normal, Cp and Cpk indices calculated using conventional methods often lead to erroneous interpretation of process capability. In the literature, various methods have been proposed for surrogate process capability indices under non normality but few literature sources offer their comprehensive evaluation and comparison of their ability to capture true capability in non-normal situation. In this paper, five methods have been reviewed and capability evaluation is carried out for the data pertaining to resistivity of silicon wafer. The final results revealed that the Burr based percentile method is better than Clements method. Modelling of non-normal data and Box-Cox transformation method using statistical software (Minitab 14 provides reasonably good result as they are very promising methods for non - normal and moderately skewed data (Skewness <= 1.5.

  6. Spin-filter effect in normal metal/ferromagnetic insulator/normal metal/superconductor structures

    International Nuclear Information System (INIS)

    Li, Hong; Yang, Wei; Yang, Xinjian; Qin, Minghui; Guo, Jianqin

    2007-01-01

    Taking into account the thickness of the ferromagnetic insulator, the spin-filter effect in normal metal/ferromagnetic insulator/normal metal/superconductor (NM/FI/NM/SC) junctions is studied based on the Blonder-Tinkham-Klapwijk (BTK) theory. It is shown that a spin-dependent energy shift during the tunneling process induces splitting of the subgap resonance peaks. The spin polarization due to the spin-filter effect of the FI causes an imbalance of the peaks heights and can enhance the Zeeman splitting of the gap peaks caused by an applied magnetic field. The spin-filter effect has no contribution to the proximity-effect-induced superconductivity in NM interlayer

  7. On diffusion process generators and scattering theory

    International Nuclear Information System (INIS)

    Demuth, M.

    1980-01-01

    In scattering theory the existence of wave operators is one of the mainly interesting points. For two selfadjoint operators K and H defined in separable Hilbertspaces H tilde and H' tilde, respectively, the usual two space wave operator is defined by Ωsub(+-)(H,J,K) = s-lim esup(itH)Jesup(-itK)Psup(ac), t → +-infinity, if these limits exist. J is the identification operator mapping H tilde into H' tilde. Psup(ac) is the orthogonal projection onto the absolutely continuous subspace of K. The objective is to prove the existence and completeness of the wave operator for K and K+V where K is a diffusion process generator and V a singular perturbation. Because generators of diffusion processes can be obtained by extension of second order differential operators with variable coefficients the result connects hard-core potential problems and wave operator existence for diffusion process generators including scattering theory for second order elliptic differential operators by means of the stochastic process theory and stochastic differential equation solutions. (author)

  8. Toward a General Research Process for Using Dubin's Theory Building Model

    Science.gov (United States)

    Holton, Elwood F.; Lowe, Janis S.

    2007-01-01

    Dubin developed a widely used methodology for theory building, which describes the components of the theory building process. Unfortunately, he does not define a research process for implementing his theory building model. This article proposes a seven-step general research process for implementing Dubin's theory building model. An example of a…

  9. How to Develop a Multi-Grounded Theory: the evolution of a business process theory

    OpenAIRE

    Mikael Lind; Goran Goldkuhl

    2006-01-01

    In the information systems field there is a great need for different theories. Theory development can be performed in different ways – deductively and/or inductively. Different approaches with their pros and cons for theory development exists. A combined approach, which builds on inductive as well as deductive thinking, has been put forward – a Multi-Grounded Theory approach. In this paper the evolution of a business process theory is regarded as the development of a multi-grounded theory. Th...

  10. Does Normal Processing Provide Evidence of Specialised Semantic Subsystems?

    Science.gov (United States)

    Shapiro, Laura R.; Olson, Andrew C.

    2005-01-01

    Category-specific disorders are frequently explained by suggesting that living and non-living things are processed in separate subsystems (e.g. Caramazza & Shelton, 1998). If subsystems exist, there should be benefits for normal processing, beyond the influence of structural similarity. However, no previous study has separated the relative…

  11. Information processing theory in the early design stages

    DEFF Research Database (Denmark)

    Cash, Philip; Kreye, Melanie

    2014-01-01

    suggestions for improvements and support. One theory that may be particularly applicable to the early design stages is Information Processing Theory (IPT) as it is linked to the design process with regard to the key concepts considered. IPT states that designers search for information if they perceive......, the new knowledge is shared between the design team to reduce ambiguity with regards to its meaning and to build a shared understanding – reducing perceived uncertainty. Thus, we propose that Information-Processing Theory is suitable to describe designer activity in the early design stages...... uncertainty with regard to the knowledge necessary to solve a design challenge. They then process this information and compare if the new knowledge they have gained covers the previous knowledge gap. In engineering design, uncertainty plays a key role, particularly in the early design stages which has been...

  12. Applying normalization process theory to understand implementation of a family violence screening and care model in maternal and child health nursing practice: a mixed method process evaluation of a randomised controlled trial.

    Science.gov (United States)

    Hooker, Leesa; Small, Rhonda; Humphreys, Cathy; Hegarty, Kelsey; Taft, Angela

    2015-03-28

    In Victoria, Australia, Maternal and Child Health (MCH) services deliver primary health care to families with children 0-6 years, focusing on health promotion, parenting support and early intervention. Family violence (FV) has been identified as a major public health concern, with increased prevalence in the child-bearing years. Victorian Government policy recommends routine FV screening of all women attending MCH services. Using Normalization Process Theory (NPT), we aimed to understand the barriers and facilitators of implementing an enhanced screening model into MCH nurse clinical practice. NPT informed the process evaluation of a pragmatic, cluster randomised controlled trial in eight MCH nurse teams in metropolitan Melbourne, Victoria, Australia. Using mixed methods (surveys and interviews), we explored the views of MCH nurses, MCH nurse team leaders, FV liaison workers and FV managers on implementation of the model. Quantitative data were analysed by comparing proportionate group differences and change within trial arm over time between interim and impact nurse surveys. Qualitative data were inductively coded, thematically analysed and mapped to NPT constructs (coherence, cognitive participation, collective action and reflexive monitoring) to enhance our understanding of the outcome evaluation. MCH nurse participation rates for interim and impact surveys were 79% (127/160) and 71% (114/160), respectively. Twenty-three key stakeholder interviews were completed. FV screening work was meaningful and valued by participants; however, the implementation coincided with a significant (government directed) change in clinical practice which impacted on full engagement with the model (coherence and cognitive participation). The use of MCH nurse-designed FV screening/management tools in focussed women's health consultations and links with FV services enhanced the participants' work (collective action). Monitoring of FV work (reflexive monitoring) was limited. The use of

  13. Dynamic competition and enterprising discovery: Kirzner’s market process theory

    Directory of Open Access Journals (Sweden)

    Ahmet İhsan KAYA

    2011-12-01

    Full Text Available Market process theory is designed by the followers of Austrian School tradition as an alternative to neo-classic price theory in order to explain perceptible markets. Contrary to neo-classic economy which focuses on the concept of equilibrium, market process theory seeks to explore unequilibrium and direction to equilibrium. While doing so, the role of enterprenuer in dealing with limited information which is not taken into consideration in the price theory of neo-classic economy, uncertainty because of time and uncertainty which occurs in market underpins Israel Kirzner's analyses. In the study, Kirzner's competition and enterpreneurship theory is discussed with the contributions of Mises and Hayek. The study constitutes an introduction to market process theory of Kirzner.

  14. The Theory of Linear Prediction

    CERN Document Server

    Vaidyanathan, PP

    2007-01-01

    Linear prediction theory has had a profound impact in the field of digital signal processing. Although the theory dates back to the early 1940s, its influence can still be seen in applications today. The theory is based on very elegant mathematics and leads to many beautiful insights into statistical signal processing. Although prediction is only a part of the more general topics of linear estimation, filtering, and smoothing, this book focuses on linear prediction. This has enabled detailed discussion of a number of issues that are normally not found in texts. For example, the theory of vecto

  15. Linking the Value Assessment of Oil and Gas Firms to Ambidexterity Theory Using a Mixture of Normal Distributions

    Directory of Open Access Journals (Sweden)

    Casault Sébastien

    2016-05-01

    Full Text Available Oil and gas exploration and production firms have return profiles that are not easily explained by current financial theory – the variation in their market returns is non-Gaussian. In this paper, the nature and underlying reason for these significant deviations from expected behavior are considered. Understanding these differences in financial market behavior is important for a wide range of reasons, including: assessing investments, investor relations, decisions to raise capital, assessment of firm and management performance. We show that using a “thicker tailed” mixture of two normal distributions offers a significantly more accurate model than the traditionally Gaussian approach in describing the behavior of the value of oil and gas firms. This mixture of normal distribution is also more effective in bridging the gap between management theory and practice without the need to introduce complex time-sensitive GARCH and/or jump diffusion dynamics. The mixture distribution is consistent with ambidexterity theory that suggests firms operate in two distinct states driven by the primary focus of the firm: an exploration state with high uncertainty and, an exploitation (or production state with lower uncertainty. The findings have direct implications on improving the accuracy of real option pricing techniques and futures analysis of risk management. Traditional options pricing models assume that commercial returns from these assets are described by a normal random walk. However, a normal random walk model discounts the possibility of large changes to the marketplace from events such as the discovery of important reserves or the introduction of new technology. The mixture distribution proves to be well suited to inherently describe the unusually large risks and opportunities associated with oil and gas production and exploration. A significance testing study of 554 oil and gas exploration and production firms empirically supports using a mixture

  16. Semi-analytical quasi-normal mode theory for the local density of states in coupled photonic crystal cavity-waveguide structures

    DEFF Research Database (Denmark)

    de Lasson, Jakob Rosenkrantz; Kristensen, Philip Trøst; Mørk, Jesper

    2015-01-01

    We present and validate a semi-analytical quasi-normal mode (QNM) theory for the local density of states (LDOS) in coupled photonic crystal (PhC) cavity-waveguide structures. By means of an expansion of the Green's function on one or a few QNMs, a closed-form expression for the LDOS is obtained, ......-trivial spectrum with a peak and a dip is found, which is reproduced only when including both the two relevant QNMs in the theory. In both cases, we find relative errors below 1% in the bandwidth of interest.......We present and validate a semi-analytical quasi-normal mode (QNM) theory for the local density of states (LDOS) in coupled photonic crystal (PhC) cavity-waveguide structures. By means of an expansion of the Green's function on one or a few QNMs, a closed-form expression for the LDOS is obtained......, and for two types of two-dimensional PhCs, with one and two cavities side-coupled to an extended waveguide, the theory is validated against numerically exact computations. For the single cavity, a slightly asymmetric spectrum is found, which the QNM theory reproduces, and for two cavities a non...

  17. Information Theory Analysis of Cascading Process in a Synthetic Model of Fluid Turbulence

    Directory of Open Access Journals (Sweden)

    Massimo Materassi

    2014-02-01

    Full Text Available The use of transfer entropy has proven to be helpful in detecting which is the verse of dynamical driving in the interaction of two processes, X and Y . In this paper, we present a different normalization for the transfer entropy, which is capable of better detecting the information transfer direction. This new normalized transfer entropy is applied to the detection of the verse of energy flux transfer in a synthetic model of fluid turbulence, namely the Gledzer–Ohkitana–Yamada shell model. Indeed, this is a fully well-known model able to model the fully developed turbulence in the Fourier space, which is characterized by an energy cascade towards the small scales (large wavenumbers k, so that the application of the information-theory analysis to its outcome tests the reliability of the analysis tool rather than exploring the model physics. As a result, the presence of a direct cascade along the scales in the shell model and the locality of the interactions in the space of wavenumbers come out as expected, indicating the validity of this data analysis tool. In this context, the use of a normalized version of transfer entropy, able to account for the difference of the intrinsic randomness of the interacting processes, appears to perform better, being able to discriminate the wrong conclusions to which the “traditional” transfer entropy would drive.

  18. Complete Normal Ordering 1: Foundations

    CERN Document Server

    Ellis, John; Skliros, Dimitri P.

    2016-01-01

    We introduce a new prescription for quantising scalar field theories perturbatively around a true minimum of the full quantum effective action, which is to `complete normal order' the bare action of interest. When the true vacuum of the theory is located at zero field value, the key property of this prescription is the automatic cancellation, to any finite order in perturbation theory, of all tadpole and, more generally, all `cephalopod' Feynman diagrams. The latter are connected diagrams that can be disconnected into two pieces by cutting one internal vertex, with either one or both pieces free from external lines. In addition, this procedure of `complete normal ordering' (which is an extension of the standard field theory definition of normal ordering) reduces by a substantial factor the number of Feynman diagrams to be calculated at any given loop order. We illustrate explicitly the complete normal ordering procedure and the cancellation of cephalopod diagrams in scalar field theories with non-derivative i...

  19. Normal Patterns of Deja Experience in a Healthy, Blind Male: Challenging Optical Pathway Delay Theory

    Science.gov (United States)

    O'Connor, Akira R.; Moulin, Christopher J. A.

    2006-01-01

    We report the case of a 25-year-old healthy, blind male, MT, who experiences normal patterns of deja vu. The optical pathway delay theory of deja vu formation assumes that neuronal input from the optical pathways is necessary for the formation of the experience. Surprisingly, although the sensation of deja vu is known to be experienced by blind…

  20. System Theory and Physiological Processes.

    Science.gov (United States)

    Jones, R W

    1963-05-03

    Engineers and physiologists working together in experimental and theoretical studies predict that the application of system analysis to biological processes will increase understanding of these processes and broaden the base of system theory. Richard W. Jones, professor of electrical engineering at Northwestern University, Evanston, Illinois, and John S. Gray, professor of physiology at Northwestern's Medical School, discuss these developments. Their articles are adapted from addresses delivered in Chicago in November 1962 at the 15th Annual Conference on Engineering in Medicine and Biology.

  1. Quantum measurement and algebraic quantum field theories

    International Nuclear Information System (INIS)

    DeFacio, B.

    1976-01-01

    It is shown that the physics and semantics of quantum measurement provide a natural interpretation of the weak neighborhoods of the states on observable algebras without invoking any ideas of ''a reading error'' or ''a measured range.'' Then the state preparation process in quantum measurement theory is shown to give the normal (or locally normal) states on the observable algebra. Some remarks are made concerning the physical implications of normal state for systems with an infinite number of degrees of freedom, including questions on open and closed algebraic theories

  2. Applying Information Processing Theory to Supervision: An Initial Exploration

    Science.gov (United States)

    Tangen, Jodi L.; Borders, L. DiAnne

    2017-01-01

    Although clinical supervision is an educational endeavor (Borders & Brown, [Borders, L. D., 2005]), many scholars neglect theories of learning in working with supervisees. The authors describe 1 learning theory--information processing theory (Atkinson & Shiffrin, 1968, 1971; Schunk, 2016)--and the ways its associated interventions may…

  3. Complete normal ordering 1: Foundations

    Directory of Open Access Journals (Sweden)

    John Ellis

    2016-08-01

    Full Text Available We introduce a new prescription for quantising scalar field theories (in generic spacetime dimension and background perturbatively around a true minimum of the full quantum effective action, which is to ‘complete normal order’ the bare action of interest. When the true vacuum of the theory is located at zero field value, the key property of this prescription is the automatic cancellation, to any finite order in perturbation theory, of all tadpole and, more generally, all ‘cephalopod’ Feynman diagrams. The latter are connected diagrams that can be disconnected into two pieces by cutting one internal vertex, with either one or both pieces free from external lines. In addition, this procedure of ‘complete normal ordering’ (which is an extension of the standard field theory definition of normal ordering reduces by a substantial factor the number of Feynman diagrams to be calculated at any given loop order. We illustrate explicitly the complete normal ordering procedure and the cancellation of cephalopod diagrams in scalar field theories with non-derivative interactions, and by using a point splitting ‘trick’ we extend this result to theories with derivative interactions, such as those appearing as non-linear σ-models in the world-sheet formulation of string theory. We focus here on theories with trivial vacua, generalising the discussion to non-trivial vacua in a follow-up paper.

  4. Diffusive epidemic process: theory and simulation

    International Nuclear Information System (INIS)

    Maia, Daniel Souza; Dickman, Ronald

    2007-01-01

    We study the continuous absorbing-state phase transition in the one-dimensional diffusive epidemic process via mean-field theory and Monte Carlo simulation. In this model, particles of two species (A and B) hop on a lattice and undergo reactions B → A and A+B → 2B; the total particle number is conserved. We formulate the model as a continuous-time Markov process described by a master equation. A phase transition between the (absorbing) B-free state and an active state is observed as the parameters (reaction and diffusion rates, and total particle density) are varied. Mean-field theory reveals a surprising, nonmonotonic dependence of the critical recovery rate on the diffusion rate of B particles. A computational realization of the process that is faithful to the transition rates defining the model is devised, allowing for direct comparison with theory. Using the quasi-stationary simulation method we determine the order parameter and the survival time in systems of up to 4000 sites. Due to strong finite-size effects, the results converge only for large system sizes. We find no evidence for a discontinuous transition. Our results are consistent with the existence of three distinct universality classes, depending on whether A particles diffusive more rapidly, less rapidly or at the same rate as B particles. We also perform quasi-stationary simulations of the triplet creation model, which yield results consistent with a discontinuous transition at high diffusion rates

  5. Process theory for supervisory control of stochastic systems with data

    NARCIS (Netherlands)

    Markovski, J.

    2012-01-01

    We propose a process theory for supervisory control of stochastic nondeterministic plants with data-based observations. The Markovian process theory with data relies on the notion of Markovian partial bisimulation to capture controllability of stochastic nondeterministic systems. It presents a

  6. Fractal Point Process and Queueing Theory and Application to Communication Networks

    National Research Council Canada - National Science Library

    Wornel, Gregory

    1999-01-01

    .... A unifying theme in the approaches to these problems has been an integration of interrelated perspectives from communication theory, information theory, signal processing theory, and control theory...

  7. Reggeon field theory and Markov processes

    International Nuclear Information System (INIS)

    Grassberger, P.; Sundermeyer, K.

    1978-01-01

    Reggeon field theory with a quartic coupling in addition to the standard cubic one is shown to be mathematically equivalent to a chemical process where a radical can undergo diffusion, absorption, recombination, and autocatalytic production. Physically, these 'radicals' are wee partons. (Auth.)

  8. The uncertainty processing theory of motivation.

    Science.gov (United States)

    Anselme, Patrick

    2010-04-02

    Most theories describe motivation using basic terminology (drive, 'wanting', goal, pleasure, etc.) that fails to inform well about the psychological mechanisms controlling its expression. This leads to a conception of motivation as a mere psychological state 'emerging' from neurophysiological substrates. However, the involvement of motivation in a large number of behavioural parameters (triggering, intensity, duration, and directedness) and cognitive abilities (learning, memory, decision, etc.) suggest that it should be viewed as an information processing system. The uncertainty processing theory (UPT) presented here suggests that motivation is the set of cognitive processes allowing organisms to extract information from the environment by reducing uncertainty about the occurrence of psychologically significant events. This processing of information is shown to naturally result in the highlighting of specific stimuli. The UPT attempts to solve three major problems: (i) how motivations can affect behaviour and cognition so widely, (ii) how motivational specificity for objects and events can result from nonspecific neuropharmacological causal factors (such as mesolimbic dopamine), and (iii) how motivational interactions can be conceived in psychological terms, irrespective of their biological correlates. The UPT is in keeping with the conceptual tradition of the incentive salience hypothesis while trying to overcome the shortcomings inherent to this view. Copyright 2009 Elsevier B.V. All rights reserved.

  9. Implementing monitoring technologies in care homes for people with dementia: A qualitative exploration using Normalization Process Theory.

    Science.gov (United States)

    Hall, Alex; Wilson, Christine Brown; Stanmore, Emma; Todd, Chris

    2017-07-01

    Ageing societies and a rising prevalence of dementia are associated with increasing demand for care home places. Monitoring technologies (e.g. bed-monitoring systems; wearable location-tracking devices) are appealing to care homes as they may enhance safety, increase resident freedom, and reduce staff burden. However, there are ethical concerns about the use of such technologies, and it is unclear how they might be implemented to deliver their full range of potential benefits. This study explored facilitators and barriers to the implementation of monitoring technologies in care homes. Embedded multiple-case study with qualitative methods. Three dementia-specialist care homes in North-West England. Purposive sample of 24 staff (including registered nurses, clinical specialists, senior managers and care workers), 9 relatives and 9 residents. 36 semi-structured interviews with staff, relatives and residents; 175h of observation; resident care record review. Data collection informed by Normalization Process Theory, which seeks to account for how novel interventions become routine practice. Data analysed using Framework Analysis. Findings are presented under three main themes: 1. Reasons for using technologies: The primary reason for using monitoring technologies was to enhance safety. This often seemed to override consideration of other potential benefits (e.g. increased resident freedom) or ethical concerns (e.g. resident privacy); 2. Ways in which technologies were implemented: Some staff, relatives and residents were not involved in discussions and decision-making, which seemed to limit understandings of the potential benefits and challenges from the technologies. Involvement of residents appeared particularly challenging. Staff highlighted the importance of training, but staff training appeared mainly informal which did not seem sufficient to ensure that staff fully understood the technologies; 3. Use of technologies in practice: Technologies generated frequent

  10. Stationary stochastic processes theory and applications

    CERN Document Server

    Lindgren, Georg

    2012-01-01

    Some Probability and Process BackgroundSample space, sample function, and observablesRandom variables and stochastic processesStationary processes and fieldsGaussian processesFour historical landmarksSample Function PropertiesQuadratic mean propertiesSample function continuityDerivatives, tangents, and other characteristicsStochastic integrationAn ergodic resultExercisesSpectral RepresentationsComplex-valued stochastic processesBochner's theorem and the spectral distributionSpectral representation of a stationary processGaussian processesStationary counting processesExercisesLinear Filters - General PropertiesLinear time invariant filtersLinear filters and differential equationsWhite noise in linear systemsLong range dependence, non-integrable spectra, and unstable systemsThe ARMA-familyLinear Filters - Special TopicsThe Hilbert transform and the envelopeThe sampling theoremKarhunen-Loève expansionClassical Ergodic Theory and MixingThe basic ergodic theorem in L2Stationarity and transformationsThe ergodic th...

  11. Scattering process in the Scalar Duffin-Kemmer-Petiau gauge theory

    International Nuclear Information System (INIS)

    Beltran, J; M Pimentel, B; E Soto, D

    2016-01-01

    In this work we calculate the cross section of the scattering process of the Duffin-Kemmer-Petiau theory coupling with the Maxwell’s electromagnetic field. Specifically, we find the propagator of the free theory, the scattering amplitudes and cross sections at Born level for the Moeller and Compton scattering process of this model. For this purpose we use the analytic representation for free propagators and take account the framework of the Causal Perturbation Theory of Epstein and Glaser. (paper)

  12. Fetterman-House: A Process Use Distinction and a Theory.

    Science.gov (United States)

    Fetterman, David

    2003-01-01

    Discusses the concept of process use as an important distinction between the evaluation theories of E. House and D. Fetterman, thus helping to explain the discordant results of C. Christie for these two theories. (SLD)

  13. Mathematics Education as a Proving-Ground for Information-Processing Theories.

    Science.gov (United States)

    Greer, Brian, Ed.; Verschaffel, Lieven, Ed.

    1990-01-01

    Five papers discuss the current and potential contributions of information-processing theory to our understanding of mathematical thinking as those contributions affect the practice of mathematics education. It is concluded that information-processing theories need to be supplemented in various ways to more adequately reflect the complexity of…

  14. Exploring drivers and challenges in implementation of health promotion in community mental health services: a qualitative multi-site case study using Normalization Process Theory.

    Science.gov (United States)

    Burau, Viola; Carstensen, Kathrine; Fredens, Mia; Kousgaard, Marius Brostrøm

    2018-01-24

    There is an increased interest in improving the physical health of people with mental illness. Little is known about implementing health promotion interventions in adult mental health organisations where many users also have physical health problems. The literature suggests that contextual factors are important for implementation in community settings. This study focused on the change process and analysed the implementation of a structural health promotion intervention in community mental health organisations in different contexts in Denmark. The study was based on a qualitative multiple-case design and included two municipal and two regional provider organisations. Data were various written sources and 13 semi-structured interviews with 22 key managers and frontline staff. The analysis was organised around the four main constructs of Normalization Process Theory: Coherence, Cognitive Participation, Collective Action, and Reflexive Monitoring. Coherence: Most respondents found the intervention to be meaningful in that the intervention fitted well into existing goals, practices and treatment approaches. Cognitive Participation: Management engagement varied across providers and low engagement impeded implementation. Engaging all staff was a general problem although some of the initial resistance was apparently overcome. Collective Action: Daily enactment depended on staff being attentive and flexible enough to manage the complex needs and varying capacities of users. Reflexive Monitoring: During implementation, staff evaluations of the progress and impact of the intervention were mostly informal and ad hoc and staff used these to make on-going adjustments to activities. Overall, characteristics of context common to all providers (work force and user groups) seemed to be more important for implementation than differences in the external political-administrative context. In terms of research, future studies should adopt a more bottom-up, grounded description of context

  15. Assessment the Plasticity of Cortical Brain Theory through Visual Memory in Deaf and Normal Students

    Directory of Open Access Journals (Sweden)

    Ali Ghanaee-Chamanabad

    2012-10-01

    Full Text Available Background: The main aim of this research was to assess the differences of visual memory in deaf and normal students according to plasticity of cortical brain.Materials and Methods: This is an ex-post factor research. Benton visual test was performed by two different ways on 46 students of primary school. (22 deaf and 24 normal students. The t-student was used to analysis the data. Results: The visual memory in deaf students was significantly higher than the similar normal students (not deaf.While the action of visual memory in deaf girls was risen in comparison to normal girls in both ways, the deaf boys presented the better action in just one way of the two performances of Benton visual memory test.Conclusion: The action of plasticity of brain shows that the brain of an adult is dynamic and there are some changes in it. This brain plasticity has not limited to sensory somatic systems. Therefore according to plasticity of cortical brain theory, the deaf students due to the defect of hearing have increased the visual the visual inputs which developed the procedural visual memory.

  16. Bonding in Mercury Molecules Described by the Normalized Elimination of the Small Component and Coupled Cluster Theory

    NARCIS (Netherlands)

    Cremer, Dieter; Kraka, Elfi; Filatov, Michael

    2008-01-01

    Bond dissociation energies (BDEs) of neutral HgX and cationic HgX(+) molecules range from less than a kcal mol(-1) to as much as 60 kcal mol(-1). Using NESCICCCSD(T) [normalized elimination of the small component and coupled-cluster theory with all single and double excitations and a perturbative

  17. Induction of depressed mood: a test of opponent-process theory.

    Science.gov (United States)

    Ranieri, D J; Zeiss, A M

    1984-12-01

    Solomon's (1980) opponent-process theory of acquired motivation has been used to explain many phenomena in which affective or hedonic contrasts appear to exist, but has not been applied to the induction of depressed mood. The purpose of this study, therefore, was to determine whether opponent-process theory can be applied to this area. Velten's (1968) mood-induction procedure was used and subjects were assigned either to a depression-induction condition or to one of two control groups. Self-report measures of depressed mood were taken before, during, and at several points after the mood induction. Results were not totally consistent with a rigorous set of criteria for supporting an opponent-process interpretation. This suggests that the opponent-process model may not be applicable to induced depressed mood. Possible weaknesses in the experimental design, along with implications for opponent-process theory, are discussed.

  18. Bicervical normal uterus with normal vagina | Okeke | Annals of ...

    African Journals Online (AJOL)

    To the best of our knowledge, only few cases of bicervical normal uterus with normal vagina exist in the literature; one of the cases had an anterior‑posterior disposition. This form of uterine abnormality is not explicable by the existing classical theory of mullerian anomalies and suggests that a complex interplay of events ...

  19. Varying acoustic-phonemic ambiguity reveals that talker normalization is obligatory in speech processing.

    Science.gov (United States)

    Choi, Ja Young; Hu, Elly R; Perrachione, Tyler K

    2018-04-01

    The nondeterministic relationship between speech acoustics and abstract phonemic representations imposes a challenge for listeners to maintain perceptual constancy despite the highly variable acoustic realization of speech. Talker normalization facilitates speech processing by reducing the degrees of freedom for mapping between encountered speech and phonemic representations. While this process has been proposed to facilitate the perception of ambiguous speech sounds, it is currently unknown whether talker normalization is affected by the degree of potential ambiguity in acoustic-phonemic mapping. We explored the effects of talker normalization on speech processing in a series of speeded classification paradigms, parametrically manipulating the potential for inconsistent acoustic-phonemic relationships across talkers for both consonants and vowels. Listeners identified words with varying potential acoustic-phonemic ambiguity across talkers (e.g., beet/boat vs. boot/boat) spoken by single or mixed talkers. Auditory categorization of words was always slower when listening to mixed talkers compared to a single talker, even when there was no potential acoustic ambiguity between target sounds. Moreover, the processing cost imposed by mixed talkers was greatest when words had the most potential acoustic-phonemic overlap across talkers. Models of acoustic dissimilarity between target speech sounds did not account for the pattern of results. These results suggest (a) that talker normalization incurs the greatest processing cost when disambiguating highly confusable sounds and (b) that talker normalization appears to be an obligatory component of speech perception, taking place even when the acoustic-phonemic relationships across sounds are unambiguous.

  20. Conflict Monitoring in Dual Process Theories of Thinking

    Science.gov (United States)

    De Neys, Wim; Glumicic, Tamara

    2008-01-01

    Popular dual process theories have characterized human thinking as an interplay between an intuitive-heuristic and demanding-analytic reasoning process. Although monitoring the output of the two systems for conflict is crucial to avoid decision making errors there are some widely different views on the efficiency of the process. Kahneman…

  1. Process service quality evaluation based on Dempster-Shafer theory and support vector machine.

    Science.gov (United States)

    Pei, Feng-Que; Li, Dong-Bo; Tong, Yi-Fei; He, Fei

    2017-01-01

    Human involvement influences traditional service quality evaluations, which triggers an evaluation's low accuracy, poor reliability and less impressive predictability. This paper proposes a method by employing a support vector machine (SVM) and Dempster-Shafer evidence theory to evaluate the service quality of a production process by handling a high number of input features with a low sampling data set, which is called SVMs-DS. Features that can affect production quality are extracted by a large number of sensors. Preprocessing steps such as feature simplification and normalization are reduced. Based on three individual SVM models, the basic probability assignments (BPAs) are constructed, which can help the evaluation in a qualitative and quantitative way. The process service quality evaluation results are validated by the Dempster rules; the decision threshold to resolve conflicting results is generated from three SVM models. A case study is presented to demonstrate the effectiveness of the SVMs-DS method.

  2. IMPROVING QUALITY OF STATISTICAL PROCESS CONTROL BY DEALING WITH NON‐NORMAL DATA IN AUTOMOTIVE INDUSTRY

    Directory of Open Access Journals (Sweden)

    Zuzana ANDRÁSSYOVÁ

    2012-07-01

    Full Text Available Study deals with an analysis of data to the effect that it improves the quality of statistical tools in processes of assembly of automobile seats. Normal distribution of variables is one of inevitable conditions for the analysis, examination, and improvement of the manufacturing processes (f. e.: manufacturing process capability although, there are constantly more approaches to non‐normal data handling. An appropriate probability distribution of measured data is firstly tested by the goodness of fit of empirical distribution with theoretical normal distribution on the basis of hypothesis testing using programme StatGraphics Centurion XV.II. Data are collected from the assembly process of 1st row automobile seats for each characteristic of quality (Safety Regulation ‐S/R individually. Study closely processes the measured data of an airbag´s assembly and it aims to accomplish the normal distributed data and apply it the statistical process control. Results of the contribution conclude in a statement of rejection of the null hypothesis (measured variables do not follow the normal distribution therefore it is necessary to begin to work on data transformation supported by Minitab15. Even this approach does not reach a normal distributed data and so should be proposed a procedure that leads to the quality output of whole statistical control of manufacturing processes.

  3. Quantum processes: A Whiteheadian interpretation of quantum field theory

    Science.gov (United States)

    Bain, Jonathan

    Quantum processes: A Whiteheadian interpretation of quantum field theory is an ambitious and thought-provoking exercise in physics and metaphysics, combining an erudite study of the very complex metaphysics of A.N. Whitehead with a well-informed discussion of contemporary issues in the philosophy of algebraic quantum field theory. Hättich's overall goal is to construct an interpretation of quantum field theory. He does this by translating key concepts in Whitehead's metaphysics into the language of algebraic quantum field theory. In brief, this Hättich-Whitehead (H-W, hereafter) interpretation takes "actual occasions" as the fundamental ontological entities of quantum field theory. An actual occasion is the result of two types of processes: a "transition process" in which a set of initial possibly-possessed properties for the occasion (in the form of "eternal objects") is localized to a space-time region; and a "concrescence process" in which a subset of these initial possibly-possessed properties is selected and actualized to produce the occasion. Essential to these processes is the "underlying activity", which conditions the way in which properties are initially selected and subsequently actualized. In short, under the H-W interpretation of quantum field theory, an initial set of possibly-possessed eternal objects is represented by a Boolean sublattice of the lattice of projection operators determined by a von Neumann algebra R (O) associated with a region O of Minkowski space-time, and the underlying activity is represented by a state on R (O) obtained by conditionalizing off of the vacuum state. The details associated with the H-W interpretation involve imposing constraints on these representations motivated by principles found in Whitehead's metaphysics. These details are spelled out in the three sections of the book. The first section is a summary and critique of Whitehead's metaphysics, the second section introduces the formalism of algebraic quantum field

  4. The field theory approach to percolation processes

    International Nuclear Information System (INIS)

    Janssen, Hans-Karl; Taeuber, Uwe C.

    2005-01-01

    We review the field theory approach to percolation processes. Specifically, we focus on the so-called simple and general epidemic processes that display continuous non-equilibrium active to absorbing state phase transitions whose asymptotic features are governed, respectively, by the directed (DP) and dynamic isotropic percolation (dIP) universality classes. We discuss the construction of a field theory representation for these Markovian stochastic processes based on fundamental phenomenological considerations, as well as from a specific microscopic reaction-diffusion model realization. Subsequently we explain how dynamic renormalization group (RG) methods can be applied to obtain the universal properties near the critical point in an expansion about the upper critical dimensions d c = 4 (DP) and 6 (dIP). We provide a detailed overview of results for critical exponents, scaling functions, crossover phenomena, finite-size scaling, and also briefly comment on the influence of long-range spreading, the presence of a boundary, multispecies generalizations, coupling of the order parameter to other conserved modes, and quenched disorder

  5. Reclaiming life on one's own terms: a grounded theory study of the process of breast cancer survivorship.

    Science.gov (United States)

    Sherman, Deborah Witt; Rosedale, Mary; Haber, Judith

    2012-05-01

    To develop a substantive theory of the process of breast cancer survivorship. Grounded theory. A LISTSERV announcement posted on the SHARE Web site and purposeful recruitment of women known to be diagnosed and treated for breast cancer. 15 women diagnosed with early-stage breast cancer. Constant comparative analysis. Breast cancer survivorship. The core variable identified was Reclaiming Life on One's Own Terms. The perceptions and experiences of the participants revealed overall that the diagnosis of breast cancer was a turning point in life and the stimulus for change. That was followed by the recognition of breast cancer as now being a part of life, leading to the necessity of learning to live with breast cancer, and finally, creating a new life after breast cancer. Participants revealed that breast cancer survivorship is a process marked and shaped by time, the perception of support, and coming to terms with the trauma of a cancer diagnosis and the aftermath of treatment. The process of survivorship continues by assuming an active role in self-healing, gaining a new perspective and reconciling paradoxes, creating a new mindset and moving to a new normal, developing a new way of being in the world on one's own terms, and experiencing growth through adversity beyond survivorship. The process of survivorship for women with breast cancer is an evolutionary journey with short- and long-term challenges. This study shows the development of an empirically testable theory of survivorship that describes and predicts women's experiences following breast cancer treatment from the initial phase of recovery and beyond. The theory also informs interventions that not only reduce negative outcomes, but promote ongoing healing, adjustment, and resilience over time.

  6. Effect of normal processes on thermal conductivity of germanium ...

    Indian Academy of Sciences (India)

    Abstract. The effect of normal scattering processes is considered to redistribute the phonon momentum in (a) the same phonon branch – KK-S model and (b) between differ- ent phonon branches – KK-H model. Simplified thermal conductivity relations are used to estimate the thermal conductivity of germanium, silicon and ...

  7. Quantum theory of gauge fields and rigid processes calculation

    International Nuclear Information System (INIS)

    Andreev, I.V.

    1981-01-01

    Elementary statement of the basic data on the nature of quark interactions and their role in the high energy processes is presented in the first part of the paper. The second part of the paper deals with gauge theory (GT) of strong interactions (chromodynamics (CD)) and its application in calculation of rigid processes with quark participation. It is based on the method of functional integration (MFI). A comparatively simple representation of the MFI in the quantum theory and formulation of the perturbation theory for gauge fields are given. A derivation of the rules of diagram technique is presented. Renormalization invariance of the theory and the basic for CD phenomenon of asymptotical freedom are discussed. Theory application in calculation of certain effects at high energies is considered. From the CD view point considered is a parton model on the base of which ''rigid'' stage of evolution of quark and gluon jets produced at high energies can be quantitatively described and some quantitative experimental tests of the CD are suggested [ru

  8. Communication: Density functional theory model for multi-reference systems based on the exact-exchange hole normalization.

    Science.gov (United States)

    Laqua, Henryk; Kussmann, Jörg; Ochsenfeld, Christian

    2018-03-28

    The correct description of multi-reference electronic ground states within Kohn-Sham density functional theory (DFT) requires an ensemble-state representation, employing fractionally occupied orbitals. However, the use of fractional orbital occupation leads to non-normalized exact-exchange holes, resulting in large fractional-spin errors for conventional approximative density functionals. In this communication, we present a simple approach to directly include the exact-exchange-hole normalization into DFT. Compared to conventional functionals, our model strongly improves the description for multi-reference systems, while preserving the accuracy in the single-reference case. We analyze the performance of our proposed method at the example of spin-averaged atoms and spin-restricted bond dissociation energy surfaces.

  9. Communication: Density functional theory model for multi-reference systems based on the exact-exchange hole normalization

    Science.gov (United States)

    Laqua, Henryk; Kussmann, Jörg; Ochsenfeld, Christian

    2018-03-01

    The correct description of multi-reference electronic ground states within Kohn-Sham density functional theory (DFT) requires an ensemble-state representation, employing fractionally occupied orbitals. However, the use of fractional orbital occupation leads to non-normalized exact-exchange holes, resulting in large fractional-spin errors for conventional approximative density functionals. In this communication, we present a simple approach to directly include the exact-exchange-hole normalization into DFT. Compared to conventional functionals, our model strongly improves the description for multi-reference systems, while preserving the accuracy in the single-reference case. We analyze the performance of our proposed method at the example of spin-averaged atoms and spin-restricted bond dissociation energy surfaces.

  10. Using Process Theory for Accumulating Project Management Knowledge : A Seven-Category Model

    NARCIS (Netherlands)

    Niederman, Fred; Mueller, Benjamin; March, Salvatore T.

    2018-01-01

    Process theory has become an important type of theory for the accumulation of knowledge in a number of disciplines. Process theory focuses on sequences of activities, their duration and the intervals between them, as they lead to particular outcomes. Thus it is particularly relevant to project

  11. Business Process Management Theory and Applications

    CERN Document Server

    2013-01-01

    Business Process Management (BPM) has been in existence for decades. It  uses, complements, integrates and extends theories, methods and tools from  other scientific disciplines like: strategic management, information technology, managerial accounting, operations management etc. During this period the main focus themes of researchers and professionals in BPM  were: business process modeling, business process analysis, activity based costing, business process simulation, performance measurement, workflow management, the link between information technology and BPM for process automation etc. More recently the focus moved to subjects like Knowledge Management, Enterprise Resource Planning (ERP) Systems, Service Oriented Architectures (SOAs), Process Intelligence (PI) and even  Social Networks. In this collection of papers we present a review of the work and the outcomes achieved in the classic BPM fields as well as a deeper insight on recent advances in BPM. We present a review of business process modeling a...

  12. Dual-Process Theories of Reasoning: The Test of Development

    Science.gov (United States)

    Barrouillet, Pierre

    2011-01-01

    Dual-process theories have become increasingly influential in the psychology of reasoning. Though the distinction they introduced between intuitive and reflective thinking should have strong developmental implications, the developmental approach has rarely been used to refine or test these theories. In this article, I review several contemporary…

  13. Linear circuits, systems and signal processing: theory and application

    International Nuclear Information System (INIS)

    Byrnes, C.I.; Saeks, R.E.; Martin, C.F.

    1988-01-01

    In part because of its universal role as a first approximation of more complicated behaviour and in part because of the depth and breadth of its principle paradigms, the study of linear systems continues to play a central role in control theory and its applications. Enhancing more traditional applications to aerospace and electronics, application areas such as econometrics, finance, and speech and signal processing have contributed to a renaissance in areas such as realization theory and classical automatic feedback control. Thus, the last few years have witnessed a remarkable research effort expended in understanding both new algorithms and new paradigms for modeling and realization of linear processes and in the analysis and design of robust control strategies. The papers in this volume reflect these trends in both the theory and applications of linear systems and were selected from the invited and contributed papers presented at the 8th International Symposium on the Mathematical Theory of Networks and Systems held in Phoenix on June 15-19, 1987

  14. 师范生师德培养与思想政治理论课%Teachers' Professional Ethics and Ideological Political Theory Courses for Normal University Students

    Institute of Scientific and Technical Information of China (English)

    冉静; 王京强; 冯晋

    2015-01-01

    在思想政治理论课多元化的教育功能之中,道德教育是较为突出的功能之一。思想政治理论课在属性、目标和过程等方面与师范生师德培养存在密切的关联性,对师范生师德培养起着积极而有效的推动和促进作用。文章阐释了师范生师德培养的意义,并着重论述了思想政治理论课在师范生师德培养中的作用。%In the pluralism of educational function for t he Ideological and Political Theory Courses, Teachers' professional ethics is one of the prominent functions. The course is closely related to students’ morality cultivation in the aspects of attribution,goals and process,which greatly promotes normal university students’ professional ethics. And it plays a positive and effective role in promoting and facilitating teachers' professional ethics of normal university students. This article expounds the significance of the cultivating teachers' professional ethics of normal university students,and mainly discusses the role the ideological and political theory course plays in cultivating normal university students' professional ethics.

  15. Process service quality evaluation based on Dempster-Shafer theory and support vector machine.

    Directory of Open Access Journals (Sweden)

    Feng-Que Pei

    Full Text Available Human involvement influences traditional service quality evaluations, which triggers an evaluation's low accuracy, poor reliability and less impressive predictability. This paper proposes a method by employing a support vector machine (SVM and Dempster-Shafer evidence theory to evaluate the service quality of a production process by handling a high number of input features with a low sampling data set, which is called SVMs-DS. Features that can affect production quality are extracted by a large number of sensors. Preprocessing steps such as feature simplification and normalization are reduced. Based on three individual SVM models, the basic probability assignments (BPAs are constructed, which can help the evaluation in a qualitative and quantitative way. The process service quality evaluation results are validated by the Dempster rules; the decision threshold to resolve conflicting results is generated from three SVM models. A case study is presented to demonstrate the effectiveness of the SVMs-DS method.

  16. Anterior EEG asymmetries and opponent process theory.

    Science.gov (United States)

    Kline, John P; Blackhart, Ginette C; Williams, William C

    2007-03-01

    The opponent process theory of emotion [Solomon, R.L., and Corbit, J.D. (1974). An opponent-process theory of motivation: I. Temporal dynamics of affect. Psychological Review, 81, 119-143.] predicts a temporary reversal of emotional valence during the recovery from emotional stimulation. We hypothesized that this affective contrast would be apparent in asymmetrical activity patterns in the frontal lobes, and would be more apparent for left frontally active individuals. The present study tested this prediction by examining EEG asymmetries during and after blocked presentations of aversive pictures selected from the International Affective Picture System (IAPS). 12 neutral images, 12 aversive images, and 24 neutral images were presented in blocks. Participants who were right frontally active at baseline did not show changes in EEG asymmetry while viewing aversive slides or after cessation. Participants left frontally active at baseline, however, exhibited greater relative left frontal activity after aversive stimulation than before stimulation. Asymmetrical activity patterns in the frontal lobes may relate to affect regulatory processes, including contrasting opponent after-reactions to aversive stimuli.

  17. A steady state theory for processive cellulases

    DEFF Research Database (Denmark)

    Cruys-Bagger, Nicolaj; Olsen, Jens Elmerdahl; Præstgaard, Eigil

    2013-01-01

    coefficient’, which represents the probability of the enzyme dissociating from the substrate strand before completing n sequential catalytic steps, where n is the mean processivity number measured experimentally. Typical processive cellulases have high substrate affinity, and therefore this probability is low....... This has significant kinetic implications, for example the maximal specific rate (Vmax/E0) for processive cellulases is much lower than the catalytic rate constant (kcat). We discuss how relationships based on this theory may be used in both comparative and mechanistic analyses of cellulases....

  18. Elements of the theory of Markov processes and their applications

    CERN Document Server

    Bharucha-Reid, A T

    2010-01-01

    This graduate-level text and reference in probability, with numerous applications to several fields of science, presents nonmeasure-theoretic introduction to theory of Markov processes. The work also covers mathematical models based on the theory, employed in various applied fields. Prerequisites are a knowledge of elementary probability theory, mathematical statistics, and analysis. Appendixes. Bibliographies. 1960 edition.

  19. Comparing Sensory Information Processing and Alexithymia between People with Substance Dependency and Normal.

    Science.gov (United States)

    Bashapoor, Sajjad; Hosseini-Kiasari, Seyyedeh Tayebeh; Daneshvar, Somayeh; Kazemi-Taskooh, Zeinab

    2015-01-01

    Sensory information processing and alexithymia are two important factors in determining behavioral reactions. Some studies explain the effect of the sensitivity of sensory processing and alexithymia in the tendency to substance abuse. Giving that, the aim of the current study was to compare the styles of sensory information processing and alexithymia between substance-dependent people and normal ones. The research method was cross-sectional and the statistical population of the current study comprised of all substance-dependent men who are present in substance quitting camps of Masal, Iran, in October 2013 (n = 78). 36 persons were selected randomly by simple randomly sampling method from this population as the study group, and 36 persons were also selected among the normal population in the same way as the comparison group. Both groups was evaluated by using Toronto alexithymia scale (TAS) and adult sensory profile, and the multivariate analysis of variance (MANOVA) test was applied to analyze data. The results showed that there are significance differences between two groups in low registration (P processing and difficulty in describing emotions (P process sensory information in a different way than normal people and show more alexithymia features than them.

  20. Familiarity Breeds Attempts: A Critical Review of Dual-Process Theories of Recognition.

    Science.gov (United States)

    Mandler, George

    2008-09-01

    Recognition memory and recall/recollection are the major divisions of the psychology of human memory. Theories of recognition have shifted from a "strength" approach to a dual-process view, which distinguishes between knowing that one has experienced an object before and knowing what it was. In this article, I discuss the history of this approach and the two processes of familiarity and recollection and locate their origin in pattern matching and organization. I evaluate various theories in terms of their basic requirements and their defining research and propose the extension of the original two process theory to domains such as pictorial recognition. Finally, I present the main phenomena that a dual-process theory of recognition must account for and discuss future needs and directions of research and development. © 2008 Association for Psychological Science.

  1. Dual-Process Theories of Higher Cognition: Advancing the Debate.

    Science.gov (United States)

    Evans, Jonathan St B T; Stanovich, Keith E

    2013-05-01

    Dual-process and dual-system theories in both cognitive and social psychology have been subjected to a number of recently published criticisms. However, they have been attacked as a category, incorrectly assuming there is a generic version that applies to all. We identify and respond to 5 main lines of argument made by such critics. We agree that some of these arguments have force against some of the theories in the literature but believe them to be overstated. We argue that the dual-processing distinction is supported by much recent evidence in cognitive science. Our preferred theoretical approach is one in which rapid autonomous processes (Type 1) are assumed to yield default responses unless intervened on by distinctive higher order reasoning processes (Type 2). What defines the difference is that Type 2 processing supports hypothetical thinking and load heavily on working memory. © The Author(s) 2013.

  2. Morální hodnocení v kontextu dual process theory

    OpenAIRE

    Schinková, Kristýna

    2017-01-01

    The philosopher and psychologist Joshua Greene came up with a theory of moral judgement that integrates both rationalism and intuitionism - the dual process theory. It says that during moral judgement the unconscious, emotional processes as well as the conscious, rational processes play an important role. At the same time it binds together the process and the respective moral output. If the judgement is made based on intuition, it will be of a deontological type and on the other hand the cont...

  3. A Short Review of the Theory of Hard Exclusive Processes

    International Nuclear Information System (INIS)

    Wallon, S.

    2012-01-01

    We first present an introduction to the theory of hard exclusive processes. We then illustrate this theory by a few selected examples. The last part is devoted to the most recent developments in the asymptotical energy limit. (author)

  4. M-momentum transfer between gravitons, membranes, and fivebranes as perturbative gauge theory processes

    International Nuclear Information System (INIS)

    Keski-Vakkuri, E.; Kraus, P.

    1998-01-01

    Polchinski and Pouliot have shown that M-momentum transfer between membranes in supergravity can be understood as a non-perturbative instanton effect in gauge theory. Here we consider a dual process: electric flux transmission between D-branes. We show that this process can be described in perturbation theory as virtual string pair creation, and is closely related to Schwinger's treatment of the pair creation of charged particles in a uniform electric field. Through the application of dualities, our perturbative calculation gives results for various non-perturbative amplitudes, including M-momentum transfer between gravitons, membranes and longitudinal fivebranes. Thus perturbation theory plus dualities are sufficient to demonstrate agreement between supergravity and gauge theory for a number of M-momentum transferring processes. A variety of other processes where branes are transmitted between branes, e.g. (p,q)-string transmission in IIB theory, can also be studied. We discuss the implications of our results for proving the eleven-dimensional Lorentz invariance of matrix theory. (orig.)

  5. Log-normal distribution from a process that is not multiplicative but is additive.

    Science.gov (United States)

    Mouri, Hideaki

    2013-10-01

    The central limit theorem ensures that a sum of random variables tends to a Gaussian distribution as their total number tends to infinity. However, for a class of positive random variables, we find that the sum tends faster to a log-normal distribution. Although the sum tends eventually to a Gaussian distribution, the distribution of the sum is always close to a log-normal distribution rather than to any Gaussian distribution if the summands are numerous enough. This is in contrast to the current consensus that any log-normal distribution is due to a product of random variables, i.e., a multiplicative process, or equivalently to nonlinearity of the system. In fact, the log-normal distribution is also observable for a sum, i.e., an additive process that is typical of linear systems. We show conditions for such a sum, an analytical example, and an application to random scalar fields such as those of turbulence.

  6. Asymptotic theory of weakly dependent random processes

    CERN Document Server

    Rio, Emmanuel

    2017-01-01

    Presenting tools to aid understanding of asymptotic theory and weakly dependent processes, this book is devoted to inequalities and limit theorems for sequences of random variables that are strongly mixing in the sense of Rosenblatt, or absolutely regular. The first chapter introduces covariance inequalities under strong mixing or absolute regularity. These covariance inequalities are applied in Chapters 2, 3 and 4 to moment inequalities, rates of convergence in the strong law, and central limit theorems. Chapter 5 concerns coupling. In Chapter 6 new deviation inequalities and new moment inequalities for partial sums via the coupling lemmas of Chapter 5 are derived and applied to the bounded law of the iterated logarithm. Chapters 7 and 8 deal with the theory of empirical processes under weak dependence. Lastly, Chapter 9 describes links between ergodicity, return times and rates of mixing in the case of irreducible Markov chains. Each chapter ends with a set of exercises. The book is an updated and extended ...

  7. Modal analysis of inter-area oscillations using the theory of normal modes

    Energy Technology Data Exchange (ETDEWEB)

    Betancourt, R.J. [School of Electromechanical Engineering, University of Colima, Manzanillo, Col. 28860 (Mexico); Barocio, E. [CUCEI, University of Guadalajara, Guadalajara, Jal. 44480 (Mexico); Messina, A.R. [Graduate Program in Electrical Engineering, Cinvestav, Guadalajara, Jal. 45015 (Mexico); Martinez, I. [State Autonomous University of Mexico, Toluca, Edo. Mex. 50110 (Mexico)

    2009-04-15

    Based on the notion of normal modes in mechanical systems, a method is proposed for the analysis and characterization of oscillatory processes in power systems. The method is based on the property of invariance of modal subspaces and can be used to represent complex power system modal behavior by a set of decoupled, two-degree-of-freedom nonlinear oscillator equations. Using techniques from nonlinear mechanics, a new approach is outlined, for determining the normal modes (NMs) of motion of a general n-degree-of-freedom nonlinear system. Equations relating the normal modes and the physical velocities and displacements are developed from the linearized system model and numerical issues associated with the application of the technique are discussed. In addition to qualitative insight, this method can be utilized in the study of nonlinear behavior and bifurcation analyses. The application of these procedures is illustrated on a planning model of the Mexican interconnected system using a quadratic nonlinear model. Specifically, the use of normal mode analysis as a basis for identifying modal parameters, including natural frequencies and damping ratios of general, linear systems with n degrees of freedom is discussed. Comparisons to conventional linear analysis techniques demonstrate the ability of the proposed technique to extract the different oscillation modes embedded in the oscillation. (author)

  8. Toward a Theory of Entrepreneurial Rents: a Simulation of the Market Process

    NARCIS (Netherlands)

    Keyhani, M; Levesque, M.; Madhok, A.

    2015-01-01

    While strategy theory relies heavily on equilibrium theories of economic rents such as Ricardian and monopoly rents, we do not yet have a comprehensive theory of disequilibrium or entrepreneurial rents. We use cooperative game theory to structure computer simulations of the market process in which

  9. Probability, random variables, and random processes theory and signal processing applications

    CERN Document Server

    Shynk, John J

    2012-01-01

    Probability, Random Variables, and Random Processes is a comprehensive textbook on probability theory for engineers that provides a more rigorous mathematical framework than is usually encountered in undergraduate courses. It is intended for first-year graduate students who have some familiarity with probability and random variables, though not necessarily of random processes and systems that operate on random signals. It is also appropriate for advanced undergraduate students who have a strong mathematical background. The book has the following features: Several app

  10. Improving the requirements process in Axiomatic Design Theory

    DEFF Research Database (Denmark)

    Thompson, Mary Kathryn

    2013-01-01

    This paper introduces a model to integrate the traditional requirements process into Axiomatic Design Theory and proposes a method to structure the requirements process. The method includes a requirements classification system to ensure that all requirements information can be included...... in the Axiomatic Design process, a stakeholder classification system to reduce the chances of excluding one or more key stakeholders, and a table to visualize the mapping between the stakeholders and their requirements....

  11. Biomechanical Analysis of Normal Brain Development during the First Year of Life Using Finite Strain Theory.

    Science.gov (United States)

    Kim, Jeong Chul; Wang, Li; Shen, Dinggang; Lin, Weili

    2016-12-02

    The first year of life is the most critical time period for structural and functional development of the human brain. Combining longitudinal MR imaging and finite strain theory, this study aimed to provide new insights into normal brain development through a biomechanical framework. Thirty-three normal infants were longitudinally imaged using MRI from 2 weeks to 1 year of age. Voxel-wise Jacobian determinant was estimated to elucidate volumetric changes while Lagrange strains (both normal and shear strains) were measured to reveal directional growth information every 3 months during the first year of life. Directional normal strain maps revealed that, during the first 6 months, the growth pattern of gray matter is anisotropic and spatially inhomogeneous with higher left-right stretch around the temporal lobe and interhemispheric fissure, anterior-posterior stretch in the frontal and occipital lobes, and superior-inferior stretch in right inferior occipital and right inferior temporal gyri. In contrast, anterior lateral ventricles and insula showed an isotropic stretch pattern. Volumetric and directional growth rates were linearly decreased with age for most of the cortical regions. Our results revealed anisotropic and inhomogeneous brain growth patterns of the human brain during the first year of life using longitudinal MRI and a biomechanical framework.

  12. Relating Memory To Functional Performance In Normal Aging to Dementia Using Hierarchical Bayesian Cognitive Processing Models

    Science.gov (United States)

    Shankle, William R.; Pooley, James P.; Steyvers, Mark; Hara, Junko; Mangrola, Tushar; Reisberg, Barry; Lee, Michael D.

    2012-01-01

    Determining how cognition affects functional abilities is important in Alzheimer’s disease and related disorders (ADRD). 280 patients (normal or ADRD) received a total of 1,514 assessments using the Functional Assessment Staging Test (FAST) procedure and the MCI Screen (MCIS). A hierarchical Bayesian cognitive processing (HBCP) model was created by embedding a signal detection theory (SDT) model of the MCIS delayed recognition memory task into a hierarchical Bayesian framework. The SDT model used latent parameters of discriminability (memory process) and response bias (executive function) to predict, simultaneously, recognition memory performance for each patient and each FAST severity group. The observed recognition memory data did not distinguish the six FAST severity stages, but the latent parameters completely separated them. The latent parameters were also used successfully to transform the ordinal FAST measure into a continuous measure reflecting the underlying continuum of functional severity. HBCP models applied to recognition memory data from clinical practice settings accurately translated a latent measure of cognition to a continuous measure of functional severity for both individuals and FAST groups. Such a translation links two levels of brain information processing, and may enable more accurate correlations with other levels, such as those characterized by biomarkers. PMID:22407225

  13. Concreteness effects in semantic processing: ERP evidence supporting dual-coding theory.

    Science.gov (United States)

    Kounios, J; Holcomb, P J

    1994-07-01

    Dual-coding theory argues that processing advantages for concrete over abstract (verbal) stimuli result from the operation of 2 systems (i.e., imaginal and verbal) for concrete stimuli, rather than just 1 (for abstract stimuli). These verbal and imaginal systems have been linked with the left and right hemispheres of the brain, respectively. Context-availability theory argues that concreteness effects result from processing differences in a single system. The merits of these theories were investigated by examining the topographic distribution of event-related brain potentials in 2 experiments (lexical decision and concrete-abstract classification). The results were most consistent with dual-coding theory. In particular, different scalp distributions of an N400-like negativity were elicited by concrete and abstract words.

  14. Aligning Grammatical Theories and Language Processing Models

    Science.gov (United States)

    Lewis, Shevaun; Phillips, Colin

    2015-01-01

    We address two important questions about the relationship between theoretical linguistics and psycholinguistics. First, do grammatical theories and language processing models describe separate cognitive systems, or are they accounts of different aspects of the same system? We argue that most evidence is consistent with the one-system view. Second,…

  15. Continuous-time Markov decision processes theory and applications

    CERN Document Server

    Guo, Xianping

    2009-01-01

    This volume provides the first book entirely devoted to recent developments on the theory and applications of continuous-time Markov decision processes (MDPs). The MDPs presented here include most of the cases that arise in applications.

  16. A Grounded Theory of the Process of Spiritual Change Among Homicide Survivors.

    Science.gov (United States)

    Johnson, Shannon K; Zitzmann, Brooks

    2018-01-01

    Grounded theory was used to generate a mid-range theory of the process of spiritual change in the lives of survivors of homicide victims. Theoretical sampling guided the selection of 30 participants from a larger study of spiritual change after homicide ( N = 112). Individual interviews were analyzed using a four-step sequence of line-by-line, focused, axial, and selective coding. Analysis generated a closed theory consisting of three fluids, consecutive but nonlinear stages. Each stage consisted of an overarching process and a state of being in the world: (a) Disintegrating: living in a state of shock; (b) Reckoning: living in a state of stagnation; (c) Recreating and reintegrating the self: living in a state of renewal. Movement through the stages was fueled by processes of spiritual connection that yielded changes that permeated the theory. Findings can be used to help practitioners address the processes that drive spiritual change in the lives of homicide survivors.

  17. [Quantitative analysis method based on fractal theory for medical imaging of normal brain development in infants].

    Science.gov (United States)

    Li, Heheng; Luo, Liangping; Huang, Li

    2011-02-01

    The present paper is aimed to study the fractal spectrum of the cerebral computerized tomography in 158 normal infants of different age groups, based on the calculation of chaotic theory. The distribution range of neonatal period was 1.88-1.90 (mean = 1.8913 +/- 0.0064); It reached a stable condition at the level of 1.89-1.90 during 1-12 months old (mean = 1.8927 +/- 0.0045); The normal range of 1-2 years old infants was 1.86-1.90 (mean = 1.8863 +/- 4 0.0085); It kept the invariance of the quantitative value among 1.88-1.91(mean = 1.8958 +/- 0.0083) during 2-3 years of age. ANOVA indicated there's no significant difference between boys and girls (F = 0.243, P > 0.05), but the difference of age groups was significant (F = 8.947, P development.

  18. Asymptotic theory for Brownian semi-stationary processes with application to turbulence

    DEFF Research Database (Denmark)

    Corcuera, José Manuel; Hedevang, Emil; Pakkanen, Mikko S.

    2013-01-01

    This paper presents some asymptotic results for statistics of Brownian semi-stationary (BSS) processes. More precisely, we consider power variations of BSS processes, which are based on high frequency (possibly higher order) differences of the BSS model. We review the limit theory discussed......, which allow to obtain a valid central limit theorem for the critical region. Finally, we apply our statistical theory to turbulence data....

  19. Normalization and gene p-value estimation: issues in microarray data processing.

    Science.gov (United States)

    Fundel, Katrin; Küffner, Robert; Aigner, Thomas; Zimmer, Ralf

    2008-05-28

    Numerous methods exist for basic processing, e.g. normalization, of microarray gene expression data. These methods have an important effect on the final analysis outcome. Therefore, it is crucial to select methods appropriate for a given dataset in order to assure the validity and reliability of expression data analysis. Furthermore, biological interpretation requires expression values for genes, which are often represented by several spots or probe sets on a microarray. How to best integrate spot/probe set values into gene values has so far been a somewhat neglected problem. We present a case study comparing different between-array normalization methods with respect to the identification of differentially expressed genes. Our results show that it is feasible and necessary to use prior knowledge on gene expression measurements to select an adequate normalization method for the given data. Furthermore, we provide evidence that combining spot/probe set p-values into gene p-values for detecting differentially expressed genes has advantages compared to combining expression values for spots/probe sets into gene expression values. The comparison of different methods suggests to use Stouffer's method for this purpose. The study has been conducted on gene expression experiments investigating human joint cartilage samples of osteoarthritis related groups: a cDNA microarray (83 samples, four groups) and an Affymetrix (26 samples, two groups) data set. The apparently straight forward steps of gene expression data analysis, e.g. between-array normalization and detection of differentially regulated genes, can be accomplished by numerous different methods. We analyzed multiple methods and the possible effects and thereby demonstrate the importance of the single decisions taken during data processing. We give guidelines for evaluating normalization outcomes. An overview of these effects via appropriate measures and plots compared to prior knowledge is essential for the biological

  20. [Description of clinical thinking by the dual-process theory].

    Science.gov (United States)

    Peña G, Luis

    2012-06-01

    Clinical thinking is a very complex process that can be described by the dual-process theory, it has an intuitive part (that recognizes patterns) and an analytical part (that tests hypotheses). It is vulnerable to cognitive bias that professionals must be aware of, to minimize diagnostic errors.

  1. Information Architecture without Internal Theory: An Inductive Design Process.

    Science.gov (United States)

    Haverty, Marsha

    2002-01-01

    Suggests that information architecture design is primarily an inductive process, partly because it lacks internal theory and partly because it is an activity that supports emergent phenomena (user experiences) from basic design components. Suggests a resemblance to Constructive Induction, a design process that locates the best representational…

  2. Theory of suppressing avalanche process of carrier in short pulse laser irradiated dielectrics

    Energy Technology Data Exchange (ETDEWEB)

    Deng, H. X., E-mail: hxdeng@uestc.edu.cn, E-mail: xtzu@uestc.edu.cn, E-mail: kaisun@umich.edu; Zu, X. T., E-mail: hxdeng@uestc.edu.cn, E-mail: xtzu@uestc.edu.cn, E-mail: kaisun@umich.edu; Xiang, X. [School of Physical Electronics, University of Electronic Science and Technology of China, Chengdu 610054 (China); Zheng, W. G.; Yuan, X. D. [Research Center of Laser Fusion, China Academy of Engineering Physics, Mianyang 621900 (China); Sun, K., E-mail: hxdeng@uestc.edu.cn, E-mail: xtzu@uestc.edu.cn, E-mail: kaisun@umich.edu [Department of Materials Engineering and Sciences, University of Michigan, 413B Space Research Building, Ann Arbor, Michigan 48109-2143 (United States); Gao, F. [Pacific Northwest National Laboratory, P. O. Box 999, Richland, Washington 99352 (United States)

    2014-05-28

    A theory for controlling avalanche process of carrier during short pulse laser irradiation is proposed. We show that avalanche process of conduction band electrons (CBEs) is determined by the occupation number of phonons in dielectrics. The theory provides a way to suppress avalanche process and a direct judgment for the contribution of avalanche process and photon ionization process to the generation of CBEs. The obtained temperature dependent rate equation shows that the laser induced damage threshold of dielectrics, e.g., fused silica, increase nonlinearly with the decreases of temperature. Present theory predicts a new approach to improve the laser induced damage threshold of dielectrics.

  3. Event-related potential evidence for the processing efficiency theory.

    Science.gov (United States)

    Murray, N P; Janelle, C M

    2007-01-15

    The purpose of this study was to examine the central tenets of the processing efficiency theory using psychophysiological measures of attention and effort. Twenty-eight participants were divided equally into either a high or low trait anxiety group. They were then required to perform a simulated driving task while responding to one of four target light-emitting diodes. Cortical activity and dual task performance were recorded under two conditions -- baseline and competition -- with cognitive anxiety being elevated in the competitive session by an instructional set. Although driving speed was similar across sessions, a reduction in P3 amplitude to cue onset in the light detection task occurred for both groups during the competitive session, suggesting a reduction in processing efficiency as participants became more state anxious. Our findings provide more comprehensive and mechanistic evidence for processing efficiency theory, and confirm that increases in cognitive anxiety can result in a reduction of processing efficiency with little change in performance effectiveness.

  4. Exploiting attractiveness in persuasion: senders' implicit theories about receivers' processing motivation.

    Science.gov (United States)

    Vogel, Tobias; Kutzner, Florian; Fiedler, Klaus; Freytag, Peter

    2010-06-01

    Previous research suggests a positive correlation between physical attractiveness and the expectation of positive outcomes in social interactions, such as successful persuasion. However, prominent persuasion theories do not imply a general advantage of attractive senders. Instead, the persuasion success should vary with the receivers' processing motivation and processing capacity. Focusing on the perspective of the sender, the authors elaborate on lay theories about how attractiveness affects persuasion success. They propose that lay theories (a) match scientific models in that they also comprise the interaction of senders' attractiveness and receivers' processing characteristics, (b) guide laypersons' anticipation of persuasion success, and (c) translate into strategic behavior. They show that anticipated persuasion success depends on the interplay of perceived attractiveness and expectations about receivers' processing motivation (Experiment 1 and 2). Further experiments show that laypersons strategically attempt to exploit attractiveness in that they approach situations (Experiment 3) and persons (Experiment 4) that promise persuasion success.

  5. The experience of weight management in normal weight adults.

    Science.gov (United States)

    Hernandez, Cheri Ann; Hernandez, David A; Wellington, Christine M; Kidd, Art

    2016-11-01

    No prior research has been done with normal weight persons specific to their experience of weight management. The purpose of this research was to discover the experience of weight management in normal weight individuals. Glaserian grounded theory was used. Qualitative data (focus group) and quantitative data (food diary, study questionnaire, and anthropometric measures) were collected. Weight management was an ongoing process of trying to focus on living (family, work, and social), while maintaining their normal weight targets through five consciously and unconsciously used strategies. Despite maintaining normal weights, the nutritional composition of foods eaten was grossly inadequate. These five strategies can be used to develop new weight management strategies that could be integrated into existing weight management programs, or could be developed into novel weight management interventions. Surprisingly, normal weight individuals require dietary assessment and nutrition education to prevent future negative health consequences. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. Theory of the dissociation process, ch. 1

    International Nuclear Information System (INIS)

    Asselt, N.P.F.B. van

    1976-01-01

    The formalism of Moeller operators and channel Hamiltonians, originating from scattering theory, is used for the description of the dissociation process. The proper choice of the initial state wave function is discussed. A method is given which accounts for the symmetry requirements which appear in the case of a homonuclear molecule where identical particles are present

  7. Understanding the coping process from a self-determination theory perspective.

    Science.gov (United States)

    Ntoumanis, Nikos; Edmunds, Jemma; Duda, Joan L

    2009-05-01

    To explore conceptual links between the cognitive-motivational-relational theory (CMRT) of coping (Lazarus, 1991) and self-determination theory (SDT) of motivation (Deci & Ryan, 1985). We present a very brief overview of the two theories. We also discuss how components from the two theories can be examined together to facilitate research in the health/exercise domain. To this effect, we offer a preliminary integrated model of stress, coping, and motivation, based on the two aforementioned theories, in an attempt to illustrate and instigate research on how motivational factors are implicated in the coping process. We believe that the proposed model can serve as a platform for generating new research ideas which, besides their theoretical relevance, may have important applied implications.

  8. Dual-process models of health-related behaviour and cognition: a review of theory.

    Science.gov (United States)

    Houlihan, S

    2018-03-01

    The aim of this review was to synthesise a spectrum of theories incorporating dual-process models of health-related behaviour. Review of theory, adapted loosely from Cochrane-style systematic review methodology. Inclusion criteria were specified to identify all relevant dual-process models that explain decision-making in the context of decisions made about human health. Data analysis took the form of iterative template analysis (adapted from the conceptual synthesis framework used in other reviews of theory), and in this way theories were synthesised on the basis of shared theoretical constructs and causal pathways. Analysis and synthesis proceeded in turn, instead of moving uni-directionally from analysis of individual theories to synthesis of multiple theories. Namely, the reviewer considered and reconsidered individual theories and theoretical components in generating the narrative synthesis' main findings. Drawing on systematic review methodology, 11 electronic databases were searched for relevant dual-process theories. After de-duplication, 12,198 records remained. Screening of title and abstract led to the exclusion of 12,036 records, after which 162 full-text records were assessed. Of those, 21 records were included in the review. Moving back and forth between analysis of individual theories and the synthesis of theories grouped on the basis of theme or focus yielded additional insights into the orientation of a theory to an individual. Theories could be grouped in part on their treatment of an individual as an irrational actor, as social actor, as actor in a physical environment or as a self-regulated actor. Synthesising identified theories into a general dual-process model of health-related behaviour indicated that such behaviour is the result of both propositional and unconscious reasoning driven by an individual's response to internal cues (such as heuristics, attitude and affect), physical cues (social and physical environmental stimuli) as well as

  9. Performances on a cognitive theory of mind task: specific decline or general cognitive deficits? Evidence from normal aging.

    Science.gov (United States)

    Fliss, Rafika; Lemerre, Marion; Mollard, Audrey

    2016-06-01

    Compromised theory of mind (ToM) can be explained either by a failure to implement specific representational capacities (mental state representations) or by more general executive selection demands. In older adult populations, evidence supporting affected executive functioning and cognitive ToM in normal aging are reported. However, links between these two functions remain unclear. In the present paper, we address these shortcomings by using a specific task of ToM and classical executive tasks. We studied, using an original cognitive ToM task, the effect of age on ToM performances, in link with the progressive executive decline. 96 elderly participants were recruited. They were asked to perform a cognitive ToM task, and 5 executive tests (Stroop test and Hayling Sentence Completion Test to appreciate inhibitory process, Trail Making Test and Verbal Fluency for shifting assessment and backward span dedicated to estimate working memory capacity). The results show changes in cognitive ToM performance according to executive demands. Correlational studies indicate a significant relationship between ToM performance and the selected executive measures. Regression analyzes demonstrates that level of vocabulary and age as the best predictors of ToM performance. The results are consistent with the hypothesis that ToM deficits are related to age-related domain-general decline rather than as to a breakdown in specialized representational system. The implications of these findings for the nature of social cognition tests in normal aging are also discussed.

  10. Introduction to Measure Theory and Integration

    CERN Document Server

    Ambrosio, Luigi; Mennucci, Andrea

    2011-01-01

    This textbook collects the notes for an introductory course in measure theory and integration. The course was taught by the authors to undergraduate students of the Scuola Normale Superiore, in the years 2000-2011. The goal of the course was to present, in a quick but rigorous way, the modern point of view on measure theory and integration, putting Lebesgue's Euclidean space theory into a more general context and presenting the basic applications to Fourier series, calculus and real analysis. The text can also pave the way to more advanced courses in probability, stochastic processes or geomet

  11. Frames and operator theory in analysis and signal processing

    CERN Document Server

    Larson, David R; Nashed, Zuhair; Nguyen, Minh Chuong; Papadakis, Manos

    2008-01-01

    This volume contains articles based on talks presented at the Special Session Frames and Operator Theory in Analysis and Signal Processing, held in San Antonio, Texas, in January of 2006. Recently, the field of frames has undergone tremendous advancement. Most of the work in this field is focused on the design and construction of more versatile frames and frames tailored towards specific applications, e.g., finite dimensional uniform frames for cellular communication. In addition, frames are now becoming a hot topic in mathematical research as a part of many engineering applications, e.g., matching pursuits and greedy algorithms for image and signal processing. Topics covered in this book include: Application of several branches of analysis (e.g., PDEs; Fourier, wavelet, and harmonic analysis; transform techniques; data representations) to industrial and engineering problems, specifically image and signal processing. Theoretical and applied aspects of frames and wavelets. Pure aspects of operator theory empha...

  12. The Relevance of Theories of the Policy Process to Educational Decision-Making.

    Science.gov (United States)

    Ryan, R. J.

    1985-01-01

    Two case studies of educational decision making are used to test the utility of some current theories of the policy-formation process; a framework for the application of these theories is proposed; and the merits of applying existing theories before seeking new paradigms are stressed. (MSE)

  13. A Numerical Theory for Impedance Education in Three-Dimensional Normal Incidence Tubes

    Science.gov (United States)

    Watson, Willie R.; Jones, Michael G.

    2016-01-01

    A method for educing the locally-reacting acoustic impedance of a test sample mounted in a 3-D normal incidence impedance tube is presented and validated. The unique feature of the method is that the excitation frequency (or duct geometry) may be such that high-order duct modes may exist. The method educes the impedance, iteratively, by minimizing an objective function consisting of the difference between the measured and numerically computed acoustic pressure at preselected measurement points in the duct. The method is validated on planar and high-order mode sources with data synthesized from exact mode theory. These data are then subjected to random jitter to simulate the effects of measurement uncertainties on the educed impedance spectrum. The primary conclusions of the study are 1) Without random jitter the method is in excellent agreement with that for known impedance samples, and 2) Random jitter that is compatible to that found in a typical experiment has minimal impact on the accuracy of the educed impedance.

  14. Dynamic interracial/intercultural processes: the role of lay theories of race.

    Science.gov (United States)

    Hong, Ying-yi; Chao, Melody Manchi; No, Sun

    2009-10-01

    This paper explores how the lay theory approach provides a framework beyond previous stereotype/prejudice research to understand dynamic personality processes in interracial/ethnic contexts. The authors conceptualize theory of race within the Cognitive-Affective Personality System (CAPS), in which lay people's beliefs regarding the essential nature of race sets up a mind-set through which individuals construe and interpret their social experiences. The research findings illustrate that endorsement of the essentialist theory (i.e., that race reflects deep-seated, inalterable essence and is indicative of traits and ability) versus the social constructionist theory (i.e., that race is socially constructed, malleable, and arbitrary) are associated with different encoding and representation of social information, which in turn affect feelings, motivation, and competence in navigating between racial and cultural boundaries. These findings shed light on dynamic interracial/intercultural processes. Relations of this approach to CAPS are discussed.

  15. Theory of mind for processing unexpected events across contexts.

    Science.gov (United States)

    Dungan, James A; Stepanovic, Michael; Young, Liane

    2016-08-01

    Theory of mind, or mental state reasoning, may be particularly useful for making sense of unexpected events. Here, we investigated unexpected behavior across both social and non-social contexts in order to characterize the precise role of theory of mind in processing unexpected events. We used functional magnetic resonance imaging to examine how people respond to unexpected outcomes when initial expectations were based on (i) an object's prior behavior, (ii) an agent's prior behavior and (iii) an agent's mental states. Consistent with prior work, brain regions for theory of mind were preferentially recruited when people first formed expectations about social agents vs non-social objects. Critically, unexpected vs expected outcomes elicited greater activity in dorsomedial prefrontal cortex, which also discriminated in its spatial pattern of activity between unexpected and expected outcomes for social events. In contrast, social vs non-social events elicited greater activity in precuneus across both expected and unexpected outcomes. Finally, given prior information about an agent's behavior, unexpected vs expected outcomes elicited an especially robust response in right temporoparietal junction, and the magnitude of this difference across participants correlated negatively with autistic-like traits. Together, these findings illuminate the distinct contributions of brain regions for theory of mind for processing unexpected events across contexts. © The Author (2016). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  16. Algorithmic information theory mathematics of digital information processing

    CERN Document Server

    Seibt, Peter

    2007-01-01

    Treats the Mathematics of many important areas in digital information processing. This book covers, in a unified presentation, five topics: Data Compression, Cryptography, Sampling (Signal Theory), Error Control Codes, Data Reduction. It is useful for teachers, students and practitioners in Electronic Engineering, Computer Science and Mathematics.

  17. Validity and Reliability of Published Comprehensive Theory of Mind Tests for Normal Preschool Children: A Systematic Review

    Directory of Open Access Journals (Sweden)

    Seyyede Zohreh Ziatabar Ahmadi

    2015-12-01

    Full Text Available Objective: Theory of mind (ToM or mindreading is an aspect of social cognition that evaluates mental states and beliefs of oneself and others. Validity and reliability are very important criteria when evaluating standard tests; and without them, these tests are not usable. The aim of this study was to systematically review the validity and reliability of published English comprehensive ToM tests developed for normal preschool children.Method: We searched MEDLINE (PubMed interface, Web of Science, Science direct, PsycINFO, and also evidence base Medicine (The Cochrane Library databases from 1990 to June 2015. Search strategy was Latin transcription of ‘Theory of Mind’ AND test AND children. Also, we manually studied the reference lists of all final searched articles and carried out a search of their references. Inclusion criteria were as follows: Valid and reliable diagnostic ToM tests published from 1990 to June 2015 for normal preschool children; and exclusion criteria were as follows: the studies that only used ToM tests and single tasks (false belief tasks for ToM assessment and/or had no description about structure, validity or reliability of their tests. Methodological quality of the selected articles was assessed using the Critical Appraisal Skills Programme (CASP.Result: In primary searching, we found 1237 articles in total databases. After removing duplicates and applying all inclusion and exclusion criteria, we selected 11 tests for this systematic review. Conclusion: There were a few valid, reliable and comprehensive ToM tests for normal preschool children. However, we had limitations concerning the included articles. The defined ToM tests were different in populations, tasks, mode of presentations, scoring, mode of responses, times and other variables. Also, they had various validities and reliabilities. Therefore, it is recommended that the researchers and clinicians select the ToM tests according to their psychometric

  18. Theory and praxis pf map analsys in CHEF part 1: Linear normal form

    Energy Technology Data Exchange (ETDEWEB)

    Michelotti, Leo; /Fermilab

    2008-10-01

    This memo begins a series which, put together, could comprise the 'CHEF Documentation Project' if there were such a thing. The first--and perhaps only--three will telegraphically describe theory, algorithms, implementation and usage of the normal form map analysis procedures encoded in CHEF's collection of libraries. [1] This one will begin the sequence by explaining the linear manipulations that connect the Jacobian matrix of a symplectic mapping to its normal form. It is a 'Reader's Digest' version of material I wrote in Intermediate Classical Dynamics (ICD) [2] and randomly scattered across technical memos, seminar viewgraphs, and lecture notes for the past quarter century. Much of its content is old, well known, and in some places borders on the trivial.1 Nevertheless, completeness requires their inclusion. The primary objective is the 'fundamental theorem' on normalization written on page 8. I plan to describe the nonlinear procedures in a subsequent memo and devote a third to laying out algorithms and lines of code, connecting them with equations written in the first two. Originally this was to be done in one short paper, but I jettisoned that approach after its first section exceeded a dozen pages. The organization of this document is as follows. A brief description of notation is followed by a section containing a general treatment of the linear problem. After the 'fundamental theorem' is proved, two further subsections discuss the generation of equilibrium distributions and issue of 'phase'. The final major section reviews parameterizations--that is, lattice functions--in two and four dimensions with a passing glance at the six-dimensional version. Appearances to the contrary, for the most part I have tried to restrict consideration to matters needed to understand the code in CHEF's libraries.

  19. Postmortem abdominal CT: Assessing normal cadaveric modifications and pathological processes

    International Nuclear Information System (INIS)

    Charlier, P.; Carlier, R.; Roffi, F.; Ezra, J.; Chaillot, P.F.; Duchat, F.; Huynh-Charlier, I.; Lorin de la Grandmaison, G.

    2012-01-01

    Purpose: To investigate the interest of postmortem non-enhanced computer tomography (CT) for abdominal lesions in a forensic context of suspicions death and to list the different radiological cadaveric modifications occurring normally at abdominal stage, which must be known by non forensic radiologists in case of any postmortem exam. Materials and methods: 30 cadavers have been submitted to a body CT-scan without injection of contrast material. CT exams were reviewed by two independent radiologists and radiological findings were compared with forensic autopsy data. Results: False positive CT findings included physiological postmortem transudates misdiagnosed with intra-abdominal bleedings, and putrefaction gas misdiagnosed with gas embolism, aeroporty, aerobily, digestive parietal pneumatosis. Incidentalomas without any role in death process were also reported. False negative CT findings included small contusions, vascular thromboses, acute infarcts foci, non radio-opaque foreign bodies. Normal cadaveric modifications were due to livor mortis and putrefaction, and are seen quickly (some hours) after death. Conclusion: The non forensic radiologist should be familiar with the normal abdominal postmortem features in order to avoid misdiagnoses, and detect informative lesions which can help and guide the forensic practitioner or the clinical physician.

  20. An Opponent-Process Theory of Motivation: II. Cigarette Addiction

    Science.gov (United States)

    Solomon, Richard L.; Corbit, John D.

    1973-01-01

    Methods suggested by opponent-process theory of acquired motivation in helping smokers to quit the habit include use of antagonistic drugs, total cessation from tobacco, and decrease in intensity and frequency of tobacco use. (DS)

  1. Drug Use Normalization: A Systematic and Critical Mixed-Methods Review.

    Science.gov (United States)

    Sznitman, Sharon R; Taubman, Danielle S

    2016-09-01

    Drug use normalization, which is a process whereby drug use becomes less stigmatized and more accepted as normative behavior, provides a conceptual framework for understanding contemporary drug issues and changes in drug use trends. Through a mixed-methods systematic review of the normalization literature, this article seeks to (a) critically examine how the normalization framework has been applied in empirical research and (b) make recommendations for future research in this area. Twenty quantitative, 26 qualitative, and 4 mixed-methods studies were identified through five electronic databases and reference lists of published studies. Studies were assessed for relevance, study characteristics, quality, and aspects of normalization examined. None of the studies applied the most rigorous research design (experiments) or examined all of the originally proposed normalization dimensions. The most commonly assessed dimension of drug use normalization was "experimentation." In addition to the original dimensions, the review identified the following new normalization dimensions in the literature: (a) breakdown of demographic boundaries and other risk factors in relation to drug use; (b) de-normalization; (c) drug use as a means to achieve normal goals; and (d) two broad forms of micro-politics associated with managing the stigma of illicit drug use: assimilative and transformational normalization. Further development in normalization theory and methodology promises to provide researchers with a novel framework for improving our understanding of drug use in contemporary society. Specifically, quasi-experimental designs that are currently being made feasible by swift changes in cannabis policy provide researchers with new and improved opportunities to examine normalization processes.

  2. Accident and Off-Normal Response and Recovery from Multi-Canister Overpack (MCO) Processing Events

    International Nuclear Information System (INIS)

    ALDERMAN, C.A.

    2000-01-01

    In the process of removing spent nuclear fuel (SNF) from the K Basins through its subsequent packaging, drymg, transportation and storage steps, the SNF Project must be able to respond to all anticipated or foreseeable off-normal and accident events that may occur. Response procedures and recovery plans need to be in place, personnel training established and implemented to ensure the project will be capable of appropriate actions. To establish suitable project planning, these events must first be identified and analyzed for their expected impact to the project. This document assesses all off-normal and accident events for their potential cross-facility or Multi-Canister Overpack (MCO) process reversal impact. Table 1 provides the methodology for establishing the event planning level and these events are provided in Table 2 along with the general response and recovery planning. Accidents and off-normal events of the SNF Project have been evaluated and are identified in the appropriate facility Safety Analysis Report (SAR) or in the transportation Safety Analysis Report for Packaging (SARP). Hazards and accidents are summarized from these safety analyses and listed in separate tables for each facility and the transportation system in Appendix A, along with identified off-normal events. The tables identify the general response time required to ensure a stable state after the event, governing response documents, and the events with potential cross-facility or SNF process reversal impacts. The event closure is predicated on stable state response time, impact to operations and the mitigated annual occurrence frequency of the event as developed in the hazard analysis process

  3. Instructional Transaction Theory: Knowledge Relationships among Processes, Entities, and Activities.

    Science.gov (United States)

    Merrill, M. David; And Others

    1993-01-01

    Discussion of instructional transaction theory focuses on knowledge representation in an automated instructional design expert system. A knowledge structure called PEA-Net (processes, entities, and activities) is explained; the refrigeration process is used as an example; text resources and graphic resources are described; and simulations are…

  4. Elevated intrabolus pressure identifies obstructive processes when integrated relaxation pressure is normal on esophageal high-resolution manometry.

    Science.gov (United States)

    Quader, Farhan; Reddy, Chanakyaram; Patel, Amit; Gyawali, C Prakash

    2017-07-01

    Elevated integrated relaxation pressure (IRP) on esophageal high-resolution manometry (HRM) identifies obstructive processes at the esophagogastric junction (EGJ). Our aim was to determine whether intrabolus pressure (IBP) can identify structural EGJ processes when IRP is normal. In this observational cohort study, adult patients with dysphagia and undergoing HRM were evaluated for endoscopic evidence of structural EGJ processes (strictures, rings, hiatus hernia) in the setting of normal IRP. HRM metrics [IRP, distal contractile integral (DCI), distal latency (DL), IBP, and EGJ contractile integral (EGJ-CI)] were compared among 74 patients with structural EGJ findings (62.8 ± 1.6 yr, 67.6% women), 27 patients with normal EGD (52.9 ± 3.2 yr, 70.3% women), and 21 healthy controls (27.6 ± 0.6 yr, 52.4% women). Findings were validated in 85 consecutive symptomatic patients to address clinical utility. In the primary cohort, mean IBP (18.4 ± 0.9 mmHg) was higher with structural EGJ findings compared with dysphagia with normal EGD (13.5 ± 1.1 mmHg, P = 0.002) and healthy controls (10.9 ± 0.9 mmHg, P 0.05 for each comparison). During multiple rapid swallows, IBP remained higher in the structural findings group compared with controls ( P = 0.02). Similar analysis of the prospective validation cohort confirmed IBP elevation in structural EGJ processes, but correlation with dysphagia could not be demonstrated. We conclude that elevated IBP predicts the presence of structural EGJ processes even when IRP is normal, but correlation with dysphagia is suboptimal. NEW & NOTEWORTHY Integrated relaxation pressure (IRP) above the upper limit of normal defines esophageal outflow obstruction using high-resolution manometry. In patients with normal IRP, elevated intrabolus pressure (IBP) can be a surrogate marker for a structural restrictive or obstructive process at the esophagogastric junction (EGJ). This has the potential to augment the clinical value of

  5. Information Processing Theories and the Education of the Gifted.

    Science.gov (United States)

    Rawl, Ruth K.; O'Tuel, Frances S.

    1983-01-01

    The basic assumptions of information processing theories in cognitive psychology are reviewed, and the application of this approach to problem solving in gifted education is considered. Specific implications are cited on problem selection and instruction giving. (CL)

  6. Individual Differences in Working Memory Capacity and Dual-Process Theories of the Mind

    Science.gov (United States)

    Barrett, Lisa Feldman; Tugade, Michele M.; Engle, Randall W.

    2004-01-01

    Dual-process theories of the mind are ubiquitous in psychology. A central principle of these theories is that behavior is determined by the interplay of automatic and controlled processing. In this article, the authors examine individual differences in the capacity to control attention as a major contributor to differences in working memory…

  7. Theory and praxis of map analsys in CHEF part 2: Nonlinear normal form

    International Nuclear Information System (INIS)

    Michelotti, Leo

    2009-01-01

    This is the second of three memos describing how normal form map analysis is implemented in CHEF. The first (1) explained the manipulations required to assure that initial, linear transformations preserved Poincare invariants, thereby confirming correct normalization of action-angle coordinates. In this one, the transformation will be extended to nonlinear terms. The third, describing how the algorithms were implemented within the software of CHEF's libraries, most likely will never be written. The first section, Section 2, quickly lays out preliminary concepts and relationships. In Section 3, we shall review the perturbation theory - an iterative sequence of transformations that converts a nonlinear mapping into its normal form - and examine the equation which moves calculations from one step to the next. Following that is a section titled 'Interpretation', which identifies connections between the normalized mappings and idealized, integrable, fictitious Hamiltonian models. A final section contains closing comments, some of which may - but probably will not - preview work to be done later. My reasons for writing this memo and its predecessor have already been expressed. (1) To them can be added this: 'black box code' encourages users to proceed with little or no understanding of what it does or how it operates. So far, CHEF has avoided this trap admirably by failing to attract potential users. However, we reached a watershed last year: even I now have difficulty following the software through its maze of operations. Extensions to CHEF's physics functionalities, software upgrades, and even simple maintenance are becoming more difficult than they should. I hope these memos will mark parts of the maze for easier navigation in the future. Despite appearances to the contrary, I tried to include no (or very little) more than the minimum needed to understand what CHEF's nonlinear analysis modules do.1 As with the first memo, material has been lifted - and modified - from

  8. Theory and praxis of map analsys in CHEF part 2: Nonlinear normal form

    Energy Technology Data Exchange (ETDEWEB)

    Michelotti, Leo; /FERMILAB

    2009-04-01

    This is the second of three memos describing how normal form map analysis is implemented in CHEF. The first [1] explained the manipulations required to assure that initial, linear transformations preserved Poincare invariants, thereby confirming correct normalization of action-angle coordinates. In this one, the transformation will be extended to nonlinear terms. The third, describing how the algorithms were implemented within the software of CHEF's libraries, most likely will never be written. The first section, Section 2, quickly lays out preliminary concepts and relationships. In Section 3, we shall review the perturbation theory - an iterative sequence of transformations that converts a nonlinear mapping into its normal form - and examine the equation which moves calculations from one step to the next. Following that is a section titled 'Interpretation', which identifies connections between the normalized mappings and idealized, integrable, fictitious Hamiltonian models. A final section contains closing comments, some of which may - but probably will not - preview work to be done later. My reasons for writing this memo and its predecessor have already been expressed. [1] To them can be added this: 'black box code' encourages users to proceed with little or no understanding of what it does or how it operates. So far, CHEF has avoided this trap admirably by failing to attract potential users. However, we reached a watershed last year: even I now have difficulty following the software through its maze of operations. Extensions to CHEF's physics functionalities, software upgrades, and even simple maintenance are becoming more difficult than they should. I hope these memos will mark parts of the maze for easier navigation in the future. Despite appearances to the contrary, I tried to include no (or very little) more than the minimum needed to understand what CHEF's nonlinear analysis modules do.1 As with the first memo, material

  9. Generalized Poisson processes in quantum mechanics and field theory

    International Nuclear Information System (INIS)

    Combe, P.; Rodriguez, R.; Centre National de la Recherche Scientifique, 13 - Marseille; Hoegh-Krohn, R.; Centre National de la Recherche Scientifique, 13 - Marseille; Sirugue, M.; Sirugue-Collin, M.; Centre National de la Recherche Scientifique, 13 - Marseille

    1981-01-01

    In section 2 we describe more carefully the generalized Poisson processes, giving a realization of the underlying probability space, and we characterize these processes by their characteristic functionals. Section 3 is devoted to the proof of the previous formula for quantum mechanical systems, with possibly velocity dependent potentials and in section 4 we give an application of the previous theory to some relativistic Bose field models. (orig.)

  10. Theory-data comparisons for jet measurements in hadron-induced processes

    Energy Technology Data Exchange (ETDEWEB)

    Wobisch, M. [Lousiana Tech Univ., Ruston, LA (United States); Britzger, D. [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Kluge, T. [Liverpool Univ. (United Kingdom); Rabbertz, K.; Stober, F. [Karlsruher Institut fuer Technologie (KIT), Karlsruhe (Germany)

    2011-09-15

    We present a comprehensive overview of theory-data comparisons for inclusive jet production. Theory predictions are derived for recent parton distribution functions and compared with jet data from different hadron-induced processes at various center-of-mass energies {radical}(s). The comparisons are presented as a function of jet transverse momentum p{sub T} or, alternatively, of the scaling variable x{sub T}=2p{sub T}/{radical}(s). (orig.)

  11. Evaluating Transfer Entropy for Normal and y-Order Normal Distributions

    Czech Academy of Sciences Publication Activity Database

    Hlaváčková-Schindler, Kateřina; Toulias, T. L.; Kitsos, C. P.

    2016-01-01

    Roč. 17, č. 5 (2016), s. 1-20 ISSN 2231-0851 Institutional support: RVO:67985556 Keywords : Transfer entropy * time series * Kullback-Leibler divergence * causality * generalized normal distribution Subject RIV: BC - Control Systems Theory http://library.utia.cas.cz/separaty/2016/AS/hlavackova-schindler-0461261.pdf

  12. The mathematical theory of signal processing and compression-designs

    Science.gov (United States)

    Feria, Erlan H.

    2006-05-01

    The mathematical theory of signal processing, named processor coding, will be shown to inherently arise as the computational time dual of Shannon's mathematical theory of communication which is also known as source coding. Source coding is concerned with signal source memory space compression while processor coding deals with signal processor computational time compression. Their combination is named compression-designs and referred as Conde in short. A compelling and pedagogically appealing diagram will be discussed highlighting Conde's remarkable successful application to real-world knowledge-aided (KA) airborne moving target indicator (AMTI) radar.

  13. Physics of Laser Materials Processing Theory and Experiment

    CERN Document Server

    Gladush, Gennady G

    2011-01-01

    This book describes the basic mechanisms, theory, simulations and technological aspects of Laser processing techniques. It covers the principles of laser quenching, welding, cutting, alloying, selective sintering, ablation, etc. The main attention is paid to the quantitative description. The diversity and complexity of technological and physical processes is discussed using a unitary approach. The book aims on understanding the cause-and-effect relations in physical processes in Laser technologies. It will help researchers and engineers to improve the existing and develop new Laser machining techniques. The book addresses readers with a certain background in general physics and mathematical analysis: graduate students, researchers and engineers practicing laser applications.

  14. Reasoning on the Autism Spectrum: A Dual Process Theory Account

    Science.gov (United States)

    Brosnan, Mark; Lewton, Marcus; Ashwin, Chris

    2016-01-01

    Dual process theory proposes two distinct reasoning processes in humans, an intuitive style that is rapid and automatic and a deliberative style that is more effortful. However, no study to date has specifically examined these reasoning styles in relation to the autism spectrum. The present studies investigated deliberative and intuitive reasoning…

  15. Non-equilibrium reacting gas flows kinetic theory of transport and relaxation processes

    CERN Document Server

    Nagnibeda, Ekaterina; Nagnibeda, Ekaterina

    2009-01-01

    This volume develops the kinetic theory of transport phenomena and relaxation processes in the flows of reacting gas mixtures. The theory is applied to the modeling of non-equilibrium flows behind strong shock waves, in the boundary layer, and in nozzles.

  16. Analysis of a renormalization group method and normal form theory for perturbed ordinary differential equations

    Science.gov (United States)

    DeVille, R. E. Lee; Harkin, Anthony; Holzer, Matt; Josić, Krešimir; Kaper, Tasso J.

    2008-06-01

    For singular perturbation problems, the renormalization group (RG) method of Chen, Goldenfeld, and Oono [Phys. Rev. E. 49 (1994) 4502-4511] has been shown to be an effective general approach for deriving reduced or amplitude equations that govern the long time dynamics of the system. It has been applied to a variety of problems traditionally analyzed using disparate methods, including the method of multiple scales, boundary layer theory, the WKBJ method, the Poincaré-Lindstedt method, the method of averaging, and others. In this article, we show how the RG method may be used to generate normal forms for large classes of ordinary differential equations. First, we apply the RG method to systems with autonomous perturbations, and we show that the reduced or amplitude equations generated by the RG method are equivalent to the classical Poincaré-Birkhoff normal forms for these systems up to and including terms of O(ɛ2), where ɛ is the perturbation parameter. This analysis establishes our approach and generalizes to higher order. Second, we apply the RG method to systems with nonautonomous perturbations, and we show that the reduced or amplitude equations so generated constitute time-asymptotic normal forms, which are based on KBM averages. Moreover, for both classes of problems, we show that the main coordinate changes are equivalent, up to translations between the spaces in which they are defined. In this manner, our results show that the RG method offers a new approach for deriving normal forms for nonautonomous systems, and it offers advantages since one can typically more readily identify resonant terms from naive perturbation expansions than from the nonautonomous vector fields themselves. Finally, we establish how well the solution to the RG equations approximates the solution of the original equations on time scales of O(1/ɛ).

  17. Process for preparing a normal lighting and heating gas etc

    Energy Technology Data Exchange (ETDEWEB)

    Becker, J

    1910-12-11

    A process for preparing a normal lighting and heating gas from Australian bituminous shale by distillation and decomposition in the presence of water vapor is characterized by the fact that the gasification is suitably undertaken with gradual filling of a retort and with simultaneous introduction of water vapor at a temperature not exceeding 1,000/sup 0/ C. The resulting amount of gas is heated in the same or a second heated retort with freshly supplied vapor.

  18. Finding Commonalities: Social Information Processing and Domain Theory in the Study of Aggression

    Science.gov (United States)

    Nucci, Larry

    2004-01-01

    The Arsenio and Lemerise (this issue) proposal integrating social information processing (SIP) and domain theory to study children's aggression is evaluated from a domain theory perspective. Basic tenets of domain theory rendering it compatible with SIP are discussed as well as points of divergence. Focus is directed to the proposition that…

  19. Towards a general theory of implementation

    Science.gov (United States)

    2013-01-01

    Understanding and evaluating the implementation of complex interventions in practice is an important problem for healthcare managers and policy makers, and for patients and others who must operationalize them beyond formal clinical settings. It has been argued that this work should be founded on theory that provides a foundation for understanding, designing, predicting, and evaluating dynamic implementation processes. This paper sets out core constituents of a general theory of implementation, building on Normalization Process Theory and linking it to key constructs from recent work in sociology and psychology. These are informed by ideas about agency and its expression within social systems and fields, social and cognitive mechanisms, and collective action. This approach unites a number of contending perspectives in a way that makes possible a more comprehensive explanation of the implementation and embedding of new ways of thinking, enacting and organizing practice. PMID:23406398

  20. Periodic Schur process, cylindric partitions and N=2* theory

    International Nuclear Information System (INIS)

    Iqbal, Amer; Kozcaz, Can; Sohail, Tanweer

    2011-01-01

    Type IIA string theory compactified on an elliptic CY3-fold gives rise to N=2U(1) gauge theory with an adjoint hypermultiplet. We study the refined open and closed topological string partition functions of this geometry using the refined topological vertex. We show that these partition functions, open and closed, are examples of periodic Schur process and are related to the generating function of the cylindric partitions if the Kaehler parameters are quantized in units of string coupling. The level-rank duality appears as the exchange symmetry of the two Kaehler parameters of the elliptic CY3-fold.

  1. Seeking Humanizing Care in Patient-Centered Care Process: A Grounded Theory Study.

    Science.gov (United States)

    Cheraghi, Mohammad Ali; Esmaeili, Maryam; Salsali, Mahvash

    Patient-centered care is both a goal in itself and a tool for enhancing health outcomes. The application of patient-centered care in health care services globally however is diverse. This article reports on a study that sought to introduce patient-centered care. The aim of this study is to explore the process of providing patient-centered care in critical care units. The study used a grounded theory method. Data were collected on 5 critical care units in Tehran University of Medical Sciences. Purposive and theoretical sampling directed the collection of data using 29 semistructured interviews with 27 participants (nurses, patients, and physician). Data obtained were analyzed according to the analysis stages of grounded theory and constant comparison to identify the concepts, context, and process of the study. The core category of this grounded theory is "humanizing care," which consisted of 4 interrelated phases, including patient acceptance, purposeful patient assessment and identification, understanding patients, and patient empowerment. A core category of humanizing care integrated the theory. Humanizing care was an outcome and process. Patient-centered care is a dynamic and multifaceted process provided according to the nurses' understanding of the concept. Patient-centered care does not involve repeating routine tasks; rather, it requires an all-embracing understanding of the patients and showing respect for their values, needs, and preferences.

  2. Research of radioecological processes by methods of the theory of reliability

    International Nuclear Information System (INIS)

    Kutlakhmedov, Yu.A.; Salivon, A.G.; Pchelovskaya, S.A.; Rodina, V.V.; Bevza, A.G.; Matveeva, I.V.

    2012-01-01

    Theory and the models of radiocapacity ecosystems using the theory and models of reliability have allowed adequately to describe the laws of migration and radionuclides distribution for different types ecosystems of reservoirs and land. The theory and the models of radiocapacity allow strictly to define critical elements of ecosystem where it is necessary to expect temporary or final depoting of radionuclides.The approach on the basis of application biogenic tracers allows within the framework of the theory both models of radiocapacity and reliability simultaneously to estimate the processes of radionuclides migration, to define the dozes of loading on biota ecosystems, and to establish fundamental parameters of radionuclides redistribution speeds and others pollutants in different types of ecosystems.

  3. Tutorial - applying extreme value theory to characterize food-processing systems

    DEFF Research Database (Denmark)

    Skou, Peter Bæk; Holroyd, Stephen E.; van der Berg, Franciscus Winfried J

    2017-01-01

    This tutorial presents extreme value theory (EVT) as an analytical tool in process characterization and shows its potential to describe production performance, eg, across different factories, via reliable estimates of the frequency and scale of extreme events. Two alternative EVT methods...... are discussed: point over threshold and block maxima. We illustrate the theoretical framework for EVT by process data from two different examples from the food-processing industry. Finally, we discuss limitations, decisions, and possibilities when applying EVT for process data....

  4. Dual-Process Theories of Reasoning: Contemporary Issues and Developmental Applications

    Science.gov (United States)

    Evans, Jonathan St. B. T.

    2011-01-01

    In this paper, I discuss the current state of theorising about dual processes in adult performance on reasoning and decision making tasks, in which Type 1 intuitive processing is distinguished from Type 2 reflective thinking. I show that there are many types of theory some of which distinguish modes rather than types of thinking and that…

  5. Can dual processing theory explain physics students’ performance on the Force Concept Inventory?

    Directory of Open Access Journals (Sweden)

    Anna K. Wood

    2016-07-01

    Full Text Available According to dual processing theory there are two types, or modes, of thinking: system 1, which involves intuitive and nonreflective thinking, and system 2, which is more deliberate and requires conscious effort and thought. The Cognitive Reflection Test (CRT is a widely used and robust three item instrument that measures the tendency to override system 1 thinking and to engage in reflective, system 2 thinking. Each item on the CRT has an intuitive (but wrong answer that must be rejected in order to answer the item correctly. We therefore hypothesized that performance on the CRT may give useful insights into the cognitive processes involved in learning physics, where success involves rejecting the common, intuitive ideas about the world (often called misconceptions and instead carefully applying physical concepts. This paper presents initial results from an ongoing study examining the relationship between students’ CRT scores and their performance on the Force Concept Inventory (FCI, which tests students’ understanding of Newtonian mechanics. We find that a higher CRT score predicts a higher FCI score for both precourse and postcourse tests. However, we also find that the FCI normalized gain is independent of CRT score. The implications of these results are discussed.

  6. Colorimetric determination of reducing normality in the Purex process

    International Nuclear Information System (INIS)

    Baumann, E.W.

    1983-07-01

    Adjustment of the valence state of plutonium from extractable Pu(IV) to nonextractable Pu(III) in the Purex process is accomplished by addition of reductants such as Fe(II), hydroxylamine nitrate (HAN), or U(IV). To implement on-line monitoring of this reduction step for improved process control at the Savannah River Plant, a simple colorimetric method for determining excess reductant (reducing normality) was developed. The method is based on formation of a colored complex of Fe(II) with FerroZine (Hach Chemical Company). The concentration of Fe(II) is determined directly. The concentration of HAN or U(IV), in addition to Fe(II), is determined indirectly as Fe(II), produced through reduction of Fe(III). Experimental conditions for a HAN-Fe(III) reaction of known stoichiometry were established. The effect of hydrazine, which stabilizes U(IV), was also determined. Real-time measurements of color development were made that simulated on-line performance. A laboratory analytical procedure is included. 5 references, 8 figures

  7. Toward a general evolutionary theory of oncogenesis.

    Science.gov (United States)

    Ewald, Paul W; Swain Ewald, Holly A

    2013-01-01

    We propose an evolutionary framework, the barrier theory of cancer, which is based on the distinction between barriers to oncogenesis and restraints. Barriers are defined as mechanisms that prevent oncogenesis. Restraints, which are more numerous, inhibit but do not prevent oncogenesis. Processes that compromise barriers are essential causes of cancer; those that interfere with restraints are exacerbating causes. The barrier theory is built upon the three evolutionary processes involved in oncogenesis: natural selection acting on multicellular organisms to mold barriers and restraints, natural selection acting on infectious organisms to abrogate these protective mechanisms, and oncogenic selection which is responsible for the evolution of normal cells into cancerous cells. The barrier theory is presented as a first step toward the development of a general evolutionary theory of cancer. Its attributes and implications for intervention are compared with those of other major conceptual frameworks for understanding cancer: the clonal diversification model, the stem cell theory and the hallmarks of cancer. The barrier theory emphasizes the practical value of distinguishing between essential and exacerbating causes. It also stresses the importance of determining the scope of infectious causation of cancer, because individual pathogens can be responsible for multiple essential causes in infected cells.

  8. The process of accepting breast cancer among Chinese women: A grounded theory study.

    Science.gov (United States)

    Chen, Shuang-Qin; Liu, Jun-E; Li, Zhi; Su, Ya-Li

    2017-06-01

    To describe the process by which Chinese women accept living with breast cancer. Individual interviews were conducted with 18 Chinese women who completed breast cancer treatment. Data were collected from September 2014 to January 2015 at a large tertiary teaching hospital in Beijing, China. In this grounded theory study, data were analyzed using constant comparative and coding analysis methods. In order to explain the process of accepting having breast cancer among women in China through the grounded theory study, a model that includes 5 axial categories was developed. Cognitive reconstruction emerged as the core category. The extent to which the women with breast cancer accepted having the disease was found to increase with the treatment stage and as their treatment stage progressed with time. The accepting process included five stages: non-acceptance, passive acceptance, willingness to accept, behavioral acceptance, and transcendence of acceptance. Our study using grounded theory study develops a model describing the process by which women accept having breast cancer. The model provides some intervention opportunities at every point of the process. Copyright © 2017. Published by Elsevier Ltd.

  9. Theory of mind and emotion-recognition functioning in autistic spectrum disorders and in psychiatric control and normal children.

    Science.gov (United States)

    Buitelaar, J K; van der Wees, M; Swaab-Barneveld, H; van der Gaag, R J

    1999-01-01

    The hypothesis was tested that weak theory of mind (ToM) and/or emotion recognition (ER) abilities are specific to subjects with autism. Differences in ToM and ER performance were examined between autistic (n = 20), pervasive developmental disorder-not otherwise specified (PDD-NOS) (n = 20), psychiatric control (n = 20), and normal children (n = 20). The clinical groups were matched person-to-person on age and verbal IQ. We used tasks for the matching and the context recognition of emotional expressions, and a set of first- and second-order ToM tasks. Autistic and PDD-NOS children could not be significantly differentiated from each other, nor could they be differentiated from the psychiatric controls with a diagnosis of ADHD (n = 9). The psychiatric controls with conduct disorder or dysthymia performed about as well as normal children. The variance in second-order ToM performance contributed most to differences between diagnostic groups.

  10. Adaptive Convergence Rates of a Dirichlet Process Mixture of Multivariate Normals

    OpenAIRE

    Tokdar, Surya T.

    2011-01-01

    It is shown that a simple Dirichlet process mixture of multivariate normals offers Bayesian density estimation with adaptive posterior convergence rates. Toward this, a novel sieve for non-parametric mixture densities is explored, and its rate adaptability to various smoothness classes of densities in arbitrary dimension is demonstrated. This sieve construction is expected to offer a substantial technical advancement in studying Bayesian non-parametric mixture models based on stick-breaking p...

  11. Process research on Emotionally Focused Therapy (EFT) for couples: linking theory to practice.

    Science.gov (United States)

    Greenman, Paul S; Johnson, Susan M

    2013-03-01

    The focus of this article is on the link among theory, process, and outcome in the practice of Emotionally Focused Therapy (EFT) for couples. We describe the EFT model of change and the EFT perspective on adult love as the reflection of underlying attachment processes. We outline the manner in which theory and research inform EFT interventions. This leads into a detailed review of the literature on the processes of change in EFT. We highlight the client responses and therapist operations that have emerged from process research and their relation to treatment outcomes. We discuss the implications of this body of research for clinical practice and training. © FPI, Inc.

  12. An information theory-based approach to modeling the information processing of NPP operators

    International Nuclear Information System (INIS)

    Kim, Jong Hyun; Seong, Poong Hyun

    2002-01-01

    This paper proposes a quantitative approach to modeling the information processing of NPP operators. The aim of this work is to derive the amount of the information processed during a certain control task. The focus will be on i) developing a model for information processing of NPP operators and ii) quantifying the model. To resolve the problems of the previous approaches based on the information theory, i.e. the problems of single channel approaches, we primarily develop the information processing model having multiple stages, which contains information flows. Then the uncertainty of the information is quantified using the Conant's model, a kind of information theory

  13. Energy shocks, crises and the policy process: A review of theory and application

    International Nuclear Information System (INIS)

    Grossman, Peter Z.

    2015-01-01

    What motivates changes in energy policy? Typically, the process begins with a notable exogenous event, a shock. Often, the shock leads to what is perceived to be a crisis. This review essay surveys theories of crisis policymaking from the social science literature and considers their application to changes in energy policy. Two cases — one from the U.S., the other from Germany — are examined in more detail from the standpoint of the theories discussed. Suggestions are made for improving energy policy analysis in the future. - Highlights: • An analysis of the idea of “crisis” and its application to energy. • A review of theories and models of the policy process and of policy change. • Theory applied to two energy cases. • Suggestion as to how the analysis of energy policymaking might be approached in the future

  14. Critically Engaging "Mutually Engaged Supervisory Processes": A Proposed Theory for CPE Supervisory Education.

    Science.gov (United States)

    Fitchett, George; Altenbaumer, Mary L; Atta, Osofo Kwesi; Stowman, Sheryl Lyndes; Vlach, Kyle

    2014-12-01

    Revisions to the processes for training and certifying supervisors continue to be debated within the Association for Clinical Pastoral Education (ACPE). In 2012 Ragsdale and colleagues published, "Mutually engaged supervisory processes," a qualitative research study utilizing grounded theory based on interviews with 19 recently certified Associate CPE Supervisors, of nine components that facilitate the development of CPE supervisory education students. In this article we critically engage this theory and the research upon which it is based. We also reflect on three issues highlighted by the theory: personal transformation in CPE supervisory education, how CPE supervisory education students develop theoretical foundations for their work, and engaging multicultural issues in supervisory education. We conclude that this theory offers ACPE the possibility of using research to guide future modifications to its practice of Supervisory education. © 2014 Journal of Pastoral Care Publications Inc.

  15. Cognitive load disrupts implicit theory-of-mind processing.

    Science.gov (United States)

    Schneider, Dana; Lam, Rebecca; Bayliss, Andrew P; Dux, Paul E

    2012-08-01

    Eye movements in Sally-Anne false-belief tasks appear to reflect the ability to implicitly monitor the mental states of other individuals (theory of mind, or ToM). It has recently been proposed that an early-developing, efficient, and automatically operating ToM system subserves this ability. Surprisingly absent from the literature, however, is an empirical test of the influence of domain-general executive processing resources on this implicit ToM system. In the study reported here, a dual-task method was employed to investigate the impact of executive load on eye movements in an implicit Sally-Anne false-belief task. Under no-load conditions, adult participants displayed eye movement behavior consistent with implicit belief processing, whereas evidence for belief processing was absent for participants under cognitive load. These findings indicate that the cognitive system responsible for implicitly tracking beliefs draws at least minimally on executive processing resources. Thus, even the most low-level processing of beliefs appears to reflect a capacity-limited operation.

  16. Understanding Reactions to Workplace Injustice through Process Theories of Motivation: A Teaching Module and Simulation

    Science.gov (United States)

    Stecher, Mary D.; Rosse, Joseph G.

    2007-01-01

    Management and organizational behavior students are often overwhelmed by the plethora of motivation theories they must master at the undergraduate level. This article offers a teaching module geared toward helping students understand how two major process theories of motivation, equity and expectancy theories and theories of organizational…

  17. Extending Attribution Theory: Considering Students' Perceived Control of the Attribution Process

    Science.gov (United States)

    Fishman, Evan J.; Husman, Jenefer

    2017-01-01

    Research in attribution theory has shown that students' causal thinking profoundly affects their learning and motivational outcomes. Very few studies, however, have explored how students' attribution-related beliefs influence the causal thought process. The present study used the perceived control of the attribution process (PCAP) model to examine…

  18. Application of a model of social information processing to nursing theory: how nurses respond to patients.

    Science.gov (United States)

    Sheldon, Lisa Kennedy; Ellington, Lee

    2008-11-01

    This paper is a report of a study to assess the applicability of a theoretical model of social information processing in expanding a nursing theory addressing how nurses respond to patients. Nursing communication affects patient outcomes such as anxiety, adherence to treatments and satisfaction with care. Orlando's theory of nursing process describes nurses' reactions to patients' behaviour as generating a perception, thought and feeling in the nurse and then action by the nurse. A model of social information processing describes the sequential steps in the cognitive processes used to respond to social cues and may be useful in describing the nursing process. Cognitive interviews were conducted in 2006 with a convenience sample of 5 nurses in the United States of America. The data were interpreted using the Crick and Dodge model of social information processing. Themes arising from cognitive interviews validated concepts of the nursing theory and the constructs of the model of social information processing. The interviews revealed that the support of peers was an additional construct involved in the development of communication skills, creation of a database and enhancement of self-efficacy. Models of social information processing enhance understanding of the process of how nurses respond to patients and further develop nursing theories further. In combination, the theories are useful in developing research into nurse-patient communication. Future research based on the expansion of nursing theory may identify effective and culturally appropriate nurse response patterns to specific patient interactions with implications for nursing care and patient outcomes.

  19. Managing fear in public health campaigns: a theory-based formative evaluation process.

    Science.gov (United States)

    Cho, Hyunyi; Witte, Kim

    2005-10-01

    The HIV/AIDS infection rate of Ethiopia is one of the world's highest. Prevention campaigns should systematically incorporate and respond to at-risk population's existing beliefs, emotions, and perceived barriers in the message design process to effectively promote behavior change. However, guidelines for conducting formative evaluation that are grounded in proven risk communication theory and empirical data analysis techniques are hard to find. This article provides a five-step formative evaluation process that translates theory and research for developing effective messages for behavior change. Guided by the extended parallel process model, the five-step process helps message designers manage public's fear surrounding issues such as HIV/AIDS. An entertainment education project that used the process to design HIV/AIDS prevention messages for Ethiopian urban youth is reported. Data were collected in five urban regions of Ethiopia and analyzed according to the process to develop key messages for a 26-week radio soap opera.

  20. Mermin Non-Locality in Abstract Process Theories

    Directory of Open Access Journals (Sweden)

    Stefano Gogioso

    2015-11-01

    Full Text Available The study of non-locality is fundamental to the understanding of quantum mechanics. The past 50 years have seen a number of non-locality proofs, but its fundamental building blocks, and the exact role it plays in quantum protocols, has remained elusive. In this paper, we focus on a particular flavour of non-locality, generalising Mermin's argument on the GHZ state. Using strongly complementary observables, we provide necessary and sufficient conditions for Mermin non-locality in abstract process theories. We show that the existence of more phases than classical points (aka eigenstates is not sufficient, and that the key to Mermin non-locality lies in the presence of certain algebraically non-trivial phases. This allows us to show that fRel, a favourite toy model for categorical quantum mechanics, is Mermin local. We show Mermin non-locality to be the key resource ensuring the device-independent security of the HBB CQ (N,N family of Quantum Secret Sharing protocols. Finally, we challenge the unspoken assumption that the measurements involved in Mermin-type scenarios should be complementary (like the pair X,Y, opening the doors to a much wider class of potential experimental setups than currently employed. In short, we give conditions for Mermin non-locality tests on any number of systems, where each party has an arbitrary number of measurement choices, where each measurement has an arbitrary number of outcomes and further, that works in any abstract process theory.

  1. Developing a new theory of knowledge sharing : Documenting and reflecting on a messy process

    NARCIS (Netherlands)

    Martinsons, M.G.; Davison, R.M.; Ou, Carol

    2015-01-01

    Much has been written about theories and how they can be tested. Unfortunately, much less has been written about how to develop them. This paper sheds light on the process of new theory development. We document and reflect on how we developed a context-sensitive indigenous theory of knowledge

  2. An Explanation of the Relationship between Instructor Humor and Student Learning: Instructional Humor Processing Theory

    Science.gov (United States)

    Wanzer, Melissa B.; Frymier, Ann B.; Irwin, Jeffrey

    2010-01-01

    This paper proposes the Instructional Humor Processing Theory (IHPT), a theory that incorporates elements of incongruity-resolution theory, disposition theory, and the elaboration likelihood model (ELM) of persuasion. IHPT is proposed and offered as an explanation for why some types of instructor-generated humor result in increased student…

  3. Researches Concerning to Minimize Vibrations when Processing Normal Lathe

    Directory of Open Access Journals (Sweden)

    Lenuța Cîndea

    2015-09-01

    Full Text Available In the cutting process, vibration is inevitable appearance, and in situations where the amplitude exceeds the limits of precision dimensional and shape of the surfaces generated vibrator phenomenon is detrimental.Field vibration is an issue of increasingly developed, so the futures will a better understanding of them and their use even in other sectors.The paper developed experimental measurement of vibrations at the lathe machining normal. The scheme described kinematical machine tool, cutting tool, cutting conditions, presenting experimental facility for measuring vibration occurring at turning. Experimental results have followed measurement of amplitude, which occurs during interior turning the knife without silencer incorporated. The tests were performed continuously for different speed, feed and depth of cut.

  4. Teaching queer theory at a Normal School.

    Science.gov (United States)

    Bacon, Jen

    2006-01-01

    This article presents a case study of the ongoing struggle to queer West Chester University at the level of the institution, the curriculum, and the classroom. Part of that struggle includes an effort to establish a policy for free speech that accommodates the values of the institution toward diversity. Another part involves attempts to introduce LGBT Studies into the curriculum, and the resulting debates over whether the curriculum should be "gayer" or "queerer." I discuss the personal struggle to destabilize ready-made categories and encourage non-binary thinking, while honoring the identities we live, and perform, in the classroom. In the last four years, WCU has hired half a dozen out gay or lesbian faculty members, some of whom identify as "queer." In many ways, those faculty members have entered a climate open to new ideas for adding LGBT content to the curriculum and to queering the structure and curriculum of the university. But as faculty, staff, and students engage this cause-along with the broader cause of social justice at the University- we have found that our enemies are often closer than we might have guessed. Detailing the tensions that have characterized the landscape at WCUduring my three years and half years there, this essay elaborates on the epistemological and pedagogical issues that arise when queer Theory meets LGBT Studies in the process of institutional, curricular, and pedagogical reform. I argue that questions about content and method, inclusion and exclusion, and identity and performance can be answered only with a concerted effort and continued attention to the cultural tendency to re-assert binaries while simultaneously learning from them. What is true of West Chester, I argue, is true of the larger social system where the contested terrain of the queer has implications for the choices we make as both stakeholders and deviants in the systems we chronicle and critique.

  5. A test of processing efficiency theory in a team sport context.

    Science.gov (United States)

    Smith, N C; Bellamy, M; Collins, D J; Newell, D

    2001-05-01

    In this study, we tested some key postulates of Eysenck and Calvo's processing efficiency theory in a team sport. The participants were 12 elite male volleyball players who were followed throughout the course of a competitive season. Self-report measures of pre-match and in-game cognitive anxiety and mental effort were collected in groups of players high and low in dispositional anxiety. Player performance was determined from the statistical analysis of match-play. Sets were classified according to the point spread separating the two teams into one of three levels of criticality. Game momentum was also analysed to determine its influence on in-game state anxiety. Significant differences in in-game cognitive anxiety were apparent between high and low trait anxiety groups. An interaction between anxiety grouping and momentum condition was also evident in cognitive anxiety. Differences in set criticality were reflected in significant elevations in mental effort, an effect more pronounced in dispositionally high anxious performers. Consistent with the predictions of processing efficiency theory, mental effort ratings were higher in high trait-anxious players in settings where their performance was equivalent to that of low trait-anxious performers. The usefulness of processing efficiency theory as an explanatory framework in sport anxiety research is discussed in the light of these findings.

  6. The Theory of High Energy Collision Processes - Final Report DOE/ER/40158-1

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Tai, T.

    2011-09-15

    In 1984, DOE awarded Harvard University a new Grant DE-FG02-84ER40158 to continue their support of Tai Tsun Wu as Principal Investigator of research on the theory of high energy collision processes. This Grant was renewed and remained active continuously from June 1, 1984 through November 30, 2007. Topics of interest during the 23-year duration of this Grant include: the theory and phenomenology of collision and production processes at ever higher energies; helicity methods of QED and QCD; neutrino oscillations and masses; Yang-Mills gauge theory; Beamstrahlung; Fermi pseudopotentials; magnetic monopoles and dyons; cosmology; classical confinement; mass relations; Bose-Einstein condensation; and large-momentum-transfer scattering processes. This Final Report describes the research carried out on Grant DE-FG02-84ER40158 for the period June 1, 1984 through November 30, 2007. Two books resulted from this project and a total of 125 publications.

  7. Semi adiabatic theory of seasonal Markov processes

    Energy Technology Data Exchange (ETDEWEB)

    Talkner, P [Paul Scherrer Inst. (PSI), Villigen (Switzerland)

    1999-08-01

    The dynamics of many natural and technical systems are essentially influenced by a periodic forcing. Analytic solutions of the equations of motion for periodically driven systems are generally not known. Simulations, numerical solutions or in some limiting cases approximate analytic solutions represent the known approaches to study the dynamics of such systems. Besides the regime of weak periodic forces where linear response theory works, the limit of a slow driving force can often be treated analytically using an adiabatic approximation. For this approximation to hold all intrinsic processes must be fast on the time-scale of a period of the external driving force. We developed a perturbation theory for periodically driven Markovian systems that covers the adiabatic regime but also works if the system has a single slow mode that may even be slower than the driving force. We call it the semi adiabatic approximation. Some results of this approximation for a system exhibiting stochastic resonance which usually takes place within the semi adiabatic regime are indicated. (author) 1 fig., 8 refs.

  8. Ingredients and change processes in occupational therapy for children: a grounded theory study.

    Science.gov (United States)

    Armitage, Samantha; Swallow, Veronica; Kolehmainen, Niina

    2017-05-01

    There is limited evidence about the effectiveness of occupational therapy interventions for participation outcomes in children with coordination difficulties. Developing theory about the interventions, i.e. their ingredients and change processes, is the first step to advance the evidence base. To develop theory about the key ingredients of occupational therapy interventions for children with coordination difficulties and the processes through which change in participation might happen. Grounded theory methodology, as described by Kathy Charmaz, was used to develop the theory. Children and parents participated in semi-structured interviews to share their experiences of occupational therapy and processes of change. Data collection and analysis were completed concurrently using constant comparison methods. Five key ingredients of interventions were described: performing activities and tasks; achieving; carer support; helping and supporting the child; and labelling. Ingredients related to participation by changing children's mastery experience, increasing capability beliefs and sense of control. Parents' knowledge, skills, positive emotions, sense of empowerment and capability beliefs also related to children's participation. The results identify intervention ingredients and change pathways within occupational therapy to increase participation. It is unclear how explicitly and often therapists consider and make use of these ingredients and pathway.

  9. Building bridges to observational perspectives: a grounded theory of therapy processes in psychosis.

    Science.gov (United States)

    Dilks, Sarah; Tasker, Fiona; Wren, Bernadette

    2008-06-01

    This study set out to explore therapy processes in psychosis with an initial focus on reflexivity and how this might be expressed in therapy conversations. Leiman's (2000) definition of reflexivity was used as a starting-point for an exploratory investigation of the use of language as reflective activity. Grounded theory was chosen as an appropriate methodology to distil an explanatory account across the qualitative data collected. Six psychologist-client pairs supplied three tapes of therapy sessions spread out across the course of therapy. Each participant was separately interviewed on two occasions to ascertain their views of therapy and of the emerging grounded theory. A grounded theory was developed conceptualizing the processes and activities in psychological therapy in psychosis. Building bridges to observational perspectives summarizes the core process in psychological therapy in psychosis. Therapy in psychosis is understood as intimately linking the social and internal world in a dialogical process aimed at enhancing the client's functioning in the social world rather than at specifically developing the private mental experience of reflexivity or mentalizing.

  10. Kinetic theory of age-structured stochastic birth-death processes

    Science.gov (United States)

    Greenman, Chris D.; Chou, Tom

    2016-01-01

    Classical age-structured mass-action models such as the McKendrick-von Foerster equation have been extensively studied but are unable to describe stochastic fluctuations or population-size-dependent birth and death rates. Stochastic theories that treat semi-Markov age-dependent processes using, e.g., the Bellman-Harris equation do not resolve a population's age structure and are unable to quantify population-size dependencies. Conversely, current theories that include size-dependent population dynamics (e.g., mathematical models that include carrying capacity such as the logistic equation) cannot be easily extended to take into account age-dependent birth and death rates. In this paper, we present a systematic derivation of a new, fully stochastic kinetic theory for interacting age-structured populations. By defining multiparticle probability density functions, we derive a hierarchy of kinetic equations for the stochastic evolution of an aging population undergoing birth and death. We show that the fully stochastic age-dependent birth-death process precludes factorization of the corresponding probability densities, which then must be solved by using a Bogoliubov--Born--Green--Kirkwood--Yvon-like hierarchy. Explicit solutions are derived in three limits: no birth, no death, and steady state. These are then compared with their corresponding mean-field results. Our results generalize both deterministic models and existing master equation approaches by providing an intuitive and efficient way to simultaneously model age- and population-dependent stochastic dynamics applicable to the study of demography, stem cell dynamics, and disease evolution.

  11. Multitrophic microbial interactions for eco- and agro-biotechnological processes: theory and practice.

    Science.gov (United States)

    Saleem, Muhammad; Moe, Luke A

    2014-10-01

    Multitrophic level microbial loop interactions mediated by protist predators, bacteria, and viruses drive eco- and agro-biotechnological processes such as bioremediation, wastewater treatment, plant growth promotion, and ecosystem functioning. To what extent these microbial interactions are context-dependent in performing biotechnological and ecosystem processes remains largely unstudied. Theory-driven research may advance the understanding of eco-evolutionary processes underlying the patterns and functioning of microbial interactions for successful development of microbe-based biotechnologies for real world applications. This could also be a great avenue to test the validity or limitations of ecology theory for managing diverse microbial resources in an era of altering microbial niches, multitrophic interactions, and microbial diversity loss caused by climate and land use changes. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. Stochastic processes in cell biology

    CERN Document Server

    Bressloff, Paul C

    2014-01-01

    This book develops the theory of continuous and discrete stochastic processes within the context of cell biology.  A wide range of biological topics are covered including normal and anomalous diffusion in complex cellular environments, stochastic ion channels and excitable systems, stochastic calcium signaling, molecular motors, intracellular transport, signal transduction, bacterial chemotaxis, robustness in gene networks, genetic switches and oscillators, cell polarization, polymerization, cellular length control, and branching processes. The book also provides a pedagogical introduction to the theory of stochastic process – Fokker Planck equations, stochastic differential equations, master equations and jump Markov processes, diffusion approximations and the system size expansion, first passage time problems, stochastic hybrid systems, reaction-diffusion equations, exclusion processes, WKB methods, martingales and branching processes, stochastic calculus, and numerical methods.   This text is primarily...

  13. Theory and Metatheory in the Study of Dual Processing: Reply to Comments.

    Science.gov (United States)

    Evans, Jonathan St B T; Stanovich, Keith E

    2013-05-01

    In this article, we respond to the four comments on our target article. Some of the commentators suggest that we have formulated our proposals in a way that renders our account of dual-process theory untestable and less interesting than the broad theory that has been critiqued in recent literature. Our response is that there is a confusion of levels. Falsifiable predictions occur not at the level of paradigm or metatheory-where this debate is taking place-but rather in the instantiation of such a broad framework in task level models. Our proposal that many dual-processing characteristics are only correlated features does not weaken the testability of task-level dual-processing accounts. We also respond to arguments that types of processing are not qualitatively distinct and discuss specific evidence disputed by the commentators. Finally, we welcome the constructive comments of one commentator who provides strong arguments for the reality of the dual-process distinction. © The Author(s) 2013.

  14. A Novel Higher-Order Shear and Normal Deformable Plate Theory for the Static, Free Vibration and Buckling Analysis of Functionally Graded Plates

    Directory of Open Access Journals (Sweden)

    Shi-Chao Yi

    2017-01-01

    Full Text Available Closed-form solution of a special higher-order shear and normal deformable plate theory is presented for the static situations, natural frequencies, and buckling responses of simple supported functionally graded materials plates (FGMs. Distinguished from the usual theories, the uniqueness is the differentia of the new plate theory. Each individual FGM plate has special characteristics, such as material properties and length-thickness ratio. These distinctive attributes determine a set of orthogonal polynomials, and then the polynomials can form an exclusive plate theory. Thus, the novel plate theory has two merits: one is the orthogonality, where the majority of the coefficients of the equations derived from Hamilton’s principle are zero; the other is the flexibility, where the order of the plate theory can be arbitrarily set. Numerical examples with different shapes of plates are presented and the achieved results are compared with the reference solutions available in the literature. Several aspects of the model involving relevant parameters, length-to-thickness, stiffness ratios, and so forth affected by static and dynamic situations are elaborate analyzed in detail. As a consequence, the applicability and the effectiveness of the present method for accurately computing deflection, stresses, natural frequencies, and buckling response of various FGM plates are demonstrated.

  15. Pierre Gy's sampling theory and sampling practice heterogeneity, sampling correctness, and statistical process control

    CERN Document Server

    Pitard, Francis F

    1993-01-01

    Pierre Gy's Sampling Theory and Sampling Practice, Second Edition is a concise, step-by-step guide for process variability management and methods. Updated and expanded, this new edition provides a comprehensive study of heterogeneity, covering the basic principles of sampling theory and its various applications. It presents many practical examples to allow readers to select appropriate sampling protocols and assess the validity of sampling protocols from others. The variability of dynamic process streams using variography is discussed to help bridge sampling theory with statistical process control. Many descriptions of good sampling devices, as well as descriptions of poor ones, are featured to educate readers on what to look for when purchasing sampling systems. The book uses its accessible, tutorial style to focus on professional selection and use of methods. The book will be a valuable guide for mineral processing engineers; metallurgists; geologists; miners; chemists; environmental scientists; and practit...

  16. Applying circular economy innovation theory in business process modeling and analysis

    Science.gov (United States)

    Popa, V.; Popa, L.

    2017-08-01

    The overall aim of this paper is to develop a new conceptual framework for business process modeling and analysis using circular economy innovative theory as a source for business knowledge management. The last part of the paper presents an author’s proposed basic structure for a new business models applying circular economy innovation theories. For people working on new innovative business models in the field of the circular economy this paper provides new ideas for clustering their concepts.

  17. Theory and applications of spherical microphone array processing

    CERN Document Server

    Jarrett, Daniel P; Naylor, Patrick A

    2017-01-01

    This book presents the signal processing algorithms that have been developed to process the signals acquired by a spherical microphone array. Spherical microphone arrays can be used to capture the sound field in three dimensions and have received significant interest from researchers and audio engineers. Algorithms for spherical array processing are different to corresponding algorithms already known in the literature of linear and planar arrays because the spherical geometry can be exploited to great beneficial effect. The authors aim to advance the field of spherical array processing by helping those new to the field to study it efficiently and from a single source, as well as by offering a way for more experienced researchers and engineers to consolidate their understanding, adding either or both of breadth and depth. The level of the presentation corresponds to graduate studies at MSc and PhD level. This book begins with a presentation of some of the essential mathematical and physical theory relevant to ...

  18. Supporting the use of theory in cross-country health services research: a participatory qualitative approach using Normalisation Process Theory as an example.

    Science.gov (United States)

    O'Donnell, Catherine A; Mair, Frances S; Dowrick, Christopher; Brún, Mary O'Reilly-de; Brún, Tomas de; Burns, Nicola; Lionis, Christos; Saridaki, Aristoula; Papadakaki, Maria; Muijsenbergh, Maria van den; Weel-Baumgarten, Evelyn van; Gravenhorst, Katja; Cooper, Lucy; Princz, Christine; Teunissen, Erik; Mareeuw, Francine van den Driessen; Vlahadi, Maria; Spiegel, Wolfgang; MacFarlane, Anne

    2017-08-21

    To describe and reflect on the process of designing and delivering a training programme supporting the use of theory, in this case Normalisation Process Theory (NPT), in a multisite cross-country health services research study. Participatory research approach using qualitative methods. Six European primary care settings involving research teams from Austria, England, Greece, Ireland, The Netherlands and Scotland. RESTORE research team consisting of 8 project applicants, all senior primary care academics, and 10 researchers. Professional backgrounds included general practitioners/family doctors, social/cultural anthropologists, sociologists and health services/primary care researchers. Views of all research team members (n=18) were assessed using qualitative evaluation methods, analysed qualitatively by the trainers after each session. Most of the team had no experience of using NPT and many had not applied theory to prospective, qualitative research projects. Early training proved didactic and overloaded participants with information. Drawing on RESTORE's methodological approach of Participatory Learning and Action, workshops using role play, experiential interactive exercises and light-hearted examples not directly related to the study subject matter were developed. Evaluation showed the study team quickly grew in knowledge and confidence in applying theory to fieldwork.Recommendations applicable to other studies include: accepting that theory application is not a linear process, that time is needed to address researcher concerns with the process, and that experiential, interactive learning is a key device in building conceptual and practical knowledge. An unanticipated benefit was the smooth transition to cross-country qualitative coding of study data. A structured programme of training enhanced and supported the prospective application of a theory, NPT, to our work but raised challenges. These were not unique to NPT but could arise with the application of any

  19. Toward a computational theory of conscious processing.

    Science.gov (United States)

    Dehaene, Stanislas; Charles, Lucie; King, Jean-Rémi; Marti, Sébastien

    2014-04-01

    The study of the mechanisms of conscious processing has become a productive area of cognitive neuroscience. Here we review some of the recent behavioral and neuroscience data, with the specific goal of constraining present and future theories of the computations underlying conscious processing. Experimental findings imply that most of the brain's computations can be performed in a non-conscious mode, but that conscious perception is characterized by an amplification, global propagation and integration of brain signals. A comparison of these data with major theoretical proposals suggests that firstly, conscious access must be carefully distinguished from selective attention; secondly, conscious perception may be likened to a non-linear decision that 'ignites' a network of distributed areas; thirdly, information which is selected for conscious perception gains access to additional computations, including temporary maintenance, global sharing, and flexible routing; and finally, measures of the complexity, long-distance correlation and integration of brain signals provide reliable indices of conscious processing, clinically relevant to patients recovering from coma. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. Memory processes during sleep: beyond the standard consolidation theory.

    Science.gov (United States)

    Axmacher, Nikolai; Draguhn, Andreas; Elger, Christian E; Fell, Juergen

    2009-07-01

    Two-step theories of memory formation suggest that an initial encoding stage, during which transient neural assemblies are formed in the hippocampus, is followed by a second step called consolidation, which involves re-processing of activity patterns and is associated with an increasing involvement of the neocortex. Several studies in human subjects as well as in animals suggest that memory consolidation occurs predominantly during sleep (standard consolidation model). Alternatively, it has been suggested that consolidation may occur during waking state as well and that the role of sleep is rather to restore encoding capabilities of synaptic connections (synaptic downscaling theory). Here, we review the experimental evidence favoring and challenging these two views and suggest an integrative model of memory consolidation.

  1. "Theory Becoming Alive": The Learning Transition Process of Newly Graduated Nurses in Canada.

    Science.gov (United States)

    Nour, Violet; Williams, Anne M

    2018-01-01

    Background Newly graduated nurses often encounter a gap between theory and practice in clinical settings. Although this has been the focus of considerable research, little is known about the learning transition process. Purpose The purpose of this study was to explore the experiences of newly graduated nurses in acute healthcare settings within Canada. This study was conducted to gain a greater understanding of the experiences and challenges faced by graduates. Methods Grounded theory method was utilized with a sample of 14 registered nurses who were employed in acute-care settings. Data were collected using in-depth interviews. Constant comparative analysis was used to analyze data. Results Findings revealed a core category, "Theory Becoming Alive," and four supporting categories: Entry into Practice, Immersion, Committing, and Evolving. Theory Becoming Alive described the process of new graduate nurses' clinical learning experiences as well as the challenges that they encountered in clinical settings after graduating. Conclusions This research provides a greater understanding of learning process of new graduate nurses in Canada. It highlights the importance of providing supportive environments to assist new graduate nurses to develop confidence as independent registered nurses in clinical areas. Future research directions as well as supportive educational strategies are described.

  2. Verbal Processing Reaction Times in "Normal" and "Poor" Readers.

    Science.gov (United States)

    Culbertson, Jack; And Others

    After it had been determined that reaction time (RT) was a sensitive measure of hemispheric dominance in a verbal task performed by normal adult readers, the reaction times of three groups of subjects (20 normal reading college students, 12 normal reading third graders and 11 poor reading grade school students) were compared. Ss were exposed to…

  3. Note: Determination of torsional spring constant of atomic force microscopy cantilevers: Combining normal spring constant and classical beam theory

    DEFF Research Database (Denmark)

    Álvarez-Asencio, R.; Thormann, Esben; Rutland, M.W.

    2013-01-01

    A technique has been developed for the calculation of torsional spring constants for AFM cantilevers based on the combination of the normal spring constant and plate/beam theory. It is easy to apply and allow the determination of torsional constants for stiff cantilevers where the thermal power s...... spectrum is difficult to obtain due to the high resonance frequency and low signal/noise ratio. The applicability is shown to be general and this simple approach can thus be used to obtain torsional constants for any beam shaped cantilever. © 2013 AIP Publishing LLC....

  4. Coronary heart disease patients transitioning to a normal life: perspectives and stages identified through a grounded theory approach.

    Science.gov (United States)

    Najafi Ghezeljeh, Tahereh; Yadavar Nikravesh, Mansoureh; Emami, Azita

    2014-02-01

    To explore how Iranian patients with coronary heart disease experience their lives. Coronary heart disease is a leading cause of death in Iran and worldwide. Understanding qualitatively how patients experience the acute and postacute stages of this chronic condition is essential knowledge for minimising the negative consequences of coronary heart disease. Qualitative study using grounded theory for the data analysis. Data for this study were collected through individual qualitative interviews with 24 patients with coronary heart disease, conducted between January 2009 and January 2011. Patients with angina pectoris were selected for participation through purposive sampling, and sample size was determined by data saturation. Data analysis began with initial coding and continued with focused coding. Categories were determined, and the core category was subsequently developed and finalised. The main categories of the transition from acute phase to a modified or 'new normal' life were: (1) Loss of normal life. Experiencing emotions and consequences of illness; (2) Coming to terms. Using coping strategies; (3) Recreating normal life. Healthcare providers must correctly recognise the stages of transition patients navigate while coping with coronary heart disease to support and educate them appropriately throughout these stages. Patients with coronary heart disease lose their normal lives and must work towards recreating a revised life using coping strategies that enable them to come to terms with their situations. By understanding Iranian patients' experiences, healthcare providers and especially nurses can use the information to support and educate patients with coronary heart disease on how to more effectively deal with their illness and its consequences. © 2013 John Wiley & Sons Ltd.

  5. Prosody's Contribution to Fluency: An Examination of the Theory of Automatic Information Processing

    Science.gov (United States)

    Schrauben, Julie E.

    2010-01-01

    LaBerge and Samuels' (1974) theory of automatic information processing in reading offers a model that explains how and where the processing of information occurs and the degree to which processing of information occurs. These processes are dependent upon two criteria: accurate word decoding and automatic word recognition. However, LaBerge and…

  6. Mothers' daily person and process praise: implications for children's theory of intelligence and motivation.

    Science.gov (United States)

    Pomerantz, Eva M; Kempner, Sara G

    2013-11-01

    This research examined if mothers' day-to-day praise of children's success in school plays a role in children's theory of intelligence and motivation. Participants were 120 children (mean age = 10.23 years) and their mothers who took part in a 2-wave study spanning 6 months. During the first wave, mothers completed a 10-day daily interview in which they reported on their use of person (e.g., "You are smart") and process (e.g., "You tried hard") praise. Children's entity theory of intelligence and preference for challenge in school were assessed with surveys at both waves. Mothers' person, but not process, praise was predictive of children's theory of intelligence and motivation: The more person praise mothers used, the more children subsequently held an entity theory of intelligence and avoided challenge over and above their earlier functioning on these dimensions.

  7. Theory of Neural Information Processing Systems

    International Nuclear Information System (INIS)

    Galla, Tobias

    2006-01-01

    It is difficult not to be amazed by the ability of the human brain to process, to structure and to memorize information. Even by the toughest standards the behaviour of this network of about 10 11 neurons qualifies as complex, and both the scientific community and the public take great interest in the growing field of neuroscience. The scientific endeavour to learn more about the function of the brain as an information processing system is here a truly interdisciplinary one, with important contributions from biology, computer science, physics, engineering and mathematics as the authors quite rightly point out in the introduction of their book. The role of the theoretical disciplines here is to provide mathematical models of information processing systems and the tools to study them. These models and tools are at the centre of the material covered in the book by Coolen, Kuehn and Sollich. The book is divided into five parts, providing basic introductory material on neural network models as well as the details of advanced techniques to study them. A mathematical appendix complements the main text. The range of topics is extremely broad, still the presentation is concise and the book well arranged. To stress the breadth of the book let me just mention a few keywords here: the material ranges from the basics of perceptrons and recurrent network architectures to more advanced aspects such as Bayesian learning and support vector machines; Shannon's theory of information and the definition of entropy are discussed, and a chapter on Amari's information geometry is not missing either. Finally the statistical mechanics chapters cover Gardner theory and the replica analysis of the Hopfield model, not without being preceded by a brief introduction of the basic concepts of equilibrium statistical physics. The book also contains a part on effective theories of the macroscopic dynamics of neural networks. Many dynamical aspects of neural networks are usually hard to find in the

  8. Bridging between basic theory and clinical practice.

    Science.gov (United States)

    Barnard, Philip J

    2004-09-01

    This paper articulates and discusses the parts played by different processes and representations in the overall conduct of applied clinical science. It distinguishes two sorts of representation, theories in the science base and bridging representations needed to map from real world behaviour to basic theory and from theory back to the real world. It is then argued that macro-theories of the "normal" human mental architecture could help synthesise basic theoretical accounts of diverse psychopathologies, without recourse to special purpose clinical cognitive theories of particular psychopathologies or even specific symptoms. Using the Interacting Cognitive Subsystems model [Affect, Cognition and Change: Re-modelling Depressive Thought, Lawrence Erlbaum Associates, Hove, 1993], some specific macro-theoretic variables are identified. Concrete illustrations are given of how the essence of quite complex basic theory can be translated into a simpler representational format to help clinicians conceptualise a psychopathological state and pinpoint relevant variables that might be changed by therapeutic interventions. Some suggestions are also offered about how the inevitable problem of complexity in multiple component theories might be directly confronted.

  9. Challenging the Ideology of Normal in Schools

    Science.gov (United States)

    Annamma, Subini A.; Boelé, Amy L.; Moore, Brooke A.; Klingner, Janette

    2013-01-01

    In this article, we build on Brantlinger's work to critique the binary of normal and abnormal applied in US schools that create inequities in education. Operating from a critical perspective, we draw from Critical Race Theory, Disability Studies in Education, and Cultural/Historical Activity Theory to build a conceptual framework for…

  10. Using a theory-driven conceptual framework in qualitative health research.

    Science.gov (United States)

    Macfarlane, Anne; O'Reilly-de Brún, Mary

    2012-05-01

    The role and merits of highly inductive research designs in qualitative health research are well established, and there has been a powerful proliferation of grounded theory method in the field. However, tight qualitative research designs informed by social theory can be useful to sensitize researchers to concepts and processes that they might not necessarily identify through inductive processes. In this article, we provide a reflexive account of our experience of using a theory-driven conceptual framework, the Normalization Process Model, in a qualitative evaluation of general practitioners' uptake of a free, pilot, language interpreting service in the Republic of Ireland. We reflect on our decisions about whether or not to use the Model, and describe our actual use of it to inform research questions, sampling, coding, and data analysis. We conclude with reflections on the added value that the Model and tight design brought to our research.

  11. Modulation of the inter-hemispheric processing of semantic information during normal aging. A divided visual field experiment.

    Science.gov (United States)

    Hoyau, E; Cousin, E; Jaillard, A; Baciu, M

    2016-12-01

    We evaluated the effect of normal aging on the inter-hemispheric processing of semantic information by using the divided visual field (DVF) method, with words and pictures. Two main theoretical models have been considered, (a) the HAROLD model which posits that aging is associated with supplementary recruitment of the right hemisphere (RH) and decreased hemispheric specialization, and (b) the RH decline theory, which assumes that the RH becomes less efficient with aging, associated with increased LH specialization. Two groups of subjects were examined, a Young Group (YG) and an Old Group (OG), while participants performed a semantic categorization task (living vs. non-living) in words and pictures. The DVF was realized in two steps: (a) unilateral DVF presentation with stimuli presented separately in each visual field, left or right, allowing for their initial processing by only one hemisphere, right or left, respectively; (b) bilateral DVF presentation (BVF) with stimuli presented simultaneously in both visual fields, followed by their processing by both hemispheres. These two types of presentation permitted the evaluation of two main characteristics of the inter-hemispheric processing of information, the hemispheric specialization (HS) and the inter-hemispheric cooperation (IHC). Moreover, the BVF allowed determining the driver-hemisphere for processing information presented in BVF. Results obtained in OG indicated that: (a) semantic categorization was performed as accurately as YG, even if more slowly, (b) a non-semantic RH decline was observed, and (c) the LH controls the semantic processing during the BVF, suggesting an increased role of the LH in aging. However, despite the stronger involvement of the LH in OG, the RH is not completely devoid of semantic abilities. As discussed in the paper, neither the HAROLD nor the RH decline does fully explain this pattern of results. We rather suggest that the effect of aging on the hemispheric specialization and inter

  12. Normal processes of phonon-phonon scattering and thermal conductivity of germanium crystals with isotopic disorder

    CERN Document Server

    Kuleev, I G

    2001-01-01

    The effect of normal processes of the phonon-phonon scattering on the thermal conductivity of the germanium crystals with various isotopic disorder degrees is considered. The phonon pulse redistribution in the normal scattering processes both inside each oscillatory branch (the Simons mechanism) and between various phonon oscillatory branches (the Herring mechanism) is accounted for. The contributions of the longitudinal and cross-sectional phonons drift motion into the thermal conductivity are analyzed. It is shown that the pulse redistribution in the Herring relaxation mechanism leads to essential suppression of the longitudinal phonons drift motion in the isotopically pure germanium crystals. The calculations results of thermal conductivity for the Herring relaxation mechanism agree well with experimental data on the germanium crystals with various isotopic disorder degrees

  13. Optimal operation planning of radioactive waste processing system by fuzzy theory

    International Nuclear Information System (INIS)

    Yang, Jin Yeong; Lee, Kun Jai

    2000-01-01

    This study is concerned with the applications of linear goal programming and fuzzy theory to the analysis of management and operational problems in the radioactive processing system (RWPS). The developed model is validated and verified using actual data obtained from the RWPS at Kyoto University in Japan. The solution by goal programming and fuzzy theory would show the optimal operation point which is to maximize the total treatable radioactive waste volume and minimize the released radioactivity of liquid waste even under the restricted resources. (orig.)

  14. The theory of nilpotent groups

    CERN Document Server

    Clement, Anthony E; Zyman, Marcos

    2017-01-01

    This monograph presents both classical and recent results in the theory of nilpotent groups and provides a self-contained, comprehensive reference on the topic.  While the theorems and proofs included can be found throughout the existing literature, this is the first book to collect them in a single volume.  Details omitted from the original sources, along with additional computations and explanations, have been added to foster a stronger understanding of the theory of nilpotent groups and the techniques commonly used to study them.  Topics discussed include collection processes, normal forms and embeddings, isolators, extraction of roots, P-localization, dimension subgroups and Lie algebras, decision problems, and nilpotent groups of automorphisms.  Requiring only a strong undergraduate or beginning graduate background in algebra, graduate students and researchers in mathematics will find The Theory of Nilpotent Groups to be a valuable resource.

  15. Working Memory Processing In Normal Subjects and Subjects with Dyslexia

    Science.gov (United States)

    Bowyer, S. M.; Lajiness-O'Neill, R.; Weiland, B. J.; Mason, K.; Tepley, N.

    2004-10-01

    Magnetoencephalography (MEG) was used to determine the neuroanatomical location of working memory (WM) processes. Differences between subjects with dyslexia (SD; n=5) and normal readers (NR; n=5) were studied during two WM tasks. A spatial WM task (SMW) consisted of blocks visually presented in one of 12 positions for 2 s each. Subjects were to determine if the current position matched the position presented 2 slides earlier (N-Back Test). The verbal task (VMW) consisted of presentation of a single letter. The location of cortical activity during SWM in NR (determined with MR-FOCUSS analysis) was in the right superior temporal gyrus (STG) and right angular gyrus (AG). Similar activation was seen in SD with a slight delay of approximately 20 ms. During VWM activity was seen in LEFT STG and LEFT AG in NR. In contrast for SD, activation was in the RIGHT STG and RIGHT AG. This study demonstrates the possibility to differentiate WM processing in subjects with and without learning disorders.

  16. Overriding Moral Intuitions – Does It Make Us Immoral? Dual-Process Theory of Higher Cognition Account for Moral Reasoning

    OpenAIRE

    Michał Białek; Simon J. Handley

    2013-01-01

    Moral decisions are considered as an intuitive process, while conscious reasoning is mostly used only to justify those intuitions. This problem is described in few different dual-process theories of mind, that are being developed e.g. by Frederick and Kahneman, Stanovich and Evans. Those theories recently evolved into tri-process theories with a proposed process that makes ultimate decision or allows to paraformal processing with focal bias.. Presented experiment compares...

  17. Theory of superconductivity. II. Excited Cooper pairs. Why does sodium remain normal down to 0 K?

    International Nuclear Information System (INIS)

    Fujita, S.

    1992-01-01

    Based on a generalized BCS Hamiltonian in which the interaction strengths (V 11 , V 22 , V 12 ) among and between electron (12) and hole (2) Cooper pairs are differentiated, the thermodynamic properties of a type-I superconductor below the critical temperature T c are investigated. An expression for the ground-state energy, W - W 0 , relative to the unperturbed Block system is obtained. The usual BCS formulas are obtained in the limits: (all) V jl = V 0 , N 1 (0) = N 2 (0). Any excitations generated through the BCS interaction Hamiltonian containing V jl must involve Cooper pairs of antiparallel spins and nearly opposite momenta. The nonzero momentum or excited Cooper pairs below T c are shown to have an excitation energy band minimum lower than the quasi-electrons, which were regarded as the elementary excitations in the original BCS theory. The energy gap var-epsilon g (T) defined relative to excited and zero-momentum Copper pairs (when V jl > 0) decreases from var-epsilon g (0) to 0 as the temperature T is raised from 0 to T c . If electrons only are available as in a monovalent metal like sodium (V 12 = 0), the energy constant Δ 1 is finite but the energy gap vanishes identically for all T. In agreement with the BCS theory, the present theory predicts that a pure nonmagnetic metal in any dimensions should have a Cooper-pair ground state whose energy is lower than that of the Bloch ground state. Additionally it predicts that a monovalent metal should remain normal down to 0 K, and that there should be no strictly one-dimensional superconductor

  18. Integration of multiple theories for the simulation of laser interference lithography processes.

    Science.gov (United States)

    Lin, Te-Hsun; Yang, Yin-Kuang; Fu, Chien-Chung

    2017-11-24

    The periodic structure of laser interference lithography (LIL) fabrication is superior to other lithography technologies. In contrast to traditional lithography, LIL has the advantages of being a simple optical system with no mask requirements, low cost, high depth of focus, and large patterning area in a single exposure. Generally, a simulation pattern for the periodic structure is obtained through optical interference prior to its fabrication through LIL. However, the LIL process is complex and combines the fields of optical and polymer materials; thus, a single simulation theory cannot reflect the real situation. Therefore, this research integrates multiple theories, including those of optical interference, standing waves, and photoresist characteristics, to create a mathematical model for the LIL process. The mathematical model can accurately estimate the exposure time and reduce the LIL process duration through trial and error.

  19. Integration of multiple theories for the simulation of laser interference lithography processes

    Science.gov (United States)

    Lin, Te-Hsun; Yang, Yin-Kuang; Fu, Chien-Chung

    2017-11-01

    The periodic structure of laser interference lithography (LIL) fabrication is superior to other lithography technologies. In contrast to traditional lithography, LIL has the advantages of being a simple optical system with no mask requirements, low cost, high depth of focus, and large patterning area in a single exposure. Generally, a simulation pattern for the periodic structure is obtained through optical interference prior to its fabrication through LIL. However, the LIL process is complex and combines the fields of optical and polymer materials; thus, a single simulation theory cannot reflect the real situation. Therefore, this research integrates multiple theories, including those of optical interference, standing waves, and photoresist characteristics, to create a mathematical model for the LIL process. The mathematical model can accurately estimate the exposure time and reduce the LIL process duration through trial and error.

  20. An introduction to continuous-time stochastic processes theory, models, and applications to finance, biology, and medicine

    CERN Document Server

    Capasso, Vincenzo

    2015-01-01

    This textbook, now in its third edition, offers a rigorous and self-contained introduction to the theory of continuous-time stochastic processes, stochastic integrals, and stochastic differential equations. Expertly balancing theory and applications, the work features concrete examples of modeling real-world problems from biology, medicine, industrial applications, finance, and insurance using stochastic methods. No previous knowledge of stochastic processes is required. Key topics include: * Markov processes * Stochastic differential equations * Arbitrage-free markets and financial derivatives * Insurance risk * Population dynamics, and epidemics * Agent-based models New to the Third Edition: * Infinitely divisible distributions * Random measures * Levy processes * Fractional Brownian motion * Ergodic theory * Karhunen-Loeve expansion * Additional applications * Additional  exercises * Smoluchowski  approximation of  Langevin systems An Introduction to Continuous-Time Stochastic Processes, Third Editio...

  1. Processes of self-regulated learning in music theory in elementary music schools in Slovenia

    OpenAIRE

    Peklaj, Cirila; Smolej-Fritz, Barbara

    2015-01-01

    The aim of our study was determine how students regulate their learning in music theory (MT). The research is based on the socio-cognitive theory of learning. The aim of our study was twofold: first, to design the instruments for measuring (meta)cognitive and affective-motivational processes in learning MT, and, second, to examine the relationship between these processes. A total of 457 fifth- and sixth- grade students from 10 different elementary music schools in Slovenia participated in the...

  2. Signal Detection Theory-Based Information Processing for the Detection of Breast Cancer at Microwave Frequencies

    National Research Council Canada - National Science Library

    Nolte, Loren

    2002-01-01

    The hypothesis is that one can use signal detection theory to improve the performance in detecting tumors in the breast by using this theory to develop task-oriented information processing techniques...

  3. [Who teaches queer: the prospect of queer theory analysis in the health education process].

    Science.gov (United States)

    Motta, Jose Inácio Jardim; Ribeiro, Victória Maria Brant

    2013-06-01

    The scope of this essay is to reflect on the possibilities of inclusion of a queer analytical body to the processes of education in the health field. This is because the development of the Unified Health System, with its new set of health practices has revealed challenges that include broadening the knowledge set especially required for revitalization of the notion of subject. Queer theory is needed to understand how identities and in particular gender and sexuality are incorporated, in a social and cultural process, and how, in the micro-social spaces, it can determine educational practices with the power to reinforce the status of the so-called minority sexualities. Queer theory framed in so-called post-critical theories of education is analyzed from the categories of power, resistance, transgression in the context of standardization and subjectivity. It is assumed that processes of education in health, grounded in queer teaching, working in terms of difference and not diversity, proposing processes of deconstruction of binaries such as nature/culture, reason/passion, homosexual/heterosexual, working towards shaping more assertive cultural and social subjects.

  4. Recent experiments testing an opponent-process theory of acquired motivation.

    Science.gov (United States)

    Solomon, R L

    1980-01-01

    There are acquired motives of the addiction type which seem to be non-associative in nature. They all seem to involve affective phenomena caused by reinforcers, unconditioned stimuli or innate releasers. When such stimuli are repeatedly presented, at least three affective phenomena occur: (1) affective contrast effects, (2) affective habituation (tolerance), and (3) affective withdrawal syndromes. These phenomena can be precipitated either by pleasant or unpleasant events (positive or negative reinforcers). Whenever we see these three phenomena, we also see the development of an addictive cycle, a new motivational system. These phenomena are explained by an opponent-process theory of motivation which holds that there are affect control systems which oppose large departures from affective equilibrium. The control systems are strengthened by use and weakened by disuse. Current observations and experiments testing the theory are described for: (1) the growth of social attachment (imprinting) in ducklings; and (2) the growth of adjunctive behaviors. The findings so far support the theory.

  5. An analysis of clinical reasoning through a recent and comprehensive approach: the dual-process theory.

    Science.gov (United States)

    Pelaccia, Thierry; Tardif, Jacques; Triby, Emmanuel; Charlin, Bernard

    2011-03-14

    Clinical reasoning plays a major role in the ability of doctors to make diagnoses and decisions. It is considered as the physician's most critical competence, and has been widely studied by physicians, educationalists, psychologists and sociologists. Since the 1970s, many theories about clinical reasoning in medicine have been put forward. This paper aims at exploring a comprehensive approach: the "dual-process theory", a model developed by cognitive psychologists over the last few years. After 40 years of sometimes contradictory studies on clinical reasoning, the dual-process theory gives us many answers on how doctors think while making diagnoses and decisions. It highlights the importance of physicians' intuition and the high level of interaction between analytical and non-analytical processes. However, it has not received much attention in the medical education literature. The implications of dual-process models of reasoning in terms of medical education will be discussed.

  6. R-Matrix Theory of Atomic Collisions Application to Atomic, Molecular and Optical Processes

    CERN Document Server

    Burke, Philip George

    2011-01-01

    Commencing with a self-contained overview of atomic collision theory, this monograph presents recent developments of R-matrix theory and its applications to a wide-range of atomic molecular and optical processes. These developments include electron and photon collisions with atoms, ions and molecules required in the analysis of laboratory and astrophysical plasmas, multiphoton processes required in the analysis of superintense laser interactions with atoms and molecules and positron collisions with atoms and molecules required in antimatter studies of scientific and technologial importance. Basic mathematical results and general and widely used R-matrix computer programs are summarized in the appendices.

  7. The theory, practice, and future of process improvement in general thoracic surgery.

    Science.gov (United States)

    Freeman, Richard K

    2014-01-01

    Process improvement, in its broadest sense, is the analysis of a given set of actions with the aim of elevating quality and reducing costs. The tenets of process improvement have been applied to medicine in increasing frequency for at least the last quarter century including thoracic surgery. This review outlines the theory underlying process improvement, the currently available data sources for process improvement and possible future directions of research. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. Fractional Flow Theory Applicable to Non-Newtonian Behavior in EOR Processes

    NARCIS (Netherlands)

    Rossen, W.R.; Venkatraman, A.; Johns, R.T.; Kibodeaux, K.R.; Lai, H.; Moradi Tehrani, N.

    2011-01-01

    The method of characteristics, or fractional-flow theory, is extremely useful in understanding complex Enhanced Oil Recovery (EOR) processes and in calibrating simulators. One limitation has been its restriction to Newtonian rheology except in rectilinear flow. Its inability to deal with

  9. Fuzzy-Trace Theory and Lifespan Cognitive Development.

    Science.gov (United States)

    Brainerd, C J; Reyna, Valerie F

    2015-12-01

    Fuzzy-trace theory (FTT) emphasizes the use of core theoretical principles, such as the verbatim-gist distinction, to predict new findings about cognitive development that are counterintuitive from the perspective of other theories or of common-sense. To the extent that such predictions are confirmed, the range of phenomena that are explained expands without increasing the complexity of the theory's assumptions. We examine research on recent examples of such predictions during four epochs of cognitive development: childhood, adolescence, young adulthood, and late adulthood. During the first two, the featured predictions are surprising developmental reversals in false memory (childhood) and in risky decision making (adolescence). During young adulthood, FTT predicts that a retrieval operation that figures centrally in dual-process theories of memory, recollection, is bivariate rather than univariate. During the late adulthood, FTT identifies a retrieval operation, reconstruction, that has been omitted from current theories of normal memory declines in aging and pathological declines in dementia. The theory predicts that reconstruction is a major factor in such declines and that it is able to forecast future dementia.

  10. Quasi-Normal Modes of Stars and Black Holes

    Directory of Open Access Journals (Sweden)

    Kokkotas Kostas

    1999-01-01

    Full Text Available Perturbations of stars and black holes have been one of the main topics of relativistic astrophysics for the last few decades. They are of particular importance today, because of their relevance to gravitational wave astronomy. In this review we present the theory of quasi-normal modes of compact objects from both the mathematical and astrophysical points of view. The discussion includes perturbations of black holes (Schwarzschild, Reissner-Nordström, Kerr and Kerr-Newman and relativistic stars (non-rotating and slowly-rotating. The properties of the various families of quasi-normal modes are described, and numerical techniques for calculating quasi-normal modes reviewed. The successes, as well as the limits, of perturbation theory are presented, and its role in the emerging era of numerical relativity and supercomputers is discussed.

  11. Translating Unstructured Workflow Processes to Readable BPEL: Theory and Implementation

    DEFF Research Database (Denmark)

    van der Aalst, Willibrordus Martinus Pancratius; Lassen, Kristian Bisgaard

    2008-01-01

    The Business Process Execution Language for Web Services (BPEL) has emerged as the de-facto standard for implementing processes. Although intended as a language for connecting web services, its application is not limited to cross-organizational processes. It is expected that in the near future...... and not easy to use by end-users. Therefore, we provide a mapping from Workflow Nets (WF-nets) to BPEL. This mapping builds on the rich theory of Petri nets and can also be used to map other languages (e.g., UML, EPC, BPMN, etc.) onto BPEL. In addition to this we have implemented the algorithm in a tool called...

  12. Novel welding image processing method based on fractal theory

    Institute of Scientific and Technical Information of China (English)

    陈强; 孙振国; 肖勇; 路井荣

    2002-01-01

    Computer vision has come into used in the fields of welding process control and automation. In order to improve precision and rapidity of welding image processing, a novel method based on fractal theory has been put forward in this paper. Compared with traditional methods, the image is preliminarily processed in the macroscopic regions then thoroughly analyzed in the microscopic regions in the new method. With which, an image is divided up to some regions according to the different fractal characters of image edge, and the fuzzy regions including image edges are detected out, then image edges are identified with Sobel operator and curved by LSM (Lease Square Method). Since the data to be processed have been decreased and the noise of image has been reduced, it has been testified through experiments that edges of weld seam or weld pool could be recognized correctly and quickly.

  13. Evaluation of EMG processing techniques using Information Theory.

    Science.gov (United States)

    Farfán, Fernando D; Politti, Julio C; Felice, Carmelo J

    2010-11-12

    Electromyographic signals can be used in biomedical engineering and/or rehabilitation field, as potential sources of control for prosthetics and orthotics. In such applications, digital processing techniques are necessary to follow efficient and effectively the changes in the physiological characteristics produced by a muscular contraction. In this paper, two methods based on information theory are proposed to evaluate the processing techniques. These methods determine the amount of information that a processing technique is able to extract from EMG signals. The processing techniques evaluated with these methods were: absolute mean value (AMV), RMS values, variance values (VAR) and difference absolute mean value (DAMV). EMG signals from the middle deltoid during abduction and adduction movement of the arm in the scapular plane was registered, for static and dynamic contractions. The optimal window length (segmentation), abduction and adduction movements and inter-electrode distance were also analyzed. Using the optimal segmentation (200 ms and 300 ms in static and dynamic contractions, respectively) the best processing techniques were: RMS, AMV and VAR in static contractions, and only the RMS in dynamic contractions. Using the RMS of EMG signal, variations in the amount of information between the abduction and adduction movements were observed. Although the evaluation methods proposed here were applied to standard processing techniques, these methods can also be considered as alternatives tools to evaluate new processing techniques in different areas of electrophysiology.

  14. A field theory description of constrained energy-dissipation processes

    International Nuclear Information System (INIS)

    Mandzhavidze, I.D.; Sisakyan, A.N.

    2002-01-01

    A field theory description of dissipation processes constrained by a high-symmetry group is given. The formalism is presented in the example of the multiple-hadron production processes, where the transition to the thermodynamic equilibrium results from the kinetic energy of colliding particles dissipating into hadron masses. The dynamics of these processes is restricted because the constraints responsible for the colour charge confinement must be taken into account. We develop a more general S-matrix formulation of the thermodynamics of nonequilibrium dissipative processes and find a necessary and sufficient condition for the validity of this description; this condition is similar to the correlation relaxation condition, which, according to Bogolyubov, must apply as the system approaches equilibrium. This situation must physically occur in processes with an extremely high multiplicity, at least if the hadron mass is nonzero. We also describe a new strong-coupling perturbation scheme, which is useful for taking symmetry restrictions on the dynamics of dissipation processes into account. We review the literature devoted to this problem

  15. Hidden Markov processes theory and applications to biology

    CERN Document Server

    Vidyasagar, M

    2014-01-01

    This book explores important aspects of Markov and hidden Markov processes and the applications of these ideas to various problems in computational biology. The book starts from first principles, so that no previous knowledge of probability is necessary. However, the work is rigorous and mathematical, making it useful to engineers and mathematicians, even those not interested in biological applications. A range of exercises is provided, including drills to familiarize the reader with concepts and more advanced problems that require deep thinking about the theory. Biological applications are t

  16. Whiteheadian process and quantum theory of mind

    International Nuclear Information System (INIS)

    Stapp, H.

    1998-01-01

    There are deep similarities between Whitehead's idea of the process by which nature unfolds and the ideas of quantum theory. Whitehead says that the world is made of ''actual occasions'', each of which arises from potentialities created by prior actual occasions. These actual occasions are happenings modeled on experiential events, each of which comes into being and then perishes, only to be replaced by a successor. It is these experience-like happenings that are the basic realities of nature, according to Whitehead, not the persisting physical particles that Newtonian physics took be the basic entities. Similarly, Heisenberg says that what is really happening in a quantum process is the emergence of an actual from potentialities created by prior actualities. In the orthodox Copenhagen interpretation of quantum theory the actual things to which the theory refer are increments in ''our knowledge''. These increments are experiential events. The particles of classical physics lose their fundamental status: they dissolve into diffuse clouds of possibilities. At each stage of the unfolding of nature the complete cloud of possibilities acts like the potentiality for the occurrence of a next increment in knowledge, whose occurrence can radically change the cloud of possibilities/potentialities for the still-later increments in knowledge. The fundamental difference between these ideas about nature and the classical ideas that reigned from the time of Newton until this century concerns the status of the experiential aspects of nature. These are things such as thoughts, ideas, feelings, and sensations. They are distinguished from the physical aspects of nature, which are described in terms of quantities explicitly located in tiny regions of space and time. According to the ideas of classical physics the physical world is made up exclusively of things of this latter type, and the unfolding of the physical world is determined by causal connections involving only these things

  17. Zero cosmological constant from normalized general relativity

    International Nuclear Information System (INIS)

    Davidson, Aharon; Rubin, Shimon

    2009-01-01

    Normalizing the Einstein-Hilbert action by the volume functional makes the theory invariant under constant shifts in the Lagrangian. The associated field equations then resemble unimodular gravity whose otherwise arbitrary cosmological constant is now determined as a Machian universal average. We prove that an empty space-time is necessarily Ricci tensor flat, and demonstrate the vanishing of the cosmological constant within the scalar field paradigm. The cosmological analysis, carried out at the mini-superspace level, reveals a vanishing cosmological constant for a universe which cannot be closed as long as gravity is attractive. Finally, we give an example of a normalized theory of gravity which does give rise to a non-zero cosmological constant.

  18. Imitation and processes of institutionalization - Insights from Bourdieu's theory of practice

    NARCIS (Netherlands)

    Sieweke, Jost

    2014-01-01

    New institutional theory highlights the importance of language in processes of institutionalization, but Bourdieu argues that institutions are also transmitted by mimesis, i.e., the unconscious imitation of other actors' actions. The aim of this paper is to develop a framework that explains

  19. How Innovation Theory Can Contribute to the Military Operations Planning Process

    DEFF Research Database (Denmark)

    Heltberg, Anna Therese; Dahl, Kåre

    The research study considers how the application of innovation theory might contribute to military staff work planning processes and bring new perspectives to operational models of analysis such as NATO’s Comprehensive Operations Planning Directive (COPD) and the Danish Field Manual III....

  20. Normalization: A Preprocessing Stage

    OpenAIRE

    Patro, S. Gopal Krishna; Sahu, Kishore Kumar

    2015-01-01

    As we know that the normalization is a pre-processing stage of any type problem statement. Especially normalization takes important role in the field of soft computing, cloud computing etc. for manipulation of data like scale down or scale up the range of data before it becomes used for further stage. There are so many normalization techniques are there namely Min-Max normalization, Z-score normalization and Decimal scaling normalization. So by referring these normalization techniques we are ...

  1. Scalar utility theory and proportional processing: What does it actually imply?

    Science.gov (United States)

    Rosenström, Tom; Wiesner, Karoline; Houston, Alasdair I

    2016-09-07

    Scalar Utility Theory (SUT) is a model used to predict animal and human choice behaviour in the context of reward amount, delay to reward, and variability in these quantities (risk preferences). This article reviews and extends SUT, deriving novel predictions. We show that, contrary to what has been implied in the literature, (1) SUT can predict both risk averse and risk prone behaviour for both reward amounts and delays to reward depending on experimental parameters, (2) SUT implies violations of several concepts of rational behaviour (e.g. it violates strong stochastic transitivity and its equivalents, and leads to probability matching) and (3) SUT can predict, but does not always predict, a linear relationship between risk sensitivity in choices and coefficient of variation in the decision-making experiment. SUT derives from Scalar Expectancy Theory which models uncertainty in behavioural timing using a normal distribution. We show that the above conclusions also hold for other distributions, such as the inverse Gaussian distribution derived from drift-diffusion models. A straightforward way to test the key assumptions of SUT is suggested and possible extensions, future prospects and mechanistic underpinnings are discussed. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. High energy instanton induced processes in electroweak theory

    International Nuclear Information System (INIS)

    McLerran, L.

    1992-01-01

    It is well known that in electroweak theory, baryon plus lepton number is conserved by the classical equations of motion. This is of course consistent with the lack of experimental observation of such processes. It is a little less well known that when quantum corrections are included in electroweak theory, baryon plus lepton number is not conserved. This was first discovered as a consequence of the Adler-Bardeen-Bell-Jackiw triangle anomaly. It is perhaps most easily understood as a consequence of vacuum degeneracy, fermion energy level crossing and filling of the negative energy Dirac sea upon second quantization. To understand how baryon plus lepton number is not conserved upon second quantization, consider the situation shown in the energy of the system is shown as a function of a parameter which characterizes the gauge fields, the Chern-Simons charge. The Chern-Simons charge is a function only of the gauge fields, and the B + L change is equal to the change in Chern-Simons charge, ΔQ B+L = ΔQ CS

  3. Practice of Connectivism As Learning Theory: Enhancing Learning Process Through Social Networking Site (Facebook

    Directory of Open Access Journals (Sweden)

    Fahriye Altınay Aksal

    2013-12-01

    Full Text Available The impact of the digital age within learning and social interaction has been growing rapidly. The realm of digital age and computer mediated communication requires reconsidering instruction based on collaborative interactive learning process and socio-contextual experience for learning. Social networking sites such as facebook can help create group space for digital dialogue to inform, question and challenge within a frame of connectivism as learning theory within the digital age. The aim of this study is to elaborate the practice of connectivism as learning theory in terms of internship course. Facebook group space provided social learning platform for dialogue and negotiation beside the classroom learning and teaching process in this study. The 35 internship students provided self-reports within a frame of this qualitative research. This showed how principles of theory practiced and how this theory and facebook group space contribute learning, selfleadership, decision making and reflection skills. As the research reflects a practice of new theory based on action research, learning is not individualistic attempt in the digital age as regards the debate on learning in digital age within a frame of connectivism

  4. Proposed experimental test of the theory of hole superconductivity

    Energy Technology Data Exchange (ETDEWEB)

    Hirsch, J.E., E-mail: jhirsch@ucsd.edu

    2016-06-15

    Highlights: • The conventional theory of superconductivity predicts no charge flow when the normal-superconductor phase boundary moves. • The theory of hole superconductivity predicts flow and counterflow of charge. • An experiment to measure a voltage is proposed. • No voltage will be measured if the conventional theory is correct. • A voltage will be measured if the theory of hole superconductivity is correct. - Abstract: The theory of hole superconductivity predicts that in the reversible transition between normal and superconducting phases in the presence of a magnetic field there is charge flow in direction perpendicular to the normal-superconductor phase boundary. In contrast, the conventional BCS-London theory of superconductivity predicts no such charge flow. Here we discuss an experiment to test these predictions.

  5. Close relationship processes and health: implications of attachment theory for health and disease.

    Science.gov (United States)

    Pietromonaco, Paula R; Uchino, Bert; Dunkel Schetter, Christine

    2013-05-01

    Health psychology has contributed significantly to understanding the link between psychological factors and health and well-being, but it has not often incorporated advances in relationship science into hypothesis generation and study design. We present one example of a theoretical model, following from a major relationship theory (attachment theory) that integrates relationship constructs and processes with biopsychosocial processes and health outcomes. We briefly describe attachment theory and present a general framework linking it to dyadic relationship processes (relationship behaviors, mediators, and outcomes) and health processes (physiology, affective states, health behavior, and health outcomes). We discuss the utility of the model for research in several health domains (e.g., self-regulation of health behavior, pain, chronic disease) and its implications for interventions and future research. This framework revealed important gaps in knowledge about relationships and health. Future work in this area will benefit from taking into account individual differences in attachment, adopting a more explicit dyadic approach, examining more integrated models that test for mediating processes, and incorporating a broader range of relationship constructs that have implications for health. A theoretical framework for studying health that is based in relationship science can accelerate progress by generating new research directions designed to pinpoint the mechanisms through which close relationships promote or undermine health. Furthermore, this knowledge can be applied to develop more effective interventions to help individuals and their relationship partners with health-related challenges. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  6. UNIVERSITY TEACHING-LEARNING PROCESS: REFLECTIONS THROUGHOUT THE AGENCY THEORY

    Directory of Open Access Journals (Sweden)

    Víctor Jacques Parraguez

    2011-04-01

    Full Text Available This work analyses some reasons that might explain the insufficient academic level which is perceived in universities of developing countries. The discussion element is the teacher-student relationship which is studied under the perspective of the agency theory. It is concluded that in absence of efficient monitoring mechanisms of the teacher and student’s behavior might proliferate gaps of due diligence which attempts against the quality of the teaching-learning process.

  7. Microscopic theory of normal liquid 3He

    International Nuclear Information System (INIS)

    Nafari, N.; Doroudi, A.

    1994-03-01

    We have used the self-consistent scheme proposed by Singwi, Tosi, Land and Sjoelander (STLS) to study the properties of normal liquid 3 He. By employing the Aziz potential (HFD-B) and some other realistic pairwise interactions, we have calculated the static structure factor, the pair-correlation function, the zero sound frequencies as a function of wave-vector, and the Landau parameter F s 0 for different densities. Our results show considerable improvement over the Ng-Singwi's model potential of a hard core plus an attractive tail. Agreement between our results and the experimental data for the static structure factor and the zero sound frequencies is fairly good. (author). 30 refs, 6 figs, 2 tabs

  8. A theory-informed, process-oriented Resident Scholarship Program.

    Science.gov (United States)

    Thammasitboon, Satid; Darby, John B; Hair, Amy B; Rose, Karen M; Ward, Mark A; Turner, Teri L; Balmer, Dorene F

    2016-01-01

    The Accreditation Council for Graduate Medical Education requires residency programs to provide curricula for residents to engage in scholarly activities but does not specify particular guidelines for instruction. We propose a Resident Scholarship Program that is framed by the self-determination theory (SDT) and emphasize the process of scholarly activity versus a scholarly product. The authors report on their longitudinal Resident Scholarship Program, which aimed to support psychological needs central to SDT: autonomy, competence, and relatedness. By addressing those needs in program aims and program components, the program may foster residents' intrinsic motivation to learn and to engage in scholarly activity. To this end, residents' engagement in scholarly processes, and changes in perceived autonomy, competence, and relatedness were assessed. Residents engaged in a range of scholarly projects and expressed positive regard for the program. Compared to before residency, residents felt more confident in the process of scholarly activity, as determined by changes in increased perceived autonomy, competence, and relatedness. Scholarly products were accomplished in return for a focus on scholarly process. Based on our experience, and in line with the SDT, supporting residents' autonomy, competence, and relatedness through a process-oriented scholarship program may foster the curiosity, inquisitiveness, and internal motivation to learn that drives scholarly activity and ultimately the production of scholarly products.

  9. Language Learning Strategies and English Proficiency: Interpretations from Information-Processing Theory

    Science.gov (United States)

    Rao, Zhenhui

    2016-01-01

    The research reported here investigated the relationship between students' use of language learning strategies and their English proficiency, and then interpreted the data from two models in information-processing theory. Results showed that the students' English proficiency significantly affected their use of learning strategies, with high-level…

  10. Evaluation of EMG processing techniques using Information Theory

    Directory of Open Access Journals (Sweden)

    Felice Carmelo J

    2010-11-01

    Full Text Available Abstract Background Electromyographic signals can be used in biomedical engineering and/or rehabilitation field, as potential sources of control for prosthetics and orthotics. In such applications, digital processing techniques are necessary to follow efficient and effectively the changes in the physiological characteristics produced by a muscular contraction. In this paper, two methods based on information theory are proposed to evaluate the processing techniques. Methods These methods determine the amount of information that a processing technique is able to extract from EMG signals. The processing techniques evaluated with these methods were: absolute mean value (AMV, RMS values, variance values (VAR and difference absolute mean value (DAMV. EMG signals from the middle deltoid during abduction and adduction movement of the arm in the scapular plane was registered, for static and dynamic contractions. The optimal window length (segmentation, abduction and adduction movements and inter-electrode distance were also analyzed. Results Using the optimal segmentation (200 ms and 300 ms in static and dynamic contractions, respectively the best processing techniques were: RMS, AMV and VAR in static contractions, and only the RMS in dynamic contractions. Using the RMS of EMG signal, variations in the amount of information between the abduction and adduction movements were observed. Conclusions Although the evaluation methods proposed here were applied to standard processing techniques, these methods can also be considered as alternatives tools to evaluate new processing techniques in different areas of electrophysiology.

  11. Gröbner bases in control theory and signal processing

    CERN Document Server

    Regensburger, Georg

    2007-01-01

    This volume contains survey and original articles presenting the state of the art on the application of Gröbner bases in control theory and signal processing. The contributions are based on talks delivered at the Special Semester on Gröbner Bases and Related Methods at the Johann Radon Institute of Computational and Applied Mathematics (RICAM), Linz, Austria, in May 2006.

  12. Transport through hybrid superconducting/normal nanostructures

    Energy Technology Data Exchange (ETDEWEB)

    Futterer, David

    2013-01-29

    We mainly investigate transport through interacting quantum dots proximized by superconductors. For this purpose we extend an existing theory to describe transport through proximized quantum dots coupled to normal and superconducting leads. It allows us to study the influence of a strong Coulomb interaction on Andreev currents and Josephson currents. This is a particularly interesting topic because it combines two competing properties: in superconductors Cooper pairs are formed by two electrons which experience an attractive interaction while two electrons located on a quantum dot repel each other due to the Coulomb interaction. It seems at first glance that transport processes involving Cooper pairs should be suppressed because of the two competing interactions. However, it is possible to proximize the dot in nonequilibrium situations. At first, we study a setup composed of a quantum dot coupled to one normal, one ferromagnetic, and one superconducting lead in the limit of an infinitely-large superconducting gap. Within this limit the coupling between dot and superconductor is described exactly by the presented theory. It leads to the formation of Andreev-bound states (ABS) and an additional bias scheme opens in which a pure spin current, i.e. a spin current with a vanishing associated charge current, can be generated. In a second work, starting from the infinite-gap limit, we perform a systematic expansion of the superconducting gap around infinity and investigate Andreev currents and Josephson currents. This allows us to estimate the validity of infinite-gap calculations for real systems in which the superconducting gap is usually a rather small quantity. We find indications that a finite gap renormalizes the ABS and propose a resummation approach to explore the finite-gap ABS. Despite the renormalization effects the modifications of transport by finite gaps are rather small. This result lets us conclude that the infinite-gap calculation is a valuable tool to

  13. Transport through hybrid superconducting/normal nanostructures

    International Nuclear Information System (INIS)

    Futterer, David

    2013-01-01

    We mainly investigate transport through interacting quantum dots proximized by superconductors. For this purpose we extend an existing theory to describe transport through proximized quantum dots coupled to normal and superconducting leads. It allows us to study the influence of a strong Coulomb interaction on Andreev currents and Josephson currents. This is a particularly interesting topic because it combines two competing properties: in superconductors Cooper pairs are formed by two electrons which experience an attractive interaction while two electrons located on a quantum dot repel each other due to the Coulomb interaction. It seems at first glance that transport processes involving Cooper pairs should be suppressed because of the two competing interactions. However, it is possible to proximize the dot in nonequilibrium situations. At first, we study a setup composed of a quantum dot coupled to one normal, one ferromagnetic, and one superconducting lead in the limit of an infinitely-large superconducting gap. Within this limit the coupling between dot and superconductor is described exactly by the presented theory. It leads to the formation of Andreev-bound states (ABS) and an additional bias scheme opens in which a pure spin current, i.e. a spin current with a vanishing associated charge current, can be generated. In a second work, starting from the infinite-gap limit, we perform a systematic expansion of the superconducting gap around infinity and investigate Andreev currents and Josephson currents. This allows us to estimate the validity of infinite-gap calculations for real systems in which the superconducting gap is usually a rather small quantity. We find indications that a finite gap renormalizes the ABS and propose a resummation approach to explore the finite-gap ABS. Despite the renormalization effects the modifications of transport by finite gaps are rather small. This result lets us conclude that the infinite-gap calculation is a valuable tool to

  14. Affect intensity and processing fluency of deterrents.

    Science.gov (United States)

    Holman, Andrei

    2013-01-01

    The theory of emotional intensity (Brehm, 1999) suggests that the intensity of affective states depends on the magnitude of their current deterrents. Our study investigated the role that fluency--the subjective experience of ease of information processing--plays in the emotional intensity modulations as reactions to deterrents. Following an induction phase of good mood, we manipulated both the magnitude of deterrents (using sets of photographs with pre-tested potential to instigate an emotion incompatible with the pre-existent affective state--pity) and their processing fluency (normal vs. enhanced through subliminal priming). Current affective state and perception of deterrents were then measured. In the normal processing conditions, the results revealed the cubic effect predicted by the emotional intensity theory, with the initial affective state being replaced by the one appropriate to the deterrent only in participants exposed to the high magnitude deterrence. In the enhanced fluency conditions the emotional intensity pattern was drastically altered; also, the replacement of the initial affective state occurred at a lower level of deterrence magnitude (moderate instead of high), suggesting the strengthening of deterrence emotional impact by enhanced fluency.

  15. Online dating in Japan: a test of social information processing theory.

    Science.gov (United States)

    Farrer, James; Gavin, Jeff

    2009-08-01

    This study examines the experiences of past and present members of a popular Japanese online dating site in order to explore the extent to which Western-based theories of computer-mediated communication (CMC) and the development of online relationships are relevant to the Japanese online dating experience. Specifically, it examines whether social information processing theory (SIPT) is applicable to Japanese online dating interactions, and how and to what extent Japanese daters overcome the limitations of CMC through the use of contextual and other cues. Thirty-six current members and 27 former members of Match.com Japan completed an online survey. Using issue-based procedures for grounded theory analysis, we found strong support for SIPT. Japanese online daters adapt their efforts to present and acquire social information using the cues that the online dating platform provides, although many of these cues are specific to Japanese social context.

  16. Occupational therapy students in the process of interprofessional collaborative learning: a grounded theory study.

    Science.gov (United States)

    Howell, Dana

    2009-01-01

    The purpose of this grounded theory study was to generate a theory of the interprofessional collaborative learning process of occupational therapy (OT) students who were engaged in a collaborative learning experience with students from other allied health disciplines. Data consisted of semi-structured interviews with nine OT students from four different interprofessional collaborative learning experiences at three universities. The emergent theory explained OT students' need to build a culture of mutual respect among disciplines in order to facilitate interprofessional collaborative learning. Occupational therapy students went through a progression of learned skills that included learning how to represent the profession of OT, hold their weight within a team situation, solve problems collaboratively, work as a team, and ultimately, to work in an actual team in practice. This learning process occurred simultaneously as students also learned course content. The students had to contend with barriers and facilitators that influenced their participation and the success of their collaboration. Understanding the interprofessional learning process of OT students will help allied health faculty to design more effective, inclusive interprofessional courses.

  17. A comparative study of deficit pattern in theory of mind and emotion regulation methods in evaluating patients with bipolar disorder and normal individuals

    OpenAIRE

    Ali Fakhari; Khalegh Minashiri; Abolfazl Fallahi; Mohammad Taher Panah

    2013-01-01

    BACKGROUND: This study compared patterns of deficit in "theory of mind" and "emotion regulation" in patientswith bipolar disorder and normal individuals. METHODS: In this causal-comparative study, subjects were 20 patients with bipolar disorder and 20 normalindividuals. Patients were selected via convenience sampling method among hospitalized patients at Razi hospital ofTabriz, Iran. The data was collected through two scales: Reading the Mind in the Eyes Test and Emotion RegulationQuestionnai...

  18. Habituation of the orienting reflex and the development of Preliminary Process Theory.

    Science.gov (United States)

    Barry, Robert J

    2009-09-01

    The orienting reflex (OR), elicited by an innocuous stimulus, can be regarded as a model of the organism's interaction with its environment, and has been described as the unit of attentional processing. A major determinant of the OR is the novelty of the eliciting stimulus, generally operationalized in terms of its reduction with stimulus repetition, the effects of which are commonly described in habituation terms. This paper provides an overview of a research programme, spanning more than 30 years, investigating psychophysiological aspects of the OR in humans. The major complication in this research is that the numerous physiological measures used as dependent variables in the OR context fail to jointly covary with stimulus parameters. This has led to the development of the Preliminary Process Theory (PPT) of the OR to accommodate the complexity of the observed stimulus-response patterns. PPT is largely grounded in autonomic measures, and current work is attempting to integrate electroencephalographic measures, particularly components in the event-related brain potentials reflecting aspects of stimulus processing. The emphasis in the current presentation is on the use of the defining criteria of the habituation phenomenon, and Groves and Thompson's Dual-process Theory, in the development of PPT.

  19. Processes of Self-Regulated Learning in Music Theory in Elementary Music Schools in Slovenia

    Science.gov (United States)

    Fritz, Barbara Smolej; Peklaj, Cirila

    2011-01-01

    The aim of our study was determine how students regulate their learning in music theory (MT). The research is based on the socio-cognitive theory of learning. The aim of our study was twofold: first, to design the instruments for measuring (meta)cognitive and affective-motivational processes in learning MT, and, second, to examine the relationship…

  20. Time Series, Stochastic Processes and Completeness of Quantum Theory

    International Nuclear Information System (INIS)

    Kupczynski, Marian

    2011-01-01

    Most of physical experiments are usually described as repeated measurements of some random variables. Experimental data registered by on-line computers form time series of outcomes. The frequencies of different outcomes are compared with the probabilities provided by the algorithms of quantum theory (QT). In spite of statistical predictions of QT a claim was made that it provided the most complete description of the data and of the underlying physical phenomena. This claim could be easily rejected if some fine structures, averaged out in the standard descriptive statistical analysis, were found in time series of experimental data. To search for these structures one has to use more subtle statistical tools which were developed to study time series produced by various stochastic processes. In this talk we review some of these tools. As an example we show how the standard descriptive statistical analysis of the data is unable to reveal a fine structure in a simulated sample of AR (2) stochastic process. We emphasize once again that the violation of Bell inequalities gives no information on the completeness or the non locality of QT. The appropriate way to test the completeness of quantum theory is to search for fine structures in time series of the experimental data by means of the purity tests or by studying the autocorrelation and partial autocorrelation functions.

  1. Theory of novel normal and superconducting states in doped oxide high-Tc superconductors

    International Nuclear Information System (INIS)

    Dzhumanov, S.

    2001-10-01

    A consistent and complete theory of the novel normal and superconducting (SC) states of doped high-T c superconductors (HTSC) is developed by combining the continuum model of carrier self-trapping, the tight-binding model and the novel Fermi-Bose-liquid (FBL) model. The ground-state energy of carriers in lightly doped HTSC is calculated within the continuum model and adiabatic approximation using the variational method. The destruction of the long-range antiferromagnetic (AF) order at low doping x≥ x cl ≅0.015, the formation of the in-gap states or bands and novel (bi)polaronic insulating phases at x c2 ≅0.06-0.08, and the new metal- insulator transition at x≅x c2 in HTSC are studied within the continuum model of impurity (defect) centers and large (bi)polarons by using the appropriate tight-binding approximations. It is found that the three-dimensional (3d) large (bi)polarons are formed at ε ∞ /ε 0 ≤0.1 and become itinerant when the (bi)polaronic insulator-to-(bi)polaronic metal transitions occur at x x c2 . We show that the novel pseudogapped metallic and SC states in HTSC are formed at x c2 ≤x≤x p ≅0.20-0.24. We demonstrate that the large polaronic and small BCS-like pairing pseudogaps opening in the excitation spectrum of underdoped (x c2 BCS =0.125), optimally doped (x BCS o ≅0.20) and overdoped (x>x o ) HTSC above T c are unrelated to superconductivity and they are responsible for the observed anomalous optical, transport, magnetic and other properties of these HTSC. We develop the original two-stage FBL model of novel superconductivity describing the combined novel BCS-like pairing scenario of fermions and true superfluid (SF) condensation scenario of composite bosons (i.e. bipolarons and cooperons) in any Fermi-systems, where the SF condensate gap Δ B and the BCS-like pairing pseudogap Δ F have different origins. The pair and single particle condensations of attracting 3d and two- dimensional (2d) composite bosons are responsible for

  2. Rationality, Theory Acceptance and Decision Theory

    Directory of Open Access Journals (Sweden)

    J. Nicolas Kaufmann

    1998-06-01

    Full Text Available Following Kuhn's main thesis according to which theory revision and acceptance is always paradigm relative, I propose to outline some possible consequences of such a view. First, asking the question in what sense Bayesian decision theory could serve as the appropriate (normative theory of rationality examined from the point of view of the epistemology of theory acceptance, I argue that Bayesianism leads to a narrow conception of theory acceptance. Second, regarding the different types of theory revision, i.e. expansion, contraction, replacement and residuals shifts, I extract from Kuhn's view a series of indications showing that theory replacement cannot be rationalized within the framework of Bayesian decision theory, not even within a more sophisticated version of that model. Third, and finally, I will point to the need for a more comprehensive model of rationality than the Bayesian expected utility maximization model, the need for a model which could better deal with the different aspects of theory replacement. I will show that Kuhn's distinction between normal and revolutionary science gives us several hints for a more adequate theory of rationality in science. I will also show that Kuhn is not in a position to fully articulate his main ideas and that he well be confronted with a serious problem concerning collective choice of a paradigm.

  3. A theory-informed, process-oriented Resident Scholarship Program

    Science.gov (United States)

    Thammasitboon, Satid; Darby, John B.; Hair, Amy B.; Rose, Karen M.; Ward, Mark A.; Turner, Teri L.; Balmer, Dorene F.

    2016-01-01

    Background The Accreditation Council for Graduate Medical Education requires residency programs to provide curricula for residents to engage in scholarly activities but does not specify particular guidelines for instruction. We propose a Resident Scholarship Program that is framed by the self-determination theory (SDT) and emphasize the process of scholarly activity versus a scholarly product. Methods The authors report on their longitudinal Resident Scholarship Program, which aimed to support psychological needs central to SDT: autonomy, competence, and relatedness. By addressing those needs in program aims and program components, the program may foster residents’ intrinsic motivation to learn and to engage in scholarly activity. To this end, residents’ engagement in scholarly processes, and changes in perceived autonomy, competence, and relatedness were assessed. Results Residents engaged in a range of scholarly projects and expressed positive regard for the program. Compared to before residency, residents felt more confident in the process of scholarly activity, as determined by changes in increased perceived autonomy, competence, and relatedness. Scholarly products were accomplished in return for a focus on scholarly process. Conclusions Based on our experience, and in line with the SDT, supporting residents’ autonomy, competence, and relatedness through a process-oriented scholarship program may foster the curiosity, inquisitiveness, and internal motivation to learn that drives scholarly activity and ultimately the production of scholarly products. PMID:27306995

  4. Superconducting proximity effect in mesoscopic superconductor/normal-metal junctions

    CERN Document Server

    Takayanagi, H; Toyoda, E

    1999-01-01

    The superconducting proximity effect is discussed in mesoscopic superconductor/normal-metal junctions. The newly-developed theory shows long-range phase-coherent effect which explaines early experimental results of giant magnetoresistance oscillations in an Andreev interferometer. The theory also shows that the proximity correction to the conductance (PCC) has a reentrant behavior as a function of energy. The reentrant behavior is systematically studied in a gated superconductor-semiconductor junction. A negative PCC is observed in the case of a weak coupling between the normal metal and the external reservoir. Phase coherent ac effect is also observed when rf is irradiated to the junction.

  5. Trichotomous Processes in Early Memory Development, Aging, and Neurocognitive Impairment: A Unified Theory

    Science.gov (United States)

    Brainerd, C. J.; Reyna, V. F.; Howe, M. L.

    2009-01-01

    One of the most extensively investigated topics in the adult memory literature, dual memory processes, has had virtually no impact on the study of early memory development. The authors remove the key obstacles to such research by formulating a trichotomous theory of recall that combines the traditional dual processes of recollection and…

  6. Situation-specific theories from the middle-range transitions theory.

    Science.gov (United States)

    Im, Eun-Ok

    2014-01-01

    The purpose of this article was to analyze the theory development process of the situation-specific theories that were derived from the middle-range transitions theory. This analysis aims to provide directions for future development of situation-specific theories. First, transitions theory is concisely described with its history, goal, and major concepts. Then, the approach that was used to retrieve the situation-specific theories derived from transitions theory is described. Next, an analysis of 6 situation-specific theories is presented. Finally, 4 themes reflecting commonalities and variances in the theory development process are discussed with implications for future theoretical development.

  7. Theory of the shock process in dense fluids

    International Nuclear Information System (INIS)

    Wallace, D.C.

    1991-01-01

    A shock is assumed to be a steady plane wave, and irreversible thermodynamics is assumed valid. The fluid is characterized by heat conduction and by viscous or viscoelastic response, according to the strain rate. It is shown that setting the viscosity zero produces a solution which constitutes a lower bound through the shock process for the shear stress, and upper bounds for the temperature, entropy, pressure, and heat current. It is shown that there exists an upper bound to the dynamic stresses which can be achieved during shock compression, that this bound corresponds to a purely elastic response of the fluid, and that solution for the shock process along this bound constitutes lower bounds for the temperature and entropy. It is shown that a continuous steady shock is possible only if the heat current is positive and the temperature is an increasing function of compression almost everywhere. In his theory of shocks in gases, Rayleigh showed that there is a maximum shock strength for which a continuous steady solution can exist with heat conduction but without viscosity. Two more limits are shown to exist for dense fluids, based on the fluid response in the leading edge of the shock: for shocks at the overdriven threshold and above, no solution is possible without heat transport; for shocks near the viscous fluid limit and above, viscous fluid theory is not valid, and the fluid response in the leading edge of the shock is approximately that of a nonplastic solid. The viscous fluid limit is estimated to be 13 kbar for water and 690 kbar for mercury

  8. Magnetic moment calculation for p+d→ 3 He+γ process in Big=bang nucleosynthesis with effective field theory

    International Nuclear Information System (INIS)

    Bayegan, S.; Sadeghi, H.

    2004-01-01

    In big-bang nucleosynthesis, processes relevant ti increasing of nucleon density are more important. One of the theories that its solutions more accurately explain the experimental works is Effective Field Theory in this paper. Magnetic moment (χM1) for radiative capture of protons by deuterons p + d → 3 He+γ process is calculated using Effective Field Theory. The calculation includes coulomb interaction up to next-to -next-leading order (N 2 LO)

  9. Adaptive nonlinear control using input normalized neural networks

    International Nuclear Information System (INIS)

    Leeghim, Henzeh; Seo, In Ho; Bang, Hyo Choong

    2008-01-01

    An adaptive feedback linearization technique combined with the neural network is addressed to control uncertain nonlinear systems. The neural network-based adaptive control theory has been widely studied. However, the stability analysis of the closed-loop system with the neural network is rather complicated and difficult to understand, and sometimes unnecessary assumptions are involved. As a result, unnecessary assumptions for stability analysis are avoided by using the neural network with input normalization technique. The ultimate boundedness of the tracking error is simply proved by the Lyapunov stability theory. A new simple update law as an adaptive nonlinear control is derived by the simplification of the input normalized neural network assuming the variation of the uncertain term is sufficiently small

  10. Describing long-range charge-separation processes with subsystem density-functional theory

    Energy Technology Data Exchange (ETDEWEB)

    Solovyeva, Alisa; Neugebauer, Johannes, E-mail: j.neugebauer@uni-muenster.de [Theoretische Organische Chemie, Organisch-Chemisches Institut and Center for Multiscale Theory and Simulation, Westfälische Wilhelms-Universität Münster, Corrensstraße 40, 48149 Münster (Germany); Pavanello, Michele, E-mail: m.pavanello@rutgers.edu [Department of Chemistry, Rutgers University, 73 Warren St., Newark, New Jersey 07102 (United States)

    2014-04-28

    Long-range charge-transfer processes in extended systems are difficult to describe with quantum chemical methods. In particular, cost-effective (non-hybrid) approximations within time-dependent density functional theory (DFT) are not applicable unless special precautions are taken. Here, we show that the efficient subsystem DFT can be employed as a constrained DFT variant to describe the energetics of long-range charge-separation processes. A formal analysis of the energy components in subsystem DFT for such excitation energies is presented, which demonstrates that both the distance dependence and the long-range limit are correctly described. In addition, electronic couplings for these processes as needed for rate constants in Marcus theory can be obtained from this method. It is shown that the electronic structure of charge-separated states constructed by a positively charged subsystem interacting with a negatively charged one is difficult to converge — charge leaking from the negative subsystem to the positive one can occur. This problem is related to the delocalization error in DFT and can be overcome with asymptotically correct exchange–correlation (XC) potentials or XC potentials including a sufficiently large amount of exact exchange. We also outline an approximate way to obtain charge-transfer couplings between locally excited and charge-separated states.

  11. Describing long-range charge-separation processes with subsystem density-functional theory

    International Nuclear Information System (INIS)

    Solovyeva, Alisa; Neugebauer, Johannes; Pavanello, Michele

    2014-01-01

    Long-range charge-transfer processes in extended systems are difficult to describe with quantum chemical methods. In particular, cost-effective (non-hybrid) approximations within time-dependent density functional theory (DFT) are not applicable unless special precautions are taken. Here, we show that the efficient subsystem DFT can be employed as a constrained DFT variant to describe the energetics of long-range charge-separation processes. A formal analysis of the energy components in subsystem DFT for such excitation energies is presented, which demonstrates that both the distance dependence and the long-range limit are correctly described. In addition, electronic couplings for these processes as needed for rate constants in Marcus theory can be obtained from this method. It is shown that the electronic structure of charge-separated states constructed by a positively charged subsystem interacting with a negatively charged one is difficult to converge — charge leaking from the negative subsystem to the positive one can occur. This problem is related to the delocalization error in DFT and can be overcome with asymptotically correct exchange–correlation (XC) potentials or XC potentials including a sufficiently large amount of exact exchange. We also outline an approximate way to obtain charge-transfer couplings between locally excited and charge-separated states

  12. Nonepileptic seizures under levetiracetam therapy: a case report of forced normalization process

    Directory of Open Access Journals (Sweden)

    Anzellotti F

    2014-05-01

    Full Text Available Francesca Anzellotti, Raffaella Franciotti, Holta Zhuzhuni, Aurelio D'Amico, Astrid Thomas, Marco Onofrj Department of Neuroscience and Imaging, Aging Research Centre, Gabriele d'Annunzio University Foundation, Gabriele d'Annunzio University, Chieti, Italy Abstract: Nonepileptic seizures (NES apparently look like epileptic seizures, but are not associated with ictal electrical discharges in the brain. NES constitute one of the most important differential diagnoses of epilepsy. They have been recognized as a distinctive clinical phenomenon for centuries, and video/electroencephalogram monitoring has allowed clinicians to make near-certain diagnoses. NES are supposedly unrelated to organic brain lesions, and despite the preponderance of a psychiatric/psychological context, they may have an iatrogenic origin. We report a patient with NES precipitated by levetiracetam therapy; in this case, NES was observed during the disappearance of epileptiform discharges from the routine video/electroencephalogram. We discuss the possible mechanisms underlying NES with regard to alternative psychoses associated with the phenomenon of the forced normalization process. Keywords: nonepileptic seizures, forced normalization, levetiracetam, behavioral side effects

  13. Unifying Theories of Psychedelic Drug Effects

    Directory of Open Access Journals (Sweden)

    Link R. Swanson

    2018-03-01

    Full Text Available How do psychedelic drugs produce their characteristic range of acute effects in perception, emotion, cognition, and sense of self? How do these effects relate to the clinical efficacy of psychedelic-assisted therapies? Efforts to understand psychedelic phenomena date back more than a century in Western science. In this article I review theories of psychedelic drug effects and highlight key concepts which have endured over the last 125 years of psychedelic science. First, I describe the subjective phenomenology of acute psychedelic effects using the best available data. Next, I review late 19th-century and early 20th-century theories—model psychoses theory, filtration theory, and psychoanalytic theory—and highlight their shared features. I then briefly review recent findings on the neuropharmacology and neurophysiology of psychedelic drugs in humans. Finally, I describe recent theories of psychedelic drug effects which leverage 21st-century cognitive neuroscience frameworks—entropic brain theory, integrated information theory, and predictive processing—and point out key shared features that link back to earlier theories. I identify an abstract principle which cuts across many theories past and present: psychedelic drugs perturb universal brain processes that normally serve to constrain neural systems central to perception, emotion, cognition, and sense of self. I conclude that making an explicit effort to investigate the principles and mechanisms of psychedelic drug effects is a uniquely powerful way to iteratively develop and test unifying theories of brain function.

  14. Unifying Theories of Psychedelic Drug Effects

    Science.gov (United States)

    Swanson, Link R.

    2018-01-01

    How do psychedelic drugs produce their characteristic range of acute effects in perception, emotion, cognition, and sense of self? How do these effects relate to the clinical efficacy of psychedelic-assisted therapies? Efforts to understand psychedelic phenomena date back more than a century in Western science. In this article I review theories of psychedelic drug effects and highlight key concepts which have endured over the last 125 years of psychedelic science. First, I describe the subjective phenomenology of acute psychedelic effects using the best available data. Next, I review late 19th-century and early 20th-century theories—model psychoses theory, filtration theory, and psychoanalytic theory—and highlight their shared features. I then briefly review recent findings on the neuropharmacology and neurophysiology of psychedelic drugs in humans. Finally, I describe recent theories of psychedelic drug effects which leverage 21st-century cognitive neuroscience frameworks—entropic brain theory, integrated information theory, and predictive processing—and point out key shared features that link back to earlier theories. I identify an abstract principle which cuts across many theories past and present: psychedelic drugs perturb universal brain processes that normally serve to constrain neural systems central to perception, emotion, cognition, and sense of self. I conclude that making an explicit effort to investigate the principles and mechanisms of psychedelic drug effects is a uniquely powerful way to iteratively develop and test unifying theories of brain function. PMID:29568270

  15. Effect of perceptual load on conceptual processing: an extension of Vermeulen's theory.

    Science.gov (United States)

    Xie, Jiushu; Wang, Ruiming; Sun, Xun; Chang, Song

    2013-10-01

    The effect of color and shape load on conceptual processing was studied. Perceptual load effects have been found in visual and auditory conceptual processing, supporting the theory of embodied cognition. However, whether different types of visual concepts, such as color and shape, share the same perceptual load effects is unknown. In the current experiment, 32 participants were administered simultaneous perceptual and conceptual tasks to assess the relation between perceptual load and conceptual processing. Keeping color load in mind obstructed color conceptual processing. Hence, perceptual processing and conceptual load shared the same resources, suggesting embodied cognition. Color conceptual processing was not affected by shape pictures, indicating that different types of properties within vision were separate.

  16. Quasiclassical Theory of Spin Imbalance in a Normal Metal-Superconductor Heterostructure with a Spin-Active Interface

    International Nuclear Information System (INIS)

    Shevtsov, O; Löfwander, T

    2014-01-01

    Non-equilibrium phenomena in superconductors have attracted much attention since the first experiments on charge imbalance in the early 1970's. Nowadays a new promising line of research lies at an intersection between superconductivity and spintronics. Here we develop a quasiclassical theory of a single junction between a normal metal and a superconductor with a spin-active interface at finite bias voltages. Due to spin-mixing and spin-filtering effects of the interface a non-equilibrium magnetization (or spin imbalance) is induced at the superconducting side of the junction, which relaxes to zero in the bulk. A peculiar feature of the system is the presence of interface-induced Andreev bound states, which influence the magnitude and the decay length of spin imbalance. Recent experiments on spin and charge density separation in superconducting wires required external magnetic field for observing a spin signal via non-local measurements. Here, we propose an alternative way to observe spin imbalance without applying magnetic field

  17. The construction of normal expectations

    DEFF Research Database (Denmark)

    Quitzau, Maj-Britt; Røpke, Inge

    2008-01-01

    The gradual upward changes of standards in normal everyday life have significant environmental implications, and it is therefore important to study how these changes come about. The intention of the article is to analyze the social construction of normal expectations through a case study. The case...... concerns the present boom in bathroom renovations in Denmark, which offers an excellent opportunity to study the interplay between a wide variety of consumption drivers and social changes pointing toward long-term changes of normal expectations regarding bathroom standards. The study is problemoriented...... and transdisciplinary and draws on a wide range of sociological, anthropological, and economic theories. The empirical basis comprises a combination of statistics, a review of magazine and media coverage, visits to exhibitions, and qualitative interviews. A variety of consumption drivers are identified. Among...

  18. Dimensional modeling: beyond data processing constraints.

    Science.gov (United States)

    Bunardzic, A

    1995-01-01

    The focus of information processing requirements is shifting from the on-line transaction processing (OLTP) issues to the on-line analytical processing (OLAP) issues. While the former serves to ensure the feasibility of the real-time on-line transaction processing (which has already exceeded a level of up to 1,000 transactions per second under normal conditions), the latter aims at enabling more sophisticated analytical manipulation of data. The OLTP requirements, or how to efficiently get data into the system, have been solved by applying the Relational theory in the form of Entity-Relation model. There is presently no theory related to OLAP that would resolve the analytical processing requirements as efficiently as Relational theory provided for the transaction processing. The "relational dogma" also provides the mathematical foundation for the Centralized Data Processing paradigm in which mission-critical information is incorporated as 'one and only one instance' of data, thus ensuring data integrity. In such surroundings, the information that supports business analysis and decision support activities is obtained by running predefined reports and queries that are provided by the IS department. In today's intensified competitive climate, businesses are finding that this traditional approach is not good enough. The only way to stay on top of things, and to survive and prosper, is to decentralize the IS services. The newly emerging Distributed Data Processing, with its increased emphasis on empowering the end user, does not seem to find enough merit in the relational database model to justify relying upon it. Relational theory proved too rigid and complex to accommodate the analytical processing needs. In order to satisfy the OLAP requirements, or how to efficiently get the data out of the system, different models, metaphors, and theories have been devised. All of them are pointing to the need for simplifying the highly non-intuitive mathematical constraints found

  19. Process Reengineering for Quality Improvement in ICU Based on Taylor's Management Theory.

    Science.gov (United States)

    Tao, Ziqi

    2015-06-01

    Using methods including questionnaire-based surveys and control analysis, we analyzed the improvements in the efficiency of ICU rescue, service quality, and patients' satisfaction, in Xuzhou Central Hospital after the implementation of fine management, with an attempt to further introduce the concept of fine management and implement the brand construction. Originating in Taylor's "Theory of Scientific Management" (1982), fine management uses programmed, standardized, digitalized, and informational approaches to ensure each unit of an organization is running with great accuracy, high efficiency, strong coordination, and at sustained duration (Wang et al., Fine Management, 2007). The nature of fine management is a process that breaks up the strategy and goal, and executes it. Strategic planning takes place at every part of the process. Fine management demonstrates that everybody has a role to play in the management process, every area must be examined through the management process, and everything has to be managed (Zhang et al., The Experience of Hospital Nursing Precise Management, 2006). In other words, this kind of management theory demands all people to be involved in the entire process (Liu and Chen, Med Inf, 2007). As public hospital reform is becoming more widespread, it becomes imperative to "build a unified and efficient public hospital management system" and "improve the quality of medical services" (Guidelines on the Pilot Reform of Public Hospitals, 2010). The execution of fine management is of importance in optimizing the medical process, improving medical services and building a prestigious hospital brand.

  20. Bernstein's theory of pedagogic discourse as a theoretical framework for educators studying student radiographers' interpretation of normality vs. abnormality

    International Nuclear Information System (INIS)

    Winter, Peter D.; Linehan, Mark J.

    2014-01-01

    Purpose: To acknowledge the tacit rules underpinning academic practice of undergraduate radiographers in determining normality vs. abnormality when appraising skeletal images. Methodology: Twelve students were interviewed (individually) using in-depth semi-structured questions. Interviews were mediated through a PowerPoint presentation containing two digital X-ray images. Each image was based on a level of expertise; the elementary (Case 1) and the complicated (Case 2). The questions were based on regular ‘frames’ created from observing tutor–student contact in class, and then validated through a group interview. Bernstein's theory of pedagogic discourse was then utilised as a data analysis instrument to determine how third year diagnostic radiography students interpreted X-ray images, in relation to the ‘recognition’ and ‘realisation’ rules of the Educational Theoretical Framework. Conclusion: Bernstein's framework has made it possible to specify, in detail, how issues and difficulties are formed at the level of the acquirer during interpretation. The recognition rules enabled students to meaningfully recognise what trauma characteristics can be associated with the image and the demands of a detailed scrutiny so as to enact a competent interpretation. Realisation rules, made it possible for students to establish their own systematic approach and realise legitimate meanings of normality and abnormality. Whereas obvious or visible trauma generated realisation rules (represented via homogenous terminology), latent trauma authorised students to deviate from legitimate meanings. The latter rule, in this context, has directed attention to the student issue of visioning abnormality when images are normal

  1. The motivation to breastfeed: a fit to the opponent-process theory?

    Science.gov (United States)

    Myers, H H; Siegel, P S

    1985-07-01

    The opponent-process theory, a dynamic model of acquired motivation presented by Solomon and Corbit (1974), was applied to the process of breastfeeding. A modified form of the Nowlis Mood Adjective Checklist (MACL, Nowlis, 1965, 1970) and a discomfort measure were used in assessing through recall the affective course predicted by the theory. The data were analyzed using multivariate analysis of variance (MANOVA) and correlational procedures. Results were highly significant: Women who breastfed for relatively long periods recalled positive affective responses while the baby was at breast and a subsequent negative or dysphoric response. The additional characteristics of acquired motivation, habituation, and withdrawal, were also evidenced in the data. As a control for possible confounding demand characteristics inherent in the methodology, a sample of childless women was surveyed using an "as-if" form of the same questionnaire. Very little similarity to the breastfeeders was found in the pattern of responses yielded by this group. It was concluded that our major findings are quite likely free of influence from this source.

  2. Integrative mental health care: from theory to practice, part 1.

    Science.gov (United States)

    Lake, James

    2007-01-01

    Integrative approaches will lead to more accurate and different understandings of mental illness. Beneficial responses to complementary and alternative therapies provide important clues about the phenomenal nature of the human body in space-time and disparate biological, informational, and energetic factors associated with normal and abnormal psychological functioning. The conceptual framework of contemporary Western psychiatry includes multiple theoretical viewpoints, and there is no single best explanatory model of mental illness. Future theories of mental illness causation will not depend exclusively on empirical verification of strictly biological processes but will take into account both classically described biological processes and non-classical models, including complexity theory, resulting in more complete explanations of the characteristics and causes of symptoms and mechanisms of action that result in beneficial responses to treatments. Part 1 of this article examines the limitations of the theory and contemporary clinical methods employed in Western psychiatry and discusses implications of emerging paradigms in physics and the biological sciences for the future of psychiatry. In part 2, a practical methodology for planning integrative assessment and treatment strategies in mental health care is proposed. Using this methodology the integrative management of moderate and severe psychiatric symptoms is reviewed in detail. As the conceptual framework of Western medicine evolves toward an increasingly integrative perspective, novel understandings of complex relationships between biological, informational, and energetic processes associated with normal psychological functioning and mental illness will lead to more effective integrative assessment and treatment strategies addressing the causes or meanings of symptoms at multiple hierarchic levels of body-brain-mind.

  3. Integrative mental health care: from theory to practice, Part 2.

    Science.gov (United States)

    Lake, James

    2008-01-01

    Integrative approaches will lead to more accurate and different understandings of mental illness. Beneficial responses to complementary and alternative therapies provide important clues about the phenomenal nature of the human body in space-time and disparate biological, informational, and energetic factors associated with normal and abnormal psychological functioning. The conceptual framework of contemporary Western psychiatry includes multiple theoretical viewpoints, and there is no single best explanatory model of mental illness. Future theories of mental illness causation will not depend exclusively on empirical verification of strictly biological processes but will take into account both classically described biological processes and non-classical models, including complexity theory, resulting in more complete explanations of the characteristics and causes of symptoms and mechanisms of action that result in beneficial responses to treatments. Part 1 of this article examined the limitations of the theory and contemporary clinical methods employed in Western psychiatry and discussed implications of emerging paradigms in physics and the biological sciences for the future of psychiatry. In part 2, a practical methodology, for planning integrative assessment and treatment strategies in mental health care is proposed. Using this methodology the integrative management of moderate and severe psychiatric symptoms is reviewed in detail. As the conceptual framework of Western medicine evolves toward an increasingly integrative perspective, novel understanding of complex relationships between biological, informational, and energetic processes associated with normal psychological functioning and mental illness will lead to more effective integrative assessment and treatment strategies addressing the causes or meanings of symptoms at multiple hierarchic levels of body-brain-mind.

  4. Effect of ions on sulfuric acid-water binary particle formation: 2. Experimental data and comparison with QC-normalized classical nucleation theory

    CERN Document Server

    Duplissy, J.; Franchin, A.; Tsagkogeorgas, G.; Kangasluoma, J.; Wimmer, D.; Vuollekoski, H.; Schobesberger, S.; Lehtipalo, K.; Flagan, R. C.; Brus, D.; Donahue, N. M.; Vehkamäki, H.; Almeida, J.; Amorim, A.; Barmet, P.; Bianchi, F.; Breitenlechner, M.; Dunne, E. M.; Guida, R.; Henschel, H.; Junninen, H.; Kirkby, J.; Kürten, A.; Kupc, A.; Määttänen, A.; Makhmutov, V.; Mathot, S.; Nieminen, T.; Onnela, A.; Praplan, A. P.; Riccobono, F.; Rondo, L.; Steiner, G.; Tome, A.; Walther, H.; Baltensperger, U.; Carslaw, K. S.; Dommen, J.; Hansel, A.; Petäjä, T.; Sipilä, M.; Stratmann, F.; Vrtala, A.; Wagner, P. E.; Worsnop, D. R.; Curtius, J.; Kulmala, M.

    2015-09-04

    We report comprehensive, demonstrably contaminant‐free measurements of binary particle formation rates by sulfuric acid and water for neutral and ion‐induced pathways conducted in the European Organization for Nuclear Research Cosmics Leaving Outdoor Droplets chamber. The recently developed Atmospheric Pressure interface‐time of flight‐mass spectrometer was used to detect contaminants in charged clusters and to identify runs free of any contaminants. Four parameters were varied to cover ambient conditions: sulfuric acid concentration (105 to 109 mol cm−3), relative humidity (11% to 58%), temperature (207 K to 299 K), and total ion concentration (0 to 6800 ions cm−3). Formation rates were directly measured with novel instruments at sizes close to the critical cluster size (mobility size of 1.3 nm to 3.2 nm). We compare our results with predictions from Classical Nucleation Theory normalized by Quantum Chemical calculation (QC‐normalized CNT), which is described in a companion pape...

  5. An Algorithm for Higher Order Hopf Normal Forms

    Directory of Open Access Journals (Sweden)

    A.Y.T. Leung

    1995-01-01

    Full Text Available Normal form theory is important for studying the qualitative behavior of nonlinear oscillators. In some cases, higher order normal forms are required to understand the dynamic behavior near an equilibrium or a periodic orbit. However, the computation of high-order normal forms is usually quite complicated. This article provides an explicit formula for the normalization of nonlinear differential equations. The higher order normal form is given explicitly. Illustrative examples include a cubic system, a quadratic system and a Duffing–Van der Pol system. We use exact arithmetic and find that the undamped Duffing equation can be represented by an exact polynomial differential amplitude equation in a finite number of terms.

  6. Tensor products of process matrices with indefinite causal structure

    Science.gov (United States)

    Jia, Ding; Sakharwade, Nitica

    2018-03-01

    Theories with indefinite causal structure have been studied from both the fundamental perspective of quantum gravity and the practical perspective of information processing. In this paper we point out a restriction in forming tensor products of objects with indefinite causal structure in certain models: there exist both classical and quantum objects the tensor products of which violate the normalization condition of probabilities, if all local operations are allowed. We obtain a necessary and sufficient condition for when such unrestricted tensor products of multipartite objects are (in)valid. This poses a challenge to extending communication theory to indefinite causal structures, as the tensor product is the fundamental ingredient in the asymptotic setting of communication theory. We discuss a few options to evade this issue. In particular, we show that the sequential asymptotic setting does not suffer the violation of normalization.

  7. A Qualitative Analysis Framework Using Natural Language Processing and Graph Theory

    Science.gov (United States)

    Tierney, Patrick J.

    2012-01-01

    This paper introduces a method of extending natural language-based processing of qualitative data analysis with the use of a very quantitative tool--graph theory. It is not an attempt to convert qualitative research to a positivist approach with a mathematical black box, nor is it a "graphical solution". Rather, it is a method to help qualitative…

  8. Research on Remote Sensing Image Template Processing Based on Global Subdivision Theory

    OpenAIRE

    Xiong Delan; Du Genyuan

    2013-01-01

    Aiming at the questions of vast data, complex operation, and time consuming processing for remote sensing image, subdivision template was proposed based on global subdivision theory, which can set up high level of abstraction and generalization for remote sensing image. The paper emphatically discussed the model and structure of subdivision template, and put forward some new ideas for remote sensing image template processing, key technology and quickly applied demonstration. The research has ...

  9. An integrated model of clinical reasoning: dual-process theory of cognition and metacognition.

    Science.gov (United States)

    Marcum, James A

    2012-10-01

    Clinical reasoning is an important component for providing quality medical care. The aim of the present paper is to develop a model of clinical reasoning that integrates both the non-analytic and analytic processes of cognition, along with metacognition. The dual-process theory of cognition (system 1 non-analytic and system 2 analytic processes) and the metacognition theory are used to develop an integrated model of clinical reasoning. In the proposed model, clinical reasoning begins with system 1 processes in which the clinician assesses a patient's presenting symptoms, as well as other clinical evidence, to arrive at a differential diagnosis. Additional clinical evidence, if necessary, is acquired and analysed utilizing system 2 processes to assess the differential diagnosis, until a clinical decision is made diagnosing the patient's illness and then how best to proceed therapeutically. Importantly, the outcome of these processes feeds back, in terms of metacognition's monitoring function, either to reinforce or to alter cognitive processes, which, in turn, enhances synergistically the clinician's ability to reason quickly and accurately in future consultations. The proposed integrated model has distinct advantages over other models proposed in the literature for explicating clinical reasoning. Moreover, it has important implications for addressing the paradoxical relationship between experience and expertise, as well as for designing a curriculum to teach clinical reasoning skills. © 2012 Blackwell Publishing Ltd.

  10. Commutative $C^*$-algebras and $\\sigma$-normal morphisms

    OpenAIRE

    de Jeu, Marcel

    2003-01-01

    We prove in an elementary fashion that the image of a commutative monotone $\\sigma$-complete $C^*$-algebra under a $\\sigma$-normal morphism is again monotone $\\sigma$-complete and give an application of this result in spectral theory.

  11. Effect of normalization on the neutron spectrum adjustment procedure

    International Nuclear Information System (INIS)

    Zsolnay, E.M.; Zijp, W.L.; Nolthenius, H.J.

    1983-10-01

    Various computer programs currently applied for neutron spectrum adjustment based on multifoil activation data, use different ways to determine the normalization factor to be applied to an unnormalized input spectrum. The influence is shown of the various definitions of the normalization factor on the adjusted results for the case of the ORR and YAYOI spectra considered in the international REAL-80 exercise. The actual expression for defining the normalization factor is more important than previously assumed. The theory of the generalized least squares principle provides an optimal definition for the normalization factor

  12. Interpolation theory

    CERN Document Server

    Lunardi, Alessandra

    2018-01-01

    This book is the third edition of the 1999 lecture notes of the courses on interpolation theory that the author delivered at the Scuola Normale in 1998 and 1999. In the mathematical literature there are many good books on the subject, but none of them is very elementary, and in many cases the basic principles are hidden below great generality. In this book the principles of interpolation theory are illustrated aiming at simplification rather than at generality. The abstract theory is reduced as far as possible, and many examples and applications are given, especially to operator theory and to regularity in partial differential equations. Moreover the treatment is self-contained, the only prerequisite being the knowledge of basic functional analysis.

  13. Consensual decision-making model based on game theory for LNG processes

    International Nuclear Information System (INIS)

    Castillo, Luis; Dorao, Carlos A.

    2012-01-01

    Highlights: ► A Decision Making (DM) approach for LNG projects based on game theory is presented. ► DM framework was tested with two different cases, using analytical models and a simple LNG process. ► The problems were solved by using a Genetic Algorithm (GA) binary coding and Nash-GA. ► Integrated models from the design and optimization of the process could result in more realistic outcome. ► The major challenge in such a framework is related to the uncertainties in the market models. - Abstract: Decision-Making (DM) in LNG projects is a quite complex process due to the number of actors, approval phases, large investments and capital return in the long time. Furthermore, due to the very high investment of a LNG project, a detailed and efficient DM process is required in order to minimize risks. In this work a Decision-Making (DM) approach for LNG projects is presented. The approach is based on a consensus algorithm to address the consensus output over a common value using cost functions within a framework based on game theory. The DM framework was tested with two different cases. The first case was used for evaluating the performance of the framework with analytical models, while the second case corresponds to a simple LNG process. The problems were solved by using a Genetic Algorithm (GA) binary coding and Nash-GA. The results of the DM framework in the LNG project indicate that considering an integrated DM model and including the markets role from the design and optimization of the process more realistic outcome could be obtained. However, the major challenge in such a framework is related to the uncertainties in the market models.

  14. [Mourning and depression, from the attachment theory perspective].

    Science.gov (United States)

    Wolfberg, Elsa; Ekboir, Alberto; Faiman, Graciela; Finzi, Josefina; Freedman, Margarita; Heath, Adela; Martínez de Cipolatti, María C

    2011-01-01

    Since depression, according to OMS, is such a worldwide condition, it is necessary to be able to distinguish a normal mourning from a pathological mourning and a depression, so as to qualify patients and health professionals to be able to support a normal mourning without medicating it nor hurrying (hasting) it, as well as being able to treat a depression adequately when it appears as a complication. Attachment theory focuses on mourning after loss with notions such as 1- acceptance of search for the lost person as a normal fact; 2- that mourning in children may have non-pathological outcomes; 3- that a non-processed mourning may be transmitted in an intergenerational way, and 4- also defines which elements may determine a pathological mourning or a depression. A clinical case is presented with an analysis of these notions.

  15. Diagnostic Problem-Solving Process in Professional Contexts: Theory and Empirical Investigation in the Context of Car Mechatronics Using Computer-Generated Log-Files

    Science.gov (United States)

    Abele, Stephan

    2018-01-01

    This article deals with a theory-based investigation of the diagnostic problem-solving process in professional contexts. To begin with, a theory of the diagnostic problem-solving process was developed drawing on findings from different professional contexts. The theory distinguishes between four sub-processes of the diagnostic problem-solving…

  16. Changes of regional cerebral glucose metabolism in normal aging process : A study with FDG PET

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Joon Kee; Kim, Sang Eun; Lee, Kyung Han; Choi, Yong; Choe, Yearn Seong; Kim, Byung Tae [Sungkyunkwan Univ., School of Medicine, Seoul (Korea, Republic of)

    2001-08-01

    Normal aging results in detectable changes in the brain structure and function. We evaluated the changes of regional cerebral glucose metabolism in the normal aging process with FDG PET. Brain PET images were obtained in 44 healthy volunteers (age range 20-69'y'; M:F = 29:15) who had no history of neuropsychiatric disorders. On 6 representative transaxial images, ROls were drawn in the cortical and subcortical areas. Regional FDG uptake was normalized using whole brain uptake to adjust for the injection dose and correct for nonspecific declines of glucose metabolism affecting all brain areas equally. In the prefrontal, temporoparietal and primary sensorimotor cortex, the normalized FDG uptake (NFU) reached a peak In subjects in their 30s. The NFU in the prefrontal and primary sensorimotor cortex declined with age after 30s at a rate of 3.15%/decade and 1.93%/decade, respectively. However, the NFU in the lernporoparietal cortex did not change significantly with age after 30s. The anterior (prefrontal) posterior (temporoparietal) gradient peaked in subjects in their 30s and declined with age the reafter at a rate of 35%/decade. The NFU in the caudate nucleus was decreased with age after 20s at a rate of 2.39%/decade. In the primary visual cortex, putamen, and thalamus, the NFU values did not change significantly throughout the ages covered. These patterns were not significantly different between right and left cerebral hemispheres. Of interest was that the NFU in the left cerebellar cortex was increased with age after 20s at a rate of 2.86%/decade. These data demonstrate regional variation of the age-related changes in the cerebral glucose metabolism, with the most prominent age-related decline of metabolism in the prefrontal cortex. The increase in the cerebellar metabolism with age might reflect a process of neuronal plasticity associated with aging.

  17. Changes of regional cerebral glucose metabolism in normal aging process : A study with FDG PET

    International Nuclear Information System (INIS)

    Yoon, Joon Kee; Kim, Sang Eun; Lee, Kyung Han; Choi, Yong; Choe, Yearn Seong; Kim, Byung Tae

    2001-01-01

    Normal aging results in detectable changes in the brain structure and function. We evaluated the changes of regional cerebral glucose metabolism in the normal aging process with FDG PET. Brain PET images were obtained in 44 healthy volunteers (age range 20-69'y'; M:F = 29:15) who had no history of neuropsychiatric disorders. On 6 representative transaxial images, ROls were drawn in the cortical and subcortical areas. Regional FDG uptake was normalized using whole brain uptake to adjust for the injection dose and correct for nonspecific declines of glucose metabolism affecting all brain areas equally. In the prefrontal, temporoparietal and primary sensorimotor cortex, the normalized FDG uptake (NFU) reached a peak In subjects in their 30s. The NFU in the prefrontal and primary sensorimotor cortex declined with age after 30s at a rate of 3.15%/decade and 1.93%/decade, respectively. However, the NFU in the lernporoparietal cortex did not change significantly with age after 30s. The anterior (prefrontal) posterior (temporoparietal) gradient peaked in subjects in their 30s and declined with age the reafter at a rate of 35%/decade. The NFU in the caudate nucleus was decreased with age after 20s at a rate of 2.39%/decade. In the primary visual cortex, putamen, and thalamus, the NFU values did not change significantly throughout the ages covered. These patterns were not significantly different between right and left cerebral hemispheres. Of interest was that the NFU in the left cerebellar cortex was increased with age after 20s at a rate of 2.86%/decade. These data demonstrate regional variation of the age-related changes in the cerebral glucose metabolism, with the most prominent age-related decline of metabolism in the prefrontal cortex. The increase in the cerebellar metabolism with age might reflect a process of neuronal plasticity associated with aging

  18. EFFECTIVE ACTIONS FOR HETEROTIC STRING THEORY

    NARCIS (Netherlands)

    SUELMANN, H

    Heterotic String Theory is an attempt to construct a description of nature that is more satisfying than the Standard Model. A major problem is that it is very difficult to do explicit calculations in string theory. Therefore, it is useful to construct a 'normal' field theory that approximates HST.

  19. A new integrability theory for certain nonlinear physical problems

    International Nuclear Information System (INIS)

    Berger, M.S.

    1993-01-01

    A new mathematically sound integrability theory for certain nonlinear problems defined by ordinary or partial differential equations is defined. The new theory works in an arbitrary finite number of space dimensions. Moreover, if a system is integrable in the new sense described here, it has a remarkable stability property that distinguishes if from any previously known integrability ideas. The new theory proceeds by establishing a ''global normal form'' for the problem at hand. This normal form holds subject to canonical coordinate transformations, extending such classical ideas by using new nonlinear methods of infinite dimensional functional analysis. The global normal form in question is related to the mathematical theory of singularities of mappings of H. Whitney and R. Thom extended globally and form finite to infinite dimensions. Thus bifurcation phenomena are naturally included in the new integrability theory. Typical examples include the classically nonintegrable Riccati equation, certain non-Euclidean mean field theories, certain parabolic reaction diffusion equations and the hyperbolic nonlinear telegrapher's equation. (Author)

  20. Thermalization in a holographic confining gauge theory

    Science.gov (United States)

    Ishii, Takaaki; Kiritsis, Elias; Rosen, Christopher

    2015-08-01

    Time dependent perturbations of states in the holographic dual of a 3+1 dimensional confining theory are considered. The perturbations are induced by varying the coupling to the theory's most relevant operator. The dual gravitational theory belongs to a class of Einstein-dilaton theories which exhibit a mass gap at zero temperature and a first order deconfining phase transition at finite temperature. The perturbation is realized in various thermal bulk solutions by specifying time dependent boundary conditions on the scalar, and we solve the fully backreacted Einstein-dilaton equations of motion subject to these boundary conditions. We compute the characteristic time scale of many thermalization processes, noting that in every case we examine, this time scale is determined by the imaginary part of the lowest lying quasi-normal mode of the final state black brane. We quantify the dependence of this final state on parameters of the quench, and construct a dynamical phase diagram. Further support for a universal scaling regime in the abrupt quench limit is provided.

  1. Proteolytic processing of connective tissue growth factor in normal ocular tissues and during corneal wound healing.

    Science.gov (United States)

    Robinson, Paulette M; Smith, Tyler S; Patel, Dilan; Dave, Meera; Lewin, Alfred S; Pi, Liya; Scott, Edward W; Tuli, Sonal S; Schultz, Gregory S

    2012-12-13

    Connective tissue growth factor (CTGF) is a fibrogenic cytokine that is up-regulated by TGF-β and mediates most key fibrotic actions of TGF-β, including stimulation of synthesis of extracellular matrix and differentiation of fibroblasts into myofibroblasts. This study addresses the role of proteolytic processing of CTGF in human corneal fibroblasts (HCF) stimulated with TGF-β, normal ocular tissues and wounded corneas. Proteolytic processing of CTGF in HCF cultures, normal animal eyes, and excimer laser wounded rat corneas were examined by Western blot. The identity of a 21-kDa band was determined by tandem mass spectrometry, and possible alternative splice variants of CTGF were assessed by 5' Rapid Amplification of cDNA Ends (RACE). HCF stimulated by TGF-β contained full length 38-kDa CTGF and fragments of 25, 21, 18, and 13 kDa, while conditioned medium contained full length 38- and a 21-kDa fragment of CTGF that contained the middle "hinge" region of CTGF. Fragmentation of recombinant CTGF incubated in HCF extracts was blocked by the aspartate protease inhibitor, pepstatin. Normal mouse, rat, and rabbit whole eyes and rabbit ocular tissues contained abundant amounts of C-terminal 25- and 21-kDa fragments and trace amounts of 38-kDa CTGF, although no alternative transcripts were detected. All forms of CTGF (38, 25, and 21 kDa) were detected during healing of excimer ablated rat corneas, peaking on day 11. Proteolytic processing of 38-kDa CTGF occurs during corneal wound healing, which may have important implications in regulation of corneal scar formation.

  2. Emotion effects on implicit and explicit musical memory in normal aging.

    Science.gov (United States)

    Narme, Pauline; Peretz, Isabelle; Strub, Marie-Laure; Ergis, Anne-Marie

    2016-12-01

    Normal aging affects explicit memory while leaving implicit memory relatively spared. Normal aging also modifies how emotions are processed and experienced, with increasing evidence that older adults (OAs) focus more on positive information than younger adults (YAs). The aim of the present study was to investigate how age-related changes in emotion processing influence explicit and implicit memory. We used emotional melodies that differed in terms of valence (positive or negative) and arousal (high or low). Implicit memory was assessed with a preference task exploiting exposure effects, and explicit memory with a recognition task. Results indicated that effects of valence and arousal interacted to modulate both implicit and explicit memory in YAs. In OAs, recognition was poorer than in YAs; however, recognition of positive and high-arousal (happy) studied melodies was comparable. Insofar as socioemotional selectivity theory (SST) predicts a preservation of the recognition of positive information, our findings are not fully consistent with the extension of this theory to positive melodies since recognition of low-arousal (peaceful) studied melodies was poorer in OAs. In the preference task, YAs showed stronger exposure effects than OAs, suggesting an age-related decline of implicit memory. This impairment is smaller than the one observed for explicit memory (recognition), extending to the musical domain the dissociation between explicit memory decline and implicit memory relative preservation in aging. Finally, the disproportionate preference for positive material seen in OAs did not translate into stronger exposure effects for positive material suggesting no age-related emotional bias in implicit memory. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  3. Fuzzy-trace theory: dual processes in memory, reasoning, and cognitive neuroscience.

    Science.gov (United States)

    Brainerd, C J; Reyna, V F

    2001-01-01

    Fuzzy-trace theory has evolved in response to counterintuitive data on how memory development influences the development of reasoning. The two traditional perspectives on memory-reasoning relations--the necessity and constructivist hypotheses--stipulate that the accuracy of children's memory for problem information and the accuracy of their reasoning are closely intertwined, albeit for different reasons. However, contrary to necessity, correlational and experimental dissociations have been found between children's memory for problem information that is determinative in solving certain problems and their solutions of those problems. In these same tasks, age changes in memory for problem information appear to be dissociated from age changes in reasoning. Contrary to constructivism, correlational and experimental dissociations also have been found between children's performance on memory tests for actual experience and memory tests for the meaning of experience. As in memory-reasoning studies, age changes in one type of memory performance do not seem to be closely connected to age changes in the other type of performance. Subsequent experiments have led to dual-process accounts in both the memory and reasoning spheres. The account of memory development features four other principles: parallel verbatim-gist storage, dissociated verbatim-gist retrieval, memorial bases of conscious recollection, and identity/similarity processes. The account of the development of reasoning features three principles: gist extraction, fuzzy-to-verbatim continua, and fuzzy-processing preferences. The fuzzy-processing preference is a particularly important notion because it implies that gist-based intuitive reasoning often suffices to deliver "logical" solutions and that such reasoning confers multiple cognitive advantages that enhance accuracy. The explanation of memory-reasoning dissociations in cognitive development then falls out of fuzzy-trace theory's dual-process models of memory and

  4. Arrays of surface-normal electroabsorption modulators for the generation and signal processing of microwave photonics signals

    NARCIS (Netherlands)

    Noharet, Bertrand; Wang, Qin; Platt, Duncan; Junique, Stéphane; Marpaung, D.A.I.; Roeloffzen, C.G.H.

    2011-01-01

    The development of an array of 16 surface-normal electroabsorption modulators operating at 1550nm is presented. The modulator array is dedicated to the generation and processing of microwave photonics signals, targeting a modulation bandwidth in excess of 5GHz. The hybrid integration of the

  5. Professional Socialization: A Grounded Theory of the Clinical Reasoning Processes That RNs and LPNs Use to Recognize Delirium.

    Science.gov (United States)

    El Hussein, Mohamed; Hirst, Sandra; Osuji, Joseph

    2017-08-01

    Delirium is an acute disorder of attention and cognition. It affects half of older adults in acute care settings and is a cause of increasing mortality and costs. Registered nurses (RNs) and licensed practical nurses (LPNs) frequently fail to recognize delirium. The goals of this research were to identify the reasoning processes that RNs and LPNs use to recognize delirium, to compare their reasoning processes, and to generate a theory that explains their clinical reasoning processes. Theoretical sampling was employed to elicit data from 28 participants using grounded theory methodology. Theoretical coding culminated in the emergence of Professional Socialization as the substantive theory. Professional Socialization emerged from participants' responses and was based on two social processes, specifically reasoning to uncover and reasoning to report. Professional Socialization makes explicit the similarities and variations in the clinical reasoning processes between RNs and LPNs and highlights their main concerns when interacting with delirious patients.

  6. Factors Affecting Christian Parents' School Choice Decision Processes: A Grounded Theory Study

    Science.gov (United States)

    Prichard, Tami G.; Swezey, James A.

    2016-01-01

    This study identifies factors affecting the decision processes for school choice by Christian parents. Grounded theory design incorporated interview transcripts, field notes, and a reflective journal to analyze themes. Comparative analysis, including open, axial, and selective coding, was used to reduce the coded statements to five code families:…

  7. Parameter-free effective field theory calculation for the solar proton-fusion and hep processes

    International Nuclear Information System (INIS)

    T.S. Park; L.E. Marcucci; R. Schiavilla; M. Viviani; A. Kievsky; S. Rosati; K. Kubodera; D.P. Min; M. Rho

    2002-01-01

    Spurred by the recent complete determination of the weak currents in two-nucleon systems up to Ο(Q 3 ) in heavy-baryon chiral perturbation theory, we carry out a parameter-free calculation of the threshold S-factors for the solar pp (proton-fusion) and hep processes in an effective field theory that combines the merits of the standard nuclear physics method and systematic chiral expansion. The power of the EFT adopted here is that one can correlate in a unified formalism the weak-current matrix elements of two-, three- and four-nucleon systems. Using the tritium β-decay rate as an input to fix the only unknown parameter in the theory, we can evaluate the threshold S factors with drastically improved precision; the results are S pp (0) = 3.94 x (1 ± 0.004) x 10 -25 MeV-b and S hep (0) = (8.6 ± 1.3) x 10 -20 keV-b. The dependence of the calculated S-factors on the momentum cutoff parameter Λ has been examined for a physically reasonable range of Λ. This dependence is found to be extremely small for the pp process, and to be within acceptable levels for the hep process, substantiating the consistency of our calculational scheme

  8. Reward value-based gain control: divisive normalization in parietal cortex.

    Science.gov (United States)

    Louie, Kenway; Grattan, Lauren E; Glimcher, Paul W

    2011-07-20

    The representation of value is a critical component of decision making. Rational choice theory assumes that options are assigned absolute values, independent of the value or existence of other alternatives. However, context-dependent choice behavior in both animals and humans violates this assumption, suggesting that biological decision processes rely on comparative evaluation. Here we show that neurons in the monkey lateral intraparietal cortex encode a relative form of saccadic value, explicitly dependent on the values of the other available alternatives. Analogous to extra-classical receptive field effects in visual cortex, this relative representation incorporates target values outside the response field and is observed in both stimulus-driven activity and baseline firing rates. This context-dependent modulation is precisely described by divisive normalization, indicating that this standard form of sensory gain control may be a general mechanism of cortical computation. Such normalization in decision circuits effectively implements an adaptive gain control for value coding and provides a possible mechanistic basis for behavioral context-dependent violations of rationality.

  9. Design for human factors (DfHF): a grounded theory for integrating human factors into production design processes.

    Science.gov (United States)

    Village, Judy; Searcy, Cory; Salustri, Filipo; Patrick Neumann, W

    2015-01-01

    The 'design for human factors' grounded theory explains 'how' human factors (HF) went from a reactive, after-injury programme in safety, to being proactively integrated into each step of the production design process. In this longitudinal case study collaboration with engineers and HF Specialists in a large electronics manufacturer, qualitative data (e.g. meetings, interviews, observations and reflections) were analysed using a grounded theory methodology. The central tenet in the theory is that when HF Specialists acclimated to the engineering process, language and tools, and strategically aligned HF to the design and business goals of the organisation, HF became a means to improve business performance. This led to engineers 'pulling' HF Specialists onto their team. HF targets were adopted into engineering tools to communicate HF concerns quantitatively, drive continuous improvement, visibly demonstrate change and lead to benchmarking. Senior management held engineers accountable for HF as a key performance indicator, thus integrating HF into the production design process. Practitioner Summary: Research and practice lack explanations about how HF can be integrated early in design of production systems. This three-year case study and the theory derived demonstrate how ergonomists changed their focus to align with design and business goals to integrate HF into the design process.

  10. Queer theory and education to approach not normalizing

    Directory of Open Access Journals (Sweden)

    Wendel Souza Santos

    2017-12-01

    Full Text Available Queer analytical commonly related to gender studies is a recent conceptual approach. This article aims mainly to bring out the prospect explored the critical analysis of the educational field. So the big challenge in education is to rethink what is educate, educate and educate and to whom. In a non-normalizing perspective, educate would be a dialogical activity in that the experiences to date unfeasible, non-recognized, or more commonly, raped, started to be incorporated into the school routine, changing the hierarchy between who teaches and who is educated and seeking establish more symmetry between them in order to move from education to a relational learning and transformative for both.

  11. Assertiveness process of Iranian nurse leaders: a grounded theory study.

    Science.gov (United States)

    Mahmoudirad, Gholamhossein; Ahmadi, Fazlollah; Vanaki, Zohreh; Hajizadeh, Ebrahim

    2009-06-01

    The purpose of this study was to explore the assertiveness process in Iranian nursing leaders. A qualitative design based on the grounded theory approach was used to collect and analyze the assertiveness experiences of 12 nurse managers working in four hospitals in Iran. Purposeful and theoretical sampling methods were employed for the data collection and selection of the participants, and semistructured interviews were held. During the data analysis, 17 categories emerged and these were categorized into three themes: "task generation", "assertiveness behavior", and "executive agents". From the participants' experiences, assertiveness theory emerged as being fundamental to the development of a schematic model describing nursing leadership behaviors. From another aspect, religious beliefs also played a fundamental role in Iranian nursing leadership assertiveness. It was concluded that bringing a change in the current support from top managers and improving self-learning are required in order to enhance the assertiveness of the nursing leaders in Iran.

  12. Dynamical description of the fission process using the TD-BCS theory

    Energy Technology Data Exchange (ETDEWEB)

    Scamps, Guillaume, E-mail: scamps@nucl.phys.tohoku.ac.jp [Department of Physics, Tohoku University, Sendai 980-8578 (Japan); Simenel, Cédric [Department of Nuclear Physics, Research School of Physics and Engineering Australian National University, Canberra, Australian Capital Territory 2601 (Australia); Lacroix, Denis [Institut de Physique Nucléaire, IN2P3-CNRS, Université Paris-Sud, F-91406 Orsay Cedex (France)

    2015-10-15

    The description of fission remains a challenge for nuclear microscopic theories. The time-dependent Hartree-Fock approach with BCS pairing is applied to study the last stage of the fission process. A good agreement is found for the one-body observables: the total kinetic energy and the average mass asymmetry. The non-physical dependence of two-body observables with the initial shape is discussed.

  13. Seeking Comfort: Women Mental Health Process in I. R. Iran: A Grounded Theory Study

    Science.gov (United States)

    Mohammadi, Farahnaz; Eftekhari, Monir Baradaran; Dejman, Masoumeh; Forouzan, Ameneh Setareh; Mirabzadeh, Arash

    2014-01-01

    Background: Psychosocial factor is considered as intermediate social determinant of health, because it has powerful effects on health especially in women. Hence deeper understanding of the mental-health process needed for its promotion. The aim of this study was to explore women's experience of the mental-health problem and related action-interactions activities to design the appropriate interventions. Methods: In-depth interviews with women 18-65 years were analyzed according to the grounded theory method. The selection of Participants was based on purposeful and theoretical sampling. Results: In this study, a substantive theory was generated; explaining how female with the mental-health problem handled their main concern, which was identified as their effort to achieve comfort (core variable). The other six categories are elements in this process. Daily stress as a trigger, satisfaction is the end point, marriage is the key point and action - interaction activities in this process are strengthening human essence, Developing life skills and help seeking. Conclusions: Better understanding the mental-health process might be useful to design the interventional program among women with mental-health problems. PMID:24627750

  14. Statistical Inference on Memory Structure of Processes and Its Applications to Information Theory

    Science.gov (United States)

    2016-05-12

    Distribution Unlimited UU UU UU UU 12-05-2016 15-May-2014 14-Feb-2015 Final Report: Statistical Inference on Memory Structure of Processes and Its Applications ...ES) U.S. Army Research Office P.O. Box 12211 Research Triangle Park, NC 27709-2211 mathematical statistics ; time series; Markov chains; random...journals: Final Report: Statistical Inference on Memory Structure of Processes and Its Applications to Information Theory Report Title Three areas

  15. Chaos Theory as a Model for Life Transitions Counseling: Nonlinear Dynamics and Life's Changes

    Science.gov (United States)

    Bussolari, Cori J.; Goodell, Judith A.

    2009-01-01

    Chaos theory is presented for counselors working with clients experiencing life transitions. It is proposed as a model that considers disorder, unpredictability, and lack of control as normal parts of transition processes. Nonlinear constructs from physics are adapted for use in counseling. The model provides a method clients can use to…

  16. Renewal theory for perturbed random walks and similar processes

    CERN Document Server

    Iksanov, Alexander

    2016-01-01

    This book offers a detailed review of perturbed random walks, perpetuities, and random processes with immigration. Being of major importance in modern probability theory, both theoretical and applied, these objects have been used to model various phenomena in the natural sciences as well as in insurance and finance. The book also presents the many significant results and efficient techniques and methods that have been worked out in the last decade. The first chapter is devoted to perturbed random walks and discusses their asymptotic behavior and various functionals pertaining to them, including supremum and first-passage time. The second chapter examines perpetuities, presenting results on continuity of their distributions and the existence of moments, as well as weak convergence of divergent perpetuities. Focusing on random processes with immigration, the third chapter investigates the existence of moments, describes long-time behavior and discusses limit theorems, both with and without scaling. Chapters fou...

  17. Nonlinear closure relations theory for transport processes in nonequilibrium systems

    International Nuclear Information System (INIS)

    Sonnino, Giorgio

    2009-01-01

    A decade ago, a macroscopic theory for closure relations has been proposed for systems out of Onsager's region. This theory is referred to as the thermodynamic field theory (TFT). The aim of this work was to determine the nonlinear flux-force relations that respect the thermodynamic theorems for systems far from equilibrium. We propose a formulation of the TFT where one of the basic restrictions, namely, the closed-form solution for the skew-symmetric piece of the transport coefficients, has been removed. In addition, the general covariance principle is replaced by the De Donder-Prigogine thermodynamic covariance principle (TCP). The introduction of TCP requires the application of an appropriate mathematical formalism, which is referred to as the entropy-covariant formalism. By geometrical arguments, we prove the validity of the Glansdorff-Prigogine universal criterion of evolution. A new set of closure equations determining the nonlinear corrections to the linear ('Onsager') transport coefficients is also derived. The geometry of the thermodynamic space is non-Riemannian. However, it tends to be Riemannian for high values of the entropy production. In this limit, we recover the transport equations found by the old theory. Applications of our approach to transport in magnetically confined plasmas, materials submitted to temperature, and electric potential gradients or to unimolecular triangular chemical reactions can be found at references cited herein. Transport processes in tokamak plasmas are of particular interest. In this case, even in the absence of turbulence, the state of the plasma remains close to (but, it is not in) a state of local equilibrium. This prevents the transport relations from being linear.

  18. Progress in the application of classical S-matrix theory to inelastic collision processes

    International Nuclear Information System (INIS)

    McCurdy, C.W.; Miller, W.H.

    1980-01-01

    Methods are described which effectively solve two of the technical difficulties associated with applying classical S-matrix theory to inelastic/reactive scattering. Specifically, it is shown that rather standard numerical methods can be used to solve the ''root search'' problem (i.e., the nonlinear boundary value problem necessary to impose semiclassical quantum conditions at the beginning and the end of the classical trajectories) and also how complex classical trajectories, which are necessary to describe classically forbidden (i.e., tunneling) processes, can be computed in a numerically stable way. Application is made to vibrational relaxation of H 2 by collision with He (within the helicity conserving approximation). The only remaining problem with regard to applying classical S-matrix theory to complex collision processes has to do with the availability of multidimensional uniform asymptotic formulas for interpolating the ''primitive'' semiclassical expressions between their various regions of validity

  19. High-energy, large-momentum-transfer processes: Ladder diagrams in φ3 theory. Pt. 1

    International Nuclear Information System (INIS)

    Osland, P.; Wu, T.T.; Harvard Univ., Cambridge, MA

    1987-01-01

    Relativistic quantum field theories may give us useful guidance to understanding high-energy, large-momentum-transfer processes, where the center-of-mass energy is much larger than the transverse momentum transfers, which are in turn much larger than the masses of the participating particles. With this possibility in mind, we study the ladder diagrams in φ 3 theory. In this paper, some of the necessary techniques are developed and applied to the simplest cases of the fourth- and sixth-order ladder diagrams. (orig.)

  20. An analysis of clinical reasoning through a recent and comprehensive approach: the dual-process theory

    Directory of Open Access Journals (Sweden)

    Thierry Pelaccia

    2011-03-01

    Full Text Available Context. Clinical reasoning plays a major role in the ability of doctors to make diagnoses and decisions. It is considered as the physician's most critical competence, and has been widely studied by physicians, educationalists, psychologists and sociologists. Since the 1970s, many theories about clinical reasoning in medicine have been put forward.Purpose. This paper aims at exploring a comprehensive approach: the “dual-process theory”, a model developed by cognitive psychologists over the last few years.Discussion. After 40 years of sometimes contradictory studies on clinical reasoning, the dual-process theory gives us many answers on how doctors think while making diagnoses and decisions. It highlights the importance of physicians’ intuition and the high level of interaction between analytical and non-analytical processes. However, it has not received much attention in the medical education literature. The implications of dual-process models of reasoning in terms of medical education will be discussed.

  1. Normality in Analytical Psychology

    Science.gov (United States)

    Myers, Steve

    2013-01-01

    Although C.G. Jung’s interest in normality wavered throughout his career, it was one of the areas he identified in later life as worthy of further research. He began his career using a definition of normality which would have been the target of Foucault’s criticism, had Foucault chosen to review Jung’s work. However, Jung then evolved his thinking to a standpoint that was more aligned to Foucault’s own. Thereafter, the post Jungian concept of normality has remained relatively undeveloped by comparison with psychoanalysis and mainstream psychology. Jung’s disjecta membra on the subject suggest that, in contemporary analytical psychology, too much focus is placed on the process of individuation to the neglect of applications that consider collective processes. Also, there is potential for useful research and development into the nature of conflict between individuals and societies, and how normal people typically develop in relation to the spectrum between individuation and collectivity. PMID:25379262

  2. Normality in Analytical Psychology

    Directory of Open Access Journals (Sweden)

    Steve Myers

    2013-11-01

    Full Text Available Although C.G. Jung’s interest in normality wavered throughout his career, it was one of the areas he identified in later life as worthy of further research. He began his career using a definition of normality which would have been the target of Foucault’s criticism, had Foucault chosen to review Jung’s work. However, Jung then evolved his thinking to a standpoint that was more aligned to Foucault’s own. Thereafter, the post Jungian concept of normality has remained relatively undeveloped by comparison with psychoanalysis and mainstream psychology. Jung’s disjecta membra on the subject suggest that, in contemporary analytical psychology, too much focus is placed on the process of individuation to the neglect of applications that consider collective processes. Also, there is potential for useful research and development into the nature of conflict between individuals and societies, and how normal people typically develop in relation to the spectrum between individuation and collectivity.

  3. BOOK REVIEW: Theory of Neural Information Processing Systems

    Science.gov (United States)

    Galla, Tobias

    2006-04-01

    It is difficult not to be amazed by the ability of the human brain to process, to structure and to memorize information. Even by the toughest standards the behaviour of this network of about 1011 neurons qualifies as complex, and both the scientific community and the public take great interest in the growing field of neuroscience. The scientific endeavour to learn more about the function of the brain as an information processing system is here a truly interdisciplinary one, with important contributions from biology, computer science, physics, engineering and mathematics as the authors quite rightly point out in the introduction of their book. The role of the theoretical disciplines here is to provide mathematical models of information processing systems and the tools to study them. These models and tools are at the centre of the material covered in the book by Coolen, Kühn and Sollich. The book is divided into five parts, providing basic introductory material on neural network models as well as the details of advanced techniques to study them. A mathematical appendix complements the main text. The range of topics is extremely broad, still the presentation is concise and the book well arranged. To stress the breadth of the book let me just mention a few keywords here: the material ranges from the basics of perceptrons and recurrent network architectures to more advanced aspects such as Bayesian learning and support vector machines; Shannon's theory of information and the definition of entropy are discussed, and a chapter on Amari's information geometry is not missing either. Finally the statistical mechanics chapters cover Gardner theory and the replica analysis of the Hopfield model, not without being preceded by a brief introduction of the basic concepts of equilibrium statistical physics. The book also contains a part on effective theories of the macroscopic dynamics of neural networks. Many dynamical aspects of neural networks are usually hard to find in the

  4. Behavioural investigations into uncertainty perception in service exchanges: Lessons from dual-processing theory

    DEFF Research Database (Denmark)

    Kreye, Melanie

    2015-01-01

    by experience and knowledge. Based on dual-processing theory, this paper proposes an analysis method for assessing both explicit and implicit uncertainty perception depending on the individual’s use of tacit or explicit knowledge. Analysing two industrial case studies of service relationships, this paper...

  5. Can practice theory inspire studies of ICTs in everyday life?

    DEFF Research Database (Denmark)

    Christensen, Toke Haunstrup; Røpke, Inge

    2010-01-01

    a new ‘normality’ in everyday life: the expectations and conventions regarding a normal home’s necessary ‘infrastructure’ and the ordinary gear for a normal way of life are changing, and the changes are proceeding rapidly. This chapter takes a closer look at the construction of a new normality...... in everyday life and discusses how this development can be studied from the perspective of practice theory. We show how a practice theory approach shifts the analytic focus away from the consumption of ICT as such and toward the practices that integrate ICT as one element among many others. Thereby......, a practice theory approach helps us to avoid the risk of ending up with a ‘media-centric’ understanding of the use of new media and adds interesting details and subtleties to the study of the construction of a new normality in everyday life. Our application of practice theory in the study...

  6. Toward a Philosophy and Theory of Volumetric Nonthermal Processing.

    Science.gov (United States)

    Sastry, Sudhir K

    2016-06-01

    Nonthermal processes for food preservation have been under intensive investigation for about the past quarter century, with varying degrees of success. We focus this discussion on two volumetrically acting nonthermal processes, high pressure processing (HPP) and pulsed electric fields (PEF), with emphasis on scientific understanding of each, and the research questions that need to be addressed for each to be more successful in the future. We discuss the character or "philosophy" of food preservation, with a question about the nature of the kill step(s), and the sensing challenges that need to be addressed. For HPP, key questions and needs center around whether its nonthermal effectiveness can be increased by increased pressures or pulsing, the theoretical treatment of rates of reaction as influenced by pressure, the assumption of uniform pressure distribution, and the need for (and difficulties involved in) in-situ measurement. For PEF, the questions include the rationale for pulsing, difficulties involved in continuous flow treatment chambers, the difference between electroporation theory and experimental observations, and the difficulties involved in in-situ measurement and monitoring of electric field distribution. © 2016 Institute of Food Technologists®

  7. On generalisations of the log-Normal distribution by means of a new product definition in the Kapteyn process

    Science.gov (United States)

    Duarte Queirós, Sílvio M.

    2012-07-01

    We discuss the modification of the Kapteyn multiplicative process using the q-product of Borges [E.P. Borges, A possible deformed algebra and calculus inspired in nonextensive thermostatistics, Physica A 340 (2004) 95]. Depending on the value of the index q a generalisation of the log-Normal distribution is yielded. Namely, the distribution increases the tail for small (when q1) values of the variable upon analysis. The usual log-Normal distribution is retrieved when q=1, which corresponds to the traditional Kapteyn multiplicative process. The main statistical features of this distribution as well as related random number generators and tables of quantiles of the Kolmogorov-Smirnov distance are presented. Finally, we illustrate the validity of this scenario by describing a set of variables of biological and financial origin.

  8. Reflective processes of practitioners in head and neck cancer rehabilitation: a grounded theory study.

    Science.gov (United States)

    Caty, Marie-Ève; Kinsella, Elizabeth Anne; Doyle, Philip C

    2016-12-01

    This study systematically examined how experienced Speech-Language Pathologists (SLPs) use the processes of reflection to develop knowledge relevant for practice in the context of head and neck cancer (HNC) rehabilitation. In-depth, semi-structured interviews were conducted with 12 SLPs working in HNC rehabilitation in North America. Grounded theory methodology was adopted for data collection and analysis. The findings inform a preliminary reflective practice model that depicts the processes of reflection used by practitioners interviewed. Nine categories of reflective processes were identified by participant SLPs in terms of the processes of reflection: ongoing questioning, experimenting through trial and error, integrating knowledge from past cases, embracing surprise, thinking out of the box, being in the moment, consulting with colleagues, putting oneself in the patients' shoes, and discerning ethical issues. These findings provide empirical evidence that supports Schön's theory of reflective practice and contribute to knowledge about the ways in which SLPs use processes of reflection in the context of HNC rehabilitation. The findings of this study have implications for how SLPs perceive and consider their role as knowledge-users and knowledge producers in their day-to-day clinical work, as well as for building capacity for reflective practice.

  9. Asymptotic theory for the sample covariance matrix of a heavy-tailed multivariate time series

    DEFF Research Database (Denmark)

    Davis, Richard A.; Mikosch, Thomas Valentin; Pfaffel, Olivier

    2016-01-01

    In this paper we give an asymptotic theory for the eigenvalues of the sample covariance matrix of a multivariate time series. The time series constitutes a linear process across time and between components. The input noise of the linear process has regularly varying tails with index α∈(0,4) in...... particular, the time series has infinite fourth moment. We derive the limiting behavior for the largest eigenvalues of the sample covariance matrix and show point process convergence of the normalized eigenvalues. The limiting process has an explicit form involving points of a Poisson process and eigenvalues...... of a non-negative definite matrix. Based on this convergence we derive limit theory for a host of other continuous functionals of the eigenvalues, including the joint convergence of the largest eigenvalues, the joint convergence of the largest eigenvalue and the trace of the sample covariance matrix...

  10. Direct social perception and dual process theories of mindreading.

    Science.gov (United States)

    Herschbach, Mitchell

    2015-11-01

    The direct social perception (DSP) thesis claims that we can directly perceive some mental states of other people. The direct perception of mental states has been formulated phenomenologically and psychologically, and typically restricted to the mental state types of intentions and emotions. I will compare DSP to another account of mindreading: dual process accounts that posit a fast, automatic "Type 1" form of mindreading and a slow, effortful "Type 2" form. I will here analyze whether dual process accounts' Type 1 mindreading serves as a rival to DSP or whether some Type 1 mindreading can be perceptual. I will focus on Apperly and Butterfill's dual process account of mindreading epistemic states such as perception, knowledge, and belief. This account posits a minimal form of Type 1 mindreading of belief-like states called registrations. I will argue that general dual process theories fit well with a modular view of perception that is considered a kind of Type 1 process. I will show that this modular view of perception challenges and has significant advantages over DSP's phenomenological and psychological theses. Finally, I will argue that if such a modular view of perception is accepted, there is significant reason for thinking Type 1 mindreading of belief-like states is perceptual in nature. This would mean extending the scope of DSP to at least one type of epistemic state. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. Integer, fractional, and anomalous quantum Hall effects explained with Eyring's rate process theory and free volume concept.

    Science.gov (United States)

    Hao, Tian

    2017-02-22

    The Hall effects, especially the integer, fractional and anomalous quantum Hall effects, have been addressed using Eyring's rate process theory and free volume concept. The basic assumptions are that the conduction process is a common rate controlled "reaction" process that can be described with Eyring's absolute rate process theory; the mobility of electrons should be dependent on the free volume available for conduction electrons. The obtained Hall conductivity is clearly quantized as with prefactors related to both the magnetic flux quantum number and the magnetic quantum number via the azimuthal quantum number, with and without an externally applied magnetic field. This article focuses on two dimensional (2D) systems, but the approaches developed in this article can be extended to 3D systems.

  12. Challenges in clinical natural language processing for automated disorder normalization.

    Science.gov (United States)

    Leaman, Robert; Khare, Ritu; Lu, Zhiyong

    2015-10-01

    Identifying key variables such as disorders within the clinical narratives in electronic health records has wide-ranging applications within clinical practice and biomedical research. Previous research has demonstrated reduced performance of disorder named entity recognition (NER) and normalization (or grounding) in clinical narratives than in biomedical publications. In this work, we aim to identify the cause for this performance difference and introduce general solutions. We use closure properties to compare the richness of the vocabulary in clinical narrative text to biomedical publications. We approach both disorder NER and normalization using machine learning methodologies. Our NER methodology is based on linear-chain conditional random fields with a rich feature approach, and we introduce several improvements to enhance the lexical knowledge of the NER system. Our normalization method - never previously applied to clinical data - uses pairwise learning to rank to automatically learn term variation directly from the training data. We find that while the size of the overall vocabulary is similar between clinical narrative and biomedical publications, clinical narrative uses a richer terminology to describe disorders than publications. We apply our system, DNorm-C, to locate disorder mentions and in the clinical narratives from the recent ShARe/CLEF eHealth Task. For NER (strict span-only), our system achieves precision=0.797, recall=0.713, f-score=0.753. For the normalization task (strict span+concept) it achieves precision=0.712, recall=0.637, f-score=0.672. The improvements described in this article increase the NER f-score by 0.039 and the normalization f-score by 0.036. We also describe a high recall version of the NER, which increases the normalization recall to as high as 0.744, albeit with reduced precision. We perform an error analysis, demonstrating that NER errors outnumber normalization errors by more than 4-to-1. Abbreviations and acronyms are found

  13. Intervention mapping: a process for developing theory- and evidence-based health education programs.

    Science.gov (United States)

    Bartholomew, L K; Parcel, G S; Kok, G

    1998-10-01

    The practice of health education involves three major program-planning activities: needs assessment, program development, and evaluation. Over the past 20 years, significant enhancements have been made to the conceptual base and practice of health education. Models that outline explicit procedures and detailed conceptualization of community assessment and evaluation have been developed. Other advancements include the application of theory to health education and promotion program development and implementation. However, there remains a need for more explicit specification of the processes by which one uses theory and empirical findings to develop interventions. This article presents the origins, purpose, and description of Intervention Mapping, a framework for health education intervention development. Intervention Mapping is composed of five steps: (1) creating a matrix of proximal program objectives, (2) selecting theory-based intervention methods and practical strategies, (3) designing and organizing a program, (4) specifying adoption and implementation plans, and (5) generating program evaluation plans.

  14. On Kaluza-Klein theory

    International Nuclear Information System (INIS)

    Salam, A.; Strathdee, J.

    1981-10-01

    Assuming the compactification of 4+K-dimensional spacetime implied in Kaluza-Klein type theories, we consider the case in which the internal manifold is a quotient space, G/H. We develop normal mode expansions on the internal manifold and show that the conventional gravitational plus Yang-Mills theory (realizing local G symmetry) is obtained in the leading approximation. The higher terms in the expansions give rise to field theories of massive particles. In particular, for the original Kaluza-Klein 4+1-dimensional theory, the higher excitations describe massive, charged, purely spin-2 particles. These belong to infinite dimensional representations of an 0(1,2). (author)

  15. Dynamic Training Elements in a Circuit Theory Course to Implement a Self-Directed Learning Process

    Science.gov (United States)

    Krouk, B. I.; Zhuravleva, O. B.

    2009-01-01

    This paper reports on the implementation of a self-directed learning process in a circuit theory course, incorporating dynamic training elements which were designed on the basis of a cybernetic model of cognitive process management. These elements are centrally linked in a dynamic learning frame, created on the monitor screen, which displays the…

  16. Dual-process theory and consumer response to front-of-package nutrition label formats.

    Science.gov (United States)

    Sanjari, S Setareh; Jahn, Steffen; Boztug, Yasemin

    2017-11-01

    Nutrition labeling literature yields fragmented results about the effect of front-of-package (FOP) nutrition label formats on healthy food choice. Specifically, it is unclear which type of nutrition label format is effective across different shopping situations. To address this gap, the present review investigates the available nutrition labeling literature through the prism of dual-process theory, which posits that decisions are made either quickly and automatically (system 1) or slowly and deliberately (system 2). A systematically performed review of nutrition labeling literature returned 59 papers that provide findings that can be explained according to dual-process theory. The findings of these studies suggest that the effectiveness of nutrition label formats is influenced by the consumer's dominant processing system, which is a function of specific contexts and personal variables (eg, motivation, nutrition knowledge, time pressure, and depletion). Examination of reported findings through a situational processing perspective reveals that consumers might prefer different FOP nutrition label formats in different situations and can exhibit varying responses to the same label format across situations. This review offers several suggestions for policy makers and researchers to help improve current FOP nutrition label formats. © The Author(s) 2017. Published by Oxford University Press on behalf of the International Life Sciences Institute. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  17. Tuned with a tune: Talker normalization via general auditory processes

    Directory of Open Access Journals (Sweden)

    Erika J C Laing

    2012-06-01

    Full Text Available Voices have unique acoustic signatures, contributing to the acoustic variability listeners must contend with in perceiving speech, and it has long been proposed that listeners normalize speech perception to information extracted from a talker’s speech. Initial attempts to explain talker normalization relied on extraction of articulatory referents, but recent studies of context-dependent auditory perception suggest that general auditory referents such as the long-term average spectrum (LTAS of a talker’s speech similarly affect speech perception. The present study aimed to differentiate the contributions of articulatory/linguistic versus auditory referents for context-driven talker normalization effects and, more specifically, to identify the specific constraints under which such contexts impact speech perception. Synthesized sentences manipulated to sound like different talkers influenced categorization of a subsequent speech target only when differences in the sentences’ LTAS were in the frequency range of the acoustic cues relevant for the target phonemic contrast. This effect was true both for speech targets preceded by spoken sentence contexts and for targets preceded by nonspeech tone sequences that were LTAS-matched to the spoken sentence contexts. Specific LTAS characteristics, rather than perceived talker, predicted the results suggesting that general auditory mechanisms play an important role in effects considered to be instances of perceptual talker normalization.

  18. Exact perturbation theory of multiphoton processes at high intensities. [Schroedinger equation, perturbation theory, matrix

    Energy Technology Data Exchange (ETDEWEB)

    Faisal, F H.M. [Bielefeld Univ. (Germany, F.R.). Fakultaet fuer Physik

    1976-06-11

    In this work the perturbation theory for multiphoton processes at high intensities is investigated and it is described an analytical method of summing the perturbation series to extract the contribution from all terms that give rise to the absorption of N photons by an atomic system. The method is first applied to the solution of a simple model problem and the result is confirmed by direct integration of the model Schroedinger equation. The usual lowest (nonvanishing)-order perturbation-theoretical calculation is also carried out for this model to demonstrate explicitly that the full result correctly reproduces that of the lowest-order theory in the limit of low intensity. The method is then extended to the case of an atomic system with well-developed spectrum (e.g. H atom) and the N-photon T-matrix is derived in terms of a ''photon matrix'' asub(N), for which a three-term recurrence relation is established. Next, from the vantage point of the general result obtained here, A probe is made into the nature of several approximate nonperturbative solutions that have appeared in the literature in the past. It is shown here that their applicability is severely restricted by the requirement of the essential spectral degeneracy of the atomic system. Finally, appendix A outlines a prescription of computing the photon matrix asub(N), which (as in the usual lowest-order perturbation-theoretical calculation)requires a knowledge of the eigenfunctions and eigenvalues of the atomic Hamiltonian only.

  19. Practice Evaluation Strategies Among Social Workers: Why an Evidence-Informed Dual-Process Theory Still Matters.

    Science.gov (United States)

    Davis, Thomas D

    2017-01-01

    Practice evaluation strategies range in style from the formal-analytic tools of single-subject designs, rapid assessment instruments, algorithmic steps in evidence-informed practice, and computer software applications, to the informal-interactive tools of clinical supervision, consultation with colleagues, use of client feedback, and clinical experience. The purpose of this article is to provide practice researchers in social work with an evidence-informed theory that is capable of explaining both how and why social workers use practice evaluation strategies to self-monitor the effectiveness of their interventions in terms of client change. The author delineates the theoretical contours and consequences of what is called dual-process theory. Drawing on evidence-informed advances in the cognitive and social neurosciences, the author identifies among everyday social workers a theoretically stable, informal-interactive tool preference that is a cognitively necessary, sufficient, and stand-alone preference that requires neither the supplementation nor balance of formal-analytic tools. The author's delineation of dual-process theory represents a theoretical contribution in the century-old attempt to understand how and why social workers evaluate their practice the way they do.

  20. Processing Capacity under Perceptual and Cognitive Load: A Closer Look at Load Theory

    Science.gov (United States)

    Fitousi, Daniel; Wenger, Michael J.

    2011-01-01

    Variations in perceptual and cognitive demands (load) play a major role in determining the efficiency of selective attention. According to load theory (Lavie, Hirst, Fockert, & Viding, 2004) these factors (a) improve or hamper selectivity by altering the way resources (e.g., processing capacity) are allocated, and (b) tap resources rather than…

  1. Grounded theory: building a middle-range theory in nursing

    Directory of Open Access Journals (Sweden)

    Maria João Fernandes

    2015-03-01

    Full Text Available The development of nursing as a discipline results from a boom of investigations underway for nearly a century, and of the construction of theories that have arisen during the 1950’s, with greater relevance since the 1960’s. Giving continuation to the production of knowledge in nursing and seeking to contribute to the increase in the number of explanatory theories of the functional content of nurses, there is interest in answering the question: how can a middle-range theory in nursing be built that explains the nurse-elderly interaction in a successful aging process? As well, we address the goal of describing the process of building a middle-range theory in nursing. Middle-range theory refers to a qualitative paradigm study of inductive thinking, developed in the context of primary health care. The information was collected through participant observation and interviews. Method of analysis grounded theory by Corbin and Strauss(1 was followed, utilizing the triangulation of data and theoretical sampling. Grounded theory has become a method of analysis which facilitates the understanding and explanation of the phenomenon under study. By making clear the nature and process of the nurse-elderly interaction in the selected context and within the context of successful aging, a middle-range theory proposal emerged.

  2. Thermalization in a holographic confining gauge theory

    International Nuclear Information System (INIS)

    Ishii, Takaaki; Kiritsis, Elias; Rosen, Christopher

    2015-01-01

    Time dependent perturbations of states in the holographic dual of a 3+1 dimensional confining theory are considered. The perturbations are induced by varying the coupling to the theory’s most relevant operator. The dual gravitational theory belongs to a class of Einstein-dilaton theories which exhibit a mass gap at zero temperature and a first order deconfining phase transition at finite temperature. The perturbation is realized in various thermal bulk solutions by specifying time dependent boundary conditions on the scalar, and we solve the fully backreacted Einstein-dilaton equations of motion subject to these boundary conditions. We compute the characteristic time scale of many thermalization processes, noting that in every case we examine, this time scale is determined by the imaginary part of the lowest lying quasi-normal mode of the final state black brane. We quantify the dependence of this final state on parameters of the quench, and construct a dynamical phase diagram. Further support for a universal scaling regime in the abrupt quench limit is provided.

  3. Nonepileptic seizures under levetiracetam therapy: a case report of forced normalization process.

    Science.gov (United States)

    Anzellotti, Francesca; Franciotti, Raffaella; Zhuzhuni, Holta; D'Amico, Aurelio; Thomas, Astrid; Onofrj, Marco

    2014-01-01

    Nonepileptic seizures (NES) apparently look like epileptic seizures, but are not associated with ictal electrical discharges in the brain. NES constitute one of the most important differential diagnoses of epilepsy. They have been recognized as a distinctive clinical phenomenon for centuries, and video/electroencephalogram monitoring has allowed clinicians to make near-certain diagnoses. NES are supposedly unrelated to organic brain lesions, and despite the preponderance of a psychiatric/psychological context, they may have an iatrogenic origin. We report a patient with NES precipitated by levetiracetam therapy; in this case, NES was observed during the disappearance of epileptiform discharges from the routine video/electroencephalogram. We discuss the possible mechanisms underlying NES with regard to alternative psychoses associated with the phenomenon of the forced normalization process.

  4. Normal mode analysis and applications in biological physics.

    Science.gov (United States)

    Dykeman, Eric C; Sankey, Otto F

    2010-10-27

    Normal mode analysis has become a popular and often used theoretical tool in the study of functional motions in enzymes, viruses, and large protein assemblies. The use of normal modes in the study of these motions is often extremely fruitful since many of the functional motions of large proteins can be described using just a few normal modes which are intimately related to the overall structure of the protein. In this review, we present a broad overview of several popular methods used in the study of normal modes in biological physics including continuum elastic theory, the elastic network model, and a new all-atom method, recently developed, which is capable of computing a subset of the low frequency vibrational modes exactly. After a review of the various methods, we present several examples of applications of normal modes in the study of functional motions, with an emphasis on viral capsids.

  5. Processing deficits in monitoring analog and digital displays: Implications for attentional theory and mental-state estimation research

    Science.gov (United States)

    Payne, David G.; Gunther, Virginia A. L.

    1988-01-01

    Subjects performed short term memory tasks, involving both spatial and verbal components, and a visual monitoring task involving either analog or digital display formats. These two tasks (memory vs. monitoring) were performed both singly and in conjunction. Contrary to expectations derived from multiple resource theories of attentional processes, there was no evidence that when the two tasks involved the same cognitive codes (i.e., either both spatial or both verbal/linguistics) there was more of a dual task performance decrement than when the two tasks employed different cognitive codes/processes. These results are discussed in terms of their implications for theories of attentional processes and also for research in mental state estimation.

  6. Sequences, groups, and number theory

    CERN Document Server

    Rigo, Michel

    2018-01-01

    This collaborative book presents recent trends on the study of sequences, including combinatorics on words and symbolic dynamics, and new interdisciplinary links to group theory and number theory. Other chapters branch out from those areas into subfields of theoretical computer science, such as complexity theory and theory of automata. The book is built around four general themes: number theory and sequences, word combinatorics, normal numbers, and group theory. Those topics are rounded out by investigations into automatic and regular sequences, tilings and theory of computation, discrete dynamical systems, ergodic theory, numeration systems, automaton semigroups, and amenable groups.  This volume is intended for use by graduate students or research mathematicians, as well as computer scientists who are working in automata theory and formal language theory. With its organization around unified themes, it would also be appropriate as a supplemental text for graduate level courses.

  7. Process convergence of self-normalized sums of i.i.d. random ...

    Indian Academy of Sciences (India)

    The study of the asymptotics of the self-normalized sums are also interesting. Logan ... if the constituent random variables are from the domain of attraction of a normal dis- tribution ... index of stability α which equals 2 (for definition, see §2).

  8. Normal and superconducting metals at microwave frequencies-classic experiments

    International Nuclear Information System (INIS)

    Dheer, P.N.

    1999-01-01

    A brief review of experimental and theoretical work on the behaviour of normal and superconducting materials at microwave frequencies before the publication of Bardeen, Cooper and Schrieffer's theory of superconductivity is given. The work discussed is mostly that of Pippard and his coworkers. It is shown that these investigations lead not only to a better understanding of the electrodynamics of normal and superconducting state but also of the nature of the superconducting state itself. (author)

  9. Discrete Curvature Theories and Applications

    KAUST Repository

    Sun, Xiang

    2016-08-25

    Discrete Di erential Geometry (DDG) concerns discrete counterparts of notions and methods in di erential geometry. This thesis deals with a core subject in DDG, discrete curvature theories on various types of polyhedral surfaces that are practically important for free-form architecture, sunlight-redirecting shading systems, and face recognition. Modeled as polyhedral surfaces, the shapes of free-form structures may have to satisfy di erent geometric or physical constraints. We study a combination of geometry and physics { the discrete surfaces that can stand on their own, as well as having proper shapes for the manufacture. These proper shapes, known as circular and conical meshes, are closely related to discrete principal curvatures. We study curvature theories that make such surfaces possible. Shading systems of freeform building skins are new types of energy-saving structures that can re-direct the sunlight. From these systems, discrete line congruences across polyhedral surfaces can be abstracted. We develop a new curvature theory for polyhedral surfaces equipped with normal congruences { a particular type of congruences de ned by linear interpolation of vertex normals. The main results are a discussion of various de nitions of normality, a detailed study of the geometry of such congruences, and a concept of curvatures and shape operators associated with the faces of a triangle mesh. These curvatures are compatible with both normal congruences and the Steiner formula. In addition to architecture, we consider the role of discrete curvatures in face recognition. We use geometric measure theory to introduce the notion of asymptotic cones associated with a singular subspace of a Riemannian manifold, which is an extension of the classical notion of asymptotic directions. We get a simple expression of these cones for polyhedral surfaces, as well as convergence and approximation theorems. We use the asymptotic cones as facial descriptors and demonstrate the

  10. Density functional theory and parallel processing

    International Nuclear Information System (INIS)

    Ward, R.C.; Geist, G.A.; Butler, W.H.

    1987-01-01

    The authors demonstrate a method for obtaining the ground state energies and charge densities of a system of atoms described within density functional theory using simulated annealing on a parallel computer

  11. Learning from doing: the case for combining normalisation process theory and participatory learning and action research methodology for primary healthcare implementation research.

    Science.gov (United States)

    de Brún, Tomas; O'Reilly-de Brún, Mary; O'Donnell, Catherine A; MacFarlane, Anne

    2016-08-03

    The implementation of research findings is not a straightforward matter. There are substantive and recognised gaps in the process of translating research findings into practice and policy. In order to overcome some of these translational difficulties, a number of strategies have been proposed for researchers. These include greater use of theoretical approaches in research focused on implementation, and use of a wider range of research methods appropriate to policy questions and the wider social context in which they are placed. However, questions remain about how to combine theory and method in implementation research. In this paper, we respond to these proposals. Focussing on a contemporary social theory, Normalisation Process Theory, and a participatory research methodology, Participatory Learning and Action, we discuss the potential of their combined use for implementation research. We note ways in which Normalisation Process Theory and Participatory Learning and Action are congruent and may therefore be used as heuristic devices to explore, better understand and support implementation. We also provide examples of their use in our own research programme about community involvement in primary healthcare. Normalisation Process Theory alone has, to date, offered useful explanations for the success or otherwise of implementation projects post-implementation. We argue that Normalisation Process Theory can also be used to prospectively support implementation journeys. Furthermore, Normalisation Process Theory and Participatory Learning and Action can be used together so that interventions to support implementation work are devised and enacted with the expertise of key stakeholders. We propose that the specific combination of this theory and methodology possesses the potential, because of their combined heuristic force, to offer a more effective means of supporting implementation projects than either one might do on its own, and of providing deeper understandings of

  12. The effects of limited bandwidth and noise on verbal processing time and word recall in normal-hearing children.

    Science.gov (United States)

    McCreery, Ryan W; Stelmachowicz, Patricia G

    2013-09-01

    Understanding speech in acoustically degraded environments can place significant cognitive demands on school-age children who are developing the cognitive and linguistic skills needed to support this process. Previous studies suggest the speech understanding, word learning, and academic performance can be negatively impacted by background noise, but the effect of limited audibility on cognitive processes in children has not been directly studied. The aim of the present study was to evaluate the impact of limited audibility on speech understanding and working memory tasks in school-age children with normal hearing. Seventeen children with normal hearing between 6 and 12 years of age participated in the present study. Repetition of nonword consonant-vowel-consonant stimuli was measured under conditions with combinations of two different signal to noise ratios (SNRs; 3 and 9 dB) and two low-pass filter settings (3.2 and 5.6 kHz). Verbal processing time was calculated based on the time from the onset of the stimulus to the onset of the child's response. Monosyllabic word repetition and recall were also measured in conditions with a full bandwidth and 5.6 kHz low-pass cutoff. Nonword repetition scores decreased as audibility decreased. Verbal processing time increased as audibility decreased, consistent with predictions based on increased listening effort. Although monosyllabic word repetition did not vary between the full bandwidth and 5.6 kHz low-pass filter condition, recall was significantly poorer in the condition with limited bandwidth (low pass at 5.6 kHz). Age and expressive language scores predicted performance on word recall tasks, but did not predict nonword repetition accuracy or verbal processing time. Decreased audibility was associated with reduced accuracy for nonword repetition and increased verbal processing time in children with normal hearing. Deficits in free recall were observed even under conditions where word repetition was not affected

  13. Behavioral finance: Finance with normal people

    Directory of Open Access Journals (Sweden)

    Meir Statman

    2014-06-01

    Behavioral finance substitutes normal people for the rational people in standard finance. It substitutes behavioral portfolio theory for mean-variance portfolio theory, and behavioral asset pricing model for the CAPM and other models where expected returns are determined only by risk. Behavioral finance also distinguishes rational markets from hard-to-beat markets in the discussion of efficient markets, a distinction that is often blurred in standard finance, and it examines why so many investors believe that it is easy to beat the market. Moreover, behavioral finance expands the domain of finance beyond portfolios, asset pricing, and market efficiency and is set to continue that expansion while adhering to the scientific rigor introduced by standard finance.

  14. Phenomenological rate process theory for the storage of atomic H in solid Hsub(2)sup(*)

    International Nuclear Information System (INIS)

    Rosen, G.

    1976-01-01

    A phenomenological rate process theory is developed for the storage and rapid recombination of atomic hydrogen fuel radical in a crystalline molecular hydrogen solid at temperatures in the range o.1K(<=)T(<=K. It is shown that such a theory can account quantitatively for the recently observed dependence of the storage time on the storage temperature, for the maximum concentration of trapped H atom, and for the time duration of the energy release in the tritium decay experiments of Webeler

  15. Can Dual Processing Theory Explain Physics Students' Performance on the Force Concept Inventory?

    Science.gov (United States)

    Wood, Anna K.; Galloway, Ross K.; Hardy, Judy

    2016-01-01

    According to dual processing theory there are two types, or modes, of thinking: system 1, which involves intuitive and nonreflective thinking, and system 2, which is more deliberate and requires conscious effort and thought. The Cognitive Reflection Test (CRT) is a widely used and robust three item instrument that measures the tendency to override…

  16. Sphalerons, deformed sphalerons and normal modes

    International Nuclear Information System (INIS)

    Brihaye, Y.; Kunz, J.; Oldenburg Univ.

    1992-01-01

    Topological arguments suggest that tha Weinberg-Salam model posses unstable solutions, sphalerons, representing the top of energy barriers between inequivalent vacua of the gauge theory. In the limit of vanishing Weinberg angle, such unstable solutions are known: the sphaleron of Klinkhamer and Manton and at large values of the Higgs mass in addition the deformed sphalerons. Here a systematic study of the discrete normal modes about these sphalerons for the full range Higgs mass is presented. The emergence of deformed sphalerons at critical values of the Higgs mass is seem to be related to the crossing of zero of the eigenvalue of the particular normal modes about the sphaleron. 6 figs., 1 tab., 19 refs. (author)

  17. Normal modes of vibration in nickel

    Energy Technology Data Exchange (ETDEWEB)

    Birgeneau, R J [Yale Univ., New Haven, Connecticut (United States); Cordes, J [Cambridge Univ., Cambridge (United Kingdom); Dolling, G; Woods, A D B

    1964-07-01

    The frequency-wave-vector dispersion relation, {nu}(q), for the normal vibrations of a nickel single crystal at 296{sup o}K has been measured for the [{zeta}00], [{zeta}00], [{zeta}{zeta}{zeta}], and [0{zeta}1] symmetric directions using inelastic neutron scattering. The results can be described in terms of the Born-von Karman theory of lattice dynamics with interactions out to fourth-nearest neighbors. The shapes of the dispersion curves are very similar to those of copper, the normal mode frequencies in nickel being about 1.24 times the corresponding frequencies in copper. The fourth-neighbor model was used to calculate the frequency distribution function g({nu}) and related thermodynamic properties. (author)

  18. Higher Fasting Plasma Glucose Levels, within the Normal Range, are Associated with Decreased Processing Speed in High Functioning Young Elderly

    OpenAIRE

    Raizes, Meytal; Elkana, Odelia; Franko, Motty; Springer, Ramit Ravona; Segev, Shlomo; Beeri, Michal Schnaider

    2016-01-01

    We explored the association of plasma glucose levels within the normal range with processing speed in high functioning young elderly, free of type 2 diabetes mellitus (T2DM). A sample of 41 participants (mean age = 64.7, SD = 10; glucose 94.5 mg/dL, SD = 9.3), were examined with a computerized cognitive battery. Hierarchical linear regression analysis showed that higher plasma glucose levels, albeit within the normal range (

  19. Application of adult attachment theory to group member transference and the group therapy process.

    Science.gov (United States)

    Markin, Rayna D; Marmarosh, Cheri

    2010-03-01

    Although clinical researchers have applied attachment theory to client conceptualization and treatment in individual therapy, few researchers have applied this theory to group therapy. The purpose of this article is to begin to apply theory and research on adult dyadic and group attachment styles to our understanding of group dynamics and processes in adult therapy groups. In particular, we set forth theoretical propositions on how group members' attachment styles affect relationships within the group. Specifically, this article offers some predictions on how identifying group member dyadic and group attachment styles could help leaders predict member transference within the therapy group. Implications of group member attachment for the selection and composition of a group and the different group stages are discussed. Recommendations for group clinicians and researchers are offered. PsycINFO Database Record (c) 2010 APA, all rights reserved

  20. Habituation and sensitization of aggression in bullfrogs (Rana catesbeiana): testing the dual-process theory of habituation.

    Science.gov (United States)

    Bee, M A

    2001-09-01

    The aggressive response of male bullfrogs (Rana catesbeiana) habituates with repeated broadcasts of acoustic stimuli simulating a new territorial neighbor. The effects of stimulus repetition rate and stimulus intensity on bullfrog aggressive responses were tested in a field experiment designed to test the assumptions of a dual-process theory of habituation. Synthetic advertisement calls were broadcast at 2 repetition rates and 2 intensities in a factorial design. Bullfrogs were more aggressive at the higher stimulus intensity at both repetition rates. Aggressive responses habituated more slowly at the higher stimulus intensity and slower repetition rate compared with other treatments. Several biotic and abiotic factors had small or negligible effects on aggressive responses. Although consistent with the operation of 2 opposing processes, habituation and sensitization, the data provide only partial support for the assumptions of dual-process theory.

  1. e++e-→νsub(e)+anti νsub(e)+γ process in the gauge theories

    International Nuclear Information System (INIS)

    Dzhafarov, I.G.; Mustafaev, Kh.A.; Sultanov, S.F.

    1977-01-01

    The e + +e - →νsub(e)+anti νsub(e)+γ process has been treated within the framework of the unified theory of weak and electromagnetic interractions (the Weinberg-Salam model). The analytical expressions for the photoproduction differential cross section are presented for two energy ranges: the range of relatively low energies and that of the resonance production expected due to the Z bozon exchange. Angular and energy distributions of photons produced in the process under consideration are investigated. The photoproduction differential cross section is by two-three orders of magnitude higher for the resonance energy, than the cross section for the same process, which is predicted by the local four-fermion V-A theory. The formulae obtained can be used also to describe the μ + +μ - →νsub(μ)+anti νsub(μ)+γ reaction

  2. Compressed normalized block difference for object tracking

    Science.gov (United States)

    Gao, Yun; Zhang, Dengzhuo; Cai, Donglan; Zhou, Hao; Lan, Ge

    2018-04-01

    Feature extraction is very important for robust and real-time tracking. Compressive sensing provided a technical support for real-time feature extraction. However, all existing compressive tracking were based on compressed Haar-like feature, and how to compress many more excellent high-dimensional features is worth researching. In this paper, a novel compressed normalized block difference feature (CNBD) was proposed. For resisting noise effectively in a highdimensional normalized pixel difference feature (NPD), a normalized block difference feature extends two pixels in the original formula of NPD to two blocks. A CNBD feature can be obtained by compressing a normalized block difference feature based on compressive sensing theory, with the sparse random Gaussian matrix as the measurement matrix. The comparative experiments of 7 trackers on 20 challenging sequences showed that the tracker based on CNBD feature can perform better than other trackers, especially than FCT tracker based on compressed Haar-like feature, in terms of AUC, SR and Precision.

  3. A Geometrically—Nonlinear Plate Theory 12

    Institute of Scientific and Technical Information of China (English)

    AlbertC.J.LUO

    1999-01-01

    An approximate plate theory developed in this paper is based on an assumed displacement field,the strains described by a Taylor series in the normal distance from the middle surface,the exact strains of the middle surface and the equations of equilibrium governing the exact configuration of the deformed middle surface,In this theory the exact geometry of the deformed middle surface is used to derive the strains and equilibrium of the plate.Application of this theory does not depend on the constitutive law.THis theory can reduce to some existing nonlinear theories through imposition of constraints.

  4. Theory of emission spectra from metal films irradiated by low energy electrons near normal incidence

    International Nuclear Information System (INIS)

    Kretschmann, E.; Callcott, T.A.; Arakawa, E.T.

    1980-01-01

    The emission spectrum produced by low energy electrons incident on a rough metal surface has been calculated for a roughness auto-correlation function containing a prominent peak at a high wave vector. For low energy electrons near normal incidence, the high wavevector peak dominates the roughness coupled surface plasmon radiation (RCSPR) process. The calculation yields estimates of the ratio of RCSPR to transition radiation, the dependence of emission intensity on electron energy and the shape and position of the RCSPR peak. The most interesting result is that the high-wavevector roughness can split the RCSPR radiation into peaks lying above and below the asymptotic surface plasma frequency. The results are compared with data from Ag in the following paper. (orig.)

  5. Chern-Simons theory and three-dimensional surfaces

    International Nuclear Information System (INIS)

    Guven, Jemal

    2007-01-01

    There are two natural Chern-Simons theories associated with the embedding of a three-dimensional surface in Euclidean space: one is constructed using the induced metric connection and involves only the intrinsic geometry? the other is extrinsic and uses the connection associated with the gauging of normal rotations. As such, the two theories appear to describe very different aspects of the surface geometry. Remarkably, at a classical level, they are equivalent. In particular, it will be shown that their stress tensors differ only by a null contribution. Their Euler-Lagrange equations provide identical constraints on the normal curvature. A new identity for the Cotton tensor is associated with the triviality of the Chern-Simons theory for embedded hypersurfaces implied by this equivalence

  6. Influence of the growth process on some laws deduced from percolation theory

    International Nuclear Information System (INIS)

    Hachi, M.; Olivier, G.

    1985-09-01

    A brutal application of the percolation theory to some physical problems can lead to erroneous interpretation of the experimental results. Among these problems, the influence of the growth process on the percolation laws is studied. The behaviour of nsub(s)(t), the number of clusters of size s, at time t, is analyzed and linked to a macroscopic property of the system for a comparison to experimental laws. (author)

  7. Toward a Unified Consciousness Theory

    Science.gov (United States)

    Johnson, Richard H.

    1977-01-01

    The beginning of a holistic theory that can treat paranormal phenomena as normal human development is presented. Implications for counseling, counselor education, and counselor supervision are discussed. (Author)

  8. Cue acquisition: A feature of Malawian midwives decision making process to support normality during the first stage of labour.

    Science.gov (United States)

    Chodzaza, Elizabeth; Haycock-Stuart, Elaine; Holloway, Aisha; Mander, Rosemary

    2018-03-01

    to explore Malawian midwives decision making when caring for women during the first stage of labour in the hospital setting. this focused ethnographic study examined the decision making process of 9 nurse-midwives with varying years of clinical experience in the real world setting of an urban and semi urban hospital from October 2013 to May 2014.This was done using 27 participant observations and 27 post-observation in-depth interviews over a period of six months. Qualitative data analysis software, NVivo 10, was used to assist with data management for the analysis. All data was analysed using the principle of theme and category formation. analysis revealed a six-stage process of decision making that include a baseline for labour, deciding to admit a woman to labour ward, ascertaining the normal physiological progress of labour, supporting the normal physiological progress of labour, embracing uncertainty: the midwives' construction of unusual labour as normal, dealing with uncertainty and deciding to intervene in unusual labour. This six-stage process of decision making is conceptualised as the 'role of cue acquisition', illustrating the ways in which midwives utilise their assessment of labouring women to reason and make decisions on how to care for them in labour. Cue acquisition involved the midwives piecing together segments of information they obtained from the women to formulate an understanding of the woman's birthing progress and inform the midwives decision making process. This understanding of cue acquisition by midwives is significant for supporting safe care in the labour setting. When there was uncertainty in a woman's progress of labour, midwives used deductive reasoning, for example, by cross-checking and analysing the information obtained during the span of labour. Supporting normal labour physiological processes was identified as an underlying principle that shaped the midwives clinical judgement and decision making when they cared for women in

  9. Theory Meets Practice

    DEFF Research Database (Denmark)

    Schlamovitz, Jesper

    2015-01-01

    Process thinking and process-based theory are receiving increased attention in the field of organization studies and organization theory development (Tsoukas & Chia, 2002; Langley & Tsoukas, 2010; Hernes, 2014). The aim has been to study processes rather than structures, in organizations. This has...... recently inspired research on the organizing of projects and the development of a (new) theory of temporary organizations (Bakker, 2010; Blomquist et al. 2010; Söderlund, 2013). These theories are still under development and need empirical studies that can show their relevance for practice. This paper...... will give an overview of this theoretical development and discuss the consequences for the practice of project management. The paper finds that the focus on processes such as time and temporality, meaning structures, and articulation are covered in project management research, but sparsely documented...

  10. A Look at the Memory Performance of Retarded and Normal Children Utilizing the Levels of Processing Framework.

    Science.gov (United States)

    Lupart, Judy L.; Mulcahy, Robert F.

    Memory performance differences of mental age matched (9-12 years) educable mentally retarded (EMR) (n=56) and normal (n=56) children were examined in two experiments using the F. Craik and R. Lockhart levels of processing framework. In experiment 1, Ss were randomly assigned to an incidental, intentional, or planned intentional learning condition,…

  11. Pitch angle scattering of relativistic electrons from stationary magnetic waves: Continuous Markov process and quasilinear theory

    International Nuclear Information System (INIS)

    Lemons, Don S.

    2012-01-01

    We develop a Markov process theory of charged particle scattering from stationary, transverse, magnetic waves. We examine approximations that lead to quasilinear theory, in particular the resonant diffusion approximation. We find that, when appropriate, the resonant diffusion approximation simplifies the result of the weak turbulence approximation without significant further restricting the regime of applicability. We also explore a theory generated by expanding drift and diffusion rates in terms of a presumed small correlation time. This small correlation time expansion leads to results valid for relatively small pitch angle and large wave energy density - a regime that may govern pitch angle scattering of high-energy electrons into the geomagnetic loss cone.

  12. Calculating the Price for Derivative Financial Assets of Bessel Processes Using the Sturm-Liouville Theory

    Directory of Open Access Journals (Sweden)

    Burtnyak Ivan V.

    2017-06-01

    Full Text Available In the paper we apply the spectral theory to find the price for derivatives of financial assets assuming that the processes described are Markov processes and such that can be considered in the Hilbert space L^2 using the Sturm-Liouville theory. Bessel diffusion processes are used in studying Asian options. We consider the financial flows generated by the Bessel diffusions by expressing them in terms of the system of Bessel functions of the first kind, provided that they take into account the linear combination of the flow and its spatial derivative. Such expression enables calculating the size of the market portfolio and provides a measure of the amount of internal volatility in the market at any given moment, allows investigating the dynamics of the equity market. The expansion of the Green function in terms of the system of Bessel functions is expressed by an analytic formula that is convenient in calculating the volume of financial flows. All assumptions are natural, result in analytic formulas that are consistent with the empirical data and, when applied in practice, adequately reflect the processes in equity markets.

  13. Transition theory and its relevance to patients with chronic wounds.

    Science.gov (United States)

    Neil, J A; Barrell, L M

    1998-01-01

    A wound, in the broadest sense, is a disruption of normal anatomic structure and function. Acute wounds progress through a timely and orderly sequence of repair that leads to the restoration of functional integrity. In chronic wounds, this timely and orderly sequence goes awry. As a result, people with chronic wounds often face not only physiological difficulties but emotional ones as well. The study of body image and its damage as a result of a chronic wound fits well with Selder's transition theory. This article describes interviews with seven patients with chronic wounds. The themes that emerged from those interviews were compared with Selder's theory to describe patients' experience with chronic wounds as a transition process that can be identified and better understood by healthcare providers.

  14. Morse theory interpretation of topological quantum field theories

    International Nuclear Information System (INIS)

    Labastida, J.M.F.

    1989-01-01

    Topological quantum field theories are interpreted as a generalized form of Morse theory. This interpretation is applied to formulate the simplest topological quantum field theory: Topological quantum mechanics. The only non-trivial topological invariant corresponding to this theory is computed and identified with the Euler characteristic. Using field theoretical methods this topological invariant is calculated in different ways and in the process a proof of the Gauss-Bonnet-Chern-Avez formula as well as some results of degenerate Morse theory are obtained. (orig.)

  15. Becoming Therapeutic Agents: A Grounded Theory of Mothers' Process When Implementing Cognitive Behavioural Therapy at Home with an Anxious Child.

    Science.gov (United States)

    Pishva, Rana

    2017-05-01

    The premise of parent-centred programmes for parents of anxious children is to educate and train caregivers in the sustainable implementation of cognitive behaviour therapy (CBT) in the home. The existing operationalization of parent involvement, however, does not address the systemic, parent or child factors that could influence this process. The qualitative approach of grounded theory was employed to examine patterns of action and interaction involved in the complex process of carrying out CBT with one's child in one's home. A grounded theory goes beyond the description of a process, offering an explanatory theory that brings taken-for-granted meanings and processes to the surface. The theory that emerged from the analysis suggests that CBT implementation by mothers of anxious children is characterized by the evolution of mothers' perception of their child and mothers' perception of their role as well as a shift from reacting with emotion to responding pragmatically to the child. Changes occur as mothers recognize the crisis, make links between the treatment rationale, child's symptoms and their own parenting strategies, integrate tenets of CBT for anxiety and eventually focus on sustaining therapeutic gains through natural life transitions. The theory widens our understanding of mothers' role, therapeutic engagement, process, and decision-making. The theory also generates new hypotheses regarding parent involvement in the treatment of paediatric anxiety disorders and proposes novel research avenues that aim to maximize the benefits of parental involvement in the treatment of paediatric anxiety disorders. Copyright © 2016 John Wiley & Sons, Ltd. Mothers of anxious youth who take part in parent-centred programmes experience a shift in their perception of the child and of their role. Parental strategy after CBT implementation shifts from emotional empathy to cognitive empathy. Mothers experience significant challenges and require additional support in prevention

  16. Análise dos processos fonológicos em crianças com desenvolvimento fonológico normal Phonological processes analysis in children with normal phonological development

    Directory of Open Access Journals (Sweden)

    Carla Ferrante

    2009-01-01

    Full Text Available OBJETIVO: O presente estudo teve como objetivo verificar o uso dos processos fonológicos em uma população de crianças com desenvolvimento fonológico normal. MÉTODOS: Fizeram parte da pesquisa 240 crianças, de ambos os sexos, com idades entre três e oito anos. Foram realizadas análises relativas aos processos fonológicos e os dados foram comparados em relação à faixa etária e sexo. RESULTADOS: A análise dos resultados permitiu concluir que aos três, quatro e cinco anos os processos de redução de encontro consonantal, lateralização e apagamento de consoante final foram os mais utilizados. A metátese foi o segundo processo mais utilizado na faixa etária de seis anos, aparecendo em terceiro lugar na faixa etária de sete anos. Em relação ao número de processos fonológicos utilizados por faixa etária, aos três anos de idade as crianças utilizaram um mínimo de dois processos e a partir da faixa etária de quatro anos o número mínimo de processos fonológicos utilizados foi zero e o número máximo diminuiu gradativamente de acordo com o aumento da faixa etária, assim como a média. Em relação à variável sexo, não foi observada nenhuma diferença estatisticamente significante em nenhuma das análises realizadas nesta pesquisa. CONCLUSÕES: Os dados encontrados nesta pesquisa evidenciam a dificuldade encontrada pelas crianças na produção das líquidas e nas estruturas silábicas mais complexas.PURPOSE: The aim of this study was to verify the use of phonological processes in a group of children with normal phonological development. METHODS: The participants were 240 children of both genders, aged between three and eight years. Analyses regarding phonological processes were carried out, and the data were compared considering age and gender. RESULTS: The results allowed the conclusion that at the ages of three, four and five years the most frequently used processes were cluster reduction, lateralization, and final

  17. 'Normal' markets, market imperfections and energy efficiency

    International Nuclear Information System (INIS)

    Sanstad, A.H.; Howarth, R.B.

    1994-01-01

    The conventional distinction between 'economic' and 'engineering' approaches to energy analysis obscures key methodological issues concerning the measurement of the costs and benefits of policies to promote the adoption of energy-efficient technologies. The engineering approach is in fact based upon firm economic foundations: the principle of lifecycle cost minimization that arises directly from the theory of rational investment. Thus, evidence that so-called 'market barriers' impede the adoption of cost-effective energy-efficient technologies implies the existence of market failures as defined in the context of microeconomic theory. A widely held view that the engineering view lacks economic justification, is based on the fallacy that markets are 'normally' efficient. (author)

  18. Attachment and the Processing of Social Information across the Life Span: Theory and Evidence

    Science.gov (United States)

    Dykas, Matthew J.; Cassidy, Jude

    2011-01-01

    Researchers have used J. Bowlby's (1969/1982, 1973, 1980, 1988) attachment theory frequently as a basis for examining whether experiences in close personal relationships relate to the processing of social information across childhood, adolescence, and adulthood. We present an integrative life-span-encompassing theoretical model to explain the…

  19. [United theory of aging].

    Science.gov (United States)

    Trubitsyn, A G

    2012-01-01

    In attempts to develop a means of life prolongation the humankind has created more than three hundred theories of the aging; each of them offers the original cause of aging. However, none of them has given practical result by now. The majority of the theories have now only historical interest. There are several different theories that are mainly under consideration currently. They are based on reliable, proven evidence: the free radical theory, the protein error theory, the replicative senescence theory, the theory of reparation weakening, the immunological theory, several versions of neuroendocrinal theories, and programmed aging theory. The theory presented here is based on conception that the life as the phenomenon represents many of the interconnected physical and chemical processes propelled by energy of the mitochondrial bioenergetical machine. Gradual degradation of all vital processes is caused by the programmed decrease in level of bioenergetics. This theory unites all existing theories of aging constructed on authentic facts: it is shown, that such fundamental phenomena accompanying aging process as the increase in level of reactive oxygen species (ROS), the decrease in the general level of protein synthesis, the limitation of cellular dividing (Haiflick limit), decrease in efficiency of reparation mechanisms are caused by bioenergetics attenuation. Each of these phenomena in turn generates a number of harmful secondary processes. Any of the theories bases on one of these destructive phenomena or their combination. Hence, each of them describes one of sides of process of the aging initially caused by programmed decrease of level of bioenergetics. This united theory gives the chance to understand the nature of aging clock and explains a phenomenon of increase in longevity at the condition of food restriction. Failures of attempts to develop means from aging are explained by that the manipulations with the separate secondary phenomena of attenuation of

  20. Relational description of the measurement process in quantum field theory

    International Nuclear Information System (INIS)

    Gambini, Rodolfo; Porto, Rafael A.

    2002-01-01

    We have recently introduced a realistic, covariant, interpretation for the reduction process in relativistic quantum mechanics. The basic problem for a covariant description is the dependence of the states on the frame within which collapse takes place. A suitable use of the causal structure of the devices involved in the measurement process allowed us to introduce a covariant notion for the collapse of quantum states. However, a fully consistent description in the relativistic domain requires the extension of the interpretation to quantum fields. The extension is far from straightforward. Besides the obvious difficulty of dealing with the infinite degrees of freedom of the field theory, one has to analyse the restrictions imposed by causality concerning the allowed operations in a measurement process. In this paper we address these issues. We shall show that, in the case of partial causally connected measurements, our description allows us to include a wider class of causal operations than the one resulting from the standard way of computing conditional probabilities. This alternative description could be experimentally tested. A verification of this proposal would give stronger support to the realistic interpretations of the states in quantum mechanics. (author)

  1. Parser Adaptation for Social Media by Integrating Normalization

    NARCIS (Netherlands)

    van der Goot, Rob; van Noord, Gerardus

    This work explores normalization for parser adaptation. Traditionally, normalization is used as separate pre-processing step. We show that integrating the normalization model into the parsing algorithm is beneficial. This way, multiple normalization candidates can be leveraged, which improves

  2. Solitary-wave families of the Ostrovsky equation: An approach via reversible systems theory and normal forms

    International Nuclear Information System (INIS)

    Roy Choudhury, S.

    2007-01-01

    The Ostrovsky equation is an important canonical model for the unidirectional propagation of weakly nonlinear long surface and internal waves in a rotating, inviscid and incompressible fluid. Limited functional analytic results exist for the occurrence of one family of solitary-wave solutions of this equation, as well as their approach to the well-known solitons of the famous Korteweg-de Vries equation in the limit as the rotation becomes vanishingly small. Since solitary-wave solutions often play a central role in the long-time evolution of an initial disturbance, we consider such solutions here (via the normal form approach) within the framework of reversible systems theory. Besides confirming the existence of the known family of solitary waves and its reduction to the KdV limit, we find a second family of multihumped (or N-pulse) solutions, as well as a continuum of delocalized solitary waves (or homoclinics to small-amplitude periodic orbits). On isolated curves in the relevant parameter region, the delocalized waves reduce to genuine embedded solitons. The second and third families of solutions occur in regions of parameter space distinct from the known solitary-wave solutions and are thus entirely new. Directions for future work are also mentioned

  3. Proximity effect in normal metal-multiband superconductor hybrid structures

    NARCIS (Netherlands)

    Brinkman, Alexander; Golubov, Alexandre Avraamovitch; Kupriyanov, M. Yu

    2004-01-01

    A theory of the proximity effect in normal metal¿multiband superconductor hybrid structures is formulated within the quasiclassical Green's function formalism. The quasiclassical boundary conditions for multiband hybrid structures are derived in the dirty limit. It is shown that the existence of

  4. Energetic arousal and language: predictions from the computational theory of quantifiers processing.

    Science.gov (United States)

    Zajenkowski, Marcin

    2013-10-01

    The author examines the relationship between energetic arousal (EA) and the processing of sentences containing natural-language quantifiers. Previous studies and theories have shown that energy may differentially affect various cognitive functions. Recent investigations devoted to quantifiers strongly support the theory that various types of quantifiers involve different cognitive functions in the sentence-picture verification task. In the present study, 201 students were presented with a sentence-picture verification task consisting of simple propositions containing a quantifier that referred to the color of a car on display. Color pictures of cars accompanied the propositions. In addition, the level of participants' EA was measured before and after the verification task. It was found that EA and performance on proportional quantifiers (e.g., "More than half of the cars are red") are in an inverted U-shaped relationship. This result may be explained by the fact that proportional sentences engage working memory to a high degree, and previous models of EA-cognition associations have been based on the assumption that tasks that require parallel attentional and memory processes are best performed when energy is moderate. The research described in the present article has several applications, as it shows the optimal human conditions for verbal comprehension. For instance, it may be important in workplace design to control the level of arousal experienced by office staff when work is mostly related to the processing of complex texts. Energy level may be influenced by many factors, such as noise, time of day, or thermal conditions.

  5. Quantum decision theory as quantum theory of measurement

    International Nuclear Information System (INIS)

    Yukalov, V.I.; Sornette, D.

    2008-01-01

    We present a general theory of quantum information processing devices, that can be applied to human decision makers, to atomic multimode registers, or to molecular high-spin registers. Our quantum decision theory is a generalization of the quantum theory of measurement, endowed with an action ring, a prospect lattice and a probability operator measure. The algebra of probability operators plays the role of the algebra of local observables. Because of the composite nature of prospects and of the entangling properties of the probability operators, quantum interference terms appear, which make actions noncommutative and the prospect probabilities nonadditive. The theory provides the basis for explaining a variety of paradoxes typical of the application of classical utility theory to real human decision making. The principal advantage of our approach is that it is formulated as a self-consistent mathematical theory, which allows us to explain not just one effect but actually all known paradoxes in human decision making. Being general, the approach can serve as a tool for characterizing quantum information processing by means of atomic, molecular, and condensed-matter systems

  6. THEORY OF ACTIVE HITTINGS IS IN PROCESSES OF ELECTRO-COAGULATION THE ADMIXTURES IN WATER TECHNOLOGICAL ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    В.В. Березуцький

    2012-10-01

    Full Text Available  In the article theoretical bases of electro-coagulation of admixtures are examined in a water technological environment with the use of theory of the active hittings, which are based on the results of the executed researches and analysis of scientific information. Application of theory of the active hittings is in coagulation, provides high efficiency of process of extraction of admixtures from water environments during minimization of energy consumption and expenses of materials.

  7. Conflict monitoring in dual process theories of thinking.

    Science.gov (United States)

    De Neys, Wim; Glumicic, Tamara

    2008-03-01

    Popular dual process theories have characterized human thinking as an interplay between an intuitive-heuristic and demanding-analytic reasoning process. Although monitoring the output of the two systems for conflict is crucial to avoid decision making errors there are some widely different views on the efficiency of the process. Kahneman [Kahneman, D. (2002). Maps of bounded rationality: A perspective on intuitive judgement and choice. Nobel Prize Lecture. Retrieved January 11, 2006, from: http://nobelprize.org/nobel_prizes/economics/laureates/2002/kahnemann-lecture.pdf] and Evans [Evans, J. St. B. T. (1984). Heuristic and analytic processing in reasoning. British Journal of Psychology, 75, 451-468], for example, claim that the monitoring of the heuristic system is typically quite lax whereas others such as Sloman [Sloman, S. A. (1996). The empirical case for two systems of reasoning. Psychological Bulletin, 119, 3-22] and Epstein [Epstein, S. (1994). Integration of the cognitive and psychodynamic unconscious. American Psychologists, 49, 709-724] claim it is flawless and people typically experience a struggle between what they "know" and "feel" in case of a conflict. The present study contrasted these views. Participants solved classic base rate neglect problems while thinking aloud. In these problems a stereotypical description cues a response that conflicts with the response based on the analytic base rate information. Verbal protocols showed no direct evidence for an explicitly experienced conflict. As Kahneman and Evans predicted, participants hardly ever mentioned the base rates and seemed to base their judgment exclusively on heuristic reasoning. However, more implicit measures of conflict detection such as participants' retrieval of the base rate information in an unannounced recall test, decision making latencies, and the tendency to review the base rates indicated that the base rates had been thoroughly processed. On control problems where base rates and

  8. Applying Catastrophe Theory to an Information-Processing Model of Problem Solving in Science Education

    Science.gov (United States)

    Stamovlasis, Dimitrios; Tsaparlis, Georgios

    2012-01-01

    In this study, we test an information-processing model (IPM) of problem solving in science education, namely the working memory overload model, by applying catastrophe theory. Changes in students' achievement were modeled as discontinuities within a cusp catastrophe model, where working memory capacity was implemented as asymmetry and the degree…

  9. The Diagonal Model of Job Satisfaction and Motivation: Extracted from the Logical Comparison of Content and Process Theories

    Science.gov (United States)

    Sahito, Zafarullah; Vaisanen, Pertti

    2017-01-01

    The purpose of this study is to explore the strongest areas of all prime theories of job satisfaction and motivation to create a new multidimensional model. This model relies on all explored areas from the logical comparison of content and process theories to understand the phenomenon of job satisfaction and motivation of employees. The model…

  10. Dual processing theory and experts' reasoning: exploring thinking on national multiple-choice questions.

    Science.gov (United States)

    Durning, Steven J; Dong, Ting; Artino, Anthony R; van der Vleuten, Cees; Holmboe, Eric; Schuwirth, Lambert

    2015-08-01

    An ongoing debate exists in the medical education literature regarding the potential benefits of pattern recognition (non-analytic reasoning), actively comparing and contrasting diagnostic options (analytic reasoning) or using a combination approach. Studies have not, however, explicitly explored faculty's thought processes while tackling clinical problems through the lens of dual process theory to inform this debate. Further, these thought processes have not been studied in relation to the difficulty of the task or other potential mediating influences such as personal factors and fatigue, which could also be influenced by personal factors such as sleep deprivation. We therefore sought to determine which reasoning process(es) were used with answering clinically oriented multiple-choice questions (MCQs) and if these processes differed based on the dual process theory characteristics: accuracy, reading time and answering time as well as psychometrically determined item difficulty and sleep deprivation. We performed a think-aloud procedure to explore faculty's thought processes while taking these MCQs, coding think-aloud data based on reasoning process (analytic, nonanalytic, guessing or combination of processes) as well as word count, number of stated concepts, reading time, answering time, and accuracy. We also included questions regarding amount of work in the recent past. We then conducted statistical analyses to examine the associations between these measures such as correlations between frequencies of reasoning processes and item accuracy and difficulty. We also observed the total frequencies of different reasoning processes in the situations of getting answers correctly and incorrectly. Regardless of whether the questions were classified as 'hard' or 'easy', non-analytical reasoning led to the correct answer more often than to an incorrect answer. Significant correlations were found between self-reported recent number of hours worked with think-aloud word count

  11. Explicit and implicit cognition: a preliminary test of a dual-process theory of cognitive vulnerability to depression.

    Science.gov (United States)

    Haeffel, Gerald J; Abramson, Lyn Y; Brazy, Paige C; Shah, James Y; Teachman, Bethany A; Nosek, Brian A

    2007-06-01

    Two studies were conducted to test a dual-process theory of cognitive vulnerability to depression. According to this theory, implicit and explicit cognitive processes have differential effects on depressive reactions to stressful life events. Implicit processes are hypothesized to be critical in determining an individual's immediate affective reaction to stress whereas explicit cognitions are thought to be more involved in long-term depressive reactions. Consistent with hypotheses, the results of study 1 (cross-sectional; N=237) showed that implicit, but not explicit, cognitions predicted immediate affective reactions to a lab stressor. Study 2 (longitudinal; N=251) also supported the dual-process model of cognitive vulnerability to depression. Results showed that both the implicit and explicit measures interacted with life stress to predict prospective changes in depressive symptoms, respectively. However, when both implicit and explicit predictors were entered into a regression equation simultaneously, only the explicit measure interacted with stress to remain a unique predictor of depressive symptoms over the five-week prospective interval.

  12. Integral type operators from normal weighted Bloch spaces to QT,S spaces

    Directory of Open Access Journals (Sweden)

    Yongyi GU

    2016-08-01

    Full Text Available Operator theory is an important research content of the analytic function space theory. The discussion of simultaneous operator and function space is an effective way to study operator and function space. Assuming that  is an analytic self map on the unit disk Δ, and the normal weighted bloch space μ-B is a Banach space on the unit disk Δ, defining a composition operator C∶C(f=f on μ-B for all f∈μ-B, integral type operator JhC and CJh are generalized by integral operator and composition operator. The boundeness and compactness of the integral type operator JhC acting from normal weighted Bloch spaces to QT,S spaces are discussed, as well as the boundeness of the integral type operators CJh acting from normal weighted Bloch spaces to QT,S spaces. The related sufficient and necessary conditions are given.

  13. Self-regulated processes as predictors of students' achievement in music theory in Slovenian elementary music schools

    OpenAIRE

    Barbara Smolej Fritz; Cirila Peklaj

    2010-01-01

    The aim of the present research was to examine the relation between processes of selfregulated learning and achievement in Music Theory (MT), a basic and obligatory subject in Slovenian music schools. A total of 457 fifth- and sixth- grade students (153 boys and 303 girls) from 10 different elementary music schools in Slovenia participated in the study. Students completed a questionnaire about affective-motivational processes and a questionnaire about (meta)cognitive processes of selfregulate...

  14. NORMAL PRESSURE AND FRICTION STRESS MEASUREMENT IN ROLLING PROCESSES

    DEFF Research Database (Denmark)

    Henningsen, Poul; Arentoft, Mogens; Lagergren, Jonas

    2005-01-01

    the output from the transducer, the friction stress and normal pressure in the contact zone can be determined. The new concept differs from existing pin designs by a lower disturbance of lubricant film and material flow and limited penetration of material between transducer and roll. Aluminum, cupper...

  15. Quantum transport in graphene normal-metal superconductor- normal-metal structures

    Directory of Open Access Journals (Sweden)

    H. Mohammadpour

    2008-06-01

    Full Text Available  We study the transport of electrons in a graphene NSN structure in which two normal regions are connected by a superconducting strip of thickness d. Within Dirac-Bogoliubov-de Gennes equations we describe the transmission through the contact in terms of different scattering processes consisting of quasiparticle cotunneling, local and crossed Andreev reflections. Compared to a fully normal structure we show that the angular dependence of the transmission probability is significantly modified by the presence of superconducting correlations. This modifation can be explained in terms of the interplay between Andreev reflection and Klein tunneling of chiral quasiparticles. We further analyze the energy dependence of the resulting differential conductance of the structure. The subgap differential conductance shows features of Andreev reflection and cotunelling processes, which tends to the values of an NS structure for large ds. Above the gap, the differential conductance shows an oscillatory behavior with energy even at very large ds.

  16. Limit theory for the sample autocorrelations and extremes of a GARCH (1,1) process

    NARCIS (Netherlands)

    Mikosch, T; Starica, C

    2000-01-01

    The asymptotic theory for the sample autocorrelations and extremes of a GARCH(I, 1) process is provided. Special attention is given to the case when the sum of the ARCH and GARCH parameters is close to 1, that is, when one is close to an infinite Variance marginal distribution. This situation has

  17. Higher Fasting Plasma Glucose Levels, within the Normal Range, are Associated with Decreased Processing Speed in High Functioning Young Elderly.

    Science.gov (United States)

    Raizes, Meytal; Elkana, Odelia; Franko, Motty; Ravona Springer, Ramit; Segev, Shlomo; Beeri, Michal Schnaider

    2016-01-01

    We explored the association of plasma glucose levels within the normal range with processing speed in high functioning young elderly, free of type 2 diabetes mellitus (T2DM). A sample of 41 participants (mean age = 64.7, SD = 10; glucose 94.5 mg/dL, SD = 9.3), were examined with a computerized cognitive battery. Hierarchical linear regression analysis showed that higher plasma glucose levels, albeit within the normal range (levels may have an impact on cognitive function.

  18. A Bayes Theory-Based Modeling Algorithm to End-to-end Network Traffic

    Directory of Open Access Journals (Sweden)

    Zhao Hong-hao

    2016-01-01

    Full Text Available Recently, network traffic has exponentially increasing due to all kind of applications, such as mobile Internet, smart cities, smart transportations, Internet of things, and so on. the end-to-end network traffic becomes more important for traffic engineering. Usually end-to-end traffic estimation is highly difficult. This paper proposes a Bayes theory-based method to model the end-to-end network traffic. Firstly, the end-to-end network traffic is described as a independent identically distributed normal process. Then the Bases theory is used to characterize the end-to-end network traffic. By calculating the parameters, the model is determined correctly. Simulation results show that our approach is feasible and effective.

  19. Stochastic processes, optimization, and control theory a volume in honor of Suresh Sethi

    CERN Document Server

    Yan, Houmin

    2006-01-01

    This edited volume contains 16 research articles. It presents recent and pressing issues in stochastic processes, control theory, differential games, optimization, and their applications in finance, manufacturing, queueing networks, and climate control. One of the salient features is that the book is highly multi-disciplinary. The book is dedicated to Professor Suresh Sethi on the occasion of his 60th birthday, in view of his distinguished career.

  20. Facial-Attractiveness Choices Are Predicted by Divisive Normalization.

    Science.gov (United States)

    Furl, Nicholas

    2016-10-01

    Do people appear more attractive or less attractive depending on the company they keep? A divisive-normalization account-in which representation of stimulus intensity is normalized (divided) by concurrent stimulus intensities-predicts that choice preferences among options increase with the range of option values. In the first experiment reported here, I manipulated the range of attractiveness of the faces presented on each trial by varying the attractiveness of an undesirable distractor face that was presented simultaneously with two attractive targets, and participants were asked to choose the most attractive face. I used normalization models to predict the context dependence of preferences regarding facial attractiveness. The more unattractive the distractor, the more one of the targets was preferred over the other target, which suggests that divisive normalization (a potential canonical computation in the brain) influences social evaluations. I obtained the same result when I manipulated faces' averageness and participants chose the most average face. This finding suggests that divisive normalization is not restricted to value-based decisions (e.g., attractiveness). This new application to social evaluation of normalization, a classic theory, opens possibilities for predicting social decisions in naturalistic contexts such as advertising or dating.

  1. Modern tendencies and problems of the theory of spiritual-moral processes management in higher school

    Directory of Open Access Journals (Sweden)

    Iryna Sidanich

    2016-03-01

    Full Text Available In the article were analyzed the modern tendencies and problems of the theory of spiritual-moral processes management in the higher school. There were defined the node tasks of reformation of higher education: ensuring its quality, construction of effective educational system of the higher school institutions with effective economy and management. There was characterized the problem of ensuring axiological direction of spiritual-humanitarian component of educational process in the system of higher education. There were defined priorities of national interests in spiritual-moral education of junior generation in the state educational activity: national self-consciousness, spiritual-cultural unity of nation, patriotism, humanism, tolerance, responsibility.There was analyzed the system of higher education in the aspect of interaction of spiritual and secular components in coordinates of moral sanitation and spiritual enlightenment of nation, elaboration of democratic principles of society and construction of the modern theory of spiritual-moral processes management in higher school.There were defined the new directions of the theory of spiritual-moral processes management in higher school in the aspect of development of innovations and commercialization, attraction of employers to collaboration with scientists in separate work groups for creation of the new educational programs and modernization of existing ones, mentor support and training of students for job placement and development of enterprising skills and also for support of the programs of probation or practical participation of students in the “real social projects”.There were characterized prospects of research in the aspect of elaboration of the main functions that must establish the main claims to production tasks in professional activity of holder of the master’s degree on speciality “Christian pedagogics in the high education”

  2. Temporal and speech processing skills in normal hearing individuals exposed to occupational noise.

    Science.gov (United States)

    Kumar, U Ajith; Ameenudin, Syed; Sangamanatha, A V

    2012-01-01

    Prolonged exposure to high levels of occupational noise can cause damage to hair cells in the cochlea and result in permanent noise-induced cochlear hearing loss. Consequences of cochlear hearing loss on speech perception and psychophysical abilities have been well documented. Primary goal of this research was to explore temporal processing and speech perception Skills in individuals who are exposed to occupational noise of more than 80 dBA and not yet incurred clinically significant threshold shifts. Contribution of temporal processing skills to speech perception in adverse listening situation was also evaluated. A total of 118 participants took part in this research. Participants comprised three groups of train drivers in the age range of 30-40 (n= 13), 41 50 ( = 13), 41-50 (n = 9), and 51-60 (n = 6) years and their non-noise-exposed counterparts (n = 30 in each age group). Participants of all the groups including the train drivers had hearing sensitivity within 25 dB HL in the octave frequencies between 250 and 8 kHz. Temporal processing was evaluated using gap detection, modulation detection, and duration pattern tests. Speech recognition was tested in presence multi-talker babble at -5dB SNR. Differences between experimental and control groups were analyzed using ANOVA and independent sample t-tests. Results showed a trend of reduced temporal processing skills in individuals with noise exposure. These deficits were observed despite normal peripheral hearing sensitivity. Speech recognition scores in the presence of noise were also significantly poor in noise-exposed group. Furthermore, poor temporal processing skills partially accounted for the speech recognition difficulties exhibited by the noise-exposed individuals. These results suggest that noise can cause significant distortions in the processing of suprathreshold temporal cues which may add to difficulties in hearing in adverse listening conditions.

  3. Temporal and speech processing skills in normal hearing individuals exposed to occupational noise

    Directory of Open Access Journals (Sweden)

    U Ajith Kumar

    2012-01-01

    Full Text Available Prolonged exposure to high levels of occupational noise can cause damage to hair cells in the cochlea and result in permanent noise-induced cochlear hearing loss. Consequences of cochlear hearing loss on speech perception and psychophysical abilities have been well documented. Primary goal of this research was to explore temporal processing and speech perception Skills in individuals who are exposed to occupational noise of more than 80 dBA and not yet incurred clinically significant threshold shifts. Contribution of temporal processing skills to speech perception in adverse listening situation was also evaluated. A total of 118 participants took part in this research. Participants comprised three groups of train drivers in the age range of 30-40 (n= 13, 41 50 ( = 13, 41-50 (n = 9, and 51-60 (n = 6 years and their non-noise-exposed counterparts (n = 30 in each age group. Participants of all the groups including the train drivers had hearing sensitivity within 25 dB HL in the octave frequencies between 250 and 8 kHz. Temporal processing was evaluated using gap detection, modulation detection, and duration pattern tests. Speech recognition was tested in presence multi-talker babble at -5dB SNR. Differences between experimental and control groups were analyzed using ANOVA and independent sample t-tests. Results showed a trend of reduced temporal processing skills in individuals with noise exposure. These deficits were observed despite normal peripheral hearing sensitivity. Speech recognition scores in the presence of noise were also significantly poor in noise-exposed group. Furthermore, poor temporal processing skills partially accounted for the speech recognition difficulties exhibited by the noise-exposed individuals. These results suggest that noise can cause significant distortions in the processing of suprathreshold temporal cues which may add to difficulties in hearing in adverse listening conditions.

  4. The Higgs particle and higher-dimensional theories

    International Nuclear Information System (INIS)

    Lim, C. S.

    2014-01-01

    In spite of the great success of LHC experiments, we do not know whether the discovered “standard model-like” Higgs particle is really what the standard model predicts, or a particle that some new physics has in its low-energy effective theory. Also, the long-standing problems concerning the property of the Higgs and its interactions are still there, and we still do not have any conclusive argument on the origin of the Higgs itself. In this article we focus on higher-dimensional theories as new physics. First we give a brief review of their representative scenarios and closely related 4D scenarios. Among them, we mainly discuss two interesting possibilities of the origin of the Higgs: the Higgs as a gauge boson and the Higgs as a (pseudo) Nambu–Goldstone boson. Next, we argue that theories of new physics are divided into two categories, i.e., theories with normal Higgs interactions and those with anomalous Higgs interactions. Interestingly, both the candidates for the origin of the Higgs mentioned above predict characteristic “anomalous” Higgs interactions, such as the deviation of the Yukawa couplings from the standard model predictions. Such deviations can hopefully be investigated by precision tests of Higgs interactions at the planned ILC experiment. Also discussed is the main decay mode of the Higgs, H→γγ. Again, theories belonging to different categories are known to predict remarkably different new physics contributions to this important process

  5. The Helicobacter pylori theory and duodenal ulcer disease. A case study of the research process

    DEFF Research Database (Denmark)

    Christensen, A H; Gjørup, T

    1995-01-01

    should be selected for H. pylori eradication treatment. CONCLUSION: Descriptive clinical studies and laboratory studies of disease mechanisms were the prevailing types of research about H. pylori. Comparatively few therapeutic intervention studies were done; this fact may have hampered the acceptance......OBJECTIVES: To describe the medical research process from the time of the generation of a new theory to its implementation in clinical practice. The Helicobacter pylori (H. pylori) theory, i.e. the theory that H. pylori plays a significant causal role in duodenal ulcer disease was chosen as a case....... MATERIAL: Abstracts from 1984 to 1993, identified in the CD-Rom, Medline system, ("Silverplatter"), using the search terms Campylobacter pylori and Helicobacter pylori, and reviews and editorials about H. pylori in some of the most widespread clinical journals. RESULTS: 2204 papers on H. pylori were...

  6. IBUPROFEN AS A MEDICATION FOR A CORRECTION OF SYMPTOMS OF NORMAL VACCINAL PROCESS IN CHILDREN

    Directory of Open Access Journals (Sweden)

    T.A. Chebotareva

    2008-01-01

    Full Text Available The pathogenetic approach to treatment of symptoms of normal vaccinal process in children after standard vaccination, based on the results of application of anti9inflammatory medications — ibuprofen (nurofen for children and paracetamol is presented in this article. Clinical activity of ibuprofen was established on the basis of clinica catamnestic observation of 856 vaccinated children aged from 3 months to 3 years. recommendations for application of these medications as a treatment for a correction of vaccinal reactions are given.Key words: children, ibuprofen, paracetamol, vaccination.

  7. Propagation of normal zones in composite superconductors

    International Nuclear Information System (INIS)

    Dresner, L.

    1976-08-01

    This paper describes calculations of propagation velocities of normal zones in composite superconductors. Full accounting is made for (1) current sharing, (2) the variation with temperature of the thermal conductivity of the copper matrix, and the specific heats of the matrix and the superconductor, and (3) the variation with temperature of the steady-state heat transfer at a copper-helium interface in the nucleate-boiling, transition, and film-boiling ranges. The theory, which contains no adjustable parameters, is compared with experiments on bare (uninsulated) conductors. Agreement is not good. It is concluded that the effects of transient heat transfer may need to be included in the theory to improve agreement with experiment

  8. A Grounded Theory of Text Revision Processes Used by Young Adolescents Who Are Deaf

    Science.gov (United States)

    Yuknis, Christina

    2014-01-01

    This study examined the revising processes used by 8 middle school students who are deaf or hard-of-hearing as they composed essays for their English classes. Using grounded theory, interviews with students and teachers in one middle school, observations of the students engaging in essay creation, and writing samples were collected for analysis.…

  9. Towards Transition Theory

    NARCIS (Netherlands)

    J. de Haan (Hans)

    2010-01-01

    textabstractThis thesis is a treatise on a theory for societal transitions: pillar theory. Societal transitions are complex processes taking place in complex systems, large-scale, long-term processes in which societal systems radically change the way they are composed and function. Since we all are

  10. Analysis of the stochastic channel model by Saleh & Valenzuela via the theory of point processes

    DEFF Research Database (Denmark)

    Jakobsen, Morten Lomholt; Pedersen, Troels; Fleury, Bernard Henri

    2012-01-01

    and underlying features, like the intensity function of the component delays and the delaypower intensity. The flexibility and clarity of the mathematical instruments utilized to obtain these results lead us to conjecture that the theory of spatial point processes provides a unifying mathematical framework...

  11. Normal gravity field in relativistic geodesy

    Science.gov (United States)

    Kopeikin, Sergei; Vlasov, Igor; Han, Wen-Biao

    2018-02-01

    Modern geodesy is subject to a dramatic change from the Newtonian paradigm to Einstein's theory of general relativity. This is motivated by the ongoing advance in development of quantum sensors for applications in geodesy including quantum gravimeters and gradientometers, atomic clocks and fiber optics for making ultra-precise measurements of the geoid and multipolar structure of the Earth's gravitational field. At the same time, very long baseline interferometry, satellite laser ranging, and global navigation satellite systems have achieved an unprecedented level of accuracy in measuring 3-d coordinates of the reference points of the International Terrestrial Reference Frame and the world height system. The main geodetic reference standard to which gravimetric measurements of the of Earth's gravitational field are referred is a normal gravity field represented in the Newtonian gravity by the field of a uniformly rotating, homogeneous Maclaurin ellipsoid of which mass and quadrupole momentum are equal to the total mass and (tide-free) quadrupole moment of Earth's gravitational field. The present paper extends the concept of the normal gravity field from the Newtonian theory to the realm of general relativity. We focus our attention on the calculation of the post-Newtonian approximation of the normal field that is sufficient for current and near-future practical applications. We show that in general relativity the level surface of homogeneous and uniformly rotating fluid is no longer described by the Maclaurin ellipsoid in the most general case but represents an axisymmetric spheroid of the fourth order with respect to the geodetic Cartesian coordinates. At the same time, admitting a post-Newtonian inhomogeneity of the mass density in the form of concentric elliptical shells allows one to preserve the level surface of the fluid as an exact ellipsoid of rotation. We parametrize the mass density distribution and the level surface with two parameters which are

  12. Hypergame theory applied to cyber attack and defense

    Science.gov (United States)

    House, James Thomas; Cybenko, George

    2010-04-01

    This work concerns cyber attack and defense in the context of game theory--specifically hypergame theory. Hypergame theory extends classical game theory with the ability to deal with differences in players' expertise, differences in their understanding of game rules, misperceptions, and so forth. Each of these different sub-scenarios, or subgames, is associated with a probability--representing the likelihood that the given subgame is truly "in play" at a given moment. In order to form an optimal attack or defense policy, these probabilities must be learned if they're not known a-priori. We present hidden Markov model and maximum entropy approaches for accurately learning these probabilities through multiple iterations of both normal and modified game play. We also give a widely-applicable approach for the analysis of cases where an opponent is aware that he is being studied, and intentionally plays to spoil the process of learning and thereby obfuscate his attributes. These are considered in the context of a generic, abstract cyber attack example. We demonstrate that machine learning efficacy can be heavily dependent on the goals and styles of participant behavior. To this end detailed simulation results under various combinations of attacker and defender behaviors are presented and analyzed.

  13. Norm Theory: Comparing Reality to Its Alternatives.

    Science.gov (United States)

    Kahneman, Daniel; Miller, Dale T.

    1986-01-01

    A theory of norms and normality is applied to some phenomena of emotional responses, social judgment, and conversations about causes. Norm theory is applied in analyses of enhanced emotional response to events that have abnormal causes, of generation of prediction from observations of behavior, and of the role of norms. (Author/LMO)

  14. Testing of the coping flexibility hypothesis based on the dual-process theory: Relationships between coping flexibility and depressive Symptoms.

    Science.gov (United States)

    Kato, Tsukasa

    2015-12-15

    According to the dual-process theory of coping flexibility (Kato, 2012), coping flexibility is the ability to discontinue an ineffective coping strategy (i.e., evaluation coping process) and implement an alternative strategy (i.e., adaptive coping process). The coping flexibility hypothesis (CFH) proposes that the ability to engage in flexible coping is related to better psychological functioning and physical health, including less depression. I the present study, participants were 393 American Whites, 429 Australian Whites, and 496 Chinese, selected from the data pool of the 2013 Coping and Health Survey (see Kato, 2014b). They completed both the Coping Flexibility Scale (Kato, 2012), which is based on the dual-process theory of coping flexibility, and the Center for Epidemiologic Studies Depression Scale (CES-D). For all nationalities and genders, evaluation coping and adaptive coping were significantly correlated with lower levels of depressive symptoms. Structural equation modeling revealed that evaluation coping was associated with lower depressive symptoms for all nationalities and genders, whereas no significant relationships between adaptive coping and depressive symptoms were found for any nationalities. Our results partially supported that the CFH fits with the dual-process theory of coping flexibility. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  15. Probability theory

    CERN Document Server

    Dorogovtsev, A Ya; Skorokhod, A V; Silvestrov, D S; Skorokhod, A V

    1997-01-01

    This book of problems is intended for students in pure and applied mathematics. There are problems in traditional areas of probability theory and problems in the theory of stochastic processes, which has wide applications in the theory of automatic control, queuing and reliability theories, and in many other modern science and engineering fields. Answers to most of the problems are given, and the book provides hints and solutions for more complicated problems.

  16. Hamiltonian formulation of theory with higher order derivatives

    International Nuclear Information System (INIS)

    Gitman, D.M.; Lyakhovich, S.L.; Tyutin, I.V.

    1983-01-01

    A method of ''hamiltonization'' of a special theory with higher order derivatives is described. In a nonspecial case the result coincides with the known Ostrogradsky formulation. It is shown that in the nonspecial theory the lagrange equations of motion are reduced to the normal form

  17. Projecting Grammatical Features in Nominals: Cognitive Processing Theory & Computational Implementation

    Science.gov (United States)

    2010-03-01

    underlying linguistic theory is an adaptation of X-Bar Theory ( Chomsky , 1970; Jackendoff, 1977) called Bi- Polar Theory (Ball, 2007a). In Bi-Polar...University Press. Chomsky , N. (1970). Remarks on Nominalization. In Jacobs & Rosembaum, (Eds.), Readings in English Transformational Grammar. Waltham, MA

  18. Effects of Cognitive Complexity and Emotional Upset on Processing Supportive Messages: Two Tests of a Dual-Process Theory of Supportive Communication Outcomes

    Science.gov (United States)

    Bodie, Graham D.; Burleson, Brant R.; Holmstrom, Amanda J.; McCullough, Jennifer D.; Rack, Jessica J.; Hanasono, Lisa K.; Rosier, Jennifer G.

    2011-01-01

    We report tests of hypotheses derived from a theory of supportive communication outcomes that maintains the effects of supportive messages are moderated by factors influencing the motivation and ability to process these messages. Participants in two studies completed a measure of cognitive complexity, which provided an assessment of processing…

  19. A measure theoretical approach to quantum stochastic processes

    CERN Document Server

    Von Waldenfels, Wilhelm

    2014-01-01

    This monograph takes as starting point that abstract quantum stochastic processes can be understood as a quantum field theory in one space and in one time coordinate. As a result it is appropriate to represent operators as power series of creation and annihilation operators in normal-ordered form, which can be achieved using classical measure theory. Considering in detail four basic examples (e.g. a two-level atom coupled to a heat bath of oscillators), in each case the Hamiltonian of the associated one-parameter strongly continuous group is determined and the spectral decomposition is explicitly calculated in the form of generalized eigen-vectors. Advanced topics include the theory of the Hudson-Parthasarathy equation and the amplified oscillator problem. To that end, a chapter on white noise calculus has also been included.

  20. Biomechanics of normal and pathological gait: implications for understanding human locomotor control.

    Science.gov (United States)

    Winter, D A

    1989-12-01

    The biomechanical (kinetic) analysis of human gait reveals the integrated and detailed motor patterns that are essential in pinpointing the abnormal patterns in pathological gait. In a similar manner, these motor patterns (moments, powers, and EMGs) can be used to identify synergies and to validate theories of CNS control. Based on kinetic and EMG patterns for a wide range of normal subjects and cadences, evidence is presented that both supports and negates the central pattern generator theory of locomotion. Adaptive motor patterns that are evident in peripheral gait pathologies reinforce a strong peripheral rather than a central control. Finally, a three-component subtask theory of human gait is presented and is supported by reference to the motor patterns seen in a normal gait. The identified subtasks are (a) support (against collapse during stance); (b) dynamic balance of the upper body, also during stance; and (c) feedforward control of the foot trajectory to achieve safe ground clearance and a gentle heel contact.

  1. Theory of high energy collision processes. Final report, June 1, 1969-May 31, 1984

    International Nuclear Information System (INIS)

    Wu, T.T.

    1984-01-01

    We have developed a comprehensive theory for scattering processes at extremely high energies. On the basis of relativistic quantum field theories with or without isotopic spin, we have obtained a simple physical picture, called the impact picture, which gives a number of unusual predictions. Many of these have been verified experimentally, including the increasing total cross sections, the increasing total elastic cross sections, the moving dip, and the rising plateau. An especially accurate experimental verification of the prediction of increasing total cross section has been provided by the CERN p anti p Collider at a c.m. energy of 540 GeV. All of these predictions were obtained by resumming the perturbation series. The natural next step is to look for important physical effects that cannot be seen by any method of resumming the perturbation series. One such method is to find nonperturbative effects already present on the classical level; another is to construct exactly solvable models of quantum field theory. Both approaches have been pursued. Recent theoretical results include the possible occurrence of indeterminate-mass particles, dynamic determination of coupling constants, a solvable Z 2 lattice gauge theory, a generalization of the method of helicity amplitudes, classical models of confinement, and a monopole as a short-distance probe. 152 publications are listed

  2. On the theory of polar ozone holes

    International Nuclear Information System (INIS)

    Njau, E.C.

    1990-12-01

    The viable theories already proposed to explain polar ozone holes generally fall into two main categories, namely, chemical theories and dynamical theories. In both of these categories, polar stratospheric clouds (PSCs) are taken as part of the essential basis. Besides, all the dynamical theories are based upon temperature changes. Since formation of the PSCs is highly temperature-dependent, it has been concluded from recent research (e.g. see Kawahira and Hirooka) that temperature changes are a cause, not a result of ozone depletion in polar regions. On this basis, formulations are developed that represent short-term and long-term temperature variations in the polar regions due to natural processes. These variations, which are confined to a limited area around each pole, include specific oscillations with periods ranging from ∼ 2 years up to ∼ 218,597 years. Polar ozone variations are normally expected to be influenced by these temperature oscillations. It is, therefore, apparent that the generally decreasing trend observed in mean October ozone column at Halley Bay (76 deg. S, 27 deg. W) from 1956 up to 1987 is mostly caused by the decreasing phase of a combination of two natural temperature oscillations, one with a period of ∼ 70-80 years and the other with a period of ∼ 160-180 years. Contributions of other natural temperature oscillations are also mentioned and briefly discussed. (author). 35 refs, 4 figs

  3. The Process of Social Identity Development in Adolescent High School Choral Singers: A Grounded Theory

    Science.gov (United States)

    Parker, Elizabeth Cassidy

    2014-01-01

    The purpose of this grounded theory study was to describe the process of adolescent choral singers' social identity development within three midsized, midwestern high school mixed choirs. Forty-nine interviews were conducted with 36 different participants. Secondary data sources included memoing, observations, and interviews with the choir…

  4. ‘Living' theory: a pedagogical framework for process support in networked learning

    Directory of Open Access Journals (Sweden)

    Philipa Levy

    2006-12-01

    Full Text Available This paper focuses on the broad outcome of an action research project in which practical theory was developed in the field of networked learning through case-study analysis of learners' experiences and critical evaluation of educational practice. It begins by briefly discussing the pedagogical approach adopted for the case-study course and the action research methodology. It then identifies key dimensions of four interconnected developmental processes–orientation, communication, socialisation and organisation–that were associated with ‘learning to learn' in the course's networked environment, and offers a flavour of participants' experiences in relation to these processes. A number of key evaluation issues that arose are highlighted. Finally, the paper presents the broad conceptual framework for the design and facilitation of process support in networked learning that was derived from this research. The framework proposes a strong, explicit focus on support for process as well as domain learning, and progression from tighter to looser design and facilitation structures for process-focused (as well as domain-focused learning tasks.

  5. Bell-type quantum field theories

    International Nuclear Information System (INIS)

    Duerr, Detlef; Goldstein, Sheldon; Tumulka, Roderich; Zanghi, Nino

    2005-01-01

    In his paper (1986 Beables for quantum field theory Phys. Rep. 137 49-54) John S Bell proposed how to associate particle trajectories with a lattice quantum field theory, yielding what can be regarded as a vertical bar Ψ vertical bar 2 -distributed Markov process on the appropriate configuration space. A similar process can be defined in the continuum, for more or less any regularized quantum field theory; we call such processes Bell-type quantum field theories. We describe methods for explicitly constructing these processes. These concern, in addition to the definition of the Markov processes, the efficient calculation of jump rates, how to obtain the process from the processes corresponding to the free and interaction Hamiltonian alone, and how to obtain the free process from the free Hamiltonian or, alternatively, from the one-particle process by a construction analogous to 'second quantization'. As an example, we consider the process for a second quantized Dirac field in an external electromagnetic field. (topical review)

  6. Cancer Theory from Systems Biology Point of View

    Science.gov (United States)

    Wang, Gaowei; Tang, Ying; Yuan, Ruoshi; Ao, Ping

    In our previous work, we have proposed a novel cancer theory, endogenous network theory, to understand mechanism underlying cancer genesis and development. Recently, we apply this theory to hepatocellular carcinoma (HCC). A core endogenous network of hepatocyte was established by integrating the current understanding of hepatocyte at molecular level. Quantitative description of the endogenous network consisted of a set of stochastic differential equations which could generate many local attractors with obvious or non-obvious biological functions. By comparing with clinical observation and experimental data, the results showed that two robust attractors from the model reproduced the main known features of normal hepatocyte and cancerous hepatocyte respectively at both modular and molecular level. In light of our theory, the genesis and progression of cancer is viewed as transition from normal attractor to HCC attractor. A set of new insights on understanding cancer genesis and progression, and on strategies for cancer prevention, cure, and care were provided.

  7. Theory of charge transport in diffusive normal metal conventional superconductor point contacts

    NARCIS (Netherlands)

    Tanaka, Y.; Golubov, Alexandre Avraamovitch; Kashiwaya, S.

    2003-01-01

    Tunneling conductance in diffusive normal (DN) metal/insulator/s-wave superconductor junctions is calculated for various situations by changing the magnitudes of the resistance and Thouless energy in DN and the transparency of the insulating barrier. The generalized boundary condition introduced by

  8. Theory of thermal and charge transport in diffusive normal metal / superconductor junctions

    NARCIS (Netherlands)

    Yokoyama, T.; Tanaka, Y.; Golubov, Alexandre Avraamovitch; Asano, Y.

    2005-01-01

    Thermal and charge transport in diffusive normal metal (DN)/insulator/s-, d-, and p-wave superconductor junctions are studied based on the Usadel equation with the Nazarov's generalized boundary condition. We derive a general expression of the thermal conductance in unconventional superconducting

  9. Recent progresses in relativistic beam-plasma instability theory

    Directory of Open Access Journals (Sweden)

    A. Bret

    2010-11-01

    Full Text Available Beam-plasma instabilities are a key physical process in many astrophysical phenomena. Within the fireball model of Gamma ray bursts, they first mediate a relativistic collisionless shock before they produce upstream the turbulence needed for the Fermi acceleration process. While non-relativistic systems are usually governed by flow-aligned unstable modes, relativistic ones are likely to be dominated by normally or even obliquely propagating waves. After reviewing the basis of the theory, results related to the relativistic kinetic regime of the poorly-known oblique unstable modes will be presented. Relevant systems besides the well-known electron beam-plasma interaction are presented, and it is shown how the concept of modes hierarchy yields a criterion to assess the proton to electron mass ratio in Particle in cell simulations.

  10. Inhibitory processes and cognitive flexibility: evidence for the theory of attentional inertia

    Directory of Open Access Journals (Sweden)

    Isabel Introzzi

    2015-07-01

    Full Text Available The aim of this study was to discriminate the differential contribution of different inhibitory processes -perceptual, cognitive and behavioral inhibition- to switching cost effect associated with alternation cognitive tasks. A correlational design was used. Several experimental paradigms (e.g., Stop signal, visual search, Stemberg´s experimental and Simon paradigm were adapted and included in a computerized program called TAC (Introzzi & Canet Juric, 2014 to the assessment of the different cognitive processes. The final sample consisted of 45 adults (18-50 years. Perceptual and behavioral inhibition shows moderate and low correlations with attentional cost, cognitive inhibition shows no relation with flexibility and only perceptual inhibition predicts switching costs effects, suggesting that different inhibitory processes contribute differentially to switch cost. This could be interpreted as evidence to Attentional Inertia Theory main argument which postulates that inhibition plays an essential role in the ability to flexibly switch between tasks and/or representations.

  11. Theory of Constraints (TOC)

    DEFF Research Database (Denmark)

    Michelsen, Aage U.

    2004-01-01

    Tankegangen bag Theory of Constraints samt planlægningsprincippet Drum-Buffer-Rope. Endvidere skitse af The Thinking Process.......Tankegangen bag Theory of Constraints samt planlægningsprincippet Drum-Buffer-Rope. Endvidere skitse af The Thinking Process....

  12. Band theory of metals the elements

    CERN Document Server

    Altmann, Simon L

    1970-01-01

    Band Theory of Metals: The Elements focuses on the band theory of solids. The book first discusses revision of quantum mechanics. Topics include Heisenberg's uncertainty principle, normalization, stationary states, wave and group velocities, mean values, and variational method. The text takes a look at the free-electron theory of metals, including heat capacities, density of states, Fermi energy, core and metal electrons, and eigenfunctions in three dimensions. The book also reviews the effects of crystal fields in one dimension. The eigenfunctions of the translations; symmetry operations of t

  13. Minimal theory of massive gravity

    International Nuclear Information System (INIS)

    De Felice, Antonio; Mukohyama, Shinji

    2016-01-01

    We propose a new theory of massive gravity with only two propagating degrees of freedom. While the homogeneous and isotropic background cosmology and the tensor linear perturbations around it are described by exactly the same equations as those in the de Rham–Gabadadze–Tolley (dRGT) massive gravity, the scalar and vector gravitational degrees of freedom are absent in the new theory at the fully nonlinear level. Hence the new theory provides a stable nonlinear completion of the self-accelerating cosmological solution that was originally found in the dRGT theory. The cosmological solution in the other branch, often called the normal branch, is also rendered stable in the new theory and, for the first time, makes it possible to realize an effective equation-of-state parameter different from (either larger or smaller than) −1 without introducing any extra degrees of freedom.

  14. Immediate survival focus: synthesizing life history theory and dual process models to explain substance use.

    Science.gov (United States)

    Richardson, George B; Hardesty, Patrick

    2012-01-01

    Researchers have recently applied evolutionary life history theory to the understanding of behaviors often conceived of as prosocial or antisocial. In addition, researchers have applied cognitive science to the understanding of substance use and used dual process models, where explicit cognitive processes are modeled as relatively distinct from implicit cognitive processes, to explain and predict substance use behaviors. In this paper we synthesized these two theoretical perspectives to produce an adaptive and cognitive framework for explaining substance use. We contend that this framework provides new insights into the nature of substance use that may be valuable for both clinicians and researchers.

  15. Hiding in plain sight: communication theory in implementation science.

    Science.gov (United States)

    Manojlovich, Milisa; Squires, Janet E; Davies, Barbara; Graham, Ian D

    2015-04-23

    Poor communication among healthcare professionals is a pressing problem, contributing to widespread barriers to patient safety. The word "communication" means to share or make common. In the literature, two communication paradigms dominate: (1) communication as a transactional process responsible for information exchange, and (2) communication as a transformational process responsible for causing change. Implementation science has focused on information exchange attributes while largely ignoring transformational attributes of communication. In this paper, we debate the merits of encompassing both paradigms. We conducted a two-staged literature review searching for the concept of communication in implementation science to understand how communication is conceptualized. Twenty-seven theories, models, or frameworks were identified; only Rogers' Diffusion of Innovations theory provides a definition of communication and includes both communication paradigms. Most models (notable exceptions include Diffusion of Innovations, The Ottawa Model of Research Use, and Normalization Process Theory) describe communication as a transactional process. But thinking of communication solely as information transfer or exchange misrepresents reality. We recommend that implementation science theories (1) propose and test the concept of shared understanding when describing communication, (2) acknowledge that communication is multi-layered, identify at least a few layers, and posit how identified layers might affect the development of shared understanding, (3) acknowledge that communication occurs in a social context, providing a frame of reference for both individuals and groups, (4) acknowledge the unpredictability of communication (and healthcare processes in general), and (5) engage with and draw on work done by communication theorists. Implementation science literature has conceptualized communication as a transactional process (when communication has been mentioned at all), thereby

  16. Parallel Distributed Processing Theory in the Age of Deep Networks.

    Science.gov (United States)

    Bowers, Jeffrey S

    2017-12-01

    Parallel distributed processing (PDP) models in psychology are the precursors of deep networks used in computer science. However, only PDP models are associated with two core psychological claims, namely that all knowledge is coded in a distributed format and cognition is mediated by non-symbolic computations. These claims have long been debated in cognitive science, and recent work with deep networks speaks to this debate. Specifically, single-unit recordings show that deep networks learn units that respond selectively to meaningful categories, and researchers are finding that deep networks need to be supplemented with symbolic systems to perform some tasks. Given the close links between PDP and deep networks, it is surprising that research with deep networks is challenging PDP theory. Copyright © 2017. Published by Elsevier Ltd.

  17. Problems in probability theory, mathematical statistics and theory of random functions

    CERN Document Server

    Sveshnikov, A A

    1979-01-01

    Problem solving is the main thrust of this excellent, well-organized workbook. Suitable for students at all levels in probability theory and statistics, the book presents over 1,000 problems and their solutions, illustrating fundamental theory and representative applications in the following fields: Random Events; Distribution Laws; Correlation Theory; Random Variables; Entropy & Information; Markov Processes; Systems of Random Variables; Limit Theorems; Data Processing; and more.The coverage of topics is both broad and deep, ranging from the most elementary combinatorial problems through lim

  18. Normal Forms for Fuzzy Logics: A Proof-Theoretic Approach

    Czech Academy of Sciences Publication Activity Database

    Cintula, Petr; Metcalfe, G.

    2007-01-01

    Roč. 46, č. 5-6 (2007), s. 347-363 ISSN 1432-0665 R&D Projects: GA MŠk(CZ) 1M0545 Institutional research plan: CEZ:AV0Z10300504 Keywords : fuzzy logic * normal form * proof theory * hypersequents Subject RIV: BA - General Mathematics Impact factor: 0.620, year: 2007

  19. The process of adopting and incorporating simulation into undergraduate nursing curricula: a grounded theory study.

    Science.gov (United States)

    Taplay, Karyn; Jack, Susan M; Baxter, Pamela; Eva, Kevin; Martin, Lynn

    2015-01-01

    The aim of this study is to explain the process of adopting and incorporating simulation as a teaching strategy in undergraduate nursing programs, define uptake, and discuss potential outcomes. In many countries, simulation is increasingly adopted as a common teaching strategy. However, there is a dearth of knowledge related to the process of adoption and incorporation. We used an interpretive, constructivist approach to grounded theory to guide this research study. We conducted the study was in Ontario, Canada, during 2011-2012. Using multiple data sources, we informed the development of this theory including in-depth interviews (n = 43) and a review of key organizational documents, such as mission and vision statements (n = 67) from multiple nursing programs (n = 13). The adoption and uptake of mid- to high-fidelity simulation equipment is a multistep iterative process involving various organizational levels within the institution that entails a seven-phase process: (a) securing resources, (b) nursing leaders working in tandem, (c) getting it out of the box, (d) learning about simulation and its potential for teaching, (e) finding a fit, (f) trialing the equipment, and (g) integrating into the curriculum. These findings could assist nursing programs in Canada and internationally that wish to adopt or further incorporate simulation into their curricula and highlight potential organizational and program level outcomes. Crown Copyright © 2015. Published by Elsevier Inc. All rights reserved.

  20. A Quantum Version of Wigner's Transition State Theory

    NARCIS (Netherlands)

    Schubert, R.; Waalkens, H.; Wiggins, S.

    A quantum version of a recent realization of Wigner's transition state theory in phase space is presented. The theory developed builds on a quantum normal form which locally decouples the quantum dynamics near the transition state to any desired order in (h) over bar. This leads to an explicit

  1. Evaluation of Two Absolute Radiometric Normalization Algorithms for Pre-processing of Landsat Imagery

    Institute of Scientific and Technical Information of China (English)

    Xu Hanqiu

    2006-01-01

    In order to evaluate radiometric normalization techniques, two image normalization algorithms for absolute radiometric correction of Landsat imagery were quantitatively compared in this paper, which are the Illumination Correction Model proposed by Markham and Irish and the Illumination and Atmospheric Correction Model developed by the Remote Sensing and GIS Laboratory of the Utah State University. Relative noise, correlation coefficient and slope value were used as the criteria for the evaluation and comparison, which were derived from pseudo-invariant features identified from multitemtween the normalized multitemporal images were significantly reduced when the seasons of multitemporal images were different. However, there was no significant difference between the normalized and unnormalized images with a similar seasonal condition. Furthermore, the correction results of two algorithms are similar when the images are relatively clear with a uniform atmospheric condition. Therefore, the radiometric normalization procedures should be carried out if the multitemporal images have a significant seasonal difference.

  2. Log-Normal Turbulence Dissipation in Global Ocean Models

    Science.gov (United States)

    Pearson, Brodie; Fox-Kemper, Baylor

    2018-03-01

    Data from turbulent numerical simulations of the global ocean demonstrate that the dissipation of kinetic energy obeys a nearly log-normal distribution even at large horizontal scales O (10 km ) . As the horizontal scales of resolved turbulence are larger than the ocean is deep, the Kolmogorov-Yaglom theory for intermittency in 3D homogeneous, isotropic turbulence cannot apply; instead, the down-scale potential enstrophy cascade of quasigeostrophic turbulence should. Yet, energy dissipation obeys approximate log-normality—robustly across depths, seasons, regions, and subgrid schemes. The distribution parameters, skewness and kurtosis, show small systematic departures from log-normality with depth and subgrid friction schemes. Log-normality suggests that a few high-dissipation locations dominate the integrated energy and enstrophy budgets, which should be taken into account when making inferences from simplified models and inferring global energy budgets from sparse observations.

  3. Differential expression and processing of transforming growth factor beta induced protein (TGFBIp) in the normal human cornea during postnatal development and aging

    DEFF Research Database (Denmark)

    Karring, Henrik; Runager, Kasper; Valnickova, Zuzana

    2010-01-01

    Transforming growth factor beta induced protein (TGFBIp, also named keratoepithelin) is an extracellular matrix protein abundant in the cornea. The purpose of this study was to determine the expression and processing of TGFBIp in the normal human cornea during postnatal development and aging...... trimming events from the N-terminus of mature TGFBIp generate TGFBIp isoforms which form a similar "zig-zag" pattern when separated by 2-D polyacrylamide gel electrophoresis (PAGE). This study shows that in humans TGFBIp is more abundant in mature corneas than in the developing cornea...... and that the processing of TGFBIp changes during postnatal development of the cornea. In addition, TGFBIp appears to be degraded in a highly orchestrated manner in the normal human cornea with the resulting C-terminal fragments being retained in the cornea. The age-related changes in the expression and processing...

  4. Decision and intuition during organizational change : an evolutionary critique of dual process theory

    OpenAIRE

    Talat, U; Chang, K; Nguyen, B

    2017-01-01

    Purpose: The purpose of this paper is to review intuition in the context of organizational change. We argue that intuition as a concept requires attention and its formulation is necessary prior to its application in organizations. The paper provides a critique of Dual Process Theory and highlights shortcomings in organization theorizing of intuition.\\ud Design/methodology/approach: The paper is conceptual and provides in-depth theoretical discussions by drawing from the literature on decision...

  5. Normal Mode Analysis to a Poroelastic Half-Space Problem under Generalized Thermoelasticity

    Directory of Open Access Journals (Sweden)

    Chunbao Xiong

    Full Text Available Abstract The thermo-hydro-mechanical problems associated with a poroelastic half-space soil medium with variable properties under generalized thermoelasticity theory were investigated in this study. By remaining faithful to Biot’s theory of dynamic poroelasticity, we idealized the foundation material as a uniform, fully saturated, poroelastic half-space medium. We first subjected this medium to time harmonic loads consisting of normal or thermal loads, then investigated the differences between the coupled thermohydro-mechanical dynamic models and the thermo-elastic dynamic models. We used normal mode analysis to solve the resulting non-dimensional coupled equations, then investigated the effects that non-dimensional vertical displacement, excess pore water pressure, vertical stress, and temperature distribution exerted on the poroelastic half-space medium and represented them graphically.

  6. Optically stimulated exoelectron emission processes in quartz: comparison of experiment and theory

    DEFF Research Database (Denmark)

    Pagonis, V.; Ankjærgaard, Christina; Murray, A.S.

    2009-01-01

    Recent experiments have demonstrated that it is possible to measure optically stimulated exoelectron emission (OSE) signals simultaneously with optically stimulated luminescence (OSL) from quartz samples. These experiments provide valuable information on the charge movement in quartz grains. Two...... data yield a value of χ1.2 eV for the work function of quartz. The experimental temperature dependence of the OSE signals is interpreted on the basis of a photo-thermostimulated (PTSEE) process involving the main OSL trap at 320 °C; this process takes place with a thermal assistance energy estimated...... at W(0.29±0.02) eV. Good quantitative agreement is obtained between theory and experiment by assuming a thermal broadening of the thermal depletion factor for the OSL traps, described by a Gaussian distribution of energies....

  7. Improving Readability of an Evaluation Tool for Low-Income Clients Using Visual Information Processing Theories

    Science.gov (United States)

    Townsend, Marilyn S.; Sylva, Kathryn; Martin, Anna; Metz, Diane; Wooten-Swanson, Patti

    2008-01-01

    Literacy is an issue for many low-income audiences. Using visual information processing theories, the goal was improving readability of a food behavior checklist and ultimately improving its ability to accurately capture existing changes in dietary behaviors. Using group interviews, low-income clients (n = 18) evaluated 4 visual styles. The text…

  8. Capability-oriented agent theory and its applications in dependable systems and process engineering

    Energy Technology Data Exchange (ETDEWEB)

    Thunem, Atoosa P-J.

    2004-04-15

    During the rapid growth of computerised systems in the past 15 years, the variety of services and their efficiency have been the strongest deciding factors in design and development of the systems within various industrial branches. At the same time, the introduction and popularity of emerging design and development techniques seems to have forced the industry to include these in their product development process. Unfortunately, too many examples of lack of use or erroneous use of these techniques within industries such as telecommunications, telemedicine, aerospace and indeed the energy sector indicate that a common understanding of and belief in the rationale behind the techniques and their solution domains has not been obtained. At the same time, a tremendous increase in the number of emerging techniques has made such an understanding difficult to gain, especially when the techniques share the same application field, but differ in few yet important issues. Finally, the lack of knowledge about system aspects and the integration of various abstraction levels to describe them have added even more to the confusion on how to use different techniques. The work resulting in the Capability-Oriented Agent Theory began while trying to find more descriptive system models, taking into account a wider selection of system aspects. Although related to object-oriented and agent-oriented principles, the theory differs from such principles in many respects. Among others, its focal point is on a category of system aspects neither addressed nor recognised within such principles before. Additionally, the theory opposes the well-established idea of distinct separation between requirement, design, implementation and test specifications, but suggests a systematic integration of the related activities, hence to increase their traceability and intercommunication in both a top-down and a bottom-up manner along the development process. (Author)

  9. Chasing the Mirage: a grounded theory of the clinical reasoning processes that Registered Nurses use to recognize delirium.

    Science.gov (United States)

    El Hussein, Mohamed; Hirst, Sandra

    2016-02-01

    The aim of this study was to construct a grounded theory that explains the clinical reasoning processes that registered nurses use to recognize delirium in older adults in acute care hospitals. Delirium is under-recognized in acute hospital settings, this may stem from underdeveloped clinical reasoning processes. Little is known about registered nurses' (RNs) clinical reasoning processes in complex situations such as delirium recognition. A grounded theory approach was used to analyse interview data about the clinical reasoning processes of RNs in acute hospital settings. Seventeen RNs were recruited. Concurrent data collection and comparative analysis and theoretical sampling were conducted in 2013-2014. The core category to emerge from the data was 'chasing the mirage', which describes RNs' clinical reasoning processes to recognize delirium during their interaction with older adults. Understanding the reasoning that contributes to delirium under-recognition provides a strategy by which, this problem can be brought to the forefront of RNs' awareness and intervention. Delirium recognition will contribute to quality care for older adults. © 2015 John Wiley & Sons Ltd.

  10. Advanced multilateration theory, software development, and data processing: The MICRODOT system

    Science.gov (United States)

    Escobal, P. R.; Gallagher, J. F.; Vonroos, O. H.

    1976-01-01

    The process of geometric parameter estimation to accuracies of one centimeter, i.e., multilateration, is defined and applications are listed. A brief functional explanation of the theory is presented. Next, various multilateration systems are described in order of increasing system complexity. Expected systems accuracy is discussed from a general point of view and a summary of the errors is listed. An outline of the design of a software processing system for multilateration, called MICRODOT, is presented next. The links of this software, which can be used for multilateration data simulations or operational data reduction, are examined on an individual basis. Functional flow diagrams are presented to aid in understanding the software capability. MICRODOT capability is described with respect to vehicle configurations, interstation coordinate reduction, geophysical parameter estimation, and orbit determination. Numerical results obtained from MICRODOT via data simulations are displayed both for hypothetical and real world vehicle/station configurations such as used in the GEOS-3 Project. These simulations show the inherent power of the multilateration procedure.

  11. High-energy, large-momentum-transfer processes: Ladder diagrams in var-phi 3 theory

    International Nuclear Information System (INIS)

    Newton, C.L.J.

    1990-01-01

    Relativistic quantum field theories may help one to understand high-energy, large-momentum-transfer processes, where the center-of-mass energy is much larger than the transverse momentum transfers, which are in turn much larger than the masses of the participating particles. With this possibility in mind, the author studies ladder diagrams in var-phi 3 theory. He shows that in the limit s much-gt |t| much-gt m 2 , the scattering amplitude for the N-rung ladder diagram takes the form s -1 |t| -N+1 times a homogeneous polynomial of degree 2N - 2 and ln s and ln |t|. This polynomial takes different forms depending on the relation of ln |t| to ln s. More precisely, the asymptotic formula for the N-rung ladder diagram has points of non-analytically when ln |t| = γ ln s for γ = 1/2, 1/3, hor-ellipsis, 1/N-2

  12. Mathematical theory of peer-instruction dynamics

    Directory of Open Access Journals (Sweden)

    Hideo Nitta

    2010-08-01

    Full Text Available A mathematical theory of peer instruction describing the increase of the normalized number of correct answers due to peer discussion is presented. A simple analytic expression is derived which agrees with class data. It is shown that our theory is connected to the mathematical learning models proposed by Pritchard et al. It is also shown that obtained theoretical lines are useful for analyzing peer-instruction efficiencies.

  13. Decision making using AHP (Analytic Hierarchy Process) and fuzzy set theory in waste management

    International Nuclear Information System (INIS)

    Chung, J.Y.; Lee, K.J.; Kim, C.D.

    1995-01-01

    The major problem is how to consider the differences in opinions, when many experts are involved in decision making process. This paper provides a simple general methodology to treat the differences in various opinions. The authors determined the grade of membership through the process of magnitude estimation derived from pairwise comparisons and AHP developed by Saaty. They used fuzzy set theory to consider the differences in opinions and obtain the priorities for each alternative. An example, which can be applied to radioactive waste management, also was presented. The result shows a good agreement with the results of averaging methods

  14. Cognitive levels of performance account for hemispheric lateralisation effects in dyslexic and normally reading children.

    Science.gov (United States)

    Heim, Stefan; Grande, Marion; Meffert, Elisabeth; Eickhoff, Simon B; Schreiber, Helen; Kukolja, Juraj; Shah, Nadim Jon; Huber, Walter; Amunts, Katrin

    2010-12-01

    Recent theories of developmental dyslexia explain reading deficits in terms of deficient phonological awareness, attention, visual and auditory processing, or automaticity. Since dyslexia has a neurobiological basis, the question arises how the reader's proficiency in these cognitive variables affects the brain regions involved in visual word recognition. This question was addressed in two fMRI experiments with 19 normally reading children (Experiment 1) and 19 children with dyslexia (Experiment 2). First, reading-specific brain activation was assessed by contrasting the BOLD signal for reading aloud words vs. overtly naming pictures of real objects. Next, ANCOVAs with brain activation during reading the individuals' scores for all five cognitive variables assessed outside the scanner as covariates were performed. Whereas the normal readers' brain activation during reading showed co-variation effects predominantly in the right hemisphere, the reverse pattern was observed for the dyslexics. In particular, middle frontal gyrus, inferior parietal cortex, and precuneus showed contralateral effects for controls as compared to dyslexics. In line with earlier findings in the literature, these data hint at a global change in hemispheric asymmetry during cognitive processing in dyslexic readers, which, in turn, might affect reading proficiency. Copyright © 2010 Elsevier Inc. All rights reserved.

  15. [Introduction to grounded theory].

    Science.gov (United States)

    Wang, Shou-Yu; Windsor, Carol; Yates, Patsy

    2012-02-01

    Grounded theory, first developed by Glaser and Strauss in the 1960s, was introduced into nursing education as a distinct research methodology in the 1970s. The theory is grounded in a critique of the dominant contemporary approach to social inquiry, which imposed "enduring" theoretical propositions onto study data. Rather than starting from a set theoretical framework, grounded theory relies on researchers distinguishing meaningful constructs from generated data and then identifying an appropriate theory. Grounded theory is thus particularly useful in investigating complex issues and behaviours not previously addressed and concepts and relationships in particular populations or places that are still undeveloped or weakly connected. Grounded theory data analysis processes include open, axial and selective coding levels. The purpose of this article was to explore the grounded theory research process and provide an initial understanding of this methodology.

  16. From topological quantum field theories to supersymmetric gauge theories; Des theories quantiques de champ topologiques aux theories de jauge supersymetriques

    Energy Technology Data Exchange (ETDEWEB)

    Bossard, G

    2007-10-15

    This thesis contains 2 parts based on scientific contributions that have led to 2 series of publications. The first one concerns the introduction of vector symmetry in cohomological theories, through a generalization of the so-called Baulieu-Singer equation. Together with the topological BRST (Becchi-Rouet-Stora-Tyutin) operator, this symmetry gives an off-shell closed sub-sector of supersymmetry that permits to determine the action uniquely. The second part proposes a methodology for re-normalizing supersymmetric Yang-Mills theory without assuming a regularization scheme which is both supersymmetry and gauge invariance preserving. The renormalization prescription is derived thanks to the definition of 2 consistent Slavnov-Taylor operators for supersymmetry and gauge invariance, whose construction requires the introduction of the so-called shadow fields. We demonstrate the renormalizability of supersymmetric Yang-Mills theories. We give a fully consistent, regularization scheme independent, proof of the vanishing of the {beta} function and of the anomalous dimensions of the one half BPS operators in maximally supersymmetric Yang-Mills theory. After a short introduction, in chapter two, we give a review of the cohomological Yang-Mills theory in eight dimensions. We then study its dimensional reductions in seven and six dimensions. The last chapter gives quite independent results, about a geometrical interpretation of the shadow fields, an unpublished work about topological gravity in four dimensions, an extension of the shadow formalism to superconformal invariance, and finally the solution of the constraints in a twisted superspace. (author)

  17. Motivation for aggressive religious radicalization: goal regulation theory and a personality × threat × affordance hypothesis

    OpenAIRE

    McGregor, Ian; Hayes, Joseph; Prentice, Mike

    2015-01-01

    A new set of hypotheses is presented regarding the cause of aggressive religious radicalization (ARR). It is grounded in classic and contemporary theory of human motivation and goal regulation, together with recent empirical advances in personality, social, and neurophysiological psychology. We specify personality traits, threats, and group affordances that combine to divert normal motivational processes toward ARR. Conducive personality traits are oppositional, anxiety-prone, and identity-we...

  18. Motivation for Aggressive Religious Radicalization: Goal Regulation Theory and a Personality × Threat × Affordance Hypothesis

    OpenAIRE

    Ian eMcGregor; Joseph eHayes; Mike ePrentice; Mike ePrentice

    2015-01-01

    A new set of hypotheses is presented regarding the cause of aggressive religious radicalization. It is grounded in classic and contemporary theory of human motivation and goal regulation, together with recent empirical advances in personality, social, and neurophysiological psychology. We specify personality traits, threats, and group affordances that combine to divert normal motivational processes toward aggressive religious radicalization. Conducive personality traits are oppositional, anxi...

  19. Glomerular epithelial foot processes in normal man and rats. Distribution of true width and its intra- and inter-individual variation.

    Science.gov (United States)

    Gundersen, H J; Seefeldt, T; Osterby, R

    1980-01-01

    The width of individual glomerular epithelial foot processes appears very different on electron micrographs. A method for obtainining distributions of the true width of foot processes from that of their apparent width on electron micrographs has been developed based on geometric probability theory pertaining to a specific geometric model. Analyses of foot process width in humans and rats show a remarkable interindividual invariance implying rigid control and therefore great biological significance of foot process width or a derivative thereof. The very low inter-individual variation of the true width, shown in the present paper, makes it possible to demonstrate slight changes in rather small groups of patients or experimental animals.

  20. Forests as Patrimonies? From Theory to Tangible Processes at Various Scales

    Directory of Open Access Journals (Sweden)

    Genevieve Michon

    2012-09-01

    Full Text Available Among theoretical fields addressing the conceptualization of interrelationships between nature and society, patrimonial approaches remain relatively unexplored. Stressing the multiplication of local dynamics where elements of nature are redefined as "patrimonies" (ranging from local patrimonies to world heritage by various social groups, this conceptual field tries to qualify these dynamics and their determiners to understand how they allow us to better address contemporary environmental challenges. Through a multidisciplinary approach in social sciences, centered on rural forests, we analyze the multiplication of patrimonial processes in forest development at various scales. First, we elaborate on the concept of patrimony and on patrimonial processes and present the current construction and dynamics of forest patrimonies. A crucial question concerns the links that form between the many spatial-temporal levels where these processes develop. Moreover, these patrimonial processes can be quite divergent, not only across scales from local to global, but also between "endogenous" (or bottom-up and "exogenous" (or top-down processes. We present two detailed examples in Morocco and Sumatra, where patrimonial constructions are developing simultaneously at various scales and through various actors who treat the forest in very different ways. Drawing from these examples, we discuss how and why the simultaneous development of different, often overlapping, patrimonial constructions along these scales allows collaboration or, conversely, can lead their holders into conflict. Lastly, we discuss the contribution of patrimonial concepts to resilience thinking and social-ecological systems theory.

  1. Minimal theory of massive gravity

    Directory of Open Access Journals (Sweden)

    Antonio De Felice

    2016-01-01

    Full Text Available We propose a new theory of massive gravity with only two propagating degrees of freedom. While the homogeneous and isotropic background cosmology and the tensor linear perturbations around it are described by exactly the same equations as those in the de Rham–Gabadadze–Tolley (dRGT massive gravity, the scalar and vector gravitational degrees of freedom are absent in the new theory at the fully nonlinear level. Hence the new theory provides a stable nonlinear completion of the self-accelerating cosmological solution that was originally found in the dRGT theory. The cosmological solution in the other branch, often called the normal branch, is also rendered stable in the new theory and, for the first time, makes it possible to realize an effective equation-of-state parameter different from (either larger or smaller than −1 without introducing any extra degrees of freedom.

  2. Structural and functional perspectives on classification and seriation in psychotic and normal children.

    Science.gov (United States)

    Breslow, L; Cowan, P A

    1984-02-01

    This study describes a strategy for examining cognitive functioning in psychotic and normal children without the usual confounding effects of marked differences in cognitive structure that occur when children of the same age are compared. Participants were 14 psychotic children, 12 males and 2 females, mean age 9-2, matched with normal children at preoperational and concrete operational stage levels on a set of Piagetian classification tasks. The mean age of the normal children was 6-4, replicating the usually found developmental delay in psychotic samples. Participants were then compared on both structural level and functional abilities on a set of tasks involving seriation of sticks; the higher-level children were also administered a seriation drawing task. Analysis of children's processes of seriating and seriation drawings indicated that over and above the structural retardation, psychotic children at all levels showed functional deficits, especially in the use of anticipatory imagery. The implications for general developmental theory are that progress in structural development is not sufficient for imaginal development, and that structural development of logical concepts is relatively independent of the development of imagery. It was suggested that "thought disorder" may not be a disordered structure of thinking or a retardation in psychotic populations but rather a mismatch between higher-level logical structures and lower-level functions.

  3. Adolescent Perspectives Following Ostomy Surgery: A Grounded Theory Study.

    Science.gov (United States)

    Mohr, Lynn D; Hamilton, Rebekah J

    2016-01-01

    This purpose of this study was to provide a theoretical account of how adolescents aged 13 to 18 years process the experience of having an ostomy. Qualitative study using grounded theory design. The sample comprised of 12 English-speaking adolescents aged 13-18 years: 10 with an ostomy and 2 with medical management of their disease. Respondents completed audio-recorded interviews that were transcribed verbatim. Data were analyzed using the constant comparative method until data saturation occurred. Dedoose, a Web-based qualitative methods management tool, was used to capture major themes arising from the data. Study results indicate that for adolescents between 13 and 18 years of age, processing the experience of having an ostomy includes concepts of the "physical self" and "social self" with the goal of "normalizing." Subcategories of physical self include (a) changing reality, (b) learning, and (c) adapting. Subcategories of social self include (a) reentering and (b) disclosing. This study sheds light on how adolescents process the experience of having an ostomy and how health care providers can assist adolescents to move through the process to get back to their desired "normal" state. Health care providers can facilitate the adolescent through the ostomy experience by being proactive in conversations not only about care issues but also about school and family concerns and spirituality. Further research is needed in understanding how parents process their adolescents' ostomy surgery experience and how spirituality assists adolescents in coping and adjustment with body-altering events.

  4. A phenomenological theory of transient creep

    International Nuclear Information System (INIS)

    Ajaja, O.; Ardell, A.J.

    1979-01-01

    A new creep theory is proposed which takes into account the strain generated during the annihilation of dislocations. This contribution is found to be very significant when recovery is appreciable, and is mainly responsible for the decreasing creep rate associated with the normal primary creep of class II materials. The theory provides excellent semiquantitative rationalization for the types of creep curves presented in the preceding paper. In particular, the theory predicts a change in the shape of the primary creep curve from normal to inverted as recovery becomes less important, i.e. as the applied stress and/or temperature decrease(s). It also predicts a minimum creep rate under certain circumstances, hence pseudo-tertiary behaviour. These different types of creep curves are predicted even though the net dislocation density decreases monotonically with time in all cases. Qualitative rationalization is presented for the inverted transient which always follows a stress drop in class II materials, as well as for the inverted primary and sigmoidal creep behaviour of class I solid solutions. (author)

  5. Immediate Survival Focus: Synthesizing Life History Theory and Dual Process Models to Explain Substance Use

    Directory of Open Access Journals (Sweden)

    George B. Richardson

    2012-10-01

    Full Text Available Researchers have recently applied evolutionary life history theory to the understanding of behaviors often conceived of as prosocial or antisocial. In addition, researchers have applied cognitive science to the understanding of substance use and used dual process models, where explicit cognitive processes are modeled as relatively distinct from implicit cognitive processes, to explain and predict substance use behaviors. In this paper we synthesized these two theoretical perspectives to produce an adaptive and cognitive framework for explaining substance use. We contend that this framework provides new insights into the nature of substance use that may be valuable for both clinicians and researchers.

  6. Dynamical Systems Theory: Application to Pedagogy

    Science.gov (United States)

    Abraham, Jane L.

    Theories of learning affect how cognition is viewed, and this subsequently leads to the style of pedagogical practice that is used in education. Traditionally, educators have relied on a variety of theories on which to base pedagogy. Behavioral learning theories influenced the teaching/learning process for over 50 years. In the 1960s, the information processing approach brought the mind back into the learning process. The current emphasis on constructivism integrates the views of Piaget, Vygotsky, and cognitive psychology. Additionally, recent scientific advances have allowed researchers to shift attention to biological processes in cognition. The problem is that these theories do not provide an integrated approach to understanding principles responsible for differences among students in cognitive development and learning ability. Dynamical systems theory offers a unifying theoretical framework to explain the wider context in which learning takes place and the processes involved in individual learning. This paper describes how principles of Dynamic Systems Theory can be applied to cognitive processes of students, the classroom community, motivation to learn, and the teaching/learning dynamic giving educational psychologists a framework for research and pedagogy.

  7. Preliminary Process Theory does not validate the Comparison Question Test: A comment on Palmatier and Rovner

    NARCIS (Netherlands)

    Ben-Shakar, G.; Gamer, M.; Iacono, W.; Meijer, E.; Verschuere, B.

    2015-01-01

    Palmatier and Rovner (2015) attempt to establish the construct validity of the Comparison Question Test (CQT) by citing extensive research ranging from modern neuroscience to memory and psychophysiology. In this comment we argue that merely citing studies on the preliminary process theory (PPT) of

  8. Normalization in Lie algebras via mould calculus and applications

    Science.gov (United States)

    Paul, Thierry; Sauzin, David

    2017-11-01

    We establish Écalle's mould calculus in an abstract Lie-theoretic setting and use it to solve a normalization problem, which covers several formal normal form problems in the theory of dynamical systems. The mould formalism allows us to reduce the Lie-theoretic problem to a mould equation, the solutions of which are remarkably explicit and can be fully described by means of a gauge transformation group. The dynamical applications include the construction of Poincaré-Dulac formal normal forms for a vector field around an equilibrium point, a formal infinite-order multiphase averaging procedure for vector fields with fast angular variables (Hamiltonian or not), or the construction of Birkhoff normal forms both in classical and quantum situations. As a by-product we obtain, in the case of harmonic oscillators, the convergence of the quantum Birkhoff form to the classical one, without any Diophantine hypothesis on the frequencies of the unperturbed Hamiltonians.

  9. Threads: theory of Vygotsky to learning processes and child development in early childhood education mediated by toy construction

    Directory of Open Access Journals (Sweden)

    Fábio Tadeu Reina

    2016-05-01

    Full Text Available This article has the objectivity to point out some of the threads of the cultural historical theory of Vygotsky partner to the process of learning and development of children in early childhood education mediated by the construction of toys and games. In this direction, looking to approach the foundations of this theory in order to internalize the reader in his work in search of reflections and readings on the theme proposed here.

  10. On electromagnetic forming processes in finitely strained solids: Theory and examples

    Science.gov (United States)

    Thomas, J. D.; Triantafyllidis, N.

    2009-08-01

    The process of electromagnetic forming (EMF) is a high velocity manufacturing technique that uses electromagnetic (Lorentz) body forces to shape sheet metal parts. EMF holds several advantages over conventional forming techniques: speed, repeatability, one-sided tooling, and most importantly considerable ductility increase in several metals. Current modeling techniques for EMF processes are not based on coupled variational principles to simultaneously account for electromagnetic and mechanical effects. Typically, separate solutions to the electromagnetic (Maxwell) and motion (Newton) equations are combined in staggered or lock-step methods, sequentially solving the mechanical and electromagnetic problems. The present work addresses these issues by introducing a fully coupled Lagrangian (reference configuration) least-action variational principle, involving magnetic flux and electric potentials and the displacement field as independent variables. The corresponding Euler-Lagrange equations are Maxwell's and Newton's equations in the reference configuration, which are shown to coincide with their current configuration counterparts obtained independently by a direct approach. The general theory is subsequently simplified for EMF processes by considering the eddy current approximation. Next, an application is presented for axisymmetric EMF problems. It is shown that the proposed variational principle forms the basis of a variational integration numerical scheme that provides an efficient staggered solution algorithm. As an illustration a number of such processes are simulated, inspired by recent experiments of freely expanding uncoated and polyurea-coated aluminum tubes.

  11. Origin and evolution of the free radical theory of aging: a brief personal history, 1954–2009.

    Science.gov (United States)

    Harman, Denham

    2009-12-01

    Aging is the progressive accumulation in an organism of diverse, deleterious changes with time that increase the chance of disease and death. The basic chemical process underlying aging was first advanced by the free radical theory of aging (FRTA) in 1954: the reaction of active free radicals, normally produced in the organisms, with cellular constituents initiates the changes associated with aging. The involvement of free radicals in aging is related to their key role in the origin and evolution of life. The initial low acceptance of the FRTA by the scientific community, its slow growth, manifested by meetings and occasional papers based on the theory, prompted this account of the intermittent growth of acceptance of the theory over the past nearly 55 years.

  12. [Low level auditory skills compared to writing skills in school children attending third and fourth grade: evidence for the rapid auditory processing deficit theory?].

    Science.gov (United States)

    Ptok, M; Meisen, R

    2008-01-01

    The rapid auditory processing defi-cit theory holds that impaired reading/writing skills are not caused exclusively by a cognitive deficit specific to representation and processing of speech sounds but arise due to sensory, mainly auditory, deficits. To further explore this theory we compared different measures of auditory low level skills to writing skills in school children. prospective study. School children attending third and fourth grade. just noticeable differences for intensity and frequency (JNDI, JNDF), gap detection (GD) monaural and binaural temporal order judgement (TOJb and TOJm); grade in writing, language and mathematics. correlation analysis. No relevant correlation was found between any auditory low level processing variable and writing skills. These data do not support the rapid auditory processing deficit theory.

  13. Multiscale System Theory

    Science.gov (United States)

    1990-02-21

    LIDS-P-1953 Multiscale System Theory Albert Benveniste IRISA-INRIA, Campus de Beaulieu 35042 RENNES CEDEX, FRANCE Ramine Nikoukhah INRIA...TITLE AND SUBTITLE Multiscale System Theory 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e...the development of a corresponding system theory and a theory of stochastic processes and their estimation. The research presented in this and several

  14. Adding Theoretical Grounding to Grounded Theory: Toward Multi-Grounded Theory

    OpenAIRE

    Göran Goldkuhl; Stefan Cronholm

    2010-01-01

    The purpose of this paper is to challenge some of the cornerstones of the grounded theory approach and propose an extended and alternative approach for data analysis and theory development, which the authors call multi-grounded theory (MGT). A multi-grounded theory is not only empirically grounded; it is also grounded in other ways. Three different grounding processes are acknowledged: theoretical, empirical, and internal grounding. The authors go beyond the pure inductivist approach in GT an...

  15. A suicidal recovery theory to guide individuals on their healing and recovering process following a suicide attempt.

    Science.gov (United States)

    Sun, Fan-Ko; Long, Ann

    2013-09-01

    To develop a theory to guide the recovery process of a recent suicide attempt. Suicide is one of the 10 leading causes of death in many countries. Many nations have set targets to reduce the high incidence of suicide by aiming to prevent people from taking their own lives and also providing care to promote the healing of those who attempt suicide. A qualitative grounded theory approach was used. Data were collected in 2011-2012 in a Taiwanese hospital until data saturation occurred. Twenty participants were interviewed, comprising patients who recovered from suicide attempts (N = 14) and their caregivers (N = 6). Data were analysed using open, axial, and selective coding and using the constant comparison technique. A substantive theory was formulated to guide the recovery process of people who have recently attempted suicide. The core category that emerged from the data collected was 'Striving to accept the value of self-in-existence'. Other key categories linked to and embraced in this core category were: becoming flexible and open-minded, re-building a positive sense of self, and endeavouring to live a peaceful and contented life. Nurses could use this theory as a theoretical framework to guide people who are recovering from a suicide attempt by affording them the opportunity to grow and heal, and facilitating the re-building a positive sense of self, acknowledging the uncertainties of life, and inspiring hope. © 2013 Blackwell Publishing Ltd.

  16. Performance Feedback Processing Is Positively Biased As Predicted by Attribution Theory.

    Directory of Open Access Journals (Sweden)

    Christoph W Korn

    incorrect performance alone could not explain the observed positivity bias. Furthermore, participants' behavior in our task was linked to the most widely used measure of attribution style. In sum, our findings suggest that positive and negative performance feedback influences the evaluation of task-related stimuli, as predicted by attribution theory. Therefore, our study points to the relevance of attribution theory for feedback processing in decision-making and provides a novel outlook for decision-making biases.

  17. Performance Feedback Processing Is Positively Biased As Predicted by Attribution Theory.

    Science.gov (United States)

    Korn, Christoph W; Rosenblau, Gabriela; Rodriguez Buritica, Julia M; Heekeren, Hauke R

    2016-01-01

    alone could not explain the observed positivity bias. Furthermore, participants' behavior in our task was linked to the most widely used measure of attribution style. In sum, our findings suggest that positive and negative performance feedback influences the evaluation of task-related stimuli, as predicted by attribution theory. Therefore, our study points to the relevance of attribution theory for feedback processing in decision-making and provides a novel outlook for decision-making biases.

  18. A Theory of Information Quality and a Framework for its Implementation in the Requirements Engineering Process

    Science.gov (United States)

    Grenn, Michael W.

    This dissertation introduces a theory of information quality to explain macroscopic behavior observed in the systems engineering process. The theory extends principles of Shannon's mathematical theory of communication [1948] and statistical mechanics to information development processes concerned with the flow, transformation, and meaning of information. The meaning of requirements information in the systems engineering context is estimated or measured in terms of the cumulative requirements quality Q which corresponds to the distribution of the requirements among the available quality levels. The requirements entropy framework (REF) implements the theory to address the requirements engineering problem. The REF defines the relationship between requirements changes, requirements volatility, requirements quality, requirements entropy and uncertainty, and engineering effort. The REF is evaluated via simulation experiments to assess its practical utility as a new method for measuring, monitoring and predicting requirements trends and engineering effort at any given time in the process. The REF treats the requirements engineering process as an open system in which the requirements are discrete information entities that transition from initial states of high entropy, disorder and uncertainty toward the desired state of minimum entropy as engineering effort is input and requirements increase in quality. The distribution of the total number of requirements R among the N discrete quality levels is determined by the number of defined quality attributes accumulated by R at any given time. Quantum statistics are used to estimate the number of possibilities P for arranging R among the available quality levels. The requirements entropy H R is estimated using R, N and P by extending principles of information theory and statistical mechanics to the requirements engineering process. The information I increases as HR and uncertainty decrease, and the change in information AI needed

  19. The Emergence of the Teaching/Learning Process in Preschoolers: Theory of Mind and Age Effect

    Science.gov (United States)

    Bensalah, Leila

    2011-01-01

    This study analysed the gradual emergence of the teaching/learning process by examining theory of mind (ToM) acquisition and age effects in the preschool period. We observed five dyads performing a jigsaw task drawn from a previous study. Three stages were identified. In the first one, the teacher focuses on the execution of her/his own task…

  20. Stochastic Processes and Queueing Theory used in Cloud Computer Performance Simulations

    Directory of Open Access Journals (Sweden)

    Florin-Catalin ENACHE

    2015-10-01

    Full Text Available The growing character of the cloud business has manifested exponentially in the last 5 years. The capacity managers need to concentrate on a practical way to simulate the random demands a cloud infrastructure could face, even if there are not too many mathematical tools to simulate such demands.This paper presents an introduction into the most important stochastic processes and queueing theory concepts used for modeling computer performance. Moreover, it shows the cases where such concepts are applicable and when not, using clear programming examples on how to simulate a queue, and how to use and validate a simulation, when there are no mathematical concepts to back it up.