WorldWideScience

Sample records for normalization process theory

  1. Development of a theory of implementation and integration: Normalization Process Theory

    Directory of Open Access Journals (Sweden)

    May Carl R

    2009-05-01

    Full Text Available Abstract Background Theories are important tools in the social and natural sciences. The methods by which they are derived are rarely described and discussed. Normalization Process Theory explains how new technologies, ways of acting, and ways of working become routinely embedded in everyday practice, and has applications in the study of implementation processes. This paper describes the process by which it was built. Methods Between 1998 and 2008, we developed a theory. We derived a set of empirical generalizations from analysis of data collected in qualitative studies of healthcare work and organization. We developed an applied theoretical model through analysis of empirical generalizations. Finally, we built a formal theory through a process of extension and implication analysis of the applied theoretical model. Results Each phase of theory development showed that the constructs of the theory did not conflict with each other, had explanatory power, and possessed sufficient robustness for formal testing. As the theory developed, its scope expanded from a set of observed regularities in data with procedural explanations, to an applied theoretical model, to a formal middle-range theory. Conclusion Normalization Process Theory has been developed through procedures that were properly sceptical and critical, and which were opened to review at each stage of development. The theory has been shown to merit formal testing.

  2. Overcoming the Problem of Embedding Change in Educational Organizations: A Perspective from Normalization Process Theory

    Science.gov (United States)

    Wood, Phil

    2017-01-01

    In this article, I begin by outlining some of the barriers which constrain sustainable organizational change in schools and universities. I then go on to introduce a theory which has already started to help explain complex change and innovation processes in health and care contexts, Normalization Process Theory. Finally, I consider what this…

  3. Process evaluation of discharge planning implementation in healthcare using normalization process theory.

    Science.gov (United States)

    Nordmark, Sofi; Zingmark, Karin; Lindberg, Inger

    2016-04-27

    Discharge planning is a care process that aims to secure the transfer of care for the patient at transition from home to the hospital and back home. Information exchange and collaboration between care providers are essential, but deficits are common. A wide range of initiatives to improve the discharge planning process have been developed and implemented for the past three decades. However, there are still high rates of reported medical errors and adverse events related to failures in the discharge planning. Using theoretical frameworks such as Normalization Process Theory (NPT) can support evaluations of complex interventions and processes in healthcare. The aim of this study was to explore the embedding and integration of the DPP from the perspective of registered nurses, district nurses and homecare organizers. The study design was explorative, using the NPT as a framework to explore the embedding and integration of the DPP. Data consisted of written documentation from; workshops with staff, registered adverse events and system failures, web based survey and individual interviews with staff. Using the NPT as a framework to explore the embedding and integration of discharge planning after 10 years in use showed that the staff had reached a consensus of opinion of what the process was (coherence) and how they evaluated the process (reflexive monitoring). However, they had not reached a consensus of opinion of who performed the process (cognitive participation) and how it was performed (collective action). This could be interpreted as the process had not become normalized in daily practice. The result shows necessity to observe the implementation of old practices to better understand the needs of new ones before developing and implementing new practices or supportive tools within healthcare to reach the aim of development and to accomplish sustainable implementation. The NPT offers a generalizable framework for analysis, which can explain and shape the

  4. A qualitative systematic review of studies using the normalization process theory to research implementation processes.

    Science.gov (United States)

    McEvoy, Rachel; Ballini, Luciana; Maltoni, Susanna; O'Donnell, Catherine A; Mair, Frances S; Macfarlane, Anne

    2014-01-02

    There is a well-recognized need for greater use of theory to address research translational gaps. Normalization Process Theory (NPT) provides a set of sociological tools to understand and explain the social processes through which new or modified practices of thinking, enacting, and organizing work are implemented, embedded, and integrated in healthcare and other organizational settings. This review of NPT offers readers the opportunity to observe how, and in what areas, a particular theoretical approach to implementation is being used. In this article we review the literature on NPT in order to understand what interventions NPT is being used to analyze, how NPT is being operationalized, and the reported benefits, if any, of using NPT. Using a framework analysis approach, we conducted a qualitative systematic review of peer-reviewed literature using NPT. We searched 12 electronic databases and all citations linked to six key NPT development papers. Grey literature/unpublished studies were not sought. Limitations of English language, healthcare setting and year of publication 2006 to June 2012 were set. Twenty-nine articles met the inclusion criteria; in the main, NPT is being applied to qualitatively analyze a diverse range of complex interventions, many beyond its original field of e-health and telehealth. The NPT constructs have high stability across settings and, notwithstanding challenges in applying NPT in terms of managing overlaps between constructs, there is evidence that it is a beneficial heuristic device to explain and guide implementation processes. NPT offers a generalizable framework that can be applied across contexts with opportunities for incremental knowledge gain over time and an explicit framework for analysis, which can explain and potentially shape implementation processes. This is the first review of NPT in use and it generates an impetus for further and extended use of NPT. We recommend that in future NPT research, authors should explicate

  5. Effect of care management program structure on implementation: a normalization process theory analysis.

    Science.gov (United States)

    Holtrop, Jodi Summers; Potworowski, Georges; Fitzpatrick, Laurie; Kowalk, Amy; Green, Lee A

    2016-08-15

    Care management in primary care can be effective in helping patients with chronic disease improve their health status, however, primary care practices are often challenged with implementation. Further, there are different ways to structure care management that may make implementation more or less successful. Normalization process theory (NPT) provides a means of understanding how a new complex intervention can become routine (normalized) in practice. In this study, we used NPT to understand how care management structure affected how well care management became routine in practice. Data collection involved semi-structured interviews and observations conducted at 25 practices in five physician organizations in Michigan, USA. Practices were selected to reflect variation in physician organizations, type of care management program, and degree of normalization. Data were transcribed, qualitatively coded and analyzed, initially using an editing approach and then a template approach with NPT as a guiding framework. Seventy interviews and 25 observations were completed. Two key structures for care management organization emerged: practice-based care management where the care managers were embedded in the practice as part of the practice team; and centralized care management where the care managers worked independently of the practice work flow and was located outside the practice. There were differences in normalization of care management across practices. Practice-based care management was generally better normalized as compared to centralized care management. Differences in normalization were well explained by the NPT, and in particular the collective action construct. When care managers had multiple and flexible opportunities for communication (interactional workability), had the requisite knowledge, skills, and personal characteristics (skill set workability), and the organizational support and resources (contextual integration), a trusting professional relationship

  6. Theory of normal metals

    International Nuclear Information System (INIS)

    Mahan, G.D.

    1992-01-01

    The organizers requested that I give eight lectures on the theory of normal metals, ''with an eye on superconductivity.'' My job was to cover the general properties of metals. The topics were selected according to what the students would need to known for the following lectures on superconductivity. My role was to prepare the ground work for the later lectures. The problem is that there is not yet a widely accepted theory for the mechanism which pairs the electrons. Many mechanisms have been proposed, with those of phonons and spin fluctuations having the most followers. So I tried to discuss both topics. I also introduced the tight-binding model for metals, which forms the basis for most of the work on the cuprate superconductors

  7. Evaluating Complex Interventions and Health Technologies Using Normalization Process Theory: Development of a Simplified Approach and Web-Enabled Toolkit

    LENUS (Irish Health Repository)

    May, Carl R

    2011-09-30

    Abstract Background Normalization Process Theory (NPT) can be used to explain implementation processes in health care relating to new technologies and complex interventions. This paper describes the processes by which we developed a simplified version of NPT for use by clinicians, managers, and policy makers, and which could be embedded in a web-enabled toolkit and on-line users manual. Methods Between 2006 and 2010 we undertook four tasks. (i) We presented NPT to potential and actual users in multiple workshops, seminars, and presentations. (ii) Using what we discovered from these meetings, we decided to create a simplified set of statements and explanations expressing core constructs of the theory (iii) We circulated these statements to a criterion sample of 60 researchers, clinicians and others, using SurveyMonkey to collect qualitative textual data about their criticisms of the statements. (iv) We then reconstructed the statements and explanations to meet users\\' criticisms, embedded them in a web-enabled toolkit, and beta tested this \\'in the wild\\'. Results On-line data collection was effective: over a four week period 50\\/60 participants responded using SurveyMonkey (40\\/60) or direct phone and email contact (10\\/60). An additional nine responses were received from people who had been sent the SurveyMonkey form by other respondents. Beta testing of the web enabled toolkit produced 13 responses, from 327 visits to http:\\/\\/www.normalizationprocess.org. Qualitative analysis of both sets of responses showed a high level of support for the statements but also showed that some statements poorly expressed their underlying constructs or overlapped with others. These were rewritten to take account of users\\' criticisms and then embedded in a web-enabled toolkit. As a result we were able translate the core constructs into a simplified set of statements that could be utilized by non-experts. Conclusion Normalization Process Theory has been developed through

  8. Evaluating complex interventions and health technologies using normalization process theory: development of a simplified approach and web-enabled toolkit

    Directory of Open Access Journals (Sweden)

    Murray Elizabeth

    2011-09-01

    Full Text Available Abstract Background Normalization Process Theory (NPT can be used to explain implementation processes in health care relating to new technologies and complex interventions. This paper describes the processes by which we developed a simplified version of NPT for use by clinicians, managers, and policy makers, and which could be embedded in a web-enabled toolkit and on-line users manual. Methods Between 2006 and 2010 we undertook four tasks. (i We presented NPT to potential and actual users in multiple workshops, seminars, and presentations. (ii Using what we discovered from these meetings, we decided to create a simplified set of statements and explanations expressing core constructs of the theory (iii We circulated these statements to a criterion sample of 60 researchers, clinicians and others, using SurveyMonkey to collect qualitative textual data about their criticisms of the statements. (iv We then reconstructed the statements and explanations to meet users' criticisms, embedded them in a web-enabled toolkit, and beta tested this 'in the wild'. Results On-line data collection was effective: over a four week period 50/60 participants responded using SurveyMonkey (40/60 or direct phone and email contact (10/60. An additional nine responses were received from people who had been sent the SurveyMonkey form by other respondents. Beta testing of the web enabled toolkit produced 13 responses, from 327 visits to http://www.normalizationprocess.org. Qualitative analysis of both sets of responses showed a high level of support for the statements but also showed that some statements poorly expressed their underlying constructs or overlapped with others. These were rewritten to take account of users' criticisms and then embedded in a web-enabled toolkit. As a result we were able translate the core constructs into a simplified set of statements that could be utilized by non-experts. Conclusion Normalization Process Theory has been developed through

  9. Promoting health workers' ownership of infection prevention and control: using Normalization Process Theory as an interpretive framework.

    Science.gov (United States)

    Gould, D J; Hale, R; Waters, E; Allen, D

    2016-12-01

    All health workers should take responsibility for infection prevention and control (IPC). Recent reduction in key reported healthcare-associated infections in the UK is impressive, but the determinants of success are unknown. It is imperative to understand how IPC strategies operate as new challenges arise and threats of antimicrobial resistance increase. The authors undertook a retrospective, independent evaluation of an action plan to enhance IPC and 'ownership' (individual accountability) for IPC introduced throughout a healthcare organization. Twenty purposively selected informants were interviewed. Data were analysed inductively. Normalization Process Theory (NPT) was applied to interpret the findings and explain how the action plan was operating. Six themes emerged through inductive analysis. Theme 1: 'Ability to make sense of ownership' provided evidence of the first element of NPT (coherence). Regardless of occupational group or seniority, informants understood the importance of IPC ownership and described what it entailed. They identified three prerequisites: 'Always being vigilant' (Theme 2), 'Importance of access to information' (Theme 3) and 'Being able to learn together in a no-blame culture' (Theme 4). Data relating to each theme provided evidence of the other elements of NPT that are required to embed change: planning implementation (cognitive participation), undertaking the work necessary to achieve change (collective action), and reflection on what else is needed to promote change as part of continuous quality improvement (reflexive monitoring). Informants identified barriers (e.g. workload) and facilitators (clear lines of communication and expectations for IPC). Eighteen months after implementing the action plan incorporating IPC ownership, there was evidence of continuous service improvement and significant reduction in infection rates. Applying a theory that identifies factors that promote/inhibit routine incorporation ('normalization') of IPC

  10. Normal modified stable processes

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler; Shephard, N.

    2002-01-01

    Gaussian (NGIG) laws. The wider framework thus established provides, in particular, for added flexibility in the modelling of the dynamics of financial time series, of importance especially as regards OU based stochastic volatility models for equities. In the special case of the tempered stable OU process......This paper discusses two classes of distributions, and stochastic processes derived from them: modified stable (MS) laws and normal modified stable (NMS) laws. This extends corresponding results for the generalised inverse Gaussian (GIG) and generalised hyperbolic (GH) or normal generalised inverse...

  11. Implementing nutrition guidelines for older people in residential care homes: a qualitative study using Normalization Process Theory

    Directory of Open Access Journals (Sweden)

    Bamford Claire

    2012-10-01

    Full Text Available Abstract Background Optimizing the dietary intake of older people can prevent nutritional deficiencies and diet-related diseases, thereby improving quality of life. However, there is evidence that the nutritional intake of older people living in care homes is suboptimal, with high levels of saturated fat, salt, and added sugars. The UK Food Standards Agency therefore developed nutrient- and food-based guidance for residential care homes. The acceptability of these guidelines and their feasibility in practice is unknown. This study used the Normalization Process Theory (NPT to understand the barriers and facilitators to implementing the guidelines and inform future implementation. Methods We conducted a process evaluation in five care homes in the north of England using qualitative methods (observation and interviews to explore the views of managers, care staff, catering staff, and domestic staff. Data were analyzed thematically and discussed in data workshops; emerging themes were then mapped to the constructs of NPT. Results Many staff perceived the guidelines as unnecessarily restrictive and irrelevant to older people. In terms of NPT, the guidelines simply did not make sense (coherence, and as a result, relatively few staff invested in the guidelines (cognitive participation. Even where staff supported the guidelines, implementation was hampered by a lack of nutritional knowledge and institutional support (collective action. Finally, the absence of observable benefits to clients confirmed the negative preconceptions of many staff, with limited evidence of reappraisal following implementation (reflexive monitoring. Conclusions The successful implementation of the nutrition guidelines requires that the fundamental issues relating to their perceived value and fit with other priorities and goals be addressed. Specialist support is needed to equip staff with the technical knowledge and skills required for menu analysis and development and to

  12. Implementing monitoring technologies in care homes for people with dementia: A qualitative exploration using Normalization Process Theory.

    Science.gov (United States)

    Hall, Alex; Wilson, Christine Brown; Stanmore, Emma; Todd, Chris

    2017-07-01

    Ageing societies and a rising prevalence of dementia are associated with increasing demand for care home places. Monitoring technologies (e.g. bed-monitoring systems; wearable location-tracking devices) are appealing to care homes as they may enhance safety, increase resident freedom, and reduce staff burden. However, there are ethical concerns about the use of such technologies, and it is unclear how they might be implemented to deliver their full range of potential benefits. This study explored facilitators and barriers to the implementation of monitoring technologies in care homes. Embedded multiple-case study with qualitative methods. Three dementia-specialist care homes in North-West England. Purposive sample of 24 staff (including registered nurses, clinical specialists, senior managers and care workers), 9 relatives and 9 residents. 36 semi-structured interviews with staff, relatives and residents; 175h of observation; resident care record review. Data collection informed by Normalization Process Theory, which seeks to account for how novel interventions become routine practice. Data analysed using Framework Analysis. Findings are presented under three main themes: 1. Reasons for using technologies: The primary reason for using monitoring technologies was to enhance safety. This often seemed to override consideration of other potential benefits (e.g. increased resident freedom) or ethical concerns (e.g. resident privacy); 2. Ways in which technologies were implemented: Some staff, relatives and residents were not involved in discussions and decision-making, which seemed to limit understandings of the potential benefits and challenges from the technologies. Involvement of residents appeared particularly challenging. Staff highlighted the importance of training, but staff training appeared mainly informal which did not seem sufficient to ensure that staff fully understood the technologies; 3. Use of technologies in practice: Technologies generated frequent

  13. Normal form theory and spectral sequences

    OpenAIRE

    Sanders, Jan A.

    2003-01-01

    The concept of unique normal form is formulated in terms of a spectral sequence. As an illustration of this technique some results of Baider and Churchill concerning the normal form of the anharmonic oscillator are reproduced. The aim of this paper is to show that spectral sequences give us a natural framework in which to formulate normal form theory. © 2003 Elsevier Science (USA). All rights reserved.

  14. Understanding the challenges to implementing case management for people with dementia in primary care in England: a qualitative study using Normalization Process Theory.

    Science.gov (United States)

    Bamford, Claire; Poole, Marie; Brittain, Katie; Chew-Graham, Carolyn; Fox, Chris; Iliffe, Steve; Manthorpe, Jill; Robinson, Louise

    2014-11-08

    Case management has been suggested as a way of improving the quality and cost-effectiveness of support for people with dementia. In this study we adapted and implemented a successful United States' model of case management in primary care in England. The results are reported elsewhere, but a key finding was that little case management took place. This paper reports the findings of the process evaluation which used Normalization Process Theory to understand the barriers to implementation. Ethnographic methods were used to explore the views and experiences of case management. Interviews with 49 stakeholders (patients, carers, case managers, health and social care professionals) were supplemented with observation of case managers during meetings and initial assessments with patients. Transcripts and field notes were analysed initially using the constant comparative approach and emerging themes were then mapped onto the framework of Normalization Process Theory. The primary focus during implementation was on the case managers as isolated individuals, with little attention being paid to the social or organizational context within which they worked. Barriers relating to each of the four main constructs of Normalization Process Theory were identified, with a lack of clarity over the scope and boundaries of the intervention (coherence); variable investment in the intervention (cognitive participation); a lack of resources, skills and training to deliver case management (collective action); and limited reflection and feedback on the case manager role (reflexive monitoring). Despite the intuitive appeal of case management to all stakeholders, there were multiple barriers to implementation in primary care in England including: difficulties in embedding case managers within existing well-established community networks; the challenges of protecting time for case management; and case managers' inability to identify, and act on, emerging patient and carer needs (an essential, but

  15. A critical analysis of the implementation of service user involvement in primary care research and health service development using normalization process theory.

    Science.gov (United States)

    Tierney, Edel; McEvoy, Rachel; O'Reilly-de Brún, Mary; de Brún, Tomas; Okonkwo, Ekaterina; Rooney, Michelle; Dowrick, Chris; Rogers, Anne; MacFarlane, Anne

    2016-06-01

    There have been recent important advances in conceptualizing and operationalizing involvement in health research and health-care service development. However, problems persist in the field that impact on the scope for meaningful involvement to become a routine - normalized - way of working in primary care. In this review, we focus on current practice to critically interrogate factors known to be relevant for normalization - definition, enrolment, enactment and appraisal. Ours was a multidisciplinary, interagency team, with community representation. We searched EBSCO host for papers from 2007 to 2011 and engaged in an iterative, reflexive approach to sampling, appraising and analysing the literature following the principles of a critical interpretive synthesis approach and using Normalization Process Theory. Twenty-six papers were chosen from 289 papers, as a purposeful sample of work that is reported as service user involvement in the field. Few papers provided a clear working definition of service user involvement. The dominant identified rationale for enrolling service users in primary care projects was linked with policy imperatives for co-governance and emancipatory ideals. The majority of methodologies employed were standard health services research methods that do not qualify as research with service users. This indicates a lack of congruence between the stated aims and methods. Most studies only reported positive outcomes, raising questions about the balance or completeness of the published appraisals. To improve normalization of meaningful involvement in primary care, it is necessary to encourage explicit reporting of definitions, methodological innovation to enhance co-governance and dissemination of research processes and findings. © 2014 The Authors Health Expectations Published by John Wiley & Sons Ltd.

  16. 'It Opened My Eyes'-examining the impact of a multifaceted chlamydia testing intervention on general practitioners using Normalization Process Theory.

    Science.gov (United States)

    Yeung, Anna; Hocking, Jane; Guy, Rebecca; Fairley, Christopher K; Smith, Kirsty; Vaisey, Alaina; Donovan, Basil; Imrie, John; Gunn, Jane; Temple-Smith, Meredith

    2018-03-28

    Chlamydia is the most common notifiable sexually transmissible infection in Australia. Left untreated, it can develop into pelvic inflammatory disease and infertility. The majority of notifications come from general practice and it is ideally situated to test young Australians. The Australian Chlamydia Control Effectiveness Pilot (ACCEPt) was a multifaceted intervention that aimed to reduce chlamydia prevalence by increasing testing in 16- to 29-year-olds attending general practice. GPs were interviewed to describe the effectiveness of the ACCEPt intervention in integrating chlamydia testing into routine practice using Normalization Process Theory (NPT). GPs were purposively selected based on age, gender, geographic location and size of practice at baseline and midpoint. Interview data were analysed regarding the intervention components and results were interpreted using NPT. A total of 44 GPs at baseline and 24 at midpoint were interviewed. Most GPs reported offering a test based on age at midpoint versus offering a test based on symptoms or patient request at baseline. Quarterly feedback was the most significant ACCEPt component for facilitating a chlamydia test. The ACCEPt intervention has been able to moderately normalize chlamydia testing among GPs, although the components had varying levels of effectiveness. NPT can demonstrate the effective implementation of an intervention in general practice and has been valuable in understanding which components are essential and which components can be improved upon.

  17. Irreversible processes kinetic theory

    CERN Document Server

    Brush, Stephen G

    2013-01-01

    Kinetic Theory, Volume 2: Irreversible Processes deals with the kinetic theory of gases and the irreversible processes they undergo. It includes the two papers by James Clerk Maxwell and Ludwig Boltzmann in which the basic equations for transport processes in gases are formulated, together with the first derivation of Boltzmann's ""H-theorem"" and a discussion of this theorem, along with the problem of irreversibility.Comprised of 10 chapters, this volume begins with an introduction to the fundamental nature of heat and of gases, along with Boltzmann's work on the kinetic theory of gases and s

  18. Statistical Theory of Normal Grain Growth Revisited

    International Nuclear Information System (INIS)

    Gadomski, A.; Luczka, J.

    2002-01-01

    In this paper, we discuss three physically relevant problems concerning the normal grain growth process. These are: Infinite vs finite size of the system under study (a step towards more realistic modeling); conditions of fine-grained structure formation, with possible applications to thin films and biomembranes, and interesting relations to superplasticity of materials; approach to log-normality, an ubiquitous natural phenomenon, frequently reported in literature. It turns out that all three important points mentioned are possible to be included in a Mulheran-Harding type behavior of evolving grains-containing systems that we have studied previously. (author)

  19. Qualitative Process Theory.

    Science.gov (United States)

    1984-07-01

    solving common sense reasoning mathematical reasoning naive physics aritificial intelligence * 20. ABSTRACT (Continue o,, reverse side Ift necessary and...AD-A148 987 QUALITATIVE PROCESS THEORY(U) MASSACHUSETTS INST OF 1/2 TEEH CAMBRIDGE ARTIFICIAL INTELLIGENCE LAB K D FORBUS JUL 84 RI-TR-789 N88814-80...NATIONAL BUREAU Of STAN ARDS IJ% A 4 I .7 Technical Report 789 Q[-----Qualitative• ProcessTheory . Kenneth Dale Forbus MIT Artificial Intelligence

  20. Exploring drivers and challenges in implementation of health promotion in community mental health services: a qualitative multi-site case study using Normalization Process Theory.

    Science.gov (United States)

    Burau, Viola; Carstensen, Kathrine; Fredens, Mia; Kousgaard, Marius Brostrøm

    2018-01-24

    There is an increased interest in improving the physical health of people with mental illness. Little is known about implementing health promotion interventions in adult mental health organisations where many users also have physical health problems. The literature suggests that contextual factors are important for implementation in community settings. This study focused on the change process and analysed the implementation of a structural health promotion intervention in community mental health organisations in different contexts in Denmark. The study was based on a qualitative multiple-case design and included two municipal and two regional provider organisations. Data were various written sources and 13 semi-structured interviews with 22 key managers and frontline staff. The analysis was organised around the four main constructs of Normalization Process Theory: Coherence, Cognitive Participation, Collective Action, and Reflexive Monitoring. Coherence: Most respondents found the intervention to be meaningful in that the intervention fitted well into existing goals, practices and treatment approaches. Cognitive Participation: Management engagement varied across providers and low engagement impeded implementation. Engaging all staff was a general problem although some of the initial resistance was apparently overcome. Collective Action: Daily enactment depended on staff being attentive and flexible enough to manage the complex needs and varying capacities of users. Reflexive Monitoring: During implementation, staff evaluations of the progress and impact of the intervention were mostly informal and ad hoc and staff used these to make on-going adjustments to activities. Overall, characteristics of context common to all providers (work force and user groups) seemed to be more important for implementation than differences in the external political-administrative context. In terms of research, future studies should adopt a more bottom-up, grounded description of context

  1. Stochastic processes inference theory

    CERN Document Server

    Rao, Malempati M

    2014-01-01

    This is the revised and enlarged 2nd edition of the authors’ original text, which was intended to be a modest complement to Grenander's fundamental memoir on stochastic processes and related inference theory. The present volume gives a substantial account of regression analysis, both for stochastic processes and measures, and includes recent material on Ridge regression with some unexpected applications, for example in econometrics. The first three chapters can be used for a quarter or semester graduate course on inference on stochastic processes. The remaining chapters provide more advanced material on stochastic analysis suitable for graduate seminars and discussions, leading to dissertation or research work. In general, the book will be of interest to researchers in probability theory, mathematical statistics and electrical and information theory.

  2. Self-consistent normal ordering of gauge field theories

    International Nuclear Information System (INIS)

    Ruehl, W.

    1987-01-01

    Mean-field theories with a real action of unconstrained fields can be self-consistently normal ordered. This leads to a considerable improvement over standard mean-field theory. This concept is applied to lattice gauge theories. First an appropriate real action mean-field theory is constructed. The equations determining the Gaussian kernel necessary for self-consistent normal ordering of this mean-field theory are derived. (author). 4 refs

  3. Teaching queer theory at a Normal School.

    Science.gov (United States)

    Bacon, Jen

    2006-01-01

    This article presents a case study of the ongoing struggle to queer West Chester University at the level of the institution, the curriculum, and the classroom. Part of that struggle includes an effort to establish a policy for free speech that accommodates the values of the institution toward diversity. Another part involves attempts to introduce LGBT Studies into the curriculum, and the resulting debates over whether the curriculum should be "gayer" or "queerer." I discuss the personal struggle to destabilize ready-made categories and encourage non-binary thinking, while honoring the identities we live, and perform, in the classroom. In the last four years, WCU has hired half a dozen out gay or lesbian faculty members, some of whom identify as "queer." In many ways, those faculty members have entered a climate open to new ideas for adding LGBT content to the curriculum and to queering the structure and curriculum of the university. But as faculty, staff, and students engage this cause-along with the broader cause of social justice at the University- we have found that our enemies are often closer than we might have guessed. Detailing the tensions that have characterized the landscape at WCUduring my three years and half years there, this essay elaborates on the epistemological and pedagogical issues that arise when queer Theory meets LGBT Studies in the process of institutional, curricular, and pedagogical reform. I argue that questions about content and method, inclusion and exclusion, and identity and performance can be answered only with a concerted effort and continued attention to the cultural tendency to re-assert binaries while simultaneously learning from them. What is true of West Chester, I argue, is true of the larger social system where the contested terrain of the queer has implications for the choices we make as both stakeholders and deviants in the systems we chronicle and critique.

  4. The challenge of transferring an implementation strategy from academia to the field: a process evaluation of local quality improvement collaboratives in Dutch primary care using the normalization process theory.

    Science.gov (United States)

    Trietsch, Jasper; van Steenkiste, Ben; Hobma, Sjoerd; Frericks, Arnoud; Grol, Richard; Metsemakers, Job; van der Weijden, Trudy

    2014-12-01

    A quality improvement strategy consisting of comparative feedback and peer review embedded in available local quality improvement collaboratives proved to be effective in changing the test-ordering behaviour of general practitioners. However, implementing this strategy was problematic. We aimed for large-scale implementation of an adapted strategy covering both test ordering and prescribing performance. Because we failed to achieve large-scale implementation, the aim of this study was to describe and analyse the challenges of the transferring process. In a qualitative study 19 regional health officers, pharmacists, laboratory specialists and general practitioners were interviewed within 6 months after the transfer period. The interviews were audiotaped, transcribed and independently coded by two of the authors. The codes were matched to the dimensions of the normalization process theory. The general idea of the strategy was widely supported, but generating the feedback was more complex than expected and the need for external support after transfer of the strategy remained high because participants did not assume responsibility for the work and the distribution of resources that came with it. Evidence on effectiveness, a national infrastructure for these collaboratives and a general positive attitude were not sufficient for normalization. Thinking about managing large databases, responsibility for tasks and distribution of resources should start as early as possible when planning complex quality improvement strategies. Merely exploring the barriers and facilitators experienced in a preceding trial is not sufficient. Although multifaceted implementation strategies to change professional behaviour are attractive, their inherent complexity is also a pitfall for large-scale implementation. © 2014 John Wiley & Sons, Ltd.

  5. Applying normalization process theory to understand implementation of a family violence screening and care model in maternal and child health nursing practice: a mixed method process evaluation of a randomised controlled trial.

    Science.gov (United States)

    Hooker, Leesa; Small, Rhonda; Humphreys, Cathy; Hegarty, Kelsey; Taft, Angela

    2015-03-28

    In Victoria, Australia, Maternal and Child Health (MCH) services deliver primary health care to families with children 0-6 years, focusing on health promotion, parenting support and early intervention. Family violence (FV) has been identified as a major public health concern, with increased prevalence in the child-bearing years. Victorian Government policy recommends routine FV screening of all women attending MCH services. Using Normalization Process Theory (NPT), we aimed to understand the barriers and facilitators of implementing an enhanced screening model into MCH nurse clinical practice. NPT informed the process evaluation of a pragmatic, cluster randomised controlled trial in eight MCH nurse teams in metropolitan Melbourne, Victoria, Australia. Using mixed methods (surveys and interviews), we explored the views of MCH nurses, MCH nurse team leaders, FV liaison workers and FV managers on implementation of the model. Quantitative data were analysed by comparing proportionate group differences and change within trial arm over time between interim and impact nurse surveys. Qualitative data were inductively coded, thematically analysed and mapped to NPT constructs (coherence, cognitive participation, collective action and reflexive monitoring) to enhance our understanding of the outcome evaluation. MCH nurse participation rates for interim and impact surveys were 79% (127/160) and 71% (114/160), respectively. Twenty-three key stakeholder interviews were completed. FV screening work was meaningful and valued by participants; however, the implementation coincided with a significant (government directed) change in clinical practice which impacted on full engagement with the model (coherence and cognitive participation). The use of MCH nurse-designed FV screening/management tools in focussed women's health consultations and links with FV services enhanced the participants' work (collective action). Monitoring of FV work (reflexive monitoring) was limited. The use of

  6. Microscopic theory of normal liquid 3He

    International Nuclear Information System (INIS)

    Nafari, N.; Doroudi, A.

    1994-03-01

    We have used the self-consistent scheme proposed by Singwi, Tosi, Land and Sjoelander (STLS) to study the properties of normal liquid 3 He. By employing the Aziz potential (HFD-B) and some other realistic pairwise interactions, we have calculated the static structure factor, the pair-correlation function, the zero sound frequencies as a function of wave-vector, and the Landau parameter F s 0 for different densities. Our results show considerable improvement over the Ng-Singwi's model potential of a hard core plus an attractive tail. Agreement between our results and the experimental data for the static structure factor and the zero sound frequencies is fairly good. (author). 30 refs, 6 figs, 2 tabs

  7. Dissociative Functions in the Normal Mourning Process.

    Science.gov (United States)

    Kauffman, Jeffrey

    1994-01-01

    Sees dissociative functions in mourning process as occurring in conjunction with integrative trends. Considers initial shock reaction in mourning as model of normal dissociation in mourning process. Dissociation is understood to be related to traumatic significance of death in human consciousness. Discerns four psychological categories of…

  8. Stochastic processes and quantum theory

    International Nuclear Information System (INIS)

    Klauder, J.R.

    1975-01-01

    The author analyses a variety of stochastic processes, namely real time diffusion phenomena, which are analogues of imaginary time quantum theory and convariant imaginary time quantum field theory. He elaborates some standard properties involving probability measures and stochastic variables and considers a simple class of examples. Finally he develops the fact that certain stochastic theories actually exhibit divergences that simulate those of covariant quantum field theory and presents examples of both renormaizable and unrenormalizable behavior. (V.J.C.)

  9. Stochastic processes and filtering theory

    CERN Document Server

    Jazwinski, Andrew H

    1970-01-01

    This unified treatment of linear and nonlinear filtering theory presents material previously available only in journals, and in terms accessible to engineering students. Its sole prerequisites are advanced calculus, the theory of ordinary differential equations, and matrix analysis. Although theory is emphasized, the text discusses numerous practical applications as well.Taking the state-space approach to filtering, this text models dynamical systems by finite-dimensional Markov processes, outputs of stochastic difference, and differential equations. Starting with background material on probab

  10. Empirical processes: theory and applications

    OpenAIRE

    Venturini Sergio

    2005-01-01

    Proceedings of the 2003 Summer School in Statistics and Probability in Torgnon (Aosta, Italy) held by Prof. Jon A. Wellner and Prof. M. Banerjee. The topic presented was the theory of empirical processes with applications to statistics (m-estimation, bootstrap, semiparametric theory).

  11. Restricted broadcast process theory

    NARCIS (Netherlands)

    F. Ghassemi; W.J. Fokkink (Wan); A. Movaghar; A. Cerone; S. Gruner

    2008-01-01

    htmlabstractWe present a process algebra for modeling and reasoning about Mobile Ad hoc Networks (MANETs) and their protocols. In our algebra we model the essential modeling concepts of ad hoc networks, i.e. local broadcast, connectivity of nodes and connectivity changes. Connectivity and

  12. Nevanlinna theory, normal families, and algebraic differential equations

    CERN Document Server

    Steinmetz, Norbert

    2017-01-01

    This book offers a modern introduction to Nevanlinna theory and its intricate relation to the theory of normal families, algebraic functions, asymptotic series, and algebraic differential equations. Following a comprehensive treatment of Nevanlinna’s theory of value distribution, the author presents advances made since Hayman’s work on the value distribution of differential polynomials and illustrates how value- and pair-sharing problems are linked to algebraic curves and Briot–Bouquet differential equations. In addition to discussing classical applications of Nevanlinna theory, the book outlines state-of-the-art research, such as the effect of the Yosida and Zalcman–Pang method of re-scaling to algebraic differential equations, and presents the Painlevé–Yosida theorem, which relates Painlevé transcendents and solutions to selected 2D Hamiltonian systems to certain Yosida classes of meromorphic functions. Aimed at graduate students interested in recent developments in the field and researchers wor...

  13. Destination memory and cognitive theory of mind in normal ageing.

    Science.gov (United States)

    El Haj, Mohamad; Raffard, Stéphane; Gély-Nargeot, Marie-Christine

    2016-01-01

    Destination memory is the ability to remember the destination to which a piece of information has been addressed (e.g., "Did I tell you about the promotion?"). This ability is found to be impaired in normal ageing. Our work aimed to link this deterioration to the decline in theory of mind. Forty younger adults (M age = 23.13 years, SD = 4.00) and 36 older adults (M age = 69.53 years, SD = 8.93) performed a destination memory task. They also performed the False-belief test addressing cognitive theory of mind and the Reading the mind in the eyes test addressing affective theory of mind. Results showed significant deterioration in destination memory, cognitive theory of mind and affective theory of mind in the older adults. The older adults' performance on destination memory was significantly correlated with and predicted by their performance on cognitive theory of mind. Difficulties in the ability to interpret and predict others' mental states are related to destination memory decline in older adults.

  14. System Theory and Physiological Processes.

    Science.gov (United States)

    Jones, R W

    1963-05-03

    Engineers and physiologists working together in experimental and theoretical studies predict that the application of system analysis to biological processes will increase understanding of these processes and broaden the base of system theory. Richard W. Jones, professor of electrical engineering at Northwestern University, Evanston, Illinois, and John S. Gray, professor of physiology at Northwestern's Medical School, discuss these developments. Their articles are adapted from addresses delivered in Chicago in November 1962 at the 15th Annual Conference on Engineering in Medicine and Biology.

  15. Mean fields and self consistent normal ordering of lattice spin and gauge field theories

    International Nuclear Information System (INIS)

    Ruehl, W.

    1986-01-01

    Classical Heisenberg spin models on lattices possess mean field theories that are well defined real field theories on finite lattices. These mean field theories can be self consistently normal ordered. This leads to a considerable improvement over standard mean field theory. This concept is carried over to lattice gauge theories. We construct first an appropriate real mean field theory. The equations determining the Gaussian kernel necessary for self-consistent normal ordering of this mean field theory are derived. (orig.)

  16. Theory of the low-voltage impedance of superconductor-- p insulator--normal metal tunnel junctions

    International Nuclear Information System (INIS)

    Lemberger, T.R.

    1984-01-01

    A theory for the low-voltage impedance of a superconductor-- p insulator--normal metal tunnel junction is developed that includes the effects of charge imbalance and of quasiparticle fluctuations. A novel, inelastic, charge-imbalance relaxation process is identified that is associated with the junction itself. This new process leads to the surprising result that the charge-imbalance component of the dc resistance of a junction becomes independent of the electron-phonon scattering rate as the insulator resistance decreases

  17. Gaussian processes and constructive scalar field theory

    International Nuclear Information System (INIS)

    Benfatto, G.; Nicolo, F.

    1981-01-01

    The last years have seen a very deep progress of constructive euclidean field theory, with many implications in the area of the random fields theory. The authors discuss an approach to super-renormalizable scalar field theories, which puts in particular evidence the connections with the theory of the Gaussian processes associated to the elliptic operators. The paper consists of two parts. Part I treats some problems in the theory of Gaussian processes which arise in the approach to the PHI 3 4 theory. Part II is devoted to the discussion of the ultraviolet stability in the PHI 3 4 theory. (Auth.)

  18. Accumulating project management knowledge through process theory

    NARCIS (Netherlands)

    Niederman, Fred; March, Salvatore T.; Mueller, Benjamin

    2014-01-01

    This paper describes how the general notion of process theory can provide a foundational component in a portfolio of project management theories. The paper begins by outlining a variety of views pertaining to the nature of theory and theory development. This forms a basis for understanding how

  19. Self-consistent theory of normal-to-superconducting transition

    International Nuclear Information System (INIS)

    Radzihovsky, L.; Chicago Univ., IL

    1995-01-01

    I study the normal-to-superconducting (NS) transition within the Ginzburg-Landau (GL) model, taking into account the fluctuations in the m-component complex order parameter ψ α and the vector potential A in the arbitrary dimension d, for any m. I find that the transition is of second order and that the previous conclusion of the fluctuation-driven first-order transition is a possible artifact of the breakdown of the ε-expansion and the inaccuracy of the 1/m-expansion for physical values ε = 1, m 1. I compute the anomalous η(d, m) exponent at the NS transition, and find η(3, 1) ∼ -0.38. In the m → ∞ limit, η(d, m) becomes exact and agrees with the 1/m-expansion. Near d = 4 the theory is also in good agreement with the perturbative ε-expansion results for m > 183 and provides a sensible interpolation formula for arbitrary d and m. (orig.)

  20. Accumulating Project Management Knowledge Using Process Theory

    NARCIS (Netherlands)

    Niederman, Fred; March, Salvatore T.; Mueller, Benjamin

    2016-01-01

    Process theory has become an important mechanism for the accumulation of knowledge in a number of disciplines. In contrast with variance theory, which focuses on co-variation of dependent and independent variables, process theory focuses on sequences of activities, their duration and the intervals

  1. Acquisition by Processing Theory: A Theory of Everything?

    Science.gov (United States)

    Carroll, Susanne E.

    2004-01-01

    Truscott and Sharwood Smith (henceforth T&SS) propose a novel theory of language acquisition, "Acquisition by Processing Theory" (APT), designed to account for both first and second language acquisition, monolingual and bilingual speech perception and parsing, and speech production. This is a tall order. Like any theoretically ambitious…

  2. Elementary process theory axiomatic introduction and applications

    CERN Document Server

    Cabbolet, Marcoen J T F

    2011-01-01

    Modern physics lacks a unitary theory that applies to all four fundamental interactions. This PhD thesis is a proposal for a single, complete, and coherent scheme of mathematically formulated elementary laws of nature. While the first chapter presents the general background, the second chapter addresses the method by which the main result has been developed. The next three chapters rigorously introduce the Elementary Process Theory, its mathematical foundations, and its applications to physics, cosmology and philosophy of mind. The final two chapters discuss the results and present the conclusions. Summarizing, the Elementary Process Theory is a scheme of seven well-formed closed expressions, written in the mathematical language of set matrix theory – a generalization of Zermelo-Fraenkel set theory. In the physical world, these seven expressions can be interpreted as elementary principles governing the universe at supersmall scale. The author critically confronts the theory with Quantum Mechanics and Genera...

  3. An adaptive orienting theory of error processing.

    Science.gov (United States)

    Wessel, Jan R

    2018-03-01

    The ability to detect and correct action errors is paramount to safe and efficient goal-directed behaviors. Existing work on the neural underpinnings of error processing and post-error behavioral adaptations has led to the development of several mechanistic theories of error processing. These theories can be roughly grouped into adaptive and maladaptive theories. While adaptive theories propose that errors trigger a cascade of processes that will result in improved behavior after error commission, maladaptive theories hold that error commission momentarily impairs behavior. Neither group of theories can account for all available data, as different empirical studies find both impaired and improved post-error behavior. This article attempts a synthesis between the predictions made by prominent adaptive and maladaptive theories. Specifically, it is proposed that errors invoke a nonspecific cascade of processing that will rapidly interrupt and inhibit ongoing behavior and cognition, as well as orient attention toward the source of the error. It is proposed that this cascade follows all unexpected action outcomes, not just errors. In the case of errors, this cascade is followed by error-specific, controlled processing, which is specifically aimed at (re)tuning the existing task set. This theory combines existing predictions from maladaptive orienting and bottleneck theories with specific neural mechanisms from the wider field of cognitive control, including from error-specific theories of adaptive post-error processing. The article aims to describe the proposed framework and its implications for post-error slowing and post-error accuracy, propose mechanistic neural circuitry for post-error processing, and derive specific hypotheses for future empirical investigations. © 2017 Society for Psychophysiological Research.

  4. Radiative processes in gauge theories

    International Nuclear Information System (INIS)

    Berends, F.A.; Kleiss, R.; Danckaert, D.; Causmaecker, P. De; Gastmans, R.; Troost, W.; Tai Tsun Wu

    1982-01-01

    It is shown how the introduction of explicit polarization vectors of the radiated gauge particles leads to great simplifications in the calculation of bremsstrahlung processes at high energies. (author)

  5. Processing Information in Quantum Decision Theory

    OpenAIRE

    Yukalov, V. I.; Sornette, D.

    2008-01-01

    A survey is given summarizing the state of the art of describing information processing in Quantum Decision Theory, which has been recently advanced as a novel variant of decision making, based on the mathematical theory of separable Hilbert spaces. This mathematical structure captures the effect of superposition of composite prospects, including many incorporated intended actions. The theory characterizes entangled decision making, non-commutativity of subsequent decisions, and intention int...

  6. On the theory of optimal processes

    International Nuclear Information System (INIS)

    Goldenberg, P.; Provenzano, V.

    1975-01-01

    The theory of optimal processes is a recent mathematical formalism that is used to solve an important class of problems in science and in technology, that cannot be solved by classical variational techniques. An example of such processes would be the control of a nuclear reactor. Certain features of the theory of optimal processes are discussed, emphasizing the central contribution of Pontryagin with his formulation of the maximum principle. An application of the theory of optimum control is presented. The example is a time optimum problem applied to a simplified model of a nuclear reactor. It deals with the question of changing the equilibrium power level of the reactor in an optimum time

  7. Does Normal Processing Provide Evidence of Specialised Semantic Subsystems?

    Science.gov (United States)

    Shapiro, Laura R.; Olson, Andrew C.

    2005-01-01

    Category-specific disorders are frequently explained by suggesting that living and non-living things are processed in separate subsystems (e.g. Caramazza & Shelton, 1998). If subsystems exist, there should be benefits for normal processing, beyond the influence of structural similarity. However, no previous study has separated the relative…

  8. Queer theory and education to approach not normalizing

    Directory of Open Access Journals (Sweden)

    Wendel Souza Santos

    2017-12-01

    Full Text Available Queer analytical commonly related to gender studies is a recent conceptual approach. This article aims mainly to bring out the prospect explored the critical analysis of the educational field. So the big challenge in education is to rethink what is educate, educate and educate and to whom. In a non-normalizing perspective, educate would be a dialogical activity in that the experiences to date unfeasible, non-recognized, or more commonly, raped, started to be incorporated into the school routine, changing the hierarchy between who teaches and who is educated and seeking establish more symmetry between them in order to move from education to a relational learning and transformative for both.

  9. Effect of normal processes on thermal conductivity of germanium ...

    Indian Academy of Sciences (India)

    Abstract. The effect of normal scattering processes is considered to redistribute the phonon momentum in (a) the same phonon branch – KK-S model and (b) between differ- ent phonon branches – KK-H model. Simplified thermal conductivity relations are used to estimate the thermal conductivity of germanium, silicon and ...

  10. Overweight but unseen: a review of the underestimation of weight status and a visual normalization theory.

    Science.gov (United States)

    Robinson, E

    2017-10-01

    Although overweight and obesity are widespread across most of the developed world, a considerable body of research has now accumulated, which suggests that adiposity often goes undetected. A substantial proportion of individuals with overweight or obesity do not identify they are overweight, and large numbers of parents of children with overweight or obesity fail to identify their child as being overweight. Lay people and medical practitioners are also now poor at identifying overweight and obesity in others. A visual normalization theory of the under-detection of overweight and obesity is proposed. This theory is based on the notion that weight status is judged relative to visual body size norms. Because larger body sizes are now common, this has caused a recalibration to the range of body sizes that are perceived as being 'normal' and increased the visual threshold for what constitutes 'overweight'. Evidence is reviewed that indicates this process has played a significant role in the under-detection of overweight and obesity. The public health relevance of the under-detection of overweight and obesity is also discussed. © 2017 The Authors. Obesity Reviews published by John Wiley & Sons Ltd on behalf of World Obesity.

  11. Emotion processes in normal and abnormal development and preventive intervention.

    Science.gov (United States)

    Izard, Carroll E; Fine, Sarah; Mostow, Allison; Trentacosta, Christopher; Campbell, Jan

    2002-01-01

    We present an analysis of the role of emotions in normal and abnormal development and preventive intervention. The conceptual framework stems from three tenets of differential emotions theory (DET). These principles concern the constructs of emotion utilization; intersystem connections among modular emotion systems, cognition, and action; and the organizational and motivational functions of discrete emotions. Particular emotions and patterns of emotions function differentially in different periods of development and in influencing the cognition and behavior associated with different forms of psychopathology. Established prevention programs have not emphasized the concept of emotion as motivation. It is even more critical that they have generally neglected the idea of modulating emotions, not simply to achieve self-regulation, but also to utilize their inherently adaptive functions as a means of facilitating the development of social competence and preventing psychopathology. The paper includes a brief description of a theory-based prevention program and suggestions for complementary targeted interventions to address specific externalizing and internalizing problems. In the final section, we describe ways in which emotion-centered preventions can provide excellent opportunities for research on the development of normal and abnormal behavior.

  12. Biomechanical Analysis of Normal Brain Development during the First Year of Life Using Finite Strain Theory

    OpenAIRE

    Kim, Jeong Chul; Wang, Li; Shen, Dinggang; Lin, Weili

    2016-01-01

    The first year of life is the most critical time period for structural and functional development of the human brain. Combining longitudinal MR imaging and finite strain theory, this study aimed to provide new insights into normal brain development through a biomechanical framework. Thirty-three normal infants were longitudinally imaged using MRI from 2 weeks to 1 year of age. Voxel-wise Jacobian determinant was estimated to elucidate volumetric changes while Lagrange strains (both normal and...

  13. Process for preparing a normal lighting and heating gas etc

    Energy Technology Data Exchange (ETDEWEB)

    Becker, J

    1910-12-11

    A process for preparing a normal lighting and heating gas from Australian bituminous shale by distillation and decomposition in the presence of water vapor is characterized by the fact that the gasification is suitably undertaken with gradual filling of a retort and with simultaneous introduction of water vapor at a temperature not exceeding 1,000/sup 0/ C. The resulting amount of gas is heated in the same or a second heated retort with freshly supplied vapor.

  14. Theory of the dissociation process, ch. 1

    International Nuclear Information System (INIS)

    Asselt, N.P.F.B. van

    1976-01-01

    The formalism of Moeller operators and channel Hamiltonians, originating from scattering theory, is used for the description of the dissociation process. The proper choice of the initial state wave function is discussed. A method is given which accounts for the symmetry requirements which appear in the case of a homonuclear molecule where identical particles are present

  15. Reggeon field theory and Markov processes

    International Nuclear Information System (INIS)

    Grassberger, P.; Sundermeyer, K.

    1978-01-01

    Reggeon field theory with a quartic coupling in addition to the standard cubic one is shown to be mathematically equivalent to a chemical process where a radical can undergo diffusion, absorption, recombination, and autocatalytic production. Physically, these 'radicals' are wee partons. (Auth.)

  16. Aligning Grammatical Theories and Language Processing Models

    Science.gov (United States)

    Lewis, Shevaun; Phillips, Colin

    2015-01-01

    We address two important questions about the relationship between theoretical linguistics and psycholinguistics. First, do grammatical theories and language processing models describe separate cognitive systems, or are they accounts of different aspects of the same system? We argue that most evidence is consistent with the one-system view. Second,…

  17. Business Process Management Theory and Applications

    CERN Document Server

    2013-01-01

    Business Process Management (BPM) has been in existence for decades. It  uses, complements, integrates and extends theories, methods and tools from  other scientific disciplines like: strategic management, information technology, managerial accounting, operations management etc. During this period the main focus themes of researchers and professionals in BPM  were: business process modeling, business process analysis, activity based costing, business process simulation, performance measurement, workflow management, the link between information technology and BPM for process automation etc. More recently the focus moved to subjects like Knowledge Management, Enterprise Resource Planning (ERP) Systems, Service Oriented Architectures (SOAs), Process Intelligence (PI) and even  Social Networks. In this collection of papers we present a review of the work and the outcomes achieved in the classic BPM fields as well as a deeper insight on recent advances in BPM. We present a review of business process modeling a...

  18. A steady state theory for processive cellulases

    DEFF Research Database (Denmark)

    Cruys-Bagger, Nicolaj; Olsen, Jens Elmerdahl; Præstgaard, Eigil

    2013-01-01

    coefficient’, which represents the probability of the enzyme dissociating from the substrate strand before completing n sequential catalytic steps, where n is the mean processivity number measured experimentally. Typical processive cellulases have high substrate affinity, and therefore this probability is low....... This has significant kinetic implications, for example the maximal specific rate (Vmax/E0) for processive cellulases is much lower than the catalytic rate constant (kcat). We discuss how relationships based on this theory may be used in both comparative and mechanistic analyses of cellulases....

  19. On diffusion process generators and scattering theory

    International Nuclear Information System (INIS)

    Demuth, M.

    1980-01-01

    In scattering theory the existence of wave operators is one of the mainly interesting points. For two selfadjoint operators K and H defined in separable Hilbertspaces H tilde and H' tilde, respectively, the usual two space wave operator is defined by Ωsub(+-)(H,J,K) = s-lim esup(itH)Jesup(-itK)Psup(ac), t → +-infinity, if these limits exist. J is the identification operator mapping H tilde into H' tilde. Psup(ac) is the orthogonal projection onto the absolutely continuous subspace of K. The objective is to prove the existence and completeness of the wave operator for K and K+V where K is a diffusion process generator and V a singular perturbation. Because generators of diffusion processes can be obtained by extension of second order differential operators with variable coefficients the result connects hard-core potential problems and wave operator existence for diffusion process generators including scattering theory for second order elliptic differential operators by means of the stochastic process theory and stochastic differential equation solutions. (author)

  20. Postmortem abdominal CT: Assessing normal cadaveric modifications and pathological processes

    International Nuclear Information System (INIS)

    Charlier, P.; Carlier, R.; Roffi, F.; Ezra, J.; Chaillot, P.F.; Duchat, F.; Huynh-Charlier, I.; Lorin de la Grandmaison, G.

    2012-01-01

    Purpose: To investigate the interest of postmortem non-enhanced computer tomography (CT) for abdominal lesions in a forensic context of suspicions death and to list the different radiological cadaveric modifications occurring normally at abdominal stage, which must be known by non forensic radiologists in case of any postmortem exam. Materials and methods: 30 cadavers have been submitted to a body CT-scan without injection of contrast material. CT exams were reviewed by two independent radiologists and radiological findings were compared with forensic autopsy data. Results: False positive CT findings included physiological postmortem transudates misdiagnosed with intra-abdominal bleedings, and putrefaction gas misdiagnosed with gas embolism, aeroporty, aerobily, digestive parietal pneumatosis. Incidentalomas without any role in death process were also reported. False negative CT findings included small contusions, vascular thromboses, acute infarcts foci, non radio-opaque foreign bodies. Normal cadaveric modifications were due to livor mortis and putrefaction, and are seen quickly (some hours) after death. Conclusion: The non forensic radiologist should be familiar with the normal abdominal postmortem features in order to avoid misdiagnoses, and detect informative lesions which can help and guide the forensic practitioner or the clinical physician.

  1. Perturbations and quasi-normal modes of black holes in Einstein-Aether theory

    International Nuclear Information System (INIS)

    Konoplya, R.A.; Zhidenko, A.

    2007-01-01

    We develop a new method for calculation of quasi-normal modes of black holes, when the effective potential, which governs black hole perturbations, is known only numerically in some region near the black hole. This method can be applied to perturbations of a wide class of numerical black hole solutions. We apply it to the black holes in the Einstein-Aether theory, a theory where general relativity is coupled to a unit time-like vector field, in order to observe local Lorentz symmetry violation. We found that in the non-reduced Einstein-Aether theory, real oscillation frequency and damping rate of quasi-normal modes are larger than those of Schwarzschild black holes in the Einstein theory

  2. Asymptotic theory of weakly dependent random processes

    CERN Document Server

    Rio, Emmanuel

    2017-01-01

    Presenting tools to aid understanding of asymptotic theory and weakly dependent processes, this book is devoted to inequalities and limit theorems for sequences of random variables that are strongly mixing in the sense of Rosenblatt, or absolutely regular. The first chapter introduces covariance inequalities under strong mixing or absolute regularity. These covariance inequalities are applied in Chapters 2, 3 and 4 to moment inequalities, rates of convergence in the strong law, and central limit theorems. Chapter 5 concerns coupling. In Chapter 6 new deviation inequalities and new moment inequalities for partial sums via the coupling lemmas of Chapter 5 are derived and applied to the bounded law of the iterated logarithm. Chapters 7 and 8 deal with the theory of empirical processes under weak dependence. Lastly, Chapter 9 describes links between ergodicity, return times and rates of mixing in the case of irreducible Markov chains. Each chapter ends with a set of exercises. The book is an updated and extended ...

  3. Microscopic theory of the current-voltage relationship across a normal-superconducting interface

    International Nuclear Information System (INIS)

    Kraehenbuehl, Y.; Watts-Tobin, R.J.

    1979-01-01

    Measurements by Pippard et al. have shown the existence of an extra resistance due to the penetration of an electrical potential into a superconductor. Previous theories of this effect are unable to explain the full temperature dependence of the extra resistance because they use oversimplified models of the normal--superconducting interface. We show that the microscopic theory for dirty superconductors leads to a good agreement with experiment over the whole temperature range

  4. Diffusive epidemic process: theory and simulation

    International Nuclear Information System (INIS)

    Maia, Daniel Souza; Dickman, Ronald

    2007-01-01

    We study the continuous absorbing-state phase transition in the one-dimensional diffusive epidemic process via mean-field theory and Monte Carlo simulation. In this model, particles of two species (A and B) hop on a lattice and undergo reactions B → A and A+B → 2B; the total particle number is conserved. We formulate the model as a continuous-time Markov process described by a master equation. A phase transition between the (absorbing) B-free state and an active state is observed as the parameters (reaction and diffusion rates, and total particle density) are varied. Mean-field theory reveals a surprising, nonmonotonic dependence of the critical recovery rate on the diffusion rate of B particles. A computational realization of the process that is faithful to the transition rates defining the model is devised, allowing for direct comparison with theory. Using the quasi-stationary simulation method we determine the order parameter and the survival time in systems of up to 4000 sites. Due to strong finite-size effects, the results converge only for large system sizes. We find no evidence for a discontinuous transition. Our results are consistent with the existence of three distinct universality classes, depending on whether A particles diffusive more rapidly, less rapidly or at the same rate as B particles. We also perform quasi-stationary simulations of the triplet creation model, which yield results consistent with a discontinuous transition at high diffusion rates

  5. Entropy generation and momentum transfer in the superconductor-normal and normal-superconductor phase transformations and the consistency of the conventional theory of superconductivity

    Science.gov (United States)

    Hirsch, J. E.

    2018-05-01

    Since the discovery of the Meissner effect, the superconductor to normal (S-N) phase transition in the presence of a magnetic field is understood to be a first-order phase transformation that is reversible under ideal conditions and obeys the laws of thermodynamics. The reverse (N-S) transition is the Meissner effect. This implies in particular that the kinetic energy of the supercurrent is not dissipated as Joule heat in the process where the superconductor becomes normal and the supercurrent stops. In this paper, we analyze the entropy generation and the momentum transfer between the supercurrent and the body in the S-N transition and the N-S transition as described by the conventional theory of superconductivity. We find that it is not possible to explain the transition in a way that is consistent with the laws of thermodynamics unless the momentum transfer between the supercurrent and the body occurs with zero entropy generation, for which the conventional theory of superconductivity provides no mechanism. Instead, we point out that the alternative theory of hole superconductivity does not encounter such difficulties.

  6. The uncertainty processing theory of motivation.

    Science.gov (United States)

    Anselme, Patrick

    2010-04-02

    Most theories describe motivation using basic terminology (drive, 'wanting', goal, pleasure, etc.) that fails to inform well about the psychological mechanisms controlling its expression. This leads to a conception of motivation as a mere psychological state 'emerging' from neurophysiological substrates. However, the involvement of motivation in a large number of behavioural parameters (triggering, intensity, duration, and directedness) and cognitive abilities (learning, memory, decision, etc.) suggest that it should be viewed as an information processing system. The uncertainty processing theory (UPT) presented here suggests that motivation is the set of cognitive processes allowing organisms to extract information from the environment by reducing uncertainty about the occurrence of psychologically significant events. This processing of information is shown to naturally result in the highlighting of specific stimuli. The UPT attempts to solve three major problems: (i) how motivations can affect behaviour and cognition so widely, (ii) how motivational specificity for objects and events can result from nonspecific neuropharmacological causal factors (such as mesolimbic dopamine), and (iii) how motivational interactions can be conceived in psychological terms, irrespective of their biological correlates. The UPT is in keeping with the conceptual tradition of the incentive salience hypothesis while trying to overcome the shortcomings inherent to this view. Copyright 2009 Elsevier B.V. All rights reserved.

  7. Stationary stochastic processes theory and applications

    CERN Document Server

    Lindgren, Georg

    2012-01-01

    Some Probability and Process BackgroundSample space, sample function, and observablesRandom variables and stochastic processesStationary processes and fieldsGaussian processesFour historical landmarksSample Function PropertiesQuadratic mean propertiesSample function continuityDerivatives, tangents, and other characteristicsStochastic integrationAn ergodic resultExercisesSpectral RepresentationsComplex-valued stochastic processesBochner's theorem and the spectral distributionSpectral representation of a stationary processGaussian processesStationary counting processesExercisesLinear Filters - General PropertiesLinear time invariant filtersLinear filters and differential equationsWhite noise in linear systemsLong range dependence, non-integrable spectra, and unstable systemsThe ARMA-familyLinear Filters - Special TopicsThe Hilbert transform and the envelopeThe sampling theoremKarhunen-Loève expansionClassical Ergodic Theory and MixingThe basic ergodic theorem in L2Stationarity and transformationsThe ergodic th...

  8. Evaluating accounting information systems that support multiple GAAP reporting using Normalized Systems Theory

    NARCIS (Netherlands)

    Vanhoof, E.; Huysmans, P.; Aerts, Walter; Verelst, J.; Aveiro, D.; Tribolet, J.; Gouveia, D.

    2014-01-01

    This paper uses a mixed methods approach of design science and case study research to evaluate structures of Accounting Information Systems (AIS) that report in multiple Generally Accepted Accounting Principles (GAAP), using Normalized Systems Theory (NST). To comply with regulation, many companies

  9. Normal Patterns of Deja Experience in a Healthy, Blind Male: Challenging Optical Pathway Delay Theory

    Science.gov (United States)

    O'Connor, Akira R.; Moulin, Christopher J. A.

    2006-01-01

    We report the case of a 25-year-old healthy, blind male, MT, who experiences normal patterns of deja vu. The optical pathway delay theory of deja vu formation assumes that neuronal input from the optical pathways is necessary for the formation of the experience. Surprisingly, although the sensation of deja vu is known to be experienced by blind…

  10. Researches Concerning to Minimize Vibrations when Processing Normal Lathe

    Directory of Open Access Journals (Sweden)

    Lenuța Cîndea

    2015-09-01

    Full Text Available In the cutting process, vibration is inevitable appearance, and in situations where the amplitude exceeds the limits of precision dimensional and shape of the surfaces generated vibrator phenomenon is detrimental.Field vibration is an issue of increasingly developed, so the futures will a better understanding of them and their use even in other sectors.The paper developed experimental measurement of vibrations at the lathe machining normal. The scheme described kinematical machine tool, cutting tool, cutting conditions, presenting experimental facility for measuring vibration occurring at turning. Experimental results have followed measurement of amplitude, which occurs during interior turning the knife without silencer incorporated. The tests were performed continuously for different speed, feed and depth of cut.

  11. Understanding the implementation of complex interventions in health care: the normalization process model

    Directory of Open Access Journals (Sweden)

    Rogers Anne

    2007-09-01

    Full Text Available Abstract Background The Normalization Process Model is a theoretical model that assists in explaining the processes by which complex interventions become routinely embedded in health care practice. It offers a framework for process evaluation and also for comparative studies of complex interventions. It focuses on the factors that promote or inhibit the routine embedding of complex interventions in health care practice. Methods A formal theory structure is used to define the model, and its internal causal relations and mechanisms. The model is broken down to show that it is consistent and adequate in generating accurate description, systematic explanation, and the production of rational knowledge claims about the workability and integration of complex interventions. Results The model explains the normalization of complex interventions by reference to four factors demonstrated to promote or inhibit the operationalization and embedding of complex interventions (interactional workability, relational integration, skill-set workability, and contextual integration. Conclusion The model is consistent and adequate. Repeated calls for theoretically sound process evaluations in randomized controlled trials of complex interventions, and policy-makers who call for a proper understanding of implementation processes, emphasize the value of conceptual tools like the Normalization Process Model.

  12. The field theory approach to percolation processes

    International Nuclear Information System (INIS)

    Janssen, Hans-Karl; Taeuber, Uwe C.

    2005-01-01

    We review the field theory approach to percolation processes. Specifically, we focus on the so-called simple and general epidemic processes that display continuous non-equilibrium active to absorbing state phase transitions whose asymptotic features are governed, respectively, by the directed (DP) and dynamic isotropic percolation (dIP) universality classes. We discuss the construction of a field theory representation for these Markovian stochastic processes based on fundamental phenomenological considerations, as well as from a specific microscopic reaction-diffusion model realization. Subsequently we explain how dynamic renormalization group (RG) methods can be applied to obtain the universal properties near the critical point in an expansion about the upper critical dimensions d c = 4 (DP) and 6 (dIP). We provide a detailed overview of results for critical exponents, scaling functions, crossover phenomena, finite-size scaling, and also briefly comment on the influence of long-range spreading, the presence of a boundary, multispecies generalizations, coupling of the order parameter to other conserved modes, and quenched disorder

  13. Anterior EEG asymmetries and opponent process theory.

    Science.gov (United States)

    Kline, John P; Blackhart, Ginette C; Williams, William C

    2007-03-01

    The opponent process theory of emotion [Solomon, R.L., and Corbit, J.D. (1974). An opponent-process theory of motivation: I. Temporal dynamics of affect. Psychological Review, 81, 119-143.] predicts a temporary reversal of emotional valence during the recovery from emotional stimulation. We hypothesized that this affective contrast would be apparent in asymmetrical activity patterns in the frontal lobes, and would be more apparent for left frontally active individuals. The present study tested this prediction by examining EEG asymmetries during and after blocked presentations of aversive pictures selected from the International Affective Picture System (IAPS). 12 neutral images, 12 aversive images, and 24 neutral images were presented in blocks. Participants who were right frontally active at baseline did not show changes in EEG asymmetry while viewing aversive slides or after cessation. Participants left frontally active at baseline, however, exhibited greater relative left frontal activity after aversive stimulation than before stimulation. Asymmetrical activity patterns in the frontal lobes may relate to affect regulatory processes, including contrasting opponent after-reactions to aversive stimuli.

  14. Theories on migration processes of Cd in Jiaozhou Bay

    Science.gov (United States)

    Yang, Dongfang; Li, Haixia; Wang, Qi; Ding, Jun; Zhang, Longlei

    2018-03-01

    Understanding the migration progress is essential to pollution control, while developing theories for the migration progress is the scientific basis. This paper further developed five key theories on migration processes of Cd including homogeneous theory, environmental dynamic theory, horizontal loss theory, migration trend theory and vertical migration theory, respectively. The performance and practical values of these theories were demonstrated in the application of these on analyzing the migration process of Cd in Jiaozhou Bay. Results these theory helpful to better understand the migration progress of pollutants in marine bay.

  15. Examination of the neighborhood activation theory in normal and hearing-impaired listeners.

    Science.gov (United States)

    Dirks, D D; Takayanagi, S; Moshfegh, A; Noffsinger, P D; Fausti, S A

    2001-02-01

    well as to an elderly group of listeners with sensorineural hearing loss in the speech-shaped noise (Experiment 3). The results of three experiments verified predictions of NAM in both normal hearing and hearing-impaired listeners. In each experiment, words from low density neighborhoods were recognized more accurately than those from high density neighborhoods. The presence of high frequency neighbors (average neighborhood frequency) produced poorer recognition performance than comparable conditions with low frequency neighbors. Word frequency was found to have a highly significant effect on word recognition. Lexical conditions with high word frequencies produced higher performance scores than conditions with low frequency words. The results supported the basic tenets of NAM theory and identified both neighborhood structural properties and word frequency as significant lexical factors affecting word recognition when listening in noise and "in quiet." The results of the third experiment permit extension of NAM theory to individuals with sensorineural hearing loss. Future development of speech recognition tests should allow for the effects of higher level cognitive (lexical) factors on lower level phonemic processing.

  16. A normal form approach to the theory of nonlinear betatronic motion

    International Nuclear Information System (INIS)

    Bazzani, A.; Todesco, E.; Turchetti, G.; Servizi, G.

    1994-01-01

    The betatronic motion of a particle in a circular accelerator is analysed using the transfer map description of the magnetic lattice. In the linear case the transfer matrix approach is shown to be equivalent to the Courant-Snyder theory: In the normal coordinates' representation the transfer matrix is a pure rotation. When the nonlinear effects due to the multipolar components of the magnetic field are taken into account, a similar procedure is used: a nonlinear change of coordinates provides a normal form representation of the map, which exhibits explicit symmetry properties depending on the absence or presence of resonance relations among the linear tunes. The use of normal forms is illustrated in the simplest but significant model of a cell with a sextupolar nonlinearity which is described by the quadratic Henon map. After recalling the basic theoretical results in Hamiltonian dynamics, we show how the normal forms describe the different topological structures of phase space such as KAM tori, chains of islands and chaotic regions; a critical comparison with the usual perturbation theory for Hamilton equations is given. The normal form theory is applied to compute the tune shift and deformation of the orbits for the lattices of the SPS and LHC accelerators, and scaling laws are obtained. Finally, the correction procedure of the multipolar errors of the LHC, based on the analytic minimization of the tune shift computed via the normal forms, is described and the results for a model of the LHC are presented. This application, relevant for the lattice design, focuses on the advantages of normal forms with respect to tracking when parametric dependences have to be explored. (orig.)

  17. Verbal Processing Reaction Times in "Normal" and "Poor" Readers.

    Science.gov (United States)

    Culbertson, Jack; And Others

    After it had been determined that reaction time (RT) was a sensitive measure of hemispheric dominance in a verbal task performed by normal adult readers, the reaction times of three groups of subjects (20 normal reading college students, 12 normal reading third graders and 11 poor reading grade school students) were compared. Ss were exposed to…

  18. Colorimetric determination of reducing normality in the Purex process

    International Nuclear Information System (INIS)

    Baumann, E.W.

    1983-07-01

    Adjustment of the valence state of plutonium from extractable Pu(IV) to nonextractable Pu(III) in the Purex process is accomplished by addition of reductants such as Fe(II), hydroxylamine nitrate (HAN), or U(IV). To implement on-line monitoring of this reduction step for improved process control at the Savannah River Plant, a simple colorimetric method for determining excess reductant (reducing normality) was developed. The method is based on formation of a colored complex of Fe(II) with FerroZine (Hach Chemical Company). The concentration of Fe(II) is determined directly. The concentration of HAN or U(IV), in addition to Fe(II), is determined indirectly as Fe(II), produced through reduction of Fe(III). Experimental conditions for a HAN-Fe(III) reaction of known stoichiometry were established. The effect of hydrazine, which stabilizes U(IV), was also determined. Real-time measurements of color development were made that simulated on-line performance. A laboratory analytical procedure is included. 5 references, 8 figures

  19. Working Memory Processing In Normal Subjects and Subjects with Dyslexia

    Science.gov (United States)

    Bowyer, S. M.; Lajiness-O'Neill, R.; Weiland, B. J.; Mason, K.; Tepley, N.

    2004-10-01

    Magnetoencephalography (MEG) was used to determine the neuroanatomical location of working memory (WM) processes. Differences between subjects with dyslexia (SD; n=5) and normal readers (NR; n=5) were studied during two WM tasks. A spatial WM task (SMW) consisted of blocks visually presented in one of 12 positions for 2 s each. Subjects were to determine if the current position matched the position presented 2 slides earlier (N-Back Test). The verbal task (VMW) consisted of presentation of a single letter. The location of cortical activity during SWM in NR (determined with MR-FOCUSS analysis) was in the right superior temporal gyrus (STG) and right angular gyrus (AG). Similar activation was seen in SD with a slight delay of approximately 20 ms. During VWM activity was seen in LEFT STG and LEFT AG in NR. In contrast for SD, activation was in the RIGHT STG and RIGHT AG. This study demonstrates the possibility to differentiate WM processing in subjects with and without learning disorders.

  20. Theory of normal and superconducting properties of fullerene-based solids

    International Nuclear Information System (INIS)

    Cohen, M.L.

    1992-10-01

    Recent experiments on the normal-state and superconducting properties of fullerene-based solids are used to constrain the proposal theories of the electronic nature of these materials. In general, models of superconductivity based on electron pairing induced by phonons are consistent with electronic band theory. The latter experiments also yield estimates of the parameters characterizing these type H superconductors. It is argued that, at this point, a ''standard model'' of phonons interacting with itinerant electrons may be a good first approximation for explaining the properties of the metallic fullerenes

  1. Semi adiabatic theory of seasonal Markov processes

    Energy Technology Data Exchange (ETDEWEB)

    Talkner, P [Paul Scherrer Inst. (PSI), Villigen (Switzerland)

    1999-08-01

    The dynamics of many natural and technical systems are essentially influenced by a periodic forcing. Analytic solutions of the equations of motion for periodically driven systems are generally not known. Simulations, numerical solutions or in some limiting cases approximate analytic solutions represent the known approaches to study the dynamics of such systems. Besides the regime of weak periodic forces where linear response theory works, the limit of a slow driving force can often be treated analytically using an adiabatic approximation. For this approximation to hold all intrinsic processes must be fast on the time-scale of a period of the external driving force. We developed a perturbation theory for periodically driven Markovian systems that covers the adiabatic regime but also works if the system has a single slow mode that may even be slower than the driving force. We call it the semi adiabatic approximation. Some results of this approximation for a system exhibiting stochastic resonance which usually takes place within the semi adiabatic regime are indicated. (author) 1 fig., 8 refs.

  2. Toward a computational theory of conscious processing.

    Science.gov (United States)

    Dehaene, Stanislas; Charles, Lucie; King, Jean-Rémi; Marti, Sébastien

    2014-04-01

    The study of the mechanisms of conscious processing has become a productive area of cognitive neuroscience. Here we review some of the recent behavioral and neuroscience data, with the specific goal of constraining present and future theories of the computations underlying conscious processing. Experimental findings imply that most of the brain's computations can be performed in a non-conscious mode, but that conscious perception is characterized by an amplification, global propagation and integration of brain signals. A comparison of these data with major theoretical proposals suggests that firstly, conscious access must be carefully distinguished from selective attention; secondly, conscious perception may be likened to a non-linear decision that 'ignites' a network of distributed areas; thirdly, information which is selected for conscious perception gains access to additional computations, including temporary maintenance, global sharing, and flexible routing; and finally, measures of the complexity, long-distance correlation and integration of brain signals provide reliable indices of conscious processing, clinically relevant to patients recovering from coma. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. The Distribution of the Product Explains Normal Theory Mediation Confidence Interval Estimation.

    Science.gov (United States)

    Kisbu-Sakarya, Yasemin; MacKinnon, David P; Miočević, Milica

    2014-05-01

    The distribution of the product has several useful applications. One of these applications is its use to form confidence intervals for the indirect effect as the product of 2 regression coefficients. The purpose of this article is to investigate how the moments of the distribution of the product explain normal theory mediation confidence interval coverage and imbalance. Values of the critical ratio for each random variable are used to demonstrate how the moments of the distribution of the product change across values of the critical ratio observed in research studies. Results of the simulation study showed that as skewness in absolute value increases, coverage decreases. And as skewness in absolute value and kurtosis increases, imbalance increases. The difference between testing the significance of the indirect effect using the normal theory versus the asymmetric distribution of the product is further illustrated with a real data example. This article is the first study to show the direct link between the distribution of the product and indirect effect confidence intervals and clarifies the results of previous simulation studies by showing why normal theory confidence intervals for indirect effects are often less accurate than those obtained from the asymmetric distribution of the product or from resampling methods.

  4. Whiteheadian process and quantum theory of mind

    International Nuclear Information System (INIS)

    Stapp, H.

    1998-01-01

    There are deep similarities between Whitehead's idea of the process by which nature unfolds and the ideas of quantum theory. Whitehead says that the world is made of ''actual occasions'', each of which arises from potentialities created by prior actual occasions. These actual occasions are happenings modeled on experiential events, each of which comes into being and then perishes, only to be replaced by a successor. It is these experience-like happenings that are the basic realities of nature, according to Whitehead, not the persisting physical particles that Newtonian physics took be the basic entities. Similarly, Heisenberg says that what is really happening in a quantum process is the emergence of an actual from potentialities created by prior actualities. In the orthodox Copenhagen interpretation of quantum theory the actual things to which the theory refer are increments in ''our knowledge''. These increments are experiential events. The particles of classical physics lose their fundamental status: they dissolve into diffuse clouds of possibilities. At each stage of the unfolding of nature the complete cloud of possibilities acts like the potentiality for the occurrence of a next increment in knowledge, whose occurrence can radically change the cloud of possibilities/potentialities for the still-later increments in knowledge. The fundamental difference between these ideas about nature and the classical ideas that reigned from the time of Newton until this century concerns the status of the experiential aspects of nature. These are things such as thoughts, ideas, feelings, and sensations. They are distinguished from the physical aspects of nature, which are described in terms of quantities explicitly located in tiny regions of space and time. According to the ideas of classical physics the physical world is made up exclusively of things of this latter type, and the unfolding of the physical world is determined by causal connections involving only these things

  5. Calculation of TC in a normal-superconductor bilayer using the microscopic-based Usadel theory

    International Nuclear Information System (INIS)

    Martinis, John M.; Hilton, G.C.; Irwin, K.D.; Wollman, D.A.

    2000-01-01

    The Usadel equations give a theory of superconductivity, valid in the diffusive limit, that is a generalization of the microscopic equations of the BCS theory. Because the theory is expressed in a tractable and physical form, even experimentalists can analytically and numerically calculate detailed properties of superconductors in physically relevant geometries. Here, we describe the Usadel equations and review their solution in the case of predicting the transition temperature T C of a thin normal-superconductor bilayer. We also extend this calculation for thicker bilayers to show the dependence on the resistivity of the films. These results, which show a dependence on both the interface resistance and heat capacity of the films, provide important guidance on fabricating bilayers with reproducible transition temperatures

  6. Long-wave theory for a new convective instability with exponential growth normal to the wall.

    Science.gov (United States)

    Healey, J J

    2005-05-15

    A linear stability theory is presented for the boundary-layer flow produced by an infinite disc rotating at constant angular velocity in otherwise undisturbed fluid. The theory is developed in the limit of long waves and when the effects of viscosity on the waves can be neglected. This is the parameter regime recently identified by the author in a numerical stability investigation where a curious new type of instability was found in which disturbances propagate and grow exponentially in the direction normal to the disc, (i.e. the growth takes place in a region of zero mean shear). The theory describes the mechanisms controlling the instability, the role and location of critical points, and presents a saddle-point analysis describing the large-time evolution of a wave packet in frames of reference moving normal to the disc. The theory also shows that the previously obtained numerical solutions for numerically large wavelengths do indeed lie in the asymptotic long-wave regime, and so the behaviour and mechanisms described here may apply to a number of cross-flow instability problems.

  7. Theory of Neural Information Processing Systems

    International Nuclear Information System (INIS)

    Galla, Tobias

    2006-01-01

    It is difficult not to be amazed by the ability of the human brain to process, to structure and to memorize information. Even by the toughest standards the behaviour of this network of about 10 11 neurons qualifies as complex, and both the scientific community and the public take great interest in the growing field of neuroscience. The scientific endeavour to learn more about the function of the brain as an information processing system is here a truly interdisciplinary one, with important contributions from biology, computer science, physics, engineering and mathematics as the authors quite rightly point out in the introduction of their book. The role of the theoretical disciplines here is to provide mathematical models of information processing systems and the tools to study them. These models and tools are at the centre of the material covered in the book by Coolen, Kuehn and Sollich. The book is divided into five parts, providing basic introductory material on neural network models as well as the details of advanced techniques to study them. A mathematical appendix complements the main text. The range of topics is extremely broad, still the presentation is concise and the book well arranged. To stress the breadth of the book let me just mention a few keywords here: the material ranges from the basics of perceptrons and recurrent network architectures to more advanced aspects such as Bayesian learning and support vector machines; Shannon's theory of information and the definition of entropy are discussed, and a chapter on Amari's information geometry is not missing either. Finally the statistical mechanics chapters cover Gardner theory and the replica analysis of the Hopfield model, not without being preceded by a brief introduction of the basic concepts of equilibrium statistical physics. The book also contains a part on effective theories of the macroscopic dynamics of neural networks. Many dynamical aspects of neural networks are usually hard to find in the

  8. How to Develop a Multi-Grounded Theory: the evolution of a business process theory

    OpenAIRE

    Mikael Lind; Goran Goldkuhl

    2006-01-01

    In the information systems field there is a great need for different theories. Theory development can be performed in different ways – deductively and/or inductively. Different approaches with their pros and cons for theory development exists. A combined approach, which builds on inductive as well as deductive thinking, has been put forward – a Multi-Grounded Theory approach. In this paper the evolution of a business process theory is regarded as the development of a multi-grounded theory. Th...

  9. Dual-Process Theories and Cognitive Development: Advances and Challenges

    Science.gov (United States)

    Barrouillet, Pierre

    2011-01-01

    Dual-process theories have gained increasing importance in psychology. The contrast that they describe between an old intuitive and a new deliberative mind seems to make these theories especially suited to account for development. Accordingly, this special issue aims at presenting the latest applications of dual-process theories to cognitive…

  10. Estimating Non-Normal Latent Trait Distributions within Item Response Theory Using True and Estimated Item Parameters

    Science.gov (United States)

    Sass, D. A.; Schmitt, T. A.; Walker, C. M.

    2008-01-01

    Item response theory (IRT) procedures have been used extensively to study normal latent trait distributions and have been shown to perform well; however, less is known concerning the performance of IRT with non-normal latent trait distributions. This study investigated the degree of latent trait estimation error under normal and non-normal…

  11. Theories of transporting processes of Cu in Jiaozhou Bay

    Science.gov (United States)

    Yang, Dongfang; Su, Chunhua; Zhu, Sixi; Wu, Yunjie; Zhou, Wei

    2018-02-01

    Many marine bays have been polluted along with the rapid development of industry and population size, and understanding the transporting progresses of pollutants is essential to pollution control. In order to better understanding the transporting progresses of pollutants in marine, this paper carried on a comprehensive research of the theories of transporting processes of Cu in Jiaozhou Bay. Results showed that the transporting processes of Cu in this bay could be summarized into seven key theories including homogeneous theory, environmental dynamic theory, horizontal loss theory, source to waters transporting theory, sedimentation transporting theory, migration trend theory and vertical transporting theory, respectively. These theories helpful to better understand the migration progress of pollutants in marine bay.

  12. A Short Review of the Theory of Hard Exclusive Processes

    International Nuclear Information System (INIS)

    Wallon, S.

    2012-01-01

    We first present an introduction to the theory of hard exclusive processes. We then illustrate this theory by a few selected examples. The last part is devoted to the most recent developments in the asymptotical energy limit. (author)

  13. Fetterman-House: A Process Use Distinction and a Theory.

    Science.gov (United States)

    Fetterman, David

    2003-01-01

    Discusses the concept of process use as an important distinction between the evaluation theories of E. House and D. Fetterman, thus helping to explain the discordant results of C. Christie for these two theories. (SLD)

  14. NORMAL PRESSURE AND FRICTION STRESS MEASUREMENT IN ROLLING PROCESSES

    DEFF Research Database (Denmark)

    Henningsen, Poul; Arentoft, Mogens; Lagergren, Jonas

    2005-01-01

    the output from the transducer, the friction stress and normal pressure in the contact zone can be determined. The new concept differs from existing pin designs by a lower disturbance of lubricant film and material flow and limited penetration of material between transducer and roll. Aluminum, cupper...

  15. Tuned with a tune: Talker normalization via general auditory processes

    Directory of Open Access Journals (Sweden)

    Erika J C Laing

    2012-06-01

    Full Text Available Voices have unique acoustic signatures, contributing to the acoustic variability listeners must contend with in perceiving speech, and it has long been proposed that listeners normalize speech perception to information extracted from a talker’s speech. Initial attempts to explain talker normalization relied on extraction of articulatory referents, but recent studies of context-dependent auditory perception suggest that general auditory referents such as the long-term average spectrum (LTAS of a talker’s speech similarly affect speech perception. The present study aimed to differentiate the contributions of articulatory/linguistic versus auditory referents for context-driven talker normalization effects and, more specifically, to identify the specific constraints under which such contexts impact speech perception. Synthesized sentences manipulated to sound like different talkers influenced categorization of a subsequent speech target only when differences in the sentences’ LTAS were in the frequency range of the acoustic cues relevant for the target phonemic contrast. This effect was true both for speech targets preceded by spoken sentence contexts and for targets preceded by nonspeech tone sequences that were LTAS-matched to the spoken sentence contexts. Specific LTAS characteristics, rather than perceived talker, predicted the results suggesting that general auditory mechanisms play an important role in effects considered to be instances of perceptual talker normalization.

  16. [Quantitative analysis method based on fractal theory for medical imaging of normal brain development in infants].

    Science.gov (United States)

    Li, Heheng; Luo, Liangping; Huang, Li

    2011-02-01

    The present paper is aimed to study the fractal spectrum of the cerebral computerized tomography in 158 normal infants of different age groups, based on the calculation of chaotic theory. The distribution range of neonatal period was 1.88-1.90 (mean = 1.8913 +/- 0.0064); It reached a stable condition at the level of 1.89-1.90 during 1-12 months old (mean = 1.8927 +/- 0.0045); The normal range of 1-2 years old infants was 1.86-1.90 (mean = 1.8863 +/- 4 0.0085); It kept the invariance of the quantitative value among 1.88-1.91(mean = 1.8958 +/- 0.0083) during 2-3 years of age. ANOVA indicated there's no significant difference between boys and girls (F = 0.243, P > 0.05), but the difference of age groups was significant (F = 8.947, P development.

  17. Assessment the Plasticity of Cortical Brain Theory through Visual Memory in Deaf and Normal Students

    Directory of Open Access Journals (Sweden)

    Ali Ghanaee-Chamanabad

    2012-10-01

    Full Text Available Background: The main aim of this research was to assess the differences of visual memory in deaf and normal students according to plasticity of cortical brain.Materials and Methods: This is an ex-post factor research. Benton visual test was performed by two different ways on 46 students of primary school. (22 deaf and 24 normal students. The t-student was used to analysis the data. Results: The visual memory in deaf students was significantly higher than the similar normal students (not deaf.While the action of visual memory in deaf girls was risen in comparison to normal girls in both ways, the deaf boys presented the better action in just one way of the two performances of Benton visual memory test.Conclusion: The action of plasticity of brain shows that the brain of an adult is dynamic and there are some changes in it. This brain plasticity has not limited to sensory somatic systems. Therefore according to plasticity of cortical brain theory, the deaf students due to the defect of hearing have increased the visual the visual inputs which developed the procedural visual memory.

  18. Biomechanical Analysis of Normal Brain Development during the First Year of Life Using Finite Strain Theory.

    Science.gov (United States)

    Kim, Jeong Chul; Wang, Li; Shen, Dinggang; Lin, Weili

    2016-12-02

    The first year of life is the most critical time period for structural and functional development of the human brain. Combining longitudinal MR imaging and finite strain theory, this study aimed to provide new insights into normal brain development through a biomechanical framework. Thirty-three normal infants were longitudinally imaged using MRI from 2 weeks to 1 year of age. Voxel-wise Jacobian determinant was estimated to elucidate volumetric changes while Lagrange strains (both normal and shear strains) were measured to reveal directional growth information every 3 months during the first year of life. Directional normal strain maps revealed that, during the first 6 months, the growth pattern of gray matter is anisotropic and spatially inhomogeneous with higher left-right stretch around the temporal lobe and interhemispheric fissure, anterior-posterior stretch in the frontal and occipital lobes, and superior-inferior stretch in right inferior occipital and right inferior temporal gyri. In contrast, anterior lateral ventricles and insula showed an isotropic stretch pattern. Volumetric and directional growth rates were linearly decreased with age for most of the cortical regions. Our results revealed anisotropic and inhomogeneous brain growth patterns of the human brain during the first year of life using longitudinal MRI and a biomechanical framework.

  19. Challenges in clinical natural language processing for automated disorder normalization.

    Science.gov (United States)

    Leaman, Robert; Khare, Ritu; Lu, Zhiyong

    2015-10-01

    Identifying key variables such as disorders within the clinical narratives in electronic health records has wide-ranging applications within clinical practice and biomedical research. Previous research has demonstrated reduced performance of disorder named entity recognition (NER) and normalization (or grounding) in clinical narratives than in biomedical publications. In this work, we aim to identify the cause for this performance difference and introduce general solutions. We use closure properties to compare the richness of the vocabulary in clinical narrative text to biomedical publications. We approach both disorder NER and normalization using machine learning methodologies. Our NER methodology is based on linear-chain conditional random fields with a rich feature approach, and we introduce several improvements to enhance the lexical knowledge of the NER system. Our normalization method - never previously applied to clinical data - uses pairwise learning to rank to automatically learn term variation directly from the training data. We find that while the size of the overall vocabulary is similar between clinical narrative and biomedical publications, clinical narrative uses a richer terminology to describe disorders than publications. We apply our system, DNorm-C, to locate disorder mentions and in the clinical narratives from the recent ShARe/CLEF eHealth Task. For NER (strict span-only), our system achieves precision=0.797, recall=0.713, f-score=0.753. For the normalization task (strict span+concept) it achieves precision=0.712, recall=0.637, f-score=0.672. The improvements described in this article increase the NER f-score by 0.039 and the normalization f-score by 0.036. We also describe a high recall version of the NER, which increases the normalization recall to as high as 0.744, albeit with reduced precision. We perform an error analysis, demonstrating that NER errors outnumber normalization errors by more than 4-to-1. Abbreviations and acronyms are found

  20. Density functional theory and parallel processing

    International Nuclear Information System (INIS)

    Ward, R.C.; Geist, G.A.; Butler, W.H.

    1987-01-01

    The authors demonstrate a method for obtaining the ground state energies and charge densities of a system of atoms described within density functional theory using simulated annealing on a parallel computer

  1. Elements of the theory of Markov processes and their applications

    CERN Document Server

    Bharucha-Reid, A T

    2010-01-01

    This graduate-level text and reference in probability, with numerous applications to several fields of science, presents nonmeasure-theoretic introduction to theory of Markov processes. The work also covers mathematical models based on the theory, employed in various applied fields. Prerequisites are a knowledge of elementary probability theory, mathematical statistics, and analysis. Appendixes. Bibliographies. 1960 edition.

  2. Applying Information Processing Theory to Supervision: An Initial Exploration

    Science.gov (United States)

    Tangen, Jodi L.; Borders, L. DiAnne

    2017-01-01

    Although clinical supervision is an educational endeavor (Borders & Brown, [Borders, L. D., 2005]), many scholars neglect theories of learning in working with supervisees. The authors describe 1 learning theory--information processing theory (Atkinson & Shiffrin, 1968, 1971; Schunk, 2016)--and the ways its associated interventions may…

  3. Could information theory provide an ecological theory of sensory processing?

    Science.gov (United States)

    Atick, Joseph J

    2011-01-01

    The sensory pathways of animals are well adapted to processing a special class of signals, namely stimuli from the animal's environment. An important fact about natural stimuli is that they are typically very redundant and hence the sampled representation of these signals formed by the array of sensory cells is inefficient. One could argue for some animals and pathways, as we do in this review, that efficiency of information representation in the nervous system has several evolutionary advantages. Consequently, one might expect that much of the processing in the early levels of these sensory pathways could be dedicated towards recoding incoming signals into a more efficient form. In this review, we explore the principle of efficiency of information representation as a design principle for sensory processing. We give a preliminary discussion on how this principle could be applied in general to predict neural processing and then discuss concretely some neural systems where it recently has been shown to be successful. In particular, we examine the fly's LMC coding strategy and the mammalian retinal coding in the spatial, temporal and chromatic domains.

  4. Measurement of Normal and Friction Forces in a Rolling Process

    DEFF Research Database (Denmark)

    Henningsen, Poul; Arentoft, Mogens; Wanheim, Tarras

    2004-01-01

    by the fric-tion conditions. To achieve this important informa-tion, measurements of the normal pressure and friction stresses in the deformation zone are re-quested. The direction of the friction stresses is changing during the rolling gap. At the entrance of the de-formation zone, the peripherical velocity...... of the roll is higher than for the incoming material, which causes frictional stresses at the material acting in the rolling direction. At the outlet of the rolling gap, the velocity of the deformed material exceeds the velocity of the roll, generating frictional stresses contrary to the direction of rolling....... In a narrow area in the deformation zone, the velocity of the de-formed material is equal to the velocity of the rolls. This area or line is named “neutral line”. The posi-tion of the neutral line depends on friction, reduc-tion ratio, diameter of the rolls, and width of the sheet....

  5. Analysis of a renormalization group method and normal form theory for perturbed ordinary differential equations

    Science.gov (United States)

    DeVille, R. E. Lee; Harkin, Anthony; Holzer, Matt; Josić, Krešimir; Kaper, Tasso J.

    2008-06-01

    For singular perturbation problems, the renormalization group (RG) method of Chen, Goldenfeld, and Oono [Phys. Rev. E. 49 (1994) 4502-4511] has been shown to be an effective general approach for deriving reduced or amplitude equations that govern the long time dynamics of the system. It has been applied to a variety of problems traditionally analyzed using disparate methods, including the method of multiple scales, boundary layer theory, the WKBJ method, the Poincaré-Lindstedt method, the method of averaging, and others. In this article, we show how the RG method may be used to generate normal forms for large classes of ordinary differential equations. First, we apply the RG method to systems with autonomous perturbations, and we show that the reduced or amplitude equations generated by the RG method are equivalent to the classical Poincaré-Birkhoff normal forms for these systems up to and including terms of O(ɛ2), where ɛ is the perturbation parameter. This analysis establishes our approach and generalizes to higher order. Second, we apply the RG method to systems with nonautonomous perturbations, and we show that the reduced or amplitude equations so generated constitute time-asymptotic normal forms, which are based on KBM averages. Moreover, for both classes of problems, we show that the main coordinate changes are equivalent, up to translations between the spaces in which they are defined. In this manner, our results show that the RG method offers a new approach for deriving normal forms for nonautonomous systems, and it offers advantages since one can typically more readily identify resonant terms from naive perturbation expansions than from the nonautonomous vector fields themselves. Finally, we establish how well the solution to the RG equations approximates the solution of the original equations on time scales of O(1/ɛ).

  6. The relationship of theory of mind and executive functions in normal, deaf and cochlear-implanted children

    Directory of Open Access Journals (Sweden)

    Farideh Nazarzadeh

    2014-08-01

    Full Text Available Background and Aim : Theory of mind refers to the ability to understand the others have mental states that can be different from one's own mental states or facts. This study aimed to investigate the relationship of theory of mind and executive functions in normal hearing, deaf, and cochlear-implanted children.Methods: The study population consisted of normal, deaf and cochlear-implanted girl students in Mashhad city, Iran. Using random sampling, 30 children (10 normal, 10 deaf and 10 cochlear-implanted in age groups of 8-12 years old were selected. To measure the theoty of mind, theory of mind 38-item scale and to assess executive function, Coolidge neuropsychological and personality test was used. Research data were analyzed using the Spearman correlation coefficient, analysis of variance and Kruskal-Wallis tests.Results: There was a significant difference between the groups in the theory of mind and executive function subscales, organization, planning-decision-making, and inhibition. Between normal and deaf groups (p=0.01, as well as cochlear-implanted and deaf groups (p=0.01, there was significant difference in planning decision-making subscale. There was not any significant relationship between the theory of mind and executive functions generally or the theory of mind and executive function subscales in these three groups independently.Conclusion: Based on our findings, cochlear-implanted and deaf children have lower performance in theory of mind and executive function compared with normal hearing children.

  7. How to Develop a Multi-Grounded Theory: the evolution of a business process theory

    Directory of Open Access Journals (Sweden)

    Mikael Lind

    2006-05-01

    Full Text Available In the information systems field there is a great need for different theories. Theory development can be performed in different ways – deductively and/or inductively. Different approaches with their pros and cons for theory development exists. A combined approach, which builds on inductive as well as deductive thinking, has been put forward – a Multi-Grounded Theory approach. In this paper the evolution of a business process theory is regarded as the development of a multi-grounded theory. This evolution is based on empirical studies, theory-informed conceptual development and the creation of conceptual cohesion. The theoretical development has involved a dialectic approach aiming at a theoretical synthesis based on antagonistic theories. The result of this research process was a multi-grounded business process theory. Multi-grounded means that the theory is empirically, internally and theoretically founded. This business process theory can be used as an aid for business modellers to direct attention towards relevant aspects when business process determination is performed.

  8. Process theory for supervisory control of stochastic systems with data

    NARCIS (Netherlands)

    Markovski, J.

    2012-01-01

    We propose a process theory for supervisory control of stochastic nondeterministic plants with data-based observations. The Markovian process theory with data relies on the notion of Markovian partial bisimulation to capture controllability of stochastic nondeterministic systems. It presents a

  9. Fractal Point Process and Queueing Theory and Application to Communication Networks

    National Research Council Canada - National Science Library

    Wornel, Gregory

    1999-01-01

    .... A unifying theme in the approaches to these problems has been an integration of interrelated perspectives from communication theory, information theory, signal processing theory, and control theory...

  10. Hamiltonian kinetic theory of plasma ponderomotive processes

    International Nuclear Information System (INIS)

    McDonald, S.W.; Kaufman, A.N.

    1982-01-01

    The nonlinear nonresonant interaction of plasma waves and particles is formulated in Hamiltonian kinetic theory which treats the wave-action and particle distributions on an equal footing, thereby displaying reciprocity relations. In the quasistatic limit, a nonlinear wave-kinetic equation is obtained. The generality of the formalism allows for applications to arbitrary geometry, with the nonlinear effects expressed in terms of the linear susceptibility

  11. Hamiltonian kinetic theory of plasma ponderomotive processes

    International Nuclear Information System (INIS)

    McDonald, S.W.; Kaufman, A.N.

    1981-12-01

    The nonlinear nonresonant interaction of plasma waves and particles is formulated in a Hamiltonian kinetic theory which treats the wave-action and particle distributions on an equal footing, thereby displaying reciprocity relations. In the quasistatic limit, a nonlinear wave-kinetic equation is obtained. The generality of the formalism allows for applications to arbitrary geometry, with the nonlinear effects expressed in terms of the linear susceptibility

  12. The conflict and process theory of Melanie Klein.

    Science.gov (United States)

    Kavaler-Adler, S

    1993-09-01

    This article depicts the theory of Melanie Klein in both its conflict and process dimensions. In addition, it outlines Klein's strategic place in psychoanalytic history and in psychoanalytic theory formation. Her major contributions are seen in light of their clinical imperatives, and aspects of her metapsychology that seem negligible are differentiated from these clinical imperatives. Klein's role as a dialectical fulcrum between drive and object relations theories is explicated. Within the conflict theory, drive derivatives of sex and aggression are reformulated as object-related passions of love and hate. The process dimensions of Klein's theory are outlined in terms of dialectical increments of depressive position process as it alternates with regressive paranoid-schizoid-position mental phenomenology. The mourning process as a developmental process is particularly high-lighted in terms of self-integrative progression within the working through of the depressive position.

  13. Modal analysis of inter-area oscillations using the theory of normal modes

    Energy Technology Data Exchange (ETDEWEB)

    Betancourt, R.J. [School of Electromechanical Engineering, University of Colima, Manzanillo, Col. 28860 (Mexico); Barocio, E. [CUCEI, University of Guadalajara, Guadalajara, Jal. 44480 (Mexico); Messina, A.R. [Graduate Program in Electrical Engineering, Cinvestav, Guadalajara, Jal. 45015 (Mexico); Martinez, I. [State Autonomous University of Mexico, Toluca, Edo. Mex. 50110 (Mexico)

    2009-04-15

    Based on the notion of normal modes in mechanical systems, a method is proposed for the analysis and characterization of oscillatory processes in power systems. The method is based on the property of invariance of modal subspaces and can be used to represent complex power system modal behavior by a set of decoupled, two-degree-of-freedom nonlinear oscillator equations. Using techniques from nonlinear mechanics, a new approach is outlined, for determining the normal modes (NMs) of motion of a general n-degree-of-freedom nonlinear system. Equations relating the normal modes and the physical velocities and displacements are developed from the linearized system model and numerical issues associated with the application of the technique are discussed. In addition to qualitative insight, this method can be utilized in the study of nonlinear behavior and bifurcation analyses. The application of these procedures is illustrated on a planning model of the Mexican interconnected system using a quadratic nonlinear model. Specifically, the use of normal mode analysis as a basis for identifying modal parameters, including natural frequencies and damping ratios of general, linear systems with n degrees of freedom is discussed. Comparisons to conventional linear analysis techniques demonstrate the ability of the proposed technique to extract the different oscillation modes embedded in the oscillation. (author)

  14. Process Dissociation and Mixture Signal Detection Theory

    Science.gov (United States)

    DeCarlo, Lawrence T.

    2008-01-01

    The process dissociation procedure was developed in an attempt to separate different processes involved in memory tasks. The procedure naturally lends itself to a formulation within a class of mixture signal detection models. The dual process model is shown to be a special case. The mixture signal detection model is applied to data from a widely…

  15. Theory of superconductivity. II. Excited Cooper pairs. Why does sodium remain normal down to 0 K?

    International Nuclear Information System (INIS)

    Fujita, S.

    1992-01-01

    Based on a generalized BCS Hamiltonian in which the interaction strengths (V 11 , V 22 , V 12 ) among and between electron (12) and hole (2) Cooper pairs are differentiated, the thermodynamic properties of a type-I superconductor below the critical temperature T c are investigated. An expression for the ground-state energy, W - W 0 , relative to the unperturbed Block system is obtained. The usual BCS formulas are obtained in the limits: (all) V jl = V 0 , N 1 (0) = N 2 (0). Any excitations generated through the BCS interaction Hamiltonian containing V jl must involve Cooper pairs of antiparallel spins and nearly opposite momenta. The nonzero momentum or excited Cooper pairs below T c are shown to have an excitation energy band minimum lower than the quasi-electrons, which were regarded as the elementary excitations in the original BCS theory. The energy gap var-epsilon g (T) defined relative to excited and zero-momentum Copper pairs (when V jl > 0) decreases from var-epsilon g (0) to 0 as the temperature T is raised from 0 to T c . If electrons only are available as in a monovalent metal like sodium (V 12 = 0), the energy constant Δ 1 is finite but the energy gap vanishes identically for all T. In agreement with the BCS theory, the present theory predicts that a pure nonmagnetic metal in any dimensions should have a Cooper-pair ground state whose energy is lower than that of the Bloch ground state. Additionally it predicts that a monovalent metal should remain normal down to 0 K, and that there should be no strictly one-dimensional superconductor

  16. Theory and praxis pf map analsys in CHEF part 1: Linear normal form

    Energy Technology Data Exchange (ETDEWEB)

    Michelotti, Leo; /Fermilab

    2008-10-01

    This memo begins a series which, put together, could comprise the 'CHEF Documentation Project' if there were such a thing. The first--and perhaps only--three will telegraphically describe theory, algorithms, implementation and usage of the normal form map analysis procedures encoded in CHEF's collection of libraries. [1] This one will begin the sequence by explaining the linear manipulations that connect the Jacobian matrix of a symplectic mapping to its normal form. It is a 'Reader's Digest' version of material I wrote in Intermediate Classical Dynamics (ICD) [2] and randomly scattered across technical memos, seminar viewgraphs, and lecture notes for the past quarter century. Much of its content is old, well known, and in some places borders on the trivial.1 Nevertheless, completeness requires their inclusion. The primary objective is the 'fundamental theorem' on normalization written on page 8. I plan to describe the nonlinear procedures in a subsequent memo and devote a third to laying out algorithms and lines of code, connecting them with equations written in the first two. Originally this was to be done in one short paper, but I jettisoned that approach after its first section exceeded a dozen pages. The organization of this document is as follows. A brief description of notation is followed by a section containing a general treatment of the linear problem. After the 'fundamental theorem' is proved, two further subsections discuss the generation of equilibrium distributions and issue of 'phase'. The final major section reviews parameterizations--that is, lattice functions--in two and four dimensions with a passing glance at the six-dimensional version. Appearances to the contrary, for the most part I have tried to restrict consideration to matters needed to understand the code in CHEF's libraries.

  17. Verifying Process Algebra Proofs in Type Theory

    NARCIS (Netherlands)

    Sellink, M.P.A.

    In this paper we study automatic verification of proofs in process algebra. Formulas of process algebra are represented by types in typed λ-calculus. Inhabitants (terms) of these types represent proofs. The specific typed λ-calculus we use is the Calculus of Inductive Constructions as implemented

  18. Relating Memory To Functional Performance In Normal Aging to Dementia Using Hierarchical Bayesian Cognitive Processing Models

    Science.gov (United States)

    Shankle, William R.; Pooley, James P.; Steyvers, Mark; Hara, Junko; Mangrola, Tushar; Reisberg, Barry; Lee, Michael D.

    2012-01-01

    Determining how cognition affects functional abilities is important in Alzheimer’s disease and related disorders (ADRD). 280 patients (normal or ADRD) received a total of 1,514 assessments using the Functional Assessment Staging Test (FAST) procedure and the MCI Screen (MCIS). A hierarchical Bayesian cognitive processing (HBCP) model was created by embedding a signal detection theory (SDT) model of the MCIS delayed recognition memory task into a hierarchical Bayesian framework. The SDT model used latent parameters of discriminability (memory process) and response bias (executive function) to predict, simultaneously, recognition memory performance for each patient and each FAST severity group. The observed recognition memory data did not distinguish the six FAST severity stages, but the latent parameters completely separated them. The latent parameters were also used successfully to transform the ordinal FAST measure into a continuous measure reflecting the underlying continuum of functional severity. HBCP models applied to recognition memory data from clinical practice settings accurately translated a latent measure of cognition to a continuous measure of functional severity for both individuals and FAST groups. Such a translation links two levels of brain information processing, and may enable more accurate correlations with other levels, such as those characterized by biomarkers. PMID:22407225

  19. PROCESS CAPABILITY ESTIMATION FOR NON-NORMALLY DISTRIBUTED DATA USING ROBUST METHODS - A COMPARATIVE STUDY

    Directory of Open Access Journals (Sweden)

    Yerriswamy Wooluru

    2016-06-01

    Full Text Available Process capability indices are very important process quality assessment tools in automotive industries. The common process capability indices (PCIs Cp, Cpk, Cpm are widely used in practice. The use of these PCIs based on the assumption that process is in control and its output is normally distributed. In practice, normality is not always fulfilled. Indices developed based on normality assumption are very sensitive to non- normal processes. When distribution of a product quality characteristic is non-normal, Cp and Cpk indices calculated using conventional methods often lead to erroneous interpretation of process capability. In the literature, various methods have been proposed for surrogate process capability indices under non normality but few literature sources offer their comprehensive evaluation and comparison of their ability to capture true capability in non-normal situation. In this paper, five methods have been reviewed and capability evaluation is carried out for the data pertaining to resistivity of silicon wafer. The final results revealed that the Burr based percentile method is better than Clements method. Modelling of non-normal data and Box-Cox transformation method using statistical software (Minitab 14 provides reasonably good result as they are very promising methods for non - normal and moderately skewed data (Skewness <= 1.5.

  20. Theory and praxis of map analsys in CHEF part 2: Nonlinear normal form

    International Nuclear Information System (INIS)

    Michelotti, Leo

    2009-01-01

    This is the second of three memos describing how normal form map analysis is implemented in CHEF. The first (1) explained the manipulations required to assure that initial, linear transformations preserved Poincare invariants, thereby confirming correct normalization of action-angle coordinates. In this one, the transformation will be extended to nonlinear terms. The third, describing how the algorithms were implemented within the software of CHEF's libraries, most likely will never be written. The first section, Section 2, quickly lays out preliminary concepts and relationships. In Section 3, we shall review the perturbation theory - an iterative sequence of transformations that converts a nonlinear mapping into its normal form - and examine the equation which moves calculations from one step to the next. Following that is a section titled 'Interpretation', which identifies connections between the normalized mappings and idealized, integrable, fictitious Hamiltonian models. A final section contains closing comments, some of which may - but probably will not - preview work to be done later. My reasons for writing this memo and its predecessor have already been expressed. (1) To them can be added this: 'black box code' encourages users to proceed with little or no understanding of what it does or how it operates. So far, CHEF has avoided this trap admirably by failing to attract potential users. However, we reached a watershed last year: even I now have difficulty following the software through its maze of operations. Extensions to CHEF's physics functionalities, software upgrades, and even simple maintenance are becoming more difficult than they should. I hope these memos will mark parts of the maze for easier navigation in the future. Despite appearances to the contrary, I tried to include no (or very little) more than the minimum needed to understand what CHEF's nonlinear analysis modules do.1 As with the first memo, material has been lifted - and modified - from

  1. Theory and praxis of map analsys in CHEF part 2: Nonlinear normal form

    Energy Technology Data Exchange (ETDEWEB)

    Michelotti, Leo; /FERMILAB

    2009-04-01

    This is the second of three memos describing how normal form map analysis is implemented in CHEF. The first [1] explained the manipulations required to assure that initial, linear transformations preserved Poincare invariants, thereby confirming correct normalization of action-angle coordinates. In this one, the transformation will be extended to nonlinear terms. The third, describing how the algorithms were implemented within the software of CHEF's libraries, most likely will never be written. The first section, Section 2, quickly lays out preliminary concepts and relationships. In Section 3, we shall review the perturbation theory - an iterative sequence of transformations that converts a nonlinear mapping into its normal form - and examine the equation which moves calculations from one step to the next. Following that is a section titled 'Interpretation', which identifies connections between the normalized mappings and idealized, integrable, fictitious Hamiltonian models. A final section contains closing comments, some of which may - but probably will not - preview work to be done later. My reasons for writing this memo and its predecessor have already been expressed. [1] To them can be added this: 'black box code' encourages users to proceed with little or no understanding of what it does or how it operates. So far, CHEF has avoided this trap admirably by failing to attract potential users. However, we reached a watershed last year: even I now have difficulty following the software through its maze of operations. Extensions to CHEF's physics functionalities, software upgrades, and even simple maintenance are becoming more difficult than they should. I hope these memos will mark parts of the maze for easier navigation in the future. Despite appearances to the contrary, I tried to include no (or very little) more than the minimum needed to understand what CHEF's nonlinear analysis modules do.1 As with the first memo, material

  2. What is algebraic in process theory?

    NARCIS (Netherlands)

    Luttik, B.

    2006-01-01

    This is an extended version of an essay with the same title that I wrote for the workshop Algebraic process calculi : the first twenty five years and beyond, held in Bertinoro, Italy in the first week of August 2005.

  3. What is algebraic in process theory?

    NARCIS (Netherlands)

    Luttik, B.

    2006-01-01

    This is an extended version of an essay with the same title that I wrote for the workshop Algebraic Process Calculi: The First Twenty Five Years and Beyond, held in Bertinoro, Italy in the first week of August 2005.

  4. Digital signal processing theory and practice

    CERN Document Server

    Rao, K Deergha

    2018-01-01

    The book provides a comprehensive exposition of all major topics in digital signal processing (DSP). With numerous illustrative examples for easy understanding of the topics, it also includes MATLAB-based examples with codes in order to encourage the readers to become more confident of the fundamentals and to gain insights into DSP. Further, it presents real-world signal processing design problems using MATLAB and programmable DSP processors. In addition to problems that require analytical solutions, it discusses problems that require solutions using MATLAB at the end of each chapter. Divided into 13 chapters, it addresses many emerging topics, which are not typically found in advanced texts on DSP. It includes a chapter on adaptive digital filters used in the signal processing problems for faster acceptable results in the presence of changing environments and changing system requirements. Moreover, it offers an overview of wavelets, enabling readers to easily understand the basics and applications of this po...

  5. Dual-Process Theories of Reasoning: The Test of Development

    Science.gov (United States)

    Barrouillet, Pierre

    2011-01-01

    Dual-process theories have become increasingly influential in the psychology of reasoning. Though the distinction they introduced between intuitive and reflective thinking should have strong developmental implications, the developmental approach has rarely been used to refine or test these theories. In this article, I review several contemporary…

  6. The theory of stochastic processes I

    CERN Document Server

    Gihman, Iosif Il’ich

    2004-01-01

    From the Reviews: "Gihman and Skorohod have done an excellent job of presenting the theory in its present state of rich imperfection." D.W. Stroock in Bulletin of the American Mathematical Society, 1980 "To call this work encyclopedic would not give an accurate picture of its content and style. Some parts read like a textbook, but others are more technical and contain relatively new results. ... The exposition is robust and explicit, as one has come to expect of the Russian tradition of mathematical writing. The set when completed will be an invaluable source of information and reference in this ever-expanding field" K.L. Chung in American Scientist, 1977 "..., the subject has grown enormously since 1953, and there will never be a true successor to Doob's book, but Gihman and Skorohod's three volumes will, I think, occupy a rather similar position as an invaluable tool of reference for all probability theorists. ... The dominant impression is of the authors' mastery of their material, and of their confident i...

  7. Performances on a cognitive theory of mind task: specific decline or general cognitive deficits? Evidence from normal aging.

    Science.gov (United States)

    Fliss, Rafika; Lemerre, Marion; Mollard, Audrey

    2016-06-01

    Compromised theory of mind (ToM) can be explained either by a failure to implement specific representational capacities (mental state representations) or by more general executive selection demands. In older adult populations, evidence supporting affected executive functioning and cognitive ToM in normal aging are reported. However, links between these two functions remain unclear. In the present paper, we address these shortcomings by using a specific task of ToM and classical executive tasks. We studied, using an original cognitive ToM task, the effect of age on ToM performances, in link with the progressive executive decline. 96 elderly participants were recruited. They were asked to perform a cognitive ToM task, and 5 executive tests (Stroop test and Hayling Sentence Completion Test to appreciate inhibitory process, Trail Making Test and Verbal Fluency for shifting assessment and backward span dedicated to estimate working memory capacity). The results show changes in cognitive ToM performance according to executive demands. Correlational studies indicate a significant relationship between ToM performance and the selected executive measures. Regression analyzes demonstrates that level of vocabulary and age as the best predictors of ToM performance. The results are consistent with the hypothesis that ToM deficits are related to age-related domain-general decline rather than as to a breakdown in specialized representational system. The implications of these findings for the nature of social cognition tests in normal aging are also discussed.

  8. Normalized value coding explains dynamic adaptation in the human valuation process.

    Science.gov (United States)

    Khaw, Mel W; Glimcher, Paul W; Louie, Kenway

    2017-11-28

    The notion of subjective value is central to choice theories in ecology, economics, and psychology, serving as an integrated decision variable by which options are compared. Subjective value is often assumed to be an absolute quantity, determined in a static manner by the properties of an individual option. Recent neurobiological studies, however, have shown that neural value coding dynamically adapts to the statistics of the recent reward environment, introducing an intrinsic temporal context dependence into the neural representation of value. Whether valuation exhibits this kind of dynamic adaptation at the behavioral level is unknown. Here, we show that the valuation process in human subjects adapts to the history of previous values, with current valuations varying inversely with the average value of recently observed items. The dynamics of this adaptive valuation are captured by divisive normalization, linking these temporal context effects to spatial context effects in decision making as well as spatial and temporal context effects in perception. These findings suggest that adaptation is a universal feature of neural information processing and offer a unifying explanation for contextual phenomena in fields ranging from visual psychophysics to economic choice.

  9. An Opponent-Process Theory of Motivation: II. Cigarette Addiction

    Science.gov (United States)

    Solomon, Richard L.; Corbit, John D.

    1973-01-01

    Methods suggested by opponent-process theory of acquired motivation in helping smokers to quit the habit include use of antagonistic drugs, total cessation from tobacco, and decrease in intensity and frequency of tobacco use. (DS)

  10. Information Processing Theories and the Education of the Gifted.

    Science.gov (United States)

    Rawl, Ruth K.; O'Tuel, Frances S.

    1983-01-01

    The basic assumptions of information processing theories in cognitive psychology are reviewed, and the application of this approach to problem solving in gifted education is considered. Specific implications are cited on problem selection and instruction giving. (CL)

  11. Continuous-time Markov decision processes theory and applications

    CERN Document Server

    Guo, Xianping

    2009-01-01

    This volume provides the first book entirely devoted to recent developments on the theory and applications of continuous-time Markov decision processes (MDPs). The MDPs presented here include most of the cases that arise in applications.

  12. Prospect Theory in the Automated Advisory Process

    OpenAIRE

    WERNER, JONATAN; SJÖBERG, JONAS

    2016-01-01

    With robo-advisors and regulation eventually changing the market conditions of thefinancial advisory industry, traditional advisors will have to adapt to a new world of asset management. Thus, it will be of interest to traditional advisors to further explore the topic of how to automatically evaluate soft aspects such as client preferences and behavior, and transform it into portfolio allocations while retaining stringency and high quality in the process. In this thesis, we show how client pr...

  13. Pseudo random signal processing theory and application

    CERN Document Server

    Zepernick, Hans-Jurgen

    2013-01-01

    In recent years, pseudo random signal processing has proven to be a critical enabler of modern communication, information, security and measurement systems. The signal's pseudo random, noise-like properties make it vitally important as a tool for protecting against interference, alleviating multipath propagation and allowing the potential of sharing bandwidth with other users. Taking a practical approach to the topic, this text provides a comprehensive and systematic guide to understanding and using pseudo random signals. Covering theoretical principles, design methodologies and applications

  14. Conflict Monitoring in Dual Process Theories of Thinking

    Science.gov (United States)

    De Neys, Wim; Glumicic, Tamara

    2008-01-01

    Popular dual process theories have characterized human thinking as an interplay between an intuitive-heuristic and demanding-analytic reasoning process. Although monitoring the output of the two systems for conflict is crucial to avoid decision making errors there are some widely different views on the efficiency of the process. Kahneman…

  15. A Numerical Theory for Impedance Education in Three-Dimensional Normal Incidence Tubes

    Science.gov (United States)

    Watson, Willie R.; Jones, Michael G.

    2016-01-01

    A method for educing the locally-reacting acoustic impedance of a test sample mounted in a 3-D normal incidence impedance tube is presented and validated. The unique feature of the method is that the excitation frequency (or duct geometry) may be such that high-order duct modes may exist. The method educes the impedance, iteratively, by minimizing an objective function consisting of the difference between the measured and numerically computed acoustic pressure at preselected measurement points in the duct. The method is validated on planar and high-order mode sources with data synthesized from exact mode theory. These data are then subjected to random jitter to simulate the effects of measurement uncertainties on the educed impedance spectrum. The primary conclusions of the study are 1) Without random jitter the method is in excellent agreement with that for known impedance samples, and 2) Random jitter that is compatible to that found in a typical experiment has minimal impact on the accuracy of the educed impedance.

  16. Cascade theory in isotopic separation processes

    International Nuclear Information System (INIS)

    Agostini, J.P.

    1994-06-01

    Three main areas are developed within the scope of this work: - the first one is devoted to fundamentals: separative power, value function, ideal cascade and square cascade. Applications to two main cases are carried out, namely: Study of binary isotopic mix, Study of processes with a small enrichment coefficient. - The second one is devoted to cascade coupling -high-flux coupling (more widely used and better known) as well as low-flux coupling are presented and compared to one another. - The third one is an outlook on problems linked to cascade transients. Those problem are somewhat intricate and their interest lies mainly into two areas: economics where the start-up time may have a large influence on the interests paid during the construction and start-up period, military productions where the start-up time has a direct bearing on the production schedule. (author). 50 figs. 3 annexes. 12 refs. 6 tabs

  17. Theory of emission spectra from metal films irradiated by low energy electrons near normal incidence

    International Nuclear Information System (INIS)

    Kretschmann, E.; Callcott, T.A.; Arakawa, E.T.

    1980-01-01

    The emission spectrum produced by low energy electrons incident on a rough metal surface has been calculated for a roughness auto-correlation function containing a prominent peak at a high wave vector. For low energy electrons near normal incidence, the high wavevector peak dominates the roughness coupled surface plasmon radiation (RCSPR) process. The calculation yields estimates of the ratio of RCSPR to transition radiation, the dependence of emission intensity on electron energy and the shape and position of the RCSPR peak. The most interesting result is that the high-wavevector roughness can split the RCSPR radiation into peaks lying above and below the asymptotic surface plasma frequency. The results are compared with data from Ag in the following paper. (orig.)

  18. Theory of novel normal and superconducting states in doped oxide high-Tc superconductors

    International Nuclear Information System (INIS)

    Dzhumanov, S.

    2001-10-01

    A consistent and complete theory of the novel normal and superconducting (SC) states of doped high-T c superconductors (HTSC) is developed by combining the continuum model of carrier self-trapping, the tight-binding model and the novel Fermi-Bose-liquid (FBL) model. The ground-state energy of carriers in lightly doped HTSC is calculated within the continuum model and adiabatic approximation using the variational method. The destruction of the long-range antiferromagnetic (AF) order at low doping x≥ x cl ≅0.015, the formation of the in-gap states or bands and novel (bi)polaronic insulating phases at x c2 ≅0.06-0.08, and the new metal- insulator transition at x≅x c2 in HTSC are studied within the continuum model of impurity (defect) centers and large (bi)polarons by using the appropriate tight-binding approximations. It is found that the three-dimensional (3d) large (bi)polarons are formed at ε ∞ /ε 0 ≤0.1 and become itinerant when the (bi)polaronic insulator-to-(bi)polaronic metal transitions occur at x x c2 . We show that the novel pseudogapped metallic and SC states in HTSC are formed at x c2 ≤x≤x p ≅0.20-0.24. We demonstrate that the large polaronic and small BCS-like pairing pseudogaps opening in the excitation spectrum of underdoped (x c2 BCS =0.125), optimally doped (x BCS o ≅0.20) and overdoped (x>x o ) HTSC above T c are unrelated to superconductivity and they are responsible for the observed anomalous optical, transport, magnetic and other properties of these HTSC. We develop the original two-stage FBL model of novel superconductivity describing the combined novel BCS-like pairing scenario of fermions and true superfluid (SF) condensation scenario of composite bosons (i.e. bipolarons and cooperons) in any Fermi-systems, where the SF condensate gap Δ B and the BCS-like pairing pseudogap Δ F have different origins. The pair and single particle condensations of attracting 3d and two- dimensional (2d) composite bosons are responsible for

  19. IMPROVING QUALITY OF STATISTICAL PROCESS CONTROL BY DEALING WITH NON‐NORMAL DATA IN AUTOMOTIVE INDUSTRY

    Directory of Open Access Journals (Sweden)

    Zuzana ANDRÁSSYOVÁ

    2012-07-01

    Full Text Available Study deals with an analysis of data to the effect that it improves the quality of statistical tools in processes of assembly of automobile seats. Normal distribution of variables is one of inevitable conditions for the analysis, examination, and improvement of the manufacturing processes (f. e.: manufacturing process capability although, there are constantly more approaches to non‐normal data handling. An appropriate probability distribution of measured data is firstly tested by the goodness of fit of empirical distribution with theoretical normal distribution on the basis of hypothesis testing using programme StatGraphics Centurion XV.II. Data are collected from the assembly process of 1st row automobile seats for each characteristic of quality (Safety Regulation ‐S/R individually. Study closely processes the measured data of an airbag´s assembly and it aims to accomplish the normal distributed data and apply it the statistical process control. Results of the contribution conclude in a statement of rejection of the null hypothesis (measured variables do not follow the normal distribution therefore it is necessary to begin to work on data transformation supported by Minitab15. Even this approach does not reach a normal distributed data and so should be proposed a procedure that leads to the quality output of whole statistical control of manufacturing processes.

  20. Bonding in Mercury Molecules Described by the Normalized Elimination of the Small Component and Coupled Cluster Theory

    NARCIS (Netherlands)

    Cremer, Dieter; Kraka, Elfi; Filatov, Michael

    2008-01-01

    Bond dissociation energies (BDEs) of neutral HgX and cationic HgX(+) molecules range from less than a kcal mol(-1) to as much as 60 kcal mol(-1). Using NESCICCCSD(T) [normalized elimination of the small component and coupled-cluster theory with all single and double excitations and a perturbative

  1. Information processing theory in the early design stages

    DEFF Research Database (Denmark)

    Cash, Philip; Kreye, Melanie

    2014-01-01

    suggestions for improvements and support. One theory that may be particularly applicable to the early design stages is Information Processing Theory (IPT) as it is linked to the design process with regard to the key concepts considered. IPT states that designers search for information if they perceive......, the new knowledge is shared between the design team to reduce ambiguity with regards to its meaning and to build a shared understanding – reducing perceived uncertainty. Thus, we propose that Information-Processing Theory is suitable to describe designer activity in the early design stages...... uncertainty with regard to the knowledge necessary to solve a design challenge. They then process this information and compare if the new knowledge they have gained covers the previous knowledge gap. In engineering design, uncertainty plays a key role, particularly in the early design stages which has been...

  2. [Description of clinical thinking by the dual-process theory].

    Science.gov (United States)

    Peña G, Luis

    2012-06-01

    Clinical thinking is a very complex process that can be described by the dual-process theory, it has an intuitive part (that recognizes patterns) and an analytical part (that tests hypotheses). It is vulnerable to cognitive bias that professionals must be aware of, to minimize diagnostic errors.

  3. Reasoning on the Autism Spectrum: A Dual Process Theory Account

    Science.gov (United States)

    Brosnan, Mark; Lewton, Marcus; Ashwin, Chris

    2016-01-01

    Dual process theory proposes two distinct reasoning processes in humans, an intuitive style that is rapid and automatic and a deliberative style that is more effortful. However, no study to date has specifically examined these reasoning styles in relation to the autism spectrum. The present studies investigated deliberative and intuitive reasoning…

  4. Information Architecture without Internal Theory: An Inductive Design Process.

    Science.gov (United States)

    Haverty, Marsha

    2002-01-01

    Suggests that information architecture design is primarily an inductive process, partly because it lacks internal theory and partly because it is an activity that supports emergent phenomena (user experiences) from basic design components. Suggests a resemblance to Constructive Induction, a design process that locates the best representational…

  5. Instructional Transaction Theory: Knowledge Relationships among Processes, Entities, and Activities.

    Science.gov (United States)

    Merrill, M. David; And Others

    1993-01-01

    Discussion of instructional transaction theory focuses on knowledge representation in an automated instructional design expert system. A knowledge structure called PEA-Net (processes, entities, and activities) is explained; the refrigeration process is used as an example; text resources and graphic resources are described; and simulations are…

  6. Recollection is a continuous process: implications for dual-process theories of recognition memory.

    Science.gov (United States)

    Mickes, Laura; Wais, Peter E; Wixted, John T

    2009-04-01

    Dual-process theory, which holds that recognition decisions can be based on recollection or familiarity, has long seemed incompatible with signal detection theory, which holds that recognition decisions are based on a singular, continuous memory-strength variable. Formal dual-process models typically regard familiarity as a continuous process (i.e., familiarity comes in degrees), but they construe recollection as a categorical process (i.e., recollection either occurs or does not occur). A continuous process is characterized by a graded relationship between confidence and accuracy, whereas a categorical process is characterized by a binary relationship such that high confidence is associated with high accuracy but all lower degrees of confidence are associated with chance accuracy. Using a source-memory procedure, we found that the relationship between confidence and source-recollection accuracy was graded. Because recollection, like familiarity, is a continuous process, dual-process theory is more compatible with signal detection theory than previously thought.

  7. Improving the requirements process in Axiomatic Design Theory

    DEFF Research Database (Denmark)

    Thompson, Mary Kathryn

    2013-01-01

    This paper introduces a model to integrate the traditional requirements process into Axiomatic Design Theory and proposes a method to structure the requirements process. The method includes a requirements classification system to ensure that all requirements information can be included...... in the Axiomatic Design process, a stakeholder classification system to reduce the chances of excluding one or more key stakeholders, and a table to visualize the mapping between the stakeholders and their requirements....

  8. Dual-Process Theories of Higher Cognition: Advancing the Debate.

    Science.gov (United States)

    Evans, Jonathan St B T; Stanovich, Keith E

    2013-05-01

    Dual-process and dual-system theories in both cognitive and social psychology have been subjected to a number of recently published criticisms. However, they have been attacked as a category, incorrectly assuming there is a generic version that applies to all. We identify and respond to 5 main lines of argument made by such critics. We agree that some of these arguments have force against some of the theories in the literature but believe them to be overstated. We argue that the dual-processing distinction is supported by much recent evidence in cognitive science. Our preferred theoretical approach is one in which rapid autonomous processes (Type 1) are assumed to yield default responses unless intervened on by distinctive higher order reasoning processes (Type 2). What defines the difference is that Type 2 processing supports hypothetical thinking and load heavily on working memory. © The Author(s) 2013.

  9. Log-normal distribution from a process that is not multiplicative but is additive.

    Science.gov (United States)

    Mouri, Hideaki

    2013-10-01

    The central limit theorem ensures that a sum of random variables tends to a Gaussian distribution as their total number tends to infinity. However, for a class of positive random variables, we find that the sum tends faster to a log-normal distribution. Although the sum tends eventually to a Gaussian distribution, the distribution of the sum is always close to a log-normal distribution rather than to any Gaussian distribution if the summands are numerous enough. This is in contrast to the current consensus that any log-normal distribution is due to a product of random variables, i.e., a multiplicative process, or equivalently to nonlinearity of the system. In fact, the log-normal distribution is also observable for a sum, i.e., an additive process that is typical of linear systems. We show conditions for such a sum, an analytical example, and an application to random scalar fields such as those of turbulence.

  10. Varying acoustic-phonemic ambiguity reveals that talker normalization is obligatory in speech processing.

    Science.gov (United States)

    Choi, Ja Young; Hu, Elly R; Perrachione, Tyler K

    2018-04-01

    The nondeterministic relationship between speech acoustics and abstract phonemic representations imposes a challenge for listeners to maintain perceptual constancy despite the highly variable acoustic realization of speech. Talker normalization facilitates speech processing by reducing the degrees of freedom for mapping between encountered speech and phonemic representations. While this process has been proposed to facilitate the perception of ambiguous speech sounds, it is currently unknown whether talker normalization is affected by the degree of potential ambiguity in acoustic-phonemic mapping. We explored the effects of talker normalization on speech processing in a series of speeded classification paradigms, parametrically manipulating the potential for inconsistent acoustic-phonemic relationships across talkers for both consonants and vowels. Listeners identified words with varying potential acoustic-phonemic ambiguity across talkers (e.g., beet/boat vs. boot/boat) spoken by single or mixed talkers. Auditory categorization of words was always slower when listening to mixed talkers compared to a single talker, even when there was no potential acoustic ambiguity between target sounds. Moreover, the processing cost imposed by mixed talkers was greatest when words had the most potential acoustic-phonemic overlap across talkers. Models of acoustic dissimilarity between target speech sounds did not account for the pattern of results. These results suggest (a) that talker normalization incurs the greatest processing cost when disambiguating highly confusable sounds and (b) that talker normalization appears to be an obligatory component of speech perception, taking place even when the acoustic-phonemic relationships across sounds are unambiguous.

  11. Event-related potential evidence for the processing efficiency theory.

    Science.gov (United States)

    Murray, N P; Janelle, C M

    2007-01-15

    The purpose of this study was to examine the central tenets of the processing efficiency theory using psychophysiological measures of attention and effort. Twenty-eight participants were divided equally into either a high or low trait anxiety group. They were then required to perform a simulated driving task while responding to one of four target light-emitting diodes. Cortical activity and dual task performance were recorded under two conditions -- baseline and competition -- with cognitive anxiety being elevated in the competitive session by an instructional set. Although driving speed was similar across sessions, a reduction in P3 amplitude to cue onset in the light detection task occurred for both groups during the competitive session, suggesting a reduction in processing efficiency as participants became more state anxious. Our findings provide more comprehensive and mechanistic evidence for processing efficiency theory, and confirm that increases in cognitive anxiety can result in a reduction of processing efficiency with little change in performance effectiveness.

  12. Algorithmic information theory mathematics of digital information processing

    CERN Document Server

    Seibt, Peter

    2007-01-01

    Treats the Mathematics of many important areas in digital information processing. This book covers, in a unified presentation, five topics: Data Compression, Cryptography, Sampling (Signal Theory), Error Control Codes, Data Reduction. It is useful for teachers, students and practitioners in Electronic Engineering, Computer Science and Mathematics.

  13. 师范生师德培养与思想政治理论课%Teachers' Professional Ethics and Ideological Political Theory Courses for Normal University Students

    Institute of Scientific and Technical Information of China (English)

    冉静; 王京强; 冯晋

    2015-01-01

    在思想政治理论课多元化的教育功能之中,道德教育是较为突出的功能之一。思想政治理论课在属性、目标和过程等方面与师范生师德培养存在密切的关联性,对师范生师德培养起着积极而有效的推动和促进作用。文章阐释了师范生师德培养的意义,并着重论述了思想政治理论课在师范生师德培养中的作用。%In the pluralism of educational function for t he Ideological and Political Theory Courses, Teachers' professional ethics is one of the prominent functions. The course is closely related to students’ morality cultivation in the aspects of attribution,goals and process,which greatly promotes normal university students’ professional ethics. And it plays a positive and effective role in promoting and facilitating teachers' professional ethics of normal university students. This article expounds the significance of the cultivating teachers' professional ethics of normal university students,and mainly discusses the role the ideological and political theory course plays in cultivating normal university students' professional ethics.

  14. Generalized Poisson processes in quantum mechanics and field theory

    International Nuclear Information System (INIS)

    Combe, P.; Rodriguez, R.; Centre National de la Recherche Scientifique, 13 - Marseille; Hoegh-Krohn, R.; Centre National de la Recherche Scientifique, 13 - Marseille; Sirugue, M.; Sirugue-Collin, M.; Centre National de la Recherche Scientifique, 13 - Marseille

    1981-01-01

    In section 2 we describe more carefully the generalized Poisson processes, giving a realization of the underlying probability space, and we characterize these processes by their characteristic functionals. Section 3 is devoted to the proof of the previous formula for quantum mechanical systems, with possibly velocity dependent potentials and in section 4 we give an application of the previous theory to some relativistic Bose field models. (orig.)

  15. Quantum theory of gauge fields and rigid processes calculation

    International Nuclear Information System (INIS)

    Andreev, I.V.

    1981-01-01

    Elementary statement of the basic data on the nature of quark interactions and their role in the high energy processes is presented in the first part of the paper. The second part of the paper deals with gauge theory (GT) of strong interactions (chromodynamics (CD)) and its application in calculation of rigid processes with quark participation. It is based on the method of functional integration (MFI). A comparatively simple representation of the MFI in the quantum theory and formulation of the perturbation theory for gauge fields are given. A derivation of the rules of diagram technique is presented. Renormalization invariance of the theory and the basic for CD phenomenon of asymptotical freedom are discussed. Theory application in calculation of certain effects at high energies is considered. From the CD view point considered is a parton model on the base of which ''rigid'' stage of evolution of quark and gluon jets produced at high energies can be quantitatively described and some quantitative experimental tests of the CD are suggested [ru

  16. Linking the Value Assessment of Oil and Gas Firms to Ambidexterity Theory Using a Mixture of Normal Distributions

    Directory of Open Access Journals (Sweden)

    Casault Sébastien

    2016-05-01

    Full Text Available Oil and gas exploration and production firms have return profiles that are not easily explained by current financial theory – the variation in their market returns is non-Gaussian. In this paper, the nature and underlying reason for these significant deviations from expected behavior are considered. Understanding these differences in financial market behavior is important for a wide range of reasons, including: assessing investments, investor relations, decisions to raise capital, assessment of firm and management performance. We show that using a “thicker tailed” mixture of two normal distributions offers a significantly more accurate model than the traditionally Gaussian approach in describing the behavior of the value of oil and gas firms. This mixture of normal distribution is also more effective in bridging the gap between management theory and practice without the need to introduce complex time-sensitive GARCH and/or jump diffusion dynamics. The mixture distribution is consistent with ambidexterity theory that suggests firms operate in two distinct states driven by the primary focus of the firm: an exploration state with high uncertainty and, an exploitation (or production state with lower uncertainty. The findings have direct implications on improving the accuracy of real option pricing techniques and futures analysis of risk management. Traditional options pricing models assume that commercial returns from these assets are described by a normal random walk. However, a normal random walk model discounts the possibility of large changes to the marketplace from events such as the discovery of important reserves or the introduction of new technology. The mixture distribution proves to be well suited to inherently describe the unusually large risks and opportunities associated with oil and gas production and exploration. A significance testing study of 554 oil and gas exploration and production firms empirically supports using a mixture

  17. The mathematical theory of signal processing and compression-designs

    Science.gov (United States)

    Feria, Erlan H.

    2006-05-01

    The mathematical theory of signal processing, named processor coding, will be shown to inherently arise as the computational time dual of Shannon's mathematical theory of communication which is also known as source coding. Source coding is concerned with signal source memory space compression while processor coding deals with signal processor computational time compression. Their combination is named compression-designs and referred as Conde in short. A compelling and pedagogically appealing diagram will be discussed highlighting Conde's remarkable successful application to real-world knowledge-aided (KA) airborne moving target indicator (AMTI) radar.

  18. Periodic Schur process, cylindric partitions and N=2* theory

    International Nuclear Information System (INIS)

    Iqbal, Amer; Kozcaz, Can; Sohail, Tanweer

    2011-01-01

    Type IIA string theory compactified on an elliptic CY3-fold gives rise to N=2U(1) gauge theory with an adjoint hypermultiplet. We study the refined open and closed topological string partition functions of this geometry using the refined topological vertex. We show that these partition functions, open and closed, are examples of periodic Schur process and are related to the generating function of the cylindric partitions if the Kaehler parameters are quantized in units of string coupling. The level-rank duality appears as the exchange symmetry of the two Kaehler parameters of the elliptic CY3-fold.

  19. Frames and operator theory in analysis and signal processing

    CERN Document Server

    Larson, David R; Nashed, Zuhair; Nguyen, Minh Chuong; Papadakis, Manos

    2008-01-01

    This volume contains articles based on talks presented at the Special Session Frames and Operator Theory in Analysis and Signal Processing, held in San Antonio, Texas, in January of 2006. Recently, the field of frames has undergone tremendous advancement. Most of the work in this field is focused on the design and construction of more versatile frames and frames tailored towards specific applications, e.g., finite dimensional uniform frames for cellular communication. In addition, frames are now becoming a hot topic in mathematical research as a part of many engineering applications, e.g., matching pursuits and greedy algorithms for image and signal processing. Topics covered in this book include: Application of several branches of analysis (e.g., PDEs; Fourier, wavelet, and harmonic analysis; transform techniques; data representations) to industrial and engineering problems, specifically image and signal processing. Theoretical and applied aspects of frames and wavelets. Pure aspects of operator theory empha...

  20. Quantum processes: A Whiteheadian interpretation of quantum field theory

    Science.gov (United States)

    Bain, Jonathan

    Quantum processes: A Whiteheadian interpretation of quantum field theory is an ambitious and thought-provoking exercise in physics and metaphysics, combining an erudite study of the very complex metaphysics of A.N. Whitehead with a well-informed discussion of contemporary issues in the philosophy of algebraic quantum field theory. Hättich's overall goal is to construct an interpretation of quantum field theory. He does this by translating key concepts in Whitehead's metaphysics into the language of algebraic quantum field theory. In brief, this Hättich-Whitehead (H-W, hereafter) interpretation takes "actual occasions" as the fundamental ontological entities of quantum field theory. An actual occasion is the result of two types of processes: a "transition process" in which a set of initial possibly-possessed properties for the occasion (in the form of "eternal objects") is localized to a space-time region; and a "concrescence process" in which a subset of these initial possibly-possessed properties is selected and actualized to produce the occasion. Essential to these processes is the "underlying activity", which conditions the way in which properties are initially selected and subsequently actualized. In short, under the H-W interpretation of quantum field theory, an initial set of possibly-possessed eternal objects is represented by a Boolean sublattice of the lattice of projection operators determined by a von Neumann algebra R (O) associated with a region O of Minkowski space-time, and the underlying activity is represented by a state on R (O) obtained by conditionalizing off of the vacuum state. The details associated with the H-W interpretation involve imposing constraints on these representations motivated by principles found in Whitehead's metaphysics. These details are spelled out in the three sections of the book. The first section is a summary and critique of Whitehead's metaphysics, the second section introduces the formalism of algebraic quantum field

  1. Translating Unstructured Workflow Processes to Readable BPEL: Theory and Implementation

    DEFF Research Database (Denmark)

    van der Aalst, Willibrordus Martinus Pancratius; Lassen, Kristian Bisgaard

    2008-01-01

    The Business Process Execution Language for Web Services (BPEL) has emerged as the de-facto standard for implementing processes. Although intended as a language for connecting web services, its application is not limited to cross-organizational processes. It is expected that in the near future...... and not easy to use by end-users. Therefore, we provide a mapping from Workflow Nets (WF-nets) to BPEL. This mapping builds on the rich theory of Petri nets and can also be used to map other languages (e.g., UML, EPC, BPMN, etc.) onto BPEL. In addition to this we have implemented the algorithm in a tool called...

  2. Physics of Laser Materials Processing Theory and Experiment

    CERN Document Server

    Gladush, Gennady G

    2011-01-01

    This book describes the basic mechanisms, theory, simulations and technological aspects of Laser processing techniques. It covers the principles of laser quenching, welding, cutting, alloying, selective sintering, ablation, etc. The main attention is paid to the quantitative description. The diversity and complexity of technological and physical processes is discussed using a unitary approach. The book aims on understanding the cause-and-effect relations in physical processes in Laser technologies. It will help researchers and engineers to improve the existing and develop new Laser machining techniques. The book addresses readers with a certain background in general physics and mathematical analysis: graduate students, researchers and engineers practicing laser applications.

  3. Theory of mind for processing unexpected events across contexts.

    Science.gov (United States)

    Dungan, James A; Stepanovic, Michael; Young, Liane

    2016-08-01

    Theory of mind, or mental state reasoning, may be particularly useful for making sense of unexpected events. Here, we investigated unexpected behavior across both social and non-social contexts in order to characterize the precise role of theory of mind in processing unexpected events. We used functional magnetic resonance imaging to examine how people respond to unexpected outcomes when initial expectations were based on (i) an object's prior behavior, (ii) an agent's prior behavior and (iii) an agent's mental states. Consistent with prior work, brain regions for theory of mind were preferentially recruited when people first formed expectations about social agents vs non-social objects. Critically, unexpected vs expected outcomes elicited greater activity in dorsomedial prefrontal cortex, which also discriminated in its spatial pattern of activity between unexpected and expected outcomes for social events. In contrast, social vs non-social events elicited greater activity in precuneus across both expected and unexpected outcomes. Finally, given prior information about an agent's behavior, unexpected vs expected outcomes elicited an especially robust response in right temporoparietal junction, and the magnitude of this difference across participants correlated negatively with autistic-like traits. Together, these findings illuminate the distinct contributions of brain regions for theory of mind for processing unexpected events across contexts. © The Author (2016). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  4. Linear circuits, systems and signal processing: theory and application

    International Nuclear Information System (INIS)

    Byrnes, C.I.; Saeks, R.E.; Martin, C.F.

    1988-01-01

    In part because of its universal role as a first approximation of more complicated behaviour and in part because of the depth and breadth of its principle paradigms, the study of linear systems continues to play a central role in control theory and its applications. Enhancing more traditional applications to aerospace and electronics, application areas such as econometrics, finance, and speech and signal processing have contributed to a renaissance in areas such as realization theory and classical automatic feedback control. Thus, the last few years have witnessed a remarkable research effort expended in understanding both new algorithms and new paradigms for modeling and realization of linear processes and in the analysis and design of robust control strategies. The papers in this volume reflect these trends in both the theory and applications of linear systems and were selected from the invited and contributed papers presented at the 8th International Symposium on the Mathematical Theory of Networks and Systems held in Phoenix on June 15-19, 1987

  5. Theory of charge transport in diffusive normal metal conventional superconductor point contacts

    NARCIS (Netherlands)

    Tanaka, Y.; Golubov, Alexandre Avraamovitch; Kashiwaya, S.

    2003-01-01

    Tunneling conductance in diffusive normal (DN) metal/insulator/s-wave superconductor junctions is calculated for various situations by changing the magnitudes of the resistance and Thouless energy in DN and the transparency of the insulating barrier. The generalized boundary condition introduced by

  6. Theory of thermal and charge transport in diffusive normal metal / superconductor junctions

    NARCIS (Netherlands)

    Yokoyama, T.; Tanaka, Y.; Golubov, Alexandre Avraamovitch; Asano, Y.

    2005-01-01

    Thermal and charge transport in diffusive normal metal (DN)/insulator/s-, d-, and p-wave superconductor junctions are studied based on the Usadel equation with the Nazarov's generalized boundary condition. We derive a general expression of the thermal conductance in unconventional superconducting

  7. A mixture theory model of fluid and solute transport in the microvasculature of normal and malignant tissues. I. Theory.

    Science.gov (United States)

    Schuff, M M; Gore, J P; Nauman, E A

    2013-05-01

    In order to better understand the mechanisms governing transport of drugs, nanoparticle-based treatments, and therapeutic biomolecules, and the role of the various physiological parameters, a number of mathematical models have previously been proposed. The limitations of the existing transport models indicate the need for a comprehensive model that includes transport in the vessel lumen, the vessel wall, and the interstitial space and considers the effects of the solute concentration on fluid flow. In this study, a general model to describe the transient distribution of fluid and multiple solutes at the microvascular level was developed using mixture theory. The model captures the experimentally observed dependence of the hydraulic permeability coefficient of the capillary wall on the concentration of solutes present in the capillary wall and the surrounding tissue. Additionally, the model demonstrates that transport phenomena across the capillary wall and in the interstitium are related to the solute concentration as well as the hydrostatic pressure. The model is used in a companion paper to examine fluid and solute transport for the simplified case of an axisymmetric geometry with no solid deformation or interconversion of mass.

  8. UNIVERSITY TEACHING-LEARNING PROCESS: REFLECTIONS THROUGHOUT THE AGENCY THEORY

    Directory of Open Access Journals (Sweden)

    Víctor Jacques Parraguez

    2011-04-01

    Full Text Available This work analyses some reasons that might explain the insufficient academic level which is perceived in universities of developing countries. The discussion element is the teacher-student relationship which is studied under the perspective of the agency theory. It is concluded that in absence of efficient monitoring mechanisms of the teacher and student’s behavior might proliferate gaps of due diligence which attempts against the quality of the teaching-learning process.

  9. Normal loads program for aerodynamic lifting surface theory. [evaluation of spanwise and chordwise loading distributions

    Science.gov (United States)

    Medan, R. T.; Ray, K. S.

    1974-01-01

    A description of and users manual are presented for a U.S.A. FORTRAN 4 computer program which evaluates spanwise and chordwise loading distributions, lift coefficient, pitching moment coefficient, and other stability derivatives for thin wings in linearized, steady, subsonic flow. The program is based on a kernel function method lifting surface theory and is applicable to a large class of planforms including asymmetrical ones and ones with mixed straight and curved edges.

  10. Comparing Sensory Information Processing and Alexithymia between People with Substance Dependency and Normal.

    Science.gov (United States)

    Bashapoor, Sajjad; Hosseini-Kiasari, Seyyedeh Tayebeh; Daneshvar, Somayeh; Kazemi-Taskooh, Zeinab

    2015-01-01

    Sensory information processing and alexithymia are two important factors in determining behavioral reactions. Some studies explain the effect of the sensitivity of sensory processing and alexithymia in the tendency to substance abuse. Giving that, the aim of the current study was to compare the styles of sensory information processing and alexithymia between substance-dependent people and normal ones. The research method was cross-sectional and the statistical population of the current study comprised of all substance-dependent men who are present in substance quitting camps of Masal, Iran, in October 2013 (n = 78). 36 persons were selected randomly by simple randomly sampling method from this population as the study group, and 36 persons were also selected among the normal population in the same way as the comparison group. Both groups was evaluated by using Toronto alexithymia scale (TAS) and adult sensory profile, and the multivariate analysis of variance (MANOVA) test was applied to analyze data. The results showed that there are significance differences between two groups in low registration (P processing and difficulty in describing emotions (P process sensory information in a different way than normal people and show more alexithymia features than them.

  11. Theory and applications of spherical microphone array processing

    CERN Document Server

    Jarrett, Daniel P; Naylor, Patrick A

    2017-01-01

    This book presents the signal processing algorithms that have been developed to process the signals acquired by a spherical microphone array. Spherical microphone arrays can be used to capture the sound field in three dimensions and have received significant interest from researchers and audio engineers. Algorithms for spherical array processing are different to corresponding algorithms already known in the literature of linear and planar arrays because the spherical geometry can be exploited to great beneficial effect. The authors aim to advance the field of spherical array processing by helping those new to the field to study it efficiently and from a single source, as well as by offering a way for more experienced researchers and engineers to consolidate their understanding, adding either or both of breadth and depth. The level of the presentation corresponds to graduate studies at MSc and PhD level. This book begins with a presentation of some of the essential mathematical and physical theory relevant to ...

  12. Novel welding image processing method based on fractal theory

    Institute of Scientific and Technical Information of China (English)

    陈强; 孙振国; 肖勇; 路井荣

    2002-01-01

    Computer vision has come into used in the fields of welding process control and automation. In order to improve precision and rapidity of welding image processing, a novel method based on fractal theory has been put forward in this paper. Compared with traditional methods, the image is preliminarily processed in the macroscopic regions then thoroughly analyzed in the microscopic regions in the new method. With which, an image is divided up to some regions according to the different fractal characters of image edge, and the fuzzy regions including image edges are detected out, then image edges are identified with Sobel operator and curved by LSM (Lease Square Method). Since the data to be processed have been decreased and the noise of image has been reduced, it has been testified through experiments that edges of weld seam or weld pool could be recognized correctly and quickly.

  13. Theory of mind and emotion-recognition functioning in autistic spectrum disorders and in psychiatric control and normal children.

    Science.gov (United States)

    Buitelaar, J K; van der Wees, M; Swaab-Barneveld, H; van der Gaag, R J

    1999-01-01

    The hypothesis was tested that weak theory of mind (ToM) and/or emotion recognition (ER) abilities are specific to subjects with autism. Differences in ToM and ER performance were examined between autistic (n = 20), pervasive developmental disorder-not otherwise specified (PDD-NOS) (n = 20), psychiatric control (n = 20), and normal children (n = 20). The clinical groups were matched person-to-person on age and verbal IQ. We used tasks for the matching and the context recognition of emotional expressions, and a set of first- and second-order ToM tasks. Autistic and PDD-NOS children could not be significantly differentiated from each other, nor could they be differentiated from the psychiatric controls with a diagnosis of ADHD (n = 9). The psychiatric controls with conduct disorder or dysthymia performed about as well as normal children. The variance in second-order ToM performance contributed most to differences between diagnostic groups.

  14. Communication: Density functional theory model for multi-reference systems based on the exact-exchange hole normalization.

    Science.gov (United States)

    Laqua, Henryk; Kussmann, Jörg; Ochsenfeld, Christian

    2018-03-28

    The correct description of multi-reference electronic ground states within Kohn-Sham density functional theory (DFT) requires an ensemble-state representation, employing fractionally occupied orbitals. However, the use of fractional orbital occupation leads to non-normalized exact-exchange holes, resulting in large fractional-spin errors for conventional approximative density functionals. In this communication, we present a simple approach to directly include the exact-exchange-hole normalization into DFT. Compared to conventional functionals, our model strongly improves the description for multi-reference systems, while preserving the accuracy in the single-reference case. We analyze the performance of our proposed method at the example of spin-averaged atoms and spin-restricted bond dissociation energy surfaces.

  15. Communication: Density functional theory model for multi-reference systems based on the exact-exchange hole normalization

    Science.gov (United States)

    Laqua, Henryk; Kussmann, Jörg; Ochsenfeld, Christian

    2018-03-01

    The correct description of multi-reference electronic ground states within Kohn-Sham density functional theory (DFT) requires an ensemble-state representation, employing fractionally occupied orbitals. However, the use of fractional orbital occupation leads to non-normalized exact-exchange holes, resulting in large fractional-spin errors for conventional approximative density functionals. In this communication, we present a simple approach to directly include the exact-exchange-hole normalization into DFT. Compared to conventional functionals, our model strongly improves the description for multi-reference systems, while preserving the accuracy in the single-reference case. We analyze the performance of our proposed method at the example of spin-averaged atoms and spin-restricted bond dissociation energy surfaces.

  16. Toward a General Research Process for Using Dubin's Theory Building Model

    Science.gov (United States)

    Holton, Elwood F.; Lowe, Janis S.

    2007-01-01

    Dubin developed a widely used methodology for theory building, which describes the components of the theory building process. Unfortunately, he does not define a research process for implementing his theory building model. This article proposes a seven-step general research process for implementing Dubin's theory building model. An example of a…

  17. Note: Determination of torsional spring constant of atomic force microscopy cantilevers: Combining normal spring constant and classical beam theory

    DEFF Research Database (Denmark)

    Álvarez-Asencio, R.; Thormann, Esben; Rutland, M.W.

    2013-01-01

    A technique has been developed for the calculation of torsional spring constants for AFM cantilevers based on the combination of the normal spring constant and plate/beam theory. It is easy to apply and allow the determination of torsional constants for stiff cantilevers where the thermal power s...... spectrum is difficult to obtain due to the high resonance frequency and low signal/noise ratio. The applicability is shown to be general and this simple approach can thus be used to obtain torsional constants for any beam shaped cantilever. © 2013 AIP Publishing LLC....

  18. Memory processes during sleep: beyond the standard consolidation theory.

    Science.gov (United States)

    Axmacher, Nikolai; Draguhn, Andreas; Elger, Christian E; Fell, Juergen

    2009-07-01

    Two-step theories of memory formation suggest that an initial encoding stage, during which transient neural assemblies are formed in the hippocampus, is followed by a second step called consolidation, which involves re-processing of activity patterns and is associated with an increasing involvement of the neocortex. Several studies in human subjects as well as in animals suggest that memory consolidation occurs predominantly during sleep (standard consolidation model). Alternatively, it has been suggested that consolidation may occur during waking state as well and that the role of sleep is rather to restore encoding capabilities of synaptic connections (synaptic downscaling theory). Here, we review the experimental evidence favoring and challenging these two views and suggest an integrative model of memory consolidation.

  19. Assertiveness process of Iranian nurse leaders: a grounded theory study.

    Science.gov (United States)

    Mahmoudirad, Gholamhossein; Ahmadi, Fazlollah; Vanaki, Zohreh; Hajizadeh, Ebrahim

    2009-06-01

    The purpose of this study was to explore the assertiveness process in Iranian nursing leaders. A qualitative design based on the grounded theory approach was used to collect and analyze the assertiveness experiences of 12 nurse managers working in four hospitals in Iran. Purposeful and theoretical sampling methods were employed for the data collection and selection of the participants, and semistructured interviews were held. During the data analysis, 17 categories emerged and these were categorized into three themes: "task generation", "assertiveness behavior", and "executive agents". From the participants' experiences, assertiveness theory emerged as being fundamental to the development of a schematic model describing nursing leadership behaviors. From another aspect, religious beliefs also played a fundamental role in Iranian nursing leadership assertiveness. It was concluded that bringing a change in the current support from top managers and improving self-learning are required in order to enhance the assertiveness of the nursing leaders in Iran.

  20. Normal processes of phonon-phonon scattering and thermal conductivity of germanium crystals with isotopic disorder

    CERN Document Server

    Kuleev, I G

    2001-01-01

    The effect of normal processes of the phonon-phonon scattering on the thermal conductivity of the germanium crystals with various isotopic disorder degrees is considered. The phonon pulse redistribution in the normal scattering processes both inside each oscillatory branch (the Simons mechanism) and between various phonon oscillatory branches (the Herring mechanism) is accounted for. The contributions of the longitudinal and cross-sectional phonons drift motion into the thermal conductivity are analyzed. It is shown that the pulse redistribution in the Herring relaxation mechanism leads to essential suppression of the longitudinal phonons drift motion in the isotopically pure germanium crystals. The calculations results of thermal conductivity for the Herring relaxation mechanism agree well with experimental data on the germanium crystals with various isotopic disorder degrees

  1. Accident and Off-Normal Response and Recovery from Multi-Canister Overpack (MCO) Processing Events

    International Nuclear Information System (INIS)

    ALDERMAN, C.A.

    2000-01-01

    In the process of removing spent nuclear fuel (SNF) from the K Basins through its subsequent packaging, drymg, transportation and storage steps, the SNF Project must be able to respond to all anticipated or foreseeable off-normal and accident events that may occur. Response procedures and recovery plans need to be in place, personnel training established and implemented to ensure the project will be capable of appropriate actions. To establish suitable project planning, these events must first be identified and analyzed for their expected impact to the project. This document assesses all off-normal and accident events for their potential cross-facility or Multi-Canister Overpack (MCO) process reversal impact. Table 1 provides the methodology for establishing the event planning level and these events are provided in Table 2 along with the general response and recovery planning. Accidents and off-normal events of the SNF Project have been evaluated and are identified in the appropriate facility Safety Analysis Report (SAR) or in the transportation Safety Analysis Report for Packaging (SARP). Hazards and accidents are summarized from these safety analyses and listed in separate tables for each facility and the transportation system in Appendix A, along with identified off-normal events. The tables identify the general response time required to ensure a stable state after the event, governing response documents, and the events with potential cross-facility or SNF process reversal impacts. The event closure is predicated on stable state response time, impact to operations and the mitigated annual occurrence frequency of the event as developed in the hazard analysis process

  2. Adaptive Convergence Rates of a Dirichlet Process Mixture of Multivariate Normals

    OpenAIRE

    Tokdar, Surya T.

    2011-01-01

    It is shown that a simple Dirichlet process mixture of multivariate normals offers Bayesian density estimation with adaptive posterior convergence rates. Toward this, a novel sieve for non-parametric mixture densities is explored, and its rate adaptability to various smoothness classes of densities in arbitrary dimension is demonstrated. This sieve construction is expected to offer a substantial technical advancement in studying Bayesian non-parametric mixture models based on stick-breaking p...

  3. Evaluation of EMG processing techniques using Information Theory.

    Science.gov (United States)

    Farfán, Fernando D; Politti, Julio C; Felice, Carmelo J

    2010-11-12

    Electromyographic signals can be used in biomedical engineering and/or rehabilitation field, as potential sources of control for prosthetics and orthotics. In such applications, digital processing techniques are necessary to follow efficient and effectively the changes in the physiological characteristics produced by a muscular contraction. In this paper, two methods based on information theory are proposed to evaluate the processing techniques. These methods determine the amount of information that a processing technique is able to extract from EMG signals. The processing techniques evaluated with these methods were: absolute mean value (AMV), RMS values, variance values (VAR) and difference absolute mean value (DAMV). EMG signals from the middle deltoid during abduction and adduction movement of the arm in the scapular plane was registered, for static and dynamic contractions. The optimal window length (segmentation), abduction and adduction movements and inter-electrode distance were also analyzed. Using the optimal segmentation (200 ms and 300 ms in static and dynamic contractions, respectively) the best processing techniques were: RMS, AMV and VAR in static contractions, and only the RMS in dynamic contractions. Using the RMS of EMG signal, variations in the amount of information between the abduction and adduction movements were observed. Although the evaluation methods proposed here were applied to standard processing techniques, these methods can also be considered as alternatives tools to evaluate new processing techniques in different areas of electrophysiology.

  4. Evaluation of EMG processing techniques using Information Theory

    Directory of Open Access Journals (Sweden)

    Felice Carmelo J

    2010-11-01

    Full Text Available Abstract Background Electromyographic signals can be used in biomedical engineering and/or rehabilitation field, as potential sources of control for prosthetics and orthotics. In such applications, digital processing techniques are necessary to follow efficient and effectively the changes in the physiological characteristics produced by a muscular contraction. In this paper, two methods based on information theory are proposed to evaluate the processing techniques. Methods These methods determine the amount of information that a processing technique is able to extract from EMG signals. The processing techniques evaluated with these methods were: absolute mean value (AMV, RMS values, variance values (VAR and difference absolute mean value (DAMV. EMG signals from the middle deltoid during abduction and adduction movement of the arm in the scapular plane was registered, for static and dynamic contractions. The optimal window length (segmentation, abduction and adduction movements and inter-electrode distance were also analyzed. Results Using the optimal segmentation (200 ms and 300 ms in static and dynamic contractions, respectively the best processing techniques were: RMS, AMV and VAR in static contractions, and only the RMS in dynamic contractions. Using the RMS of EMG signal, variations in the amount of information between the abduction and adduction movements were observed. Conclusions Although the evaluation methods proposed here were applied to standard processing techniques, these methods can also be considered as alternatives tools to evaluate new processing techniques in different areas of electrophysiology.

  5. Normalization and gene p-value estimation: issues in microarray data processing.

    Science.gov (United States)

    Fundel, Katrin; Küffner, Robert; Aigner, Thomas; Zimmer, Ralf

    2008-05-28

    Numerous methods exist for basic processing, e.g. normalization, of microarray gene expression data. These methods have an important effect on the final analysis outcome. Therefore, it is crucial to select methods appropriate for a given dataset in order to assure the validity and reliability of expression data analysis. Furthermore, biological interpretation requires expression values for genes, which are often represented by several spots or probe sets on a microarray. How to best integrate spot/probe set values into gene values has so far been a somewhat neglected problem. We present a case study comparing different between-array normalization methods with respect to the identification of differentially expressed genes. Our results show that it is feasible and necessary to use prior knowledge on gene expression measurements to select an adequate normalization method for the given data. Furthermore, we provide evidence that combining spot/probe set p-values into gene p-values for detecting differentially expressed genes has advantages compared to combining expression values for spots/probe sets into gene expression values. The comparison of different methods suggests to use Stouffer's method for this purpose. The study has been conducted on gene expression experiments investigating human joint cartilage samples of osteoarthritis related groups: a cDNA microarray (83 samples, four groups) and an Affymetrix (26 samples, two groups) data set. The apparently straight forward steps of gene expression data analysis, e.g. between-array normalization and detection of differentially regulated genes, can be accomplished by numerous different methods. We analyzed multiple methods and the possible effects and thereby demonstrate the importance of the single decisions taken during data processing. We give guidelines for evaluating normalization outcomes. An overview of these effects via appropriate measures and plots compared to prior knowledge is essential for the biological

  6. Hidden Markov processes theory and applications to biology

    CERN Document Server

    Vidyasagar, M

    2014-01-01

    This book explores important aspects of Markov and hidden Markov processes and the applications of these ideas to various problems in computational biology. The book starts from first principles, so that no previous knowledge of probability is necessary. However, the work is rigorous and mathematical, making it useful to engineers and mathematicians, even those not interested in biological applications. A range of exercises is provided, including drills to familiarize the reader with concepts and more advanced problems that require deep thinking about the theory. Biological applications are t

  7. A field theory description of constrained energy-dissipation processes

    International Nuclear Information System (INIS)

    Mandzhavidze, I.D.; Sisakyan, A.N.

    2002-01-01

    A field theory description of dissipation processes constrained by a high-symmetry group is given. The formalism is presented in the example of the multiple-hadron production processes, where the transition to the thermodynamic equilibrium results from the kinetic energy of colliding particles dissipating into hadron masses. The dynamics of these processes is restricted because the constraints responsible for the colour charge confinement must be taken into account. We develop a more general S-matrix formulation of the thermodynamics of nonequilibrium dissipative processes and find a necessary and sufficient condition for the validity of this description; this condition is similar to the correlation relaxation condition, which, according to Bogolyubov, must apply as the system approaches equilibrium. This situation must physically occur in processes with an extremely high multiplicity, at least if the hadron mass is nonzero. We also describe a new strong-coupling perturbation scheme, which is useful for taking symmetry restrictions on the dynamics of dissipation processes into account. We review the literature devoted to this problem

  8. Theory of the normal modes of vibrations in the lanthanide type crystals

    Science.gov (United States)

    Acevedo, Roberto; Soto-Bubert, Andres

    2008-11-01

    For the lanthanide type crystals, a vast and rich, though incomplete amount of experimental data has been accumulated, from linear and non linear optics, during the last decades. The main goal of the current research work is to report a new methodology and strategy to put forward a more representative approach to account for the normal modes of vibrations for a complex N-body system. For illustrative purposes, the chloride lanthanide type crystals Cs2NaLnCl6 have been chosen and we develop new convergence tests as well as a criterion to deal with the details of the F-matrix (potential energy matrix). A novel and useful concept of natural potential energy distributions (NPED) is introduced and examined throughout the course of this work. The diagonal and non diagonal contributions to these NPED-values, are evaluated for a series of these crystals explicitly. Our model is based upon a total of seventy two internal coordinates and ninety eight internal Hooke type force constants. An optimization mathematical procedure is applied with reference to the series of chloride lanthanide crystals and it is shown that the strategy and model adopted is sound from both a chemical and a physical viewpoints. We can argue that the current model is able to accommodate a number of interactions and to provide us with a very useful physical insight. The limitations and advantages of the current model and the most likely sources for improvements are discussed in detail.

  9. Nonlinear closure relations theory for transport processes in nonequilibrium systems

    International Nuclear Information System (INIS)

    Sonnino, Giorgio

    2009-01-01

    A decade ago, a macroscopic theory for closure relations has been proposed for systems out of Onsager's region. This theory is referred to as the thermodynamic field theory (TFT). The aim of this work was to determine the nonlinear flux-force relations that respect the thermodynamic theorems for systems far from equilibrium. We propose a formulation of the TFT where one of the basic restrictions, namely, the closed-form solution for the skew-symmetric piece of the transport coefficients, has been removed. In addition, the general covariance principle is replaced by the De Donder-Prigogine thermodynamic covariance principle (TCP). The introduction of TCP requires the application of an appropriate mathematical formalism, which is referred to as the entropy-covariant formalism. By geometrical arguments, we prove the validity of the Glansdorff-Prigogine universal criterion of evolution. A new set of closure equations determining the nonlinear corrections to the linear ('Onsager') transport coefficients is also derived. The geometry of the thermodynamic space is non-Riemannian. However, it tends to be Riemannian for high values of the entropy production. In this limit, we recover the transport equations found by the old theory. Applications of our approach to transport in magnetically confined plasmas, materials submitted to temperature, and electric potential gradients or to unimolecular triangular chemical reactions can be found at references cited herein. Transport processes in tokamak plasmas are of particular interest. In this case, even in the absence of turbulence, the state of the plasma remains close to (but, it is not in) a state of local equilibrium. This prevents the transport relations from being linear.

  10. Validity and Reliability of Published Comprehensive Theory of Mind Tests for Normal Preschool Children: A Systematic Review

    Directory of Open Access Journals (Sweden)

    Seyyede Zohreh Ziatabar Ahmadi

    2015-12-01

    Full Text Available Objective: Theory of mind (ToM or mindreading is an aspect of social cognition that evaluates mental states and beliefs of oneself and others. Validity and reliability are very important criteria when evaluating standard tests; and without them, these tests are not usable. The aim of this study was to systematically review the validity and reliability of published English comprehensive ToM tests developed for normal preschool children.Method: We searched MEDLINE (PubMed interface, Web of Science, Science direct, PsycINFO, and also evidence base Medicine (The Cochrane Library databases from 1990 to June 2015. Search strategy was Latin transcription of ‘Theory of Mind’ AND test AND children. Also, we manually studied the reference lists of all final searched articles and carried out a search of their references. Inclusion criteria were as follows: Valid and reliable diagnostic ToM tests published from 1990 to June 2015 for normal preschool children; and exclusion criteria were as follows: the studies that only used ToM tests and single tasks (false belief tasks for ToM assessment and/or had no description about structure, validity or reliability of their tests. Methodological quality of the selected articles was assessed using the Critical Appraisal Skills Programme (CASP.Result: In primary searching, we found 1237 articles in total databases. After removing duplicates and applying all inclusion and exclusion criteria, we selected 11 tests for this systematic review. Conclusion: There were a few valid, reliable and comprehensive ToM tests for normal preschool children. However, we had limitations concerning the included articles. The defined ToM tests were different in populations, tasks, mode of presentations, scoring, mode of responses, times and other variables. Also, they had various validities and reliabilities. Therefore, it is recommended that the researchers and clinicians select the ToM tests according to their psychometric

  11. Bernstein's theory of pedagogic discourse as a theoretical framework for educators studying student radiographers' interpretation of normality vs. abnormality

    International Nuclear Information System (INIS)

    Winter, Peter D.; Linehan, Mark J.

    2014-01-01

    Purpose: To acknowledge the tacit rules underpinning academic practice of undergraduate radiographers in determining normality vs. abnormality when appraising skeletal images. Methodology: Twelve students were interviewed (individually) using in-depth semi-structured questions. Interviews were mediated through a PowerPoint presentation containing two digital X-ray images. Each image was based on a level of expertise; the elementary (Case 1) and the complicated (Case 2). The questions were based on regular ‘frames’ created from observing tutor–student contact in class, and then validated through a group interview. Bernstein's theory of pedagogic discourse was then utilised as a data analysis instrument to determine how third year diagnostic radiography students interpreted X-ray images, in relation to the ‘recognition’ and ‘realisation’ rules of the Educational Theoretical Framework. Conclusion: Bernstein's framework has made it possible to specify, in detail, how issues and difficulties are formed at the level of the acquirer during interpretation. The recognition rules enabled students to meaningfully recognise what trauma characteristics can be associated with the image and the demands of a detailed scrutiny so as to enact a competent interpretation. Realisation rules, made it possible for students to establish their own systematic approach and realise legitimate meanings of normality and abnormality. Whereas obvious or visible trauma generated realisation rules (represented via homogenous terminology), latent trauma authorised students to deviate from legitimate meanings. The latter rule, in this context, has directed attention to the student issue of visioning abnormality when images are normal

  12. Renewal theory for perturbed random walks and similar processes

    CERN Document Server

    Iksanov, Alexander

    2016-01-01

    This book offers a detailed review of perturbed random walks, perpetuities, and random processes with immigration. Being of major importance in modern probability theory, both theoretical and applied, these objects have been used to model various phenomena in the natural sciences as well as in insurance and finance. The book also presents the many significant results and efficient techniques and methods that have been worked out in the last decade. The first chapter is devoted to perturbed random walks and discusses their asymptotic behavior and various functionals pertaining to them, including supremum and first-passage time. The second chapter examines perpetuities, presenting results on continuity of their distributions and the existence of moments, as well as weak convergence of divergent perpetuities. Focusing on random processes with immigration, the third chapter investigates the existence of moments, describes long-time behavior and discusses limit theorems, both with and without scaling. Chapters fou...

  13. Mermin Non-Locality in Abstract Process Theories

    Directory of Open Access Journals (Sweden)

    Stefano Gogioso

    2015-11-01

    Full Text Available The study of non-locality is fundamental to the understanding of quantum mechanics. The past 50 years have seen a number of non-locality proofs, but its fundamental building blocks, and the exact role it plays in quantum protocols, has remained elusive. In this paper, we focus on a particular flavour of non-locality, generalising Mermin's argument on the GHZ state. Using strongly complementary observables, we provide necessary and sufficient conditions for Mermin non-locality in abstract process theories. We show that the existence of more phases than classical points (aka eigenstates is not sufficient, and that the key to Mermin non-locality lies in the presence of certain algebraically non-trivial phases. This allows us to show that fRel, a favourite toy model for categorical quantum mechanics, is Mermin local. We show Mermin non-locality to be the key resource ensuring the device-independent security of the HBB CQ (N,N family of Quantum Secret Sharing protocols. Finally, we challenge the unspoken assumption that the measurements involved in Mermin-type scenarios should be complementary (like the pair X,Y, opening the doors to a much wider class of potential experimental setups than currently employed. In short, we give conditions for Mermin non-locality tests on any number of systems, where each party has an arbitrary number of measurement choices, where each measurement has an arbitrary number of outcomes and further, that works in any abstract process theory.

  14. Time Series, Stochastic Processes and Completeness of Quantum Theory

    International Nuclear Information System (INIS)

    Kupczynski, Marian

    2011-01-01

    Most of physical experiments are usually described as repeated measurements of some random variables. Experimental data registered by on-line computers form time series of outcomes. The frequencies of different outcomes are compared with the probabilities provided by the algorithms of quantum theory (QT). In spite of statistical predictions of QT a claim was made that it provided the most complete description of the data and of the underlying physical phenomena. This claim could be easily rejected if some fine structures, averaged out in the standard descriptive statistical analysis, were found in time series of experimental data. To search for these structures one has to use more subtle statistical tools which were developed to study time series produced by various stochastic processes. In this talk we review some of these tools. As an example we show how the standard descriptive statistical analysis of the data is unable to reveal a fine structure in a simulated sample of AR (2) stochastic process. We emphasize once again that the violation of Bell inequalities gives no information on the completeness or the non locality of QT. The appropriate way to test the completeness of quantum theory is to search for fine structures in time series of the experimental data by means of the purity tests or by studying the autocorrelation and partial autocorrelation functions.

  15. Direct social perception and dual process theories of mindreading.

    Science.gov (United States)

    Herschbach, Mitchell

    2015-11-01

    The direct social perception (DSP) thesis claims that we can directly perceive some mental states of other people. The direct perception of mental states has been formulated phenomenologically and psychologically, and typically restricted to the mental state types of intentions and emotions. I will compare DSP to another account of mindreading: dual process accounts that posit a fast, automatic "Type 1" form of mindreading and a slow, effortful "Type 2" form. I will here analyze whether dual process accounts' Type 1 mindreading serves as a rival to DSP or whether some Type 1 mindreading can be perceptual. I will focus on Apperly and Butterfill's dual process account of mindreading epistemic states such as perception, knowledge, and belief. This account posits a minimal form of Type 1 mindreading of belief-like states called registrations. I will argue that general dual process theories fit well with a modular view of perception that is considered a kind of Type 1 process. I will show that this modular view of perception challenges and has significant advantages over DSP's phenomenological and psychological theses. Finally, I will argue that if such a modular view of perception is accepted, there is significant reason for thinking Type 1 mindreading of belief-like states is perceptual in nature. This would mean extending the scope of DSP to at least one type of epistemic state. Copyright © 2015 Elsevier Inc. All rights reserved.

  16. A theory-informed, process-oriented Resident Scholarship Program.

    Science.gov (United States)

    Thammasitboon, Satid; Darby, John B; Hair, Amy B; Rose, Karen M; Ward, Mark A; Turner, Teri L; Balmer, Dorene F

    2016-01-01

    The Accreditation Council for Graduate Medical Education requires residency programs to provide curricula for residents to engage in scholarly activities but does not specify particular guidelines for instruction. We propose a Resident Scholarship Program that is framed by the self-determination theory (SDT) and emphasize the process of scholarly activity versus a scholarly product. The authors report on their longitudinal Resident Scholarship Program, which aimed to support psychological needs central to SDT: autonomy, competence, and relatedness. By addressing those needs in program aims and program components, the program may foster residents' intrinsic motivation to learn and to engage in scholarly activity. To this end, residents' engagement in scholarly processes, and changes in perceived autonomy, competence, and relatedness were assessed. Residents engaged in a range of scholarly projects and expressed positive regard for the program. Compared to before residency, residents felt more confident in the process of scholarly activity, as determined by changes in increased perceived autonomy, competence, and relatedness. Scholarly products were accomplished in return for a focus on scholarly process. Based on our experience, and in line with the SDT, supporting residents' autonomy, competence, and relatedness through a process-oriented scholarship program may foster the curiosity, inquisitiveness, and internal motivation to learn that drives scholarly activity and ultimately the production of scholarly products.

  17. Cognitive load disrupts implicit theory-of-mind processing.

    Science.gov (United States)

    Schneider, Dana; Lam, Rebecca; Bayliss, Andrew P; Dux, Paul E

    2012-08-01

    Eye movements in Sally-Anne false-belief tasks appear to reflect the ability to implicitly monitor the mental states of other individuals (theory of mind, or ToM). It has recently been proposed that an early-developing, efficient, and automatically operating ToM system subserves this ability. Surprisingly absent from the literature, however, is an empirical test of the influence of domain-general executive processing resources on this implicit ToM system. In the study reported here, a dual-task method was employed to investigate the impact of executive load on eye movements in an implicit Sally-Anne false-belief task. Under no-load conditions, adult participants displayed eye movement behavior consistent with implicit belief processing, whereas evidence for belief processing was absent for participants under cognitive load. These findings indicate that the cognitive system responsible for implicitly tracking beliefs draws at least minimally on executive processing resources. Thus, even the most low-level processing of beliefs appears to reflect a capacity-limited operation.

  18. A theory-informed, process-oriented Resident Scholarship Program

    Science.gov (United States)

    Thammasitboon, Satid; Darby, John B.; Hair, Amy B.; Rose, Karen M.; Ward, Mark A.; Turner, Teri L.; Balmer, Dorene F.

    2016-01-01

    Background The Accreditation Council for Graduate Medical Education requires residency programs to provide curricula for residents to engage in scholarly activities but does not specify particular guidelines for instruction. We propose a Resident Scholarship Program that is framed by the self-determination theory (SDT) and emphasize the process of scholarly activity versus a scholarly product. Methods The authors report on their longitudinal Resident Scholarship Program, which aimed to support psychological needs central to SDT: autonomy, competence, and relatedness. By addressing those needs in program aims and program components, the program may foster residents’ intrinsic motivation to learn and to engage in scholarly activity. To this end, residents’ engagement in scholarly processes, and changes in perceived autonomy, competence, and relatedness were assessed. Results Residents engaged in a range of scholarly projects and expressed positive regard for the program. Compared to before residency, residents felt more confident in the process of scholarly activity, as determined by changes in increased perceived autonomy, competence, and relatedness. Scholarly products were accomplished in return for a focus on scholarly process. Conclusions Based on our experience, and in line with the SDT, supporting residents’ autonomy, competence, and relatedness through a process-oriented scholarship program may foster the curiosity, inquisitiveness, and internal motivation to learn that drives scholarly activity and ultimately the production of scholarly products. PMID:27306995

  19. Changes of regional cerebral glucose metabolism in normal aging process : A study with FDG PET

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Joon Kee; Kim, Sang Eun; Lee, Kyung Han; Choi, Yong; Choe, Yearn Seong; Kim, Byung Tae [Sungkyunkwan Univ., School of Medicine, Seoul (Korea, Republic of)

    2001-08-01

    Normal aging results in detectable changes in the brain structure and function. We evaluated the changes of regional cerebral glucose metabolism in the normal aging process with FDG PET. Brain PET images were obtained in 44 healthy volunteers (age range 20-69'y'; M:F = 29:15) who had no history of neuropsychiatric disorders. On 6 representative transaxial images, ROls were drawn in the cortical and subcortical areas. Regional FDG uptake was normalized using whole brain uptake to adjust for the injection dose and correct for nonspecific declines of glucose metabolism affecting all brain areas equally. In the prefrontal, temporoparietal and primary sensorimotor cortex, the normalized FDG uptake (NFU) reached a peak In subjects in their 30s. The NFU in the prefrontal and primary sensorimotor cortex declined with age after 30s at a rate of 3.15%/decade and 1.93%/decade, respectively. However, the NFU in the lernporoparietal cortex did not change significantly with age after 30s. The anterior (prefrontal) posterior (temporoparietal) gradient peaked in subjects in their 30s and declined with age the reafter at a rate of 35%/decade. The NFU in the caudate nucleus was decreased with age after 20s at a rate of 2.39%/decade. In the primary visual cortex, putamen, and thalamus, the NFU values did not change significantly throughout the ages covered. These patterns were not significantly different between right and left cerebral hemispheres. Of interest was that the NFU in the left cerebellar cortex was increased with age after 20s at a rate of 2.86%/decade. These data demonstrate regional variation of the age-related changes in the cerebral glucose metabolism, with the most prominent age-related decline of metabolism in the prefrontal cortex. The increase in the cerebellar metabolism with age might reflect a process of neuronal plasticity associated with aging.

  20. Changes of regional cerebral glucose metabolism in normal aging process : A study with FDG PET

    International Nuclear Information System (INIS)

    Yoon, Joon Kee; Kim, Sang Eun; Lee, Kyung Han; Choi, Yong; Choe, Yearn Seong; Kim, Byung Tae

    2001-01-01

    Normal aging results in detectable changes in the brain structure and function. We evaluated the changes of regional cerebral glucose metabolism in the normal aging process with FDG PET. Brain PET images were obtained in 44 healthy volunteers (age range 20-69'y'; M:F = 29:15) who had no history of neuropsychiatric disorders. On 6 representative transaxial images, ROls were drawn in the cortical and subcortical areas. Regional FDG uptake was normalized using whole brain uptake to adjust for the injection dose and correct for nonspecific declines of glucose metabolism affecting all brain areas equally. In the prefrontal, temporoparietal and primary sensorimotor cortex, the normalized FDG uptake (NFU) reached a peak In subjects in their 30s. The NFU in the prefrontal and primary sensorimotor cortex declined with age after 30s at a rate of 3.15%/decade and 1.93%/decade, respectively. However, the NFU in the lernporoparietal cortex did not change significantly with age after 30s. The anterior (prefrontal) posterior (temporoparietal) gradient peaked in subjects in their 30s and declined with age the reafter at a rate of 35%/decade. The NFU in the caudate nucleus was decreased with age after 20s at a rate of 2.39%/decade. In the primary visual cortex, putamen, and thalamus, the NFU values did not change significantly throughout the ages covered. These patterns were not significantly different between right and left cerebral hemispheres. Of interest was that the NFU in the left cerebellar cortex was increased with age after 20s at a rate of 2.86%/decade. These data demonstrate regional variation of the age-related changes in the cerebral glucose metabolism, with the most prominent age-related decline of metabolism in the prefrontal cortex. The increase in the cerebellar metabolism with age might reflect a process of neuronal plasticity associated with aging

  1. Coronary heart disease patients transitioning to a normal life: perspectives and stages identified through a grounded theory approach.

    Science.gov (United States)

    Najafi Ghezeljeh, Tahereh; Yadavar Nikravesh, Mansoureh; Emami, Azita

    2014-02-01

    To explore how Iranian patients with coronary heart disease experience their lives. Coronary heart disease is a leading cause of death in Iran and worldwide. Understanding qualitatively how patients experience the acute and postacute stages of this chronic condition is essential knowledge for minimising the negative consequences of coronary heart disease. Qualitative study using grounded theory for the data analysis. Data for this study were collected through individual qualitative interviews with 24 patients with coronary heart disease, conducted between January 2009 and January 2011. Patients with angina pectoris were selected for participation through purposive sampling, and sample size was determined by data saturation. Data analysis began with initial coding and continued with focused coding. Categories were determined, and the core category was subsequently developed and finalised. The main categories of the transition from acute phase to a modified or 'new normal' life were: (1) Loss of normal life. Experiencing emotions and consequences of illness; (2) Coming to terms. Using coping strategies; (3) Recreating normal life. Healthcare providers must correctly recognise the stages of transition patients navigate while coping with coronary heart disease to support and educate them appropriately throughout these stages. Patients with coronary heart disease lose their normal lives and must work towards recreating a revised life using coping strategies that enable them to come to terms with their situations. By understanding Iranian patients' experiences, healthcare providers and especially nurses can use the information to support and educate patients with coronary heart disease on how to more effectively deal with their illness and its consequences. © 2013 John Wiley & Sons Ltd.

  2. Probability, random variables, and random processes theory and signal processing applications

    CERN Document Server

    Shynk, John J

    2012-01-01

    Probability, Random Variables, and Random Processes is a comprehensive textbook on probability theory for engineers that provides a more rigorous mathematical framework than is usually encountered in undergraduate courses. It is intended for first-year graduate students who have some familiarity with probability and random variables, though not necessarily of random processes and systems that operate on random signals. It is also appropriate for advanced undergraduate students who have a strong mathematical background. The book has the following features: Several app

  3. Proteolytic processing of connective tissue growth factor in normal ocular tissues and during corneal wound healing.

    Science.gov (United States)

    Robinson, Paulette M; Smith, Tyler S; Patel, Dilan; Dave, Meera; Lewin, Alfred S; Pi, Liya; Scott, Edward W; Tuli, Sonal S; Schultz, Gregory S

    2012-12-13

    Connective tissue growth factor (CTGF) is a fibrogenic cytokine that is up-regulated by TGF-β and mediates most key fibrotic actions of TGF-β, including stimulation of synthesis of extracellular matrix and differentiation of fibroblasts into myofibroblasts. This study addresses the role of proteolytic processing of CTGF in human corneal fibroblasts (HCF) stimulated with TGF-β, normal ocular tissues and wounded corneas. Proteolytic processing of CTGF in HCF cultures, normal animal eyes, and excimer laser wounded rat corneas were examined by Western blot. The identity of a 21-kDa band was determined by tandem mass spectrometry, and possible alternative splice variants of CTGF were assessed by 5' Rapid Amplification of cDNA Ends (RACE). HCF stimulated by TGF-β contained full length 38-kDa CTGF and fragments of 25, 21, 18, and 13 kDa, while conditioned medium contained full length 38- and a 21-kDa fragment of CTGF that contained the middle "hinge" region of CTGF. Fragmentation of recombinant CTGF incubated in HCF extracts was blocked by the aspartate protease inhibitor, pepstatin. Normal mouse, rat, and rabbit whole eyes and rabbit ocular tissues contained abundant amounts of C-terminal 25- and 21-kDa fragments and trace amounts of 38-kDa CTGF, although no alternative transcripts were detected. All forms of CTGF (38, 25, and 21 kDa) were detected during healing of excimer ablated rat corneas, peaking on day 11. Proteolytic processing of 38-kDa CTGF occurs during corneal wound healing, which may have important implications in regulation of corneal scar formation.

  4. IBUPROFEN AS A MEDICATION FOR A CORRECTION OF SYMPTOMS OF NORMAL VACCINAL PROCESS IN CHILDREN

    Directory of Open Access Journals (Sweden)

    T.A. Chebotareva

    2008-01-01

    Full Text Available The pathogenetic approach to treatment of symptoms of normal vaccinal process in children after standard vaccination, based on the results of application of anti9inflammatory medications — ibuprofen (nurofen for children and paracetamol is presented in this article. Clinical activity of ibuprofen was established on the basis of clinica catamnestic observation of 856 vaccinated children aged from 3 months to 3 years. recommendations for application of these medications as a treatment for a correction of vaccinal reactions are given.Key words: children, ibuprofen, paracetamol, vaccination.

  5. High energy instanton induced processes in electroweak theory

    International Nuclear Information System (INIS)

    McLerran, L.

    1992-01-01

    It is well known that in electroweak theory, baryon plus lepton number is conserved by the classical equations of motion. This is of course consistent with the lack of experimental observation of such processes. It is a little less well known that when quantum corrections are included in electroweak theory, baryon plus lepton number is not conserved. This was first discovered as a consequence of the Adler-Bardeen-Bell-Jackiw triangle anomaly. It is perhaps most easily understood as a consequence of vacuum degeneracy, fermion energy level crossing and filling of the negative energy Dirac sea upon second quantization. To understand how baryon plus lepton number is not conserved upon second quantization, consider the situation shown in the energy of the system is shown as a function of a parameter which characterizes the gauge fields, the Chern-Simons charge. The Chern-Simons charge is a function only of the gauge fields, and the B + L change is equal to the change in Chern-Simons charge, ΔQ B+L = ΔQ CS

  6. Parallel Distributed Processing Theory in the Age of Deep Networks.

    Science.gov (United States)

    Bowers, Jeffrey S

    2017-12-01

    Parallel distributed processing (PDP) models in psychology are the precursors of deep networks used in computer science. However, only PDP models are associated with two core psychological claims, namely that all knowledge is coded in a distributed format and cognition is mediated by non-symbolic computations. These claims have long been debated in cognitive science, and recent work with deep networks speaks to this debate. Specifically, single-unit recordings show that deep networks learn units that respond selectively to meaningful categories, and researchers are finding that deep networks need to be supplemented with symbolic systems to perform some tasks. Given the close links between PDP and deep networks, it is surprising that research with deep networks is challenging PDP theory. Copyright © 2017. Published by Elsevier Ltd.

  7. Toward a Philosophy and Theory of Volumetric Nonthermal Processing.

    Science.gov (United States)

    Sastry, Sudhir K

    2016-06-01

    Nonthermal processes for food preservation have been under intensive investigation for about the past quarter century, with varying degrees of success. We focus this discussion on two volumetrically acting nonthermal processes, high pressure processing (HPP) and pulsed electric fields (PEF), with emphasis on scientific understanding of each, and the research questions that need to be addressed for each to be more successful in the future. We discuss the character or "philosophy" of food preservation, with a question about the nature of the kill step(s), and the sensing challenges that need to be addressed. For HPP, key questions and needs center around whether its nonthermal effectiveness can be increased by increased pressures or pulsing, the theoretical treatment of rates of reaction as influenced by pressure, the assumption of uniform pressure distribution, and the need for (and difficulties involved in) in-situ measurement. For PEF, the questions include the rationale for pulsing, difficulties involved in continuous flow treatment chambers, the difference between electroporation theory and experimental observations, and the difficulties involved in in-situ measurement and monitoring of electric field distribution. © 2016 Institute of Food Technologists®

  8. Relational description of the measurement process in quantum field theory

    International Nuclear Information System (INIS)

    Gambini, Rodolfo; Porto, Rafael A.

    2002-01-01

    We have recently introduced a realistic, covariant, interpretation for the reduction process in relativistic quantum mechanics. The basic problem for a covariant description is the dependence of the states on the frame within which collapse takes place. A suitable use of the causal structure of the devices involved in the measurement process allowed us to introduce a covariant notion for the collapse of quantum states. However, a fully consistent description in the relativistic domain requires the extension of the interpretation to quantum fields. The extension is far from straightforward. Besides the obvious difficulty of dealing with the infinite degrees of freedom of the field theory, one has to analyse the restrictions imposed by causality concerning the allowed operations in a measurement process. In this paper we address these issues. We shall show that, in the case of partial causally connected measurements, our description allows us to include a wider class of causal operations than the one resulting from the standard way of computing conditional probabilities. This alternative description could be experimentally tested. A verification of this proposal would give stronger support to the realistic interpretations of the states in quantum mechanics. (author)

  9. Theory of the shock process in dense fluids

    International Nuclear Information System (INIS)

    Wallace, D.C.

    1991-01-01

    A shock is assumed to be a steady plane wave, and irreversible thermodynamics is assumed valid. The fluid is characterized by heat conduction and by viscous or viscoelastic response, according to the strain rate. It is shown that setting the viscosity zero produces a solution which constitutes a lower bound through the shock process for the shear stress, and upper bounds for the temperature, entropy, pressure, and heat current. It is shown that there exists an upper bound to the dynamic stresses which can be achieved during shock compression, that this bound corresponds to a purely elastic response of the fluid, and that solution for the shock process along this bound constitutes lower bounds for the temperature and entropy. It is shown that a continuous steady shock is possible only if the heat current is positive and the temperature is an increasing function of compression almost everywhere. In his theory of shocks in gases, Rayleigh showed that there is a maximum shock strength for which a continuous steady solution can exist with heat conduction but without viscosity. Two more limits are shown to exist for dense fluids, based on the fluid response in the leading edge of the shock: for shocks at the overdriven threshold and above, no solution is possible without heat transport; for shocks near the viscous fluid limit and above, viscous fluid theory is not valid, and the fluid response in the leading edge of the shock is approximately that of a nonplastic solid. The viscous fluid limit is estimated to be 13 kbar for water and 690 kbar for mercury

  10. Information Theory Analysis of Cascading Process in a Synthetic Model of Fluid Turbulence

    Directory of Open Access Journals (Sweden)

    Massimo Materassi

    2014-02-01

    Full Text Available The use of transfer entropy has proven to be helpful in detecting which is the verse of dynamical driving in the interaction of two processes, X and Y . In this paper, we present a different normalization for the transfer entropy, which is capable of better detecting the information transfer direction. This new normalized transfer entropy is applied to the detection of the verse of energy flux transfer in a synthetic model of fluid turbulence, namely the Gledzer–Ohkitana–Yamada shell model. Indeed, this is a fully well-known model able to model the fully developed turbulence in the Fourier space, which is characterized by an energy cascade towards the small scales (large wavenumbers k, so that the application of the information-theory analysis to its outcome tests the reliability of the analysis tool rather than exploring the model physics. As a result, the presence of a direct cascade along the scales in the shell model and the locality of the interactions in the space of wavenumbers come out as expected, indicating the validity of this data analysis tool. In this context, the use of a normalized version of transfer entropy, able to account for the difference of the intrinsic randomness of the interacting processes, appears to perform better, being able to discriminate the wrong conclusions to which the “traditional” transfer entropy would drive.

  11. BOOK REVIEW: Theory of Neural Information Processing Systems

    Science.gov (United States)

    Galla, Tobias

    2006-04-01

    It is difficult not to be amazed by the ability of the human brain to process, to structure and to memorize information. Even by the toughest standards the behaviour of this network of about 1011 neurons qualifies as complex, and both the scientific community and the public take great interest in the growing field of neuroscience. The scientific endeavour to learn more about the function of the brain as an information processing system is here a truly interdisciplinary one, with important contributions from biology, computer science, physics, engineering and mathematics as the authors quite rightly point out in the introduction of their book. The role of the theoretical disciplines here is to provide mathematical models of information processing systems and the tools to study them. These models and tools are at the centre of the material covered in the book by Coolen, Kühn and Sollich. The book is divided into five parts, providing basic introductory material on neural network models as well as the details of advanced techniques to study them. A mathematical appendix complements the main text. The range of topics is extremely broad, still the presentation is concise and the book well arranged. To stress the breadth of the book let me just mention a few keywords here: the material ranges from the basics of perceptrons and recurrent network architectures to more advanced aspects such as Bayesian learning and support vector machines; Shannon's theory of information and the definition of entropy are discussed, and a chapter on Amari's information geometry is not missing either. Finally the statistical mechanics chapters cover Gardner theory and the replica analysis of the Hopfield model, not without being preceded by a brief introduction of the basic concepts of equilibrium statistical physics. The book also contains a part on effective theories of the macroscopic dynamics of neural networks. Many dynamical aspects of neural networks are usually hard to find in the

  12. Exact perturbation theory of multiphoton processes at high intensities. [Schroedinger equation, perturbation theory, matrix

    Energy Technology Data Exchange (ETDEWEB)

    Faisal, F H.M. [Bielefeld Univ. (Germany, F.R.). Fakultaet fuer Physik

    1976-06-11

    In this work the perturbation theory for multiphoton processes at high intensities is investigated and it is described an analytical method of summing the perturbation series to extract the contribution from all terms that give rise to the absorption of N photons by an atomic system. The method is first applied to the solution of a simple model problem and the result is confirmed by direct integration of the model Schroedinger equation. The usual lowest (nonvanishing)-order perturbation-theoretical calculation is also carried out for this model to demonstrate explicitly that the full result correctly reproduces that of the lowest-order theory in the limit of low intensity. The method is then extended to the case of an atomic system with well-developed spectrum (e.g. H atom) and the N-photon T-matrix is derived in terms of a ''photon matrix'' asub(N), for which a three-term recurrence relation is established. Next, from the vantage point of the general result obtained here, A probe is made into the nature of several approximate nonperturbative solutions that have appeared in the literature in the past. It is shown here that their applicability is severely restricted by the requirement of the essential spectral degeneracy of the atomic system. Finally, appendix A outlines a prescription of computing the photon matrix asub(N), which (as in the usual lowest-order perturbation-theoretical calculation)requires a knowledge of the eigenfunctions and eigenvalues of the atomic Hamiltonian only.

  13. Projecting Grammatical Features in Nominals: Cognitive Processing Theory & Computational Implementation

    Science.gov (United States)

    2010-03-01

    underlying linguistic theory is an adaptation of X-Bar Theory ( Chomsky , 1970; Jackendoff, 1977) called Bi- Polar Theory (Ball, 2007a). In Bi-Polar...University Press. Chomsky , N. (1970). Remarks on Nominalization. In Jacobs & Rosembaum, (Eds.), Readings in English Transformational Grammar. Waltham, MA

  14. Process service quality evaluation based on Dempster-Shafer theory and support vector machine.

    Science.gov (United States)

    Pei, Feng-Que; Li, Dong-Bo; Tong, Yi-Fei; He, Fei

    2017-01-01

    Human involvement influences traditional service quality evaluations, which triggers an evaluation's low accuracy, poor reliability and less impressive predictability. This paper proposes a method by employing a support vector machine (SVM) and Dempster-Shafer evidence theory to evaluate the service quality of a production process by handling a high number of input features with a low sampling data set, which is called SVMs-DS. Features that can affect production quality are extracted by a large number of sensors. Preprocessing steps such as feature simplification and normalization are reduced. Based on three individual SVM models, the basic probability assignments (BPAs) are constructed, which can help the evaluation in a qualitative and quantitative way. The process service quality evaluation results are validated by the Dempster rules; the decision threshold to resolve conflicting results is generated from three SVM models. A case study is presented to demonstrate the effectiveness of the SVMs-DS method.

  15. Process service quality evaluation based on Dempster-Shafer theory and support vector machine.

    Directory of Open Access Journals (Sweden)

    Feng-Que Pei

    Full Text Available Human involvement influences traditional service quality evaluations, which triggers an evaluation's low accuracy, poor reliability and less impressive predictability. This paper proposes a method by employing a support vector machine (SVM and Dempster-Shafer evidence theory to evaluate the service quality of a production process by handling a high number of input features with a low sampling data set, which is called SVMs-DS. Features that can affect production quality are extracted by a large number of sensors. Preprocessing steps such as feature simplification and normalization are reduced. Based on three individual SVM models, the basic probability assignments (BPAs are constructed, which can help the evaluation in a qualitative and quantitative way. The process service quality evaluation results are validated by the Dempster rules; the decision threshold to resolve conflicting results is generated from three SVM models. A case study is presented to demonstrate the effectiveness of the SVMs-DS method.

  16. Conflict monitoring in dual process theories of thinking.

    Science.gov (United States)

    De Neys, Wim; Glumicic, Tamara

    2008-03-01

    Popular dual process theories have characterized human thinking as an interplay between an intuitive-heuristic and demanding-analytic reasoning process. Although monitoring the output of the two systems for conflict is crucial to avoid decision making errors there are some widely different views on the efficiency of the process. Kahneman [Kahneman, D. (2002). Maps of bounded rationality: A perspective on intuitive judgement and choice. Nobel Prize Lecture. Retrieved January 11, 2006, from: http://nobelprize.org/nobel_prizes/economics/laureates/2002/kahnemann-lecture.pdf] and Evans [Evans, J. St. B. T. (1984). Heuristic and analytic processing in reasoning. British Journal of Psychology, 75, 451-468], for example, claim that the monitoring of the heuristic system is typically quite lax whereas others such as Sloman [Sloman, S. A. (1996). The empirical case for two systems of reasoning. Psychological Bulletin, 119, 3-22] and Epstein [Epstein, S. (1994). Integration of the cognitive and psychodynamic unconscious. American Psychologists, 49, 709-724] claim it is flawless and people typically experience a struggle between what they "know" and "feel" in case of a conflict. The present study contrasted these views. Participants solved classic base rate neglect problems while thinking aloud. In these problems a stereotypical description cues a response that conflicts with the response based on the analytic base rate information. Verbal protocols showed no direct evidence for an explicitly experienced conflict. As Kahneman and Evans predicted, participants hardly ever mentioned the base rates and seemed to base their judgment exclusively on heuristic reasoning. However, more implicit measures of conflict detection such as participants' retrieval of the base rate information in an unannounced recall test, decision making latencies, and the tendency to review the base rates indicated that the base rates had been thoroughly processed. On control problems where base rates and

  17. Toward a Theory of Entrepreneurial Rents: a Simulation of the Market Process

    NARCIS (Netherlands)

    Keyhani, M; Levesque, M.; Madhok, A.

    2015-01-01

    While strategy theory relies heavily on equilibrium theories of economic rents such as Ricardian and monopoly rents, we do not yet have a comprehensive theory of disequilibrium or entrepreneurial rents. We use cooperative game theory to structure computer simulations of the market process in which

  18. Understanding Reactions to Workplace Injustice through Process Theories of Motivation: A Teaching Module and Simulation

    Science.gov (United States)

    Stecher, Mary D.; Rosse, Joseph G.

    2007-01-01

    Management and organizational behavior students are often overwhelmed by the plethora of motivation theories they must master at the undergraduate level. This article offers a teaching module geared toward helping students understand how two major process theories of motivation, equity and expectancy theories and theories of organizational…

  19. An Explanation of the Relationship between Instructor Humor and Student Learning: Instructional Humor Processing Theory

    Science.gov (United States)

    Wanzer, Melissa B.; Frymier, Ann B.; Irwin, Jeffrey

    2010-01-01

    This paper proposes the Instructional Humor Processing Theory (IHPT), a theory that incorporates elements of incongruity-resolution theory, disposition theory, and the elaboration likelihood model (ELM) of persuasion. IHPT is proposed and offered as an explanation for why some types of instructor-generated humor result in increased student…

  20. Temporal and speech processing skills in normal hearing individuals exposed to occupational noise.

    Science.gov (United States)

    Kumar, U Ajith; Ameenudin, Syed; Sangamanatha, A V

    2012-01-01

    Prolonged exposure to high levels of occupational noise can cause damage to hair cells in the cochlea and result in permanent noise-induced cochlear hearing loss. Consequences of cochlear hearing loss on speech perception and psychophysical abilities have been well documented. Primary goal of this research was to explore temporal processing and speech perception Skills in individuals who are exposed to occupational noise of more than 80 dBA and not yet incurred clinically significant threshold shifts. Contribution of temporal processing skills to speech perception in adverse listening situation was also evaluated. A total of 118 participants took part in this research. Participants comprised three groups of train drivers in the age range of 30-40 (n= 13), 41 50 ( = 13), 41-50 (n = 9), and 51-60 (n = 6) years and their non-noise-exposed counterparts (n = 30 in each age group). Participants of all the groups including the train drivers had hearing sensitivity within 25 dB HL in the octave frequencies between 250 and 8 kHz. Temporal processing was evaluated using gap detection, modulation detection, and duration pattern tests. Speech recognition was tested in presence multi-talker babble at -5dB SNR. Differences between experimental and control groups were analyzed using ANOVA and independent sample t-tests. Results showed a trend of reduced temporal processing skills in individuals with noise exposure. These deficits were observed despite normal peripheral hearing sensitivity. Speech recognition scores in the presence of noise were also significantly poor in noise-exposed group. Furthermore, poor temporal processing skills partially accounted for the speech recognition difficulties exhibited by the noise-exposed individuals. These results suggest that noise can cause significant distortions in the processing of suprathreshold temporal cues which may add to difficulties in hearing in adverse listening conditions.

  1. Temporal and speech processing skills in normal hearing individuals exposed to occupational noise

    Directory of Open Access Journals (Sweden)

    U Ajith Kumar

    2012-01-01

    Full Text Available Prolonged exposure to high levels of occupational noise can cause damage to hair cells in the cochlea and result in permanent noise-induced cochlear hearing loss. Consequences of cochlear hearing loss on speech perception and psychophysical abilities have been well documented. Primary goal of this research was to explore temporal processing and speech perception Skills in individuals who are exposed to occupational noise of more than 80 dBA and not yet incurred clinically significant threshold shifts. Contribution of temporal processing skills to speech perception in adverse listening situation was also evaluated. A total of 118 participants took part in this research. Participants comprised three groups of train drivers in the age range of 30-40 (n= 13, 41 50 ( = 13, 41-50 (n = 9, and 51-60 (n = 6 years and their non-noise-exposed counterparts (n = 30 in each age group. Participants of all the groups including the train drivers had hearing sensitivity within 25 dB HL in the octave frequencies between 250 and 8 kHz. Temporal processing was evaluated using gap detection, modulation detection, and duration pattern tests. Speech recognition was tested in presence multi-talker babble at -5dB SNR. Differences between experimental and control groups were analyzed using ANOVA and independent sample t-tests. Results showed a trend of reduced temporal processing skills in individuals with noise exposure. These deficits were observed despite normal peripheral hearing sensitivity. Speech recognition scores in the presence of noise were also significantly poor in noise-exposed group. Furthermore, poor temporal processing skills partially accounted for the speech recognition difficulties exhibited by the noise-exposed individuals. These results suggest that noise can cause significant distortions in the processing of suprathreshold temporal cues which may add to difficulties in hearing in adverse listening conditions.

  2. Group processes in medical education: learning from social identity theory.

    Science.gov (United States)

    Burford, Bryan

    2012-02-01

    The clinical workplace in which doctors learn involves many social groups, including representatives of different professions, clinical specialties and workplace teams. This paper suggests that medical education research does not currently take full account of the effects of group membership, and describes a theoretical approach from social psychology, the social identity approach, which allows those effects to be explored. The social identity approach has a long history in social psychology and provides an integrated account of group processes, from the adoption of group identity through a process of self-categorisation, to the biases and conflicts between groups. This paper outlines key elements of this theoretical approach and illustrates their relevance to medical education. The relevance of the social identity approach is illustrated with reference to a number of areas of medical education. The paper shows how research questions in medical education may be usefully reframed in terms of social identity in ways that allow a deeper exploration of the psychological processes involved. Professional identity and professionalism may be viewed in terms of self-categorisation rather than simply attainment; the salience of different identities may be considered as influences on teamwork and interprofessional learning, and issues in communication and assessment may be considered in terms of intergroup biases. Social identity theory provides a powerful framework with which to consider many areas of medical education. It allows disparate influences on, and consequences of, group membership to be considered as part of an integrated system, and allows assumptions, such as about the nature of professional identity and interprofessional tensions, to be made explicit in the design of research studies. This power to question assumptions and develop deeper and more meaningful research questions may be increasingly relevant as the nature and role of the medical profession change

  3. Nonepileptic seizures under levetiracetam therapy: a case report of forced normalization process

    Directory of Open Access Journals (Sweden)

    Anzellotti F

    2014-05-01

    Full Text Available Francesca Anzellotti, Raffaella Franciotti, Holta Zhuzhuni, Aurelio D'Amico, Astrid Thomas, Marco Onofrj Department of Neuroscience and Imaging, Aging Research Centre, Gabriele d'Annunzio University Foundation, Gabriele d'Annunzio University, Chieti, Italy Abstract: Nonepileptic seizures (NES apparently look like epileptic seizures, but are not associated with ictal electrical discharges in the brain. NES constitute one of the most important differential diagnoses of epilepsy. They have been recognized as a distinctive clinical phenomenon for centuries, and video/electroencephalogram monitoring has allowed clinicians to make near-certain diagnoses. NES are supposedly unrelated to organic brain lesions, and despite the preponderance of a psychiatric/psychological context, they may have an iatrogenic origin. We report a patient with NES precipitated by levetiracetam therapy; in this case, NES was observed during the disappearance of epileptiform discharges from the routine video/electroencephalogram. We discuss the possible mechanisms underlying NES with regard to alternative psychoses associated with the phenomenon of the forced normalization process. Keywords: nonepileptic seizures, forced normalization, levetiracetam, behavioral side effects

  4. [Implications of mental image processing in the deficits of verbal information coding during normal aging].

    Science.gov (United States)

    Plaie, Thierry; Thomas, Delphine

    2008-06-01

    Our study specifies the contributions of image generation and image maintenance processes occurring at the time of imaginal coding of verbal information in memory during normal aging. The memory capacities of 19 young adults (average age of 24 years) and 19 older adults (average age of 75 years) were assessed using recall tasks according to the imagery value of the stimuli to learn. The mental visual imagery capacities are assessed using tasks of image generation and temporary storage of mental imagery. The variance analysis indicates a more important decrease with age of the concretness effect. The major contribution of our study rests on the fact that the decline with age of dual coding of verbal information in memory would result primarily from the decline of image maintenance capacities and from a slowdown in image generation. (PsycINFO Database Record (c) 2008 APA, all rights reserved).

  5. Nonepileptic seizures under levetiracetam therapy: a case report of forced normalization process.

    Science.gov (United States)

    Anzellotti, Francesca; Franciotti, Raffaella; Zhuzhuni, Holta; D'Amico, Aurelio; Thomas, Astrid; Onofrj, Marco

    2014-01-01

    Nonepileptic seizures (NES) apparently look like epileptic seizures, but are not associated with ictal electrical discharges in the brain. NES constitute one of the most important differential diagnoses of epilepsy. They have been recognized as a distinctive clinical phenomenon for centuries, and video/electroencephalogram monitoring has allowed clinicians to make near-certain diagnoses. NES are supposedly unrelated to organic brain lesions, and despite the preponderance of a psychiatric/psychological context, they may have an iatrogenic origin. We report a patient with NES precipitated by levetiracetam therapy; in this case, NES was observed during the disappearance of epileptiform discharges from the routine video/electroencephalogram. We discuss the possible mechanisms underlying NES with regard to alternative psychoses associated with the phenomenon of the forced normalization process.

  6. Using Process Theory for Accumulating Project Management Knowledge : A Seven-Category Model

    NARCIS (Netherlands)

    Niederman, Fred; Mueller, Benjamin; March, Salvatore T.

    2018-01-01

    Process theory has become an important type of theory for the accumulation of knowledge in a number of disciplines. Process theory focuses on sequences of activities, their duration and the intervals between them, as they lead to particular outcomes. Thus it is particularly relevant to project

  7. Bioattractors: dynamical systems theory and the evolution of regulatory processes

    Science.gov (United States)

    Jaeger, Johannes; Monk, Nick

    2014-01-01

    In this paper, we illustrate how dynamical systems theory can provide a unifying conceptual framework for evolution of biological regulatory systems. Our argument is that the genotype–phenotype map can be characterized by the phase portrait of the underlying regulatory process. The features of this portrait – such as attractors with associated basins and their bifurcations – define the regulatory and evolutionary potential of a system. We show how the geometric analysis of phase space connects Waddington's epigenetic landscape to recent computational approaches for the study of robustness and evolvability in network evolution. We discuss how the geometry of phase space determines the probability of possible phenotypic transitions. Finally, we demonstrate how the active, self-organizing role of the environment in phenotypic evolution can be understood in terms of dynamical systems concepts. This approach yields mechanistic explanations that go beyond insights based on the simulation of evolving regulatory networks alone. Its predictions can now be tested by studying specific, experimentally tractable regulatory systems using the tools of modern systems biology. A systematic exploration of such systems will enable us to understand better the nature and origin of the phenotypic variability, which provides the substrate for evolution by natural selection. PMID:24882812

  8. A general theory for radioactive processes in rare earth compounds

    International Nuclear Information System (INIS)

    Acevedo, R.; Meruane, T.

    1998-01-01

    The formal theory of radiative processes in centrosymmetric coordination compounds of the Ln X 3+ is a trivalent lanthanide ion and X -1 =Cl -1 , Br -1 ) is put forward based on a symmetry vibronic crystal field-ligand polarisation model. This research considers a truncated basis set for the intermediate states of the central metal ion and have derived general master equations to account for both the overall observed spectral intensities and the measured relative vibronic intensity distributions for parity forbidden but vibronically allowed electronic transitions. In addition, a procedure which includes the closure approximation over the intermediate electronic states is included in order to estimate quantitative crystal field contribution to the total transition dipole moments of various and selected electronic transitions. This formalism is both general and flexible and it may be employed in any electronic excitations involving f N type configurations for the rare earths in centrosymmetric co-ordination compounds in cubic environments and also in doped host crystals belonging to the space group Fm 3m. (author)

  9. Concepts, Perception and the Dual Process Theories of Mind

    Directory of Open Access Journals (Sweden)

    Marcello Frixione

    2014-12-01

    Full Text Available In this article we argue that the problem of the relationships between concepts and perception in cognitive science is blurred by the fact that the very notion of concept is rather confused. Since it is not always clear exactly what concepts are, it is not easy to say, for example, whether and in what measure concept possession involves entertaining and manipulating perceptual representations, whether concepts are entirely different from perceptual representations, and so on. As a paradigmatic example of this state of affairs, we will start by taking into consideration the distinction between conceptual and nonconceptual content. The analysis of such a distinction will lead us to the conclusion that concept is a heterogeneous notion. Then we shall take into account the so called dual process theories of mind; this approach also points to concepts being a heterogeneous phenomenon: different aspects of conceptual competence are likely to be ascribed to different types of systems. We conclude that without a clear specification of what concepts are, the problem of the relationships between concepts and perception is somewhat ill-posed.

  10. Signal Detection Theory-Based Information Processing for the Detection of Breast Cancer at Microwave Frequencies

    National Research Council Canada - National Science Library

    Nolte, Loren

    2002-01-01

    The hypothesis is that one can use signal detection theory to improve the performance in detecting tumors in the breast by using this theory to develop task-oriented information processing techniques...

  11. Pion interferometry theory for the hydrodynamic stage of multiple processes

    International Nuclear Information System (INIS)

    Makhlin, A.N.; Sinyukov, Yu.M.

    1986-01-01

    The double pion inclusive cross section for identical particles is described in hydrodynamical theory of multiparticle production. The pion interferometry theory is developed for the case when secondary particles are generated against the background of internal relativistic motion of radiative hadron matter. The connection between correlation functions in various schemes of experiment is found within the framework of relativistic Wigner functions formalism

  12. Solitary-wave families of the Ostrovsky equation: An approach via reversible systems theory and normal forms

    International Nuclear Information System (INIS)

    Roy Choudhury, S.

    2007-01-01

    The Ostrovsky equation is an important canonical model for the unidirectional propagation of weakly nonlinear long surface and internal waves in a rotating, inviscid and incompressible fluid. Limited functional analytic results exist for the occurrence of one family of solitary-wave solutions of this equation, as well as their approach to the well-known solitons of the famous Korteweg-de Vries equation in the limit as the rotation becomes vanishingly small. Since solitary-wave solutions often play a central role in the long-time evolution of an initial disturbance, we consider such solutions here (via the normal form approach) within the framework of reversible systems theory. Besides confirming the existence of the known family of solitary waves and its reduction to the KdV limit, we find a second family of multihumped (or N-pulse) solutions, as well as a continuum of delocalized solitary waves (or homoclinics to small-amplitude periodic orbits). On isolated curves in the relevant parameter region, the delocalized waves reduce to genuine embedded solitons. The second and third families of solutions occur in regions of parameter space distinct from the known solitary-wave solutions and are thus entirely new. Directions for future work are also mentioned

  13. Quasiclassical Theory of Spin Imbalance in a Normal Metal-Superconductor Heterostructure with a Spin-Active Interface

    International Nuclear Information System (INIS)

    Shevtsov, O; Löfwander, T

    2014-01-01

    Non-equilibrium phenomena in superconductors have attracted much attention since the first experiments on charge imbalance in the early 1970's. Nowadays a new promising line of research lies at an intersection between superconductivity and spintronics. Here we develop a quasiclassical theory of a single junction between a normal metal and a superconductor with a spin-active interface at finite bias voltages. Due to spin-mixing and spin-filtering effects of the interface a non-equilibrium magnetization (or spin imbalance) is induced at the superconducting side of the junction, which relaxes to zero in the bulk. A peculiar feature of the system is the presence of interface-induced Andreev bound states, which influence the magnitude and the decay length of spin imbalance. Recent experiments on spin and charge density separation in superconducting wires required external magnetic field for observing a spin signal via non-local measurements. Here, we propose an alternative way to observe spin imbalance without applying magnetic field

  14. Modulation of the inter-hemispheric processing of semantic information during normal aging. A divided visual field experiment.

    Science.gov (United States)

    Hoyau, E; Cousin, E; Jaillard, A; Baciu, M

    2016-12-01

    We evaluated the effect of normal aging on the inter-hemispheric processing of semantic information by using the divided visual field (DVF) method, with words and pictures. Two main theoretical models have been considered, (a) the HAROLD model which posits that aging is associated with supplementary recruitment of the right hemisphere (RH) and decreased hemispheric specialization, and (b) the RH decline theory, which assumes that the RH becomes less efficient with aging, associated with increased LH specialization. Two groups of subjects were examined, a Young Group (YG) and an Old Group (OG), while participants performed a semantic categorization task (living vs. non-living) in words and pictures. The DVF was realized in two steps: (a) unilateral DVF presentation with stimuli presented separately in each visual field, left or right, allowing for their initial processing by only one hemisphere, right or left, respectively; (b) bilateral DVF presentation (BVF) with stimuli presented simultaneously in both visual fields, followed by their processing by both hemispheres. These two types of presentation permitted the evaluation of two main characteristics of the inter-hemispheric processing of information, the hemispheric specialization (HS) and the inter-hemispheric cooperation (IHC). Moreover, the BVF allowed determining the driver-hemisphere for processing information presented in BVF. Results obtained in OG indicated that: (a) semantic categorization was performed as accurately as YG, even if more slowly, (b) a non-semantic RH decline was observed, and (c) the LH controls the semantic processing during the BVF, suggesting an increased role of the LH in aging. However, despite the stronger involvement of the LH in OG, the RH is not completely devoid of semantic abilities. As discussed in the paper, neither the HAROLD nor the RH decline does fully explain this pattern of results. We rather suggest that the effect of aging on the hemispheric specialization and inter

  15. STUDY ON ENERGY EXCHANGE PROCESSES IN NORMAL OPERATION OF METRO ROLLING STOCK WITH REGENERATIVE BRAKING SYSTEMS

    Directory of Open Access Journals (Sweden)

    A. O. Sulym

    2017-10-01

    Full Text Available Purpose. The analysis of the existing studies showed that the increasing of energy efficiency of metro rolling stock becomes especially important and requires timely solutions. It is known that the implementation of regenerative braking systems on rolling stock will allow significantly solving this problem. It was proved that one of the key issues regarding the introduction of the above-mentioned systems is research on efficient use of electric energy of regenerative braking. The purpose of the work is to evaluate the amount of excessive electric power of regenerative braking under normal operation conditions of the rolling stock with regenerative braking systems for the analysis of the energy saving reserves. Methodology. Quantifiable values of electrical energy consumed for traction, returned to the contact line and dissipated in braking resistors (excessive energy are determined using results of experimental studies of energy exchange processes under normal operating conditions of metro rolling stock with regenerative systems. Statistical methods of data processing were applied as well. Findings. Results of the studies analysis of metro rolling stock operation under specified conditions in Sviatoshinsko-Brovarskaia line of KP «Kyiv Metro system» stipulate the following: 1 introduction of regenerative braking systems into the rolling stock allows to return about 17.9-23.2% of electrical energy consumed for traction to the contact line; 2 there are reserves for improving of energy efficiency of rolling stock with regenerative systems at the level of 20.2–29.9 % of electrical energy consumed for traction. Originality. For the first time, it is proved that the most significant factor that influences the quantifiable values of the electrical energy regeneration is a track profile. It is suggested to use coefficients which indicate the amount and reserves of unused (excessive electrical energy for quantitative evaluation. Studies on

  16. Usability of a theory of visual attention (TVA) for parameter-based measurement of attention I: evidence from normal subjects

    DEFF Research Database (Denmark)

    Finke, Kathrin; Bublak, Peter; Krummenacher, Joseph

    2005-01-01

    four separable attentional components: processing speed, working memory storage capacity, spatial distribution of attention, and top-down control. A number of studies (Duncan et al., 1999; Habekost & Bundesen, 2003; Peers et al., 2005) have already demonstrated the clinical relevance......The present study investigated the usability of whole and partial report of briefly displayed letter arrays as a diagnostic tool for the assessment of attentional functions. The tool is based on Bundesen's (1990, 1998, 2002; Bundesen et al., 2005) theory of visual attention (TVA), which assumes...... of these parameters. The present study was designed to examine whether (a) a shortened procedure bears sufficient accuracy and reliability, (b) whether the procedures reveal attentional constructs with clinical relevance, and (c) whether the mathematically independent parameters are also empirically independent...

  17. A predictive processing theory of sensorimotor contingencies: Explaining the puzzle of perceptual presence and its absence in synesthesia.

    Science.gov (United States)

    Seth, Anil K

    2014-01-01

    Normal perception involves experiencing objects within perceptual scenes as real, as existing in the world. This property of "perceptual presence" has motivated "sensorimotor theories" which understand perception to involve the mastery of sensorimotor contingencies. However, the mechanistic basis of sensorimotor contingencies and their mastery has remained unclear. Sensorimotor theory also struggles to explain instances of perception, such as synesthesia, that appear to lack perceptual presence and for which relevant sensorimotor contingencies are difficult to identify. On alternative "predictive processing" theories, perceptual content emerges from probabilistic inference on the external causes of sensory signals, however, this view has addressed neither the problem of perceptual presence nor synesthesia. Here, I describe a theory of predictive perception of sensorimotor contingencies which (1) accounts for perceptual presence in normal perception, as well as its absence in synesthesia, and (2) operationalizes the notion of sensorimotor contingencies and their mastery. The core idea is that generative models underlying perception incorporate explicitly counterfactual elements related to how sensory inputs would change on the basis of a broad repertoire of possible actions, even if those actions are not performed. These "counterfactually-rich" generative models encode sensorimotor contingencies related to repertoires of sensorimotor dependencies, with counterfactual richness determining the degree of perceptual presence associated with a stimulus. While the generative models underlying normal perception are typically counterfactually rich (reflecting a large repertoire of possible sensorimotor dependencies), those underlying synesthetic concurrents are hypothesized to be counterfactually poor. In addition to accounting for the phenomenology of synesthesia, the theory naturally accommodates phenomenological differences between a range of experiential states

  18. Information processing in micro and meso-scale neural circuits during normal and disease states

    Science.gov (United States)

    Luongo, Francisco

    Neural computation can occur at multiple spatial and temporal timescales. The sum total of all of these processes is to guide optimal behaviors within the context of the constraints imposed by the physical world. How the circuits of the brain achieves this goal represents a central question in systems neuroscience. Here I explore the many ways in which the circuits of the brain can process information at both the micro and meso scale. Understanding the way information is represented and processed in the brain could shed light on the neuropathology underlying complex neuropsychiatric diseases such as autism and schizophrenia. Chapter 2 establishes an experimental paradigm for assaying patterns of microcircuit activity and examines the role of dopaminergic modulation on prefrontal microcircuits. We find that dopamine type 2 (D2) receptor activation results in an increase in spontaneous activity while dopamine type 1 (D1) activation does not. Chapter 3 of this dissertation presents a study that illustrates how cholingergic activation normally produces what has been suggested as a neural substrate of attention; pairwise decorrelation in microcircuit activity. This study also shows that in two etiologicall distinct mouse models of autism, FMR1 knockout mice and Valproic Acid exposed mice, this ability to decorrelate in the presence of cholinergic activation is lost. This represents a putative microcircuit level biomarker of autism. Chapter 4 examines the structure/function relationship within the prefrontal microcircuit. Spontaneous activity in prefrontal microcircuits is shown to be organized according to a small world architecture. Interestingly, this architecture is important for one concrete function of neuronal microcircuits; the ability to produce temporally stereotyped patterns of activation. In the final chapter, we identify subnetworks in chronic intracranial electrocorticographic (ECoG) recordings using pairwise electrode coherence and dimensionality reduction

  19. Developing a new theory of knowledge sharing : Documenting and reflecting on a messy process

    NARCIS (Netherlands)

    Martinsons, M.G.; Davison, R.M.; Ou, Carol

    2015-01-01

    Much has been written about theories and how they can be tested. Unfortunately, much less has been written about how to develop them. This paper sheds light on the process of new theory development. We document and reflect on how we developed a context-sensitive indigenous theory of knowledge

  20. Finding Commonalities: Social Information Processing and Domain Theory in the Study of Aggression

    Science.gov (United States)

    Nucci, Larry

    2004-01-01

    The Arsenio and Lemerise (this issue) proposal integrating social information processing (SIP) and domain theory to study children's aggression is evaluated from a domain theory perspective. Basic tenets of domain theory rendering it compatible with SIP are discussed as well as points of divergence. Focus is directed to the proposition that…

  1. The Relevance of Theories of the Policy Process to Educational Decision-Making.

    Science.gov (United States)

    Ryan, R. J.

    1985-01-01

    Two case studies of educational decision making are used to test the utility of some current theories of the policy-formation process; a framework for the application of these theories is proposed; and the merits of applying existing theories before seeking new paradigms are stressed. (MSE)

  2. Process convergence of self-normalized sums of i.i.d. random ...

    Indian Academy of Sciences (India)

    The study of the asymptotics of the self-normalized sums are also interesting. Logan ... if the constituent random variables are from the domain of attraction of a normal dis- tribution ... index of stability α which equals 2 (for definition, see §2).

  3. Evaluation of Two Absolute Radiometric Normalization Algorithms for Pre-processing of Landsat Imagery

    Institute of Scientific and Technical Information of China (English)

    Xu Hanqiu

    2006-01-01

    In order to evaluate radiometric normalization techniques, two image normalization algorithms for absolute radiometric correction of Landsat imagery were quantitatively compared in this paper, which are the Illumination Correction Model proposed by Markham and Irish and the Illumination and Atmospheric Correction Model developed by the Remote Sensing and GIS Laboratory of the Utah State University. Relative noise, correlation coefficient and slope value were used as the criteria for the evaluation and comparison, which were derived from pseudo-invariant features identified from multitemtween the normalized multitemporal images were significantly reduced when the seasons of multitemporal images were different. However, there was no significant difference between the normalized and unnormalized images with a similar seasonal condition. Furthermore, the correction results of two algorithms are similar when the images are relatively clear with a uniform atmospheric condition. Therefore, the radiometric normalization procedures should be carried out if the multitemporal images have a significant seasonal difference.

  4. Rate Theory for Correlated Processes: Double Jumps in Adatom Diffusion

    DEFF Research Database (Denmark)

    Jacobsen, J.; Jacobsen, Karsten Wedel; Sethna, J.

    1997-01-01

    We study the rate of activated motion over multiple barriers, in particular the correlated double jump of an adatom diffusing on a missing-row reconstructed platinum (110) surface. We develop a transition path theory, showing that the activation energy is given by the minimum-energy trajectory...... which succeeds in the double jump. We explicitly calculate this trajectory within an effective-medium molecular dynamics simulation. A cusp in the acceptance region leads to a root T prefactor for the activated rate of double jumps. Theory and numerical results agree....

  5. Large-scale grain growth in the solid-state process: From "Abnormal" to "Normal"

    Science.gov (United States)

    Jiang, Minhong; Han, Shengnan; Zhang, Jingwei; Song, Jiageng; Hao, Chongyan; Deng, Manjiao; Ge, Lingjing; Gu, Zhengfei; Liu, Xinyu

    2018-02-01

    Abnormal grain growth (AGG) has been a common phenomenon during the ceramic or metallurgy processing since prehistoric times. However, usually it had been very difficult to grow big single crystal (centimeter scale over) by using the AGG method due to its so-called occasionality. Based on the AGG, a solid-state crystal growth (SSCG) method was developed. The greatest advantages of the SSCG technology are the simplicity and cost-effectiveness of the technique. But the traditional SSCG technology is still uncontrollable. This article first summarizes the history and current status of AGG, and then reports recent technical developments from AGG to SSCG, and further introduces a new seed-free, solid-state crystal growth (SFSSCG) technology. This SFSSCG method allows us to repeatedly and controllably fabricate large-scale single crystals with appreciable high quality and relatively stable chemical composition at a relatively low temperature, at least in (K0.5Na0.5)NbO3(KNN) and Cu-Al-Mn systems. In this sense, the exaggerated grain growth is no longer 'Abnormal' but 'Normal' since it is able to be artificially controllable and repeated now. This article also provides a crystal growth model to qualitatively explain the mechanism of SFSSCG for KNN system. Compared with the traditional melt and high temperature solution growth methods, the SFSSCG method has the advantages of low energy consumption, low investment, simple technique, composition homogeneity overcoming the issues with incongruent melting and high volatility. This SFSSCG could be helpful for improving the mechanical and physical properties of single crystals, which should be promising for industrial applications.

  6. Higher Fasting Plasma Glucose Levels, within the Normal Range, are Associated with Decreased Processing Speed in High Functioning Young Elderly

    OpenAIRE

    Raizes, Meytal; Elkana, Odelia; Franko, Motty; Springer, Ramit Ravona; Segev, Shlomo; Beeri, Michal Schnaider

    2016-01-01

    We explored the association of plasma glucose levels within the normal range with processing speed in high functioning young elderly, free of type 2 diabetes mellitus (T2DM). A sample of 41 participants (mean age = 64.7, SD = 10; glucose 94.5 mg/dL, SD = 9.3), were examined with a computerized cognitive battery. Hierarchical linear regression analysis showed that higher plasma glucose levels, albeit within the normal range (

  7. Scattering process in the Scalar Duffin-Kemmer-Petiau gauge theory

    International Nuclear Information System (INIS)

    Beltran, J; M Pimentel, B; E Soto, D

    2016-01-01

    In this work we calculate the cross section of the scattering process of the Duffin-Kemmer-Petiau theory coupling with the Maxwell’s electromagnetic field. Specifically, we find the propagator of the free theory, the scattering amplitudes and cross sections at Born level for the Moeller and Compton scattering process of this model. For this purpose we use the analytic representation for free propagators and take account the framework of the Causal Perturbation Theory of Epstein and Glaser. (paper)

  8. Theory of suppressing avalanche process of carrier in short pulse laser irradiated dielectrics

    Energy Technology Data Exchange (ETDEWEB)

    Deng, H. X., E-mail: hxdeng@uestc.edu.cn, E-mail: xtzu@uestc.edu.cn, E-mail: kaisun@umich.edu; Zu, X. T., E-mail: hxdeng@uestc.edu.cn, E-mail: xtzu@uestc.edu.cn, E-mail: kaisun@umich.edu; Xiang, X. [School of Physical Electronics, University of Electronic Science and Technology of China, Chengdu 610054 (China); Zheng, W. G.; Yuan, X. D. [Research Center of Laser Fusion, China Academy of Engineering Physics, Mianyang 621900 (China); Sun, K., E-mail: hxdeng@uestc.edu.cn, E-mail: xtzu@uestc.edu.cn, E-mail: kaisun@umich.edu [Department of Materials Engineering and Sciences, University of Michigan, 413B Space Research Building, Ann Arbor, Michigan 48109-2143 (United States); Gao, F. [Pacific Northwest National Laboratory, P. O. Box 999, Richland, Washington 99352 (United States)

    2014-05-28

    A theory for controlling avalanche process of carrier during short pulse laser irradiation is proposed. We show that avalanche process of conduction band electrons (CBEs) is determined by the occupation number of phonons in dielectrics. The theory provides a way to suppress avalanche process and a direct judgment for the contribution of avalanche process and photon ionization process to the generation of CBEs. The obtained temperature dependent rate equation shows that the laser induced damage threshold of dielectrics, e.g., fused silica, increase nonlinearly with the decreases of temperature. Present theory predicts a new approach to improve the laser induced damage threshold of dielectrics.

  9. Individual Differences in Working Memory Capacity and Dual-Process Theories of the Mind

    Science.gov (United States)

    Barrett, Lisa Feldman; Tugade, Michele M.; Engle, Randall W.

    2004-01-01

    Dual-process theories of the mind are ubiquitous in psychology. A central principle of these theories is that behavior is determined by the interplay of automatic and controlled processing. In this article, the authors examine individual differences in the capacity to control attention as a major contributor to differences in working memory…

  10. Mathematics Education as a Proving-Ground for Information-Processing Theories.

    Science.gov (United States)

    Greer, Brian, Ed.; Verschaffel, Lieven, Ed.

    1990-01-01

    Five papers discuss the current and potential contributions of information-processing theory to our understanding of mathematical thinking as those contributions affect the practice of mathematics education. It is concluded that information-processing theories need to be supplemented in various ways to more adequately reflect the complexity of…

  11. Morální hodnocení v kontextu dual process theory

    OpenAIRE

    Schinková, Kristýna

    2017-01-01

    The philosopher and psychologist Joshua Greene came up with a theory of moral judgement that integrates both rationalism and intuitionism - the dual process theory. It says that during moral judgement the unconscious, emotional processes as well as the conscious, rational processes play an important role. At the same time it binds together the process and the respective moral output. If the judgement is made based on intuition, it will be of a deontological type and on the other hand the cont...

  12. Translating Unstructured Workflow Processes to Readable BPEL: Theory and Implementation

    DEFF Research Database (Denmark)

    van der Aalst, Willibrordus Martinus Pancratius; Lassen, Kristian Bisgaard

    2008-01-01

    and not easy to use by end-users. Therefore, we provide a mapping from Workflow Nets (WF-nets) to BPEL. This mapping builds on the rich theory of Petri nets and can also be used to map other languages (e.g., UML, EPC, BPMN, etc.) onto BPEL. In addition to this we have implemented the algorithm in a tool called...... WorkflowNet2BPEL4WS....

  13. Speech production, dual-process theory, and the attentive addressee

    OpenAIRE

    Pollard, A. J.

    2012-01-01

    This thesis outlines a model of Speaker-Addressee interaction that suggests some answers to two linked problems current in speech production. The first concerns an under-researched issue in psycholinguistics: how are decisions about speech content – conceptualization – carried out? The second, a pragmatics problem, asks how Speakers, working under the heavy time pressures of normal dialogue, achieve optimal relevance often enough for successful communication to take place. L...

  14. The Theory of Ratio Scale Estimation: Saaty's Analytic Hierarchy Process

    OpenAIRE

    Patrick T. Harker; Luis G. Vargas

    1987-01-01

    The Analytic Hierarchy Process developed by Saaty (Saaty, T. L. 1980. The Analytic Hierarchy Process. McGraw-Hill, New York.) has proven to be an extremely useful method for decision making and planning. However, some researchers in these areas have raised concerns over the theoretical basis underlying this process. This paper addresses currently debated issues concerning the theoretical foundations of the Analytic Hierarchy Process. We also illustrate through proof and through examples the v...

  15. Dynamic competition and enterprising discovery: Kirzner’s market process theory

    Directory of Open Access Journals (Sweden)

    Ahmet İhsan KAYA

    2011-12-01

    Full Text Available Market process theory is designed by the followers of Austrian School tradition as an alternative to neo-classic price theory in order to explain perceptible markets. Contrary to neo-classic economy which focuses on the concept of equilibrium, market process theory seeks to explore unequilibrium and direction to equilibrium. While doing so, the role of enterprenuer in dealing with limited information which is not taken into consideration in the price theory of neo-classic economy, uncertainty because of time and uncertainty which occurs in market underpins Israel Kirzner's analyses. In the study, Kirzner's competition and enterpreneurship theory is discussed with the contributions of Mises and Hayek. The study constitutes an introduction to market process theory of Kirzner.

  16. Learning Theory Estimates with Observations from General Stationary Stochastic Processes.

    Science.gov (United States)

    Hang, Hanyuan; Feng, Yunlong; Steinwart, Ingo; Suykens, Johan A K

    2016-12-01

    This letter investigates the supervised learning problem with observations drawn from certain general stationary stochastic processes. Here by general, we mean that many stationary stochastic processes can be included. We show that when the stochastic processes satisfy a generalized Bernstein-type inequality, a unified treatment on analyzing the learning schemes with various mixing processes can be conducted and a sharp oracle inequality for generic regularized empirical risk minimization schemes can be established. The obtained oracle inequality is then applied to derive convergence rates for several learning schemes such as empirical risk minimization (ERM), least squares support vector machines (LS-SVMs) using given generic kernels, and SVMs using gaussian kernels for both least squares and quantile regression. It turns out that for independent and identically distributed (i.i.d.) processes, our learning rates for ERM recover the optimal rates. For non-i.i.d. processes, including geometrically [Formula: see text]-mixing Markov processes, geometrically [Formula: see text]-mixing processes with restricted decay, [Formula: see text]-mixing processes, and (time-reversed) geometrically [Formula: see text]-mixing processes, our learning rates for SVMs with gaussian kernels match, up to some arbitrarily small extra term in the exponent, the optimal rates. For the remaining cases, our rates are at least close to the optimal rates. As a by-product, the assumed generalized Bernstein-type inequality also provides an interpretation of the so-called effective number of observations for various mixing processes.

  17. Semi-analytical quasi-normal mode theory for the local density of states in coupled photonic crystal cavity-waveguide structures

    DEFF Research Database (Denmark)

    de Lasson, Jakob Rosenkrantz; Kristensen, Philip Trøst; Mørk, Jesper

    2015-01-01

    We present and validate a semi-analytical quasi-normal mode (QNM) theory for the local density of states (LDOS) in coupled photonic crystal (PhC) cavity-waveguide structures. By means of an expansion of the Green's function on one or a few QNMs, a closed-form expression for the LDOS is obtained, ......-trivial spectrum with a peak and a dip is found, which is reproduced only when including both the two relevant QNMs in the theory. In both cases, we find relative errors below 1% in the bandwidth of interest.......We present and validate a semi-analytical quasi-normal mode (QNM) theory for the local density of states (LDOS) in coupled photonic crystal (PhC) cavity-waveguide structures. By means of an expansion of the Green's function on one or a few QNMs, a closed-form expression for the LDOS is obtained......, and for two types of two-dimensional PhCs, with one and two cavities side-coupled to an extended waveguide, the theory is validated against numerically exact computations. For the single cavity, a slightly asymmetric spectrum is found, which the QNM theory reproduces, and for two cavities a non...

  18. A unified bond theory, probabilistic meso-scale modeling, and experimental validation of deformed steel rebar in normal strength concrete

    Science.gov (United States)

    Wu, Chenglin

    Bond between deformed rebar and concrete is affected by rebar deformation pattern, concrete properties, concrete confinement, and rebar-concrete interfacial properties. Two distinct groups of bond models were traditionally developed based on the dominant effects of concrete splitting and near-interface shear-off failures. Their accuracy highly depended upon the test data sets selected in analysis and calibration. In this study, a unified bond model is proposed and developed based on an analogy to the indentation problem around the rib front of deformed rebar. This mechanics-based model can take into account the combined effect of concrete splitting and interface shear-off failures, resulting in average bond strengths for all practical scenarios. To understand the fracture process associated with bond failure, a probabilistic meso-scale model of concrete is proposed and its sensitivity to interface and confinement strengths are investigated. Both the mechanical and finite element models are validated with the available test data sets and are superior to existing models in prediction of average bond strength (rib spacing-to-height ratio of deformed rebar. It can accurately predict the transition of failure modes from concrete splitting to rebar pullout and predict the effect of rebar surface characteristics as the rib spacing-to-height ratio increases. Based on the unified theory, a global bond model is proposed and developed by introducing bond-slip laws, and validated with testing of concrete beams with spliced reinforcement, achieving a load capacity prediction error of less than 26%. The optimal rebar parameters and concrete cover in structural designs can be derived from this study.

  19. Implicit and explicit theories in the teaching and learning processes of music theory

    Directory of Open Access Journals (Sweden)

    Henry Roa Ordoñez

    2014-06-01

    Full Text Available This study explores the characteristics of similarity and divergence between the pedagogical discourse of teachers and their performance in the classroom, from the different educational paradigms that guide, today, the educational events. The teaching and learning of music theory constitute the backbone of the proposed curriculum of the Department of Music, which has implications in the other musical areas and, therefore, the training program that orients the area of music theory, requires an assessment of the impacts and effects caused by the performance of the teacher in charge of running this course as an essential condition to establish elements of building and transfer of knowledge in each of the disciplines that make up the curricular structure of the Department of Music.

  20. Dual-process models of health-related behaviour and cognition: a review of theory.

    Science.gov (United States)

    Houlihan, S

    2018-03-01

    The aim of this review was to synthesise a spectrum of theories incorporating dual-process models of health-related behaviour. Review of theory, adapted loosely from Cochrane-style systematic review methodology. Inclusion criteria were specified to identify all relevant dual-process models that explain decision-making in the context of decisions made about human health. Data analysis took the form of iterative template analysis (adapted from the conceptual synthesis framework used in other reviews of theory), and in this way theories were synthesised on the basis of shared theoretical constructs and causal pathways. Analysis and synthesis proceeded in turn, instead of moving uni-directionally from analysis of individual theories to synthesis of multiple theories. Namely, the reviewer considered and reconsidered individual theories and theoretical components in generating the narrative synthesis' main findings. Drawing on systematic review methodology, 11 electronic databases were searched for relevant dual-process theories. After de-duplication, 12,198 records remained. Screening of title and abstract led to the exclusion of 12,036 records, after which 162 full-text records were assessed. Of those, 21 records were included in the review. Moving back and forth between analysis of individual theories and the synthesis of theories grouped on the basis of theme or focus yielded additional insights into the orientation of a theory to an individual. Theories could be grouped in part on their treatment of an individual as an irrational actor, as social actor, as actor in a physical environment or as a self-regulated actor. Synthesising identified theories into a general dual-process model of health-related behaviour indicated that such behaviour is the result of both propositional and unconscious reasoning driven by an individual's response to internal cues (such as heuristics, attitude and affect), physical cues (social and physical environmental stimuli) as well as

  1. Can dual processing theory explain physics students’ performance on the Force Concept Inventory?

    Directory of Open Access Journals (Sweden)

    Anna K. Wood

    2016-07-01

    Full Text Available According to dual processing theory there are two types, or modes, of thinking: system 1, which involves intuitive and nonreflective thinking, and system 2, which is more deliberate and requires conscious effort and thought. The Cognitive Reflection Test (CRT is a widely used and robust three item instrument that measures the tendency to override system 1 thinking and to engage in reflective, system 2 thinking. Each item on the CRT has an intuitive (but wrong answer that must be rejected in order to answer the item correctly. We therefore hypothesized that performance on the CRT may give useful insights into the cognitive processes involved in learning physics, where success involves rejecting the common, intuitive ideas about the world (often called misconceptions and instead carefully applying physical concepts. This paper presents initial results from an ongoing study examining the relationship between students’ CRT scores and their performance on the Force Concept Inventory (FCI, which tests students’ understanding of Newtonian mechanics. We find that a higher CRT score predicts a higher FCI score for both precourse and postcourse tests. However, we also find that the FCI normalized gain is independent of CRT score. The implications of these results are discussed.

  2. Attentional Processing and Teacher Ratings in Hyperactive, Learning Disabled and Normal Boys.

    Science.gov (United States)

    Brown, Ronald T.; Wynne, Martha Ellen

    Sustained attention and inhibitory control of 15 nonhyperactive, learning disabled (LD) boys, 15 hyperactive but not LD boys, and 15 normal boys (11-12 years old) were studied, on teacher ratings of impulse control in the classroom and testing results. Coming to attention, decision making, sustained attention, and attention-concentration were…

  3. Contrasting dynamics of organizational learning : a process theory perspective

    NARCIS (Netherlands)

    Berends, J.J.; Lammers, I.S.

    2006-01-01

    In this paper we analyze the process characteristics of organizational learning. A wide variety of process models of organizational learning have been proposed in the literature, but these models have not been systematically investigated. In this paper we use Van de Ven and Poole's (1995) taxonomy

  4. Prosody's Contribution to Fluency: An Examination of the Theory of Automatic Information Processing

    Science.gov (United States)

    Schrauben, Julie E.

    2010-01-01

    LaBerge and Samuels' (1974) theory of automatic information processing in reading offers a model that explains how and where the processing of information occurs and the degree to which processing of information occurs. These processes are dependent upon two criteria: accurate word decoding and automatic word recognition. However, LaBerge and…

  5. A Look at the Memory Performance of Retarded and Normal Children Utilizing the Levels of Processing Framework.

    Science.gov (United States)

    Lupart, Judy L.; Mulcahy, Robert F.

    Memory performance differences of mental age matched (9-12 years) educable mentally retarded (EMR) (n=56) and normal (n=56) children were examined in two experiments using the F. Craik and R. Lockhart levels of processing framework. In experiment 1, Ss were randomly assigned to an incidental, intentional, or planned intentional learning condition,…

  6. Arrays of surface-normal electroabsorption modulators for the generation and signal processing of microwave photonics signals

    NARCIS (Netherlands)

    Noharet, Bertrand; Wang, Qin; Platt, Duncan; Junique, Stéphane; Marpaung, D.A.I.; Roeloffzen, C.G.H.

    2011-01-01

    The development of an array of 16 surface-normal electroabsorption modulators operating at 1550nm is presented. The modulator array is dedicated to the generation and processing of microwave photonics signals, targeting a modulation bandwidth in excess of 5GHz. The hybrid integration of the

  7. A comparative study of deficit pattern in theory of mind and emotion regulation methods in evaluating patients with bipolar disorder and normal individuals

    OpenAIRE

    Ali Fakhari; Khalegh Minashiri; Abolfazl Fallahi; Mohammad Taher Panah

    2013-01-01

    BACKGROUND: This study compared patterns of deficit in "theory of mind" and "emotion regulation" in patientswith bipolar disorder and normal individuals. METHODS: In this causal-comparative study, subjects were 20 patients with bipolar disorder and 20 normalindividuals. Patients were selected via convenience sampling method among hospitalized patients at Razi hospital ofTabriz, Iran. The data was collected through two scales: Reading the Mind in the Eyes Test and Emotion RegulationQuestionnai...

  8. Application of a model of social information processing to nursing theory: how nurses respond to patients.

    Science.gov (United States)

    Sheldon, Lisa Kennedy; Ellington, Lee

    2008-11-01

    This paper is a report of a study to assess the applicability of a theoretical model of social information processing in expanding a nursing theory addressing how nurses respond to patients. Nursing communication affects patient outcomes such as anxiety, adherence to treatments and satisfaction with care. Orlando's theory of nursing process describes nurses' reactions to patients' behaviour as generating a perception, thought and feeling in the nurse and then action by the nurse. A model of social information processing describes the sequential steps in the cognitive processes used to respond to social cues and may be useful in describing the nursing process. Cognitive interviews were conducted in 2006 with a convenience sample of 5 nurses in the United States of America. The data were interpreted using the Crick and Dodge model of social information processing. Themes arising from cognitive interviews validated concepts of the nursing theory and the constructs of the model of social information processing. The interviews revealed that the support of peers was an additional construct involved in the development of communication skills, creation of a database and enhancement of self-efficacy. Models of social information processing enhance understanding of the process of how nurses respond to patients and further develop nursing theories further. In combination, the theories are useful in developing research into nurse-patient communication. Future research based on the expansion of nursing theory may identify effective and culturally appropriate nurse response patterns to specific patient interactions with implications for nursing care and patient outcomes.

  9. Eyewitness Identification Reform: Data, Theory, and Due Process.

    Science.gov (United States)

    Clark, Steven E

    2012-05-01

    Some commentators view my analyses (Clark, 2012, this issue) as an important step forward in assessing the costs and benefits of eyewitness identification reform. Others suggest that the trade-off between correct identifications lost and false identifications avoided is well-known; that the expected utility model is misspecified; and that the loss of correct identifications due to the use of reformed eyewitness identification procedures is irrelevant to policy decisions, as those correct identifications are the illegitimate product of suggestion and lucky guesses. Contrary to these criticisms, the loss of correct identifications has not been adequately considered in theoretical or policy matters, criticisms regarding the various utilities do not substantively change the nature of the trade-off, and the dismissal of lost correct identifications is based not on data but on an outdated theory of recognition memory. © The Author(s) 2012.

  10. Elevated intrabolus pressure identifies obstructive processes when integrated relaxation pressure is normal on esophageal high-resolution manometry.

    Science.gov (United States)

    Quader, Farhan; Reddy, Chanakyaram; Patel, Amit; Gyawali, C Prakash

    2017-07-01

    Elevated integrated relaxation pressure (IRP) on esophageal high-resolution manometry (HRM) identifies obstructive processes at the esophagogastric junction (EGJ). Our aim was to determine whether intrabolus pressure (IBP) can identify structural EGJ processes when IRP is normal. In this observational cohort study, adult patients with dysphagia and undergoing HRM were evaluated for endoscopic evidence of structural EGJ processes (strictures, rings, hiatus hernia) in the setting of normal IRP. HRM metrics [IRP, distal contractile integral (DCI), distal latency (DL), IBP, and EGJ contractile integral (EGJ-CI)] were compared among 74 patients with structural EGJ findings (62.8 ± 1.6 yr, 67.6% women), 27 patients with normal EGD (52.9 ± 3.2 yr, 70.3% women), and 21 healthy controls (27.6 ± 0.6 yr, 52.4% women). Findings were validated in 85 consecutive symptomatic patients to address clinical utility. In the primary cohort, mean IBP (18.4 ± 0.9 mmHg) was higher with structural EGJ findings compared with dysphagia with normal EGD (13.5 ± 1.1 mmHg, P = 0.002) and healthy controls (10.9 ± 0.9 mmHg, P 0.05 for each comparison). During multiple rapid swallows, IBP remained higher in the structural findings group compared with controls ( P = 0.02). Similar analysis of the prospective validation cohort confirmed IBP elevation in structural EGJ processes, but correlation with dysphagia could not be demonstrated. We conclude that elevated IBP predicts the presence of structural EGJ processes even when IRP is normal, but correlation with dysphagia is suboptimal. NEW & NOTEWORTHY Integrated relaxation pressure (IRP) above the upper limit of normal defines esophageal outflow obstruction using high-resolution manometry. In patients with normal IRP, elevated intrabolus pressure (IBP) can be a surrogate marker for a structural restrictive or obstructive process at the esophagogastric junction (EGJ). This has the potential to augment the clinical value of

  11. Application of Cortical Processing Theory to Acoustical Analysis

    National Research Council Canada - National Science Library

    Ghitza, Oded

    2007-01-01

    ... (TMC). Robustness against background noise is provided principally by the signal processing performed by the PAM, while insensitivity to time-scale variations is provided by properties of the TMC...

  12. A Novel Higher-Order Shear and Normal Deformable Plate Theory for the Static, Free Vibration and Buckling Analysis of Functionally Graded Plates

    Directory of Open Access Journals (Sweden)

    Shi-Chao Yi

    2017-01-01

    Full Text Available Closed-form solution of a special higher-order shear and normal deformable plate theory is presented for the static situations, natural frequencies, and buckling responses of simple supported functionally graded materials plates (FGMs. Distinguished from the usual theories, the uniqueness is the differentia of the new plate theory. Each individual FGM plate has special characteristics, such as material properties and length-thickness ratio. These distinctive attributes determine a set of orthogonal polynomials, and then the polynomials can form an exclusive plate theory. Thus, the novel plate theory has two merits: one is the orthogonality, where the majority of the coefficients of the equations derived from Hamilton’s principle are zero; the other is the flexibility, where the order of the plate theory can be arbitrarily set. Numerical examples with different shapes of plates are presented and the achieved results are compared with the reference solutions available in the literature. Several aspects of the model involving relevant parameters, length-to-thickness, stiffness ratios, and so forth affected by static and dynamic situations are elaborate analyzed in detail. As a consequence, the applicability and the effectiveness of the present method for accurately computing deflection, stresses, natural frequencies, and buckling response of various FGM plates are demonstrated.

  13. Peace and power: a theory of emancipatory group process.

    Science.gov (United States)

    Chinn, Peggy L; Falk-Rafael, Adeline

    2015-01-01

    To present the theoretical basis for the group process known as "Peace and Power." A dialectic between two dominant forms of power-peace powers and power-over powers-forms the basis for a synthesis that yields an emancipatory group process characterized by praxis, empowerment, awareness, cooperation, and evolvement for individuals and groups. Critical analysis of prevailing competitive group dynamics and the ideals of cooperative group dynamics was conducted to project the potential for achieving group interactions that yield profound changes in the direction of justice, empowerment, and well-being for all. The theoretical framework of "Peace and Power" is consistent with characteristics of emancipatory integrity that are vital for social change. The processes of "Peace and Power" can be used to create peaceful, cooperative interactions among nurses, with other health professionals, with patients and families, and in communities. © 2014 Sigma Theta Tau International.

  14. Higher Fasting Plasma Glucose Levels, within the Normal Range, are Associated with Decreased Processing Speed in High Functioning Young Elderly.

    Science.gov (United States)

    Raizes, Meytal; Elkana, Odelia; Franko, Motty; Ravona Springer, Ramit; Segev, Shlomo; Beeri, Michal Schnaider

    2016-01-01

    We explored the association of plasma glucose levels within the normal range with processing speed in high functioning young elderly, free of type 2 diabetes mellitus (T2DM). A sample of 41 participants (mean age = 64.7, SD = 10; glucose 94.5 mg/dL, SD = 9.3), were examined with a computerized cognitive battery. Hierarchical linear regression analysis showed that higher plasma glucose levels, albeit within the normal range (levels may have an impact on cognitive function.

  15. Processes of Self-Regulated Learning in Music Theory in Elementary Music Schools in Slovenia

    Science.gov (United States)

    Fritz, Barbara Smolej; Peklaj, Cirila

    2011-01-01

    The aim of our study was determine how students regulate their learning in music theory (MT). The research is based on the socio-cognitive theory of learning. The aim of our study was twofold: first, to design the instruments for measuring (meta)cognitive and affective-motivational processes in learning MT, and, second, to examine the relationship…

  16. Non-equilibrium reacting gas flows kinetic theory of transport and relaxation processes

    CERN Document Server

    Nagnibeda, Ekaterina; Nagnibeda, Ekaterina

    2009-01-01

    This volume develops the kinetic theory of transport phenomena and relaxation processes in the flows of reacting gas mixtures. The theory is applied to the modeling of non-equilibrium flows behind strong shock waves, in the boundary layer, and in nozzles.

  17. On a theory of media processing systems behaviour, with applications

    NARCIS (Netherlands)

    Weffers-Albu, M.A.; Lukkien, J.J.; Steffens, E.F.M.; Stok, van der P.D.V.

    2006-01-01

    In this article we provide a model for the dynamic behavior of media processing chains of tasks communicating via bounded buffers. The aim is to find the overall behavior of a chain from which performance parameters (such as start time and response time of individual tasks, chain end-to-end response

  18. Nursing's ways of knowing and dual process theories of cognition.

    Science.gov (United States)

    Paley, John; Cheyne, Helen; Dalgleish, Len; Duncan, Edward A S; Niven, Catherine A

    2007-12-01

    This paper is a comparison of nursing's patterns of knowing with the systems identified by cognitive science, and evaluates claims about the equal-status relation between scientific and non-scientific knowledge. Ever since Carper's seminal paper in 1978, it has been taken for granted in the nursing literature that there are ways of knowing, or patterns of knowing, that are not scientific. This idea has recently been used to argue that the concept of evidence, typically associated with evidence-based practice, is inappropriately restricted because it is identified exclusively with scientific research. The paper reviews literature in psychology which appears to draw a comparable distinction between rule-based, analytical cognitive processes and other forms of cognitive processing which are unconscious, holistic and intuitive. There is a convincing parallel between the 'patterns of knowing' distinction in nursing and the 'cognitive processing' distinction in psychology. However, there is an important difference in the way the relation between different forms of knowing (or cognitive processing) is depicted. In nursing, it is argued that the different patterns of knowing have equal status and weight. In cognitive science, it is suggested that the rule-based, analytical form of cognition has a supervisory and corrective function with respect to the other forms. Scientific reasoning and evidence-based knowledge have epistemological priority over the other forms of nursing knowledge. The implications of this claim for healthcare practice are briefly indicated.

  19. Foundations of digital signal processing theory, algorithms and hardware design

    CERN Document Server

    Gaydecki, Patrick

    2005-01-01

    An excellent introductory text, this book covers the basic theoretical, algorithmic and real-time aspects of digital signal processing (DSP). Detailed information is provided on off-line, real-time and DSP programming and the reader is effortlessly guided through advanced topics such as DSP hardware design, FIR and IIR filter design and difference equation manipulation.

  20. Regions of low density in the contrast-enhanced pituitary gland: normal and pathologic processes

    International Nuclear Information System (INIS)

    Chambers, E.F.; Turski, P.A.; LaMasters, D.; Newton, T.H.

    1982-01-01

    The incidence of low-density regions in the contrast-enhanced pituitary gland and the possible causes of these regions were investigated by a retrospective review of computed tomographic (CT) scans of the head in 50 patients and autopsy specimens of the pituitary in 100 other patients. It was found that focal areas of low density within the contrast enhanced pituitary gland can be caused by various normal and pathologic conditions such as pituitary microadenomas, pars intermedia cysts, foci of metastasis, infarcts, epidermoid cysts, and abscesses. Although most focal low-density regions probably represent pituitary microadenomas, careful clinical correlation is needed to establish a diagnosis

  1. Familiarity Breeds Attempts: A Critical Review of Dual-Process Theories of Recognition.

    Science.gov (United States)

    Mandler, George

    2008-09-01

    Recognition memory and recall/recollection are the major divisions of the psychology of human memory. Theories of recognition have shifted from a "strength" approach to a dual-process view, which distinguishes between knowing that one has experienced an object before and knowing what it was. In this article, I discuss the history of this approach and the two processes of familiarity and recollection and locate their origin in pattern matching and organization. I evaluate various theories in terms of their basic requirements and their defining research and propose the extension of the original two process theory to domains such as pictorial recognition. Finally, I present the main phenomena that a dual-process theory of recognition must account for and discuss future needs and directions of research and development. © 2008 Association for Psychological Science.

  2. A Grounded Theory of the Process of Spiritual Change Among Homicide Survivors.

    Science.gov (United States)

    Johnson, Shannon K; Zitzmann, Brooks

    2018-01-01

    Grounded theory was used to generate a mid-range theory of the process of spiritual change in the lives of survivors of homicide victims. Theoretical sampling guided the selection of 30 participants from a larger study of spiritual change after homicide ( N = 112). Individual interviews were analyzed using a four-step sequence of line-by-line, focused, axial, and selective coding. Analysis generated a closed theory consisting of three fluids, consecutive but nonlinear stages. Each stage consisted of an overarching process and a state of being in the world: (a) Disintegrating: living in a state of shock; (b) Reckoning: living in a state of stagnation; (c) Recreating and reintegrating the self: living in a state of renewal. Movement through the stages was fueled by processes of spiritual connection that yielded changes that permeated the theory. Findings can be used to help practitioners address the processes that drive spiritual change in the lives of homicide survivors.

  3. The Collinearity Free and Bias Reduced Regression Estimation Project: The Theory of Normalization Ridge Regression. Report No. 2.

    Science.gov (United States)

    Bulcock, J. W.; And Others

    Multicollinearity refers to the presence of highly intercorrelated independent variables in structural equation models, that is, models estimated by using techniques such as least squares regression and maximum likelihood. There is a problem of multicollinearity in both the natural and social sciences where theory formulation and estimation is in…

  4. Linking the Value Assessment of Oil and Gas Firms to Ambidexterity Theory Using a Mixture of Normal Distributions

    NARCIS (Netherlands)

    Casault, Sébastien; Groen, Arend J.; Linton, Jonathan D.; Linton, Jonathan

    2015-01-01

    Oil and gas exploration and production firms have return profiles that are not easily explained by current financial theory – the variation in their market returns is non-Gaussian. In this paper, the nature and underlying reason for these significant deviations from expected behavior are considered.

  5. Economic model predictive control theory, formulations and chemical process applications

    CERN Document Server

    Ellis, Matthew; Christofides, Panagiotis D

    2017-01-01

    This book presents general methods for the design of economic model predictive control (EMPC) systems for broad classes of nonlinear systems that address key theoretical and practical considerations including recursive feasibility, closed-loop stability, closed-loop performance, and computational efficiency. Specifically, the book proposes: Lyapunov-based EMPC methods for nonlinear systems; two-tier EMPC architectures that are highly computationally efficient; and EMPC schemes handling explicitly uncertainty, time-varying cost functions, time-delays and multiple-time-scale dynamics. The proposed methods employ a variety of tools ranging from nonlinear systems analysis, through Lyapunov-based control techniques to nonlinear dynamic optimization. The applicability and performance of the proposed methods are demonstrated through a number of chemical process examples. The book presents state-of-the-art methods for the design of economic model predictive control systems for chemical processes. In addition to being...

  6. Parallel Distributed Processing theory in the age of deep networks

    OpenAIRE

    Bowers, Jeffrey

    2017-01-01

    Parallel Distributed Processing (PDP) models in psychology are the precursors of deep networks used in computer science. However, only PDP models are associated with two core psychological claims, namely, that all knowledge is coded in a distributed format, and cognition is mediated by non-symbolic computations. These claims have long been debated within cognitive science, and recent work with deep networks speaks to this debate. Specifically, single-unit recordings show that deep networks le...

  7. Theory of charge transport in diffusive normal metal/conventional superconductor point contacts in the presence of magnetic impurity

    NARCIS (Netherlands)

    Yokoyama, T.; Tanaka, Y.; Golubov, Alexandre Avraamovitch; Inoue, J.; Asano, Y.

    2006-01-01

    Charge transport in the diffusive normal metal/insulator/s-wave superconductor junctions is studied in the presence of the magnetic impurity for various situations, where we have used the Usadel equation with Nazarov's generalized boundary condition. It is revealed that the magnetic impurity

  8. Processes of self-regulated learning in music theory in elementary music schools in Slovenia

    OpenAIRE

    Peklaj, Cirila; Smolej-Fritz, Barbara

    2015-01-01

    The aim of our study was determine how students regulate their learning in music theory (MT). The research is based on the socio-cognitive theory of learning. The aim of our study was twofold: first, to design the instruments for measuring (meta)cognitive and affective-motivational processes in learning MT, and, second, to examine the relationship between these processes. A total of 457 fifth- and sixth- grade students from 10 different elementary music schools in Slovenia participated in the...

  9. Asymptotic theory for Brownian semi-stationary processes with application to turbulence

    DEFF Research Database (Denmark)

    Corcuera, José Manuel; Hedevang, Emil; Pakkanen, Mikko S.

    2013-01-01

    This paper presents some asymptotic results for statistics of Brownian semi-stationary (BSS) processes. More precisely, we consider power variations of BSS processes, which are based on high frequency (possibly higher order) differences of the BSS model. We review the limit theory discussed......, which allow to obtain a valid central limit theorem for the critical region. Finally, we apply our statistical theory to turbulence data....

  10. Constructal theory through thermodynamics of irreversible processes framework

    International Nuclear Information System (INIS)

    Tescari, S.; Mazet, N.; Neveu, P.

    2011-01-01

    Highlights: → Point to area flow problem is solved through Thermodynamics of irreversible processes. → A new optimisation criterion is defined: the exergy or entropy impedance. → Optimisation is performed following two different routes, constructal or global. → Global optimisation is more efficient than constructal optimisation. → Global optimisation enhances the domain of construct benefits. - Abstract: Point to volume flow problem is revisited on a thermodynamics of irreversible processes (TIP) basis. The first step consists in evaluating the local entropy production of the system, and deducing from this expression the phenomenological laws. Then, the total entropy production can be simply evaluated. It is demonstrated that total entropy production can be written in a remarkable form: the product of the so-called entropy impedance with the square of the heat flux. As the heat flux is given, optimisation consists in minimising the entropy impedance. It is also shown that minimising entropy impedance minimises the maximum temperature difference. Applied to the elemental volume, this optimisation process leads to a shape factor close to the one already published. For the first construction, the equivalent system is defined as stated by Prigogine: when subjected to the same constraints, two systems are thermodynamically equivalent if their entropy production is equal. Two optimisation routes are then investigated: a global optimisation where all scales are taken into account and the constructal optimisation where the system is optimised scale by scale. In this second case, results are close to Ghodossi's work. When global optimisation is performed, it is demonstrated that conductive paths have to be spread uniformly in the active material (i.e. the number of elemental volumes must go to infinite). Comparing the two routes, global optimisation leads to better performance than constructal optimisation. Moreover, global optimisation enlarges the domain of

  11. Extending the amygdala in theories of threat processing

    Science.gov (United States)

    Fox, Andrew S.; Oler, Jonathan A.; Tromp, Do P.M.; Fudge, Julie L.; Kalin, Ned H.

    2015-01-01

    The central extended amygdala is an evolutionarily conserved set of interconnected brain regions that play an important role in threat processing to promote survival. Two core components of the central extended amygdala, the central nucleus of the amygdala (Ce) and the lateral bed nucleus of the stria terminalis (BST) are highly similar regions that serve complimentary roles by integrating fear- and anxiety-relevant information. Survival depends on the central extended amygdala's ability to rapidly integrate and respond to threats that vary in their immediacy, proximity, and characteristics. Future studies will benefit from understanding alterations in central extended amygdala function in relation to stress-related psychopathology. PMID:25851307

  12. Magnetohydrodynamic Particle Acceleration Processes: SSX Experiments, Theory and Astrophysical Applications

    International Nuclear Information System (INIS)

    Matthaeus, W.; Brown, M.

    2006-01-01

    This is the final technical report for a funded program to provide theoretical support to the Swarthmore Spheromak Experiment. We examined mhd relaxation, reconnecton between two spheromaks, particle acceleration by these processes, and collisonless effects, e.g., Hall effect near the reconnection zone,. Throughout the project, applications to space plasma physics and astrophysics were included. Towards the end of the project we were examining a more fully turbulent relaxation associated with unconstrained dynamics in SSX. We employed experimental, spacecraft observations, analytical and numerical methods.

  13. Reducing Error Rates for Iris Image using higher Contrast in Normalization process

    Science.gov (United States)

    Aminu Ghali, Abdulrahman; Jamel, Sapiee; Abubakar Pindar, Zahraddeen; Hasssan Disina, Abdulkadir; Mat Daris, Mustafa

    2017-08-01

    Iris recognition system is the most secured, and faster means of identification and authentication. However, iris recognition system suffers a setback from blurring, low contrast and illumination due to low quality image which compromises the accuracy of the system. The acceptance or rejection rates of verified user depend solely on the quality of the image. In many cases, iris recognition system with low image contrast could falsely accept or reject user. Therefore this paper adopts Histogram Equalization Technique to address the problem of False Rejection Rate (FRR) and False Acceptance Rate (FAR) by enhancing the contrast of the iris image. A histogram equalization technique enhances the image quality and neutralizes the low contrast of the image at normalization stage. The experimental result shows that Histogram Equalization Technique has reduced FRR and FAR compared to the existing techniques.

  14. Rates of convergence and asymptotic normality of curve estimators for ergodic diffusion processes

    NARCIS (Netherlands)

    J.H. van Zanten (Harry)

    2000-01-01

    textabstractFor ergodic diffusion processes, we study kernel-type estimators for the invariant density, its derivatives and the drift function. We determine rates of convergence and find the joint asymptotic distribution of the estimators at different points.

  15. Dual-Process Theory and Signal-Detection Theory of Recognition Memory

    Science.gov (United States)

    Wixted, John T.

    2007-01-01

    Two influential models of recognition memory, the unequal-variance signal-detection model and a dual-process threshold/detection model, accurately describe the receiver operating characteristic, but only the latter model can provide estimates of recollection and familiarity. Such estimates often accord with those provided by the remember-know…

  16. On generalisations of the log-Normal distribution by means of a new product definition in the Kapteyn process

    Science.gov (United States)

    Duarte Queirós, Sílvio M.

    2012-07-01

    We discuss the modification of the Kapteyn multiplicative process using the q-product of Borges [E.P. Borges, A possible deformed algebra and calculus inspired in nonextensive thermostatistics, Physica A 340 (2004) 95]. Depending on the value of the index q a generalisation of the log-Normal distribution is yielded. Namely, the distribution increases the tail for small (when q1) values of the variable upon analysis. The usual log-Normal distribution is retrieved when q=1, which corresponds to the traditional Kapteyn multiplicative process. The main statistical features of this distribution as well as related random number generators and tables of quantiles of the Kolmogorov-Smirnov distance are presented. Finally, we illustrate the validity of this scenario by describing a set of variables of biological and financial origin.

  17. Theory and Practice Meets in Industrial Process Design -Educational Perspective-

    Science.gov (United States)

    Aramo-Immonen, Heli; Toikka, Tarja

    Software engineer should see himself as a business process designer in enterprise resource planning system (ERP) re-engineering project. Software engineers and managers should have design dialogue. The objective of this paper is to discuss the motives to study the design research in connection of management education in order to envision and understand the soft human issues in the management context. Second goal is to develop means of practicing social skills between designers and managers. This article explores the affective components of design thinking in industrial management domain. In the conceptual part of this paper are discussed concepts of network and project economy, creativity, communication, use of metaphors, and design thinking. Finally is introduced empirical research plan and first empirical results from design method experiments among the multi-disciplined groups of the master-level students of industrial engineering and management and software engineering.

  18. Algebraic Meta-Theory of Processes with Data

    Directory of Open Access Journals (Sweden)

    Daniel Gebler

    2013-07-01

    Full Text Available There exists a rich literature of rule formats guaranteeing different algebraic properties for formalisms with a Structural Operational Semantics. Moreover, there exist a few approaches for automatically deriving axiomatizations characterizing strong bisimilarity of processes. To our knowledge, this literature has never been extended to the setting with data (e.g. to model storage and memory. We show how the rule formats for algebraic properties can be exploited in a generic manner in the setting with data. Moreover, we introduce a new approach for deriving sound and ground-complete axiom schemata for a notion of bisimilarity with data, called stateless bisimilarity, based on intuitive auxiliary function symbols for handling the store component. We do restrict, however, the axiomatization to the setting where the store component is only given in terms of constants.

  19. Theory Building- Towards an understanding of business model innovation processes

    DEFF Research Database (Denmark)

    Taran, Yariv; Boer, Harry; Lindgren, Peter

    2009-01-01

    Companies today, in some industries more than others, invest more capital and resources just to stay competitive, develop more diverse solutions, and increasingly start to think more radically, when considering to innovate their business model. However, the development and innovation of business...... models is a complex venture and has not been widely researched yet. The objective of this paper is therefore 1) to build a [descriptive] theoretical understanding, based on Christensen's (2005) three-step procedure, to business models and their innovation and, as a result of that, 2) to strengthen...... researchers' and practitioners' perspectives as to how the process of business model innovation can be realized. By using various researchers' perspectives and assumptions, we identify relevant inconsistencies, which consequently lead us to propose possible supplementary solutions. We conclude our paper...

  20. An information theory-based approach to modeling the information processing of NPP operators

    International Nuclear Information System (INIS)

    Kim, Jong Hyun; Seong, Poong Hyun

    2002-01-01

    This paper proposes a quantitative approach to modeling the information processing of NPP operators. The aim of this work is to derive the amount of the information processed during a certain control task. The focus will be on i) developing a model for information processing of NPP operators and ii) quantifying the model. To resolve the problems of the previous approaches based on the information theory, i.e. the problems of single channel approaches, we primarily develop the information processing model having multiple stages, which contains information flows. Then the uncertainty of the information is quantified using the Conant's model, a kind of information theory

  1. Process research on Emotionally Focused Therapy (EFT) for couples: linking theory to practice.

    Science.gov (United States)

    Greenman, Paul S; Johnson, Susan M

    2013-03-01

    The focus of this article is on the link among theory, process, and outcome in the practice of Emotionally Focused Therapy (EFT) for couples. We describe the EFT model of change and the EFT perspective on adult love as the reflection of underlying attachment processes. We outline the manner in which theory and research inform EFT interventions. This leads into a detailed review of the literature on the processes of change in EFT. We highlight the client responses and therapist operations that have emerged from process research and their relation to treatment outcomes. We discuss the implications of this body of research for clinical practice and training. © FPI, Inc.

  2. Reclaiming life on one's own terms: a grounded theory study of the process of breast cancer survivorship.

    Science.gov (United States)

    Sherman, Deborah Witt; Rosedale, Mary; Haber, Judith

    2012-05-01

    To develop a substantive theory of the process of breast cancer survivorship. Grounded theory. A LISTSERV announcement posted on the SHARE Web site and purposeful recruitment of women known to be diagnosed and treated for breast cancer. 15 women diagnosed with early-stage breast cancer. Constant comparative analysis. Breast cancer survivorship. The core variable identified was Reclaiming Life on One's Own Terms. The perceptions and experiences of the participants revealed overall that the diagnosis of breast cancer was a turning point in life and the stimulus for change. That was followed by the recognition of breast cancer as now being a part of life, leading to the necessity of learning to live with breast cancer, and finally, creating a new life after breast cancer. Participants revealed that breast cancer survivorship is a process marked and shaped by time, the perception of support, and coming to terms with the trauma of a cancer diagnosis and the aftermath of treatment. The process of survivorship continues by assuming an active role in self-healing, gaining a new perspective and reconciling paradoxes, creating a new mindset and moving to a new normal, developing a new way of being in the world on one's own terms, and experiencing growth through adversity beyond survivorship. The process of survivorship for women with breast cancer is an evolutionary journey with short- and long-term challenges. This study shows the development of an empirically testable theory of survivorship that describes and predicts women's experiences following breast cancer treatment from the initial phase of recovery and beyond. The theory also informs interventions that not only reduce negative outcomes, but promote ongoing healing, adjustment, and resilience over time.

  3. The logical syntax of number words: theory, acquisition and processing.

    Science.gov (United States)

    Musolino, Julien

    2009-04-01

    Recent work on the acquisition of number words has emphasized the importance of integrating linguistic and developmental perspectives [Musolino, J. (2004). The semantics and acquisition of number words: Integrating linguistic and developmental perspectives. Cognition93, 1-41; Papafragou, A., Musolino, J. (2003). Scalar implicatures: Scalar implicatures: Experiments at the semantics-pragmatics interface. Cognition, 86, 253-282; Hurewitz, F., Papafragou, A., Gleitman, L., Gelman, R. (2006). Asymmetries in the acquisition of numbers and quantifiers. Language Learning and Development, 2, 76-97; Huang, Y. T., Snedeker, J., Spelke, L. (submitted for publication). What exactly do numbers mean?]. Specifically, these studies have shown that data from experimental investigations of child language can be used to illuminate core theoretical issues in the semantic and pragmatic analysis of number terms. In this article, I extend this approach to the logico-syntactic properties of number words, focusing on the way numerals interact with each other (e.g. Three boys are holding two balloons) as well as with other quantified expressions (e.g. Three boys are holding each balloon). On the basis of their intuitions, linguists have claimed that such sentences give rise to at least four different interpretations, reflecting the complexity of the linguistic structure and syntactic operations involved. Using psycholinguistic experimentation with preschoolers (n=32) and adult speakers of English (n=32), I show that (a) for adults, the intuitions of linguists can be verified experimentally, (b) by the age of 5, children have knowledge of the core aspects of the logical syntax of number words, (c) in spite of this knowledge, children nevertheless differ from adults in systematic ways, (d) the differences observed between children and adults can be accounted for on the basis of an independently motivated, linguistically-based processing model [Geurts, B. (2003). Quantifying kids. Language

  4. Regular-, irregular-, and pseudo-character processing in Chinese: The regularity effect in normal adult readers

    Directory of Open Access Journals (Sweden)

    Dustin Kai Yan Lau

    2014-03-01

    Full Text Available Background Unlike alphabetic languages, Chinese uses a logographic script. However, the pronunciation of many character’s phonetic radical has the same pronunciation as the character as a whole. These are considered regular characters and can be read through a lexical non-semantic route (Weekes & Chen, 1999. Pseudocharacters are another way to study this non-semantic route. A pseudocharacter is the combination of existing semantic and phonetic radicals in their legal positions resulting in a non-existing character (Ho, Chan, Chung, Lee, & Tsang, 2007. Pseudocharacters can be pronounced by direct derivation from the sound of its phonetic radical. Conversely, if the pronunciation of a character does not follow that of the phonetic radical, it is considered as irregular and can only be correctly read through the lexical-semantic route. The aim of the current investigation was to examine reading aloud in normal adults. We hypothesized that the regularity effect, previously described for alphabetical scripts and acquired dyslexic patients of Chinese (Weekes & Chen, 1999; Wu, Liu, Sun, Chromik, & Zhang, 2014, would also be present in normal adult Chinese readers. Method Participants. Thirty (50% female native Hong Kong Cantonese speakers with a mean age of 19.6 years and a mean education of 12.9 years. Stimuli. Sixty regular-, 60 irregular-, and 60 pseudo-characters (with at least 75% of name agreement in Chinese were matched by initial phoneme, number of strokes and family size. Additionally, regular- and irregular-characters were matched by frequency (low and consistency. Procedure. Each participant was asked to read aloud the stimuli presented on a laptop using the DMDX software. The order of stimuli presentation was randomized. Data analysis. ANOVAs were carried out by participants and items with RTs and errors as dependent variables and type of stimuli (regular-, irregular- and pseudo-character as repeated measures (F1 or between subject

  5. Induction of depressed mood: a test of opponent-process theory.

    Science.gov (United States)

    Ranieri, D J; Zeiss, A M

    1984-12-01

    Solomon's (1980) opponent-process theory of acquired motivation has been used to explain many phenomena in which affective or hedonic contrasts appear to exist, but has not been applied to the induction of depressed mood. The purpose of this study, therefore, was to determine whether opponent-process theory can be applied to this area. Velten's (1968) mood-induction procedure was used and subjects were assigned either to a depression-induction condition or to one of two control groups. Self-report measures of depressed mood were taken before, during, and at several points after the mood induction. Results were not totally consistent with a rigorous set of criteria for supporting an opponent-process interpretation. This suggests that the opponent-process model may not be applicable to induced depressed mood. Possible weaknesses in the experimental design, along with implications for opponent-process theory, are discussed.

  6. Nuclear medicine study of regeneration process of the liver after partial hepatectomy in normal rats

    International Nuclear Information System (INIS)

    Nomura, Yasushi

    1990-01-01

    To evaluate regeneration of the liver in rats after partial hepatectomy based on Higgins' and Anderson's method, the present study reports using the morphological and radionuclide technique. The adult Wistar rats over 8 weeks of age were prepared in this study and were injected intravenously with either 99m Tc-N-(2,6 dimethylphenylcarbamoylmethyl) iminodiacetic acid ( 99m Tc-HIDA) or 99m Tc-phytate. Using Fishback equation, the ratio of wet weight liver regeneration was approximately 80% at 14 days after partial hepatectomy. On pathology, the microscopical findings were as follows: congestion and hepatocytes swelling on day 1; diffuse fat deposition and nuclear division on day 2; decreased hepatocytes swelling, fat deposition, and regular alignment of the hepatocytes on day 5; appearance of normal liver on day 7-14. The uptake and excretion ratio of the hepatocytes using 99m Tc-HIDA as a radionuclide technique recovered to the value prior to partial hepatectomy on day 3, and also the hepatic accumulation coefficient of Kupffer cells using 99m Tc-phytate recoverd on day 4. In conclusion, it was found that the functional recovery employed 3-4 days after partial hepatectomy. The present study using two radiopharmaceuticals describes that the radionuclide techniques can facilitate to evaluate the manifest pathological alterations of hepatocytes and Kupffer cells after partial hepatectomy. (author)

  7. The theory, practice, and future of process improvement in general thoracic surgery.

    Science.gov (United States)

    Freeman, Richard K

    2014-01-01

    Process improvement, in its broadest sense, is the analysis of a given set of actions with the aim of elevating quality and reducing costs. The tenets of process improvement have been applied to medicine in increasing frequency for at least the last quarter century including thoracic surgery. This review outlines the theory underlying process improvement, the currently available data sources for process improvement and possible future directions of research. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. Tutorial - applying extreme value theory to characterize food-processing systems

    DEFF Research Database (Denmark)

    Skou, Peter Bæk; Holroyd, Stephen E.; van der Berg, Franciscus Winfried J

    2017-01-01

    This tutorial presents extreme value theory (EVT) as an analytical tool in process characterization and shows its potential to describe production performance, eg, across different factories, via reliable estimates of the frequency and scale of extreme events. Two alternative EVT methods...... are discussed: point over threshold and block maxima. We illustrate the theoretical framework for EVT by process data from two different examples from the food-processing industry. Finally, we discuss limitations, decisions, and possibilities when applying EVT for process data....

  9. Phenomenological theory of the normal and superconductive states of Cu-O and Bi-O metals

    International Nuclear Information System (INIS)

    Varma, C.M.

    1991-01-01

    The universal normal state anomalies in the CuO metals follow from a marginal Fermi liquid hypothesis: there exists a contribution to the polarizability over most of momentum space proportional to omega/T for omega/T much less than 1 and constant thereafter up to a cutoff omega(sub c). Using the same excitation spectrum, the properties of the superconductive state were calculated. The right order of T(sub c) can be obtained, the zero temperature gap, 2 delta (0)/T(sub c) and the nuclear relaxation rate near T(sub c). The possible microscopic physics leading to the marginal Fermi liquid hypothesis is discussed

  10. Aberrant articulation of cervical vertebral transverse process: An uncommon normal variant and review of the literature

    Energy Technology Data Exchange (ETDEWEB)

    Ryu, Jeong Ah; Cha, Seung Woo [Dept. of Radiology, Hanyang University College of Medicine, Guri Hospital, Guri (Korea, Republic of); Song, Yoon Ah; Lee, Seung Hun; Joo, Kyung Bin [Dept. of Radiology, Hanyang University College of Medicine, Seoul Hospital, Seoul (Korea, Republic of)

    2013-09-15

    Aberrant articulation between two anterior tubercles is a rare congenital anomaly that should be considered for patients showing a bony projection anterior to the vertebral body on a lateral radiograph of the cervical spine. We present a case of an elongation of the anterior tubercles of the transverse processes of both the fifth and sixth cervical vertebrae. This finding is probably a form of supernumerary cervical rib developing at a level above the lowest cervical spine.

  11. Aberrant articulation of cervical vertebral transverse process: An uncommon normal variant and review of the literature

    International Nuclear Information System (INIS)

    Ryu, Jeong Ah; Cha, Seung Woo; Song, Yoon Ah; Lee, Seung Hun; Joo, Kyung Bin

    2013-01-01

    Aberrant articulation between two anterior tubercles is a rare congenital anomaly that should be considered for patients showing a bony projection anterior to the vertebral body on a lateral radiograph of the cervical spine. We present a case of an elongation of the anterior tubercles of the transverse processes of both the fifth and sixth cervical vertebrae. This finding is probably a form of supernumerary cervical rib developing at a level above the lowest cervical spine.

  12. Process Convergence of Self-Normalized Sums of i.i.d. Random ...

    Indian Academy of Sciences (India)

    ... either of tightness or finite dimensional convergence to a non-degenerate limiting distribution does not hold. This work is an extension of the work by Csörgő et al. who showed Donsker's theorem for Y n , 2 ( ⋅ p ) , i.e., for p = 2 , holds i f f =2 and identified the limiting process as a standard Brownian motion in sup norm.

  13. Depth of word processing in Alzheimer patients and normal controls: a magnetoencephalographic (MEG) study.

    Science.gov (United States)

    Walla, P; Püregger, E; Lehrner, J; Mayer, D; Deecke, L; Dal Bianco, P

    2005-05-01

    Effects related to depth of verbal information processing were investigated in probable Alzheimer's disease patients (AD) and age matched controls. During word encoding sessions 10 patients and 10 controls had either to decide whether the letter "s" appeared in visually presented words (alphabetical decision, shallow encoding), or whether the meaning of each presented word was animate or inanimate (lexical decision, deep encoding). These encoding sessions were followed by test sessions during which all previously encoded words were presented again together with the same number of new words. The task was then to discriminate between repeated and new words. Magnetic field changes related to brain activity were recorded with a whole cortex MEG.5 probable AD patients showed recognition performances above chance level related to both depths of information processing. Those patients and 5 age matched controls were then further analysed. Recognition performance was poorer in probable AD patients compared to controls for both levels of processing. However, in both groups deep encoding led to a higher recognition performance than shallow encoding. We therefore conclude that the performance reduction in the patient group was independent of depth of processing. Reaction times related to false alarms differed between patients and controls after deep encoding which perhaps could already be used for supporting an early diagnosis. The analysis of the physiological data revealed significant differences between correctly recognised repetitions and correctly classified new words (old/new-effect) in the control group which were missing in the patient group after deep encoding. The lack of such an effect in the patient group is interpreted as being due to the respective neuropathology related to probable AD. The present results demonstrate that magnetic field recordings represent a useful tool to physiologically distinguish between probable AD and age matched controls.

  14. Cascade theory in isotopic separation processes; Theorie des cascades en separation isotopique

    Energy Technology Data Exchange (ETDEWEB)

    Agostini, J P

    1994-06-01

    Three main areas are developed within the scope of this work: - the first one is devoted to fundamentals: separative power, value function, ideal cascade and square cascade. Applications to two main cases are carried out, namely: Study of binary isotopic mix, Study of processes with a small enrichment coefficient. - The second one is devoted to cascade coupling -high-flux coupling (more widely used and better known) as well as low-flux coupling are presented and compared to one another. - The third one is an outlook on problems linked to cascade transients. Those problem are somewhat intricate and their interest lies mainly into two areas: economics where the start-up time may have a large influence on the interests paid during the construction and start-up period, military productions where the start-up time has a direct bearing on the production schedule. (author). 50 figs. 3 annexes. 12 refs. 6 tabs.

  15. M-momentum transfer between gravitons, membranes, and fivebranes as perturbative gauge theory processes

    International Nuclear Information System (INIS)

    Keski-Vakkuri, E.; Kraus, P.

    1998-01-01

    Polchinski and Pouliot have shown that M-momentum transfer between membranes in supergravity can be understood as a non-perturbative instanton effect in gauge theory. Here we consider a dual process: electric flux transmission between D-branes. We show that this process can be described in perturbation theory as virtual string pair creation, and is closely related to Schwinger's treatment of the pair creation of charged particles in a uniform electric field. Through the application of dualities, our perturbative calculation gives results for various non-perturbative amplitudes, including M-momentum transfer between gravitons, membranes and longitudinal fivebranes. Thus perturbation theory plus dualities are sufficient to demonstrate agreement between supergravity and gauge theory for a number of M-momentum transferring processes. A variety of other processes where branes are transmitted between branes, e.g. (p,q)-string transmission in IIB theory, can also be studied. We discuss the implications of our results for proving the eleven-dimensional Lorentz invariance of matrix theory. (orig.)

  16. Dynamic pathways to mediate reactions buried in thermal fluctuations. I. Time-dependent normal form theory for multidimensional Langevin equation.

    Science.gov (United States)

    Kawai, Shinnosuke; Komatsuzaki, Tamiki

    2009-12-14

    We present a novel theory which enables us to explore the mechanism of reaction selectivity and robust functions in complex systems persisting under thermal fluctuation. The theory constructs a nonlinear coordinate transformation so that the equation of motion for the new reaction coordinate is independent of the other nonreactive coordinates in the presence of thermal fluctuation. In this article we suppose that reacting systems subject to thermal noise are described by a multidimensional Langevin equation without a priori assumption for the form of potential. The reaction coordinate is composed not only of all the coordinates and velocities associated with the system (solute) but also of the random force exerted by the environment (solvent) with friction constants. The sign of the reaction coordinate at any instantaneous moment in the region of a saddle determines the fate of the reaction, i.e., whether the reaction will proceed through to the products or go back to the reactants. By assuming the statistical properties of the random force, one can know a priori a well-defined boundary of the reaction which separates the full position-velocity space in the saddle region into mainly reactive and mainly nonreactive regions even under thermal fluctuation. The analytical expression of the reaction coordinate provides the firm foundation on the mechanism of how and why reaction proceeds in thermal fluctuating environments.

  17. Understanding the coping process from a self-determination theory perspective.

    Science.gov (United States)

    Ntoumanis, Nikos; Edmunds, Jemma; Duda, Joan L

    2009-05-01

    To explore conceptual links between the cognitive-motivational-relational theory (CMRT) of coping (Lazarus, 1991) and self-determination theory (SDT) of motivation (Deci & Ryan, 1985). We present a very brief overview of the two theories. We also discuss how components from the two theories can be examined together to facilitate research in the health/exercise domain. To this effect, we offer a preliminary integrated model of stress, coping, and motivation, based on the two aforementioned theories, in an attempt to illustrate and instigate research on how motivational factors are implicated in the coping process. We believe that the proposed model can serve as a platform for generating new research ideas which, besides their theoretical relevance, may have important applied implications.

  18. An analysis of clinical reasoning through a recent and comprehensive approach: the dual-process theory.

    Science.gov (United States)

    Pelaccia, Thierry; Tardif, Jacques; Triby, Emmanuel; Charlin, Bernard

    2011-03-14

    Clinical reasoning plays a major role in the ability of doctors to make diagnoses and decisions. It is considered as the physician's most critical competence, and has been widely studied by physicians, educationalists, psychologists and sociologists. Since the 1970s, many theories about clinical reasoning in medicine have been put forward. This paper aims at exploring a comprehensive approach: the "dual-process theory", a model developed by cognitive psychologists over the last few years. After 40 years of sometimes contradictory studies on clinical reasoning, the dual-process theory gives us many answers on how doctors think while making diagnoses and decisions. It highlights the importance of physicians' intuition and the high level of interaction between analytical and non-analytical processes. However, it has not received much attention in the medical education literature. The implications of dual-process models of reasoning in terms of medical education will be discussed.

  19. Pierre Gy's sampling theory and sampling practice heterogeneity, sampling correctness, and statistical process control

    CERN Document Server

    Pitard, Francis F

    1993-01-01

    Pierre Gy's Sampling Theory and Sampling Practice, Second Edition is a concise, step-by-step guide for process variability management and methods. Updated and expanded, this new edition provides a comprehensive study of heterogeneity, covering the basic principles of sampling theory and its various applications. It presents many practical examples to allow readers to select appropriate sampling protocols and assess the validity of sampling protocols from others. The variability of dynamic process streams using variography is discussed to help bridge sampling theory with statistical process control. Many descriptions of good sampling devices, as well as descriptions of poor ones, are featured to educate readers on what to look for when purchasing sampling systems. The book uses its accessible, tutorial style to focus on professional selection and use of methods. The book will be a valuable guide for mineral processing engineers; metallurgists; geologists; miners; chemists; environmental scientists; and practit...

  20. Exploiting attractiveness in persuasion: senders' implicit theories about receivers' processing motivation.

    Science.gov (United States)

    Vogel, Tobias; Kutzner, Florian; Fiedler, Klaus; Freytag, Peter

    2010-06-01

    Previous research suggests a positive correlation between physical attractiveness and the expectation of positive outcomes in social interactions, such as successful persuasion. However, prominent persuasion theories do not imply a general advantage of attractive senders. Instead, the persuasion success should vary with the receivers' processing motivation and processing capacity. Focusing on the perspective of the sender, the authors elaborate on lay theories about how attractiveness affects persuasion success. They propose that lay theories (a) match scientific models in that they also comprise the interaction of senders' attractiveness and receivers' processing characteristics, (b) guide laypersons' anticipation of persuasion success, and (c) translate into strategic behavior. They show that anticipated persuasion success depends on the interplay of perceived attractiveness and expectations about receivers' processing motivation (Experiment 1 and 2). Further experiments show that laypersons strategically attempt to exploit attractiveness in that they approach situations (Experiment 3) and persons (Experiment 4) that promise persuasion success.

  1. Dual-Process Theories of Reasoning: Contemporary Issues and Developmental Applications

    Science.gov (United States)

    Evans, Jonathan St. B. T.

    2011-01-01

    In this paper, I discuss the current state of theorising about dual processes in adult performance on reasoning and decision making tasks, in which Type 1 intuitive processing is distinguished from Type 2 reflective thinking. I show that there are many types of theory some of which distinguish modes rather than types of thinking and that…

  2. Trichotomous Processes in Early Memory Development, Aging, and Neurocognitive Impairment: A Unified Theory

    Science.gov (United States)

    Brainerd, C. J.; Reyna, V. F.; Howe, M. L.

    2009-01-01

    One of the most extensively investigated topics in the adult memory literature, dual memory processes, has had virtually no impact on the study of early memory development. The authors remove the key obstacles to such research by formulating a trichotomous theory of recall that combines the traditional dual processes of recollection and…

  3. Extending Attribution Theory: Considering Students' Perceived Control of the Attribution Process

    Science.gov (United States)

    Fishman, Evan J.; Husman, Jenefer

    2017-01-01

    Research in attribution theory has shown that students' causal thinking profoundly affects their learning and motivational outcomes. Very few studies, however, have explored how students' attribution-related beliefs influence the causal thought process. The present study used the perceived control of the attribution process (PCAP) model to examine…

  4. Dynamic Training Elements in a Circuit Theory Course to Implement a Self-Directed Learning Process

    Science.gov (United States)

    Krouk, B. I.; Zhuravleva, O. B.

    2009-01-01

    This paper reports on the implementation of a self-directed learning process in a circuit theory course, incorporating dynamic training elements which were designed on the basis of a cybernetic model of cognitive process management. These elements are centrally linked in a dynamic learning frame, created on the monitor screen, which displays the…

  5. Concreteness effects in semantic processing: ERP evidence supporting dual-coding theory.

    Science.gov (United States)

    Kounios, J; Holcomb, P J

    1994-07-01

    Dual-coding theory argues that processing advantages for concrete over abstract (verbal) stimuli result from the operation of 2 systems (i.e., imaginal and verbal) for concrete stimuli, rather than just 1 (for abstract stimuli). These verbal and imaginal systems have been linked with the left and right hemispheres of the brain, respectively. Context-availability theory argues that concreteness effects result from processing differences in a single system. The merits of these theories were investigated by examining the topographic distribution of event-related brain potentials in 2 experiments (lexical decision and concrete-abstract classification). The results were most consistent with dual-coding theory. In particular, different scalp distributions of an N400-like negativity were elicited by concrete and abstract words.

  6. Energy shocks, crises and the policy process: A review of theory and application

    International Nuclear Information System (INIS)

    Grossman, Peter Z.

    2015-01-01

    What motivates changes in energy policy? Typically, the process begins with a notable exogenous event, a shock. Often, the shock leads to what is perceived to be a crisis. This review essay surveys theories of crisis policymaking from the social science literature and considers their application to changes in energy policy. Two cases — one from the U.S., the other from Germany — are examined in more detail from the standpoint of the theories discussed. Suggestions are made for improving energy policy analysis in the future. - Highlights: • An analysis of the idea of “crisis” and its application to energy. • A review of theories and models of the policy process and of policy change. • Theory applied to two energy cases. • Suggestion as to how the analysis of energy policymaking might be approached in the future

  7. Mothers' daily person and process praise: implications for children's theory of intelligence and motivation.

    Science.gov (United States)

    Pomerantz, Eva M; Kempner, Sara G

    2013-11-01

    This research examined if mothers' day-to-day praise of children's success in school plays a role in children's theory of intelligence and motivation. Participants were 120 children (mean age = 10.23 years) and their mothers who took part in a 2-wave study spanning 6 months. During the first wave, mothers completed a 10-day daily interview in which they reported on their use of person (e.g., "You are smart") and process (e.g., "You tried hard") praise. Children's entity theory of intelligence and preference for challenge in school were assessed with surveys at both waves. Mothers' person, but not process, praise was predictive of children's theory of intelligence and motivation: The more person praise mothers used, the more children subsequently held an entity theory of intelligence and avoided challenge over and above their earlier functioning on these dimensions.

  8. Dynamic interracial/intercultural processes: the role of lay theories of race.

    Science.gov (United States)

    Hong, Ying-yi; Chao, Melody Manchi; No, Sun

    2009-10-01

    This paper explores how the lay theory approach provides a framework beyond previous stereotype/prejudice research to understand dynamic personality processes in interracial/ethnic contexts. The authors conceptualize theory of race within the Cognitive-Affective Personality System (CAPS), in which lay people's beliefs regarding the essential nature of race sets up a mind-set through which individuals construe and interpret their social experiences. The research findings illustrate that endorsement of the essentialist theory (i.e., that race reflects deep-seated, inalterable essence and is indicative of traits and ability) versus the social constructionist theory (i.e., that race is socially constructed, malleable, and arbitrary) are associated with different encoding and representation of social information, which in turn affect feelings, motivation, and competence in navigating between racial and cultural boundaries. These findings shed light on dynamic interracial/intercultural processes. Relations of this approach to CAPS are discussed.

  9. Critically Engaging "Mutually Engaged Supervisory Processes": A Proposed Theory for CPE Supervisory Education.

    Science.gov (United States)

    Fitchett, George; Altenbaumer, Mary L; Atta, Osofo Kwesi; Stowman, Sheryl Lyndes; Vlach, Kyle

    2014-12-01

    Revisions to the processes for training and certifying supervisors continue to be debated within the Association for Clinical Pastoral Education (ACPE). In 2012 Ragsdale and colleagues published, "Mutually engaged supervisory processes," a qualitative research study utilizing grounded theory based on interviews with 19 recently certified Associate CPE Supervisors, of nine components that facilitate the development of CPE supervisory education students. In this article we critically engage this theory and the research upon which it is based. We also reflect on three issues highlighted by the theory: personal transformation in CPE supervisory education, how CPE supervisory education students develop theoretical foundations for their work, and engaging multicultural issues in supervisory education. We conclude that this theory offers ACPE the possibility of using research to guide future modifications to its practice of Supervisory education. © 2014 Journal of Pastoral Care Publications Inc.

  10. Cue acquisition: A feature of Malawian midwives decision making process to support normality during the first stage of labour.

    Science.gov (United States)

    Chodzaza, Elizabeth; Haycock-Stuart, Elaine; Holloway, Aisha; Mander, Rosemary

    2018-03-01

    to explore Malawian midwives decision making when caring for women during the first stage of labour in the hospital setting. this focused ethnographic study examined the decision making process of 9 nurse-midwives with varying years of clinical experience in the real world setting of an urban and semi urban hospital from October 2013 to May 2014.This was done using 27 participant observations and 27 post-observation in-depth interviews over a period of six months. Qualitative data analysis software, NVivo 10, was used to assist with data management for the analysis. All data was analysed using the principle of theme and category formation. analysis revealed a six-stage process of decision making that include a baseline for labour, deciding to admit a woman to labour ward, ascertaining the normal physiological progress of labour, supporting the normal physiological progress of labour, embracing uncertainty: the midwives' construction of unusual labour as normal, dealing with uncertainty and deciding to intervene in unusual labour. This six-stage process of decision making is conceptualised as the 'role of cue acquisition', illustrating the ways in which midwives utilise their assessment of labouring women to reason and make decisions on how to care for them in labour. Cue acquisition involved the midwives piecing together segments of information they obtained from the women to formulate an understanding of the woman's birthing progress and inform the midwives decision making process. This understanding of cue acquisition by midwives is significant for supporting safe care in the labour setting. When there was uncertainty in a woman's progress of labour, midwives used deductive reasoning, for example, by cross-checking and analysing the information obtained during the span of labour. Supporting normal labour physiological processes was identified as an underlying principle that shaped the midwives clinical judgement and decision making when they cared for women in

  11. Pitch angle scattering of relativistic electrons from stationary magnetic waves: Continuous Markov process and quasilinear theory

    International Nuclear Information System (INIS)

    Lemons, Don S.

    2012-01-01

    We develop a Markov process theory of charged particle scattering from stationary, transverse, magnetic waves. We examine approximations that lead to quasilinear theory, in particular the resonant diffusion approximation. We find that, when appropriate, the resonant diffusion approximation simplifies the result of the weak turbulence approximation without significant further restricting the regime of applicability. We also explore a theory generated by expanding drift and diffusion rates in terms of a presumed small correlation time. This small correlation time expansion leads to results valid for relatively small pitch angle and large wave energy density - a regime that may govern pitch angle scattering of high-energy electrons into the geomagnetic loss cone.

  12. Theory-data comparisons for jet measurements in hadron-induced processes

    Energy Technology Data Exchange (ETDEWEB)

    Wobisch, M. [Lousiana Tech Univ., Ruston, LA (United States); Britzger, D. [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Kluge, T. [Liverpool Univ. (United Kingdom); Rabbertz, K.; Stober, F. [Karlsruher Institut fuer Technologie (KIT), Karlsruhe (Germany)

    2011-09-15

    We present a comprehensive overview of theory-data comparisons for inclusive jet production. Theory predictions are derived for recent parton distribution functions and compared with jet data from different hadron-induced processes at various center-of-mass energies {radical}(s). The comparisons are presented as a function of jet transverse momentum p{sub T} or, alternatively, of the scaling variable x{sub T}=2p{sub T}/{radical}(s). (orig.)

  13. Applying circular economy innovation theory in business process modeling and analysis

    Science.gov (United States)

    Popa, V.; Popa, L.

    2017-08-01

    The overall aim of this paper is to develop a new conceptual framework for business process modeling and analysis using circular economy innovative theory as a source for business knowledge management. The last part of the paper presents an author’s proposed basic structure for a new business models applying circular economy innovation theories. For people working on new innovative business models in the field of the circular economy this paper provides new ideas for clustering their concepts.

  14. Phenomenological rate process theory for the storage of atomic H in solid Hsub(2)sup(*)

    International Nuclear Information System (INIS)

    Rosen, G.

    1976-01-01

    A phenomenological rate process theory is developed for the storage and rapid recombination of atomic hydrogen fuel radical in a crystalline molecular hydrogen solid at temperatures in the range o.1K(<=)T(<=K. It is shown that such a theory can account quantitatively for the recently observed dependence of the storage time on the storage temperature, for the maximum concentration of trapped H atom, and for the time duration of the energy release in the tritium decay experiments of Webeler

  15. Hard is Normal: Military Families' Transitions Within the Process of Deployment.

    Science.gov (United States)

    Yablonsky, Abigail M; Barbero, Edie Devers; Richardson, Jeanita W

    2016-02-01

    US military deployments have become more frequent and lengthier in duration since 2003. Over half of US military members are married, and many also have children. The authors sought to understand the process of deployment from the perspective of the military family. After a thorough search of the literature, 21 primary research reports of 19 studies with an aggregate sample of 874 were analyzed using qualitative metasynthesis. The deployment process was experienced in four temporal domains. The military family as a whole shared the pre-deployment transition: all family members felt uncertain about the future, needed to complete tasks to "get ready" for deployment, and experienced a sense of distancing in preparation for the upcoming separation. The AD member went through the deployment transition independently, needing to "stay engaged" with the military mission, building a surrogate family and simultaneously trying to maintain connection with the family at home. In parallel, the home front family was going through a transposement transition, moving forward as an altered family unit, taking on new roles and responsibilities, and trying to simultaneously connect with the deployed member and find support from other military families. In post-deployment, the family went through the "reintegration" transition together, managing expectations, and readjusting family roles, all needing understanding and appreciation for their sacrifices during the recent separation. Effective family communication was important for military family well-being after deployment but unexpectedly challenging for many. Clinical, research, and policy recommendations are discussed. © 2015 Wiley Periodicals, Inc. This article has been contributed to by a US Government employee and her work is in the public domain in the USA.

  16. Overriding Moral Intuitions – Does It Make Us Immoral? Dual-Process Theory of Higher Cognition Account for Moral Reasoning

    OpenAIRE

    Michał Białek; Simon J. Handley

    2013-01-01

    Moral decisions are considered as an intuitive process, while conscious reasoning is mostly used only to justify those intuitions. This problem is described in few different dual-process theories of mind, that are being developed e.g. by Frederick and Kahneman, Stanovich and Evans. Those theories recently evolved into tri-process theories with a proposed process that makes ultimate decision or allows to paraformal processing with focal bias.. Presented experiment compares...

  17. A neural network model of normal and abnormal auditory information processing.

    Science.gov (United States)

    Du, X; Jansen, B H

    2011-08-01

    The ability of the brain to attenuate the response to irrelevant sensory stimulation is referred to as sensory gating. A gating deficiency has been reported in schizophrenia. To study the neural mechanisms underlying sensory gating, a neuroanatomically inspired model of auditory information processing has been developed. The mathematical model consists of lumped parameter modules representing the thalamus (TH), the thalamic reticular nucleus (TRN), auditory cortex (AC), and prefrontal cortex (PC). It was found that the membrane potential of the pyramidal cells in the PC module replicated auditory evoked potentials, recorded from the scalp of healthy individuals, in response to pure tones. Also, the model produced substantial attenuation of the response to the second of a pair of identical stimuli, just as seen in actual human experiments. We also tested the viewpoint that schizophrenia is associated with a deficit in prefrontal dopamine (DA) activity, which would lower the excitatory and inhibitory feedback gains in the AC and PC modules. Lowering these gains by less than 10% resulted in model behavior resembling the brain activity seen in schizophrenia patients, and replicated the reported gating deficits. The model suggests that the TRN plays a critical role in sensory gating, with the smaller response to a second tone arising from a reduction in inhibition of TH by the TRN. Copyright © 2011 Elsevier Ltd. All rights reserved.

  18. Managing fear in public health campaigns: a theory-based formative evaluation process.

    Science.gov (United States)

    Cho, Hyunyi; Witte, Kim

    2005-10-01

    The HIV/AIDS infection rate of Ethiopia is one of the world's highest. Prevention campaigns should systematically incorporate and respond to at-risk population's existing beliefs, emotions, and perceived barriers in the message design process to effectively promote behavior change. However, guidelines for conducting formative evaluation that are grounded in proven risk communication theory and empirical data analysis techniques are hard to find. This article provides a five-step formative evaluation process that translates theory and research for developing effective messages for behavior change. Guided by the extended parallel process model, the five-step process helps message designers manage public's fear surrounding issues such as HIV/AIDS. An entertainment education project that used the process to design HIV/AIDS prevention messages for Ethiopian urban youth is reported. Data were collected in five urban regions of Ethiopia and analyzed according to the process to develop key messages for a 26-week radio soap opera.

  19. The effects of limited bandwidth and noise on verbal processing time and word recall in normal-hearing children.

    Science.gov (United States)

    McCreery, Ryan W; Stelmachowicz, Patricia G

    2013-09-01

    Understanding speech in acoustically degraded environments can place significant cognitive demands on school-age children who are developing the cognitive and linguistic skills needed to support this process. Previous studies suggest the speech understanding, word learning, and academic performance can be negatively impacted by background noise, but the effect of limited audibility on cognitive processes in children has not been directly studied. The aim of the present study was to evaluate the impact of limited audibility on speech understanding and working memory tasks in school-age children with normal hearing. Seventeen children with normal hearing between 6 and 12 years of age participated in the present study. Repetition of nonword consonant-vowel-consonant stimuli was measured under conditions with combinations of two different signal to noise ratios (SNRs; 3 and 9 dB) and two low-pass filter settings (3.2 and 5.6 kHz). Verbal processing time was calculated based on the time from the onset of the stimulus to the onset of the child's response. Monosyllabic word repetition and recall were also measured in conditions with a full bandwidth and 5.6 kHz low-pass cutoff. Nonword repetition scores decreased as audibility decreased. Verbal processing time increased as audibility decreased, consistent with predictions based on increased listening effort. Although monosyllabic word repetition did not vary between the full bandwidth and 5.6 kHz low-pass filter condition, recall was significantly poorer in the condition with limited bandwidth (low pass at 5.6 kHz). Age and expressive language scores predicted performance on word recall tasks, but did not predict nonword repetition accuracy or verbal processing time. Decreased audibility was associated with reduced accuracy for nonword repetition and increased verbal processing time in children with normal hearing. Deficits in free recall were observed even under conditions where word repetition was not affected

  20. The Theory of High Energy Collision Processes - Final Report DOE/ER/40158-1

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Tai, T.

    2011-09-15

    In 1984, DOE awarded Harvard University a new Grant DE-FG02-84ER40158 to continue their support of Tai Tsun Wu as Principal Investigator of research on the theory of high energy collision processes. This Grant was renewed and remained active continuously from June 1, 1984 through November 30, 2007. Topics of interest during the 23-year duration of this Grant include: the theory and phenomenology of collision and production processes at ever higher energies; helicity methods of QED and QCD; neutrino oscillations and masses; Yang-Mills gauge theory; Beamstrahlung; Fermi pseudopotentials; magnetic monopoles and dyons; cosmology; classical confinement; mass relations; Bose-Einstein condensation; and large-momentum-transfer scattering processes. This Final Report describes the research carried out on Grant DE-FG02-84ER40158 for the period June 1, 1984 through November 30, 2007. Two books resulted from this project and a total of 125 publications.

  1. Theory of the Andreev reflection and the density of states in proximity contact normal-superconducting infinite double-layer

    International Nuclear Information System (INIS)

    Nagato, Yasushi; Nagai, Katsuhiko

    1993-01-01

    Proximity contact N-S double-layer with infinite layer widths is studied in the clean limit. The finite reflection at the interface is taken into account. Starting from a recent theory of finite width double-layer by Ashida et al., the authors obtain explicit expressions for the quasi-classical Green's function which already satisfy the boundary condition and include no exploding terms at infinities. The self-consistent pair potentials are obtained numerically with sufficient accuracy. The Andreev reflection at the N-S interface is discussed on the basis of the self-consistent pair potential. It is shown that there exists a resonance state in a potential valley formed between the depressed pair potential and the partially reflecting interface, which leads to a peak of the Andreev reflection coefficient with the height unity slightly below the bulk superconductor energy gap. They also find general relationship between the Andreev reflection coefficient and the local density of states of the superconductor just at the interface

  2. Caudal articular process dysplasia of thoracic vertebrae in neurologically normal French bulldogs, English bulldogs, and Pugs: Prevalence and characteristics.

    Science.gov (United States)

    Bertram, Simon; Ter Haar, Gert; De Decker, Steven

    2018-02-20

    The aims of this study were to evaluate the prevalence and anatomical characteristics of thoracic caudal articular process dysplasia in French bulldogs, English bulldogs and Pugs presenting for problems unrelated to spinal disease. In this retrospective cross-sectional study, computed tomography scans of the thoracic vertebral column of these three breeds were reviewed for the presence and location of caudal articular process hypoplasia and aplasia, and compared between breeds. A total of 271 dogs met the inclusion criteria: 108 French bulldogs, 63 English bulldogs, and 100 Pugs. A total of 70.4% of French bulldogs, 84.1% of English bulldogs, and 97.0% of Pugs showed evidence of caudal articular process dysplasia. Compared to French and English bulldogs, Pugs showed a significantly higher prevalence of caudal articular process aplasia, but also a lower prevalence of caudal articular process hypoplasia, a higher number of affected vertebrae per dog and demonstrated a generalized and bilateral spatial pattern more frequently. Furthermore, Pugs showed a significantly different anatomical distribution of caudal articular process dysplasia along the vertebral column, with a high prevalence of caudal articular process aplasia between T10 and T13. This area was almost completely spared in French and English bulldogs. As previously suggested, caudal articular process dysplasia is a common finding in neurologically normal Pugs but this also seems to apply to French and English bulldogs. The predisposition of clinically relevant caudal articular process dysplasia in Pugs is possibly not only caused by the higher prevalence of caudal articular process dysplasia, but also by breed specific anatomical characteristics. © 2018 American College of Veterinary Radiology.

  3. R-Matrix Theory of Atomic Collisions Application to Atomic, Molecular and Optical Processes

    CERN Document Server

    Burke, Philip George

    2011-01-01

    Commencing with a self-contained overview of atomic collision theory, this monograph presents recent developments of R-matrix theory and its applications to a wide-range of atomic molecular and optical processes. These developments include electron and photon collisions with atoms, ions and molecules required in the analysis of laboratory and astrophysical plasmas, multiphoton processes required in the analysis of superintense laser interactions with atoms and molecules and positron collisions with atoms and molecules required in antimatter studies of scientific and technologial importance. Basic mathematical results and general and widely used R-matrix computer programs are summarized in the appendices.

  4. Phonological processes in the speech of school-age children with hearing loss: Comparisons with children with normal hearing.

    Science.gov (United States)

    Asad, Areej Nimer; Purdy, Suzanne C; Ballard, Elaine; Fairgray, Liz; Bowen, Caroline

    2018-04-27

    In this descriptive study, phonological processes were examined in the speech of children aged 5;0-7;6 (years; months) with mild to profound hearing loss using hearing aids (HAs) and cochlear implants (CIs), in comparison to their peers. A second aim was to compare phonological processes of HA and CI users. Children with hearing loss (CWHL, N = 25) were compared to children with normal hearing (CWNH, N = 30) with similar age, gender, linguistic, and socioeconomic backgrounds. Speech samples obtained from a list of 88 words, derived from three standardized speech tests, were analyzed using the CASALA (Computer Aided Speech and Language Analysis) program to evaluate participants' phonological systems, based on lax (a process appeared at least twice in the speech of at least two children) and strict (a process appeared at least five times in the speech of at least two children) counting criteria. Developmental phonological processes were eliminated in the speech of younger and older CWNH while eleven developmental phonological processes persisted in the speech of both age groups of CWHL. CWHL showed a similar trend of age of elimination to CWNH, but at a slower rate. Children with HAs and CIs produced similar phonological processes. Final consonant deletion, weak syllable deletion, backing, and glottal replacement were present in the speech of HA users, affecting their overall speech intelligibility. Developmental and non-developmental phonological processes persist in the speech of children with mild to profound hearing loss compared to their peers with typical hearing. The findings indicate that it is important for clinicians to consider phonological assessment in pre-school CWHL and the use of evidence-based speech therapy in order to reduce non-developmental and non-age-appropriate developmental processes, thereby enhancing their speech intelligibility. Copyright © 2018 Elsevier Inc. All rights reserved.

  5. Research of radioecological processes by methods of the theory of reliability

    International Nuclear Information System (INIS)

    Kutlakhmedov, Yu.A.; Salivon, A.G.; Pchelovskaya, S.A.; Rodina, V.V.; Bevza, A.G.; Matveeva, I.V.

    2012-01-01

    Theory and the models of radiocapacity ecosystems using the theory and models of reliability have allowed adequately to describe the laws of migration and radionuclides distribution for different types ecosystems of reservoirs and land. The theory and the models of radiocapacity allow strictly to define critical elements of ecosystem where it is necessary to expect temporary or final depoting of radionuclides.The approach on the basis of application biogenic tracers allows within the framework of the theory both models of radiocapacity and reliability simultaneously to estimate the processes of radionuclides migration, to define the dozes of loading on biota ecosystems, and to establish fundamental parameters of radionuclides redistribution speeds and others pollutants in different types of ecosystems.

  6. Revising Tinto's Interactionalist Theory of Student Departure Through Theory Elaboration: Examining the Role of Organizational Attributes in the Persistence Process.

    Science.gov (United States)

    Berger, Joseph B.; Braxton, John M.

    1998-01-01

    A study used theory elaboration to help revise Tinto's interactionalist theory of individual student departure from college to include the effects of organizational attributes on student withdrawal. Results provide strong support for including concepts from organizational theory and suggest future research should use theory elaboration to look for…

  7. The study of regional cerebral glucose metabolic change in human being normal aging process by using PET scanner

    International Nuclear Information System (INIS)

    Si Mingjue; Huang Gang

    2008-01-01

    Objective: With the technique development, PET has been more and more applied in brain function research. The aim of this study was to investigate the tendency of regional cerebral glucose metabolism changes in human being normal aging process by using 18 F-fluorodeoxyglucose (FDG) PET/CT and statistical parametric mapping (SPM) software. Methods: 18 F-FDG PET/CT brain imaging data acquired from 252 healthy normal subjects (age ranging: 21 to 88 years old) were divided into 6 groups according to their age: 21-30, 31-40, 41-50, 51-60, 61-70, 71-88. All 5 groups with age ≥31 years old were compared to the control group of 21-30 years old, and pixel-by-pixel t-statistic analysis was applied using the SPM2. The hypo-metabolic areas were identified by MNI space utility (MSU) software and the voxel value of each brain areas were calculated (P 60 years old showed significant metabolic decreases with aging mainly involved bilateral frontal lobe (pre-motto cortex, dorsolateral prefrontal cortex, frontal pole), temporal lobe (temporal pole), insula, anterior cingulate cortex and cerebellum. The most significant metabolic decrease area with aging was the frontal lobe , followed by the anterior cingulate cortex, temporal lobe, insula and cerebellum at predominance right hemisphere (P<0.0001). Parietal lobe, parahippocampal gyrus, basal ganglia and thalamus remain metabolically unchanged with advancing aging. Conclusions: Cerebral metabolic function decrease with normal aging shows an inconstant and unsymmetrical process. The regional cerebral metabolic decrease much more significantly in older than 60 years old healthy volunteers, mainly involving bilateral frontal lobe, temporal lobe, insula, anterior cingulate cortex and cerebellum at right predominance hemisphere. (authors)

  8. An analysis of clinical reasoning through a recent and comprehensive approach: the dual-process theory

    Directory of Open Access Journals (Sweden)

    Thierry Pelaccia

    2011-03-01

    Full Text Available Context. Clinical reasoning plays a major role in the ability of doctors to make diagnoses and decisions. It is considered as the physician's most critical competence, and has been widely studied by physicians, educationalists, psychologists and sociologists. Since the 1970s, many theories about clinical reasoning in medicine have been put forward.Purpose. This paper aims at exploring a comprehensive approach: the “dual-process theory”, a model developed by cognitive psychologists over the last few years.Discussion. After 40 years of sometimes contradictory studies on clinical reasoning, the dual-process theory gives us many answers on how doctors think while making diagnoses and decisions. It highlights the importance of physicians’ intuition and the high level of interaction between analytical and non-analytical processes. However, it has not received much attention in the medical education literature. The implications of dual-process models of reasoning in terms of medical education will be discussed.

  9. Applying Catastrophe Theory to an Information-Processing Model of Problem Solving in Science Education

    Science.gov (United States)

    Stamovlasis, Dimitrios; Tsaparlis, Georgios

    2012-01-01

    In this study, we test an information-processing model (IPM) of problem solving in science education, namely the working memory overload model, by applying catastrophe theory. Changes in students' achievement were modeled as discontinuities within a cusp catastrophe model, where working memory capacity was implemented as asymmetry and the degree…

  10. Behavioural investigations into uncertainty perception in service exchanges: Lessons from dual-processing theory

    DEFF Research Database (Denmark)

    Kreye, Melanie

    2015-01-01

    by experience and knowledge. Based on dual-processing theory, this paper proposes an analysis method for assessing both explicit and implicit uncertainty perception depending on the individual’s use of tacit or explicit knowledge. Analysing two industrial case studies of service relationships, this paper...

  11. Analysis of the stochastic channel model by Saleh & Valenzuela via the theory of point processes

    DEFF Research Database (Denmark)

    Jakobsen, Morten Lomholt; Pedersen, Troels; Fleury, Bernard Henri

    2012-01-01

    and underlying features, like the intensity function of the component delays and the delaypower intensity. The flexibility and clarity of the mathematical instruments utilized to obtain these results lead us to conjecture that the theory of spatial point processes provides a unifying mathematical framework...

  12. Limit theory for the sample autocorrelations and extremes of a GARCH (1,1) process

    NARCIS (Netherlands)

    Mikosch, T; Starica, C

    2000-01-01

    The asymptotic theory for the sample autocorrelations and extremes of a GARCH(I, 1) process is provided. Special attention is given to the case when the sum of the ARCH and GARCH parameters is close to 1, that is, when one is close to an infinite Variance marginal distribution. This situation has

  13. Preliminary Process Theory does not validate the Comparison Question Test: A comment on Palmatier and Rovner

    NARCIS (Netherlands)

    Ben-Shakar, G.; Gamer, M.; Iacono, W.; Meijer, E.; Verschuere, B.

    2015-01-01

    Palmatier and Rovner (2015) attempt to establish the construct validity of the Comparison Question Test (CQT) by citing extensive research ranging from modern neuroscience to memory and psychophysiology. In this comment we argue that merely citing studies on the preliminary process theory (PPT) of

  14. The Process of Social Identity Development in Adolescent High School Choral Singers: A Grounded Theory

    Science.gov (United States)

    Parker, Elizabeth Cassidy

    2014-01-01

    The purpose of this grounded theory study was to describe the process of adolescent choral singers' social identity development within three midsized, midwestern high school mixed choirs. Forty-nine interviews were conducted with 36 different participants. Secondary data sources included memoing, observations, and interviews with the choir…

  15. A Qualitative Analysis Framework Using Natural Language Processing and Graph Theory

    Science.gov (United States)

    Tierney, Patrick J.

    2012-01-01

    This paper introduces a method of extending natural language-based processing of qualitative data analysis with the use of a very quantitative tool--graph theory. It is not an attempt to convert qualitative research to a positivist approach with a mathematical black box, nor is it a "graphical solution". Rather, it is a method to help qualitative…

  16. Improving Readability of an Evaluation Tool for Low-Income Clients Using Visual Information Processing Theories

    Science.gov (United States)

    Townsend, Marilyn S.; Sylva, Kathryn; Martin, Anna; Metz, Diane; Wooten-Swanson, Patti

    2008-01-01

    Literacy is an issue for many low-income audiences. Using visual information processing theories, the goal was improving readability of a food behavior checklist and ultimately improving its ability to accurately capture existing changes in dietary behaviors. Using group interviews, low-income clients (n = 18) evaluated 4 visual styles. The text…

  17. Language Learning Strategies and English Proficiency: Interpretations from Information-Processing Theory

    Science.gov (United States)

    Rao, Zhenhui

    2016-01-01

    The research reported here investigated the relationship between students' use of language learning strategies and their English proficiency, and then interpreted the data from two models in information-processing theory. Results showed that the students' English proficiency significantly affected their use of learning strategies, with high-level…

  18. The Emergence of the Teaching/Learning Process in Preschoolers: Theory of Mind and Age Effect

    Science.gov (United States)

    Bensalah, Leila

    2011-01-01

    This study analysed the gradual emergence of the teaching/learning process by examining theory of mind (ToM) acquisition and age effects in the preschool period. We observed five dyads performing a jigsaw task drawn from a previous study. Three stages were identified. In the first one, the teacher focuses on the execution of her/his own task…

  19. Factors Affecting Christian Parents' School Choice Decision Processes: A Grounded Theory Study

    Science.gov (United States)

    Prichard, Tami G.; Swezey, James A.

    2016-01-01

    This study identifies factors affecting the decision processes for school choice by Christian parents. Grounded theory design incorporated interview transcripts, field notes, and a reflective journal to analyze themes. Comparative analysis, including open, axial, and selective coding, was used to reduce the coded statements to five code families:…

  20. Seeking Humanizing Care in Patient-Centered Care Process: A Grounded Theory Study.

    Science.gov (United States)

    Cheraghi, Mohammad Ali; Esmaeili, Maryam; Salsali, Mahvash

    Patient-centered care is both a goal in itself and a tool for enhancing health outcomes. The application of patient-centered care in health care services globally however is diverse. This article reports on a study that sought to introduce patient-centered care. The aim of this study is to explore the process of providing patient-centered care in critical care units. The study used a grounded theory method. Data were collected on 5 critical care units in Tehran University of Medical Sciences. Purposive and theoretical sampling directed the collection of data using 29 semistructured interviews with 27 participants (nurses, patients, and physician). Data obtained were analyzed according to the analysis stages of grounded theory and constant comparison to identify the concepts, context, and process of the study. The core category of this grounded theory is "humanizing care," which consisted of 4 interrelated phases, including patient acceptance, purposeful patient assessment and identification, understanding patients, and patient empowerment. A core category of humanizing care integrated the theory. Humanizing care was an outcome and process. Patient-centered care is a dynamic and multifaceted process provided according to the nurses' understanding of the concept. Patient-centered care does not involve repeating routine tasks; rather, it requires an all-embracing understanding of the patients and showing respect for their values, needs, and preferences.

  1. A Grounded Theory of Text Revision Processes Used by Young Adolescents Who Are Deaf

    Science.gov (United States)

    Yuknis, Christina

    2014-01-01

    This study examined the revising processes used by 8 middle school students who are deaf or hard-of-hearing as they composed essays for their English classes. Using grounded theory, interviews with students and teachers in one middle school, observations of the students engaging in essay creation, and writing samples were collected for analysis.…

  2. Attachment and the Processing of Social Information across the Life Span: Theory and Evidence

    Science.gov (United States)

    Dykas, Matthew J.; Cassidy, Jude

    2011-01-01

    Researchers have used J. Bowlby's (1969/1982, 1973, 1980, 1988) attachment theory frequently as a basis for examining whether experiences in close personal relationships relate to the processing of social information across childhood, adolescence, and adulthood. We present an integrative life-span-encompassing theoretical model to explain the…

  3. Can Dual Processing Theory Explain Physics Students' Performance on the Force Concept Inventory?

    Science.gov (United States)

    Wood, Anna K.; Galloway, Ross K.; Hardy, Judy

    2016-01-01

    According to dual processing theory there are two types, or modes, of thinking: system 1, which involves intuitive and nonreflective thinking, and system 2, which is more deliberate and requires conscious effort and thought. The Cognitive Reflection Test (CRT) is a widely used and robust three item instrument that measures the tendency to override…

  4. Gröbner bases in control theory and signal processing

    CERN Document Server

    Regensburger, Georg

    2007-01-01

    This volume contains survey and original articles presenting the state of the art on the application of Gröbner bases in control theory and signal processing. The contributions are based on talks delivered at the Special Semester on Gröbner Bases and Related Methods at the Johann Radon Institute of Computational and Applied Mathematics (RICAM), Linz, Austria, in May 2006.

  5. Imitation and processes of institutionalization - Insights from Bourdieu's theory of practice

    NARCIS (Netherlands)

    Sieweke, Jost

    2014-01-01

    New institutional theory highlights the importance of language in processes of institutionalization, but Bourdieu argues that institutions are also transmitted by mimesis, i.e., the unconscious imitation of other actors' actions. The aim of this paper is to develop a framework that explains

  6. Examining the influence of psychopathy, hostility biases, and automatic processing on criminal offenders' Theory of Mind

    NARCIS (Netherlands)

    Nentjes, L.; Bernstein, D.; Arntz, A.; van Breukelen, G.; Slaats, M.

    2015-01-01

    Theory of Mind (ToM) is a social perceptual skill that refers to the ability to take someone else's perspective and infer what others think. The current study examined the effect of potential hostility biases, as well as controlled (slow) versus automatic (fast) processing on ToM performance in

  7. Processing Capacity under Perceptual and Cognitive Load: A Closer Look at Load Theory

    Science.gov (United States)

    Fitousi, Daniel; Wenger, Michael J.

    2011-01-01

    Variations in perceptual and cognitive demands (load) play a major role in determining the efficiency of selective attention. According to load theory (Lavie, Hirst, Fockert, & Viding, 2004) these factors (a) improve or hamper selectivity by altering the way resources (e.g., processing capacity) are allocated, and (b) tap resources rather than…

  8. Fractional Flow Theory Applicable to Non-Newtonian Behavior in EOR Processes

    NARCIS (Netherlands)

    Rossen, W.R.; Venkatraman, A.; Johns, R.T.; Kibodeaux, K.R.; Lai, H.; Moradi Tehrani, N.

    2011-01-01

    The method of characteristics, or fractional-flow theory, is extremely useful in understanding complex Enhanced Oil Recovery (EOR) processes and in calibrating simulators. One limitation has been its restriction to Newtonian rheology except in rectilinear flow. Its inability to deal with

  9. How Innovation Theory Can Contribute to the Military Operations Planning Process

    DEFF Research Database (Denmark)

    Heltberg, Anna Therese; Dahl, Kåre

    The research study considers how the application of innovation theory might contribute to military staff work planning processes and bring new perspectives to operational models of analysis such as NATO’s Comprehensive Operations Planning Directive (COPD) and the Danish Field Manual III....

  10. Statistical Inference on Memory Structure of Processes and Its Applications to Information Theory

    Science.gov (United States)

    2016-05-12

    Distribution Unlimited UU UU UU UU 12-05-2016 15-May-2014 14-Feb-2015 Final Report: Statistical Inference on Memory Structure of Processes and Its Applications ...ES) U.S. Army Research Office P.O. Box 12211 Research Triangle Park, NC 27709-2211 mathematical statistics ; time series; Markov chains; random...journals: Final Report: Statistical Inference on Memory Structure of Processes and Its Applications to Information Theory Report Title Three areas

  11. Research on Remote Sensing Image Template Processing Based on Global Subdivision Theory

    OpenAIRE

    Xiong Delan; Du Genyuan

    2013-01-01

    Aiming at the questions of vast data, complex operation, and time consuming processing for remote sensing image, subdivision template was proposed based on global subdivision theory, which can set up high level of abstraction and generalization for remote sensing image. The paper emphatically discussed the model and structure of subdivision template, and put forward some new ideas for remote sensing image template processing, key technology and quickly applied demonstration. The research has ...

  12. Normal uptake of 68Ga-DOTA-TOC by the pancreas uncinate process mimicking malignancy at somatostatin receptor PET.

    Science.gov (United States)

    Jacobsson, Hans; Larsson, Patricia; Jonsson, Cathrine; Jussing, Emma; Grybäck, Per

    2012-04-01

    To characterize a commonly occurring increased uptake by the uncinate process of the pancreas at PET/CT using 68Ga-DOTA-d-Phe1-Tyr3-octreotide (68Ga-DOTA-TOC). This tracer has replaced In pentetreotide (OctreoScan®) for somatostatin receptor scintigraphy at our laboratory. Fifty of our first 74 PET/CT examinations with 68Ga-DOTA-TOC could be evaluated in retrospect. None of these patients had surgery or showed any pathology in the pancreas head at the concomitant CT. Thirty-five of the 50 examinations (70%) showed an uptake by the uncinate process sufficiently intense to be interpreted as pathologic and simulating a tumor. Mean SUVmax was 9.2. Mean SUVmean using an isoactivity cut-off of >75% and >50% was 7.8 and 6.0, respectively. Volume calculations of the uncinate process activity using these definitions gave 0.9 mL and 4.2 mL, respectively. There is a frequent physiological uptake of 68Ga-DOTA-TOC by the pancreas uncinate process. This may be caused by an accumulation of pancreatic polypeptide-containing cells expressing somatostatin receptors. If there is a normal finding at concomitant diagnostic CT, this uptake should be regarded as physiological.

  13. Practice of Connectivism As Learning Theory: Enhancing Learning Process Through Social Networking Site (Facebook

    Directory of Open Access Journals (Sweden)

    Fahriye Altınay Aksal

    2013-12-01

    Full Text Available The impact of the digital age within learning and social interaction has been growing rapidly. The realm of digital age and computer mediated communication requires reconsidering instruction based on collaborative interactive learning process and socio-contextual experience for learning. Social networking sites such as facebook can help create group space for digital dialogue to inform, question and challenge within a frame of connectivism as learning theory within the digital age. The aim of this study is to elaborate the practice of connectivism as learning theory in terms of internship course. Facebook group space provided social learning platform for dialogue and negotiation beside the classroom learning and teaching process in this study. The 35 internship students provided self-reports within a frame of this qualitative research. This showed how principles of theory practiced and how this theory and facebook group space contribute learning, selfleadership, decision making and reflection skills. As the research reflects a practice of new theory based on action research, learning is not individualistic attempt in the digital age as regards the debate on learning in digital age within a frame of connectivism

  14. Superconductor-normal metal-superconductor process development for the fabrication of small Josephson junctions in ramp type configuration

    International Nuclear Information System (INIS)

    Poepel, R.; Hagedorn, D.; Weimann, T.; Buchholz, F.-I.; Niemeyer, J.

    2000-01-01

    At PTB, a fabrication process has been developed in SNS Nb/PdAu/Nb technology for the verification of small Josephson junctions (JJs) in the deep sub-micron range to enable the implementation of JJs as active elements in highly integrated superconducting circuits. Two steps of this technological development are described with regard to appropriately designed circuit layouts of JJ series arrays (JJAs), the first one in a conventional window type junction (WTJ) configuration and the second one in a ramp type junction (RTJ) configuration. Test circuits of JJAs containing up to 10 000 JJs have been fabricated and experimentally tested. In WTJ configuration, the circuits proved to be sensitive to external perturbing effects affecting the stability of circuit operation. In contrast to that, in RTJ configuration, the circuits realized showed correct function and a high grade of reliability of operation. To produce RTJ circuits, the technology parameters have been set to realize JJs with contact areas of A=0.25μmx1.3μm. At a thickness of the PdAu normal metal layer of d = 40 nm, the values achieved for the critical current density and for the product of critical current and normal state resistance are about j c = 200 k Acm -2 and about I c R N = 21 μV. (author)

  15. Optimal operation planning of radioactive waste processing system by fuzzy theory

    International Nuclear Information System (INIS)

    Yang, Jin Yeong; Lee, Kun Jai

    2000-01-01

    This study is concerned with the applications of linear goal programming and fuzzy theory to the analysis of management and operational problems in the radioactive processing system (RWPS). The developed model is validated and verified using actual data obtained from the RWPS at Kyoto University in Japan. The solution by goal programming and fuzzy theory would show the optimal operation point which is to maximize the total treatable radioactive waste volume and minimize the released radioactivity of liquid waste even under the restricted resources. (orig.)

  16. High-energy, large-momentum-transfer processes: Ladder diagrams in φ3 theory. Pt. 1

    International Nuclear Information System (INIS)

    Osland, P.; Wu, T.T.; Harvard Univ., Cambridge, MA

    1987-01-01

    Relativistic quantum field theories may give us useful guidance to understanding high-energy, large-momentum-transfer processes, where the center-of-mass energy is much larger than the transverse momentum transfers, which are in turn much larger than the masses of the participating particles. With this possibility in mind, we study the ladder diagrams in φ 3 theory. In this paper, some of the necessary techniques are developed and applied to the simplest cases of the fourth- and sixth-order ladder diagrams. (orig.)

  17. Ingredients and change processes in occupational therapy for children: a grounded theory study.

    Science.gov (United States)

    Armitage, Samantha; Swallow, Veronica; Kolehmainen, Niina

    2017-05-01

    There is limited evidence about the effectiveness of occupational therapy interventions for participation outcomes in children with coordination difficulties. Developing theory about the interventions, i.e. their ingredients and change processes, is the first step to advance the evidence base. To develop theory about the key ingredients of occupational therapy interventions for children with coordination difficulties and the processes through which change in participation might happen. Grounded theory methodology, as described by Kathy Charmaz, was used to develop the theory. Children and parents participated in semi-structured interviews to share their experiences of occupational therapy and processes of change. Data collection and analysis were completed concurrently using constant comparison methods. Five key ingredients of interventions were described: performing activities and tasks; achieving; carer support; helping and supporting the child; and labelling. Ingredients related to participation by changing children's mastery experience, increasing capability beliefs and sense of control. Parents' knowledge, skills, positive emotions, sense of empowerment and capability beliefs also related to children's participation. The results identify intervention ingredients and change pathways within occupational therapy to increase participation. It is unclear how explicitly and often therapists consider and make use of these ingredients and pathway.

  18. "Theory Becoming Alive": The Learning Transition Process of Newly Graduated Nurses in Canada.

    Science.gov (United States)

    Nour, Violet; Williams, Anne M

    2018-01-01

    Background Newly graduated nurses often encounter a gap between theory and practice in clinical settings. Although this has been the focus of considerable research, little is known about the learning transition process. Purpose The purpose of this study was to explore the experiences of newly graduated nurses in acute healthcare settings within Canada. This study was conducted to gain a greater understanding of the experiences and challenges faced by graduates. Methods Grounded theory method was utilized with a sample of 14 registered nurses who were employed in acute-care settings. Data were collected using in-depth interviews. Constant comparative analysis was used to analyze data. Results Findings revealed a core category, "Theory Becoming Alive," and four supporting categories: Entry into Practice, Immersion, Committing, and Evolving. Theory Becoming Alive described the process of new graduate nurses' clinical learning experiences as well as the challenges that they encountered in clinical settings after graduating. Conclusions This research provides a greater understanding of learning process of new graduate nurses in Canada. It highlights the importance of providing supportive environments to assist new graduate nurses to develop confidence as independent registered nurses in clinical areas. Future research directions as well as supportive educational strategies are described.

  19. Effect of ions on sulfuric acid-water binary particle formation: 2. Experimental data and comparison with QC-normalized classical nucleation theory

    CERN Document Server

    Duplissy, J.; Franchin, A.; Tsagkogeorgas, G.; Kangasluoma, J.; Wimmer, D.; Vuollekoski, H.; Schobesberger, S.; Lehtipalo, K.; Flagan, R. C.; Brus, D.; Donahue, N. M.; Vehkamäki, H.; Almeida, J.; Amorim, A.; Barmet, P.; Bianchi, F.; Breitenlechner, M.; Dunne, E. M.; Guida, R.; Henschel, H.; Junninen, H.; Kirkby, J.; Kürten, A.; Kupc, A.; Määttänen, A.; Makhmutov, V.; Mathot, S.; Nieminen, T.; Onnela, A.; Praplan, A. P.; Riccobono, F.; Rondo, L.; Steiner, G.; Tome, A.; Walther, H.; Baltensperger, U.; Carslaw, K. S.; Dommen, J.; Hansel, A.; Petäjä, T.; Sipilä, M.; Stratmann, F.; Vrtala, A.; Wagner, P. E.; Worsnop, D. R.; Curtius, J.; Kulmala, M.

    2015-09-04

    We report comprehensive, demonstrably contaminant‐free measurements of binary particle formation rates by sulfuric acid and water for neutral and ion‐induced pathways conducted in the European Organization for Nuclear Research Cosmics Leaving Outdoor Droplets chamber. The recently developed Atmospheric Pressure interface‐time of flight‐mass spectrometer was used to detect contaminants in charged clusters and to identify runs free of any contaminants. Four parameters were varied to cover ambient conditions: sulfuric acid concentration (105 to 109 mol cm−3), relative humidity (11% to 58%), temperature (207 K to 299 K), and total ion concentration (0 to 6800 ions cm−3). Formation rates were directly measured with novel instruments at sizes close to the critical cluster size (mobility size of 1.3 nm to 3.2 nm). We compare our results with predictions from Classical Nucleation Theory normalized by Quantum Chemical calculation (QC‐normalized CNT), which is described in a companion pape...

  20. Integration of multiple theories for the simulation of laser interference lithography processes.

    Science.gov (United States)

    Lin, Te-Hsun; Yang, Yin-Kuang; Fu, Chien-Chung

    2017-11-24

    The periodic structure of laser interference lithography (LIL) fabrication is superior to other lithography technologies. In contrast to traditional lithography, LIL has the advantages of being a simple optical system with no mask requirements, low cost, high depth of focus, and large patterning area in a single exposure. Generally, a simulation pattern for the periodic structure is obtained through optical interference prior to its fabrication through LIL. However, the LIL process is complex and combines the fields of optical and polymer materials; thus, a single simulation theory cannot reflect the real situation. Therefore, this research integrates multiple theories, including those of optical interference, standing waves, and photoresist characteristics, to create a mathematical model for the LIL process. The mathematical model can accurately estimate the exposure time and reduce the LIL process duration through trial and error.

  1. Integration of multiple theories for the simulation of laser interference lithography processes

    Science.gov (United States)

    Lin, Te-Hsun; Yang, Yin-Kuang; Fu, Chien-Chung

    2017-11-01

    The periodic structure of laser interference lithography (LIL) fabrication is superior to other lithography technologies. In contrast to traditional lithography, LIL has the advantages of being a simple optical system with no mask requirements, low cost, high depth of focus, and large patterning area in a single exposure. Generally, a simulation pattern for the periodic structure is obtained through optical interference prior to its fabrication through LIL. However, the LIL process is complex and combines the fields of optical and polymer materials; thus, a single simulation theory cannot reflect the real situation. Therefore, this research integrates multiple theories, including those of optical interference, standing waves, and photoresist characteristics, to create a mathematical model for the LIL process. The mathematical model can accurately estimate the exposure time and reduce the LIL process duration through trial and error.

  2. Multitrophic microbial interactions for eco- and agro-biotechnological processes: theory and practice.

    Science.gov (United States)

    Saleem, Muhammad; Moe, Luke A

    2014-10-01

    Multitrophic level microbial loop interactions mediated by protist predators, bacteria, and viruses drive eco- and agro-biotechnological processes such as bioremediation, wastewater treatment, plant growth promotion, and ecosystem functioning. To what extent these microbial interactions are context-dependent in performing biotechnological and ecosystem processes remains largely unstudied. Theory-driven research may advance the understanding of eco-evolutionary processes underlying the patterns and functioning of microbial interactions for successful development of microbe-based biotechnologies for real world applications. This could also be a great avenue to test the validity or limitations of ecology theory for managing diverse microbial resources in an era of altering microbial niches, multitrophic interactions, and microbial diversity loss caused by climate and land use changes. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Evaluation of the radiographic process using a experimental monobath solution compared with normal (Kodak) and rapid (RAY) developer solutions

    International Nuclear Information System (INIS)

    Baratieri, N.M.M.

    1985-01-01

    A comparative evaluation of the radiographic image quality of two dental X-ray films (Kodak's EP-21 and Agfa-Gevaert DOS-1) when processed in a normal (Kodak) a rapid (Ray) and a experimental monobath solutions, is presented. These films, processed in those solutions had their time of development, temperature and agitation performances checked by sensitometry; pH and color by routine methods and hipo rests by spectrophotometry. The radiographies were also analysed by able professionals regarding the best development time. The data so obtained allowed the conclusions that the best development time for the monobath was 3 minutes at 20 0 C but 25 or 30 0 C give also acceptable results at shorter times. The agitation of 10 seconds every minute is an important factor concerning image quality. pH and color do alter rapidally but with little influence in the final result. We found a certain amount of residual chemical compounds which were not identified but that are not hipo components, and being important to note that they seem not act upon the emulsion at least during one year after processing. (author) [pt

  4. Quantification of diffusion tensor imaging in normal white matter maturation of early childhood using an automated processing pipeline

    International Nuclear Information System (INIS)

    Loh, K.B.; Ramli, N.; Tan, L.K.; Roziah, M.; Rahmat, K.; Ariffin, H.

    2012-01-01

    The degree and status of white matter myelination can be sensitively monitored using diffusion tensor imaging (DTI). This study looks at the measurement of fractional anistropy (FA) and mean diffusivity (MD) using an automated ROI with an existing DTI atlas. Anatomical MRI and structural DTI were performed cross-sectionally on 26 normal children (newborn to 48 months old), using 1.5-T MRI. The automated processing pipeline was implemented to convert diffusion-weighted images into the NIfTI format. DTI-TK software was used to register the processed images to the ICBM DTI-81 atlas, while AFNI software was used for automated atlas-based volumes of interest (VOIs) and statistical value extraction. DTI exhibited consistent grey-white matter contrast. Triphasic temporal variation of the FA and MD values was noted, with FA increasing and MD decreasing rapidly early in the first 12 months. The second phase lasted 12-24 months during which the rate of FA and MD changes was reduced. After 24 months, the FA and MD values plateaued. DTI is a superior technique to conventional MR imaging in depicting WM maturation. The use of the automated processing pipeline provides a reliable environment for quantitative analysis of high-throughput DTI data. (orig.)

  5. Quantification of diffusion tensor imaging in normal white matter maturation of early childhood using an automated processing pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Loh, K.B.; Ramli, N.; Tan, L.K.; Roziah, M. [University of Malaya, Department of Biomedical Imaging, University Malaya Research Imaging Centre (UMRIC), Faculty of Medicine, Kuala Lumpur (Malaysia); Rahmat, K. [University of Malaya, Department of Biomedical Imaging, University Malaya Research Imaging Centre (UMRIC), Faculty of Medicine, Kuala Lumpur (Malaysia); University Malaya, Biomedical Imaging Department, Kuala Lumpur (Malaysia); Ariffin, H. [University of Malaya, Department of Paediatrics, Faculty of Medicine, Kuala Lumpur (Malaysia)

    2012-07-15

    The degree and status of white matter myelination can be sensitively monitored using diffusion tensor imaging (DTI). This study looks at the measurement of fractional anistropy (FA) and mean diffusivity (MD) using an automated ROI with an existing DTI atlas. Anatomical MRI and structural DTI were performed cross-sectionally on 26 normal children (newborn to 48 months old), using 1.5-T MRI. The automated processing pipeline was implemented to convert diffusion-weighted images into the NIfTI format. DTI-TK software was used to register the processed images to the ICBM DTI-81 atlas, while AFNI software was used for automated atlas-based volumes of interest (VOIs) and statistical value extraction. DTI exhibited consistent grey-white matter contrast. Triphasic temporal variation of the FA and MD values was noted, with FA increasing and MD decreasing rapidly early in the first 12 months. The second phase lasted 12-24 months during which the rate of FA and MD changes was reduced. After 24 months, the FA and MD values plateaued. DTI is a superior technique to conventional MR imaging in depicting WM maturation. The use of the automated processing pipeline provides a reliable environment for quantitative analysis of high-throughput DTI data. (orig.)

  6. Close relationship processes and health: implications of attachment theory for health and disease.

    Science.gov (United States)

    Pietromonaco, Paula R; Uchino, Bert; Dunkel Schetter, Christine

    2013-05-01

    Health psychology has contributed significantly to understanding the link between psychological factors and health and well-being, but it has not often incorporated advances in relationship science into hypothesis generation and study design. We present one example of a theoretical model, following from a major relationship theory (attachment theory) that integrates relationship constructs and processes with biopsychosocial processes and health outcomes. We briefly describe attachment theory and present a general framework linking it to dyadic relationship processes (relationship behaviors, mediators, and outcomes) and health processes (physiology, affective states, health behavior, and health outcomes). We discuss the utility of the model for research in several health domains (e.g., self-regulation of health behavior, pain, chronic disease) and its implications for interventions and future research. This framework revealed important gaps in knowledge about relationships and health. Future work in this area will benefit from taking into account individual differences in attachment, adopting a more explicit dyadic approach, examining more integrated models that test for mediating processes, and incorporating a broader range of relationship constructs that have implications for health. A theoretical framework for studying health that is based in relationship science can accelerate progress by generating new research directions designed to pinpoint the mechanisms through which close relationships promote or undermine health. Furthermore, this knowledge can be applied to develop more effective interventions to help individuals and their relationship partners with health-related challenges. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  7. The process of accepting breast cancer among Chinese women: A grounded theory study.

    Science.gov (United States)

    Chen, Shuang-Qin; Liu, Jun-E; Li, Zhi; Su, Ya-Li

    2017-06-01

    To describe the process by which Chinese women accept living with breast cancer. Individual interviews were conducted with 18 Chinese women who completed breast cancer treatment. Data were collected from September 2014 to January 2015 at a large tertiary teaching hospital in Beijing, China. In this grounded theory study, data were analyzed using constant comparative and coding analysis methods. In order to explain the process of accepting having breast cancer among women in China through the grounded theory study, a model that includes 5 axial categories was developed. Cognitive reconstruction emerged as the core category. The extent to which the women with breast cancer accepted having the disease was found to increase with the treatment stage and as their treatment stage progressed with time. The accepting process included five stages: non-acceptance, passive acceptance, willingness to accept, behavioral acceptance, and transcendence of acceptance. Our study using grounded theory study develops a model describing the process by which women accept having breast cancer. The model provides some intervention opportunities at every point of the process. Copyright © 2017. Published by Elsevier Ltd.

  8. Normalization: A Preprocessing Stage

    OpenAIRE

    Patro, S. Gopal Krishna; Sahu, Kishore Kumar

    2015-01-01

    As we know that the normalization is a pre-processing stage of any type problem statement. Especially normalization takes important role in the field of soft computing, cloud computing etc. for manipulation of data like scale down or scale up the range of data before it becomes used for further stage. There are so many normalization techniques are there namely Min-Max normalization, Z-score normalization and Decimal scaling normalization. So by referring these normalization techniques we are ...

  9. A branching process model for the analysis of abortive colony size distributions in carbon ion-irradiated normal human fibroblasts

    International Nuclear Information System (INIS)

    Sakashita, Tetsuya; Kobayashi, Yasuhiko; Hamada, Nobuyuki; Kawaguchi, Isao; Hara, Takamitsu; Saito, Kimiaki

    2014-01-01

    A single cell can form a colony, and ionizing irradiation has long been known to reduce such a cellular clonogenic potential. Analysis of abortive colonies unable to continue to grow should provide important information on the reproductive cell death (RCD) following irradiation. Our previous analysis with a branching process model showed that the RCD in normal human fibroblasts can persist over 16 generations following irradiation with low linear energy transfer (LET) γ-rays. Here we further set out to evaluate the RCD persistency in abortive colonies arising from normal human fibroblasts exposed to high-LET carbon ions (18.3 MeV/u, 108 keV/μm). We found that the abortive colony size distribution determined by biological experiments follows a linear relationship on the log–log plot, and that the Monte Carlo simulation using the RCD probability estimated from such a linear relationship well simulates the experimentally determined surviving fraction and the relative biological effectiveness (RBE). We identified the short-term phase and long-term phase for the persistent RCD following carbon-ion irradiation, which were similar to those previously identified following γ-irradiation. Taken together, our results suggest that subsequent secondary or tertiary colony formation would be invaluable for understanding the long-lasting RCD. All together, our framework for analysis with a branching process model and a colony formation assay is applicable to determination of cellular responses to low- and high-LET radiation, and suggests that the long-lasting RCD is a pivotal determinant of the surviving fraction and the RBE. (author)

  10. Recent experiments testing an opponent-process theory of acquired motivation.

    Science.gov (United States)

    Solomon, R L

    1980-01-01

    There are acquired motives of the addiction type which seem to be non-associative in nature. They all seem to involve affective phenomena caused by reinforcers, unconditioned stimuli or innate releasers. When such stimuli are repeatedly presented, at least three affective phenomena occur: (1) affective contrast effects, (2) affective habituation (tolerance), and (3) affective withdrawal syndromes. These phenomena can be precipitated either by pleasant or unpleasant events (positive or negative reinforcers). Whenever we see these three phenomena, we also see the development of an addictive cycle, a new motivational system. These phenomena are explained by an opponent-process theory of motivation which holds that there are affect control systems which oppose large departures from affective equilibrium. The control systems are strengthened by use and weakened by disuse. Current observations and experiments testing the theory are described for: (1) the growth of social attachment (imprinting) in ducklings; and (2) the growth of adjunctive behaviors. The findings so far support the theory.

  11. The Helicobacter pylori theory and duodenal ulcer disease. A case study of the research process

    DEFF Research Database (Denmark)

    Christensen, A H; Gjørup, T

    1995-01-01

    should be selected for H. pylori eradication treatment. CONCLUSION: Descriptive clinical studies and laboratory studies of disease mechanisms were the prevailing types of research about H. pylori. Comparatively few therapeutic intervention studies were done; this fact may have hampered the acceptance......OBJECTIVES: To describe the medical research process from the time of the generation of a new theory to its implementation in clinical practice. The Helicobacter pylori (H. pylori) theory, i.e. the theory that H. pylori plays a significant causal role in duodenal ulcer disease was chosen as a case....... MATERIAL: Abstracts from 1984 to 1993, identified in the CD-Rom, Medline system, ("Silverplatter"), using the search terms Campylobacter pylori and Helicobacter pylori, and reviews and editorials about H. pylori in some of the most widespread clinical journals. RESULTS: 2204 papers on H. pylori were...

  12. Application of adult attachment theory to group member transference and the group therapy process.

    Science.gov (United States)

    Markin, Rayna D; Marmarosh, Cheri

    2010-03-01

    Although clinical researchers have applied attachment theory to client conceptualization and treatment in individual therapy, few researchers have applied this theory to group therapy. The purpose of this article is to begin to apply theory and research on adult dyadic and group attachment styles to our understanding of group dynamics and processes in adult therapy groups. In particular, we set forth theoretical propositions on how group members' attachment styles affect relationships within the group. Specifically, this article offers some predictions on how identifying group member dyadic and group attachment styles could help leaders predict member transference within the therapy group. Implications of group member attachment for the selection and composition of a group and the different group stages are discussed. Recommendations for group clinicians and researchers are offered. PsycINFO Database Record (c) 2010 APA, all rights reserved

  13. Online dating in Japan: a test of social information processing theory.

    Science.gov (United States)

    Farrer, James; Gavin, Jeff

    2009-08-01

    This study examines the experiences of past and present members of a popular Japanese online dating site in order to explore the extent to which Western-based theories of computer-mediated communication (CMC) and the development of online relationships are relevant to the Japanese online dating experience. Specifically, it examines whether social information processing theory (SIPT) is applicable to Japanese online dating interactions, and how and to what extent Japanese daters overcome the limitations of CMC through the use of contextual and other cues. Thirty-six current members and 27 former members of Match.com Japan completed an online survey. Using issue-based procedures for grounded theory analysis, we found strong support for SIPT. Japanese online daters adapt their efforts to present and acquire social information using the cues that the online dating platform provides, although many of these cues are specific to Japanese social context.

  14. Intervention mapping: a process for developing theory- and evidence-based health education programs.

    Science.gov (United States)

    Bartholomew, L K; Parcel, G S; Kok, G

    1998-10-01

    The practice of health education involves three major program-planning activities: needs assessment, program development, and evaluation. Over the past 20 years, significant enhancements have been made to the conceptual base and practice of health education. Models that outline explicit procedures and detailed conceptualization of community assessment and evaluation have been developed. Other advancements include the application of theory to health education and promotion program development and implementation. However, there remains a need for more explicit specification of the processes by which one uses theory and empirical findings to develop interventions. This article presents the origins, purpose, and description of Intervention Mapping, a framework for health education intervention development. Intervention Mapping is composed of five steps: (1) creating a matrix of proximal program objectives, (2) selecting theory-based intervention methods and practical strategies, (3) designing and organizing a program, (4) specifying adoption and implementation plans, and (5) generating program evaluation plans.

  15. Immediate survival focus: synthesizing life history theory and dual process models to explain substance use.

    Science.gov (United States)

    Richardson, George B; Hardesty, Patrick

    2012-01-01

    Researchers have recently applied evolutionary life history theory to the understanding of behaviors often conceived of as prosocial or antisocial. In addition, researchers have applied cognitive science to the understanding of substance use and used dual process models, where explicit cognitive processes are modeled as relatively distinct from implicit cognitive processes, to explain and predict substance use behaviors. In this paper we synthesized these two theoretical perspectives to produce an adaptive and cognitive framework for explaining substance use. We contend that this framework provides new insights into the nature of substance use that may be valuable for both clinicians and researchers.

  16. Immediate Survival Focus: Synthesizing Life History Theory and Dual Process Models to Explain Substance Use

    Directory of Open Access Journals (Sweden)

    George B. Richardson

    2012-10-01

    Full Text Available Researchers have recently applied evolutionary life history theory to the understanding of behaviors often conceived of as prosocial or antisocial. In addition, researchers have applied cognitive science to the understanding of substance use and used dual process models, where explicit cognitive processes are modeled as relatively distinct from implicit cognitive processes, to explain and predict substance use behaviors. In this paper we synthesized these two theoretical perspectives to produce an adaptive and cognitive framework for explaining substance use. We contend that this framework provides new insights into the nature of substance use that may be valuable for both clinicians and researchers.

  17. Wavelet theory and belt finishing process, influence of wavelet shape on the surface roughness parameter values

    International Nuclear Information System (INIS)

    Khawaja, Z; Mazeran, P-E; Bigerelle, M; Guillemot, G; Mansori, M El

    2011-01-01

    This article presents a multi-scale theory based on wavelet decomposition to characterize the evolution of roughness in relation with a finishing process or an observed surface property. To verify this approach in production conditions, analyses were developed for the finishing process of the hardened steel by abrasive belts. These conditions are described by seven parameters considered in the Tagushi experimental design. The main objective of this work is to identify the most relevant roughness parameter and characteristic length allowing to assess the influence of finishing process, and to test the relevance of the measurement scale. Results show that wavelet approach allows finding this scale.

  18. Effect of perceptual load on conceptual processing: an extension of Vermeulen's theory.

    Science.gov (United States)

    Xie, Jiushu; Wang, Ruiming; Sun, Xun; Chang, Song

    2013-10-01

    The effect of color and shape load on conceptual processing was studied. Perceptual load effects have been found in visual and auditory conceptual processing, supporting the theory of embodied cognition. However, whether different types of visual concepts, such as color and shape, share the same perceptual load effects is unknown. In the current experiment, 32 participants were administered simultaneous perceptual and conceptual tasks to assess the relation between perceptual load and conceptual processing. Keeping color load in mind obstructed color conceptual processing. Hence, perceptual processing and conceptual load shared the same resources, suggesting embodied cognition. Color conceptual processing was not affected by shape pictures, indicating that different types of properties within vision were separate.

  19. Computing molecular fluctuations in biochemical reaction systems based on a mechanistic, statistical theory of irreversible processes.

    Science.gov (United States)

    Kulasiri, Don

    2011-01-01

    We discuss the quantification of molecular fluctuations in the biochemical reaction systems within the context of intracellular processes associated with gene expression. We take the molecular reactions pertaining to circadian rhythms to develop models of molecular fluctuations in this chapter. There are a significant number of studies on stochastic fluctuations in intracellular genetic regulatory networks based on single cell-level experiments. In order to understand the fluctuations associated with the gene expression in circadian rhythm networks, it is important to model the interactions of transcriptional factors with the E-boxes in the promoter regions of some of the genes. The pertinent aspects of a near-equilibrium theory that would integrate the thermodynamical and particle dynamic characteristics of intracellular molecular fluctuations would be discussed, and the theory is extended by using the theory of stochastic differential equations. We then model the fluctuations associated with the promoter regions using general mathematical settings. We implemented ubiquitous Gillespie's algorithms, which are used to simulate stochasticity in biochemical networks, for each of the motifs. Both the theory and the Gillespie's algorithms gave the same results in terms of the time evolution of means and variances of molecular numbers. As biochemical reactions occur far away from equilibrium-hence the use of the Gillespie algorithm-these results suggest that the near-equilibrium theory should be a good approximation for some of the biochemical reactions. © 2011 Elsevier Inc. All rights reserved.

  20. Quantification of diffusion tensor imaging in normal white matter maturation of early childhood using an automated processing pipeline.

    Science.gov (United States)

    Loh, K B; Ramli, N; Tan, L K; Roziah, M; Rahmat, K; Ariffin, H

    2012-07-01

    The degree and status of white matter myelination can be sensitively monitored using diffusion tensor imaging (DTI). This study looks at the measurement of fractional anistropy (FA) and mean diffusivity (MD) using an automated ROI with an existing DTI atlas. Anatomical MRI and structural DTI were performed cross-sectionally on 26 normal children (newborn to 48 months old), using 1.5-T MRI. The automated processing pipeline was implemented to convert diffusion-weighted images into the NIfTI format. DTI-TK software was used to register the processed images to the ICBM DTI-81 atlas, while AFNI software was used for automated atlas-based volumes of interest (VOIs) and statistical value extraction. DTI exhibited consistent grey-white matter contrast. Triphasic temporal variation of the FA and MD values was noted, with FA increasing and MD decreasing rapidly early in the first 12 months. The second phase lasted 12-24 months during which the rate of FA and MD changes was reduced. After 24 months, the FA and MD values plateaued. DTI is a superior technique to conventional MR imaging in depicting WM maturation. The use of the automated processing pipeline provides a reliable environment for quantitative analysis of high-throughput DTI data. Diffusion tensor imaging outperforms conventional MRI in depicting white matter maturation. • DTI will become an important clinical tool for diagnosing paediatric neurological diseases. • DTI appears especially helpful for developmental abnormalities, tumours and white matter disease. • An automated processing pipeline assists quantitative analysis of high throughput DTI data.

  1. Building bridges to observational perspectives: a grounded theory of therapy processes in psychosis.

    Science.gov (United States)

    Dilks, Sarah; Tasker, Fiona; Wren, Bernadette

    2008-06-01

    This study set out to explore therapy processes in psychosis with an initial focus on reflexivity and how this might be expressed in therapy conversations. Leiman's (2000) definition of reflexivity was used as a starting-point for an exploratory investigation of the use of language as reflective activity. Grounded theory was chosen as an appropriate methodology to distil an explanatory account across the qualitative data collected. Six psychologist-client pairs supplied three tapes of therapy sessions spread out across the course of therapy. Each participant was separately interviewed on two occasions to ascertain their views of therapy and of the emerging grounded theory. A grounded theory was developed conceptualizing the processes and activities in psychological therapy in psychosis. Building bridges to observational perspectives summarizes the core process in psychological therapy in psychosis. Therapy in psychosis is understood as intimately linking the social and internal world in a dialogical process aimed at enhancing the client's functioning in the social world rather than at specifically developing the private mental experience of reflexivity or mentalizing.

  2. [Who teaches queer: the prospect of queer theory analysis in the health education process].

    Science.gov (United States)

    Motta, Jose Inácio Jardim; Ribeiro, Victória Maria Brant

    2013-06-01

    The scope of this essay is to reflect on the possibilities of inclusion of a queer analytical body to the processes of education in the health field. This is because the development of the Unified Health System, with its new set of health practices has revealed challenges that include broadening the knowledge set especially required for revitalization of the notion of subject. Queer theory is needed to understand how identities and in particular gender and sexuality are incorporated, in a social and cultural process, and how, in the micro-social spaces, it can determine educational practices with the power to reinforce the status of the so-called minority sexualities. Queer theory framed in so-called post-critical theories of education is analyzed from the categories of power, resistance, transgression in the context of standardization and subjectivity. It is assumed that processes of education in health, grounded in queer teaching, working in terms of difference and not diversity, proposing processes of deconstruction of binaries such as nature/culture, reason/passion, homosexual/heterosexual, working towards shaping more assertive cultural and social subjects.

  3. An integrated model of clinical reasoning: dual-process theory of cognition and metacognition.

    Science.gov (United States)

    Marcum, James A

    2012-10-01

    Clinical reasoning is an important component for providing quality medical care. The aim of the present paper is to develop a model of clinical reasoning that integrates both the non-analytic and analytic processes of cognition, along with metacognition. The dual-process theory of cognition (system 1 non-analytic and system 2 analytic processes) and the metacognition theory are used to develop an integrated model of clinical reasoning. In the proposed model, clinical reasoning begins with system 1 processes in which the clinician assesses a patient's presenting symptoms, as well as other clinical evidence, to arrive at a differential diagnosis. Additional clinical evidence, if necessary, is acquired and analysed utilizing system 2 processes to assess the differential diagnosis, until a clinical decision is made diagnosing the patient's illness and then how best to proceed therapeutically. Importantly, the outcome of these processes feeds back, in terms of metacognition's monitoring function, either to reinforce or to alter cognitive processes, which, in turn, enhances synergistically the clinician's ability to reason quickly and accurately in future consultations. The proposed integrated model has distinct advantages over other models proposed in the literature for explicating clinical reasoning. Moreover, it has important implications for addressing the paradoxical relationship between experience and expertise, as well as for designing a curriculum to teach clinical reasoning skills. © 2012 Blackwell Publishing Ltd.

  4. A framework for analysis of abortive colony size distributions using a model of branching processes in irradiated normal human fibroblasts.

    Science.gov (United States)

    Sakashita, Tetsuya; Hamada, Nobuyuki; Kawaguchi, Isao; Ouchi, Noriyuki B; Hara, Takamitsu; Kobayashi, Yasuhiko; Saito, Kimiaki

    2013-01-01

    Clonogenicity gives important information about the cellular reproductive potential following ionizing irradiation, but an abortive colony that fails to continue to grow remains poorly characterized. It was recently reported that the fraction of abortive colonies increases with increasing dose. Thus, we set out to investigate the production kinetics of abortive colonies using a model of branching processes. We firstly plotted the experimentally determined colony size distribution of abortive colonies in irradiated normal human fibroblasts, and found the linear relationship on the log-linear or log-log plot. By applying the simple model of branching processes to the linear relationship, we found the persistent reproductive cell death (RCD) over several generations following irradiation. To verify the estimated probability of RCD, abortive colony size distribution (≤ 15 cells) and the surviving fraction were simulated by the Monte Carlo computational approach for colony expansion. Parameters estimated from the log-log fit demonstrated the good performance in both simulations than those from the log-linear fit. Radiation-induced RCD, i.e. excess probability, lasted over 16 generations and mainly consisted of two components in the early (probability over 5 generations, whereas abortive colony size distribution was robust against it. These results suggest that, whereas short-term RCD is critical to the abortive colony size distribution, long-lasting RCD is important for the dose response of the surviving fraction. Our present model provides a single framework for understanding the behavior of primary cell colonies in culture following irradiation.

  5. Progress in the application of classical S-matrix theory to inelastic collision processes

    International Nuclear Information System (INIS)

    McCurdy, C.W.; Miller, W.H.

    1980-01-01

    Methods are described which effectively solve two of the technical difficulties associated with applying classical S-matrix theory to inelastic/reactive scattering. Specifically, it is shown that rather standard numerical methods can be used to solve the ''root search'' problem (i.e., the nonlinear boundary value problem necessary to impose semiclassical quantum conditions at the beginning and the end of the classical trajectories) and also how complex classical trajectories, which are necessary to describe classically forbidden (i.e., tunneling) processes, can be computed in a numerically stable way. Application is made to vibrational relaxation of H 2 by collision with He (within the helicity conserving approximation). The only remaining problem with regard to applying classical S-matrix theory to complex collision processes has to do with the availability of multidimensional uniform asymptotic formulas for interpolating the ''primitive'' semiclassical expressions between their various regions of validity

  6. Process Reengineering for Quality Improvement in ICU Based on Taylor's Management Theory.

    Science.gov (United States)

    Tao, Ziqi

    2015-06-01

    Using methods including questionnaire-based surveys and control analysis, we analyzed the improvements in the efficiency of ICU rescue, service quality, and patients' satisfaction, in Xuzhou Central Hospital after the implementation of fine management, with an attempt to further introduce the concept of fine management and implement the brand construction. Originating in Taylor's "Theory of Scientific Management" (1982), fine management uses programmed, standardized, digitalized, and informational approaches to ensure each unit of an organization is running with great accuracy, high efficiency, strong coordination, and at sustained duration (Wang et al., Fine Management, 2007). The nature of fine management is a process that breaks up the strategy and goal, and executes it. Strategic planning takes place at every part of the process. Fine management demonstrates that everybody has a role to play in the management process, every area must be examined through the management process, and everything has to be managed (Zhang et al., The Experience of Hospital Nursing Precise Management, 2006). In other words, this kind of management theory demands all people to be involved in the entire process (Liu and Chen, Med Inf, 2007). As public hospital reform is becoming more widespread, it becomes imperative to "build a unified and efficient public hospital management system" and "improve the quality of medical services" (Guidelines on the Pilot Reform of Public Hospitals, 2010). The execution of fine management is of importance in optimizing the medical process, improving medical services and building a prestigious hospital brand.

  7. Dynamical description of the fission process using the TD-BCS theory

    Energy Technology Data Exchange (ETDEWEB)

    Scamps, Guillaume, E-mail: scamps@nucl.phys.tohoku.ac.jp [Department of Physics, Tohoku University, Sendai 980-8578 (Japan); Simenel, Cédric [Department of Nuclear Physics, Research School of Physics and Engineering Australian National University, Canberra, Australian Capital Territory 2601 (Australia); Lacroix, Denis [Institut de Physique Nucléaire, IN2P3-CNRS, Université Paris-Sud, F-91406 Orsay Cedex (France)

    2015-10-15

    The description of fission remains a challenge for nuclear microscopic theories. The time-dependent Hartree-Fock approach with BCS pairing is applied to study the last stage of the fission process. A good agreement is found for the one-body observables: the total kinetic energy and the average mass asymmetry. The non-physical dependence of two-body observables with the initial shape is discussed.

  8. An analysis of clinical reasoning through a recent and comprehensive approach: the dual-process theory

    OpenAIRE

    Pelaccia, Thierry; Tardif, Jacques; Triby, Emmanuel; Charlin, Bernard

    2011-01-01

    Context: Clinical reasoning plays a major role in the ability of doctors to make diagnoses and decisions. It is considered as the physician’s most critical competence, and has been widely studied by physicians, educationalists, psychologists and sociologists. Since the 1970s, many theories about clinical reasoning in medicine have been put forward. Purpose: This paper aims at exploring a comprehensive approach: the ‘‘dual-process theory’’, a model developed by co...

  9. Decision and intuition during organizational change : an evolutionary critique of dual process theory

    OpenAIRE

    Talat, U; Chang, K; Nguyen, B

    2017-01-01

    Purpose: The purpose of this paper is to review intuition in the context of organizational change. We argue that intuition as a concept requires attention and its formulation is necessary prior to its application in organizations. The paper provides a critique of Dual Process Theory and highlights shortcomings in organization theorizing of intuition.\\ud Design/methodology/approach: The paper is conceptual and provides in-depth theoretical discussions by drawing from the literature on decision...

  10. Does Joshua Greene’s Dual Process Theory of Moral Judgment Commit the Naturalistic Fallacy?

    OpenAIRE

    Javier Gracia Calandín

    2017-01-01

    In this article I analyse whether Joshua Greene’s dual process theory of moral judgment commits the naturalistic fallacy. Firstly, and against current authors such as Patricia S. Churchland, I uphold the validity of the naturalistic fallacy denounced by Moore for more than a century. Secondly, I highlight and question Greene’s naturalized way of understanding Deontologism. Thirdly, I assert the distinction between "neural basis" and "moral foundation" as the key to avoid committing the natura...

  11. Influence of the growth process on some laws deduced from percolation theory

    International Nuclear Information System (INIS)

    Hachi, M.; Olivier, G.

    1985-09-01

    A brutal application of the percolation theory to some physical problems can lead to erroneous interpretation of the experimental results. Among these problems, the influence of the growth process on the percolation laws is studied. The behaviour of nsub(s)(t), the number of clusters of size s, at time t, is analyzed and linked to a macroscopic property of the system for a comparison to experimental laws. (author)

  12. Stochastic processes, optimization, and control theory a volume in honor of Suresh Sethi

    CERN Document Server

    Yan, Houmin

    2006-01-01

    This edited volume contains 16 research articles. It presents recent and pressing issues in stochastic processes, control theory, differential games, optimization, and their applications in finance, manufacturing, queueing networks, and climate control. One of the salient features is that the book is highly multi-disciplinary. The book is dedicated to Professor Suresh Sethi on the occasion of his 60th birthday, in view of his distinguished career.

  13. A test of processing efficiency theory in a team sport context.

    Science.gov (United States)

    Smith, N C; Bellamy, M; Collins, D J; Newell, D

    2001-05-01

    In this study, we tested some key postulates of Eysenck and Calvo's processing efficiency theory in a team sport. The participants were 12 elite male volleyball players who were followed throughout the course of a competitive season. Self-report measures of pre-match and in-game cognitive anxiety and mental effort were collected in groups of players high and low in dispositional anxiety. Player performance was determined from the statistical analysis of match-play. Sets were classified according to the point spread separating the two teams into one of three levels of criticality. Game momentum was also analysed to determine its influence on in-game state anxiety. Significant differences in in-game cognitive anxiety were apparent between high and low trait anxiety groups. An interaction between anxiety grouping and momentum condition was also evident in cognitive anxiety. Differences in set criticality were reflected in significant elevations in mental effort, an effect more pronounced in dispositionally high anxious performers. Consistent with the predictions of processing efficiency theory, mental effort ratings were higher in high trait-anxious players in settings where their performance was equivalent to that of low trait-anxious performers. The usefulness of processing efficiency theory as an explanatory framework in sport anxiety research is discussed in the light of these findings.

  14. Occupational therapy students in the process of interprofessional collaborative learning: a grounded theory study.

    Science.gov (United States)

    Howell, Dana

    2009-01-01

    The purpose of this grounded theory study was to generate a theory of the interprofessional collaborative learning process of occupational therapy (OT) students who were engaged in a collaborative learning experience with students from other allied health disciplines. Data consisted of semi-structured interviews with nine OT students from four different interprofessional collaborative learning experiences at three universities. The emergent theory explained OT students' need to build a culture of mutual respect among disciplines in order to facilitate interprofessional collaborative learning. Occupational therapy students went through a progression of learned skills that included learning how to represent the profession of OT, hold their weight within a team situation, solve problems collaboratively, work as a team, and ultimately, to work in an actual team in practice. This learning process occurred simultaneously as students also learned course content. The students had to contend with barriers and facilitators that influenced their participation and the success of their collaboration. Understanding the interprofessional learning process of OT students will help allied health faculty to design more effective, inclusive interprofessional courses.

  15. Parameter-free effective field theory calculation for the solar proton-fusion and hep processes

    International Nuclear Information System (INIS)

    T.S. Park; L.E. Marcucci; R. Schiavilla; M. Viviani; A. Kievsky; S. Rosati; K. Kubodera; D.P. Min; M. Rho

    2002-01-01

    Spurred by the recent complete determination of the weak currents in two-nucleon systems up to Ο(Q 3 ) in heavy-baryon chiral perturbation theory, we carry out a parameter-free calculation of the threshold S-factors for the solar pp (proton-fusion) and hep processes in an effective field theory that combines the merits of the standard nuclear physics method and systematic chiral expansion. The power of the EFT adopted here is that one can correlate in a unified formalism the weak-current matrix elements of two-, three- and four-nucleon systems. Using the tritium β-decay rate as an input to fix the only unknown parameter in the theory, we can evaluate the threshold S factors with drastically improved precision; the results are S pp (0) = 3.94 x (1 ± 0.004) x 10 -25 MeV-b and S hep (0) = (8.6 ± 1.3) x 10 -20 keV-b. The dependence of the calculated S-factors on the momentum cutoff parameter Λ has been examined for a physically reasonable range of Λ. This dependence is found to be extremely small for the pp process, and to be within acceptable levels for the hep process, substantiating the consistency of our calculational scheme

  16. Fundamental Theories and Key Technologies for Smart and Optimal Manufacturing in the Process Industry

    Directory of Open Access Journals (Sweden)

    Feng Qian

    2017-04-01

    Full Text Available Given the significant requirements for transforming and promoting the process industry, we present the major limitations of current petrochemical enterprises, including limitations in decision-making, production operation, efficiency and security, information integration, and so forth. To promote a vision of the process industry with efficient, green, and smart production, modern information technology should be utilized throughout the entire optimization process for production, management, and marketing. To focus on smart equipment in manufacturing processes, as well as on the adaptive intelligent optimization of the manufacturing process, operating mode, and supply chain management, we put forward several key scientific problems in engineering in a demand-driven and application-oriented manner, namely: ① intelligent sensing and integration of all process information, including production and management information; ② collaborative decision-making in the supply chain, industry chain, and value chain, driven by knowledge; ③ cooperative control and optimization of plant-wide production processes via human-cyber-physical interaction; and ④ life-cycle assessments for safety and environmental footprint monitoring, in addition to tracing analysis and risk control. In order to solve these limitations and core scientific problems, we further present fundamental theories and key technologies for smart and optimal manufacturing in the process industry. Although this paper discusses the process industry in China, the conclusions in this paper can be extended to the process industry around the world.

  17. Decision making using AHP (Analytic Hierarchy Process) and fuzzy set theory in waste management

    International Nuclear Information System (INIS)

    Chung, J.Y.; Lee, K.J.; Kim, C.D.

    1995-01-01

    The major problem is how to consider the differences in opinions, when many experts are involved in decision making process. This paper provides a simple general methodology to treat the differences in various opinions. The authors determined the grade of membership through the process of magnitude estimation derived from pairwise comparisons and AHP developed by Saaty. They used fuzzy set theory to consider the differences in opinions and obtain the priorities for each alternative. An example, which can be applied to radioactive waste management, also was presented. The result shows a good agreement with the results of averaging methods

  18. Theory and Metatheory in the Study of Dual Processing: Reply to Comments.

    Science.gov (United States)

    Evans, Jonathan St B T; Stanovich, Keith E

    2013-05-01

    In this article, we respond to the four comments on our target article. Some of the commentators suggest that we have formulated our proposals in a way that renders our account of dual-process theory untestable and less interesting than the broad theory that has been critiqued in recent literature. Our response is that there is a confusion of levels. Falsifiable predictions occur not at the level of paradigm or metatheory-where this debate is taking place-but rather in the instantiation of such a broad framework in task level models. Our proposal that many dual-processing characteristics are only correlated features does not weaken the testability of task-level dual-processing accounts. We also respond to arguments that types of processing are not qualitatively distinct and discuss specific evidence disputed by the commentators. Finally, we welcome the constructive comments of one commentator who provides strong arguments for the reality of the dual-process distinction. © The Author(s) 2013.

  19. Reflective processes of practitioners in head and neck cancer rehabilitation: a grounded theory study.

    Science.gov (United States)

    Caty, Marie-Ève; Kinsella, Elizabeth Anne; Doyle, Philip C

    2016-12-01

    This study systematically examined how experienced Speech-Language Pathologists (SLPs) use the processes of reflection to develop knowledge relevant for practice in the context of head and neck cancer (HNC) rehabilitation. In-depth, semi-structured interviews were conducted with 12 SLPs working in HNC rehabilitation in North America. Grounded theory methodology was adopted for data collection and analysis. The findings inform a preliminary reflective practice model that depicts the processes of reflection used by practitioners interviewed. Nine categories of reflective processes were identified by participant SLPs in terms of the processes of reflection: ongoing questioning, experimenting through trial and error, integrating knowledge from past cases, embracing surprise, thinking out of the box, being in the moment, consulting with colleagues, putting oneself in the patients' shoes, and discerning ethical issues. These findings provide empirical evidence that supports Schön's theory of reflective practice and contribute to knowledge about the ways in which SLPs use processes of reflection in the context of HNC rehabilitation. The findings of this study have implications for how SLPs perceive and consider their role as knowledge-users and knowledge producers in their day-to-day clinical work, as well as for building capacity for reflective practice.

  20. Theory of high energy collision processes. Final report, June 1, 1969-May 31, 1984

    International Nuclear Information System (INIS)

    Wu, T.T.

    1984-01-01

    We have developed a comprehensive theory for scattering processes at extremely high energies. On the basis of relativistic quantum field theories with or without isotopic spin, we have obtained a simple physical picture, called the impact picture, which gives a number of unusual predictions. Many of these have been verified experimentally, including the increasing total cross sections, the increasing total elastic cross sections, the moving dip, and the rising plateau. An especially accurate experimental verification of the prediction of increasing total cross section has been provided by the CERN p anti p Collider at a c.m. energy of 540 GeV. All of these predictions were obtained by resumming the perturbation series. The natural next step is to look for important physical effects that cannot be seen by any method of resumming the perturbation series. One such method is to find nonperturbative effects already present on the classical level; another is to construct exactly solvable models of quantum field theory. Both approaches have been pursued. Recent theoretical results include the possible occurrence of indeterminate-mass particles, dynamic determination of coupling constants, a solvable Z 2 lattice gauge theory, a generalization of the method of helicity amplitudes, classical models of confinement, and a monopole as a short-distance probe. 152 publications are listed

  1. Gamma-ray multiplicity measurements for the determination of the initial angular momentum ranges in normal and fast fission processes

    International Nuclear Information System (INIS)

    El Masri, Y.; Steckmeyer, J.C.; Martin, V.; Bizard, G.; Brou, R.; Laville, J.L.; Regimbart, R.; Tamain, B.; Peter, J.

    1990-01-01

    Gamma-ray multiplicities (first and second moments) have been measured, in the 220 MeV 20 Ne+ nat Re and 315 meV 40 Ar+ 165 Ho reactions, as a function of fission fragment masses and centre-of-mass total kinetic energies. The two reactions lead to the same fusion nucleus, 205 At, at the same excitation energy (167 MeV). The experimental critical angular momentum for the fission process in the Ne+Re system (91±3) ℎ is close to I Bf=0 (∝80 ℎ) while in the Ar+Ho reaction this critical angular momentum (136±4) ℎ is much larger than I Bf=0 value, favoring the occurrence of the fast fission process. The observed widths of the fission fragment mass distribution: (42±2) u in the Ne+Re system and (56±4) u in the Ar+Ho reaction strengthen this hypothesis. For both compound nucleus fission and fast fission components in Ar+Ho, the total spin values obtained in absolute magnitude and in their dependence on the mass asymmetry are well described by assuming rigid rotation of the fissioning complex and statistical excitation of some collective rotational modes such as 'Bending' and 'Wriggling' according to the Schmitt-Pacheco model. These modes, however, are not all fully excited, their degrees of excitation are approximately the same for both fission components. From theoretical estimates of equilibration times, one anticipates the 'Tilting' mode to be by far the last to be excited, and from its non-excitation in the present data together with the excitation of bending and wriggling, a time interval of about 10 -21 s to 2x10 -20 s can be derived for the reaction time of both normal fission and fast fission. (orig./HSI)

  2. Molecular conformational analysis, vibrational spectra and normal coordinate analysis of trans-1,2-bis(3,5-dimethoxy phenyl)-ethene based on density functional theory calculations.

    Science.gov (United States)

    Joseph, Lynnette; Sajan, D; Chaitanya, K; Isac, Jayakumary

    2014-03-25

    The conformational behavior and structural stability of trans-1,2-bis(3,5-dimethoxy phenyl)-ethene (TDBE) were investigated by using density functional theory (DFT) method with the B3LYP/6-311++G(d,p) basis set combination. The vibrational wavenumbers of TDBE were computed at DFT level and complete vibrational assignments were made on the basis of normal coordinate analysis calculations (NCA). The DFT force field transformed to natural internal coordinates was corrected by a well-established set of scale factors that were found to be transferable to the title compound. The infrared and Raman spectra were also predicted from the calculated intensities. The observed Fourier transform infrared (FTIR) and Fourier transform (FT) Raman vibrational wavenumbers were analyzed and compared with the theoretically predicted vibrational spectra. Comparison of the simulated spectra with the experimental spectra provides important information about the ability of the computational method to describe the vibrational modes. Information about the size, shape, charge density distribution and site of chemical reactivity of the molecules has been obtained by mapping electron density isosurface with electrostatic potential surfaces (ESP). Copyright © 2013 Elsevier B.V. All rights reserved.

  3. Scalar utility theory and proportional processing: What does it actually imply?

    Science.gov (United States)

    Rosenström, Tom; Wiesner, Karoline; Houston, Alasdair I

    2016-09-07

    Scalar Utility Theory (SUT) is a model used to predict animal and human choice behaviour in the context of reward amount, delay to reward, and variability in these quantities (risk preferences). This article reviews and extends SUT, deriving novel predictions. We show that, contrary to what has been implied in the literature, (1) SUT can predict both risk averse and risk prone behaviour for both reward amounts and delays to reward depending on experimental parameters, (2) SUT implies violations of several concepts of rational behaviour (e.g. it violates strong stochastic transitivity and its equivalents, and leads to probability matching) and (3) SUT can predict, but does not always predict, a linear relationship between risk sensitivity in choices and coefficient of variation in the decision-making experiment. SUT derives from Scalar Expectancy Theory which models uncertainty in behavioural timing using a normal distribution. We show that the above conclusions also hold for other distributions, such as the inverse Gaussian distribution derived from drift-diffusion models. A straightforward way to test the key assumptions of SUT is suggested and possible extensions, future prospects and mechanistic underpinnings are discussed. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Habituation of the orienting reflex and the development of Preliminary Process Theory.

    Science.gov (United States)

    Barry, Robert J

    2009-09-01

    The orienting reflex (OR), elicited by an innocuous stimulus, can be regarded as a model of the organism's interaction with its environment, and has been described as the unit of attentional processing. A major determinant of the OR is the novelty of the eliciting stimulus, generally operationalized in terms of its reduction with stimulus repetition, the effects of which are commonly described in habituation terms. This paper provides an overview of a research programme, spanning more than 30 years, investigating psychophysiological aspects of the OR in humans. The major complication in this research is that the numerous physiological measures used as dependent variables in the OR context fail to jointly covary with stimulus parameters. This has led to the development of the Preliminary Process Theory (PPT) of the OR to accommodate the complexity of the observed stimulus-response patterns. PPT is largely grounded in autonomic measures, and current work is attempting to integrate electroencephalographic measures, particularly components in the event-related brain potentials reflecting aspects of stimulus processing. The emphasis in the current presentation is on the use of the defining criteria of the habituation phenomenon, and Groves and Thompson's Dual-process Theory, in the development of PPT.

  5. Dual-process theory and consumer response to front-of-package nutrition label formats.

    Science.gov (United States)

    Sanjari, S Setareh; Jahn, Steffen; Boztug, Yasemin

    2017-11-01

    Nutrition labeling literature yields fragmented results about the effect of front-of-package (FOP) nutrition label formats on healthy food choice. Specifically, it is unclear which type of nutrition label format is effective across different shopping situations. To address this gap, the present review investigates the available nutrition labeling literature through the prism of dual-process theory, which posits that decisions are made either quickly and automatically (system 1) or slowly and deliberately (system 2). A systematically performed review of nutrition labeling literature returned 59 papers that provide findings that can be explained according to dual-process theory. The findings of these studies suggest that the effectiveness of nutrition label formats is influenced by the consumer's dominant processing system, which is a function of specific contexts and personal variables (eg, motivation, nutrition knowledge, time pressure, and depletion). Examination of reported findings through a situational processing perspective reveals that consumers might prefer different FOP nutrition label formats in different situations and can exhibit varying responses to the same label format across situations. This review offers several suggestions for policy makers and researchers to help improve current FOP nutrition label formats. © The Author(s) 2017. Published by Oxford University Press on behalf of the International Life Sciences Institute. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  6. Calculating the Price for Derivative Financial Assets of Bessel Processes Using the Sturm-Liouville Theory

    Directory of Open Access Journals (Sweden)

    Burtnyak Ivan V.

    2017-06-01

    Full Text Available In the paper we apply the spectral theory to find the price for derivatives of financial assets assuming that the processes described are Markov processes and such that can be considered in the Hilbert space L^2 using the Sturm-Liouville theory. Bessel diffusion processes are used in studying Asian options. We consider the financial flows generated by the Bessel diffusions by expressing them in terms of the system of Bessel functions of the first kind, provided that they take into account the linear combination of the flow and its spatial derivative. Such expression enables calculating the size of the market portfolio and provides a measure of the amount of internal volatility in the market at any given moment, allows investigating the dynamics of the equity market. The expansion of the Green function in terms of the system of Bessel functions is expressed by an analytic formula that is convenient in calculating the volume of financial flows. All assumptions are natural, result in analytic formulas that are consistent with the empirical data and, when applied in practice, adequately reflect the processes in equity markets.

  7. Seeking Comfort: Women Mental Health Process in I. R. Iran: A Grounded Theory Study

    Science.gov (United States)

    Mohammadi, Farahnaz; Eftekhari, Monir Baradaran; Dejman, Masoumeh; Forouzan, Ameneh Setareh; Mirabzadeh, Arash

    2014-01-01

    Background: Psychosocial factor is considered as intermediate social determinant of health, because it has powerful effects on health especially in women. Hence deeper understanding of the mental-health process needed for its promotion. The aim of this study was to explore women's experience of the mental-health problem and related action-interactions activities to design the appropriate interventions. Methods: In-depth interviews with women 18-65 years were analyzed according to the grounded theory method. The selection of Participants was based on purposeful and theoretical sampling. Results: In this study, a substantive theory was generated; explaining how female with the mental-health problem handled their main concern, which was identified as their effort to achieve comfort (core variable). The other six categories are elements in this process. Daily stress as a trigger, satisfaction is the end point, marriage is the key point and action - interaction activities in this process are strengthening human essence, Developing life skills and help seeking. Conclusions: Better understanding the mental-health process might be useful to design the interventional program among women with mental-health problems. PMID:24627750

  8. The process of adopting and incorporating simulation into undergraduate nursing curricula: a grounded theory study.

    Science.gov (United States)

    Taplay, Karyn; Jack, Susan M; Baxter, Pamela; Eva, Kevin; Martin, Lynn

    2015-01-01

    The aim of this study is to explain the process of adopting and incorporating simulation as a teaching strategy in undergraduate nursing programs, define uptake, and discuss potential outcomes. In many countries, simulation is increasingly adopted as a common teaching strategy. However, there is a dearth of knowledge related to the process of adoption and incorporation. We used an interpretive, constructivist approach to grounded theory to guide this research study. We conducted the study was in Ontario, Canada, during 2011-2012. Using multiple data sources, we informed the development of this theory including in-depth interviews (n = 43) and a review of key organizational documents, such as mission and vision statements (n = 67) from multiple nursing programs (n = 13). The adoption and uptake of mid- to high-fidelity simulation equipment is a multistep iterative process involving various organizational levels within the institution that entails a seven-phase process: (a) securing resources, (b) nursing leaders working in tandem, (c) getting it out of the box, (d) learning about simulation and its potential for teaching, (e) finding a fit, (f) trialing the equipment, and (g) integrating into the curriculum. These findings could assist nursing programs in Canada and internationally that wish to adopt or further incorporate simulation into their curricula and highlight potential organizational and program level outcomes. Crown Copyright © 2015. Published by Elsevier Inc. All rights reserved.

  9. High-energy, large-momentum-transfer processes: Ladder diagrams in var-phi 3 theory

    International Nuclear Information System (INIS)

    Newton, C.L.J.

    1990-01-01

    Relativistic quantum field theories may help one to understand high-energy, large-momentum-transfer processes, where the center-of-mass energy is much larger than the transverse momentum transfers, which are in turn much larger than the masses of the participating particles. With this possibility in mind, the author studies ladder diagrams in var-phi 3 theory. He shows that in the limit s much-gt |t| much-gt m 2 , the scattering amplitude for the N-rung ladder diagram takes the form s -1 |t| -N+1 times a homogeneous polynomial of degree 2N - 2 and ln s and ln |t|. This polynomial takes different forms depending on the relation of ln |t| to ln s. More precisely, the asymptotic formula for the N-rung ladder diagram has points of non-analytically when ln |t| = γ ln s for γ = 1/2, 1/3, hor-ellipsis, 1/N-2

  10. Dual processing theory and experts' reasoning: exploring thinking on national multiple-choice questions.

    Science.gov (United States)

    Durning, Steven J; Dong, Ting; Artino, Anthony R; van der Vleuten, Cees; Holmboe, Eric; Schuwirth, Lambert

    2015-08-01

    An ongoing debate exists in the medical education literature regarding the potential benefits of pattern recognition (non-analytic reasoning), actively comparing and contrasting diagnostic options (analytic reasoning) or using a combination approach. Studies have not, however, explicitly explored faculty's thought processes while tackling clinical problems through the lens of dual process theory to inform this debate. Further, these thought processes have not been studied in relation to the difficulty of the task or other potential mediating influences such as personal factors and fatigue, which could also be influenced by personal factors such as sleep deprivation. We therefore sought to determine which reasoning process(es) were used with answering clinically oriented multiple-choice questions (MCQs) and if these processes differed based on the dual process theory characteristics: accuracy, reading time and answering time as well as psychometrically determined item difficulty and sleep deprivation. We performed a think-aloud procedure to explore faculty's thought processes while taking these MCQs, coding think-aloud data based on reasoning process (analytic, nonanalytic, guessing or combination of processes) as well as word count, number of stated concepts, reading time, answering time, and accuracy. We also included questions regarding amount of work in the recent past. We then conducted statistical analyses to examine the associations between these measures such as correlations between frequencies of reasoning processes and item accuracy and difficulty. We also observed the total frequencies of different reasoning processes in the situations of getting answers correctly and incorrectly. Regardless of whether the questions were classified as 'hard' or 'easy', non-analytical reasoning led to the correct answer more often than to an incorrect answer. Significant correlations were found between self-reported recent number of hours worked with think-aloud word count

  11. Using the theory of planned behavior to determine factors influencing processed foods consumption behavior

    Science.gov (United States)

    Kim, Og Yeon; Shim, Soonmi

    2014-01-01

    BACKGROUND/OBJECTIVES The purpose of this study is to identify how level of information affected intention, using the Theory of Planned Behavior. SUBJECTS/METHODS The study was conducted survey in diverse community centers and shopping malls in Seoul, which yielded N = 209 datasets. To compare processed foods consumption behavior, we divided samples into two groups based on level of information about food additives (whether respondents felt that information on food additives was sufficient or not). We analyzed differences in attitudes toward food additives and toward purchasing processed foods, subjective norms, perceived behavioral control, and behavioral intentions to processed foods between sufficient information group and lack information group. RESULTS The results confirmed that more than 78% of respondents thought information on food additives was insufficient. However, the group who felt information was sufficient had more positive attitudes about consuming processed foods and behavioral intentions than the group who thought information was inadequate. This study found people who consider that they have sufficient information on food additives tend to have more positive attitudes toward processed foods and intention to consume processed foods. CONCLUSIONS This study suggests increasing needs for nutrition education on the appropriate use of processed foods. Designing useful nutrition education requires a good understanding of factors which influence on processed foods consumption. PMID:24944779

  12. Using the theory of planned behavior to determine factors influencing processed foods consumption behavior.

    Science.gov (United States)

    Seo, Sunhee; Kim, Og Yeon; Shim, Soonmi

    2014-06-01

    The purpose of this study is to identify how level of information affected intention, using the Theory of Planned Behavior. The study was conducted survey in diverse community centers and shopping malls in Seoul, which yielded N = 209 datasets. To compare processed foods consumption behavior, we divided samples into two groups based on level of information about food additives (whether respondents felt that information on food additives was sufficient or not). We analyzed differences in attitudes toward food additives and toward purchasing processed foods, subjective norms, perceived behavioral control, and behavioral intentions to processed foods between sufficient information group and lack information group. The results confirmed that more than 78% of respondents thought information on food additives was insufficient. However, the group who felt information was sufficient had more positive attitudes about consuming processed foods and behavioral intentions than the group who thought information was inadequate. This study found people who consider that they have sufficient information on food additives tend to have more positive attitudes toward processed foods and intention to consume processed foods. This study suggests increasing needs for nutrition education on the appropriate use of processed foods. Designing useful nutrition education requires a good understanding of factors which influence on processed foods consumption.

  13. Modern tendencies and problems of the theory of spiritual-moral processes management in higher school

    Directory of Open Access Journals (Sweden)

    Iryna Sidanich

    2016-03-01

    Full Text Available In the article were analyzed the modern tendencies and problems of the theory of spiritual-moral processes management in the higher school. There were defined the node tasks of reformation of higher education: ensuring its quality, construction of effective educational system of the higher school institutions with effective economy and management. There was characterized the problem of ensuring axiological direction of spiritual-humanitarian component of educational process in the system of higher education. There were defined priorities of national interests in spiritual-moral education of junior generation in the state educational activity: national self-consciousness, spiritual-cultural unity of nation, patriotism, humanism, tolerance, responsibility.There was analyzed the system of higher education in the aspect of interaction of spiritual and secular components in coordinates of moral sanitation and spiritual enlightenment of nation, elaboration of democratic principles of society and construction of the modern theory of spiritual-moral processes management in higher school.There were defined the new directions of the theory of spiritual-moral processes management in higher school in the aspect of development of innovations and commercialization, attraction of employers to collaboration with scientists in separate work groups for creation of the new educational programs and modernization of existing ones, mentor support and training of students for job placement and development of enterprising skills and also for support of the programs of probation or practical participation of students in the “real social projects”.There were characterized prospects of research in the aspect of elaboration of the main functions that must establish the main claims to production tasks in professional activity of holder of the master’s degree on speciality “Christian pedagogics in the high education”

  14. Inhibitory processes and cognitive flexibility: evidence for the theory of attentional inertia

    Directory of Open Access Journals (Sweden)

    Isabel Introzzi

    2015-07-01

    Full Text Available The aim of this study was to discriminate the differential contribution of different inhibitory processes -perceptual, cognitive and behavioral inhibition- to switching cost effect associated with alternation cognitive tasks. A correlational design was used. Several experimental paradigms (e.g., Stop signal, visual search, Stemberg´s experimental and Simon paradigm were adapted and included in a computerized program called TAC (Introzzi & Canet Juric, 2014 to the assessment of the different cognitive processes. The final sample consisted of 45 adults (18-50 years. Perceptual and behavioral inhibition shows moderate and low correlations with attentional cost, cognitive inhibition shows no relation with flexibility and only perceptual inhibition predicts switching costs effects, suggesting that different inhibitory processes contribute differentially to switch cost. This could be interpreted as evidence to Attentional Inertia Theory main argument which postulates that inhibition plays an essential role in the ability to flexibly switch between tasks and/or representations.

  15. Understanding how replication processes can maintain systems away from equilibrium using Algorithmic Information Theory.

    Science.gov (United States)

    Devine, Sean D

    2016-02-01

    Replication can be envisaged as a computational process that is able to generate and maintain order far-from-equilibrium. Replication processes, can self-regulate, as the drive to replicate can counter degradation processes that impact on a system. The capability of replicated structures to access high quality energy and eject disorder allows Landauer's principle, in conjunction with Algorithmic Information Theory, to quantify the entropy requirements to maintain a system far-from-equilibrium. Using Landauer's principle, where destabilising processes, operating under the second law of thermodynamics, change the information content or the algorithmic entropy of a system by ΔH bits, replication processes can access order, eject disorder, and counter the change without outside interventions. Both diversity in replicated structures, and the coupling of different replicated systems, increase the ability of the system (or systems) to self-regulate in a changing environment as adaptation processes select those structures that use resources more efficiently. At the level of the structure, as selection processes minimise the information loss, the irreversibility is minimised. While each structure that emerges can be said to be more entropically efficient, as such replicating structures proliferate, the dissipation of the system as a whole is higher than would be the case for inert or simpler structures. While a detailed application to most real systems would be difficult, the approach may well be useful in understanding incremental changes to real systems and provide broad descriptions of system behaviour. Copyright © 2016 The Author. Published by Elsevier Ireland Ltd.. All rights reserved.

  16. Rejoinder to commentary on Palmatier and Rovner (2015): credibility assessment: Preliminary Process Theory, the polygraph process, and construct validity.

    Science.gov (United States)

    Palmatier, John J; Rovner, Louis

    2015-01-01

    We briefly review comments submitted in response to the target article in this series (Palmatier & Rovner, 2015) arguing that a scientifically defensible construct for the instrumental assessment of credibility (i.e. polygraph) may be found in Barry's Preliminary Process Theory (PPT). Our review of the relevant scientific literature discovered a growing body of converging evidence, particularly from the neurosciences that focus not only on deception, but more broadly on memory, emotion, and the orienting response (OR), leading to this conclusion. After reviewing the submitted comments, we are further convinced, especially as applied scientists, that at this time the most viable direction forward is in the context of the PPT. Concurrently, we candidly acknowledge that research must be conducted to address not only commentator concerns but, if warranted, modification of existing theory. Although disagreement continues to exist regarding the order in which questions are asked, the most significant finding, is perhaps that not a single commentator argues against this growing, and vital applied science (i.e., the instrumental assessment of credibility - polygraph). Copyright © 2014 Elsevier B.V. All rights reserved.

  17. A multilayer electro-thermal model of pouch battery during normal discharge and internal short circuit process

    International Nuclear Information System (INIS)

    Chen, Mingbiao; Bai, Fanfei; Song, Wenji; Lv, Jie; Lin, Shili

    2017-01-01

    Highlights: • 2D network equivalent circuit considers the interplay of cell units. • The temperature non-uniformity Φ of multilayer model is bigger than that of lumped model. • The temperature non-uniformity is quantified and the reason of non-uniformity is analyzed. • Increasing the thermal conductivity of the separator can effectively relieve the heat spot effect of ISC. - Abstract: As the electrical and thermal characteristic will affect the batteries’ safety, performance, calendar life and capacity fading, an electro-thermal coupled model for pouch battery LiFePO_4/C is developed in normal discharge and internal short circuit process. The battery is discretized into many cell elements which are united as a 2D network equivalent circuit. The electro-thermal model is solved with finite difference method. Non-uniformity of current distribution and temperature distribution is simulated and the result is validated with experiment data at various discharge rates. Comparison of the lumped model and the multilayer structure model shows that the temperature non-uniformity Φ of multilayer model is bigger than that of lumped model and shows more precise. The temperature non-uniformity is quantified and the reason of non-uniformity is analyzed. The electro-thermal model can also be used to guide the safety design of battery. The temperature of the ISC element near tabs is the highest because the equivalent resistance of the external circuit (not including the ISC element) is the smallest when the resistance of cell units is small. It is found that increasing the thermal conductivity of integrated layer can effectively relieve the heat spot effect of ISC.

  18. Consensual decision-making model based on game theory for LNG processes

    International Nuclear Information System (INIS)

    Castillo, Luis; Dorao, Carlos A.

    2012-01-01

    Highlights: ► A Decision Making (DM) approach for LNG projects based on game theory is presented. ► DM framework was tested with two different cases, using analytical models and a simple LNG process. ► The problems were solved by using a Genetic Algorithm (GA) binary coding and Nash-GA. ► Integrated models from the design and optimization of the process could result in more realistic outcome. ► The major challenge in such a framework is related to the uncertainties in the market models. - Abstract: Decision-Making (DM) in LNG projects is a quite complex process due to the number of actors, approval phases, large investments and capital return in the long time. Furthermore, due to the very high investment of a LNG project, a detailed and efficient DM process is required in order to minimize risks. In this work a Decision-Making (DM) approach for LNG projects is presented. The approach is based on a consensus algorithm to address the consensus output over a common value using cost functions within a framework based on game theory. The DM framework was tested with two different cases. The first case was used for evaluating the performance of the framework with analytical models, while the second case corresponds to a simple LNG process. The problems were solved by using a Genetic Algorithm (GA) binary coding and Nash-GA. The results of the DM framework in the LNG project indicate that considering an integrated DM model and including the markets role from the design and optimization of the process more realistic outcome could be obtained. However, the major challenge in such a framework is related to the uncertainties in the market models.

  19. Normal accidents

    International Nuclear Information System (INIS)

    Perrow, C.

    1989-01-01

    The author has chosen numerous concrete examples to illustrate the hazardousness inherent in high-risk technologies. Starting with the TMI reactor accident in 1979, he shows that it is not only the nuclear energy sector that bears the risk of 'normal accidents', but also quite a number of other technologies and industrial sectors, or research fields. The author refers to the petrochemical industry, shipping, air traffic, large dams, mining activities, and genetic engineering, showing that due to the complexity of the systems and their manifold, rapidly interacting processes, accidents happen that cannot be thoroughly calculated, and hence are unavoidable. (orig./HP) [de

  20. Describing long-range charge-separation processes with subsystem density-functional theory

    Energy Technology Data Exchange (ETDEWEB)

    Solovyeva, Alisa; Neugebauer, Johannes, E-mail: j.neugebauer@uni-muenster.de [Theoretische Organische Chemie, Organisch-Chemisches Institut and Center for Multiscale Theory and Simulation, Westfälische Wilhelms-Universität Münster, Corrensstraße 40, 48149 Münster (Germany); Pavanello, Michele, E-mail: m.pavanello@rutgers.edu [Department of Chemistry, Rutgers University, 73 Warren St., Newark, New Jersey 07102 (United States)

    2014-04-28

    Long-range charge-transfer processes in extended systems are difficult to describe with quantum chemical methods. In particular, cost-effective (non-hybrid) approximations within time-dependent density functional theory (DFT) are not applicable unless special precautions are taken. Here, we show that the efficient subsystem DFT can be employed as a constrained DFT variant to describe the energetics of long-range charge-separation processes. A formal analysis of the energy components in subsystem DFT for such excitation energies is presented, which demonstrates that both the distance dependence and the long-range limit are correctly described. In addition, electronic couplings for these processes as needed for rate constants in Marcus theory can be obtained from this method. It is shown that the electronic structure of charge-separated states constructed by a positively charged subsystem interacting with a negatively charged one is difficult to converge — charge leaking from the negative subsystem to the positive one can occur. This problem is related to the delocalization error in DFT and can be overcome with asymptotically correct exchange–correlation (XC) potentials or XC potentials including a sufficiently large amount of exact exchange. We also outline an approximate way to obtain charge-transfer couplings between locally excited and charge-separated states.

  1. Examining the influence of psychopathy, hostility biases, and automatic processing on criminal offenders' Theory of Mind.

    Science.gov (United States)

    Nentjes, Lieke; Bernstein, David; Arntz, Arnoud; van Breukelen, Gerard; Slaats, Mariëtte

    2015-01-01

    Theory of Mind (ToM) is a social perceptual skill that refers to the ability to take someone else's perspective and infer what others think. The current study examined the effect of potential hostility biases, as well as controlled (slow) versus automatic (fast) processing on ToM performance in psychopathy. ToM abilities (as assessed with the Reading the Mind in the Eyes Test; RMET; Baron-Cohen, Wheelwright, Hill, Raste, & Plumb, 2001), was compared between 39 PCL-R diagnosed psychopathic offenders, 37 non-psychopathic offenders, and 26 nonoffender controls. Contrary to our hypothesis, psychopathic individuals presented with intact overall RMET performance when restrictions were imposed on how long task stimuli could be processed. In addition, psychopaths did not over-ascribe hostility to task stimuli (i.e., lack of hostility bias). However, there was a significant three-way interaction between hostility, processing speed, and psychopathy: when there was no time limit on stimulus presentation, psychopathic offenders made fewer errors in identifying more hostile eye stimuli compared to nonoffender controls, who seemed to be less accurate in detecting hostility. Psychopaths' more realistic appraisal of others' malevolent mental states is discussed in the light of theories that stress its potential adaptive function. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Energetic arousal and language: predictions from the computational theory of quantifiers processing.

    Science.gov (United States)

    Zajenkowski, Marcin

    2013-10-01

    The author examines the relationship between energetic arousal (EA) and the processing of sentences containing natural-language quantifiers. Previous studies and theories have shown that energy may differentially affect various cognitive functions. Recent investigations devoted to quantifiers strongly support the theory that various types of quantifiers involve different cognitive functions in the sentence-picture verification task. In the present study, 201 students were presented with a sentence-picture verification task consisting of simple propositions containing a quantifier that referred to the color of a car on display. Color pictures of cars accompanied the propositions. In addition, the level of participants' EA was measured before and after the verification task. It was found that EA and performance on proportional quantifiers (e.g., "More than half of the cars are red") are in an inverted U-shaped relationship. This result may be explained by the fact that proportional sentences engage working memory to a high degree, and previous models of EA-cognition associations have been based on the assumption that tasks that require parallel attentional and memory processes are best performed when energy is moderate. The research described in the present article has several applications, as it shows the optimal human conditions for verbal comprehension. For instance, it may be important in workplace design to control the level of arousal experienced by office staff when work is mostly related to the processing of complex texts. Energy level may be influenced by many factors, such as noise, time of day, or thermal conditions.

  3. Describing long-range charge-separation processes with subsystem density-functional theory

    International Nuclear Information System (INIS)

    Solovyeva, Alisa; Neugebauer, Johannes; Pavanello, Michele

    2014-01-01

    Long-range charge-transfer processes in extended systems are difficult to describe with quantum chemical methods. In particular, cost-effective (non-hybrid) approximations within time-dependent density functional theory (DFT) are not applicable unless special precautions are taken. Here, we show that the efficient subsystem DFT can be employed as a constrained DFT variant to describe the energetics of long-range charge-separation processes. A formal analysis of the energy components in subsystem DFT for such excitation energies is presented, which demonstrates that both the distance dependence and the long-range limit are correctly described. In addition, electronic couplings for these processes as needed for rate constants in Marcus theory can be obtained from this method. It is shown that the electronic structure of charge-separated states constructed by a positively charged subsystem interacting with a negatively charged one is difficult to converge — charge leaking from the negative subsystem to the positive one can occur. This problem is related to the delocalization error in DFT and can be overcome with asymptotically correct exchange–correlation (XC) potentials or XC potentials including a sufficiently large amount of exact exchange. We also outline an approximate way to obtain charge-transfer couplings between locally excited and charge-separated states

  4. The process of patient enablement in general practice nurse consultations: a grounded theory study.

    Science.gov (United States)

    Desborough, Jane; Banfield, Michelle; Phillips, Christine; Mills, Jane

    2017-05-01

    The aim of this study was to gain insight into the process of patient enablement in general practice nursing consultations. Enhanced roles for general practice nurses may benefit patients through a range of mechanisms, one of which may be increasing patient enablement. In studies with general practitioners enhanced patient enablement has been associated with increases in self-efficacy and skill development. This study used a constructivist grounded theory design. In-depth interviews were conducted with 16 general practice nurses and 23 patients from 21 general practices between September 2013 - March 2014. Data generation and analysis were conducted concurrently using constant comparative analysis and theoretical sampling focussing on the process and outcomes of patient enablement. Use of the storyline technique supported theoretical coding and integration of the data into a theoretical model. A clearly defined social process that fostered and optimised patient enablement was constructed. The theory of 'developing enabling healthcare partnerships between nurses and patients in general practice' incorporates three stages: triggering enabling healthcare partnerships, tailoring care and the manifestation of patient enablement. Patient enablement was evidenced through: 1. Patients' understanding of their unique healthcare requirements informing their health seeking behaviours and choices; 2. Patients taking an increased lead in their partnership with a nurse and seeking choices in their care and 3. Patients getting health care that reflected their needs, preferences and goals. This theoretical model is in line with a patient-centred model of health care and is particularly suited to patients with chronic disease. © 2016 John Wiley & Sons Ltd.

  5. Does Joshua Greene’s Dual Process Theory of Moral Judgment Commit the Naturalistic Fallacy?

    Directory of Open Access Journals (Sweden)

    Javier Gracia Calandín

    2017-02-01

    Full Text Available In this article I analyse whether Joshua Greene’s dual process theory of moral judgment commits the naturalistic fallacy. Firstly, and against current authors such as Patricia S. Churchland, I uphold the validity of the naturalistic fallacy denounced by Moore for more than a century. Secondly, I highlight and question Greene’s naturalized way of understanding Deontologism. Thirdly, I assert the distinction between "neural basis" and "moral foundation" as the key to avoid committing the naturalistic fallacy. Finally and according to that key distinction I assess Greene’s neuroethical approach and I analyse some of its most critical aspects related to normative issues.

  6. Unhealthy weight control behaviours in adolescent girls: a process model based on self-determination theory

    OpenAIRE

    Thøgersen-Ntoumani, Cecilie; Ntoumanis, Nikos

    2009-01-01

    This study used self-determination theory (Deci, E.L., & Ryan, R.M. (2000). The 'what' and 'why' of goal pursuits: Human needs and the self-determination of behavior. Psychological Inquiry, 11, 227-268.) to examine predictors of body image concerns and unhealthy weight control behaviours in a sample of 350 Greek adolescent girls. A process model was tested which proposed that perceptions of parental autonomy support and two life goals (health and image) would predict adolescents' degree of sa...

  7. Stochastic Processes and Queueing Theory used in Cloud Computer Performance Simulations

    Directory of Open Access Journals (Sweden)

    Florin-Catalin ENACHE

    2015-10-01

    Full Text Available The growing character of the cloud business has manifested exponentially in the last 5 years. The capacity managers need to concentrate on a practical way to simulate the random demands a cloud infrastructure could face, even if there are not too many mathematical tools to simulate such demands.This paper presents an introduction into the most important stochastic processes and queueing theory concepts used for modeling computer performance. Moreover, it shows the cases where such concepts are applicable and when not, using clear programming examples on how to simulate a queue, and how to use and validate a simulation, when there are no mathematical concepts to back it up.

  8. Exploring potentials of sense-making theory for understanding social processes in public hearing

    DEFF Research Database (Denmark)

    Lyhne, Ivar

    authorities and the public in such planning often characterised by conflict. A sense-making framework is developed based on Karl Weick's theory to investigate how participants at the meeting change their understanding aspects like other actors' opinions and the infrastructure project. Through interviews...... and observations it is shown that participants' senses do not change except from a few aspects. The participants at the meeting thus seem stuck in their positions without interest in being open for other interpretations or arguments. The investigation leads to considerations about the benefit and role...... of such a public meeting and the importance of trust and openness in the social processes in a public hearing....

  9. The Direct and Indirect Impact of Pharmaceutical Industry in Economic Expansion and Job Creation: Evidence from Bootstrapping and Normal Theory Methods

    Directory of Open Access Journals (Sweden)

    Rizwan Raheem Ahmed

    2018-05-01

    Full Text Available The objective of this research article is to examine the role of Pakistan’s pharmaceutical industry in job creation opportunities, with the sacred intention to eradicate poverty, and expansion in economic activities. This research is quantitative in nature, and the data is directly gathered through closed-ended questionnaires from 300 respondents. Besides predictors’, four mediating variables have also been taken into consideration that contribute indirectly in job creation opportunities. Bootstrapping and Normal theory methods have been employed in order to examine the impact of predictors’ and mediating variables. The result of this research confirmed that pharmaceutical industry plays a vital role in job creation in Pakistan. It is further concluded that the pharmaceutical industry has a direct and significant impact in job creation by providing indigenous and direct job opportunities in sales, marketing, and other supporting departments for both skilled and unskilled workers. Pharmaceutical industry also provides indirect job opportunities through other industries, which are very much linked with this industry, such as: pharmaceutical distributors, dealers, retailers, wholesalers, hotel industry, and event management industry. It is also determined that pharmaceutical industry is acting like knowledge and skills imparting institutions. Therefore, skilled-based training and organizational learning are major mediating variables that transform unskilled people into human assets, which further trigger the future job prospects. Since pharmaceutical industry is one of the biggest industries in Pakistan, providing plenteous opportunities of new jobs with consistent growth. Thus, mediating variables such as motivation and interpersonal influence also preceded an active role in new job creation

  10. Prior Knowledge and the Learning of Science. A Review of Ausubel's Theory of This Process

    Science.gov (United States)

    West, L. H. T.; Fensham, P. J.

    1974-01-01

    Examines Ausubel's theory of learning as a model of the role concerning the influence of prior knowledge on how learning occurs. Research evidence for Ausubel's theory is presented and discussed. Implications of Ausubel's theory for teaching are summarized. (PEB)

  11. The Diagonal Model of Job Satisfaction and Motivation: Extracted from the Logical Comparison of Content and Process Theories

    Science.gov (United States)

    Sahito, Zafarullah; Vaisanen, Pertti

    2017-01-01

    The purpose of this study is to explore the strongest areas of all prime theories of job satisfaction and motivation to create a new multidimensional model. This model relies on all explored areas from the logical comparison of content and process theories to understand the phenomenon of job satisfaction and motivation of employees. The model…

  12. Attachment and the processing of social information across the life span: theory and evidence.

    Science.gov (United States)

    Dykas, Matthew J; Cassidy, Jude

    2011-01-01

    Researchers have used J. Bowlby's (1969/1982, 1973, 1980, 1988) attachment theory frequently as a basis for examining whether experiences in close personal relationships relate to the processing of social information across childhood, adolescence, and adulthood. We present an integrative life-span-encompassing theoretical model to explain the patterns of results that have emerged from these studies. The central proposition is that individuals who possess secure experience-based internal working models of attachment will process--in a relatively open manner--a broad range of positive and negative attachment-relevant social information. Moreover, secure individuals will draw on their positive attachment-related knowledge to process this information in a positively biased schematic way. In contrast, individuals who possess insecure internal working models of attachment will process attachment-relevant social information in one of two ways, depending on whether the information could cause the individual psychological pain. If processing the information is likely to lead to psychological pain, insecure individuals will defensively exclude this information from further processing. If, however, the information is unlikely to lead to psychological pain, then insecure individuals will process this information in a negatively biased schematic fashion that is congruent with their negative attachment-related experiences. In a comprehensive literature review, we describe studies that illustrate these patterns of attachment-related information processing from childhood to adulthood. This review focuses on studies that have examined specific components (e.g., attention and memory) and broader aspects (e.g., attributions) of social information processing. We also provide general conclusions and suggestions for future research.

  13. A Theory of Information Quality and a Framework for its Implementation in the Requirements Engineering Process

    Science.gov (United States)

    Grenn, Michael W.

    This dissertation introduces a theory of information quality to explain macroscopic behavior observed in the systems engineering process. The theory extends principles of Shannon's mathematical theory of communication [1948] and statistical mechanics to information development processes concerned with the flow, transformation, and meaning of information. The meaning of requirements information in the systems engineering context is estimated or measured in terms of the cumulative requirements quality Q which corresponds to the distribution of the requirements among the available quality levels. The requirements entropy framework (REF) implements the theory to address the requirements engineering problem. The REF defines the relationship between requirements changes, requirements volatility, requirements quality, requirements entropy and uncertainty, and engineering effort. The REF is evaluated via simulation experiments to assess its practical utility as a new method for measuring, monitoring and predicting requirements trends and engineering effort at any given time in the process. The REF treats the requirements engineering process as an open system in which the requirements are discrete information entities that transition from initial states of high entropy, disorder and uncertainty toward the desired state of minimum entropy as engineering effort is input and requirements increase in quality. The distribution of the total number of requirements R among the N discrete quality levels is determined by the number of defined quality attributes accumulated by R at any given time. Quantum statistics are used to estimate the number of possibilities P for arranging R among the available quality levels. The requirements entropy H R is estimated using R, N and P by extending principles of information theory and statistical mechanics to the requirements engineering process. The information I increases as HR and uncertainty decrease, and the change in information AI needed

  14. Decision making process and factors contributing to research participation among general practitioners: A grounded theory study.

    Science.gov (United States)

    Tong, Seng Fah; Ng, Chirk Jenn; Lee, Verna Kar Mun; Lee, Ping Yein; Ismail, Irmi Zarina; Khoo, Ee Ming; Tahir, Noor Azizah; Idris, Iliza; Ismail, Mastura; Abdullah, Adina

    2018-01-01

    The participation of general practitioners (GPs) in primary care research is variable and often poor. We aimed to develop a substantive and empirical theoretical framework to explain GPs' decision-making process to participate in research. We used the grounded theory approach to construct a substantive theory to explain the decision-making process of GPs to participate in research activities. Five in-depth interviews and four focus group discussions were conducted among 21 GPs. Purposeful sampling followed by theoretical sampling were used to attempt saturation of the core category. Data were collected using semi-structured open-ended questions. Interviews were recorded, transcribed verbatim and checked prior to analysis. Open line-by-line coding followed by focus coding were used to arrive at a substantive theory. Memoing was used to help bring concepts to higher abstract levels. The GPs' decision to participate in research was attributed to their inner drive and appreciation for primary care research and their confidence in managing their social and research environments. The drive and appreciation for research motivated the GPs to undergo research training to enhance their research knowledge, skills and confidence. However, the critical step in the GPs' decision to participate in research was their ability to align their research agenda with priorities in their social environment, which included personal life goals, clinical practice and organisational culture. Perceived support for research, such as funding and technical expertise, facilitated the GPs' participation in research. In addition, prior experiences participating in research also influenced the GPs' confidence in taking part in future research. The key to GPs deciding to participate in research is whether the research agenda aligns with the priorities in their social environment. Therefore, research training is important, but should be included in further measures and should comply with GPs' social

  15. Capability-oriented agent theory and its applications in dependable systems and process engineering

    Energy Technology Data Exchange (ETDEWEB)

    Thunem, Atoosa P-J.

    2004-04-15

    During the rapid growth of computerised systems in the past 15 years, the variety of services and their efficiency have been the strongest deciding factors in design and development of the systems within various industrial branches. At the same time, the introduction and popularity of emerging design and development techniques seems to have forced the industry to include these in their product development process. Unfortunately, too many examples of lack of use or erroneous use of these techniques within industries such as telecommunications, telemedicine, aerospace and indeed the energy sector indicate that a common understanding of and belief in the rationale behind the techniques and their solution domains has not been obtained. At the same time, a tremendous increase in the number of emerging techniques has made such an understanding difficult to gain, especially when the techniques share the same application field, but differ in few yet important issues. Finally, the lack of knowledge about system aspects and the integration of various abstraction levels to describe them have added even more to the confusion on how to use different techniques. The work resulting in the Capability-Oriented Agent Theory began while trying to find more descriptive system models, taking into account a wider selection of system aspects. Although related to object-oriented and agent-oriented principles, the theory differs from such principles in many respects. Among others, its focal point is on a category of system aspects neither addressed nor recognised within such principles before. Additionally, the theory opposes the well-established idea of distinct separation between requirement, design, implementation and test specifications, but suggests a systematic integration of the related activities, hence to increase their traceability and intercommunication in both a top-down and a bottom-up manner along the development process. (Author)

  16. Kinetic theory of age-structured stochastic birth-death processes

    Science.gov (United States)

    Greenman, Chris D.; Chou, Tom

    2016-01-01

    Classical age-structured mass-action models such as the McKendrick-von Foerster equation have been extensively studied but are unable to describe stochastic fluctuations or population-size-dependent birth and death rates. Stochastic theories that treat semi-Markov age-dependent processes using, e.g., the Bellman-Harris equation do not resolve a population's age structure and are unable to quantify population-size dependencies. Conversely, current theories that include size-dependent population dynamics (e.g., mathematical models that include carrying capacity such as the logistic equation) cannot be easily extended to take into account age-dependent birth and death rates. In this paper, we present a systematic derivation of a new, fully stochastic kinetic theory for interacting age-structured populations. By defining multiparticle probability density functions, we derive a hierarchy of kinetic equations for the stochastic evolution of an aging population undergoing birth and death. We show that the fully stochastic age-dependent birth-death process precludes factorization of the corresponding probability densities, which then must be solved by using a Bogoliubov--Born--Green--Kirkwood--Yvon-like hierarchy. Explicit solutions are derived in three limits: no birth, no death, and steady state. These are then compared with their corresponding mean-field results. Our results generalize both deterministic models and existing master equation approaches by providing an intuitive and efficient way to simultaneously model age- and population-dependent stochastic dynamics applicable to the study of demography, stem cell dynamics, and disease evolution.

  17. Bayesian or Laplacien inference, entropy and information theory and information geometry in data and signal processing

    Science.gov (United States)

    Mohammad-Djafari, Ali

    2015-01-01

    The main object of this tutorial article is first to review the main inference tools using Bayesian approach, Entropy, Information theory and their corresponding geometries. This review is focused mainly on the ways these tools have been used in data, signal and image processing. After a short introduction of the different quantities related to the Bayes rule, the entropy and the Maximum Entropy Principle (MEP), relative entropy and the Kullback-Leibler divergence, Fisher information, we will study their use in different fields of data and signal processing such as: entropy in source separation, Fisher information in model order selection, different Maximum Entropy based methods in time series spectral estimation and finally, general linear inverse problems.

  18. The Theory of Laser Materials Processing Heat and Mass Transfer in Modern Technology

    CERN Document Server

    Dowden, John

    2009-01-01

    The purpose of the book is to show how general principles can be used to obtain insight into laser processes. The principles used may come from fundamental physical theory or from direct observation of experimental results, but an understanding of the general characteristics of the behaviour of a process is essential for intelligent investigation and implementation, whether the approach is experimental, observational, numerical or analytical. The last two have a special value since the associated costs can be relatively low and may be used as a starting point for more expensive techniques. The construction of simple models whose underlying principles are easy to see is therefore of special value, and an understanding of their strengths and limitations is essential. The applications considered in detail are cutting, keyhole welding, drilling, arc and hybrid laser-arc welding, hardening, cladding, forming and cutting, but the general principles have a very wide application; metallurgical aspects are considered,...

  19. Optically stimulated exoelectron emission processes in quartz: comparison of experiment and theory

    DEFF Research Database (Denmark)

    Pagonis, V.; Ankjærgaard, Christina; Murray, A.S.

    2009-01-01

    Recent experiments have demonstrated that it is possible to measure optically stimulated exoelectron emission (OSE) signals simultaneously with optically stimulated luminescence (OSL) from quartz samples. These experiments provide valuable information on the charge movement in quartz grains. Two...... data yield a value of χ1.2 eV for the work function of quartz. The experimental temperature dependence of the OSE signals is interpreted on the basis of a photo-thermostimulated (PTSEE) process involving the main OSL trap at 320 °C; this process takes place with a thermal assistance energy estimated...... at W(0.29±0.02) eV. Good quantitative agreement is obtained between theory and experiment by assuming a thermal broadening of the thermal depletion factor for the OSL traps, described by a Gaussian distribution of energies....

  20. The theory of laser materials processing heat and mass transfer in modern technology

    CERN Document Server

    Schulz, Wolfgang

    2017-01-01

    The revised edition of this important reference volume presents an expanded overview of the analytical and numerical approaches employed when exploring and developing modern laser materials processing techniques. The book shows how general principles can be used to obtain insight into laser processes, whether derived from fundamental physical theory or from direct observation of experimental results. The book gives readers an understanding of the strengths and limitations of simple numerical and analytical models that can then be used as the starting-point for more elaborate models of specific practical, theoretical or commercial value. Following an introduction to the mathematical formulation of some relevant classes of physical ideas, the core of the book consists of chapters addressing key applications in detail: cutting, keyhole welding, drilling, arc and hybrid laser-arc welding, hardening, cladding and forming. The second edition includes a new a chapter on glass cutting with lasers, as employed in the ...

  1. The motivation to breastfeed: a fit to the opponent-process theory?

    Science.gov (United States)

    Myers, H H; Siegel, P S

    1985-07-01

    The opponent-process theory, a dynamic model of acquired motivation presented by Solomon and Corbit (1974), was applied to the process of breastfeeding. A modified form of the Nowlis Mood Adjective Checklist (MACL, Nowlis, 1965, 1970) and a discomfort measure were used in assessing through recall the affective course predicted by the theory. The data were analyzed using multivariate analysis of variance (MANOVA) and correlational procedures. Results were highly significant: Women who breastfed for relatively long periods recalled positive affective responses while the baby was at breast and a subsequent negative or dysphoric response. The additional characteristics of acquired motivation, habituation, and withdrawal, were also evidenced in the data. As a control for possible confounding demand characteristics inherent in the methodology, a sample of childless women was surveyed using an "as-if" form of the same questionnaire. Very little similarity to the breastfeeders was found in the pattern of responses yielded by this group. It was concluded that our major findings are quite likely free of influence from this source.

  2. The Linkages between Mindfulness and Social Information Processing Theory on the Usage of Whatsapp Media Groups

    Directory of Open Access Journals (Sweden)

    Dina Sekar Vusparatih

    2018-03-01

    Full Text Available The objective of the research was to find the linkages between mindfulness and social information processing theory on the use of WhatsApp group of Elementary school Principals in District Cilandak Region III for the distribution of various information and instructions. Through the concept of mindfulness and Social Information Processing theory approach (SIP, this research would explore the causes of the frequent emergence of noise, misunderstanding, and even tangency to the WA group that was carried on the meeting of headmaster meetings and relationships between members. The research problem was why WA group still causing issues among the Principals. By using the qualitative approach, data collection techniques used in this research were the interview, observation, and literature study. It is found that technological sophistication does not go parallel with maturity in communicating using media technologies. Lack of mindfulness in the WA group is a form of organizational communication that is simply transferred into the form of text communication on mobile phones that is being the main cause. Also, the organizational structure is still inherent in it and only serves as a bridge/form of interim communication because the main form of communication is still in the form of correspondence and face-to-face meetings.

  3. A Grounded Theory Approach in a Branding Context: Challenges and lessons learnt during the research process

    Directory of Open Access Journals (Sweden)

    Anne Rindell, PhD.

    2009-06-01

    Full Text Available The purpose of this paper is to discuss challenges and lessons learnt when conducting a classic grounded theory study in a marketing context. The paper focuses on two specific challenges that were met during a specific research process. The first challenge related to positioning the study, namely, specifying“what the study is a study of”. The second challenge concerned the choice between formal or substantive theory. Both challenges were accentuated as the emerged core category concerned a phenomenon that has caught less attention in marketing, that is, the temporal dimension in corporate images. By the temporal dimension in corporate images we mean that corporate images often have roots in earlier times through consumer memories. In other words, consumers are not tabula rasa, that is, blank sheets of paper on which communication messages can be printed. Rather, consumers have a pre-understanding of the company that works as an interpretation framework for company actions in the present. The lessons learnt from this research process can be summarized as “stay faithful to the data”, “write memos on issues you reflect upon although they might be in another substantial field” as they might become useful later, and, “look into thinking in other disciplines” as disciplines do not develop equally.

  4. Process-oriented dose assessment model for 14C due to releases during normal operation of a nuclear power plant

    International Nuclear Information System (INIS)

    Aquilonius, Karin; Hallberg, Bengt

    2005-01-01

    Swedish nuclear utility companies are required to assess doses due to releases of radionuclides during normal operation. In 2001, calculation methods used earlier were updated due to new authority regulations. The isotope 14 C is of special interest in dose assessments due to the role of carbon in the metabolism of all life forms. Earlier, factors expressing the ratio between concentration of 14 C in air and in various plants were used. In order to extend the possibility to take local conditions into account, a process-oriented assessment model for uptake of carbon and doses from releases of 14 C to air was developed (POM 14 C). The model uses part of Daisy which has been developed to model the turnover of carbon in crops. [Hansen, S., Jensen, H.E., Nielsen, N.E., Svendsen, H., 1993. Description of the Soil Plant System Model DAISY, Basic Principles and Modelling Approach. Simulation Model for Transformation and Transport of Energy and Matter in the Soil Plant Atmosphere System. Jordbruksfoerlaget, The Royal Veterianary and Agricultural University, Copenhagen, Denmark]. The main objectives were to test model performance of the former method, and to investigate if taking site specific parameters into account to a greater degree would lead to major differences in the results. Several exposure pathways were considered: direct consumption of locally grown cereals, vegetables, and root vegetables, as well as consumption of milk and meat from cows having eaten fodder cereals and green fodder from the area around the nuclear plant. The total dose of the earlier model was compared with that of POM 14 C. The result of the former was shown to be slightly higher than the latter, but POM 14 C confirmed that the earlier results were of a reasonable magnitude. When full account of local conditions was taken, e.g. as regards solar radiation, temperature, and concentration of 14 C in air at various places in the surroundings of each nuclear plant, a difference in dose between

  5. Forests as Patrimonies? From Theory to Tangible Processes at Various Scales

    Directory of Open Access Journals (Sweden)

    Genevieve Michon

    2012-09-01

    Full Text Available Among theoretical fields addressing the conceptualization of interrelationships between nature and society, patrimonial approaches remain relatively unexplored. Stressing the multiplication of local dynamics where elements of nature are redefined as "patrimonies" (ranging from local patrimonies to world heritage by various social groups, this conceptual field tries to qualify these dynamics and their determiners to understand how they allow us to better address contemporary environmental challenges. Through a multidisciplinary approach in social sciences, centered on rural forests, we analyze the multiplication of patrimonial processes in forest development at various scales. First, we elaborate on the concept of patrimony and on patrimonial processes and present the current construction and dynamics of forest patrimonies. A crucial question concerns the links that form between the many spatial-temporal levels where these processes develop. Moreover, these patrimonial processes can be quite divergent, not only across scales from local to global, but also between "endogenous" (or bottom-up and "exogenous" (or top-down processes. We present two detailed examples in Morocco and Sumatra, where patrimonial constructions are developing simultaneously at various scales and through various actors who treat the forest in very different ways. Drawing from these examples, we discuss how and why the simultaneous development of different, often overlapping, patrimonial constructions along these scales allows collaboration or, conversely, can lead their holders into conflict. Lastly, we discuss the contribution of patrimonial concepts to resilience thinking and social-ecological systems theory.

  6. Developing Visualization Support System for Teaching/Learning Database Normalization

    Science.gov (United States)

    Folorunso, Olusegun; Akinwale, AdioTaofeek

    2010-01-01

    Purpose: In tertiary institution, some students find it hard to learn database design theory, in particular, database normalization. The purpose of this paper is to develop a visualization tool to give students an interactive hands-on experience in database normalization process. Design/methodology/approach: The model-view-controller architecture…

  7. Dual process theory and intermediate effect: are faculty and residents' performance on multiple-choice, licensing exam questions different?

    NARCIS (Netherlands)

    Dong, T.; Durning, S.J.; Artino, A.R.; Vleuten, C.P.M. van der; Holmboe, E.; Lipner, R.; Schuwirth, L.

    2015-01-01

    BACKGROUND: Clinical reasoning is essential for the practice of medicine. Dual process theory conceptualizes reasoning as falling into two general categories: nonanalytic reasoning (pattern recognition) and analytic reasoning (active comparing and contrasting of alternatives). The debate continues

  8. Multistep process of neoplastic transformation of normal human fibroblasts by 60Co gamma rays and Harvey sarcoma viruses

    Energy Technology Data Exchange (ETDEWEB)

    Namba, M.; Nishitani, K.; Fukushima, F.; Kimoto, T.; Nose, K.

    1986-03-15

    As reported previously (Namba et al., 1985), normal human fibroblasts were transformed by 60Co gamma-ray irradiation into immortal cells with abnormal karyotypes. These transformed cells (KMST-6), however, showed a low cloning efficiency in soft agar and no transplantability. However, upon treatment with Harvey murine sarcoma virus (Ha-MSV), the cells acquired elevated clonability in soft agar and transplantability in nude mice. Ha-MSV alone, however, did not convert normal human fibroblasts into either immortal or tumorigenic cells. The Ha-MSV-transformed KMST-6 cells showed an enhanced expression of the ras oncogene, but normal and 60Co gamma-ray-transformed cells did not. Our current data suggest that gamma rays worked against normal human cells as an initiator, giving rise to chromosome aberrations and immortality, and that Ha-MSV, probably through its ras oncogene, played a role in the progression of the malignant cell population to a more malignant one showing enhanced colony formation in soft agar and tumorigenicity in nude mice.

  9. Macrotransport processes: Brownian tracers as stochastic averagers in effective medium theories of heterogeneous media

    International Nuclear Information System (INIS)

    Brenner, H.

    1991-01-01

    Macrotransport processes (generalized Taylor dispersion phenomena) constitute coarse-grained descriptions of comparable convective diffusive-reactive microtransport processes, the latter supposed governed by microscale linear constitutive equations and boundary conditions, but characterized by spatially nonuniform phenomenological coefficients. Following a brief review of existing applications of the theory, the author focuses - by way of background information-upon the original (and now classical) Taylor - Aris dispersion problem, involving the combined convective and molecular diffusive transport of a point-size Brownian solute molecule (tracer) suspended in a Poiseuille solvent flow within a circular tube. A series of elementary generalizations of this prototype problem to chromatographic-like solute transport processes in tubes is used to illustrate some novel statistical-physical features. These examples emphasize the fact that a solute molecule may, on average, move axially down the tube at a different mean velocity (either larger or smaller) than that of a solvent molecule. Moreover, this solute molecule may suffer axial dispersion about its mean velocity at a rate greatly exceeding that attributable to its axial molecular diffusion alone. Such chromatographic anomalies represent novel macroscale non-linearities originating from physicochemical interactions between spatially inhomogeneous convective-diffusive-reactive microtransport processes

  10. Information theory and signal transduction systems: from molecular information processing to network inference.

    Science.gov (United States)

    Mc Mahon, Siobhan S; Sim, Aaron; Filippi, Sarah; Johnson, Robert; Liepe, Juliane; Smith, Dominic; Stumpf, Michael P H

    2014-11-01

    Sensing and responding to the environment are two essential functions that all biological organisms need to master for survival and successful reproduction. Developmental processes are marshalled by a diverse set of signalling and control systems, ranging from systems with simple chemical inputs and outputs to complex molecular and cellular networks with non-linear dynamics. Information theory provides a powerful and convenient framework in which such systems can be studied; but it also provides the means to reconstruct the structure and dynamics of molecular interaction networks underlying physiological and developmental processes. Here we supply a brief description of its basic concepts and introduce some useful tools for systems and developmental biologists. Along with a brief but thorough theoretical primer, we demonstrate the wide applicability and biological application-specific nuances by way of different illustrative vignettes. In particular, we focus on the characterisation of biological information processing efficiency, examining cell-fate decision making processes, gene regulatory network reconstruction, and efficient signal transduction experimental design. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. ‘Living' theory: a pedagogical framework for process support in networked learning

    Directory of Open Access Journals (Sweden)

    Philipa Levy

    2006-12-01

    Full Text Available This paper focuses on the broad outcome of an action research project in which practical theory was developed in the field of networked learning through case-study analysis of learners' experiences and critical evaluation of educational practice. It begins by briefly discussing the pedagogical approach adopted for the case-study course and the action research methodology. It then identifies key dimensions of four interconnected developmental processes–orientation, communication, socialisation and organisation–that were associated with ‘learning to learn' in the course's networked environment, and offers a flavour of participants' experiences in relation to these processes. A number of key evaluation issues that arose are highlighted. Finally, the paper presents the broad conceptual framework for the design and facilitation of process support in networked learning that was derived from this research. The framework proposes a strong, explicit focus on support for process as well as domain learning, and progression from tighter to looser design and facilitation structures for process-focused (as well as domain-focused learning tasks.

  12. On electromagnetic forming processes in finitely strained solids: Theory and examples

    Science.gov (United States)

    Thomas, J. D.; Triantafyllidis, N.

    2009-08-01

    The process of electromagnetic forming (EMF) is a high velocity manufacturing technique that uses electromagnetic (Lorentz) body forces to shape sheet metal parts. EMF holds several advantages over conventional forming techniques: speed, repeatability, one-sided tooling, and most importantly considerable ductility increase in several metals. Current modeling techniques for EMF processes are not based on coupled variational principles to simultaneously account for electromagnetic and mechanical effects. Typically, separate solutions to the electromagnetic (Maxwell) and motion (Newton) equations are combined in staggered or lock-step methods, sequentially solving the mechanical and electromagnetic problems. The present work addresses these issues by introducing a fully coupled Lagrangian (reference configuration) least-action variational principle, involving magnetic flux and electric potentials and the displacement field as independent variables. The corresponding Euler-Lagrange equations are Maxwell's and Newton's equations in the reference configuration, which are shown to coincide with their current configuration counterparts obtained independently by a direct approach. The general theory is subsequently simplified for EMF processes by considering the eddy current approximation. Next, an application is presented for axisymmetric EMF problems. It is shown that the proposed variational principle forms the basis of a variational integration numerical scheme that provides an efficient staggered solution algorithm. As an illustration a number of such processes are simulated, inspired by recent experiments of freely expanding uncoated and polyurea-coated aluminum tubes.

  13. Glomerular epithelial foot processes in normal man and rats. Distribution of true width and its intra- and inter-individual variation.

    Science.gov (United States)

    Gundersen, H J; Seefeldt, T; Osterby, R

    1980-01-01

    The width of individual glomerular epithelial foot processes appears very different on electron micrographs. A method for obtainining distributions of the true width of foot processes from that of their apparent width on electron micrographs has been developed based on geometric probability theory pertaining to a specific geometric model. Analyses of foot process width in humans and rats show a remarkable interindividual invariance implying rigid control and therefore great biological significance of foot process width or a derivative thereof. The very low inter-individual variation of the true width, shown in the present paper, makes it possible to demonstrate slight changes in rather small groups of patients or experimental animals.

  14. Fuzzy-trace theory: dual processes in memory, reasoning, and cognitive neuroscience.

    Science.gov (United States)

    Brainerd, C J; Reyna, V F

    2001-01-01

    Fuzzy-trace theory has evolved in response to counterintuitive data on how memory development influences the development of reasoning. The two traditional perspectives on memory-reasoning relations--the necessity and constructivist hypotheses--stipulate that the accuracy of children's memory for problem information and the accuracy of their reasoning are closely intertwined, albeit for different reasons. However, contrary to necessity, correlational and experimental dissociations have been found between children's memory for problem information that is determinative in solving certain problems and their solutions of those problems. In these same tasks, age changes in memory for problem information appear to be dissociated from age changes in reasoning. Contrary to constructivism, correlational and experimental dissociations also have been found between children's performance on memory tests for actual experience and memory tests for the meaning of experience. As in memory-reasoning studies, age changes in one type of memory performance do not seem to be closely connected to age changes in the other type of performance. Subsequent experiments have led to dual-process accounts in both the memory and reasoning spheres. The account of memory development features four other principles: parallel verbatim-gist storage, dissociated verbatim-gist retrieval, memorial bases of conscious recollection, and identity/similarity processes. The account of the development of reasoning features three principles: gist extraction, fuzzy-to-verbatim continua, and fuzzy-processing preferences. The fuzzy-processing preference is a particularly important notion because it implies that gist-based intuitive reasoning often suffices to deliver "logical" solutions and that such reasoning confers multiple cognitive advantages that enhance accuracy. The explanation of memory-reasoning dissociations in cognitive development then falls out of fuzzy-trace theory's dual-process models of memory and

  15. Aging of theory of mind: the influence of educational level and cognitive processing.

    Science.gov (United States)

    Li, Xiaoming; Wang, Kai; Wang, Fan; Tao, Qian; Xie, Yu; Cheng, Qi

    2013-01-01

    Previous studies of theory of mind (ToM) in old age have provided mixed results. We predicted that educational level and cognitive processing are two factors influencing the pattern of the aging of ToM. To test this hypothesis, a younger group who received higher education (mean age 20.46 years), an older group with an education level equal to that of the young group (mean age 76.29 years), and an older group with less education (mean age 73.52 years) were recruited. ToM tasks included the following tests: the second-order false-belief task, the faux-pas task, the eyes test, and tests of fundamental aspects of cognitive function that included two background tests (memory span and processing speed) and three subcomponents of executive function (inhibition, updating, and shifting). We found that the younger group and the older group with equally high education outperformed the older group with less education in false-belief and faux-pas tasks. However, there was no significant difference between the two former groups. The three groups of participants performed equivalently in the eyes test as well as in control tasks (false-belief control question, faux-pas control question, faux-pas control story, and Eyes Test control task). The younger group outperformed the other two groups in the cognitive processing tasks. Mediation analyses showed that difficulties in inhibition, memory span, and processing speed mediated the age differences in false-belief reasoning. Also, the variables of inhibition, updating, memory span, and processing speed mediated age-related variance in faux-pas. Discussion focused on the links between ToM aging, educational level, and cognitive processing. Supported by Chinese National Natural Science Foundation (number: 30870766) and Anhui Province Natural Science Foundation (number: 11040606M166).

  16. Random Process Theory Approach to Geometric Heterogeneous Surfaces: Effective Fluid-Solid Interaction

    Science.gov (United States)

    Khlyupin, Aleksey; Aslyamov, Timur

    2017-06-01

    Realistic fluid-solid interaction potentials are essential in description of confined fluids especially in the case of geometric heterogeneous surfaces. Correlated random field is considered as a model of random surface with high geometric roughness. We provide the general theory of effective coarse-grained fluid-solid potential by proper averaging of the free energy of fluid molecules which interact with the solid media. This procedure is largely based on the theory of random processes. We apply first passage time probability problem and assume the local Markov properties of random surfaces. General expression of effective fluid-solid potential is obtained. In the case of small surface irregularities analytical approximation for effective potential is proposed. Both amorphous materials with large surface roughness and crystalline solids with several types of fcc lattices are considered. It is shown that the wider the lattice spacing in terms of molecular diameter of the fluid, the more obtained potentials differ from classical ones. A comparison with published Monte-Carlo simulations was discussed. The work provides a promising approach to explore how the random geometric heterogeneity affects on thermodynamic properties of the fluids.

  17. Normalisation process theory: a framework for developing, evaluating and implementing complex interventions

    LENUS (Irish Health Repository)

    Murray, Elizabeth

    2010-10-20

    Abstract Background The past decade has seen considerable interest in the development and evaluation of complex interventions to improve health. Such interventions can only have a significant impact on health and health care if they are shown to be effective when tested, are capable of being widely implemented and can be normalised into routine practice. To date, there is still a problematic gap between research and implementation. The Normalisation Process Theory (NPT) addresses the factors needed for successful implementation and integration of interventions into routine work (normalisation). Discussion In this paper, we suggest that the NPT can act as a sensitising tool, enabling researchers to think through issues of implementation while designing a complex intervention and its evaluation. The need to ensure trial procedures that are feasible and compatible with clinical practice is not limited to trials of complex interventions, and NPT may improve trial design by highlighting potential problems with recruitment or data collection, as well as ensuring the intervention has good implementation potential. Summary The NPT is a new theory which offers trialists a consistent framework that can be used to describe, assess and enhance implementation potential. We encourage trialists to consider using it in their next trial.

  18. Eyewitness decisions in simultaneous and sequential lineups: a dual-process signal detection theory analysis.

    Science.gov (United States)

    Meissner, Christian A; Tredoux, Colin G; Parker, Janat F; MacLin, Otto H

    2005-07-01

    Many eyewitness researchers have argued for the application of a sequential alternative to the traditional simultaneous lineup, given its role in decreasing false identifications of innocent suspects (sequential superiority effect). However, Ebbesen and Flowe (2002) have recently noted that sequential lineups may merely bring about a shift in response criterion, having no effect on discrimination accuracy. We explored this claim, using a method that allows signal detection theory measures to be collected from eyewitnesses. In three experiments, lineup type was factorially combined with conditions expected to influence response criterion and/or discrimination accuracy. Results were consistent with signal detection theory predictions, including that of a conservative criterion shift with the sequential presentation of lineups. In a fourth experiment, we explored the phenomenological basis for the criterion shift, using the remember-know-guess procedure. In accord with previous research, the criterion shift in sequential lineups was associated with a reduction in familiarity-based responding. It is proposed that the relative similarity between lineup members may create a context in which fluency-based processing is facilitated to a greater extent when lineup members are presented simultaneously.

  19. Normalisation process theory: a framework for developing, evaluating and implementing complex interventions

    Directory of Open Access Journals (Sweden)

    Ong Bie

    2010-10-01

    Full Text Available Abstract Background The past decade has seen considerable interest in the development and evaluation of complex interventions to improve health. Such interventions can only have a significant impact on health and health care if they are shown to be effective when tested, are capable of being widely implemented and can be normalised into routine practice. To date, there is still a problematic gap between research and implementation. The Normalisation Process Theory (NPT addresses the factors needed for successful implementation and integration of interventions into routine work (normalisation. Discussion In this paper, we suggest that the NPT can act as a sensitising tool, enabling researchers to think through issues of implementation while designing a complex intervention and its evaluation. The need to ensure trial procedures that are feasible and compatible with clinical practice is not limited to trials of complex interventions, and NPT may improve trial design by highlighting potential problems with recruitment or data collection, as well as ensuring the intervention has good implementation potential. Summary The NPT is a new theory which offers trialists a consistent framework that can be used to describe, assess and enhance implementation potential. We encourage trialists to consider using it in their next trial.

  20. Professional Socialization: A Grounded Theory of the Clinical Reasoning Processes That RNs and LPNs Use to Recognize Delirium.

    Science.gov (United States)

    El Hussein, Mohamed; Hirst, Sandra; Osuji, Joseph

    2017-08-01

    Delirium is an acute disorder of attention and cognition. It affects half of older adults in acute care settings and is a cause of increasing mortality and costs. Registered nurses (RNs) and licensed practical nurses (LPNs) frequently fail to recognize delirium. The goals of this research were to identify the reasoning processes that RNs and LPNs use to recognize delirium, to compare their reasoning processes, and to generate a theory that explains their clinical reasoning processes. Theoretical sampling was employed to elicit data from 28 participants using grounded theory methodology. Theoretical coding culminated in the emergence of Professional Socialization as the substantive theory. Professional Socialization emerged from participants' responses and was based on two social processes, specifically reasoning to uncover and reasoning to report. Professional Socialization makes explicit the similarities and variations in the clinical reasoning processes between RNs and LPNs and highlights their main concerns when interacting with delirious patients.

  1. An introduction to continuous-time stochastic processes theory, models, and applications to finance, biology, and medicine

    CERN Document Server

    Capasso, Vincenzo

    2015-01-01

    This textbook, now in its third edition, offers a rigorous and self-contained introduction to the theory of continuous-time stochastic processes, stochastic integrals, and stochastic differential equations. Expertly balancing theory and applications, the work features concrete examples of modeling real-world problems from biology, medicine, industrial applications, finance, and insurance using stochastic methods. No previous knowledge of stochastic processes is required. Key topics include: * Markov processes * Stochastic differential equations * Arbitrage-free markets and financial derivatives * Insurance risk * Population dynamics, and epidemics * Agent-based models New to the Third Edition: * Infinitely divisible distributions * Random measures * Levy processes * Fractional Brownian motion * Ergodic theory * Karhunen-Loeve expansion * Additional applications * Additional  exercises * Smoluchowski  approximation of  Langevin systems An Introduction to Continuous-Time Stochastic Processes, Third Editio...

  2. Diagnostic Problem-Solving Process in Professional Contexts: Theory and Empirical Investigation in the Context of Car Mechatronics Using Computer-Generated Log-Files

    Science.gov (United States)

    Abele, Stephan

    2018-01-01

    This article deals with a theory-based investigation of the diagnostic problem-solving process in professional contexts. To begin with, a theory of the diagnostic problem-solving process was developed drawing on findings from different professional contexts. The theory distinguishes between four sub-processes of the diagnostic problem-solving…

  3. Performance Feedback Processing Is Positively Biased As Predicted by Attribution Theory.

    Directory of Open Access Journals (Sweden)

    Christoph W Korn

    incorrect performance alone could not explain the observed positivity bias. Furthermore, participants' behavior in our task was linked to the most widely used measure of attribution style. In sum, our findings suggest that positive and negative performance feedback influences the evaluation of task-related stimuli, as predicted by attribution theory. Therefore, our study points to the relevance of attribution theory for feedback processing in decision-making and provides a novel outlook for decision-making biases.

  4. Evolution is a cooperative process: the biodiversity-related niches differentiation theory (BNDT) can explain why.

    Science.gov (United States)

    Gatti, Roberto Cazzolla

    2011-01-01

    A. McFayden and G.E. Hutchinson defined a niche as a multidimensional space or hypervolume within the environment that allows an individual or a species to survive, we consider niches as a fundamental ecological variable that regulate species' composition and relation in ecosystems. Successively the niche concept has been associated to the genetic term "phenotype" by MacArthurstressing the importance on what a species or a genome can show outside, either in the environmental functions or in body characteristics. Several indexes have been developed to evaluate the grade of overlapping and similarities of species' niches, even utilizing the theory of information. However, which are the factors that determine the number of species that can coexist in a determinate environment and why a generalist species do not compete until the exclusion of the remaining species to maximize its fitness, is still quite unknown. Moreover, there are few studies and theories that clearly explain why the number of niches is so variable through ecosystems and how can several species live in the same basal niche, intended in a comprehensive sense as the range of basic conditions (temperature, humidity, food-guild, etc.). Here I show that the number of niches in an ecosystem depends on the number of species present in a particular moment and that the species themselves allow the enhancement of niches in terms of space and number. I found that using a three-dimensional model as hypervolume and testing the theory on a Mediterranean, temperate and tropical forest ecosystem it is possible to demonstrate that each species plays a fundamental role in facilitating the colonization by other species by simply modifying the environment and exponentially increasing the available niches' space and number. I resumed these hypothesis, after some preliminary empiric tests, in the Biodiversity-related Niches Differentiation Theory (BNDT), stressing with these definition that the process of niches

  5. Performance Feedback Processing Is Positively Biased As Predicted by Attribution Theory.

    Science.gov (United States)

    Korn, Christoph W; Rosenblau, Gabriela; Rodriguez Buritica, Julia M; Heekeren, Hauke R

    2016-01-01

    alone could not explain the observed positivity bias. Furthermore, participants' behavior in our task was linked to the most widely used measure of attribution style. In sum, our findings suggest that positive and negative performance feedback influences the evaluation of task-related stimuli, as predicted by attribution theory. Therefore, our study points to the relevance of attribution theory for feedback processing in decision-making and provides a novel outlook for decision-making biases.

  6. Quantum information processing in the radical-pair mechanism: Haberkorn's theory violates the Ozawa entropy bound

    Science.gov (United States)

    Mouloudakis, K.; Kominis, I. K.

    2017-02-01

    Radical-ion-pair reactions, central for understanding the avian magnetic compass and spin transport in photosynthetic reaction centers, were recently shown to be a fruitful paradigm of the new synthesis of quantum information science with biological processes. We show here that the master equation so far constituting the theoretical foundation of spin chemistry violates fundamental bounds for the entropy of quantum systems, in particular the Ozawa bound. In contrast, a recently developed theory based on quantum measurements, quantum coherence measures, and quantum retrodiction, thus exemplifying the paradigm of quantum biology, satisfies the Ozawa bound as well as the Lanford-Robinson bound on information extraction. By considering Groenewold's information, the quantum information extracted during the reaction, we reproduce the known and unravel other magnetic-field effects not conveyed by reaction yields.

  7. Critique of the Naturalization of Deontologism in Joshua Greene's Dual Process Theory of Moral Judgment

    Directory of Open Access Journals (Sweden)

    Javier Gracia

    2018-05-01

    Full Text Available In this paper I propose to question the Joshua Greene’s neuroethical thesis about the essentially emotional character of so-called “deontological moral judgments”. Frist, I focus on the dual process theory of moral judgment and I criticize that they are considered only and mainly intuitive and non reflective. Se condly, I question that the “utilitarian judgment” is linked to mathematical calculation and the deontological judgment is exclusively reduced to non-reflective factor of emotion. The main objection to Greene’s naturalism raised by me is trying to eliminate the philosophical justification about the moral validity defended by Kant’s deontologism; meanwhile Greene reduces “deontological moral judgment” to exclusively psychological and neurophysiological factors associated with emotion.

  8. Unhealthy weight control behaviours in adolescent girls: a process model based on self-determination theory.

    Science.gov (United States)

    Thøgersen-Ntoumani, Cecilie; Ntoumanis, Nikos; Nikitaras, Nikitas

    2010-06-01

    This study used self-determination theory (Deci, E.L., & Ryan, R.M. (2000). The 'what' and 'why' of goal pursuits: Human needs and the self-determination of behavior. Psychological Inquiry, 11, 227-268.) to examine predictors of body image concerns and unhealthy weight control behaviours in a sample of 350 Greek adolescent girls. A process model was tested which proposed that perceptions of parental autonomy support and two life goals (health and image) would predict adolescents' degree of satisfaction of their basic psychological needs. In turn, psychological need satisfaction was hypothesised to negatively predict body image concerns (i.e. drive for thinness and body dissatisfaction) and, indirectly, unhealthy weight control behaviours. The predictions of the model were largely supported indicating that parental autonomy support and adaptive life goals can indirectly impact upon the extent to which female adolescents engage in unhealthy weight control behaviours via facilitating the latter's psychological need satisfaction.

  9. Quantum information processing in the radical-pair mechanism: Haberkorn's theory violates the Ozawa entropy bound.

    Science.gov (United States)

    Mouloudakis, K; Kominis, I K

    2017-02-01

    Radical-ion-pair reactions, central for understanding the avian magnetic compass and spin transport in photosynthetic reaction centers, were recently shown to be a fruitful paradigm of the new synthesis of quantum information science with biological processes. We show here that the master equation so far constituting the theoretical foundation of spin chemistry violates fundamental bounds for the entropy of quantum systems, in particular the Ozawa bound. In contrast, a recently developed theory based on quantum measurements, quantum coherence measures, and quantum retrodiction, thus exemplifying the paradigm of quantum biology, satisfies the Ozawa bound as well as the Lanford-Robinson bound on information extraction. By considering Groenewold's information, the quantum information extracted during the reaction, we reproduce the known and unravel other magnetic-field effects not conveyed by reaction yields.

  10. Evaluation and selection of energy technologies using an integrated graph theory and analytic hierarchy process methods

    Directory of Open Access Journals (Sweden)

    P. B. Lanjewar

    2016-06-01

    Full Text Available The evaluation and selection of energy technologies involve a large number of attributes whose selection and weighting is decided in accordance with the social, environmental, technical and economic framework. In the present work an integrated multiple attribute decision making methodology is developed by combining graph theory and analytic hierarchy process methods to deal with the evaluation and selection of energy technologies. The energy technology selection attributes digraph enables a quick visual appraisal of the energy technology selection attributes and their interrelationships. The preference index provides a total objective score for comparison of energy technologies alternatives. Application of matrix permanent offers a better appreciation of the considered attributes and helps to analyze the different alternatives from combinatorial viewpoint. The AHP is used to assign relative weights to the attributes. Four examples of evaluation and selection of energy technologies are considered in order to demonstrate and validate the proposed method.

  11. A finite state, finite memory minimum principle, part 2. [a discussion of game theory, signaling, stochastic processes, and control theory

    Science.gov (United States)

    Sandell, N. R., Jr.; Athans, M.

    1975-01-01

    The development of the theory of the finite - state, finite - memory (FSFM) stochastic control problem is discussed. The sufficiency of the FSFM minimum principle (which is in general only a necessary condition) was investigated. By introducing the notion of a signaling strategy as defined in the literature on games, conditions under which the FSFM minimum principle is sufficient were determined. This result explicitly interconnects the information structure of the FSFM problem with its optimality conditions. The min-H algorithm for the FSFM problem was studied. It is demonstrated that a version of the algorithm always converges to a particular type of local minimum termed a person - by - person extremal.

  12. Interactions between Depression and Facilitation within Neural Networks: Updating the Dual-Process Theory of Plasticity

    Science.gov (United States)

    Prescott, Steven A.

    1998-01-01

    Repetitive stimulation often results in habituation of the elicited response. However, if the stimulus is sufficiently strong, habituation may be preceded by transient sensitization or even replaced by enduring sensitization. In 1970, Groves and Thompson formulated the dual-process theory of plasticity to explain these characteristic behavioral changes on the basis of competition between decremental plasticity (depression) and incremental plasticity (facilitation) occurring within the neural network. Data from both vertebrate and invertebrate systems are reviewed and indicate that the effects of depression and facilitation are not exclusively additive but, rather, that those processes interact in a complex manner. Serial ordering of induction of learning, in which a depressing locus precedes the modulatory system responsible for inducing facilitation, causes the facilitation to wane. The parallel and/or serial expression of depression and waning facilitation within the stimulus–response pathway culminates in the behavioral changes that characterize dual-process learning. A mathematical model is presented to formally express and extend understanding of the interactions between depression and facilitation. PMID:10489261

  13. Exact and conceptual repetition dissociate conceptual memory tests: problems for transfer appropriate processing theory.

    Science.gov (United States)

    McDermott, K B; Roediger, H L

    1996-03-01

    Three experiments examined whether a conceptual implicit memory test (specifically, category instance generation) would exhibit repetition effects similar to those found in free recall. The transfer appropriate processing account of dissociations among memory tests led us to predict that the tests would show parallel effects; this prediction was based upon the theory's assumption that conceptual tests will behave similarly as a function of various independent variables. In Experiment 1, conceptual repetition (i.e., following a target word [e.g., puzzles] with an associate [e.g., jigsaw]) did not enhance priming on the instance generation test relative to the condition of simply presenting the target word once, although this manipulation did affect free recall. In Experiment 2, conceptual repetition was achieved by following a picture with its corresponding word (or vice versa). In this case, there was an effect of conceptual repetition on free recall but no reliable effect on category instance generation or category cued recall. In addition, we obtained a picture superiority effect in free recall but not in category instance generation. In the third experiment, when the same study sequence was used as in Experiment 1, but with instructions that encouraged relational processing, priming on the category instance generation task was enhanced by conceptual repetition. Results demonstrate that conceptual memory tests can be dissociated and present problems for Roediger's (1990) transfer appropriate processing account of dissociations between explicit and implicit tests.

  14. Explaining of chronic pain management process in older people: A grounded theory Study

    Directory of Open Access Journals (Sweden)

    Shirazi Manouchehr

    2016-02-01

    Full Text Available Background and Objective: With regard to the multi-dimensional and complex nature of chronic pain management process in the elderly, the identifying of its various aspects is essential for proper management of this type of pain. The current study aimed to explain the chronic pain management process in the elderly. Materials and Method: This study was conducted based on grounded theory approach in health care centers of Ahwaz in 2013-2014. Participants including 62 persons consisted of 30 elderly people who were confirmed about the lack of cognitive disorders through using I.V.A.M.T.S , 3 persons of their relatives and 29 persons of health care providers. Data collection was done through using semi-structured interview, observation and field note. Data analysis was performed based on Strauss and Corbin’s method of analysis. Results: Data analysis showed that the “comprehensive support” is considered as an important and facilitating factor in the process of chronic pain management in the elderly which consists of four sub-categories as “being with family”, “team work”, “targeted treatment” and “social support”. Conclusion: Chronic pain Management in the elderly will not be achieved without helping of effective supportive resources. . Making appropriate decisions can be effective in order to identifying and gaining support from these sources for effective management of pain.

  15. A Case Study: Dual-Process Theories of Higher Cognition-Commentary on Evans & Stanovich (2013).

    Science.gov (United States)

    Osman, Magda

    2013-05-01

    Dual-process theories of higher order cognition (DPTs) have been enjoying much success, particularly since Kahneman's 2002 Nobel prize address and recent book Thinking, Fast and Slow (2009). Historically, DPTs have attempted to provide a conceptual framework that helps classify and predict differences in patterns of behavior found under some circumstances and not others in a host of reasoning, judgment, and decision-making tasks. As evidence has changed and techniques for examining behavior have moved on, so too have DPTs. Killing two birds with one stone, Evans and Stanovich (2013, this issue) respond to five main criticisms of DPTs. Along with addressing each criticism in turn, they set out to clarify the essential defining characteristics that distinguish one form of higher order cognition from the other. The aim of this commentary is to consider the defining characteristics of Type 1 and Type 2 processing that have been proposed and to suggest that the evidence can be taken to support quantitative differences rather than qualitatively distinct processes. © The Author(s) 2013.

  16. Neural processing associated with cognitive and affective Theory of Mind in adolescents and adults.

    Science.gov (United States)

    Sebastian, Catherine L; Fontaine, Nathalie M G; Bird, Geoffrey; Blakemore, Sarah-Jayne; Brito, Stephane A De; McCrory, Eamon J P; Viding, Essi

    2012-01-01

    Theory of Mind (ToM) is the ability to attribute thoughts, intentions and beliefs to others. This involves component processes, including cognitive perspective taking (cognitive ToM) and understanding emotions (affective ToM). This study assessed the distinction and overlap of neural processes involved in these respective components, and also investigated their development between adolescence and adulthood. While data suggest that ToM develops between adolescence and adulthood, these populations have not been compared on cognitive and affective ToM domains. Using fMRI with 15 adolescent (aged 11-16 years) and 15 adult (aged 24-40 years) males, we assessed neural responses during cartoon vignettes requiring cognitive ToM, affective ToM or physical causality comprehension (control). An additional aim was to explore relationships between fMRI data and self-reported empathy. Both cognitive and affective ToM conditions were associated with neural responses in the classic ToM network across both groups, although only affective ToM recruited medial/ventromedial PFC (mPFC/vmPFC). Adolescents additionally activated vmPFC more than did adults during affective ToM. The specificity of the mPFC/vmPFC response during affective ToM supports evidence from lesion studies suggesting that vmPFC may integrate affective information during ToM. Furthermore, the differential neural response in vmPFC between adult and adolescent groups indicates developmental changes in affective ToM processing.

  17. Application of the Theory of Self-Organized Criticality to the Investigation of Historical Processes

    Directory of Open Access Journals (Sweden)

    Dmitry S. Zhukov

    2016-12-01

    Full Text Available The article demonstrates heuristic possibilities of the theory of self-organized criticality (SOC in the investigation of historical processes. Key SOC concepts and ideas are explained. Specifically, tools that can be used for identifying pink noise, an attribute of a critical state, are described. The results of spectral analyses of historical demographic data (i.e., birth and death rates in Russian settlements in the 19th and 20th centuries and historical market data (i.e., grain prices in regions of Russia in the 18th, 19th, and early 20th centuries are presented. It was found that noise color in the data series differed substantially across different periods. Based on these observations, the assumption that a change in noise color can serve as an indicator of changes in historical processes was made. In some cases, this indicator can enable one to establish the time, speed, and direction of state changes in historical processes. Pink noise was discovered in the examined birth and death rate dynamics, as well as in the dynamics of prices across periods. The described methods have the potential to be used beyond the limits of the presently considered historical subjects, including in investigations of different types of social transformation.

  18. Social anxiety disorder exhibit impaired networks involved in self and theory of mind processing.

    Science.gov (United States)

    Cui, Qian; Vanman, Eric J; Long, Zhiliang; Pang, Yajing; Chen, Yuyan; Wang, Yifeng; Duan, Xujun; Chen, Heng; Gong, Qiyong; Zhang, Wei; Chen, Huafu

    2017-08-01

    Most previous studies regarding social anxiety disorder (SAD) have focused on the role of emotional dysfunction, while impairments in self- and theory of mind (ToM)-processing have relatively been neglected. This study utilised functional connectivity density (FCD), resting-state functional connectivity (RSFC) and discriminant analyses to investigate impairments in self- and ToM-related networks in patients with SAD. Patients with SAD exhibited decreased long-range FCD in the right rostral anterior cingulate cortex (rACC) and decreased short-range FCD in the right superior temporal gyrus (STG)-key nodes involved in self- and ToM-processing, respectively. Decreased RSFC of the right rACC and STG with widespread frontal, temporal, posteromedial, sensorimotor, and somatosensory, regions was also observed in patients with SAD. Altered RSFC between the right rACC and bilateral superior frontal gyrus, between the right rACC and right middle frontal gyrus, and within the right STG itself provided the greatest contribution to individual diagnoses of SAD, with an accuracy of 84.5%. These results suggest that a lack of cognitive inhibition on emotional self-referential processing as well as impairments in social information integration may play critical roles in the pathomechanism of SAD and highlight the importance of recognising such features in the diagnosis and treatment of SAD. © The Author (2017). Published by Oxford University Press.

  19. Explaining of chronic pain management process in older people: A grounded theory Study

    Directory of Open Access Journals (Sweden)

    Manouchehr Shirazi

    2016-02-01

    Full Text Available Background: With regard to the multi-dimensional and complex nature of chronic pain management process in the elderly, the identifying of its various aspects is essential for proper management of this type of pain. The current study aimed to explain the chronic pain management process in the elderly. Methods: This study was conducted based on grounded theory approach in health care centers of Ahwaz in 2013-2014. Participants including 62 persons consisted of 30 elderly people who were confirmed about the lack of cognitive disorders through using I.V.A.M.T.S , 3 persons of their relatives and 29 persons of health care providers. Data collection was done through using semi-structured interview, observation and field note. Data analysis was performed based on Strauss and Corbin’s method of analysis. Results: Data analysis showed that the “comprehensive support” is considered as an important and facilitating factor in the process of chronic pain management in the elderly which consists of four sub-categories as “being with family”, “team work”, “targeted treatment” and “social support”. Conclusion: Chronic pain Management in the elderly will not be achieved without helping of effective supportive resources. . Making appropriate decisions can be effective in order to identifying and gaining support from these sources for effective management of pain.

  20. Implementation of the SMART MOVE intervention in primary care: a qualitative study using normalisation process theory.

    Science.gov (United States)

    Glynn, Liam G; Glynn, Fergus; Casey, Monica; Wilkinson, Louise Gaffney; Hayes, Patrick S; Heaney, David; Murphy, Andrew W M

    2018-05-02

    Problematic translational gaps continue to exist between demonstrating the positive impact of healthcare interventions in research settings and their implementation into routine daily practice. The aim of this qualitative evaluation of the SMART MOVE trial was to conduct a theoretically informed analysis, using normalisation process theory, of the potential barriers and levers to the implementation of a mhealth intervention to promote physical activity in primary care. The study took place in the West of Ireland with recruitment in the community from the Clare Primary Care Network. SMART MOVE trial participants and the staff from four primary care centres were invited to take part and all agreed to do so. A qualitative methodology with a combination of focus groups (general practitioners, practice nurses and non-clinical staff from four separate primary care centres, n = 14) and individual semi-structured interviews (intervention and control SMART MOVE trial participants, n = 4) with purposeful sampling utilising the principles of Framework Analysis was utilised. The Normalisation Process Theory was used to develop the topic guide for the interviews and also informed the data analysis process. Four themes emerged from the analysis: personal and professional exercise strategies; roles and responsibilities to support active engagement; utilisation challenges; and evaluation, adoption and adherence. It was evident that introducing a new healthcare intervention demands a comprehensive evaluation of the intervention itself and also the environment in which it is to operate. Despite certain obstacles, the opportunity exists for the successful implementation of a novel healthcare intervention that addresses a hitherto unresolved healthcare need, provided that the intervention has strong usability attributes for both disseminators and target users and coheres strongly with the core objectives and culture of the health care environment in which it is to operate. We