WorldWideScience

Sample records for analysts predict furtmer

  1. Analyst-to-Analyst Variability in Simulation-Based Prediction

    Energy Technology Data Exchange (ETDEWEB)

    Glickman, Matthew R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Romero, Vicente J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-02-01

    This report describes findings from the culminating experiment of the LDRD project entitled, "Analyst-to-Analyst Variability in Simulation-Based Prediction". For this experiment, volunteer participants solving a given test problem in engineering and statistics were interviewed at different points in their solution process. These interviews are used to trace differing solutions to differing solution processes, and differing processes to differences in reasoning, assumptions, and judgments. The issue that the experiment was designed to illuminate -- our paucity of understanding of the ways in which humans themselves have an impact on predictions derived from complex computational simulations -- is a challenging and open one. Although solution of the test problem by analyst participants in this experiment has taken much more time than originally anticipated, and is continuing past the end of this LDRD, this project has provided a rare opportunity to explore analyst-to-analyst variability in significant depth, from which we derive evidence-based insights to guide further explorations in this important area.

  2. Macroeconomic predictions – Three essays on analysts' forecast quality

    OpenAIRE

    Orbe, Sebastian

    2013-01-01

    Macroeconomic expectation data are of great interest to different agents due to their importance as central input factors in various applications. To name but a few, politicians, capital market participants, as well as academics, incorporate these forecast data into their decision processes. Consequently, a sound understanding of the quality properties of macroeconomic forecast data, their quality determinants, as well as potential ways to improve macroeconomic predictions is desirable. ...

  3. Analysts forecast error : A robust prediction model and its short term trading

    NARCIS (Netherlands)

    Boudt, Kris; de Goeij, Peter; Thewissen, James; Van Campenhout, Geert

    We examine the profitability of implementing a short term trading strategy based on predicting the error in analysts' earnings per share forecasts using publicly available information. Since large earnings surprises may lead to extreme values in the forecast error series that disrupt their smooth

  4. Essays on financial analysts' forecasts

    OpenAIRE

    Rodriguez, Marius del Giudice

    2006-01-01

    This dissertation contains three self-contained chapters dealing with specific aspects of financial analysts' earnings forecasts. After recent accounting scandals, much attention has turned to the incentives present in the career of professional financial analysts. The literature points to several reasons why financial analysts behave overoptimistically when providing their predictions. In particular, analysts may wish to maintain good relations with firm management, to please the underwriter...

  5. Applied predictive analytics principles and techniques for the professional data analyst

    CERN Document Server

    Abbott, Dean

    2014-01-01

    Learn the art and science of predictive analytics - techniques that get results Predictive analytics is what translates big data into meaningful, usable business information. Written by a leading expert in the field, this guide examines the science of the underlying algorithms as well as the principles and best practices that govern the art of predictive analytics. It clearly explains the theory behind predictive analytics, teaches the methods, principles, and techniques for conducting predictive analytics projects, and offers tips and tricks that are essential for successful predictive mode

  6. Learning about Analysts

    DEFF Research Database (Denmark)

    Rudiger, Jesper; Vigier, Adrien

    2017-01-01

    We examine an analyst with career concerns making cheap talk recommendations to a sequence of traders, each of whom possesses noisy private information concerning the analyst's ability. Each period, the reputation of the analyst is updated based on the recommendation and price developments....... An endogeneity problem thus arises, creating opportunities for the bad analyst to manipulate the market. We show that if, by a streak of good luck, the bad analyst builds up her reputation she can then momentarily hide her type. However, the capability of the bad analyst to manipulate the market has limits...

  7. Gender issues of financial analysts

    OpenAIRE

    Jingwen Ge

    2013-01-01

    Increased attention has been drawn to the gender disparity in workplace. This dissertation is dedicated to provide sight to the gender issues in financial analysts. Profound literature reviews are conducted about gender issues and financial analysts, respectively in order to comprehend the existing gender concerns in the business world, and role and functions of financial analysts. Research proposals are described to answer the following question: whether women financial analysts are more lik...

  8. Desire and the female analyst.

    Science.gov (United States)

    Schaverien, J

    1996-04-01

    The literature on erotic transference and countertransference between female analyst and male patient is reviewed and discussed. It is known that female analysts are less likely than their male colleagues to act out sexually with their patients. It has been claimed that a) male patients do not experience sustained erotic transferences, and b) female analysts do not experience erotic countertransferences with female or male patients. These views are challenged and it is argued that, if there is less sexual acting out by female analysts, it is not because of an absence of eros in the therapeutic relationship. The literature review covers material drawn from psychoanalysis, feminist psychotherapy, Jungian analysis, as well as some sociological and cultural sources. It is organized under the following headings: the gender of the analyst, sexual acting out, erotic transference, maternal and paternal transference, gender and power, countertransference, incest taboo--mothers and sons and sexual themes in the transference.

  9. Information Management Analyst | IDRC - International ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Job Summary The Information Management Analyst is the technical resource person ... Performs systems configuration, testing and quality assurance for all IRO ... IMTD employees, identifying root cause of system issues and solving them or ...

  10. AN ANALYST'S UNCERTAINTY AND FEAR.

    Science.gov (United States)

    Chused, Judith Fingert

    2016-10-01

    The motivations for choosing psychoanalysis as a profession are many and differ depending on the psychology of the analyst. However, common to most psychoanalysts is the desire to forge a helpful relationship with the individuals with whom they work therapeutically. This article presents an example of what happens when an analyst is confronted by a patient for whom being in a relationship and being helped are intolerable. © 2016 The Psychoanalytic Quarterly, Inc.

  11. Training for spacecraft technical analysts

    Science.gov (United States)

    Ayres, Thomas J.; Bryant, Larry

    1989-01-01

    Deep space missions such as Voyager rely upon a large team of expert analysts who monitor activity in the various engineering subsystems of the spacecraft and plan operations. Senior teammembers generally come from the spacecraft designers, and new analysts receive on-the-job training. Neither of these methods will suffice for the creation of a new team in the middle of a mission, which may be the situation during the Magellan mission. New approaches are recommended, including electronic documentation, explicit cognitive modeling, and coached practice with archived data.

  12. Analyst workbenches state of the art report

    CERN Document Server

    Rock-Evans, R

    1987-01-01

    Analyst Workbenches examines various aspects of analyst workbenches and the tasks and data that they should support. The major advances and state of the art in analyst workbenches are discussed. A comprehensive list of the available analyst workbenches, both the experimental and the commercial products, is provided. Comprised of three parts, this book begins by describing International Computers Ltd's approach to automating analysis and design. It then explains what business analysis really means, outlines the principal features of analyst workbenches, and considers the ways in which they can

  13. Big Data, Data Analyst, and Improving the Competence of Librarian

    Directory of Open Access Journals (Sweden)

    Albertus Pramukti Narendra

    2016-01-01

    Full Text Available Issue of Big Data was already raised by Fremont Rider, an American Librarian from Westleyan University, in 1944. He predicted that the volume of American universities collection would reach 200 million copies in 2040. As a result, it brings to fore multiple issues such as big data users, storage capacity, and the need to have data analysts. In Indonesia, data analysts is still a rare profession, and therefore urgently needed. One of its distinctive tasks is to conduct visual analyses from various data resources and also to present the result visually as interesting knowledge. It becomes science enliven by interactive visualization. In response to the issue, librarians have already been equipped with basic information management. Yet, they can see the opportunity and improve themselves as data analysts. In developed countries, it is common that librarian are also regarded as data analysts. They enhance themselves with various skills required, such as cloud computing and smart computing. In the end librarian with data analysts competency are eloquent to extract and present complex data resources as interesting and discernible knowledge.

  14. Big Data, Data Analyst, and Improving the Competence of Librarian

    Directory of Open Access Journals (Sweden)

    Albertus Pramukti Narendra

    2018-01-01

    Full Text Available Issue of Big Data was already raised by Fremont Rider, an American Librarian from Westleyan University, in 1944. He predicted that the volume of American universities collection would reach 200 million copies in 2040. As a result, it brings to fore multiple issues such as big data users, storage capacity, and the need to have data analysts.  In Indonesia, data analysts is still a rare profession, and therefore urgently needed. One of its distinctive tasks  is to conduct visual analyses from various data resources and also to present the result visually as interesting knowledge. It becomes science enliven by interactive visualization. (Thomas and Cook, 2005. In response to the issue, librarians have already been equipped with basic information management. Yet, they can see the opportunity and improve themselves as data analysts. In developed countries, it is common that librarian are also regarded as data analysts. They enhance  themselves with various skills required, such as cloud computing and smart computing. In the end librarian with data analysts competency are eloquent to extract and present complex data resources as “interesting and discernible” knowledge.

  15. Do Investors Learn About Analyst Accuracy?

    OpenAIRE

    Chang, Charles; Daouk, Hazem; Wang, Albert

    2008-01-01

    We study the impact of analyst forecasts on prices to determine whether investors learn about analyst accuracy. Our test market is the crude oil futures market. Prices rise when analysts forecast a decrease (increase) in crude supplies. In the 15 minutes following supply realizations, prices rise (fall) when forecasts have been too high (low). In both the initial price action relative to forecasts and in the subsequent reaction relative to realized forecast errors, the price response is stron...

  16. Financial Analyst | IDRC - International Development Research Centre

    International Development Research Centre (IDRC) Digital Library (Canada)

    You will assist the Chief and the Senior Financial Analyst in the discharge of their responsibilities, and backs up the Senior Financial Analyst, Treasury ... in the FAD Annual Work Plan), at the request of the Director, and ensures deliverables are met within assigned time lines as set by the FAD Manager leading the initiative;

  17. Setting analyst: A practical harvest planning technique

    Science.gov (United States)

    Olivier R.M. Halleux; W. Dale Greene

    2001-01-01

    Setting Analyst is an ArcView extension that facilitates practical harvest planning for ground-based systems. By modeling the travel patterns of ground-based machines, it compares different harvesting settings based on projected average skidding distance, logging costs, and site disturbance levels. Setting Analyst uses information commonly available to consulting...

  18. Analyst, Policy and Strategy | IDRC - International Development ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Working under the supervision of the Director, and with guidance from the Senior Analyst, the Analyst provides research, analysis, and advice on matters of policy and strategy for the Centre and its Board. He or she contributes to strategic and operational planning, corporate reporting, trend monitoring, and engagement with ...

  19. Policy Analyst | IDRC - International Development Research Centre

    International Development Research Centre (IDRC) Digital Library (Canada)

    ... reviews the plans produced to ensure that they are of the highest possible quality. ... The Policy Analyst plays a key role in information management. ... discussion and decision-making; prepares guidelines on issues relating to processes, ...

  20. Financial Analyst II - External Funds Management | IDRC ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    The Financial Analyst position is essential for the administration and smooth ... Prepare monthly journal voucher to record the donor partnership revenue in ... of the business requirements for the development and enhancement of financial ...

  1. Senior Systems Analyst | IDRC - International Development ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    ... Systems Analyst will play a critical role as part of the Information Technology ... data analysis, and system design); the delivery of professional IM/IT advisory and ... with the current development activities ensuring resolution of those issues.

  2. Improving the information environment for analysts

    DEFF Research Database (Denmark)

    Farooq, Omar; Nielsen, Christian

    2014-01-01

    they have more information. Our results also show that intellectual capital disclosure related to employees and strategic statements are the most important disclosures for analysts. Research limitations/implications: More relevant methods, such as survey or interviews with management, may be used to improve...... the information content of intellectual capital disclosure. Analysts, probably, deduce intellectual capital of a firm from interaction with management rather than financial statements. Practical implications: Firms in biotechnology sector can improve their information environment by disclosing more information...

  3. Key Performance Indicators and Analysts' Earnings Forecast Accuracy: An Application of Content Analysis

    OpenAIRE

    Alireza Dorestani; Zabihollah Rezaee

    2011-01-01

    We examine the association between the extent of change in key performance indicator (KPI) disclosures and the accuracy of forecasts made by analysts. KPIs are regarded as improving both the transparency and relevancy of public financial information. The results of using linear regression models show that contrary to our prediction and the hypothesis of this paper, there is no significant association between the change in non- financial KPI disclosures and the accuracy of analysts' forecasts....

  4. Do analysts disclose cash flow forecasts with earnings estimates when earnings quality is low?

    OpenAIRE

    Bilinski, P.

    2014-01-01

    Cash flows are incrementally useful to earnings in security valuation mainly when earnings quality is low. This suggests that when earnings quality decreases, analysts will be more likely to supplement their earnings forecasts with cash flow estimates. Contrary to this prediction, we find that analysts do not disclose cash flow forecasts when the quality of earnings is low. This is because cash flow forecast accuracy depends on the accuracy of the accrual estimates and the precision of accrua...

  5. Are security analysts rational? a literature review

    OpenAIRE

    Peixinho, Rúben; Coelho, Luís; Taffler, Richard J.

    2005-01-01

    Rational choice theory and bounded rationality constitute the basis for the discussion in several areas regarding human rationality. In finance, this discussion has been made between traditional finance and behavioural finance approach, which have different perspectives concerning market agents’ rationality. This paper reviews several studies addressing rationality among security analysts. The analysis shows that analysts’systematic optimism seems to be inconsistent with rationality....

  6. Paired analyst recommendations and internet IPOs

    NARCIS (Netherlands)

    van der Goot, T.; van Giersbergen, N.

    2008-01-01

    The paper investigates analyst recommendations for internet firms that went public during 1997-2000. Our contribution to the literature is that we match recommendations for the same firm issued by different investment banks that have published the recommendations in an interval around the same date.

  7. The analyst's participation in the analytic process.

    Science.gov (United States)

    Levine, H B

    1994-08-01

    The analyst's moment-to-moment participation in the analytic process is inevitably and simultaneously determined by at least three sets of considerations. These are: (1) the application of proper analytic technique; (2) the analyst's personally-motivated responses to the patient and/or the analysis; (3) the analyst's use of him or herself to actualise, via fantasy, feeling or action, some aspect of the patient's conflicts, fantasies or internal object relationships. This formulation has relevance to our view of actualisation and enactment in the analytic process and to our understanding of a series of related issues that are fundamental to our theory of technique. These include the dialectical relationships that exist between insight and action, interpretation and suggestion, empathy and countertransference, and abstinence and gratification. In raising these issues, I do not seek to encourage or endorse wild analysis, the attempt to supply patients with 'corrective emotional experiences' or a rationalisation for acting out one's countertransferences. Rather, it is my hope that if we can better appreciate and describe these important dimensions of the analytic encounter, we can be better prepared to recognise, understand and interpret the continual streams of actualisation and enactment that are embedded in the analytic process. A deeper appreciation of the nature of the analyst's participation in the analytic process and the dimensions of the analytic process to which that participation gives rise may offer us a limited, although important, safeguard against analytic impasse.

  8. Ground Truth Annotation in T Analyst

    DEFF Research Database (Denmark)

    2015-01-01

    This video shows how to annotate the ground truth tracks in the thermal videos. The ground truth tracks are produced to be able to compare them to tracks obtained from a Computer Vision tracking approach. The program used for annotation is T-Analyst, which is developed by Aliaksei Laureshyn, Ph...

  9. Through the Eyes of Analysts: A Content Analysis of Analyst Report Narratives

    DEFF Research Database (Denmark)

    Nielsen, Christian

    2004-01-01

    information as relevant to their analyses and recommendations. The paper shows that background information about the company, i.e. about products, markets and the industry, along with the analysts' own analysis of financial and operating data account for nearly 55% of the total disclosure in fundamental......This paper contributes to the ongoing debate of developing corporate reporting practices by analyzing the information content of fundamental analyst reports and comparing this with annual reporting practices. As there has been much critique of the lacking relevance of disclosures through corporate...... analyst reports, and the amount of financial data supplied is not related to the other disclosures in the reports. In comparison to business reporting practices, the fundamental analyst reports put relatively less weight on social and sustainability information, intellectual capital and corporate...

  10. A content analysis of analyst research: health care through the eyes of analysts.

    Science.gov (United States)

    Nielsen, Christian

    2008-01-01

    This article contributes to the understanding of how health care companies may communicate the business models by studying financial analysts' analyst reports. The study examines the differences between the information conveyed in recurrent and fundamental analyst reports as well as whether the characteristics of the analysts and their environment affect their business model analyses. A medium-sized health care company in the medical-technology sector, internationally renowned for its state-of-the-art business reporting, was chosen as the basis for the study. An analysis of 111 fundamental and recurrent analyst reports on this company by each investment bank actively following it was conducted using a content analysis methodology. The study reveals that the recurrent analyses are concerned with evaluating the information disclosed by the health care company itself and not so much with digging up new information. It also indicates that while maintenance work might be focused on evaluating specific details, fundamental research is more concerned with extending the understanding of the general picture, i.e., the sustainability and performance of the overall business model. The amount of financial information disclosed in either type of report is not correlated to the other disclosures in the reports. In comparison to business reporting practices, the fundamental analyst reports put considerably less weight on social and sustainability, intellectual capital and corporate governance information, and they disclose much less comparable non-financial information. The suggestion made is that looking at the types of information financial analysts consider important and convey to their "customers," the investors and fund managers, constitutes a valuable indication to health care companies regarding the needs of the financial market users of their reports and other communications. There are some limitations to the possibility of applying statistical tests to the data-set as

  11. The relevance of security analyst opinions for investment decisions

    NARCIS (Netherlands)

    Gerritsen, D.F.

    2014-01-01

    Security analysts analyze information regarding publicly traded companies after which they publish their opinion regarding these companies’ stocks. In this dissertation the published opinions of two different types of analysts are analyzed. Technical analysts derive a recommendation to buy, hold, or

  12. Investment Banking and Analyst Objectivity: Evidence from Forecasts and Recommendations of Analysts Affiliated with M&A Advisors

    OpenAIRE

    Kolasinski, Adam; Kothari, S.P.

    2004-01-01

    Previous research finds some evidence that analysts affiliated with equity underwriters issue more optimistic earnings growth forecasts and optimistic recommendations of client stock than unaffiliated analysts. Unfortunately, these studies are unable to discriminate between three competing hypotheses for the apparent optimism. Under the bribery hypothesis, underwriting clients, with the promise of underwriting fees, coax analysts to compromise their objectivity. The execution-related conflict...

  13. Examination of suspicious objects by virus analysts

    Science.gov (United States)

    Ananin, E. V.; Ananina, I. S.; Nikishova, A. V.

    2018-05-01

    The paper presents data on virus threats urgency. But in order for antiviruses to work properly, all data on new implementations of viruses should be added to its database. For that to be done, all suspicious objects should be investigated. It is a dangerous process and should be done in the virtual system. However, it is not secure for the main system as well. So the diagram of a secure workplace for a virus analyst is proposed. It contains software for its protection. Also all kinds of setting to ensure security of the process of investigating suspicious objects are proposed. The proposed approach allows minimizing risks caused by the virus.

  14. Implementation of the INEEL safety analyst training standard

    International Nuclear Information System (INIS)

    Hochhalter, E. E.

    2000-01-01

    The Idaho Nuclear Technology and Engineering Center (INTEC) safety analysis units at the Idaho National Engineering and Environmental Laboratory (INEEL) are in the process of implementing the recently issued INEEL Safety Analyst Training Standard (STD-1107). Safety analyst training and qualifications are integral to the development and maintenance of core safety analysis capabilities. The INEEL Safety Analyst Training Standard (STD-1107) was developed directly from EFCOG Training Subgroup draft safety analyst training plan template, but has been adapted to the needs and requirements of the INEEL safety analysis community. The implementation of this Safety Analyst Training Standard is part of the Integrated Safety Management System (ISMS) Phase II Implementation currently underway at the INEEL. The objective of this paper is to discuss (1) the INEEL Safety Analyst Training Standard, (2) the development of the safety analyst individual training plans, (3) the implementation issues encountered during this initial phase of implementation, (4) the solutions developed, and (5) the implementation activities remaining to be completed

  15. Is customer satisfaction a relevant metric for financial analysts?

    OpenAIRE

    Ngobo , Paul-Valentin; Casta , Jean-François; Ramond , Olivier ,

    2012-01-01

    published on line : 2011/01/08; International audience; This study examines the effects of customer satisfaction on analysts' earnings forecast errors. Based on a sample of analysts following companies measured by the American Customer Satisfaction Index (ACSI), we find that customer satisfaction reduces earnings forecast errors. However, analysts respond to changes in customer satisfaction but not to the ACSI metric per se. Furthermore, the effects of customer satisfaction are asymmetric; fo...

  16. The Role of Analyst Conference Calls in Capital Markets

    NARCIS (Netherlands)

    E.M. Roelofsen (Erik)

    2010-01-01

    textabstractMany firms conduct a conference call with analysts shortly after the quarterly earnings announcement. In these calls, management discusses the completed quarter, and analysts can ask questions. Due to SEC requirements, conference calls in the United States are virtually always live

  17. Multi-Intelligence Analytics for Next Generation Analysts (MIAGA)

    Science.gov (United States)

    Blasch, Erik; Waltz, Ed

    2016-05-01

    Current analysts are inundated with large volumes of data from which extraction, exploitation, and indexing are required. A future need for next-generation analysts is an appropriate balance between machine analytics from raw data and the ability of the user to interact with information through automation. Many quantitative intelligence tools and techniques have been developed which are examined towards matching analyst opportunities with recent technical trends such as big data, access to information, and visualization. The concepts and techniques summarized are derived from discussions with real analysts, documented trends of technical developments, and methods to engage future analysts with multiintelligence services. For example, qualitative techniques should be matched against physical, cognitive, and contextual quantitative analytics for intelligence reporting. Future trends include enabling knowledge search, collaborative situational sharing, and agile support for empirical decision-making and analytical reasoning.

  18. Procurement and execution of PCB analyses: Customer-analyst interactions

    International Nuclear Information System (INIS)

    Erickson, M.D.

    1993-01-01

    The practical application of PCB (polychlorinated biphenyl) analyses begins with a request for the analysis and concludes with provision of the requested analysis. The key to successful execution of this iteration is timely, professional communication between the requester and the analyst. Often PCB analyses are not satisfactorily executed, either because the requester failed to give adequate instructions or because the analyst simply ''did what he/she was told.'' The request for and conduct of a PCB analysis represents a contract for the procurement of a product (information about the sample); if both parties recognize and abide by this contractual relationship, the process generally proceeds smoothly. Requesters may be corporate purchasing agents working from a scope of work, a sample management office, a field team leader, a project manager, a physician's office, or the analyst himself. The analyst with whom the requester communicates may be a laboratory supervisor, a sample-receiving department, a salesperson for the laboratory, or the analyst himself. The analyst conducting the analysis is often a team, with custody of the sample being passed from sample receiving to the extraction laboratory, to the cleanup laboratory, to the gas chromatography (GC) laboratory, to the data reduction person, to the package preparation person, to the quality control (QC) department for verification, to shipping. Where a team of analysts is involved, the requester needs a central point of contact to minimize confusion and frustration. For the requester-analyst interface to work smoothly, it must function as if it is a one-to-one interaction. This article addresses the pitfalls of the requester-analyst interaction and provides suggestions for improving the quality of the analytical product through the requester-analyst interface

  19. "This strange disease": adolescent transference and the analyst's sexual orientation.

    Science.gov (United States)

    Burton, John K; Gilmore, Karen

    2010-08-01

    The treatment of adolescents by gay analysts is uncharted territory regarding the impact of the analyst's sexuality on the analytic process. Since a core challenge of adolescence involves the integration of the adult sexual body, gender role, and reproductive capacities into evolving identity, and since adolescents seek objects in their environment to facilitate both identity formation and the establishment of autonomy from primary objects, the analyst's sexual orientation is arguably a potent influence on the outcome of adolescent development. However, because sexual orientation is a less visible characteristic of the analyst than gender, race, or age, for example, the line between reality and fantasy is less clearly demarcated. This brings up special considerations regarding discovery and disclosure in the treatment. To explore these issues, the case of a late adolescent girl in treatment with a gay male analyst is presented. In this treatment, the question of the analyst's sexual orientation, and the demand by the patient for the analyst's self-disclosure, became a transference nucleus around which the patient's individual dynamics and adolescent dilemmas could be explored and clarified.

  20. Residues in the analyst of the patient's symbiotic connection at a somatic level: unrepresented states in the patient and analyst.

    Science.gov (United States)

    Godsil, Geraldine

    2018-02-01

    This paper discusses the residues of a somatic countertransference that revealed its meaning several years after apparently successful analytic work had ended. Psychoanalytic and Jungian analytic ideas on primitive communication, dissociation and enactment are explored in the working through of a shared respiratory symptom between patient and analyst. Growth in the analyst was necessary so that the patient's communication at a somatic level could be understood. Bleger's concept that both the patient's and analyst's body are part of the setting was central in the working through. © 2018, The Society of Analytical Psychology.

  1. Development of a Nevada Statewide Database for Safety Analyst Software

    Science.gov (United States)

    2017-02-02

    Safety Analyst is a software package developed by the Federal Highway Administration (FHWA) and twenty-seven participating state and local agencies including the Nevada Department of Transportation (NDOT). The software package implemented many of the...

  2. The analyst's body as tuning fork: embodied resonance in countertransference.

    Science.gov (United States)

    Stone, Martin

    2006-02-01

    This paper examines the phenomenon of embodied countertransference: where the analyst experiences a somatic reaction rather than the more common countertransference responses of thoughts, feelings, images, fantasies and dreams. Discussion of clinical material considers neurotic and syntonic aspects. The analogy is made of resonance with a tuning fork. Several questions are posed: Why does countertransference resonate in the bodies of some analysts but not all? Why do those analysts who are sensitive to this, experience it with some patients but not with others? And what are the conditions which are conducive to producing somatic responses? It proposes that somatic reactions are more likely to occur when a number of conditions come together: when working with patients exhibiting borderline, psychotic or severe narcissistic elements; where there has been early severe childhood trauma; and where there is fear of expressing strong emotions directly. In addition another theoretical factor is proposed, namely the typology of the analyst.

  3. Self-disclosure, trauma and the pressures on the analyst.

    Science.gov (United States)

    West, Marcus

    2017-09-01

    This paper argues that self-disclosure is intimately related to traumatic experience and the pressures on the analyst not to re-traumatize the patient or repeat traumatic dynamics. The paper gives a number of examples of such pressures and outlines the difficulties the analyst may experience in adopting an analytic attitude - attempting to stay as closely as possible with what the patient brings. It suggests that self-disclosure may be used to try to disconfirm the patient's negative sense of themselves or the analyst, or to try to induce a positive sense of self or of the analyst which, whilst well-meaning, may be missing the point and may be prolonging the patient's distress. Examples are given of staying with the co-construction of the traumatic early relational dynamics and thus working through the traumatic complex; this attitude is compared and contrasted with some relational psychoanalytic attitudes. © 2017, The Society of Analytical Psychology.

  4. COLAB: A Laboratory Environment for Studying Analyst Sensemaking and Collaboration

    National Research Council Canada - National Science Library

    Morrison, Clayton T; Cohen, Paul R

    2005-01-01

    COLAB is a laboratory for studying tools that facilitate collaboration and sensemaking among groups of human analysts as they build interpretations of unfolding situations based on accruing intelligence data...

  5. Senior Financial Analyst – External Funds Management | IDRC ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Job Summary The Senior Financial Analyst, External Funds Management is responsible ... in accordance with the donor agreements and accounting principles. ... Assist the Manager in the development of the financial accounting structure for ...

  6. Do analysts anticipate and react to bankruptcy? Evidence

    OpenAIRE

    Coelho, Luís; Peixinho, Rúben

    2005-01-01

    Finance literature suggests that financial analysts are sophisticated agents that act as facilitators of market efficiency by releasing relevant information to the market. This paper uses a sample of four major US bankruptcies to explore if analysts are able to disclose information to the market that provides investors with material information for their investment decisions. In particular, we use a qualitative approach to analyse analysts’ reports in order to verify if these agents are ab...

  7. COMPREHENSIVE APPROACH OVER THE PROFESSIONAL JUDGMENT OF THE FINANCIAL ANALYST

    Directory of Open Access Journals (Sweden)

    Viorica Mirela ŞTEFAN-DUICU

    2016-06-01

    Full Text Available The professional judgment is emblematical at a decisional level. This paper aims to highlight the valences of the professional judgment of the financial analyst by describing the components of its activity and also through highlighting the typologies of the mechanisms involved. Within this paper we have presented the types of financial analysts, the responsibilities that guide the professional judgment and also the interdependent elements of their activity.

  8. Entry Level Systems Analysts: What Does the Industry Want?

    Directory of Open Access Journals (Sweden)

    Donna M. Grant

    2016-06-01

    Full Text Available This study investigates the skill sets necessary for entry level systems analysts. Towards this end, the study combines two sources of data, namely, a content analysis of 200 systems analysts’ online job advertisements and a survey of 20 senior Information Systems (IS professionals. Based on Chi-square tests, the results reveal that most employers prefer entry level systems analysts with an undergraduate Computer Science degree. Furthermore, most of the employers prefer entry level systems analysts to have some years of experience as well as industry certifications. The results also reveal that there is a higher preference for entry level systems analysts who have non-technical and people skills (e.g., problem solving and oral communication. The empirical results from this study will inform IS educators as they develop future systems analysts. Additionally, the results will be useful to the aspiring systems analysts who need to make sure that they have the necessary job skills before graduating and entering the labor market.

  9. Human Functions, Machine Tools, and the Role of the Analyst

    Directory of Open Access Journals (Sweden)

    Gordon R. Middleton

    2015-09-01

    Full Text Available In an era of rapidly increasing technical capability, the intelligence focus is often on the modes of collection and tools of analysis rather than the analyst themselves. Data are proliferating and so are tools to help analysts deal with the flood of data and the increasingly demanding timeline for intelligence production, but the role of the analyst in such a data-driven environment needs to be understood in order to support key management decisions (e.g., training and investment priorities. This paper describes a model of the analytic process, and analyzes the roles played by humans and machine tools in each process element. It concludes that human analytic functions are as critical in the intelligence process as they have ever been, and perhaps even more so due to the advance of technology in the intelligence business. Human functions performed by analysts are critical in nearly every step in the process, particularly at the front end of the analytic process, in defining and refining the problem statement, and at the end of the process, in generating knowledge, presenting the story in understandable terms, tailoring the presentation of the results of the analysis to various audiences, as well as in determining when to initiate iterative loops in the process. The paper concludes with observations on the necessity of enabling expert analysts, tools to deal with big data, developing analysts with advanced analytic methods as well as with techniques for optimal use of advanced tools, and suggestions for further quantitative research.

  10. The reality of the other: dreaming of the analyst.

    Science.gov (United States)

    Ferruta, Anna

    2009-02-01

    The author discusses the obstacles to symbolization encountered when the analyst appears in the first dream of an analysis: the reality of the other is represented through the seeming recognition of the person of the analyst, who is portrayed in undisguised form. The interpretation of this first dream gives rise to reflections on the meaning of the other's reality in analysis: precisely this realistic representation indicates that the function of the other in the construction of the psychic world has been abolished. An analogous phenomenon is observed in the countertransference, as the analyst's mental processes are occluded by an exclusively self-generated interpretation of the patient's psychic world. For the analyst too, the reality of the other proves not to play a significant part in the construction of her interpretation. A 'turning-point' dream after five years bears witness to the power of the transforming function performed by the other throughout the analysis, by way of the representation of characters who stand for the necessary presence of a third party in the construction of a personal psychic reality. The author examines the mutual denial of the other's otherness, as expressed by the vicissitudes of the transference and countertransference between analyst and patient, otherness being experienced as a disturbance of self-sufficient narcissistic functioning. The paper ends with an analysis of the transformations that took place in the analytic relationship.

  11. Instruction in Information Structuring Improves Bayesian Judgment in Intelligence Analysts

    Directory of Open Access Journals (Sweden)

    David R. Mandel

    2015-04-01

    Full Text Available An experiment was conducted to test the effectiveness of brief instruction in information structuring (i.e., representing and integrating information for improving the coherence of probability judgments and binary choices among intelligence analysts. Forty-three analysts were presented with comparable sets of Bayesian judgment problems before and immediately after instruction. After instruction, analysts’ probability judgments were more coherent (i.e., more additive and compliant with Bayes theorem. Instruction also improved the coherence of binary choices regarding category membership: after instruction, subjects were more likely to invariably choose the category to which they assigned the higher probability of a target’s membership. The research provides a rare example of evidence-based validation of effectiveness in instruction to improve the statistical assessment skills of intelligence analysts. Such instruction could also be used to improve the assessment quality of other types of experts who are required to integrate statistical information or make probabilistic assessments.

  12. Gender heterogeneity in the sell-side analyst recommendation issuing process

    NARCIS (Netherlands)

    Bosquet, K.; de Goeij, P.C.; Smedts, K.

    Using analyst stock recommendations issued between January 1996 and December 2006 we show that the odds for female financial analysts to issue optimistic investment advice is 40% lower than for male analysts. Although 17% of our sample of analysts is female, 48% is employed by a top financial

  13. When the analyst is ill: dimensions of self-disclosure.

    Science.gov (United States)

    Pizer, B

    1997-07-01

    This article examines questions related to the "inescapable," the "inadvertent," and the "deliberate" personal disclosures by an analyst. Technical and personal considerations that influence the analyst's decision to disclose, as well as the inherent responsibilities and potential clinical consequences involved in self-disclosure, are explored, with particular attention to transference-countertransference dynamics, therapeutic goals, and the negotiation of resistance. The author describes her clinical work during a period of prolonged illness, with case vignettes that illustrate how-self-disclosure may be regarded as both an occasional authentic requirement and a regular intrinsic component of clinical technique.

  14. 17 CFR 200.17 - Chief Management Analyst.

    Science.gov (United States)

    2010-04-01

    ...) Organizational structures and delegations of authority; (d) Management information systems and concepts; and (e... 17 Commodity and Securities Exchanges 2 2010-04-01 2010-04-01 false Chief Management Analyst. 200...; CONDUCT AND ETHICS; AND INFORMATION AND REQUESTS Organization and Program Management General Organization...

  15. The Pentagon's Military Analyst Program

    Science.gov (United States)

    Valeri, Andy

    2014-01-01

    This article provides an investigatory overview of the Pentagon's military analyst program, what it is, how it was implemented, and how it constitutes a form of propaganda. A technical analysis of the program is applied using the theoretical framework of the propaganda model first developed by Noam Chomsky and Edward S. Herman. Definitions…

  16. Look who is talking now: analyst recommendations and internet IPOs

    NARCIS (Netherlands)

    van der Goot, T.; van Giersbergen, N.

    2009-01-01

    This paper investigates whether analyst recommendations are independent of their employer’s investment banking activities. Our sample consists of internet firms that went public during 1997-2000. The contribution of the paper to the literature is threefold. First, to account for missing

  17. Which analysts benefited most from mandatory IFRS adoption in Europe?

    NARCIS (Netherlands)

    Beuselinck, Christof; Joos, Philip; Khurana, I.K.; van der Meulen, Sofie

    2017-01-01

    This study examines whether financial analysts' research structure and portfolio selection choices helped in improving relative earnings forecast accuracy around mandatory IFRS adoption in Europe. Using a sample of 68,665 one-year ahead forecasts for 1,980 publicly listed firms, we find that

  18. MetaboAnalyst 3.0--making metabolomics more meaningful.

    Science.gov (United States)

    Xia, Jianguo; Sinelnikov, Igor V; Han, Beomsoo; Wishart, David S

    2015-07-01

    MetaboAnalyst (www.metaboanalyst.ca) is a web server designed to permit comprehensive metabolomic data analysis, visualization and interpretation. It supports a wide range of complex statistical calculations and high quality graphical rendering functions that require significant computational resources. First introduced in 2009, MetaboAnalyst has experienced more than a 50X growth in user traffic (>50 000 jobs processed each month). In order to keep up with the rapidly increasing computational demands and a growing number of requests to support translational and systems biology applications, we performed a substantial rewrite and major feature upgrade of the server. The result is MetaboAnalyst 3.0. By completely re-implementing the MetaboAnalyst suite using the latest web framework technologies, we have been able substantially improve its performance, capacity and user interactivity. Three new modules have also been added including: (i) a module for biomarker analysis based on the calculation of receiver operating characteristic curves; (ii) a module for sample size estimation and power analysis for improved planning of metabolomics studies and (iii) a module to support integrative pathway analysis for both genes and metabolites. In addition, popular features found in existing modules have been significantly enhanced by upgrading the graphical output, expanding the compound libraries and by adding support for more diverse organisms. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  19. Teaching Bayesian Statistics To Intelligence Analysts: Lessons Learned

    Directory of Open Access Journals (Sweden)

    Hemangni Deshmukh

    2009-01-01

    Full Text Available The Community must develop and integrate into regular use new tools that can assist analysts in filtering and correlating the vast quantities of information that threaten to overwhelm the analytic process…—Commission on the Intelligence Capabilities of the United States.Regarding Weapons of Mass Destruction (The WMD Report1Unlike the other social sciences and, particularly, the physical sciences, where scientists get to choose the questions they wish to answer and experiments are carefully designed to confirm or negate hypotheses, intelligence analysis requires analysts to deal with the demands of decision makers and estimate the intentions of foreign actors, criminals or business competitors in an environment filled with uncertainty and even deliberate deception.

  20. Audience as analyst: Dennis Potter's The Singing Detective.

    Science.gov (United States)

    Jeffrey, W

    1997-06-01

    Author Dennis Potter has written an exceptional psychoanalytically informed television series in The Singing Detective. Potter succeeds by echewing the usual portrayal of psychoanalysis in cinema and television as a therapy which the viewer observes but instead creates, by means of the content and structure of the series, a production that forces the audience into a role of analyst. The story of the current life and the childhood of the protagonist, Philip Marlow, has depth and context which allows the audience to examine the personality of Marlow, including character pathology and traits, sexuality, fantasy, dreams, and delusions from several metapsychological viewpoints. Potter allows the audience to use the dynamic, genetic, topographic, and, most unusual in drama, structural viewpoints. The audience can experience aspects of an analyst's experience, including the process of formulating and evaluating over time analytic hypotheses and coping with emotional reactions to the material which at times has transferencelike qualities.

  1. Temenos regained: reflections on the absence of the analyst.

    Science.gov (United States)

    Abramovitch, Henry

    2002-10-01

    The importance of the temenos as a metaphor to conceptualize therapeutic containment is discussed. Jung drew the analogy between the consulting room and the temenos, at the centre of the Greek Temple as a sacred and inviolate place where the analysand might encounter the Self. Although Jung believed that whether called or not, the gods would appear, under certain conditions, patients may experience 'temenos lost', the loss of the holding function of the analytic space. Two cases are presented in which temenos issues played a central role. In one case, an unorthodox method was used to preserve the analytic container during the absence of the analyst and in the other, the impact of an extra-analytical encounter had a dramatic effect on the holding function of the temenos. A discussion is presented of the appropriate circumstances in which analysts may deviate from traditional analytic practice in order to preserve the temenos and transform a 'temenos lost' into a 'temenos regained'.

  2. Special Nuclear Material Gamma-Ray Signatures for Reachback Analysts

    Energy Technology Data Exchange (ETDEWEB)

    Karpius, Peter Joseph [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Myers, Steven Charles [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-08-29

    These are slides on special nuclear material gamma-ray signatures for reachback analysts for an LSS Spectroscopy course. The closing thoughts for this presentation are the following: SNM materials have definite spectral signatures that should be readily recognizable to analysts in both bare and shielded configurations. One can estimate burnup of plutonium using certain pairs of peaks that are a few keV apart. In most cases, one cannot reliably estimate uranium enrichment in an analogous way to the estimation of plutonium burnup. The origin of the most intense peaks from some SNM items may be indirect and from ‘associated nuclides.' Indirect SNM signatures sometimes have commonalities with the natural gamma-ray background.

  3. Understanding NASA surface missions with the PDS Analyst's Notebook

    Science.gov (United States)

    Stein, T.

    2011-10-01

    Planetary data archives of surface missions contain data from numerous hosted instruments. Because of the nondeterministic nature of surface missions, it is not possible to assess the data without understanding the context in which they were collected. The PDS Analyst's Notebook (http://an.rsl.wustl.edu) provides access to Mars Exploration Rover (MER) [1] and Mars Phoenix Lander [2] data archives by integrating sequence information, engineering and science data, observation planning and targeting, and documentation into web-accessible pages to facilitate "mission replay." In addition, Lunar Apollo surface mission data archives and LCROSS mission data are available in the Analyst's Notebook concept, and a Notebook is planned for Mars Science Laboratory (MSL) mission.

  4. Analyst Hype in IPOs: Explaining the Popularity of Bookbuilding

    OpenAIRE

    Francois Degeorge; Francois Derrien; Kent L. Womack

    2007-01-01

    The bookbuilding IPO procedure has captured significant market share from auction alternatives recently, despite the significantly lower costs related to the auction mechanism. In France, where both mechanisms were used in the 1990s, the ostensible advantages of bookbuilding were advertising-related benefits. Book-built issues were more likely to be followed and positively recommended by lead underwriters. Even nonunderwriters' analysts promote book-built issues more in order to curry favor w...

  5. The analyst's desire in the clinic of anorexia

    OpenAIRE

    Silva, Mariana Benatto Pereira da; Pereira, Mario Eduardo Costa; Celeri, Eloísa Helena Valler

    2010-01-01

    The present work deals with the issue of the analyst's desire in the psychoanalytical treatment of anorexia. It analyzes important elements to establish transference in these cases, as the pursuit of death and the choice of refusing food as a way of controlling the demands of the Other. It then discusses the "analist's desire" function in this clinic. Rejecting the definition of a treatment model and the structural categorization of anorexia, we can find in the cases of the girl of Angouleme ...

  6. Translating the covenant: The behavior analyst as ambassador and translator.

    Science.gov (United States)

    Foxx, R M

    1996-01-01

    Behavior analysts should be sensitive to how others react to and interpret our language because it is inextricably related to our image. Our use of conceptual revision, with such terms as punishment, has created communicative confusion and hostility on the part of general and professional audiences we have attempted to influence. We must, therefore, adopt the role of ambassador and translator in the nonbehavioral world. A number of recommendations are offered for promoting, translating, and disseminating behavior analysis.

  7. Disclosure of Non-Financial Information: Relevant to Financial Analysts?

    OpenAIRE

    ORENS, Raf; LYBAERT, Nadine

    2013-01-01

    The decline in the relevance of financial statement information to value firms leads to calls from organizational stakeholders to convey non-financial information in order to be able to judge firms' financial performance and value. This literature review aims to report extant literature findings on the use of corporate non-financial information by sell-side financial analysts, the information intermediaries between corporate management and investors. Prior studies highlight that financial ana...

  8. Information seeking and use behaviour of economists and business analysts

    Directory of Open Access Journals (Sweden)

    Eric Thivant

    2005-01-01

    Full Text Available Introduction. The aim of this paper is to deal with the information seeking and use problem in a professional context and understand how activity can influence practices, by taking as examples, the research undertaken by economic analysts. We analyse the relationship between the situational approach, described by Cheuk, the work environment complexity (with social, technological and personal aspects, and the information seeking and use strategies, which relied on Ellis and Wilson's model, with Bates's comments. Method. We interviewed eight economists, using a questionnaire and the SICIA (Situation, Complexity and Information Activity method. The SICAI method is a qualitative approach, which underlines the relationship between situations, professional contexts and strategies. Both methods allow better understanding of how investment analysts find out what they need for their job. We can clarify their information sources and practices of information seeking, which are very particular because of their activities. We complete our analysis by interviewing analysts from financial institutions. Analysis. A qualitative mode of analysis was used to interpret the interviewees' comments, within the research framework adopted. Results. We find similarity in information seeking and use strategies used by these two groups and environmental levels meet in most situations. But some differences can be also found, explained by the activity frameworks and goals. Conclusion. This study demonstrates that the activity and also the professional context (here the financial context can directly influence practices.

  9. Learning patterns of life from intelligence analyst chat

    Science.gov (United States)

    Schneider, Michael K.; Alford, Mark; Babko-Malaya, Olga; Blasch, Erik; Chen, Lingji; Crespi, Valentino; HandUber, Jason; Haney, Phil; Nagy, Jim; Richman, Mike; Von Pless, Gregory; Zhu, Howie; Rhodes, Bradley J.

    2016-05-01

    Our Multi-INT Data Association Tool (MIDAT) learns patterns of life (POL) of a geographical area from video analyst observations called out in textual reporting. Typical approaches to learning POLs from video make use of computer vision algorithms to extract locations in space and time of various activities. Such approaches are subject to the detection and tracking performance of the video processing algorithms. Numerous examples of human analysts monitoring live video streams annotating or "calling out" relevant entities and activities exist, such as security analysis, crime-scene forensics, news reports, and sports commentary. This user description typically corresponds with textual capture, such as chat. Although the purpose of these text products is primarily to describe events as they happen, organizations typically archive the reports for extended periods. This archive provides a basis to build POLs. Such POLs are useful for diagnosis to assess activities in an area based on historical context, and for consumers of products, who gain an understanding of historical patterns. MIDAT combines natural language processing, multi-hypothesis tracking, and Multi-INT Activity Pattern Learning and Exploitation (MAPLE) technologies in an end-to-end lab prototype that processes textual products produced by video analysts, infers POLs, and highlights anomalies relative to those POLs with links to "tracks" of related activities performed by the same entity. MIDAT technologies perform well, achieving, for example, a 90% F1-value on extracting activities from the textual reports.

  10. Analyst reluctance in conveying negative information to the market

    Directory of Open Access Journals (Sweden)

    Luca Piras

    2012-11-01

    Full Text Available This paper investigates one of the main sources of financial markets’ public information: financial analysts’ reports. We analyze reports on S&P 500 index through a multidisciplinary approach integrating behavioral finance with linguistic analysis to understand how financial phenomena reflect in or are deviated by language, i.e. whether financial and linguistic trends follow the same patterns, boosting each other, or diverge. In the latter, language could conceal financial events, mitigating analysts’ feelings and misleading investors. Therefore, we attempt to identify behavioral biases (mainly represented by cognitive dissonances present in analysts’ reports. In doing so, we try to understand whether analysts try to hide the perception of negative price-sensitive events or not, eventually anticipating and controlling the market “mood”. This study focuses on how analysts use linguistic strategies in order to minimize their risk of issuing wrong advice. Our preliminary results show reluctance to incorporate negative information in the reports. A slight asymmetry between the use of positive/negative keywords taken into account and the negative/positive trends of the index seems to emerge. In those weeks characterized by the index poor performances, the frequency of keywords with a negative meaning is lower. On the contrary, in the recovering weeks a higher use of keywords with a positive meaning does not clearly appear. A thorough investigation on the market moods and the analysis of the text of the reports enable us to assess if and to what extent analysts have been willing to mitigate pessimism or emphasize confidence. Furthermore, we contribute to the existing literature also proposing a possible analysts’ value function based on the Prospect Theory [Kahneman and Tversky, 1979] where analysts try to maximize the value deriving from enhancing their reputation, taking into account the risks that may cause a reputational loss. This

  11. The Inefficient Use of Macroeconomic Information in Analysts' Earnings Forecasts in Emerging Markets

    NARCIS (Netherlands)

    G.J. de Zwart (Gerben); D.J.C. van Dijk (Dick)

    2008-01-01

    textabstractThis paper presents empirical evidence that security analysts do not efficiently use publicly available macroeconomic information in their earnings forecasts for emerging market stocks. Analysts completely ignore forecasts on political stability, while these provide valuable information

  12. A Survey of Functional Behavior Assessment Methods Used by Behavior Analysts in Practice

    Science.gov (United States)

    Oliver, Anthony C.; Pratt, Leigh A.; Normand, Matthew P.

    2015-01-01

    To gather information about the functional behavior assessment (FBA) methods behavior analysts use in practice, we sent a web-based survey to 12,431 behavior analysts certified by the Behavior Analyst Certification Board. Ultimately, 724 surveys were returned, with the results suggesting that most respondents regularly use FBA methods, especially…

  13. Idealization of the analyst by the young adult.

    Science.gov (United States)

    Chused, J F

    1987-01-01

    Idealization is an intrapsychic process that serves many functions. In addition to its use defensively and for gratification of libidinal and aggressive drive derivatives, it can contribute to developmental progression, particularly during late adolescence and young adulthood. During an analysis, it is important to recognize all the determinants of idealization, including those related to the reworking of developmental conflicts. If an analyst understands idealization solely as a manifestation of pathology, he may interfere with his patient's use of it for the development of autonomous functioning.

  14. The analyst's desire in the clinic of anorexia

    Directory of Open Access Journals (Sweden)

    Mariana Benatto Pereira da Silva

    2010-06-01

    Full Text Available The present work deals with the issue of the analyst's desire in the psychoanalytical treatment of anorexia. It analyzes important elements to establish transference in these cases, as the pursuit of death and the choice of refusing food as a way of controlling the demands of the Other. It then discusses the "analist's desire" function in this clinic. Rejecting the definition of a treatment model and the structural categorization of anorexia, we can find in the cases of the girl of Angouleme (Charcot and Sidonie (M. Mannoni present possible subjective ways to get out of this psychopathological impasse, by means of this function.

  15. Fuzzy VIKOR approach for selection of big data analyst in procurement management

    Directory of Open Access Journals (Sweden)

    Surajit Bag

    2016-07-01

    Full Text Available Background: Big data and predictive analysis have been hailed as the fourth paradigm of science. Big data and analytics are critical to the future of business sustainability. The demand for data scientists is increasing with the dynamic nature of businesses, thus making it indispensable to manage big data, derive meaningful results and interpret management decisions. Objectives: The purpose of this study was to provide a brief conceptual review of big data and analytics and further illustrate the use of a multicriteria decision-making technique in selecting the right skilled candidate for big data and analytics in procurement management. Method: It is important for firms to select and recruit the right data analyst, both in terms of skills sets and scope of analysis. The nature of such a problem is complex and multicriteria decision-making, which deals with both qualitative and quantitative factors. In the current study, an application of the Fuzzy VIsekriterijumska optimizacija i KOmpromisno Resenje (VIKOR method was used to solve the big data analyst selection problem. Results: From this study, it was identified that Technical knowledge (C1, Intellectual curiosity (C4 and Business acumen (C5 are the strongest influential criteria and must be present in the candidate for the big data and analytics job. Conclusion: Fuzzy VIKOR is the perfect technique in this kind of multiple criteria decisionmaking problematic scenario. This study will assist human resource managers and procurement managers in selecting the right workforce for big data analytics.

  16. Transference to the analyst as an excluded observer.

    Science.gov (United States)

    Steiner, John

    2008-02-01

    In this paper I briefly review some significant points in the development of ideas on transference which owe so much to the discoveries of Freud. I then discuss some of the subsequent developments which were based on Freud 's work and which have personally impressed me. In particular I mention Melanie Klein's elaboration of an internal world peopled by internal object and her description of the mechanisms of splitting and projective identification, both of which profoundly affect our understanding of transference. Using some clinical material I try to illustrate an important transference situation which I do not think has been sufficiently emphasized although it is part of the 'total situation' outlined by Klein. In this kind of transference the analyst finds himself in an observing position and is no longer the primary object to whom love and hate are directed. Instead he is put in a position of an excluded figure who can easily enact rather than understand the role he has been put in. In this situation he may try to regain the position as the patient's primary object in the transference or avoid the transference altogether and make extra-transference interpretations and in this way enact the role of a judgemental and critical super-ego. If he can tolerate the loss of a central role and understand the transference position he has been put in, the analyst can sometimes reduce enactments and release feelings to do with mourning and loss in both himself and his patient.

  17. Transformations in hallucinosis and the receptivity of the analyst.

    Science.gov (United States)

    Civitarese, Giuseppe

    2015-08-01

    Bion describes transformation in hallucinosis (TH) as a psychic defence present in elusive psychotic scenarios in which there is a total adherence to concrete reality: as the hallucinatory activity which physiologically infiltrates perception and allows us to know reality, setting it off against a background of familiarity; and then, surprisingly, as the ideal state of mind towards which the analyst has to move in order to intuit the facts of the analysis. When hallucinosis is followed by 'awakening', the analyst gains understanding from the experience and goes through a transformation that will inevitably be transmitted to the analytic field and to the patient. In this paper I illustrate Bion's concept and underline its eminently intersubjective nature. Then I differentiate it from two other technical devices: reverie, which unlike hallucinosis does not imply the persistence of a feeling of the real, and Ferro's transformation in dreaming, i.e., purposeful listening to everything that is said in the analysis as if it were the telling of a dream. Finally, I try to demonstrate the practical utility of the concept of transformation in hallucinosis in order to read the complex dynamics of a clinical vignette. Though not well known (only two references in English in the PEP archive), TH proves to be remarkably versatile and productive for thinking about psychoanalytic theory, technique and clinical work. Copyright © 2014 Institute of Psychoanalysis.

  18. How Analysts Cognitively “Connect the Dots”

    Energy Technology Data Exchange (ETDEWEB)

    Bradel, Lauren; Self, Jessica S.; Endert, Alexander; Hossain, Shahriar M.; North, Chris; Ramakrishnan, Naren

    2013-06-04

    As analysts attempt to make sense of a collection of documents, such as intelligence analysis reports, they may wish to “connect the dots” between pieces of information that may initially seem unrelated. This process of synthesizing information between information requires users to make connections between pairs of documents, creating a conceptual story. We conducted a user study to analyze the process by which users connect pairs of documents and how they spatially arrange information. Users created conceptual stories that connected the dots using organizational strategies that ranged in complexity. We propose taxonomies for cognitive connections and physical structures used when trying to “connect the dots” between two documents. We compared the user-created stories with a data-mining algorithm that constructs chains of documents using co-occurrence metrics. Using the insight gained into the storytelling process, we offer design considerations for the existing data mining algorithm and corresponding tools to combine the power of data mining and the complex cognitive processing of analysts.

  19. Dynamics of analyst forecasts and emergence of complexity: Role of information disparity.

    Directory of Open Access Journals (Sweden)

    Chansoo Kim

    Full Text Available We report complex phenomena arising among financial analysts, who gather information and generate investment advice, and elucidate them with the help of a theoretical model. Understanding how analysts form their forecasts is important in better understanding the financial market. Carrying out big-data analysis of the analyst forecast data from I/B/E/S for nearly thirty years, we find skew distributions as evidence for emergence of complexity, and show how information asymmetry or disparity affects financial analysts' forming their forecasts. Here regulations, information dissemination throughout a fiscal year, and interactions among financial analysts are regarded as the proxy for a lower level of information disparity. It is found that financial analysts with better access to information display contrasting behaviors: a few analysts become bolder and issue forecasts independent of other forecasts while the majority of analysts issue more accurate forecasts and flock to each other. Main body of our sample of optimistic forecasts fits a log-normal distribution, with the tail displaying a power law. Based on the Yule process, we propose a model for the dynamics of issuing forecasts, incorporating interactions between analysts. Explaining nicely empirical data on analyst forecasts, this provides an appealing instance of understanding social phenomena in the perspective of complex systems.

  20. Radiation litigation: Quality assurance and the radiation analyst

    International Nuclear Information System (INIS)

    Jose, D.E.

    1986-01-01

    This paper touches on three areas of interest to the radiation analyst; the dose issue, legal persuasion, and future legal issues. While many laboratory scientists would think that the actual dose received by the plaintiff's relevant organ would be an easy issue to resolve, that has not been the experience to date. All radiation cases are assumed to be ultrahazardous activity cases, even though they involve a dose well below yearly natural background. At some point the law needs to realize that such low dose cases are a waste of scarce judicial resources. Lawyers and scientists need to communicate with each other and work together to help improve the way the legal system processes these important cases

  1. BETWEEN PSYCHOANALYSIS AND TESTIMONIAL SPACE: THE ANALYST AS A WITNESS.

    Science.gov (United States)

    Gondar, Jô

    2017-04-01

    The aim of this article is to think of the place of the witness as a third place that the analyst, in the clinical space of trauma, is able to sustain. According to Ferenczi, in traumatic dreams a third is already being summoned. It is not the witness of the realm of law, nor the place of the father or the symbolic law. This is a third space that can be called potential, interstitial space, indeterminate and formless, where something that at first would be incommunicable circulates and gradually takes shape. This space allows and supports the literalness of a testimonial narrative, its hesitations, paradoxes and silences. More than a trauma theory, the notion of a potential space would be the great contribution of psychoanalysis to the treatment of trauma survivors, establishing the difference between the task of a psychoanalyst and the one of a truth commission.

  2. Automatic theory generation from analyst text files using coherence networks

    Science.gov (United States)

    Shaffer, Steven C.

    2014-05-01

    This paper describes a three-phase process of extracting knowledge from analyst textual reports. Phase 1 involves performing natural language processing on the source text to extract subject-predicate-object triples. In phase 2, these triples are then fed into a coherence network analysis process, using a genetic algorithm optimization. Finally, the highest-value sub networks are processed into a semantic network graph for display. Initial work on a well- known data set (a Wikipedia article on Abraham Lincoln) has shown excellent results without any specific tuning. Next, we ran the process on the SYNthetic Counter-INsurgency (SYNCOIN) data set, developed at Penn State, yielding interesting and potentially useful results.

  3. Blurred Lines: Ethical Implications of Social Media for Behavior Analysts.

    Science.gov (United States)

    O'Leary, Patrick N; Miller, Megan M; Olive, Melissa L; Kelly, Amanda N

    2017-03-01

    Social networking has a long list of advantages: it enables access to a large group of people that would otherwise not be geographically convenient or possible to connect with; it reaches several different generations, particularly younger ones, which are not typically involved in discussion of current events; and these sites allow a cost effective, immediate, and interactive way to engage with others. With the vast number of individuals who use social media sites as a way to connect with others, it may not be possible to completely abstain from discussions and interactions on social media that pertain to our professional practice. This is all the more reason that behavior analysts attend to the contingencies specific to these tools. This paper discusses potential ethical situations that may arise and offers a review of the Behavior Analysis Certification Board (BACB) guidelines pertaining to social networking, as well as provides suggestions for avoiding or resolving potential violations relating to online social behavior.

  4. Connecting Hazard Analysts and Risk Managers to Sensor Information.

    Science.gov (United States)

    Le Cozannet, Gonéri; Hosford, Steven; Douglas, John; Serrano, Jean-Jacques; Coraboeuf, Damien; Comte, Jérémie

    2008-06-11

    Hazard analysts and risk managers of natural perils, such as earthquakes, landslides and floods, need to access information from sensor networks surveying their regions of interest. However, currently information about these networks is difficult to obtain and is available in varying formats, thereby restricting accesses and consequently possibly leading to decision-making based on limited information. As a response to this issue, state-of-the-art interoperable catalogues are being currently developed within the framework of the Group on Earth Observations (GEO) workplan. This article provides an overview of the prototype catalogue that was developed to improve access to information about the sensor networks surveying geological hazards (geohazards), such as earthquakes, landslides and volcanoes.

  5. Reflections: can the analyst share a traumatizing experience with a traumatized patient?

    Science.gov (United States)

    Lijtmaer, Ruth

    2010-01-01

    This is a personal account of a dreadful event in the analyst's life that was similar to a patient's trauma. It is a reflection on how the analyst dealt with her own trauma, the patient's trauma, and the transference and countertransference dynamics. Included is a description of the analyst's inner struggles with self-disclosure, continuance of her professional work, and the need for persistent self-scrutiny. The meaning of objects in people's life, particularly the concept of home, will be addressed.

  6. Effects of Motivation: Rewarding Hackers for Undetected Attacks Cause Analysts to Perform Poorly.

    Science.gov (United States)

    Maqbool, Zahid; Makhijani, Nidhi; Pammi, V S Chandrasekhar; Dutt, Varun

    2017-05-01

    The aim of this study was to determine how monetary motivations influence decision making of humans performing as security analysts and hackers in a cybersecurity game. Cyberattacks are increasing at an alarming rate. As cyberattacks often cause damage to existing cyber infrastructures, it is important to understand how monetary rewards may influence decision making of hackers and analysts in the cyber world. Currently, only limited attention has been given to this area. In an experiment, participants were randomly assigned to three between-subjects conditions ( n = 26 for each condition): equal payoff, where the magnitude of monetary rewards for hackers and defenders was the same; rewarding hacker, where the magnitude of monetary reward for hacker's successful attack was 10 times the reward for analyst's successful defense; and rewarding analyst, where the magnitude of monetary reward for analyst's successful defense was 10 times the reward for hacker's successful attack. In all conditions, half of the participants were human hackers playing against Nash analysts and half were human analysts playing against Nash hackers. Results revealed that monetary rewards for human hackers and analysts caused a decrease in attack and defend actions compared with the baseline. Furthermore, rewarding human hackers for undetected attacks made analysts deviate significantly from their optimal behavior. If hackers are rewarded for their undetected attack actions, then this causes analysts to deviate from optimal defend proportions. Thus, analysts need to be trained not become overenthusiastic in defending networks. Applications of our results are to networks where the influence of monetary rewards may cause information theft and system damage.

  7. Data analyst technician: an innovative role for the pharmacy technician.

    Science.gov (United States)

    Ervin, K C; Skledar, S; Hess, M M; Ryan, M

    2001-10-01

    The development of an innovative role for the pharmacy technician is described. The role of the pharmacy technician was based on a needs assessment and the expertise of the pharmacy technician selected. Initial responsibilities of the technician included chart reviews, benchmarking surveys, monthly financial impact analysis, initiative assessment, and quality improvement reporting. As the drug-use and disease-state management (DUDSM) program expanded, pharmacist activities increased, requiring the expansion of data analyst technician (DAT) duties. These new responsibilities included participation in patient assessment, data collection and interpretation, and formulary enforcement. Most recently, technicians' expanded duties include maintenance of a physician compliance profiling database, quality improvement reporting and graphing, active role in patient risk assessment and database management for adult vaccination, and support of financial impact monitoring for other institutions within the health system. This pharmacist-technician collaboration resulted a threefold increase in patient assessments completed per day. In addition, as the DUDSM program continues to expand across the health system, an increase in DAT resources from 0.5 to 1.0 full-time equivalent was obtained. The role of the DAT has increased the efficiency of the DUDSM program and has provided an innovative role for the pharmacy technician.

  8. 78 FR 14359 - Verizon Business Networks Services, Inc., Senior Analysts-Order Management, Voice Over Internet...

    Science.gov (United States)

    2013-03-05

    ... Business Networks Services, Inc., Senior Analysts-Order Management, Voice Over Internet Protocol, Small And... Management, Voice Over Internet Protocol, Small And Medium Business, San Antonio, TX; Amended Certification... Business Networks Services, Inc., Senior Analysts-Order Management, Voice Over Internet Protocol, Small and...

  9. The demand for corporate financial reporting: A survey among financial analysts

    NARCIS (Netherlands)

    A. de Jong (Abe); G.M.H. Mertens (Gerard); A.M. van der Poel (Marieke); R. van Dijk (Ronald)

    2010-01-01

    textabstractAbstract: We examine financial analysts’ views on corporate financial reporting issues by means of a survey among 306 analysts and interviews among 21 analysts and compare their views with that of CFOs. Since CFOs believe that meeting or beating analysts’ forecasts and managing

  10. SafetyAnalyst : software tools for safety management of specific highway sites

    Science.gov (United States)

    2010-07-01

    SafetyAnalyst provides a set of software tools for use by state and local highway agencies for highway safety management. SafetyAnalyst can be used by highway agencies to improve their programming of site-specific highway safety improvements. SafetyA...

  11. Analysis of Skills Requirement for Entry-Level Programmer/Analysts in Fortune 500 Corporations

    Science.gov (United States)

    Lee, Choong Kwon; Han, Hyo-Joo

    2008-01-01

    This paper presents the most up-to-date skill requirements for programmer/analyst, one of the most demanded entry-level job titles in the Information Systems (IS) field. In the past, several researchers studied job skills for IS professionals, but few have focused especially on "programmer/analyst." The authors conducted an extensive empirical…

  12. Predicting unpredictability

    Science.gov (United States)

    Davis, Steven J.

    2018-04-01

    Analysts and markets have struggled to predict a number of phenomena, such as the rise of natural gas, in US energy markets over the past decade or so. Research shows the challenge may grow because the industry — and consequently the market — is becoming increasingly volatile.

  13. The patient who believes and the analyst who does not (1).

    Science.gov (United States)

    Lijtmaer, Ruth M

    2009-01-01

    A patient's religious beliefs and practices challenge the clinical experience and self-knowledge of the analyst owing to a great complexity of factors, and often take the form of the analyst's resistances and countertransference reactions to spiritual and religious issues. The analyst's feelings about the patient's encounters with religion and other forms of healing experiences may result in impasses and communication breakdown for a variety of reasons. These reasons include the analyst's own unresolved issues around her role as a psychoanalyst-which incorporates in some way psychoanalysis's views of religious belief-and these old conflicts may be irritated by the religious themes expressed by the patient. Vignettes from the treatments of two patients provide examples of the analyst's countertransference conflicts, particularly envy in the case of a therapist who is an atheist.

  14. The Role of Analysts as Gatekeepers: Enhancing Transparency and Curbing Earnings Management in Brazil

    Directory of Open Access Journals (Sweden)

    Antonio Lopo Martinez

    2011-07-01

    Full Text Available This paper examines the relationship of analysts’ coverage, forecasting errors and earnings management. It corroborates the role of analysts as gatekeepers by finding that analysts enhance transparency and reduce the scope of earnings management. To identify analysts’ coverage we used the I/B/E/S, from where we also obtained information on the consensus projections of analysts for listed Brazilian companies. The results indicated a negative correlation between the number of analysts covering firms and the magnitude of their discretionary accruals in absolute terms, indicating that more scrutiny inhibits earnings management. We also found a negative correlation between analysts’ coverage and forecasting errors. Multivariate regressions showed statistically significant results in the same sense. Therefore, market analysts, despite the severe criticism they receive from the specialized press, actually have a beneficial effect on corporate governance by monitoring managers and inhibiting earnings management.

  15. Training the next generation analyst using red cell analytics

    Science.gov (United States)

    Graham, Meghan N.; Graham, Jacob L.

    2016-05-01

    We have seen significant change in the study and practice of human reasoning in recent years from both a theoretical and methodological perspective. Ubiquitous communication coupled with advances in computing and a plethora of analytic support tools have created a push for instantaneous reporting and analysis. This notion is particularly prevalent in law enforcement, emergency services and the intelligence community (IC), where commanders (and their civilian leadership) expect not only a birds' eye view of operations as they occur, but a play-by-play analysis of operational effectiveness. This paper explores the use of Red Cell Analytics (RCA) as pedagogy to train the next-gen analyst. A group of Penn State students in the College of Information Sciences and Technology at the University Park campus of The Pennsylvania State University have been practicing Red Team Analysis since 2008. RCA draws heavily from the military application of the same concept, except student RCA problems are typically on non-military in nature. RCA students utilize a suite of analytic tools and methods to explore and develop red-cell tactics, techniques and procedures (TTPs), and apply their tradecraft across a broad threat spectrum, from student-life issues to threats to national security. The strength of RCA is not always realized by the solution but by the exploration of the analytic pathway. This paper describes the concept and use of red cell analytics to teach and promote the use of structured analytic techniques, analytic writing and critical thinking in the area of security and risk and intelligence training.

  16. Using the living laboratory framework as a basis for understanding next-generation analyst work

    Science.gov (United States)

    McNeese, Michael D.; Mancuso, Vincent; McNeese, Nathan; Endsley, Tristan; Forster, Pete

    2013-05-01

    The preparation of next generation analyst work requires alternative levels of understanding and new methodological departures from the way current work transpires. Current work practices typically do not provide a comprehensive approach that emphasizes the role of and interplay between (a) cognition, (b) emergent activities in a shared situated context, and (c) collaborative teamwork. In turn, effective and efficient problem solving fails to take place, and practice is often composed of piecemeal, techno-centric tools that isolate analysts by providing rigid, limited levels of understanding of situation awareness. This coupled with the fact that many analyst activities are classified produces a challenging situation for researching such phenomena and designing and evaluating systems to support analyst cognition and teamwork. Through our work with cyber, image, and intelligence analysts we have realized that there is more required of researchers to study human-centered designs to provide for analyst's needs in a timely fashion. This paper identifies and describes how The Living Laboratory Framework can be utilized as a means to develop a comprehensive, human-centric, and problem-focused approach to next generation analyst work, design, and training. We explain how the framework is utilized for specific cases in various applied settings (e.g., crisis management analysis, image analysis, and cyber analysis) to demonstrate its value and power in addressing an area of utmost importance to our national security. Attributes of analyst work settings are delineated to suggest potential design affordances that could help improve cognitive activities and awareness. Finally, the paper puts forth a research agenda for the use of the framework for future work that will move the analyst profession in a viable manner to address the concerns identified.

  17. The Determinants of Sell-side Analysts' Forecast Accuracy and Media Exposure

    OpenAIRE

    Sorogho, Samira Amadu

    2017-01-01

    This study examines contributing factors to the differential forecasting abilities of sell-side analysts and the relation between the sentiments of these analysts and their media exposure. In particular, I investigate whether the level of optimism expressed in sell-side analysts’ reports of fifteen constituents of primarily the S&P 500 Oil and Gas Industry1, enhance the media appearance of these analysts. Using a number of variables estimated from the I/B/E/S Detail history database, 15,455 a...

  18. On the relation between forecast precision and trading profitability of financial analysts

    DEFF Research Database (Denmark)

    Marinelli, Carlo; Weissensteiner, Alex

    2014-01-01

    We analyze the relation between earnings forecast accuracy and the expected profitability of financial analysts. Modeling forecast errors with a multivariate normal distribution, a complete characterization of the payoff of each analyst is provided. In particular, closed-form expressions for the ......We analyze the relation between earnings forecast accuracy and the expected profitability of financial analysts. Modeling forecast errors with a multivariate normal distribution, a complete characterization of the payoff of each analyst is provided. In particular, closed-form expressions...... for the probability density function, for the expectation, and, more generally, for moments of all orders are obtained. Our analysis shows that the relationship between forecast precision and trading profitability needs not be monotonic, and that the impact of the correlation between the forecasts on the expected...

  19. A reply to behavior analysts writing about rules and rule-governed behavior.

    Science.gov (United States)

    Schlinger, H D

    1990-01-01

    Verbal stimuli called "rules" or "instructions" continue to be interpreted as discriminative stimuli despite recent arguments against this practice. Instead, it might more fruitful for behavior analysts to focus on "contingency-specifying stimuli" which are function-altering. Moreover, rather than having a special term, "rule," for verbal stimuli whose only function is discriminative, perhaps behavior analysts should reserve the term, if at all, only for these function-altering contingency-specifying stimuli.

  20. A reply to behavior analysts writing about rules and rule-governed behavior

    OpenAIRE

    Schlinger, Henry D.

    1990-01-01

    Verbal stimuli called “rules” or “instructions” continue to be interpreted as discriminative stimuli despite recent arguments against this practice. Instead, it might more fruitful for behavior analysts to focus on “contingency-specifying stimuli” which are function-altering. Moreover, rather than having a special term, “rule,” for verbal stimuli whose only function is discriminative, perhaps behavior analysts should reserve the term, if at all, only for these function-altering contingency-sp...

  1. What's in a name: what analyst and patient call each other.

    Science.gov (United States)

    Barron, Grace Caroline

    2006-01-01

    Awkward moments often arise between patient and analyst involving the question, "What do we call each other?" The manner in which the dyad address each other contains material central to the patient's inner life. Names, like dreams, deserve a privileged status as providing a royal road into the paradoxical analytic relationship and the unconscious conflicts that feed it. Whether an analyst addresses the patient formally, informally, or not at all, awareness of the issues surrounding names is important.

  2. The analyst's anxieties in the first interview: barriers against analytic presence.

    Science.gov (United States)

    Møller, Mette

    2014-06-01

    To answer the questions: why don't more people enter analysis and how do we get more people to do so? Attention is drawn to anxieties in the analyst that become obstacles to the initiation of analysis. The main focus of the paper is how to understand why analysts, irrespective of patient characteristics, seem to have resistances against embarking on analysis. Being a meeting between strangers the consultation activates strong emotional reactions in both parties. One way of coping is defensively to diagnose, assess and exclude instead of being present as an analyst. The analytic frame of a consultation is ambiguous, and a secure analytic function is needed in order to meet the openness and unpredictability of this frame. A fragile psychoanalytic identity is seen as central to analysts' failure to create an analytic practice; it takes years to develop and maintain a robust analytic function, and analytic work continues to cause disturbing emotional reactions in the analyst. Analysts' vulnerable identity is also linked to the history of psychoanalysis that has fostered an ideal of analytic practice that is omnipotent and impossible to reach. Therefore it is no wonder that attempts to reach a convinced recommendation of analysis can become diverted in the process of consultation. Confronting these inner impediments in order to strengthen the analytic identity is suggested as a better way to get more analytic patients than to keep looking for so-called analysability in patients. Copyright © 2014 Institute of Psychoanalysis.

  3. Using MetaboAnalyst 3.0 for Comprehensive Metabolomics Data Analysis.

    Science.gov (United States)

    Xia, Jianguo; Wishart, David S

    2016-09-07

    MetaboAnalyst (http://www.metaboanalyst.ca) is a comprehensive Web application for metabolomic data analysis and interpretation. MetaboAnalyst handles most of the common metabolomic data types from most kinds of metabolomics platforms (MS and NMR) for most kinds of metabolomics experiments (targeted, untargeted, quantitative). In addition to providing a variety of data processing and normalization procedures, MetaboAnalyst also supports a number of data analysis and data visualization tasks using a range of univariate, multivariate methods such as PCA (principal component analysis), PLS-DA (partial least squares discriminant analysis), heatmap clustering and machine learning methods. MetaboAnalyst also offers a variety of tools for metabolomic data interpretation including MSEA (metabolite set enrichment analysis), MetPA (metabolite pathway analysis), and biomarker selection via ROC (receiver operating characteristic) curve analysis, as well as time series and power analysis. This unit provides an overview of the main functional modules and the general workflow of the latest version of MetaboAnalyst (MetaboAnalyst 3.0), followed by eight detailed protocols. © 2016 by John Wiley & Sons, Inc. Copyright © 2016 John Wiley & Sons, Inc.

  4. An eye tracking study of bloodstain pattern analysts during pattern classification.

    Science.gov (United States)

    Arthur, R M; Hoogenboom, J; Green, R D; Taylor, M C; de Bruin, K G

    2018-05-01

    Bloodstain pattern analysis (BPA) is the forensic discipline concerned with the classification and interpretation of bloodstains and bloodstain patterns at the crime scene. At present, it is unclear exactly which stain or pattern properties and their associated values are most relevant to analysts when classifying a bloodstain pattern. Eye tracking technology has been widely used to investigate human perception and cognition. Its application to forensics, however, is limited. This is the first study to use eye tracking as a tool for gaining access to the mindset of the bloodstain pattern expert. An eye tracking method was used to follow the gaze of 24 bloodstain pattern analysts during an assigned task of classifying a laboratory-generated test bloodstain pattern. With the aid of an automated image-processing methodology, the properties of selected features of the pattern were quantified leading to the delineation of areas of interest (AOIs). Eye tracking data were collected for each AOI and combined with verbal statements made by analysts after the classification task to determine the critical range of values for relevant diagnostic features. Eye-tracking data indicated that there were four main regions of the pattern that analysts were most interested in. Within each region, individual elements or groups of elements that exhibited features associated with directionality, size, colour and shape appeared to capture the most interest of analysts during the classification task. The study showed that the eye movements of trained bloodstain pattern experts and their verbal descriptions of a pattern were well correlated.

  5. Implementation status of performance demonstration program for steam generator tubing analysts in Korea

    International Nuclear Information System (INIS)

    Cho, Chan Hee; Lee, Hee Jong; Yoo, Hyun Ju; Nam, Min Woo; Hong, Sung Yull

    2013-01-01

    Some essential components in nuclear power plants are periodically inspected using non destructive examinations, for example ultrasonic, eddy current and radiographic examinations, in order to determine their integrity. These components include nuclear power plant items such as vessels, containments, piping systems, pumps, valves, tubes and core support structure. Steam generator tubes have an important safety role because they constitute one of the primary barriers between the radioactive and non radioactive sides of the nuclear power plant. There is potential that if a tube bursts while a plant is operating, radioactivity from the primary coolant system could escape directly to the atmosphere. Therefore, in service inspections are critical in maintaining steam generator tube integrity. In general, the eddy current testing is widely used for the inspection of steam generator tubes due tube integrity. In general, the eddy current testing is widely used for the inspection of steam generator tubes due to its high inspection speed and flaw detectability on non magnetic tubes. However, it is not easy to analyze correctly eddy current signals because they are influenced by many factors. Therefore, the performance of eddy current data analysts for steam generator tubing should be demonstrated comprehensively. In Korea, the performance of steam generator tubing analysts has been demonstrated using the Qualified Data Analyst program. This paper describes the performance demonstration program for steam generator tubing analysts and its implementation results in Korea. The pass rate of domestic analysts for this program was 71.4%

  6. Accuracy and Consistency of Grass Pollen Identification by Human Analysts Using Electron Micrographs of Surface Ornamentation

    Directory of Open Access Journals (Sweden)

    Luke Mander

    2014-08-01

    Full Text Available Premise of the study: Humans frequently identify pollen grains at a taxonomic rank above species. Grass pollen is a classic case of this situation, which has led to the development of computational methods for identifying grass pollen species. This paper aims to provide context for these computational methods by quantifying the accuracy and consistency of human identification. Methods: We measured the ability of nine human analysts to identify 12 species of grass pollen using scanning electron microscopy images. These are the same images that were used in computational identifications. We have measured the coverage, accuracy, and consistency of each analyst, and investigated their ability to recognize duplicate images. Results: Coverage ranged from 87.5% to 100%. Mean identification accuracy ranged from 46.67% to 87.5%. The identification consistency of each analyst ranged from 32.5% to 87.5%, and each of the nine analysts produced considerably different identification schemes. The proportion of duplicate image pairs that were missed ranged from 6.25% to 58.33%. Discussion: The identification errors made by each analyst, which result in a decline in accuracy and consistency, are likely related to psychological factors such as the limited capacity of human memory, fatigue and boredom, recency effects, and positivity bias.

  7. Seeing, mirroring, desiring: the impact of the analyst's pregnant body on the patient's body image.

    Science.gov (United States)

    Yakeley, Jessica

    2013-08-01

    The paper explores the impact of the analyst's pregnant body on the course of two analyses, a young man, and a young woman, specifically focusing on how each patient's visual perception and affective experience of being with the analyst's pregnant body affected their own body image and subjective experience of their body. The pre-verbal or 'subsymbolic' material evoked in the analyses contributed to a greater understanding of the patients' developmental experiences in infancy and adolescence, which had resulted in both carrying a profoundly distorted body image into adulthood. The analyst's pregnancy offered a therapeutic window in which a shift in the patient's body image could be initiated. Clinical material is presented in detail with reference to the psychoanalytic literature on the pregnant analyst, and that of the development of the body image, particularly focusing on the role of visual communication and the face. The author proposes a theory of psychic change, drawing on Bucci's multiple code theory, in which the patients' unconscious or 'subsymbolic' awareness of her pregnancy, which were manifest in their bodily responses, feeling states and dreams, as well as in the analyst s countertransference, could gradually be verbalized and understood within the transference. Thus visual perception, or 'external seeing', could gradually become 'internal seeing', or insight into unconscious phantasies, leading to a shift in the patients internal object world towards a less persecutory state and more realistic appraisal of their body image. Copyright © 2013 Institute of Psychoanalysis.

  8. Nothing but the truth: self-disclosure, self-revelation, and the persona of the analyst.

    Science.gov (United States)

    Levine, Susan S

    2007-01-01

    The question of the analyst's self-disclosure and self-revelation inhabits every moment of every psychoanalytic treatment. All self-disclosures and revelations, however, are not equivalent, and differentiating among them allows us to define a construct that can be called the analytic persona. Analysts already rely on an unarticulated concept of an analytic persona that guides them, for instance, as they decide what constitutes appropriate boundaries. Clinical examples illustrate how self-disclosures and revelations from within and without the analytic persona feel different, for both patient and analyst. The analyst plays a specific role for each patient and is both purposefully and unconsciously different in this context than in other settings. To a great degree, the self is a relational phenomenon. Our ethics call for us to tell nothing but the truth and simultaneously for us not to tell the whole truth. The unarticulated working concept of an analytic persona that many analysts have refers to the self we step out of at the close of each session and the self we step into as the patient enters the room. Attitudes toward self-disclosure and self-revelation can be considered reflections of how we conceptualize this persona.

  9. Implementation status of performance demonstration program for steam generator tubing analysts in Korea

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Chan Hee; Lee, Hee Jong; Yoo, Hyun Ju; Nam, Min Woo [KHNP Central Research Institute, Daejeon (Korea, Republic of); Hong, Sung Yull [Yeungnam Univ., Gyeongsan (Korea, Republic of)

    2013-02-15

    Some essential components in nuclear power plants are periodically inspected using non destructive examinations, for example ultrasonic, eddy current and radiographic examinations, in order to determine their integrity. These components include nuclear power plant items such as vessels, containments, piping systems, pumps, valves, tubes and core support structure. Steam generator tubes have an important safety role because they constitute one of the primary barriers between the radioactive and non radioactive sides of the nuclear power plant. There is potential that if a tube bursts while a plant is operating, radioactivity from the primary coolant system could escape directly to the atmosphere. Therefore, in service inspections are critical in maintaining steam generator tube integrity. In general, the eddy current testing is widely used for the inspection of steam generator tubes due tube integrity. In general, the eddy current testing is widely used for the inspection of steam generator tubes due to its high inspection speed and flaw detectability on non magnetic tubes. However, it is not easy to analyze correctly eddy current signals because they are influenced by many factors. Therefore, the performance of eddy current data analysts for steam generator tubing should be demonstrated comprehensively. In Korea, the performance of steam generator tubing analysts has been demonstrated using the Qualified Data Analyst program. This paper describes the performance demonstration program for steam generator tubing analysts and its implementation results in Korea. The pass rate of domestic analysts for this program was 71.4%.

  10. Physics-based and human-derived information fusion for analysts

    Science.gov (United States)

    Blasch, Erik; Nagy, James; Scott, Steve; Okoth, Joshua; Hinman, Michael

    2017-05-01

    Recent trends in physics-based and human-derived information fusion (PHIF) have amplified the capabilities of analysts; however with the big data opportunities there is a need for open architecture designs, methods of distributed team collaboration, and visualizations. In this paper, we explore recent trends in the information fusion to support user interaction and machine analytics. Challenging scenarios requiring PHIF include combing physics-based video data with human-derived text data for enhanced simultaneous tracking and identification. A driving effort would be to provide analysts with applications, tools, and interfaces that afford effective and affordable solutions for timely decision making. Fusion at scale should be developed to allow analysts to access data, call analytics routines, enter solutions, update models, and store results for distributed decision making.

  11. Stock index adjustments, analyst coverage and institutional holdings: Evidence from China

    Directory of Open Access Journals (Sweden)

    Song Zhu

    2017-09-01

    Full Text Available Using 231 pairs of matched firms from 2009 to 2012 in Chinese stock market, we find that the stock index adjustment significantly affects the analyst coverage, which in addition to the stock index leads to more analyst coverage, while deletion from the stock index has no significant effect, indicating that stock index adjustment can significantly change the information environments of firms that are added to the index. An index adjustment also affects institutional holdings in consideration of new information (e.g., changes in fundamentals and information environments. Changes in institutional holdings are partially due to changes in analyst coverage, and both index funds and other types can change their portfolios in response to changes in the target firms’ informativeness.

  12. Self-confidence in financial analysis: a study of younger and older male professional analysts.

    Science.gov (United States)

    Webster, R L; Ellis, T S

    2001-06-01

    Measures of reported self-confidence in performing financial analysis by 59 professional male analysts, 31 born between 1946 and 1964 and 28 born between 1965 and 1976, were investigated and reported. Self-confidence in one's ability is important in the securities industry because it affects recommendations and decisions to buy, sell, and hold securities. The respondents analyzed a set of multiyear corporate financial statements and reported their self-confidence in six separate financial areas. Data from the 59 male financial analysts were tallied and analyzed using both univariate and multivariate statistical tests. Rated self-confidence was not significantly different for the younger and the older men. These results are not consistent with a similar prior study of female analysts in which younger women showed significantly higher self-confidence than older women.

  13. It's the People, Stupid: The Role of Personality and Situational Variable in Predicting Decisionmaker Behavior

    National Research Council Canada - National Science Library

    Sticha, Paul J; Buede, Dennis M; Rees, Richard L

    2006-01-01

    .... The analyst builds Bayesian networks that integrate situational information with the Subject's personality and culture to provide a probabilistic prediction of the hypothesized actions a Subject might choose...

  14. Financial Analysts' Forecast Accuracy : Before and After the Introduction of AIFRS

    Directory of Open Access Journals (Sweden)

    Chee Seng Cheong

    2010-09-01

    Full Text Available We examine whether financial analysts’ forecast accuracy differs between the pre- and post- adoption ofAustralian Equivalents to the International Financial Reporting Standards (AIFRS. We find that forecastaccuracy has improved after Australia adopted AIFRS. As a secondary objective, this paper also investigatesthe role of financial analysts in reducing information asymmetry in today’s Australian capital market. We findweak evidence that more analysts following a stock do not help to improve forecast accuracy by bringingmore firm-specific information to the market.

  15. The ability of analysts' recommendations to predict optimistic and pessimistic forecasts.

    Directory of Open Access Journals (Sweden)

    Vahid Biglari

    Full Text Available Previous researches show that buy (growth companies conduct income increasing earnings management in order to meet forecasts and generate positive forecast Errors (FEs. This behavior however, is not inherent in sell (non-growth companies. Using the aforementioned background, this research hypothesizes that since sell companies are pressured to avoid income increasing earnings management, they are capable, and in fact more inclined, to pursue income decreasing Forecast Management (FM with the purpose of generating positive FEs. Using a sample of 6553 firm-years of companies that are listed in the NYSE between the years 2005-2010, the study determines that sell companies conduct income decreasing FM to generate positive FEs. However, the frequency of positive FEs of sell companies does not exceed that of buy companies. Using the efficiency perspective, the study suggests that even though buy and sell companies have immense motivation in avoiding negative FEs, they exploit different but efficient strategies, respectively, in order to meet forecasts. Furthermore, the findings illuminated the complexities behind informative and opportunistic forecasts that falls under the efficiency versus opportunistic theories in literature.

  16. The Ability of Analysts' Recommendations to Predict Optimistic and Pessimistic Forecasts

    Science.gov (United States)

    Biglari, Vahid; Alfan, Ervina Binti; Ahmad, Rubi Binti; Hajian, Najmeh

    2013-01-01

    Previous researches show that buy (growth) companies conduct income increasing earnings management in order to meet forecasts and generate positive forecast Errors (FEs). This behavior however, is not inherent in sell (non-growth) companies. Using the aforementioned background, this research hypothesizes that since sell companies are pressured to avoid income increasing earnings management, they are capable, and in fact more inclined, to pursue income decreasing Forecast Management (FM) with the purpose of generating positive FEs. Using a sample of 6553 firm-years of companies that are listed in the NYSE between the years 2005–2010, the study determines that sell companies conduct income decreasing FM to generate positive FEs. However, the frequency of positive FEs of sell companies does not exceed that of buy companies. Using the efficiency perspective, the study suggests that even though buy and sell companies have immense motivation in avoiding negative FEs, they exploit different but efficient strategies, respectively, in order to meet forecasts. Furthermore, the findings illuminated the complexities behind informative and opportunistic forecasts that falls under the efficiency versus opportunistic theories in literature. PMID:24146741

  17. Forecasting Hotspots-A Predictive Analytics Approach.

    Science.gov (United States)

    Maciejewski, R; Hafen, R; Rudolph, S; Larew, S G; Mitchell, M A; Cleveland, W S; Ebert, D S

    2011-04-01

    Current visual analytics systems provide users with the means to explore trends in their data. Linked views and interactive displays provide insight into correlations among people, events, and places in space and time. Analysts search for events of interest through statistical tools linked to visual displays, drill down into the data, and form hypotheses based upon the available information. However, current systems stop short of predicting events. In spatiotemporal data, analysts are searching for regions of space and time with unusually high incidences of events (hotspots). In the cases where hotspots are found, analysts would like to predict how these regions may grow in order to plan resource allocation and preventative measures. Furthermore, analysts would also like to predict where future hotspots may occur. To facilitate such forecasting, we have created a predictive visual analytics toolkit that provides analysts with linked spatiotemporal and statistical analytic views. Our system models spatiotemporal events through the combination of kernel density estimation for event distribution and seasonal trend decomposition by loess smoothing for temporal predictions. We provide analysts with estimates of error in our modeling, along with spatial and temporal alerts to indicate the occurrence of statistically significant hotspots. Spatial data are distributed based on a modeling of previous event locations, thereby maintaining a temporal coherence with past events. Such tools allow analysts to perform real-time hypothesis testing, plan intervention strategies, and allocate resources to correspond to perceived threats.

  18. Many analysts, one dataset: Making transparent how variations in analytical choices affect results

    NARCIS (Netherlands)

    Silberzahn, Raphael; Uhlmann, E.L.; Martin, D.P.; Anselmi, P.; Aust, F.; Awtrey, E.; Bahnik, Š.; Bai, F.; Bannard, C.; Bonnier, E.; Carlsson, R.; Cheung, F.; Christensen, G.; Clay, R.; Craig, M.A.; Dalla Rosa, A.; Dam, Lammertjan; Evans, M.H.; Flores Cervantes, I.; Fong, N.; Gamez-Djokic, M.; Glenz, A.; Gordon-McKeon, S.; Heaton, T.J.; Hederos, K.; Heene, M.; Hofelich Mohr, A.J.; Högden, F.; Hui, K.; Johannesson, M.; Kalodimos, J.; Kaszubowski, E.; Kennedy, D.M.; Lei, R.; Lindsay, T.A.; Liverani, S.; Madan, C.R.; Molden, D.; Molleman, Henricus; Morey, R.D.; Mulder, Laetitia; Nijstad, Bernard; Pope, N.G.; Pope, B.; Prenoveau, J.M.; Rink, Floortje; Robusto, E.; Roderique, H.; Sandberg, A.; Schlüter, E.; Schönbrodt, F.D.; Sherman, M.F.; Sommer, S.A.; Sotak, K.; Spain, S.; Spörlein, C.; Stafford, T.; Stefanutti, L.; Täuber, Susanne; Ullrich, J.; Vianello, M.; Wagenmakers, E.-J.; Witkowiak, M.; Yoon, S.; Nosek, B.A.

    2018-01-01

    Twenty-nine teams involving 61 analysts used the same dataset to address the same research question: whether soccer referees are more likely to give red cards to dark skin toned players than light skin toned players. Analytic approaches varied widely across teams, and estimated effect sizes ranged

  19. Analyste de systèmes intermédiaire (h/f) | CRDI - Centre de ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    L'analyste de systèmes intermédiaire travaille sous la supervision du ... à la mise en œuvre et à la maintenance d'un large éventail de systèmes tout en assurant ... directement les consultants externes sous contrat afin de faciliter la transition ...

  20. Evolution of Research on Interventions for Individuals with Autism Spectrum Disorder: Implications for Behavior Analysts

    Science.gov (United States)

    Smith, Tristram

    2012-01-01

    The extraordinary success of behavior-analytic interventions for individuals with autism spectrum disorder (ASD) has fueled the rapid growth of behavior analysis as a profession. One reason for this success is that for many years behavior analysts were virtually alone in conducting programmatic ASD intervention research. However, that era has…

  1. Analyste de systèmes principal (h/f) | CRDI - Centre de recherches ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    L'analyste de systèmes principal travaille sous la supervision du ... une analyse fonctionnelle élémentaire des exigences en parlant de la portée et de la raison ... et l'infrastructure en matière de TI tout en se mettant à la place des techniciens, ...

  2. Additional Support for the Information Systems Analyst Exam as a Valid Program Assessment Tool

    Science.gov (United States)

    Carpenter, Donald A.; Snyder, Johnny; Slauson, Gayla Jo; Bridge, Morgan K.

    2011-01-01

    This paper presents a statistical analysis to support the notion that the Information Systems Analyst (ISA) exam can be used as a program assessment tool in addition to measuring student performance. It compares ISA exam scores earned by students in one particular Computer Information Systems program with scores earned by the same students on the…

  3. Prerequisites for Systems Analysts: Analytic and Management Demands of a New Approach to Educational Administration.

    Science.gov (United States)

    Ammentorp, William

    There is much to be gained by using systems analysis in educational administration. Most administrators, presently relying on classical statistical techniques restricted to problems having few variables, should be trained to use more sophisticated tools such as systems analysis. The systems analyst, interested in the basic processes of a group or…

  4. Presentation of the results of a Bayesian automatic event detection and localization program to human analysts

    Science.gov (United States)

    Kushida, N.; Kebede, F.; Feitio, P.; Le Bras, R.

    2016-12-01

    The Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) has been developing and testing NET-VISA (Arora et al., 2013), a Bayesian automatic event detection and localization program, and evaluating its performance in a realistic operational mode. In our preliminary testing at the CTBTO, NET-VISA shows better performance than its currently operating automatic localization program. However, given CTBTO's role and its international context, a new technology should be introduced cautiously when it replaces a key piece of the automatic processing. We integrated the results of NET-VISA into the Analyst Review Station, extensively used by the analysts so that they can check the accuracy and robustness of the Bayesian approach. We expect the workload of the analysts to be reduced because of the better performance of NET-VISA in finding missed events and getting a more complete set of stations than the current system which has been operating for nearly twenty years. The results of a series of tests indicate that the expectations born from the automatic tests, which show an overall overlap improvement of 11%, meaning that the missed events rate is cut by 42%, hold for the integrated interactive module as well. New events are found by analysts, which qualify for the CTBTO Reviewed Event Bulletin, beyond the ones analyzed through the standard procedures. Arora, N., Russell, S., and Sudderth, E., NET-VISA: Network Processing Vertically Integrated Seismic Analysis, 2013, Bull. Seismol. Soc. Am., 103, 709-729.

  5. Storing and managing information artifacts collected by information analysts using a computing device

    Science.gov (United States)

    Pike, William A; Riensche, Roderick M; Best, Daniel M; Roberts, Ian E; Whyatt, Marie V; Hart, Michelle L; Carr, Norman J; Thomas, James J

    2012-09-18

    Systems and computer-implemented processes for storage and management of information artifacts collected by information analysts using a computing device. The processes and systems can capture a sequence of interactive operation elements that are performed by the information analyst, who is collecting an information artifact from at least one of the plurality of software applications. The information artifact can then be stored together with the interactive operation elements as a snippet on a memory device, which is operably connected to the processor. The snippet comprises a view from an analysis application, data contained in the view, and the sequence of interactive operation elements stored as a provenance representation comprising operation element class, timestamp, and data object attributes for each interactive operation element in the sequence.

  6. The role of analytic neutrality in the use of the child analyst as a new object.

    Science.gov (United States)

    Chused, J F

    1982-01-01

    The analyses of two children and one adolescent were presented to illustrate the concept that the neutrality of the analyst can be used not only to (a) establish a working, analyzing, and observing alliance, (b) permit the development, recognition, and working through of the transference neurosis, but also to (c) develop a sense of autonomy and self-esteem which had been contaminated by the neediness and lack of true empathy of the primary objects during the practicing and rapprochement phases of separation-individuation. For the patients discussed above, many ego functions which should have had a degree of secondary autonomy were either inhibited, enmeshed in conflict, or experienced as nongenuine, part of a "false self." It was as if the experience with the neutral analyst permitted an "autonomous practicing" that had not been possible during the period of separation-individuation.

  7. Are the People Backward? Algerian Symbolic Analysts and the Culture of the Masses

    Directory of Open Access Journals (Sweden)

    Thomas Serres

    2017-01-01

    Full Text Available This article studies representations of the Algerian population promoted by francophone intellectuals in a context of longstanding crisis and uncertainty. Borrowing the category of symbolic analysts from Robert Reich, it looks at the way in which novelists, scholars and journalists try to make sense of a critical situation by diagnosing the culture of the Algerian population as deviant or backward. Aiming to encourage social and political reform, these actors try to understand the characteristics of their "people", often by pointing to their so-called pre-modern or passive behaviors. This article analyzes two aspects of this activity: first, attempts to determine who is responsible for the ongoing crisis, and second, the reproduction of cultural prejudices in a context of increased transnationalization. Moreover, it argues that one can interpret the political and intellectual commitments of these analysts by drawing on the triad concept of "Naming, Blaming, Claiming", which as been used to study the publicization of disputes.

  8. THE MISSING FATHER FUNCTION IN PSYCHOANALYTIC THEORY AND TECHNIQUE: THE ANALYST'S INTERNAL COUPLE AND MATURING INTIMACY.

    Science.gov (United States)

    Diamond, Michael J

    2017-10-01

    This paper argues that recovering the "missing" paternal function in analytic space is essential for the patient's achievement of mature object relations. Emerging from the helpless infant's contact with primary caregivers, mature intimacy rests on establishing healthy triadic functioning based on an infant-with-mother-and-father. Despite a maternocentric bias in contemporary clinical theory, the emergence of triangularity and the inclusion of the paternal third as a separating element is vital in the analytic dyad. Effective technique requires the analyst's balanced interplay between the paternal, investigative and the maternal, maximally receptive modes of functioning-the good enough analytic couple within the analyst-to serve as the separating element that procreatively fertilizes the capacity for intimacy with a differentiated other. A clinical example illustrates how treatment is limited when the paternal function is minimized within more collusive, unconsciously symbiotic dyads. © 2017 The Psychoanalytic Quarterly, Inc.

  9. Meta-Analyst: software for meta-analysis of binary, continuous and diagnostic data

    Directory of Open Access Journals (Sweden)

    Schmid Christopher H

    2009-12-01

    Full Text Available Abstract Background Meta-analysis is increasingly used as a key source of evidence synthesis to inform clinical practice. The theory and statistical foundations of meta-analysis continually evolve, providing solutions to many new and challenging problems. In practice, most meta-analyses are performed in general statistical packages or dedicated meta-analysis programs. Results Herein, we introduce Meta-Analyst, a novel, powerful, intuitive, and free meta-analysis program for the meta-analysis of a variety of problems. Meta-Analyst is implemented in C# atop of the Microsoft .NET framework, and features a graphical user interface. The software performs several meta-analysis and meta-regression models for binary and continuous outcomes, as well as analyses for diagnostic and prognostic test studies in the frequentist and Bayesian frameworks. Moreover, Meta-Analyst includes a flexible tool to edit and customize generated meta-analysis graphs (e.g., forest plots and provides output in many formats (images, Adobe PDF, Microsoft Word-ready RTF. The software architecture employed allows for rapid changes to be made to either the Graphical User Interface (GUI or to the analytic modules. We verified the numerical precision of Meta-Analyst by comparing its output with that from standard meta-analysis routines in Stata over a large database of 11,803 meta-analyses of binary outcome data, and 6,881 meta-analyses of continuous outcome data from the Cochrane Library of Systematic Reviews. Results from analyses of diagnostic and prognostic test studies have been verified in a limited number of meta-analyses versus MetaDisc and MetaTest. Bayesian statistical analyses use the OpenBUGS calculation engine (and are thus as accurate as the standalone OpenBUGS software. Conclusion We have developed and validated a new program for conducting meta-analyses that combines the advantages of existing software for this task.

  10. The Insider Threat to Cybersecurity: How Group Process and Ignorance Affect Analyst Accuracy and Promptitude

    Science.gov (United States)

    2017-09-01

    McCarthy, J. (1980). Circumscription - A Form of Nonmonotonic Reasoning. Artificial Intelligence , 13, 27–39. McClure, S., Scambray, J., & Kurtz, G. (2012...THREAT TO CYBERSECURITY : HOW GROUP PROCESS AND IGNORANCE AFFECT ANALYST ACCURACY AND PROMPTITUDE by Ryan F. Kelly September 2017...September 2017 3. REPORT TYPE AND DATES COVERED Dissertation 4. TITLE AND SUBTITLE THE INSIDER THREAT TO CYBERSECURITY : HOW GROUP PROCESS AND

  11. Analyste de la gestion des documents (h/f) | CRDI - Centre de ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Résumé des fonctions L'analyste de la gestion des documents est la ressource technique au sein de l'équipe de la Gestion de l'information et des documents. ... Se tenir également informé de toutes les pratiques exemplaires liées à la gestion de l'information et des documents électroniques et matériels que le CRDI doit ...

  12. Dividend Policy: A Survey of Malaysian Public Listed Companies and Security Analysts

    OpenAIRE

    Chong, David Voon Chee

    2003-01-01

    This dissertation revisits the dividend puzzle and attempts to answer the fundamental question of “why do companies pay dividends?” by using a simple regression model incorporating the major theories on dividend relevance. In addition, this research investigates if the opinions of companies (both dividend and non-dividend paying) and security analysts differ with respect to the various explanations for paying dividends. Finally, this research also explores the views and opinions of corporate ...

  13. The Effect of Ownership Structure and Investor Protection on Firm Value: Analyst Following as Moderating Variable

    Directory of Open Access Journals (Sweden)

    Desi Susilawati

    2017-12-01

    Full Text Available The research related to the association between structure ownership and the firm value is a discussion about corporate governance which is still has contradictory conclusion and mixed result. It indicates open question that needs empirical evidence. The influence of concentrated ownership on firm value still brought conflict of interest so the role of analyst following can be stated as an alternative of corporate governance mechanism (Lang et al., 2004. The objectives of this research are to examine the interaction effect between concentrated ownership and analyst following, and the effect of investor protection toward firm value in five Asian companies. Asia is chosen because it has unique characteristic, in term of corporates ownership structure which is more concentrated on families and board of governance is weak (Choi, 2003. The data is consisting of 7.100 firm year observations obtained from Bloomberg and OSIRIS database for the period 2011-2013 in five Asian Countries, i.e. China, South Korea,  Malaysia, Taiwan, and Thailand. Multiple Regression analysis is used to test hypotheses. The results show that concentrated ownership is positively affects the firm value. However, there is no empirical evidence that the interaction of concentrated ownership and analyst following positively affect the firm value. As hypothesized, this research also shows that investor protection has negative impact on firm’s value.

  14. A feasibility study for Arizona's roadway safety management process using the Highway Safety Manual and SafetyAnalyst : final report.

    Science.gov (United States)

    2016-07-01

    To enable implementation of the American Association of State Highway Transportation (AASHTO) Highway Safety Manual using : SaftetyAnalyst (an AASHTOWare software product), the Arizona Department of Transportation (ADOT) studied the data assessment :...

  15. The body of the analyst and the analytic setting: reflections on the embodied setting and the symbiotic transference.

    Science.gov (United States)

    Lemma, Alessandra

    2014-04-01

    In this paper the author questions whether the body of the analyst may be helpfully conceptualized as an embodied feature of the setting and suggests that this may be especially helpful for understanding patients who develop a symbiotic transference and for whom any variance in the analyst's body is felt to be profoundly destabilizing. In such cases the patient needs to relate to the body of the analyst concretely and exclusively as a setting 'constant' and its meaning for the patient may thus remain inaccessible to analysis for a long time. When the separateness of the body of the analyst reaches the patient's awareness because of changes in the analyst's appearance or bodily state, it then mobilizes primitive anxieties in the patient. It is only when the body of the analyst can become a dynamic variable between them (i.e., part of the process) that it can be used by the patient to further the exploration of their own mind. Copyright © 2014 Institute of Psychoanalysis.

  16. A Few Insights Into Romanian Information Systems Analysts and Designers Toolbox

    Directory of Open Access Journals (Sweden)

    Fotache Marin

    2017-06-01

    Full Text Available Information Systems (IS analysts and designers have been key members in software development teams. From waterfall to Rational Unified Process, from UML to agile development, IS modelers have faced many trends and buzzwords. Even if the topic of models and modeling tools in software development is important, there are no many detailed studies to identify for what the developers, customers and managers decide to use the modeling and specific tools. Despite the popularity of the subject, studies showing what tools the IS modelers prefer are scarce, and quasi-non-existent, when talking about Romanian market. As Romania is an important IT outsourcing market, this paper investigated what methods and tools Romanian IS analysts and designers apply. In this context, the starting question of our research focuses on the preference of the developers to choose between agile or non-agile methods in IT projects. As a result, the research questions targeted the main drivers in choosing specific methods and tools for IT projects deployed in Romanian companies. Also, one of the main objectives of this paper was to approach the relationship between the methodologies (agile or non-agile, diagrams and other tools (we refer in our study to the CASE features with other variables/metrics of the system/software development project. The observational study was conducted based on a survey filled by IS modelers in Romanian IT companies. The data collected were processed and analyzed using Exploratory Data Analysis. The platform for data visualization and analysis was R.

  17. Analyzing the Qualitative Data Analyst: A Naturalistic Investigation of Data Interpretation

    Directory of Open Access Journals (Sweden)

    Wolff-Michael Roth

    2015-07-01

    Full Text Available Much qualitative research involves the analysis of verbal data. Although the possibility to conduct qualitative research in a rigorous manner is sometimes contested in debates of qualitative/quantitative methods, there are scholarly communities within which qualitative research is indeed data driven and enacted in rigorous ways. How might one teach rigorous approaches to analysis of verbal data? In this study, 20 sessions were recorded in introductory graduate classes on qualitative research methods. The social scientist thought aloud while analyzing transcriptions that were handed to her immediately prior the sessions and for which she had no background information. The students then assessed, sometimes showing the original video, the degree to which the analyst had recovered (the structures of the original events. This study provides answers to the broad question: "How does an analyst recover an original event with a high degree of accuracy?" Implications are discussed for teaching qualitative data analysis. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1503119

  18. Could the outcome of the 2016 US elections have been predicted from past voting patterns?

    CSIR Research Space (South Africa)

    Schmitz, Peter MU

    2017-07-01

    Full Text Available In South Africa, a team of analysts has for some years been using statistical techniques to predict election outcomes during election nights in South Africa. The prediction method involves using statistical clusters based on past voting patterns...

  19. [Concordance among analysts from Latin-American laboratories for rice grain appearance determination using a gallery of digital images].

    Science.gov (United States)

    Avila, Manuel; Graterol, Eduardo; Alezones, Jesús; Criollo, Beisy; Castillo, Dámaso; Kuri, Victoria; Oviedo, Norman; Moquete, Cesar; Romero, Marbella; Hanley, Zaida; Taylor, Margie

    2012-06-01

    The appearance of rice grain is a key aspect in quality determination. Mainly, this analysis is performed by expert analysts through visual observation; however, due to the subjective nature of the analysis, the results may vary among analysts. In order to evaluate the concordance between analysts from Latin-American rice quality laboratories for rice grain appearance through digital images, an inter-laboratory test was performed with ten analysts and images of 90 grains captured with a high resolution scanner. Rice grains were classified in four categories including translucent, chalky, white belly, and damaged grain. Data was categorized using statistic parameters like mode and its frequency, the relative concordance, and the reproducibility parameter kappa. Additionally, a referential image gallery of typical grain for each category was constructed based on mode frequency. Results showed a Kappa value of 0.49, corresponding to a moderate reproducibility, attributable to subjectivity in the visual analysis of grain images. These results reveal the need for standardize the evaluation criteria among analysts to improve the confidence of the determination of rice grain appearance.

  20. The Stewardship Role of Analyst Forecasts, and Discretionary Versus Non-Discretionary Accruals

    DEFF Research Database (Denmark)

    Christensen, Peter Ove; Frimor, Hans; Sabac, Florin

    We examine the interaction between discretionary and non-discretionary accruals in a stewardship setting. Contracting includes multiple rounds of renegotiation based on contractible accounting information and non-contractible but more timely non-accounting information. We show that accounting...... timely non-accounting information (analyst earnings forecasts) increases the ex ante value of the firm and reduces costly earnings management. There is an optimal level of reversible non-discretionary accrual noise introduced through revenue recognition policies. Tight rules-based accounting regulation...... regulation aimed at increasing earnings quality from a valuation perspective (earnings persistence) may have a significant impact on how firms rationally respond in terms of allowing accrual discretion in order to alleviate the impact on the stewardship role of earnings. Increasing the precision of more...

  1. The Stewardship Role of Analyst Forecasts, and Discretionary Versus Non-Discretionary Accruals

    DEFF Research Database (Denmark)

    Christensen, Peter Ove; Frimor, Hans; Sabac, Florin

    2013-01-01

    We examine the interaction between discretionary and non-discretionary accruals in a stewardship setting. Contracting includes multiple rounds of renegotiation based on contractible accounting information and non-contractible but more timely non-accounting information. We show that accounting...... timely non-accounting information (analyst earnings forecasts) increases the ex ante value of the firm and reduces costly earnings management. There is an optimal level of reversible non-discretionary accrual noise introduced through revenue recognition policies. Tight rules-based accounting regulation...... regulation aimed at increasing earnings quality from a valuation perspective (earnings persistence) may have a significant impact on how firms rationally respond in terms of allowing accrual discretion in order to alleviate the impact on the stewardship role of earnings. Increasing the precision of more...

  2. Complexity Analysis of Industrial Organizations Based on a Perspective of Systems Engineering Analysts

    Directory of Open Access Journals (Sweden)

    I. H. Garbie

    2011-12-01

    Full Text Available Complexity in industrial organizations became more difficult and complex to be solved and it needs more attention from academicians and technicians. For these reasons, complexity in industrial organizations represents a new challenge in the next decades. Until now, analysis of industrial organizations complexity is still remaining a research topic of immense international interest and they require reduction in their complexity. In this paper, analysis of complexity in industrial organizations is shown based on the perspective of systems engineering analyst. In this perspective, analysis of complexity was divided into different levels and these levels were defined as complexity levels. A framework of analyzing these levels was proposed and suggested based on the complexity in industrial organizations. This analysis was divided into four main issues: industrial system vision, industrial system structure, industrial system operating, and industrial system evaluating. This analysis shows that the complexity of industrial organizations is still an ill-structured and a multi-dimensional problem.

  3. Teleconsultation in school settings: linking classroom teachers and behavior analysts through web-based technology.

    Science.gov (United States)

    Frieder, Jessica E; Peterson, Stephanie M; Woodward, Judy; Crane, Jaelee; Garner, Marlane

    2009-01-01

    This paper describes a technically driven, collaborative approach to assessing the function of problem behavior using web-based technology. A case example is provided to illustrate the process used in this pilot project. A school team conducted a functional analysis with a child who demonstrated challenging behaviors in a preschool setting. Behavior analysts at a university setting provided the school team with initial workshop trainings, on-site visits, e-mail and phone communication, as well as live web-based feedback on functional analysis sessions. The school personnel implemented the functional analysis with high fidelity and scored the data reliably. Outcomes of the project suggest that there is great potential for collaboration via the use of web-based technologies for ongoing assessment and development of effective interventions. However, an empirical evaluation of this model should be conducted before wide-scale adoption is recommended.

  4. Analyst Tools and Quality Control Software for the ARM Data System

    Energy Technology Data Exchange (ETDEWEB)

    Moore, S.T.

    2004-12-14

    ATK Mission Research develops analyst tools and automated quality control software in order to assist the Atmospheric Radiation Measurement (ARM) Data Quality Office with their data inspection tasks. We have developed a web-based data analysis and visualization tool, called NCVweb, that allows for easy viewing of ARM NetCDF files. NCVweb, along with our library of sharable Interactive Data Language procedures and functions, allows even novice ARM researchers to be productive with ARM data with only minimal effort. We also contribute to the ARM Data Quality Office by analyzing ARM data streams, developing new quality control metrics, new diagnostic plots, and integrating this information into DQ HandS - the Data Quality Health and Status web-based explorer. We have developed several ways to detect outliers in ARM data streams and have written software to run in an automated fashion to flag these outliers.

  5. Oncology Modeling for Fun and Profit! Key Steps for Busy Analysts in Health Technology Assessment.

    Science.gov (United States)

    Beca, Jaclyn; Husereau, Don; Chan, Kelvin K W; Hawkins, Neil; Hoch, Jeffrey S

    2018-01-01

    In evaluating new oncology medicines, two common modeling approaches are state transition (e.g., Markov and semi-Markov) and partitioned survival. Partitioned survival models have become more prominent in oncology health technology assessment processes in recent years. Our experience in conducting and evaluating models for economic evaluation has highlighted many important and practical pitfalls. As there is little guidance available on best practices for those who wish to conduct them, we provide guidance in the form of 'Key steps for busy analysts,' who may have very little time and require highly favorable results. Our guidance highlights the continued need for rigorous conduct and transparent reporting of economic evaluations regardless of the modeling approach taken, and the importance of modeling that better reflects reality, which includes better approaches to considering plausibility, estimating relative treatment effects, dealing with post-progression effects, and appropriate characterization of the uncertainty from modeling itself.

  6. What Performance Analysts Need to Know About Research Trends in Association Football (2012-2016): A Systematic Review.

    Science.gov (United States)

    Sarmento, Hugo; Clemente, Filipe Manuel; Araújo, Duarte; Davids, Keith; McRobert, Allistair; Figueiredo, António

    2018-04-01

    Evolving patterns of match analysis research need to be systematically reviewed regularly since this area of work is burgeoning rapidly and studies can offer new insights to performance analysts if theoretically and coherently organized. The purpose of this paper was to conduct a systematic review of published articles on match analysis in adult male football, identify and organize common research topics, and synthesize the emerging patterns of work between 2012 and 2016, according to the Preferred Reporting Items for Systematic Reviews and Meta-analyses (PRISMA) guidelines. The Web of Science database was searched for relevant published studies using the following keywords: 'football' and 'soccer', each one associated with the terms 'match analysis', 'performance analysis', 'notational analysis', 'game analysis', 'tactical analysis' and 'patterns of play'. Of 483 studies initially identified, 77 were fully reviewed and their outcome measures extracted and analyzed. Results showed that research mainly focused on (1) performance at set pieces, i.e. corner kicks, free kicks, penalty kicks; (2) collective system behaviours, captured by established variables such as team centroid (geometrical centre of a set of players) and team dispersion (quantification of how far players are apart), as well as tendencies for team communication (establishing networks based on passing sequences), sequential patterns (predicting future passing sequences), and group outcomes (relationships between match-related statistics and final match scores); and (3) activity profile of players, i.e. playing roles, effects of fatigue, substitutions during matches, and the effects of environmental constraints on performance, such as heat and altitude. From the previous review, novel variables were identified that require new measurement techniques. It is evident that the complexity engendered during performance in competitive soccer requires an integrated approach that considers multiple aspects. A

  7. A Comparative Study of Automated Infrasound Detectors - PMCC and AFD with Analyst Review.

    Energy Technology Data Exchange (ETDEWEB)

    Park, Junghyun; Hayward, Chris; Zeiler, Cleat; Arrowsmith, Stephen John; Stump, Brian

    2015-08-01

    Automated detections calculated by the progressive multi-channel correlation (PMCC) method (Cansi, 1995) and the adaptive F detector (AFD) (Arrowsmith et al., 2009) are compared to the signals identified by five independent analysts. Each detector was applied to a four-hour time sequence recorded by the Korean infrasound array CHNAR. This array was used because it is composed of both small (<100 m) and large (~1000 m) aperture element spacing. The four hour time sequence contained a number of easily identified signals under noise conditions that have average RMS amplitudes varied from 1.2 to 4.5 mPa (1 to 5 Hz), estimated with running five-minute window. The effectiveness of the detectors was estimated for the small aperture, large aperture, small aperture combined with the large aperture, and full array. The full and combined arrays performed the best for AFD under all noise conditions while the large aperture array had the poorest performance for both detectors. PMCC produced similar results as AFD under the lower noise conditions, but did not produce as dramatic an increase in detections using the full and combined arrays. Both automated detectors and the analysts produced a decrease in detections under the higher noise conditions. Comparing the detection probabilities with Estimated Receiver Operating Characteristic (EROC) curves we found that the smaller value of consistency for PMCC and the larger p-value for AFD had the highest detection probability. These parameters produced greater changes in detection probability than estimates of the false alarm rate. The detection probability was impacted the most by noise level, with low noise (average RMS amplitude of 1.7 mPa) having an average detection probability of ~40% and high noise (average RMS amplitude of 2.9 mPa) average detection probability of ~23%.

  8. Interfacing a biosurveillance portal and an international network of institutional analysts to detect biological threats.

    Science.gov (United States)

    Riccardo, Flavia; Shigematsu, Mika; Chow, Catherine; McKnight, C Jason; Linge, Jens; Doherty, Brian; Dente, Maria Grazia; Declich, Silvia; Barker, Mike; Barboza, Philippe; Vaillant, Laetitia; Donachie, Alastair; Mawudeku, Abla; Blench, Michael; Arthur, Ray

    2014-01-01

    The Early Alerting and Reporting (EAR) project, launched in 2008, is aimed at improving global early alerting and risk assessment and evaluating the feasibility and opportunity of integrating the analysis of biological, chemical, radionuclear (CBRN), and pandemic influenza threats. At a time when no international collaborations existed in the field of event-based surveillance, EAR's innovative approach involved both epidemic intelligence experts and internet-based biosurveillance system providers in the framework of an international collaboration called the Global Health Security Initiative, which involved the ministries of health of the G7 countries and Mexico, the World Health Organization, and the European Commission. The EAR project pooled data from 7 major internet-based biosurveillance systems onto a common portal that was progressively optimized for biological threat detection under the guidance of epidemic intelligence experts from public health institutions in Canada, the European Centre for Disease Prevention and Control, France, Germany, Italy, Japan, the United Kingdom, and the United States. The group became the first end users of the EAR portal, constituting a network of analysts working with a common standard operating procedure and risk assessment tools on a rotation basis to constantly screen and assess public information on the web for events that could suggest an intentional release of biological agents. Following the first 2-year pilot phase, the EAR project was tested in its capacity to monitor biological threats, proving that its working model was feasible and demonstrating the high commitment of the countries and international institutions involved. During the testing period, analysts using the EAR platform did not miss intentional events of a biological nature and did not issue false alarms. Through the findings of this initial assessment, this article provides insights into how the field of epidemic intelligence can advance through an

  9. The Effects of Bug-in-Ear Coaching on Pre-Service Behavior Analysts' Use of Functional Communication Training.

    Science.gov (United States)

    Artman-Meeker, Kathleen; Rosenberg, Nancy; Badgett, Natalie; Yang, Xueyan; Penney, Ashley

    2017-09-01

    Behavior analysts play an important role in supporting the behavior and learning of young children with disabilities in natural settings. However, there is very little research related specifically to developing the skills and competencies needed by pre-service behavior analysts. This study examined the effects of "bug-in-ear" (BIE) coaching on pre-service behavior analysts' implementation of functional communication training with pre-school children with autism in their classrooms. BIE coaching was associated with increases in the rate of functional communication training trials each intern initiated per session and in the fidelity with which interns implemented functional communication training. Adults created more intentional opportunities for children to communicate, and adults provided more systematic instruction around those opportunities.

  10. School-Wide PBIS: Extending the Impact of Applied Behavior Analysis. Why is This Important to Behavior Analysts?

    Science.gov (United States)

    Putnam, Robert F; Kincaid, Donald

    2015-05-01

    Horner and Sugai (2015) recently wrote a manuscript providing an overview of school-wide positive behavioral interventions and supports (PBIS) and why it is an example of applied behavior analysis at the scale of social importance. This paper will describe why school-wide PBIS is important to behavior analysts, how it helps promote applied behavior analysis in schools and other organizations, and how behavior analysts can use this framework to assist them in the promotion and implementation of applied behavior analysis at both at the school and organizational level, as well as, the classroom and individual level.

  11. Emotional Detachment of Partners and the Sanctity of the Relationship with the Analyst as the Most Powerful Curative Factor.

    Science.gov (United States)

    Gostečnik, Christian; Slavič, Tanja Repič; Lukek, Saša Poljak; Pate, Tanja; Cvetek, Robert

    2017-08-01

    The relationship between partners and the analyst is considered the most basic means for healing in contemporary psychoanalytic theories and analyses. It also holds as one of the most fundamental phenomenon's of psychoanalysis, so it comes as no surprise that it has always been deliberated over as an object of great interest as well as immense controversy. This same relationship, mutually co-created by the analyst and each individual and partner in analysis, represents also the core of sanctity and sacred space in contemporary psychoanalysis.

  12. Monday-Morning Quarterbacking: A Senior Analyst Uses His Early Work to Discuss Contemporary Child and Adolescent Psychoanalytic Technique.

    Science.gov (United States)

    Sugarman, Alan

    2015-01-01

    Contemporary child and adolescent psychoanalytic technique has evolved and changed a great deal in the last thirty years. This paper will describe the analysis of an adolescent girl from early in the author's career to demonstrate the ways in which technique has changed. The clinical material presented highlights six areas in which contemporary child and adolescent analysts practice and/or understand material and the clinical process differently than they did thirty years ago: (1) the contemporary perspective on mutative action, (2) the contemporary emphasis on mental organization, (3) the developmental lag in integrating the structural model, (4) the child analyst's multiple functions, (5) the child analyst's use of countertransference, and (6) the child analyst's work with parents. The author discusses how he would work differently with the patient now using his contemporary perspective. But he also wonders what might have been lost by not working in a more traditional manner, in particular the opportunity to analyze the patient's hypersensitivity to feeling hurt and mistreated so directly in the transference.

  13. Is Student Performance on the Information Systems Analyst Certification Exam Affected by Form of Delivery of Information Systems Coursework?

    Science.gov (United States)

    Haga, Wayne; Moreno, Abel; Segall, Mark

    2012-01-01

    In this paper, we compare the performance of Computer Information Systems (CIS) majors on the Information Systems Analyst (ISA) Certification Exam. The impact that the form of delivery of information systems coursework may have on the exam score is studied. Using a sample that spans three years, we test for significant differences between scores…

  14. Behavior Analysts to the Front! A 15-Step Tutorial on Public Speaking.

    Science.gov (United States)

    Friman, Patrick C

    2014-10-01

    Mainstream prominence was Skinner's vision for behavior analysis. Unfortunately, it remains elusive, even as we approach the 110th anniversary of his birth. It can be achieved, however, and there are many routes. One that seems overlooked in many (most?) behavior analytic training programs is what I call the front of the room. The front of the room is a very powerful locus for influencing people. Mastering it can turn a commoner into a king; a middling man into a mayor; or a group of disorganized, dispirited people into an energized force marching into battle. The most powerful members of our species had their most memorable moments at the front of the room. If so much is available there, why is mastery of it in such short supply, not just in behavior analysts but in the population at large? In this paper, I address why, argue that the primary reason can be overcome, and supply 15 behaviorally based steps to take in pursuit of front of the room mastery.

  15. Building Fire Behavior Analyst (FBAN) capability and capacity: Lessons learned From Victoria, Australia's Bushfire Behavior Predictive Services Strategy

    Science.gov (United States)

    K. E. Gibos; A. Slijepcevic; T. Wells; L. Fogarty

    2015-01-01

    Wildland fire managers must frequently make meaning from chaos in order to protect communities and infrastructure from the negative impacts of fire. Fire management personnel are increasingly turning to science to support their experience-based decision-making processes and to provide clear, confident leadership for communities frequently exposed to risk from wildfire...

  16. Identifying the Education Needs of the Business Analyst: An Australian Study

    Directory of Open Access Journals (Sweden)

    Deborah Richards

    2014-06-01

    Full Text Available The Business Analyst (BA plays a key role in ensuring that technology is appropriately used to achieve the organisation’s goals. This important mediating role is currently in high (unmet demand in many English-speaking countries and thus more people need to be trained for this role. To determine the educational and/or training needs of a BA we conducted a survey in the Information and Communication Technology industry in Australia. The survey items are based on prior studies of information systems educational requirements and the internationally-developed Skills Framework for the Information Age (SFIA that has been endorsed by the Australian Computer Society. From the literature we identified three types of skills: soft, business and technical. With the increasing importance of GreenIT and the pivotal role that the BA could play in green decision making, we added a fourth type of skill: green. The survey considers 85 skills, their importance, the level of attainment of that skill, skill gaps and types of skills. Results show that all soft skills were considered to be important with the smallest knowledge gaps. Selected business skills and green skills were seen to be important. Technical skills were considered less important, but also where the largest knowledge gaps existed. Further we asked respondents whether each skill should be acquired via an undergraduate or postgraduate degree and/or industry training and experience. We found that the workplace was considered the most appropriate place to acquire and/or develop all skills, except the ability to innovate. While we found that softskills should be taught almost equally at the undergraduate and postgraduate level, business and green skills were more appropriate in a postgraduate degree. In contrast, technical skills were best acquired in an undergraduate program of study.

  17. Evaluating the results of a site-specific PSHA from the perspective of a risk analyst

    Science.gov (United States)

    Klügel, Jens-Uwe

    2016-04-01

    From 1998 till 2015 Swiss Nuclear Power Plants sponsored a set of comprehensive site-specific PSHA-studies (PEGASOS, PEGASOS Refinement Project) to obtain the requested input for their plant specific probabilistic risk assessments following the US SSHAC procedures at their most elaborated level 4. The studies were performed by well-known earth scientists working completely independent from sponsors under participatory review of the Swiss Nuclear Safety Inspectorate. Risk analysts of Swiss Nuclear Power Plants recently have been mandated to implement the final results of the studies in their risk assessment studies. This triggered an in depth assessment of the results focussed on their practical applicability for risk studies. This assessment resulted in some important insights that are of interest for future PSHA studies performed for new nuclear power plants. The assessment included a review of the completeness of results with respect to risk applications as well as plausibility checks of hazard results based on Black Swan Theory and known historical events. The key lessons and recommendations for more detailed project output specifications for future projects are presented in the paper. It was established that future PSHA projects shall provide the joint probability distribution of ground motion hazard and the associated strong motion duration as the output to allow for a technically meaningful risk assessment. The recommendation of WENRA (West European Nuclear Regulators) published in their reference levels to perform natural hazard assessment preferably based on physical grounds (deterministic method) is also rationalized by recommending an holistic approach to hazard analysis comparing PSHA insights with the results of modelling deterministic Seismic Hazard Analysis.

  18. LES PREVISIONS DES ANALYSTES FINANCIERS ET LES INCORPORELS : LES IAS/IFRS APPORTENT-ELLES UNE AMELIORATION ?

    OpenAIRE

    Lenormand , Gaëlle; Touchais , Lionel

    2017-01-01

    International audience; Due to the identification and assessment difficulties, the accounting system does not always adequately take into account the intangibles. With the IFRS, there are new accounting rules for these items. The article aims to analyze whether these changes convey more useful information for intangible assets with an improvement of analysts' earnings forecasts. To test this question, we use a sample of 209 firms listed on Euronext over 9 years with the national GAAP from 200...

  19. Clearing and settlement of interbank card transactions: a MasterCard tutorial for Federal Reserve payments analysts

    OpenAIRE

    Susan Herbst-Murphy

    2013-01-01

    The Payment Cards Center organized a meeting at which senior officials from MasterCard shared information with Federal Reserve System payments analysts about the clearing and settlement functions that MasterCard performs for its client banks. These functions involve the transfer of information pertaining to card-based transactions (clearing) and the exchange of monetary value (settlement) that takes place between the banks whose customers are cardholders and those banks whose customers are ca...

  20. Conserving analyst attention units: use of multi-agent software and CEP methods to assist information analysis

    Science.gov (United States)

    Rimland, Jeffrey; McNeese, Michael; Hall, David

    2013-05-01

    Although the capability of computer-based artificial intelligence techniques for decision-making and situational awareness has seen notable improvement over the last several decades, the current state-of-the-art still falls short of creating computer systems capable of autonomously making complex decisions and judgments in many domains where data is nuanced and accountability is high. However, there is a great deal of potential for hybrid systems in which software applications augment human capabilities by focusing the analyst's attention to relevant information elements based on both a priori knowledge of the analyst's goals and the processing/correlation of a series of data streams too numerous and heterogeneous for the analyst to digest without assistance. Researchers at Penn State University are exploring ways in which an information framework influenced by Klein's (Recognition Primed Decision) RPD model, Endsley's model of situational awareness, and the Joint Directors of Laboratories (JDL) data fusion process model can be implemented through a novel combination of Complex Event Processing (CEP) and Multi-Agent Software (MAS). Though originally designed for stock market and financial applications, the high performance data-driven nature of CEP techniques provide a natural compliment to the proven capabilities of MAS systems for modeling naturalistic decision-making, performing process adjudication, and optimizing networked processing and cognition via the use of "mobile agents." This paper addresses the challenges and opportunities of such a framework for augmenting human observational capability as well as enabling the ability to perform collaborative context-aware reasoning in both human teams and hybrid human / software agent teams.

  1. Cyber Situation Awareness through Instance-Based Learning: Modeling the Security Analyst in a Cyber-Attack Scenario

    Science.gov (United States)

    2012-01-01

    Chocolate Avenue Hershey PA 17033 Tel: 717-533-8845 Fax: 717-533-8661 E-mail: cust@igi-global.com Web site: http://www.igi-global.com Copyright © 2011...program and obtain control on the machine (event 21st out of 25). During the course of this simple scenario, a security analyst is able to observe...G. A. (1989). Recognition-primed deci- sions. In Rouse, W. B. (Ed.), Advances in man- machine system research (Vol. 5, pp. 47–92). Greenwich, CT

  2. ON THE ANALYST'S IDENTIFICATION WITH THE PATIENT: THE CASE OF J.-B. PONTALIS AND G. PEREC.

    Science.gov (United States)

    Schwartz, Henry P

    2016-01-01

    The writer Georges Perec was in psychoanalysis with Jean-Bertrand Pontalis for four years in the early 1970s. In this essay, the author presents the exceptional interest this analyst took in this patient and the ways in which that interest manifested itself in his work, psychoanalytic and otherwise. Many correlative factors suggest that identificatory processes persisted beyond the treatment and were maintained into Pontalis's later life. While this paper is primarily intended to provide evidence to support this view of a specific case, the author closes by reflecting that this may be a more general phenomenon and the reasons for this. © 2016 The Psychoanalytic Quarterly, Inc.

  3. Behavior analysts in the war on poverty: A review of the use of financial incentives to promote education and employment.

    Science.gov (United States)

    Holtyn, August F; Jarvis, Brantley P; Silverman, Kenneth

    2017-01-01

    Poverty is a pervasive risk factor underlying poor health. Many interventions that have sought to reduce health disparities associated with poverty have focused on improving health-related behaviors of low-income adults. Poverty itself could be targeted to improve health, but this approach would require programs that can consistently move poor individuals out of poverty. Governments and other organizations in the United States have tested a diverse range of antipoverty programs, generally on a large scale and in conjunction with welfare reform initiatives. This paper reviews antipoverty programs that used financial incentives to promote education and employment among welfare recipients and other low-income adults. The incentive-based, antipoverty programs had small or no effects on the target behaviors; they were implemented on large scales from the outset, without systematic development and evaluation of their components; and they did not apply principles of operant conditioning that have been shown to determine the effectiveness of incentive or reinforcement interventions. By applying basic principles of operant conditioning, behavior analysts could help address poverty and improve health through development of effective antipoverty programs. This paper describes a potential framework for a behavior-analytic antipoverty program, with the goal of illustrating that behavior analysts could be uniquely suited to make substantial contributions to the war on poverty. © 2017 Society for the Experimental Analysis of Behavior.

  4. A Tracking Analyst for large 3D spatiotemporal data from multiple sources (case study: Tracking volcanic eruptions in the atmosphere)

    Science.gov (United States)

    Gad, Mohamed A.; Elshehaly, Mai H.; Gračanin, Denis; Elmongui, Hicham G.

    2018-02-01

    This research presents a novel Trajectory-based Tracking Analyst (TTA) that can track and link spatiotemporally variable data from multiple sources. The proposed technique uses trajectory information to determine the positions of time-enabled and spatially variable scatter data at any given time through a combination of along trajectory adjustment and spatial interpolation. The TTA is applied in this research to track large spatiotemporal data of volcanic eruptions (acquired using multi-sensors) in the unsteady flow field of the atmosphere. The TTA enables tracking injections into the atmospheric flow field, the reconstruction of the spatiotemporally variable data at any desired time, and the spatiotemporal join of attribute data from multiple sources. In addition, we were able to create a smooth animation of the volcanic ash plume at interactive rates. The initial results indicate that the TTA can be applied to a wide range of multiple-source data.

  5. Predictive maintenance primer

    International Nuclear Information System (INIS)

    Flude, J.W.; Nicholas, J.R.

    1991-04-01

    This Predictive Maintenance Primer provides utility plant personnel with a single-source reference to predictive maintenance analysis methods and technologies used successfully by utilities and other industries. It is intended to be a ready reference to personnel considering starting, expanding or improving a predictive maintenance program. This Primer includes a discussion of various analysis methods and how they overlap and interrelate. Additionally, eighteen predictive maintenance technologies are discussed in sufficient detail for the user to evaluate the potential of each technology for specific applications. This document is designed to allow inclusion of additional technologies in the future. To gather the information necessary to create this initial Primer the Nuclear Maintenance Applications Center (NMAC) collected experience data from eighteen utilities plus other industry and government sources. NMAC also contacted equipment manufacturers for information pertaining to equipment utilization, maintenance, and technical specifications. The Primer includes a discussion of six methods used by analysts to study predictive maintenance data. These are: trend analysis; pattern recognition; correlation; test against limits or ranges; relative comparison data; and statistical process analysis. Following the analysis methods discussions are detailed descriptions for eighteen technologies analysts have found useful for predictive maintenance programs at power plants and other industrial facilities. Each technology subchapter has a description of the operating principles involved in the technology, a listing of plant equipment where the technology can be applied, and a general description of the monitoring equipment. Additionally, these descriptions include a discussion of results obtained from actual equipment users and preferred analysis techniques to be used on data obtained from the technology. 5 refs., 30 figs

  6. Stress Voiding in IC Interconnects - Rules of Evidence for Failure Analysts

    Energy Technology Data Exchange (ETDEWEB)

    FILTER, WILLIAM F.

    1999-09-17

    Mention the words ''stress voiding'', and everyone from technology engineer to manager to customer is likely to cringe. This IC failure mechanism elicits fear because it is insidious, capricious, and difficult to identify and arrest. There are reasons to believe that a damascene-copper future might be void-free. Nevertheless, engineers who continue to produce ICs with Al-alloy interconnects, or who assess the reliability of legacy ICs with long service life, need up-to-date insights and techniques to deal with stress voiding problems. Stress voiding need not be fearful. Not always predictable, neither is it inevitable. On the contrary, stress voids are caused by specific, avoidable processing errors. Analytical work, though often painful, can identify these errors when stress voiding occurs, and vigilance in monitoring the improved process can keep it from recurring. In this article, they show that a methodical, forensics approach to failure analysis can solve suspected cases of stress voiding. This approach uses new techniques, and patiently applies familiar ones, to develop evidence meeting strict standards of proof.

  7. Engaging policy-makers, heath system managers, and policy analysts in the knowledge synthesis process: a scoping review.

    Science.gov (United States)

    Tricco, Andrea C; Zarin, Wasifa; Rios, Patricia; Nincic, Vera; Khan, Paul A; Ghassemi, Marco; Diaz, Sanober; Pham, Ba'; Straus, Sharon E; Langlois, Etienne V

    2018-02-12

    It is unclear how to engage a wide range of knowledge users in research. We aimed to map the evidence on engaging knowledge users with an emphasis on policy-makers, health system managers, and policy analysts in the knowledge synthesis process through a scoping review. We used the Joanna Briggs Institute guidance for scoping reviews. Nine electronic databases (e.g., MEDLINE), two grey literature sources (e.g., OpenSIGLE), and reference lists of relevant systematic reviews were searched from 1996 to August 2016. We included any type of study describing strategies, barriers and facilitators, or assessing the impact of engaging policy-makers, health system managers, and policy analysts in the knowledge synthesis process. Screening and data abstraction were conducted by two reviewers independently with a third reviewer resolving discrepancies. Frequency and thematic analyses were conducted. After screening 8395 titles and abstracts followed by 394 full-texts, 84 unique documents and 7 companion reports fulfilled our eligibility criteria. All 84 documents were published in the last 10 years, and half were prepared in North America. The most common type of knowledge synthesis with knowledge user engagement was a systematic review (36%). The knowledge synthesis most commonly addressed an issue at the level of national healthcare system (48%) and focused on health services delivery (17%) in high-income countries (86%). Policy-makers were the most common (64%) knowledge users, followed by healthcare professionals (49%) and government agencies as well as patients and caregivers (34%). Knowledge users were engaged in conceptualization and design (49%), literature search and data collection (52%), data synthesis and interpretation (71%), and knowledge dissemination and application (44%). Knowledge users were most commonly engaged as key informants through meetings and workshops as well as surveys, focus groups, and interviews either in-person or by telephone and emails

  8. A Graph is Worth a Thousand Words: How Overconfidence and Graphical Disclosure of Numerical Information Influence Financial Analysts Accuracy on Decision Making.

    Directory of Open Access Journals (Sweden)

    Ricardo Lopes Cardoso

    Full Text Available Previous researches support that graphs are relevant decision aids to tasks related to the interpretation of numerical information. Moreover, literature shows that different types of graphical information can help or harm the accuracy on decision making of accountants and financial analysts. We conducted a 4×2 mixed-design experiment to examine the effects of numerical information disclosure on financial analysts' accuracy, and investigated the role of overconfidence in decision making. Results show that compared to text, column graph enhanced accuracy on decision making, followed by line graphs. No difference was found between table and textual disclosure. Overconfidence harmed accuracy, and both genders behaved overconfidently. Additionally, the type of disclosure (text, table, line graph and column graph did not affect the overconfidence of individuals, providing evidence that overconfidence is a personal trait. This study makes three contributions. First, it provides evidence from a larger sample size (295 of financial analysts instead of a smaller sample size of students that graphs are relevant decision aids to tasks related to the interpretation of numerical information. Second, it uses the text as a baseline comparison to test how different ways of information disclosure (line and column graphs, and tables can enhance understandability of information. Third, it brings an internal factor to this process: overconfidence, a personal trait that harms the decision-making process of individuals. At the end of this paper several research paths are highlighted to further study the effect of internal factors (personal traits on financial analysts' accuracy on decision making regarding numerical information presented in a graphical form. In addition, we offer suggestions concerning some practical implications for professional accountants, auditors, financial analysts and standard setters.

  9. A Graph is Worth a Thousand Words: How Overconfidence and Graphical Disclosure of Numerical Information Influence Financial Analysts Accuracy on Decision Making.

    Science.gov (United States)

    Cardoso, Ricardo Lopes; Leite, Rodrigo Oliveira; de Aquino, André Carlos Busanelli

    2016-01-01

    Previous researches support that graphs are relevant decision aids to tasks related to the interpretation of numerical information. Moreover, literature shows that different types of graphical information can help or harm the accuracy on decision making of accountants and financial analysts. We conducted a 4×2 mixed-design experiment to examine the effects of numerical information disclosure on financial analysts' accuracy, and investigated the role of overconfidence in decision making. Results show that compared to text, column graph enhanced accuracy on decision making, followed by line graphs. No difference was found between table and textual disclosure. Overconfidence harmed accuracy, and both genders behaved overconfidently. Additionally, the type of disclosure (text, table, line graph and column graph) did not affect the overconfidence of individuals, providing evidence that overconfidence is a personal trait. This study makes three contributions. First, it provides evidence from a larger sample size (295) of financial analysts instead of a smaller sample size of students that graphs are relevant decision aids to tasks related to the interpretation of numerical information. Second, it uses the text as a baseline comparison to test how different ways of information disclosure (line and column graphs, and tables) can enhance understandability of information. Third, it brings an internal factor to this process: overconfidence, a personal trait that harms the decision-making process of individuals. At the end of this paper several research paths are highlighted to further study the effect of internal factors (personal traits) on financial analysts' accuracy on decision making regarding numerical information presented in a graphical form. In addition, we offer suggestions concerning some practical implications for professional accountants, auditors, financial analysts and standard setters.

  10. Household Consumption and Expenditures Surveys (HCES): a primer for food and nutrition analysts in low- and middle-income countries.

    Science.gov (United States)

    Fiedler, John L; Lividini, Keith; Bermudez, Odilia I; Smitz, Marc-Francois

    2012-09-01

    The dearth of 24-hour recall and observed-weighed food record data--what most nutritionists regard as the gold standard source of food consumption data-has long been an obstacle to evidence-based food and nutrition policy. There have been a steadily growing number of studies using household food acquisition and consumption data from a variety of multipurpose, nationally representative household surveys as a proxy measure to overcome this fundamental information gap. To describe the key characteristics of these increasingly available Household Consumption and Expenditures Surveys (HCES) in order to help familiarize food and nutrition analysts with the strengths and shortcomings of these data and thus encourage their use in low- and middle-income countries; and to identify common shortcomings that can be readily addressed in the near term in a country-by-country approach, as new HCES are fielded, thereby beginning a process of improving the potential of these surveys as sources of useful data for better understanding food- and nutrition-related issues. Common characteristics of key food and nutrition information that is available in HCES and some basic common steps in processing HCES data for food and nutrition analyses are described. The common characteristics of these surveys are documented, and their usefulness in addressing major food and nutrition issues, as well as their shortcomings, is demonstrated. Despite their limitations, the use of HCES data constitutes a generally unexploited opportunity to address the food consumption information gap by using survey data that most countries are already routinely collecting.

  11. The Regional Healthcare Ecosystem Analyst (RHEA): a simulation modeling tool to assist infectious disease control in a health system.

    Science.gov (United States)

    Lee, Bruce Y; Wong, Kim F; Bartsch, Sarah M; Yilmaz, S Levent; Avery, Taliser R; Brown, Shawn T; Song, Yeohan; Singh, Ashima; Kim, Diane S; Huang, Susan S

    2013-06-01

    As healthcare systems continue to expand and interconnect with each other through patient sharing, administrators, policy makers, infection control specialists, and other decision makers may have to take account of the entire healthcare 'ecosystem' in infection control. We developed a software tool, the Regional Healthcare Ecosystem Analyst (RHEA), that can accept user-inputted data to rapidly create a detailed agent-based simulation model (ABM) of the healthcare ecosystem (ie, all healthcare facilities, their adjoining community, and patient flow among the facilities) of any region to better understand the spread and control of infectious diseases. To demonstrate RHEA's capabilities, we fed extensive data from Orange County, California, USA, into RHEA to create an ABM of a healthcare ecosystem and simulate the spread and control of methicillin-resistant Staphylococcus aureus. Various experiments explored the effects of changing different parameters (eg, degree of transmission, length of stay, and bed capacity). Our model emphasizes how individual healthcare facilities are components of integrated and dynamic networks connected via patient movement and how occurrences in one healthcare facility may affect many other healthcare facilities. A decision maker can utilize RHEA to generate a detailed ABM of any healthcare system of interest, which in turn can serve as a virtual laboratory to test different policies and interventions.

  12. Deployed Analyst Handbook

    Science.gov (United States)

    2016-06-01

    layered approach to data verification. One should use quality control measures throughout the data management process, from the entry of an initial...intermediate milestones aid in project management and ensure customer satisfaction ).  Communications. One should establish recurrent communication with the...distribution across an area. ......................................71 CAA-2015094 vii Figure 22. Clustered Bars – Illustrate a rank order among

  13. IMPROVED GROUND TRUTH IN SOUTHERN ASIA USING IN-COUNTRY DATA, ANALYST WAVEFORM REVIEW, AND ADVANCED ALGORITHMS

    Energy Technology Data Exchange (ETDEWEB)

    Engdahl, Eric, R.; Bergman, Eric, A.; Myers, Stephen, C.; Ryall, Floriana

    2009-06-19

    A new catalog of seismicity at magnitudes above 2.5 for the period 1923-2008 in the Iran region is assembled from arrival times reported by global, regional, and local seismic networks. Using in-country data we have formed new events, mostly at lower magnitudes that were not previously included in standard global earthquake catalogs. The magnitude completeness of the catalog varies strongly through time, complete to about magnitude 4.2 prior to 1998 and reaching a minimum of about 3.6 during the period 1998-2005. Of the 25,722 events in the catalog, most of the larger events have been carefully reviewed for proper phase association, especially for depth phases and to eliminate outlier readings, and relocated. To better understand the quality of the data set of arrival times reported by Iranian networks that are central to this study, many waveforms for events in Iran have been re-picked by an experienced seismic analyst. Waveforms at regional distances in this region are often complex. For many events this makes arrival time picks difficult to make, especially for smaller magnitude events, resulting in reported times that can be substantially improved by an experienced analyst. Even when the signal/noise ratio is large, re-picking can lead to significant differences. Picks made by our analyst are compared with original picks made by the regional networks. In spite of the obvious outliers, the median (-0.06 s) and spread (0.51 s) are small, suggesting that reasonable confidence can be placed in the picks reported by regional networks in Iran. This new catalog has been used to assess focal depth distributions throughout Iran. A principal result of this study is that the geographic pattern of depth distributions revealed by the relatively small number of earthquakes (~167) with depths constrained by waveform modeling (+/- 4 km) are now in agreement with the much larger number of depths (~1229) determined using reanalysis of ISC arrival-times (+/-10 km), within their

  14. How well do financial experts perform? A review of empirical research on performance of analysts, day-traders, forecasters, fund managers, investors, and stockbrokers

    OpenAIRE

    Andersson, Patric

    2004-01-01

    In this manuscript, empirical research on performance of various types of financial experts is reviewed. Financial experts are used as the umbrella term for financial analysts, stockbrokers, money managers, investors, and day-traders etc. The goal of the review is to find out about the abilities of financial experts to produce accurate forecasts, to issue profitable stock recommendations, as well as to make successful investments and trades. On the whole, the reviewed studies show discouragin...

  15. Analysts’ forecast error: A robust prediction model and its short term trading profitability

    NARCIS (Netherlands)

    Boudt, K.M.R.; de Goei, P.; Thewissen, J.; van Campenhout, G.

    2015-01-01

    This paper contributes to the empirical evidence on the investment horizon salient to trading based on predicting the error in analysts' earnings forecasts. An econometric framework is proposed that accommodates the stylized fact of extreme values in the forecast error series. We find that between

  16. Massive Predictive Modeling using Oracle R Enterprise

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    R is fast becoming the lingua franca for analyzing data via statistics, visualization, and predictive analytics. For enterprise-scale data, R users have three main concerns: scalability, performance, and production deployment. Oracle's R-based technologies - Oracle R Distribution, Oracle R Enterprise, Oracle R Connector for Hadoop, and the R package ROracle - address these concerns. In this talk, we introduce Oracle's R technologies, highlighting how each enables R users to achieve scalability and performance while making production deployment of R results a natural outcome of the data analyst/scientist efforts. The focus then turns to Oracle R Enterprise with code examples using the transparency layer and embedded R execution, targeting massive predictive modeling. One goal behind massive predictive modeling is to build models per entity, such as customers, zip codes, simulations, in an effort to understand behavior and tailor predictions at the entity level. Predictions...

  17. Energy Predictions 2011

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2010-10-15

    Even as the recession begins to subside, the energy sector is still likely to experience challenging conditions as we enter 2011. It should be remembered how very important a role energy plays in driving the global economy. Serving as a simple yet global and unified measure of economic recovery, it is oil's price range and the strength and sustainability of the recovery which will impact the ways in which all forms of energy are produced and consumed. The report aims for a closer insight into these predictions: What will happen with M and A (Mergers and Acquisitions) in the energy industry?; What are the prospects for renewables?; Will the water-energy nexus grow in importance?; How will technological leaps and bounds affect E and P (exploration and production) operations?; What about electric cars? This is the second year Deloitte's Global Energy and Resources Group has published its predictions for the year ahead. The report is based on in-depth interviews with clients, industry analysts, and senior energy practitioners from Deloitte member firms around the world.

  18. Energy Predictions 2011

    International Nuclear Information System (INIS)

    2010-10-01

    Even as the recession begins to subside, the energy sector is still likely to experience challenging conditions as we enter 2011. It should be remembered how very important a role energy plays in driving the global economy. Serving as a simple yet global and unified measure of economic recovery, it is oil's price range and the strength and sustainability of the recovery which will impact the ways in which all forms of energy are produced and consumed. The report aims for a closer insight into these predictions: What will happen with M and A (Mergers and Acquisitions) in the energy industry?; What are the prospects for renewables?; Will the water-energy nexus grow in importance?; How will technological leaps and bounds affect E and P (exploration and production) operations?; What about electric cars? This is the second year Deloitte's Global Energy and Resources Group has published its predictions for the year ahead. The report is based on in-depth interviews with clients, industry analysts, and senior energy practitioners from Deloitte member firms around the world.

  19. Un estudio preliminar del fundamento pulsional de la "aptitud de analista" Preliminary study of the instinctual foundation of the "analyst's competence"

    Directory of Open Access Journals (Sweden)

    Osvaldo Delgado

    2007-12-01

    Full Text Available Este trabajo presenta algunas preguntas y desarrollos preliminares surgidos en un recorrido teórico realizado por los textos freudianos del término "aptitud". - Se presentan las referencias a textos anteriores a 1920, aunque se privilegie, en función de los objetivos de la investigación en curso, el ordenamiento y relación de la "aptitud de analista" con los conceptos fundamentales de la segunda tópica freudiana. ¿Se puede elevar el término castellano aptitud y sus originales alemanes al estatuto de un concepto? Finalmente se planteará que la dimensión pulsional del término es lo que permite darle a la "aptitud de analista" un estatuto conceptual, ya que la aptitud como "tauglich" en el advenimiento de un nuevo analista implica una transmutación pulsional específica. La pregunta por cuál es la relación entre lo que porta el carácter y la recomposición de las alteraciones del yo en el período posterior al análisis quedará como orientación para otro trabajo.This work presents some questions and preliminary developments which erased during the theoretical examinations realized on the Freud's texts concerning the term ¨competence¨. The references to the texts earlier than 1920 are given, however, in function of the objectives of the investigation in course, the order and relation of the ¨analyst' s competence¨ to the fundamental concepts of Freud's second topography are favored. Can we give a Spanish term ¨aptitud¨ and its German originals a status of a concept? Final consideration will be that the instinctual dimension of the term is what permits to give it a conceptual status to the ¨analyst's competence¨, since ¨competence¨ as ¨tauglich¨ in making a new analyst implies a specific instinctual transmutation. The question about what is the relation between what makes a character and the reparation of the alteration of ¨I ¨ in the period following the analysis will remain an orientation for further work.

  20. Comparison of ArcGIS and SAS Geostatistical Analyst to Estimate Population-Weighted Monthly Temperature for US Counties.

    Science.gov (United States)

    Xiaopeng, Q I; Liang, Wei; Barker, Laurie; Lekiachvili, Akaki; Xingyou, Zhang

    Temperature changes are known to have significant impacts on human health. Accurate estimates of population-weighted average monthly air temperature for US counties are needed to evaluate temperature's association with health behaviours and disease, which are sampled or reported at the county level and measured on a monthly-or 30-day-basis. Most reported temperature estimates were calculated using ArcGIS, relatively few used SAS. We compared the performance of geostatistical models to estimate population-weighted average temperature in each month for counties in 48 states using ArcGIS v9.3 and SAS v 9.2 on a CITGO platform. Monthly average temperature for Jan-Dec 2007 and elevation from 5435 weather stations were used to estimate the temperature at county population centroids. County estimates were produced with elevation as a covariate. Performance of models was assessed by comparing adjusted R 2 , mean squared error, root mean squared error, and processing time. Prediction accuracy for split validation was above 90% for 11 months in ArcGIS and all 12 months in SAS. Cokriging in SAS achieved higher prediction accuracy and lower estimation bias as compared to cokriging in ArcGIS. County-level estimates produced by both packages were positively correlated (adjusted R 2 range=0.95 to 0.99); accuracy and precision improved with elevation as a covariate. Both methods from ArcGIS and SAS are reliable for U.S. county-level temperature estimates; However, ArcGIS's merits in spatial data pre-processing and processing time may be important considerations for software selection, especially for multi-year or multi-state projects.

  1. Trabalho, saúde e gênero: estudo comparativo sobre analistas de sistemas Work and health: a gender study on systems analysts

    Directory of Open Access Journals (Sweden)

    Lys Esther Rocha

    2001-12-01

    Full Text Available OBJETIVO: Avaliar as repercussões do trabalho de mulheres e homens analistas de sistemas na saúde. MÉTODOS: Trata-se de estudo exploratório de delineamento transversal, abrangendo 553 analistas de duas empresas de processamento de dados da região metropolitana de São Paulo. Foram realizadas análises ergonômicas do trabalho, entrevistas semi-estruturadas e preenchimento de questionários para auto-aplicação. A análise dos dados baseou-se em tabelas de contingência com qui-quadrado a 5% de significância e razões de prevalência e seus intervalos de confiança segundo gênero. RESULTADOS: As mulheres constituíram 40,7% do grupo estudado, sendo mais jovens que os homens. A presença de filhos foi maior entre os homens, embora o tempo diário dedicado às tarefas domésticas tenha sido maior entre as mulheres. Observou-se predomínio dos homens nas funções de chefia. Fatores de incômodo, com freqüência semelhante entre homens e mulheres, foram: sobrecarga de trabalho devido a prazos curtos; alto grau de responsabilidade; exigência mental do trabalho; e complexidade da tarefa. Fatores de incômodo predominantes em mulheres foram: postura desconfortável; maior exposição ao computador; e presença de equipamento obsoleto. As mulheres relataram maior freqüência de sintomas visuais, musculares e relacionados a estresse; maior insatisfação com o trabalho; maior fadiga física e mental. CONCLUSÕES: O estudo sugere que as repercussões na saúde das analistas de sistemas estão associadas às exigências do trabalho e ao papel da mulher na sociedade. Os resultados destacam a importância de estudos sobre saúde, trabalho e gênero, em analisar a interseção entre a esfera produtiva e a doméstica.OBJECTIVE: To assess the health impact of working conditions among male and female systems analysts. METHODS: In this cross-sectional study, 533 systems analysts of two data analysis companies located in the metropolitan area of S

  2. Economic analyses to support decisions about HPV vaccination in low- and middle-income countries: a consensus report and guide for analysts.

    Science.gov (United States)

    Jit, Mark; Levin, Carol; Brisson, Marc; Levin, Ann; Resch, Stephen; Berkhof, Johannes; Kim, Jane; Hutubessy, Raymond

    2013-01-30

    Low- and middle-income countries need to consider economic issues such as cost-effectiveness, affordability and sustainability before introducing a program for human papillomavirus (HPV) vaccination. However, many such countries lack the technical capacity and data to conduct their own analyses. Analysts informing policy decisions should address the following questions: 1) Is an economic analysis needed? 2) Should analyses address costs, epidemiological outcomes, or both? 3) If costs are considered, what sort of analysis is needed? 4) If outcomes are considered, what sort of model should be used? 5) How complex should the analysis be? 6) How should uncertainty be captured? 7) How should model results be communicated? Selecting the appropriate analysis is essential to ensure that all the important features of the decision problem are correctly represented, but that the analyses are not more complex than necessary. This report describes the consensus of an expert group convened by the World Health Organization, prioritizing key issues to be addressed when considering economic analyses to support HPV vaccine introduction in these countries.

  3. Using Machine Learning to Predict MCNP Bias

    Energy Technology Data Exchange (ETDEWEB)

    Grechanuk, Pavel Aleksandrovi [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-01-09

    For many real-world applications in radiation transport where simulations are compared to experimental measurements, like in nuclear criticality safety, the bias (simulated - experimental keff) in the calculation is an extremely important quantity used for code validation. The objective of this project is to accurately predict the bias of MCNP6 [1] criticality calculations using machine learning (ML) algorithms, with the intention of creating a tool that can complement the current nuclear criticality safety methods. In the latest release of MCNP6, the Whisper tool is available for criticality safety analysts and includes a large catalogue of experimental benchmarks, sensitivity profiles, and nuclear data covariance matrices. This data, coming from 1100+ benchmark cases, is used in this study of ML algorithms for criticality safety bias predictions.

  4. An analyst's self-analysis.

    Science.gov (United States)

    Calder, K T

    1980-01-01

    I have told you why I selected the topic of self-analysis, and I have described my method for it: of recording primary data such as dreams, daydreams, memories, and symptoms and of recording associations to this primary data, followed by an attempt at analyzing this written material. I have described a dream, a memory and a daydream which is also a symptom, each of which primary data I found useful in understanding myself. Finally, I reached some conclusions regarding the uses of self-analysis, including self-analysis as a research tool.

  5. Intermediate Infrastructure Analyst | IDRC - International ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Provides feedback for the creation of service project business cases and ... and project plans to allow IDRC to move forward with a specific product strategy. ... or team leader by undertaking research, investigations, evaluations and testing of ...

  6. Detector Fundamentals for Reachback Analysts

    Energy Technology Data Exchange (ETDEWEB)

    Karpius, Peter Joseph [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Myers, Steven Charles [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-08-03

    This presentation is a part of the DHS LSS spectroscopy course and provides an overview of the following concepts: detector system components, intrinsic and absolute efficiency, resolution and linearity, and operational issues and limits.

  7. The Value of Different Customer Satisfaction and Loyalty Metrics in Predicting Business Performance

    OpenAIRE

    Neil A. Morgan; Lopo Leotte Rego

    2006-01-01

    Managers commonly use customer feedback data to set goals and monitor performance on metrics such as “Top 2 Box” customer satisfaction scores and “intention-to-repurchase” loyalty scores. However, analysts have advocated a number of different customer feedback metrics including average customer satisfaction scores and the number of “net promoters” among a firm's customers. We empirically examine which commonly used and widely advocated customer feedback metrics are most valuable in predicting...

  8. WALS Prediction

    NARCIS (Netherlands)

    Magnus, J.R.; Wang, W.; Zhang, Xinyu

    2012-01-01

    Abstract: Prediction under model uncertainty is an important and difficult issue. Traditional prediction methods (such as pretesting) are based on model selection followed by prediction in the selected model, but the reported prediction and the reported prediction variance ignore the uncertainty

  9. Lawyers, Accountants and Financial Analysts: The “Architects” of the New EU Regime of Corporate Accountability

    Directory of Open Access Journals (Sweden)

    David Monciardini

    2016-09-01

    Full Text Available International accounting rules are increasingly under pressure as they are considered inadequate to respond to major changes in the way business is conducted, how it creates value and the context in which operates. The paper identifies and juxtaposes two regulatory trends in this re-definition of traditional accounting: ‘accounting for intangible assets’ and ‘corporate social accountability’. They are partially overlapping and both demand to go beyond accounting for physical and financial assets. However, they are underpinned by different rationales and supported by competing professional claims. Deploying a reflexive socio-legal approach, the article outlines a preliminary symbolic ‘archaeology’ of these regulatory trends. Drawing on interviews and documents analysis, it highlights the role of three professional communities in shaping regulatory changes: international accountants, activist-lawyers and financial analysts. Competing for the definition of what counts and what has value, they are generating an intriguing debate about the boundaries between business and society. Las normas internacionales de contabilidad están bajo una presión cada vez mayor, ya que se consideran inadecuadas para responder a los grandes cambios que se están produciendo en la forma en que se dirigen las empresas, cómo se crea valor y el contexto en el que se opera. El artículo identifica y yuxtapone dos tendencias reguladoras en esta redefinición de la contabilidad tradicional: "contabilidad de los activos intangibles" y "responsabilidad social empresarial". Se superponen parcialmente y ambas exigen ir más allá de la contabilidad de los activos físicos y financieros. Sin embargo, están respaldadas por distintas razones y cuentan con el apoyo de profesionales competentes. Implementando un enfoque socio-jurídico reflexivo, el artículo describe una "arqueología" preliminar simbólica de estas tendencias regulatorias. A partir de entrevistas y

  10. The Hydrograph Analyst, an Arcview GIS Extension That Integrates Point, Spatial, and Temporal Data Provides A Graphical User Interface for Hydrograph Analysis

    International Nuclear Information System (INIS)

    Jones, M.L.; O'Brien, G.M.; Jones, M.L.

    2000-01-01

    The Hydrograph Analyst (HA) is an ArcView GIS 3.2 extension developed by the authors to analyze hydrographs from a network of ground-water wells and springs in a regional ground-water flow model. ArcView GIS integrates geographic, hydrologic, and descriptive information and provides the base functionality needed for hydrograph analysis. The HA extends ArcView's base functionality by automating data integration procedures and by adding capabilities to visualize and analyze hydrologic data. Data integration procedures were automated by adding functionality to the View document's Document Graphical User Interface (DocGUI). A menu allows the user to query a relational database and select sites which are displayed as a point theme in a View document. An ''Identify One to Many'' tool is provided within the View DocGUI to retrieve all hydrologic information for a selected site and display it in a simple and concise tabular format. For example, the display could contain various records from many tables storing data for one site. Another HA menu allows the user to generate a hydrograph for sites selected from the point theme. Hydrographs generated by the HA are added as hydrograph documents and accessed by the user with the Hydrograph DocGUI, which contains tools and buttons for hydrograph analysis. The Hydrograph DocGUI has a ''Select By Polygon'' tool used for isolating particular points on the hydrograph inside a user-drawn polygon or the user could isolate the same points by constructing a logical expression with the ArcView GIS ''Query Builder'' dialog that is also accessible in the Hydrograph DocGUI. Other buttons can be selected to alter the query applied to the active hydrograph. The selected points on the active hydrograph can be attributed (or flagged) individually or as a group using the ''Flag'' tool found on the Hydrograph DocGUI. The ''Flag'' tool activates a dialog box that prompts the user to select an attribute and ''methods'' or ''conditions'' that qualify

  11. The Role of Teamwork in the Analysis of Big Data: A Study of Visual Analytics and Box Office Prediction.

    Science.gov (United States)

    Buchanan, Verica; Lu, Yafeng; McNeese, Nathan; Steptoe, Michael; Maciejewski, Ross; Cooke, Nancy

    2017-03-01

    Historically, domains such as business intelligence would require a single analyst to engage with data, develop a model, answer operational questions, and predict future behaviors. However, as the problems and domains become more complex, organizations are employing teams of analysts to explore and model data to generate knowledge. Furthermore, given the rapid increase in data collection, organizations are struggling to develop practices for intelligence analysis in the era of big data. Currently, a variety of machine learning and data mining techniques are available to model data and to generate insights and predictions, and developments in the field of visual analytics have focused on how to effectively link data mining algorithms with interactive visuals to enable analysts to explore, understand, and interact with data and data models. Although studies have explored the role of single analysts in the visual analytics pipeline, little work has explored the role of teamwork and visual analytics in the analysis of big data. In this article, we present an experiment integrating statistical models, visual analytics techniques, and user experiments to study the role of teamwork in predictive analytics. We frame our experiment around the analysis of social media data for box office prediction problems and compare the prediction performance of teams, groups, and individuals. Our results indicate that a team's performance is mediated by the team's characteristics such as openness of individual members to others' positions and the type of planning that goes into the team's analysis. These findings have important implications for how organizations should create teams in order to make effective use of information from their analytic models.

  12. Climate prediction and predictability

    Science.gov (United States)

    Allen, Myles

    2010-05-01

    Climate prediction is generally accepted to be one of the grand challenges of the Geophysical Sciences. What is less widely acknowledged is that fundamental issues have yet to be resolved concerning the nature of the challenge, even after decades of research in this area. How do we verify or falsify a probabilistic forecast of a singular event such as anthropogenic warming over the 21st century? How do we determine the information content of a climate forecast? What does it mean for a modelling system to be "good enough" to forecast a particular variable? How will we know when models and forecasting systems are "good enough" to provide detailed forecasts of weather at specific locations or, for example, the risks associated with global geo-engineering schemes. This talk will provide an overview of these questions in the light of recent developments in multi-decade climate forecasting, drawing on concepts from information theory, machine learning and statistics. I will draw extensively but not exclusively from the experience of the climateprediction.net project, running multiple versions of climate models on personal computers.

  13. Counter-terrorism threat prediction architecture

    Science.gov (United States)

    Lehman, Lynn A.; Krause, Lee S.

    2004-09-01

    This paper will evaluate the feasibility of constructing a system to support intelligence analysts engaged in counter-terrorism. It will discuss the use of emerging techniques to evaluate a large-scale threat data repository (or Infosphere) and comparing analyst developed models to identify and discover potential threat-related activity with a uncertainty metric used to evaluate the threat. This system will also employ the use of psychological (or intent) modeling to incorporate combatant (i.e. terrorist) beliefs and intent. The paper will explore the feasibility of constructing a hetero-hierarchical (a hierarchy of more than one kind or type characterized by loose connection/feedback among elements of the hierarchy) agent based framework or "family of agents" to support "evidence retrieval" defined as combing, or searching the threat data repository and returning information with an uncertainty metric. The counter-terrorism threat prediction architecture will be guided by a series of models, constructed to represent threat operational objectives, potential targets, or terrorist objectives. The approach would compare model representations against information retrieved by the agent family to isolate or identify patterns that match within reasonable measures of proximity. The central areas of discussion will be the construction of an agent framework to search the available threat related information repository, evaluation of results against models that will represent the cultural foundations, mindset, sociology and emotional drive of typical threat combatants (i.e. the mind and objectives of a terrorist), and the development of evaluation techniques to compare result sets with the models representing threat behavior and threat targets. The applicability of concepts surrounding Modeling Field Theory (MFT) will be discussed as the basis of this research into development of proximity measures between the models and result sets and to provide feedback in support of model

  14. Proactive Spatiotemporal Resource Allocation and Predictive Visual Analytics for Community Policing and Law Enforcement.

    Science.gov (United States)

    Malik, Abish; Maciejewski, Ross; Towers, Sherry; McCullough, Sean; Ebert, David S

    2014-12-01

    In this paper, we present a visual analytics approach that provides decision makers with a proactive and predictive environment in order to assist them in making effective resource allocation and deployment decisions. The challenges involved with such predictive analytics processes include end-users' understanding, and the application of the underlying statistical algorithms at the right spatiotemporal granularity levels so that good prediction estimates can be established. In our approach, we provide analysts with a suite of natural scale templates and methods that enable them to focus and drill down to appropriate geospatial and temporal resolution levels. Our forecasting technique is based on the Seasonal Trend decomposition based on Loess (STL) method, which we apply in a spatiotemporal visual analytics context to provide analysts with predicted levels of future activity. We also present a novel kernel density estimation technique we have developed, in which the prediction process is influenced by the spatial correlation of recent incidents at nearby locations. We demonstrate our techniques by applying our methodology to Criminal, Traffic and Civil (CTC) incident datasets.

  15. Earthquake prediction

    International Nuclear Information System (INIS)

    Ward, P.L.

    1978-01-01

    The state of the art of earthquake prediction is summarized, the possible responses to such prediction are examined, and some needs in the present prediction program and in research related to use of this new technology are reviewed. Three basic aspects of earthquake prediction are discussed: location of the areas where large earthquakes are most likely to occur, observation within these areas of measurable changes (earthquake precursors) and determination of the area and time over which the earthquake will occur, and development of models of the earthquake source in order to interpret the precursors reliably. 6 figures

  16. Automated Predictive Big Data Analytics Using Ontology Based Semantics.

    Science.gov (United States)

    Nural, Mustafa V; Cotterell, Michael E; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A

    2015-10-01

    Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology.

  17. Predictive medicine

    NARCIS (Netherlands)

    Boenink, Marianne; ten Have, Henk

    2015-01-01

    In the last part of the twentieth century, predictive medicine has gained currency as an important ideal in biomedical research and health care. Research in the genetic and molecular basis of disease suggested that the insights gained might be used to develop tests that predict the future health

  18. Methodologies of the hardware reliability prediction for PSA of digital I and C systems

    International Nuclear Information System (INIS)

    Jung, H. S.; Sung, T. Y.; Eom, H. S.; Park, J. K.; Kang, H. G.; Park, J.

    2000-09-01

    Digital I and C systems are being used widely in the Non-safety systems of the NPP and they are expanding their applications to safety critical systems. The regulatory body shifts their policy to risk based and may require Probabilistic Safety Assessment for the digital I and C systems. But there is no established reliability prediction methodology for the digital I and C systems including both software and hardware yet. This survey report includes a lot of reliability prediction methods for electronic systems in view of hardware. Each method has both the strong and the weak points. This report provides the state-of-art of prediction methods and focus on Bellcore method and MIL-HDBK-217F method in deeply. The reliability analysis models are reviewed and discussed to help analysts. Also this report includes state-of-art of software tools that are supporting reliability prediction

  19. Methodologies of the hardware reliability prediction for PSA of digital I and C systems

    Energy Technology Data Exchange (ETDEWEB)

    Jung, H. S.; Sung, T. Y.; Eom, H. S.; Park, J. K.; Kang, H. G.; Park, J

    2000-09-01

    Digital I and C systems are being used widely in the Non-safety systems of the NPP and they are expanding their applications to safety critical systems. The regulatory body shifts their policy to risk based and may require Probabilistic Safety Assessment for the digital I and C systems. But there is no established reliability prediction methodology for the digital I and C systems including both software and hardware yet. This survey report includes a lot of reliability prediction methods for electronic systems in view of hardware. Each method has both the strong and the weak points. This report provides the state-of-art of prediction methods and focus on Bellcore method and MIL-HDBK-217F method in deeply. The reliability analysis models are reviewed and discussed to help analysts. Also this report includes state-of-art of software tools that are supporting reliability prediction.

  20. Prediction Markets

    DEFF Research Database (Denmark)

    Horn, Christian Franz; Ivens, Bjørn Sven; Ohneberg, Michael

    2014-01-01

    In recent years, Prediction Markets gained growing interest as a forecasting tool among researchers as well as practitioners, which resulted in an increasing number of publications. In order to track the latest development of research, comprising the extent and focus of research, this article...... provides a comprehensive review and classification of the literature related to the topic of Prediction Markets. Overall, 316 relevant articles, published in the timeframe from 2007 through 2013, were identified and assigned to a herein presented classification scheme, differentiating between descriptive...... works, articles of theoretical nature, application-oriented studies and articles dealing with the topic of law and policy. The analysis of the research results reveals that more than half of the literature pool deals with the application and actual function tests of Prediction Markets. The results...

  1. Rationale and design of the participant, investigator, observer, and data-analyst-blinded randomized AGENDA trial on associations between gene-polymorphisms, endophenotypes for depression and antidepressive intervention: the effect of escitalopram versus placebo on the combined dexamethasone-corticotrophine releasing hormone test and other potential endophenotypes in healthy first-degree relatives of persons with depression

    DEFF Research Database (Denmark)

    Knorr, Ulla; Vinberg, Maj; Klose, Marianne

    2009-01-01

    from baseline to the end of intervention. METHODS: The AGENDA trial is designed as a participant, investigator, observer, and data-analyst-blinded randomized trial. Participants are 80 healthy first-degree relatives of patients with depression. Participants are randomized to escitalopram 10 mg per day...

  2. Unification predictions

    International Nuclear Information System (INIS)

    Ghilencea, D.; Ross, G.G.; Lanzagorta, M.

    1997-07-01

    The unification of gauge couplings suggests that there is an underlying (supersymmetric) unification of the strong, electromagnetic and weak interactions. The prediction of the unification scale may be the first quantitative indication that this unification may extend to unification with gravity. We make a precise determination of these predictions for a class of models which extend the multiplet structure of the Minimal Supersymmetric Standard Model to include the heavy states expected in many Grand Unified and/or superstring theories. We show that there is a strong cancellation between the 2-loop and threshold effects. As a result the net effect is smaller than previously thought, giving a small increase in both the unification scale and the value of the strong coupling at low energies. (author). 15 refs, 5 figs

  3. Financial Distress Prediction of Iranian Companies Using Data Mining Techniques

    Directory of Open Access Journals (Sweden)

    Moradi Mahdi

    2013-01-01

    Full Text Available Decision-making problems in the area of financial status evaluation are considered very important. Making incorrect decisions in firms is very likely to cause financial crises and distress. Predicting financial distress of factories and manufacturing companies is the desire of managers and investors, auditors, financial analysts, governmental officials, employees. Therefore, the current study aims to predict financial distress of Iranian Companies. The current study applies support vector data description (SVDD to the financial distress prediction problem in an attempt to suggest a new model with better explanatory power and stability. To serve this purpose, we use a grid-search technique using 3-fold cross-validation to find out the optimal parameter values of kernel function of SVDD. To evaluate the prediction accuracy of SVDD, we compare its performance with fuzzy c-means (FCM.The experiment results show that SVDD outperforms the other method in years before financial distress occurrence. The data used in this research were obtained from Iran Stock Market and Accounting Research Database. According to the data between 2000 and 2009, 70 pairs of companies listed in Tehran Stock Exchange are selected as initial data set.

  4. Michael Howard: Military Historian and Strategic Analyst.

    Science.gov (United States)

    1983-06-10

    eighties it became obvious that "the growing acceptability of mili- tary studies" was a consequence of a careful nurturing during the embryo stage by...would decide the destiny of the nation. If the machine gun and the artillery piece meant the "battle went on for longer than expected; the casualties

  5. Gamma-Ray Interactions for Reachback Analysts

    Energy Technology Data Exchange (ETDEWEB)

    Karpius, Peter Joseph [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Myers, Steven Charles [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-08-02

    This presentation is a part of the DHS LSS spectroscopy training course and presents an overview of the following concepts: identification and measurement of gamma rays; use of gamma counts and energies in research. Understanding the basic physics of how gamma rays interact with matter can clarify how certain features in a spectrum were produced.

  6. Technology Infrastructure, Analyst (Network, Video and ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    He/she participates in the business continuity and security of systems. ... Strategic Planning and Product Strategy ... in IT enhancement initiatives and projects as a team member or team leader by undertaking research, evaluations and testing ...

  7. Zdeněk Kopal: Numerical Analyst

    Science.gov (United States)

    Křížek, M.

    2015-07-01

    We give a brief overview of Zdeněk Kopal's life, his activities in the Czech Astronomical Society, his collaboration with Vladimír Vand, and his studies at Charles University, Cambridge, Harvard, and MIT. Then we survey Kopal's professional life. He published 26 monographs and 20 conference proceedings. We will concentrate on Kopal's extensive monograph Numerical Analysis (1955, 1961) that is widely accepted to be the first comprehensive textbook on numerical methods. It describes, for instance, methods for polynomial interpolation, numerical differentiation and integration, numerical solution of ordinary differential equations with initial or boundary conditions, and numerical solution of integral and integro-differential equations. Special emphasis will be laid on error analysis. Kopal himself applied numerical methods to celestial mechanics, in particular to the N-body problem. He also used Fourier analysis to investigate light curves of close binaries to discover their properties. This is, in fact, a problem from mathematical analysis.

  8. IT Operations Analyst | IDRC - International Development Research ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Contributes to the ISR work plan and participates in the setting of ISR priorities ... Assists the project manager and technical team in the development of ... in the preparation of briefing notes, technology summaries and analysis reports.

  9. The organically bound tritium: an analyst vision

    International Nuclear Information System (INIS)

    Ansoborlo, E.; Baglan, N.

    2009-01-01

    The authors report the work of a work group on tritium analysis. They recall the different physical forms of tritium: gas (HT, hydrogen-tritium), water vapour (HTO or tritiated water) or methane (CH3T), but also in organic compounds (OBT, organically bound tritium) which are either exchangeable or non-exchangeable. They evoke measurement techniques and methods, notably to determine the tritium volume activity. They discuss the possibilities to analyse and distinguish exchangeable and non-exchangeable OBTs

  10. Financial analysts and their opinion about nuclear

    International Nuclear Information System (INIS)

    Vos, Patrick H. de

    1999-01-01

    Electrabel is a Belgian utility with business activities ranging from electricity generation and transmission (market share 88 %) to direct distribution of electricity to industrial customers. Electrabel is listed at the Brussels stock market and ranks among the three highest market capitalisation. The Electrabel owns the following NPPs: Doel 1, Tihange 1, Doel 2, Doel, Tihange, Doel 4, Tihange 3, Chooz B and Tricastin. Electrabel opted for an integrated communication strategy, that is to say that the release of information relating to its image, both inside and outside the company, to the media and financial circles and even the marketing and logistics of communication must present a consistent message overall; it is only the language that is adapted to each target public. The communication policy aims mainly to provide communication that is as objective as possible in conjunction with our discussion partners, that is to say on the basis of a long-term professional relationship in a climate of mutual confidence

  11. Shakespeare for Analysts: Literature and Intelligence

    Science.gov (United States)

    2003-07-01

    helping to finance and at the same time providing the moral justification for his war: 29 Spiekerman, Shakespeare’s Political Realism, 131. 30... alchemy , Will change to virtue and to worthiness. Cassius Him and his worth and our great need of him You have right well conceited. Julius Caesar

  12. IT Service Desk Analyst | IDRC - International Development ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    S/he participates in IMTD-led projects to represent users' interests and to determine impact on ... Recommends hardware and software changes and updates to the ... in the Service Desk incident management system in order to help determine, ...

  13. Analysts' earnings forecasts and international asset allocation

    NARCIS (Netherlands)

    Huijgen, Carel; Plantinga, Auke

    1999-01-01

    The aim of this paper is to investigate whether financial analysts’ earnings forecasts are informative from the viewpoint of allocating investments across different stock markets. Therefore we develop a country forecast indicator reflecting the analysts’ prospects for specific stock markets. The

  14. Intermediate Systems Analyst | IDRC - International Development ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    ... bring to the System Development Group the necessary skills to understand in depth ... Participates in collaborative technical discussions with IDRC management, ... Responsible to monitor and oversee private sector consultant and part-time ...

  15. Therapeutic action and the analyst's responsibility.

    Science.gov (United States)

    Greenberg, Jay

    2015-02-01

    Models of the psychoanalytic situation can usefully be thought of as fictions. Viewed this way, the models can be understood as narrative structures that shape what we are able to see and how we are able to think about what happens between us and our analysands. Theories of therapeutic action are elements of what can be called a "controlling fiction," mediating between these theories and our very real responsibilities, both to our preferred method and to a suffering patient. This venture into comparative psychoanalysis is illustrated by a discussion of published case material. © 2015 by the American Psychoanalytic Association.

  16. Junior Information Management Analyst | IDRC - International ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    These tasks are performed recognising that information is an important asset for ... gathering, documenting, and analysing IM business requirements;; applying ... content types and metadata requirements;; defining the security and access ...

  17. Can Social Networks Assist Analysts Fight Terrorism?

    Science.gov (United States)

    2011-06-01

    Facebook set up 53 AMBER alert pages, one for each of the 50 states, along with pages for the District of Columbia, Puerto Rico, and the U.S. Virgin ...was the commonplace name of massive blizzard-like storms that plagued the northeast during the 2009-2010 holidays . Ryan Ozimek and his team at PICnet

  18. Senior Analyst | IDRC - International Development Research Centre

    International Development Research Centre (IDRC) Digital Library (Canada)

    Primary Duties or Responsibilities Strategic Planning and Development ... Conveys the directions determined by the Board and provides details on IDRC operations ... the opportunities and challenges these present for IDRC's business model; ...

  19. Predictable Medea

    Directory of Open Access Journals (Sweden)

    Elisabetta Bertolino

    2010-01-01

    Full Text Available By focusing on the tragedy of the 'unpredictable' infanticide perpetrated by Medea, the paper speculates on the possibility of a non-violent ontological subjectivity for women victims of gendered violence and whether it is possible to respond to violent actions in non-violent ways; it argues that Medea did not act in an unpredictable way, rather through the very predictable subject of resentment and violence. 'Medea' represents the story of all of us who require justice as retribution against any wrong. The presupposition is that the empowered female subjectivity of women’s rights contains the same desire of mastering others of the masculine current legal and philosophical subject. The subject of women’s rights is grounded on the emotions of resentment and retribution and refuses the categories of the private by appropriating those of the righteous, masculine and public subject. The essay opposes the essentialised stereotypes of the feminine and the maternal with an ontological approach of people as singular, corporeal, vulnerable and dependent. There is therefore an emphasis on the excluded categories of the private. Forgiveness is taken into account as a category of the private and a possibility of responding to violence with newness. A violent act is seen in relations to the community of human beings rather than through an isolated setting as in the case of the individual of human rights. In this context, forgiveness allows to risk again and being with. The result is also a rethinking of feminist actions, feminine subjectivity and of the maternal. Overall the paper opens up the Arendtian category of action and forgiveness and the Cavarerian unique and corporeal ontology of the selfhood beyond gendered stereotypes.

  20. Fatigue crack growth and life prediction under mixed-mode loading

    Science.gov (United States)

    Sajith, S.; Murthy, K. S. R. K.; Robi, P. S.

    2018-04-01

    Fatigue crack growth life as a function of crack length is essential for the prevention of catastrophic failures from damage tolerance perspective. In damage tolerance design approach, principles of fracture mechanics are usually applied to predict the fatigue life of structural components. Numerical prediction of crack growth versus number of cycles is essential in damage tolerance design. For cracks under mixed mode I/II loading, modified Paris law (d/a d N =C (ΔKe q ) m ) along with different equivalent stress intensity factor (ΔKeq) model is used for fatigue crack growth rate prediction. There are a large number of ΔKeq models available for the mixed mode I/II loading, the selection of proper ΔKeq model has significant impact on fatigue life prediction. In the present investigation, the performance of ΔKeq models in fatigue life prediction is compared with respect to the experimental findings as there are no guidelines/suggestions available on the selection of these models for accurate and/or conservative predictions of fatigue life. Within the limitations of availability of experimental data and currently available numerical simulation techniques, the results of present study attempt to outline models that would provide accurate and conservative life predictions. Such a study aid the numerical analysts or engineers in the proper selection of the model for numerical simulation of the fatigue life. Moreover, the present investigation also suggests a procedure to enhance the accuracy of life prediction using Paris law.

  1. NATO Guide for Judgement-Based Operational Analysis in Defence Decision Making (Guide OTAN pour l’analyse operationnelle basee sur le jugement dans la prise de decision de defense). Analyst-Oriented Volume: Code of Best Practice for Soft Operational Analysis

    Science.gov (United States)

    2012-06-01

    different stages of the cognitive system [3], such as the following: • The ease with which information can be recalled from memory affects how frequently... colouring is used to denote the three different text box types. The TG has restricted itself in referencing the material in the main text in order to...1: Procedure for Interpreting Problematic Situations and Identifying Their Nature. Recall from Chapter 4 that the analyst very often has to ‘prove

  2. PSORTb 3.0: improved protein subcellular localization prediction with refined localization subcategories and predictive capabilities for all prokaryotes.

    Science.gov (United States)

    Yu, Nancy Y; Wagner, James R; Laird, Matthew R; Melli, Gabor; Rey, Sébastien; Lo, Raymond; Dao, Phuong; Sahinalp, S Cenk; Ester, Martin; Foster, Leonard J; Brinkman, Fiona S L

    2010-07-01

    PSORTb has remained the most precise bacterial protein subcellular localization (SCL) predictor since it was first made available in 2003. However, the recall needs to be improved and no accurate SCL predictors yet make predictions for archaea, nor differentiate important localization subcategories, such as proteins targeted to a host cell or bacterial hyperstructures/organelles. Such improvements should preferably be encompassed in a freely available web-based predictor that can also be used as a standalone program. We developed PSORTb version 3.0 with improved recall, higher proteome-scale prediction coverage, and new refined localization subcategories. It is the first SCL predictor specifically geared for all prokaryotes, including archaea and bacteria with atypical membrane/cell wall topologies. It features an improved standalone program, with a new batch results delivery system complementing its web interface. We evaluated the most accurate SCL predictors using 5-fold cross validation plus we performed an independent proteomics analysis, showing that PSORTb 3.0 is the most accurate but can benefit from being complemented by Proteome Analyst predictions. http://www.psort.org/psortb (download open source software or use the web interface). psort-mail@sfu.ca Supplementary data are available at Bioinformatics online.

  3. Implementing Operational Analytics using Big Data Technologies to Detect and Predict Sensor Anomalies

    Science.gov (United States)

    Coughlin, J.; Mital, R.; Nittur, S.; SanNicolas, B.; Wolf, C.; Jusufi, R.

    2016-09-01

    Operational analytics when combined with Big Data technologies and predictive techniques have been shown to be valuable in detecting mission critical sensor anomalies that might be missed by conventional analytical techniques. Our approach helps analysts and leaders make informed and rapid decisions by analyzing large volumes of complex data in near real-time and presenting it in a manner that facilitates decision making. It provides cost savings by being able to alert and predict when sensor degradations pass a critical threshold and impact mission operations. Operational analytics, which uses Big Data tools and technologies, can process very large data sets containing a variety of data types to uncover hidden patterns, unknown correlations, and other relevant information. When combined with predictive techniques, it provides a mechanism to monitor and visualize these data sets and provide insight into degradations encountered in large sensor systems such as the space surveillance network. In this study, data from a notional sensor is simulated and we use big data technologies, predictive algorithms and operational analytics to process the data and predict sensor degradations. This study uses data products that would commonly be analyzed at a site. This study builds on a big data architecture that has previously been proven valuable in detecting anomalies. This paper outlines our methodology of implementing an operational analytic solution through data discovery, learning and training of data modeling and predictive techniques, and deployment. Through this methodology, we implement a functional architecture focused on exploring available big data sets and determine practical analytic, visualization, and predictive technologies.

  4. Predictable and avoidable: What’s next?

    Directory of Open Access Journals (Sweden)

    Ivo Pezzuto

    2014-09-01

    Full Text Available The author of this paper (Dr. Ivo Pezzuto has been one of the first authors to write back in 2008 about the alleged "subprime mortgage loans fraud" which has triggered the 2008 financial crisis, in combination with multiple other complex, highly interrelated, and concurrent factors. The author has been also one of the first authors to report in that same working paper of 2008 (available on SSRN and titled "Miraculous Financial Engineering or Toxic Finance? The Genesis of the U.S. Subprime Mortgage Loans Crisis and its Consequences on the Global Financial Markets and Real Economy" the high probability of a Eurozone debt crisis, due to a number of unsolved structural macroeconomic problems, the lack of a single crisis resolution scheme, current account imbalances, and in some countries, housing bubbles/high private debt. In the book published in 2013 and titled "Predictable and Avoidable: Repairing Economic Dislocation and Preventing the Recurrence of Crisis", Dr. Ivo Pezzuto has exposed the root causes of the financial crisis in order to enables readers to understand that the crisis we have seen was predictable and should have been avoidable, and that a recurrence can be avoided, if lessons are learned and the right action taken. Almost one year after the publication of the book "Predictable and Avoidable: Repairing Economic Dislocation and Preventing the Recurrence of Crisis", the author has decided to write this working paper to explore what happened in the meantime to the financial markets and to the financial regulation implementation. Most of all, the author with this working paper aims to provide an updated analysis as strategist and scenario analyst on the topics addressed in the book "Predictable and Avoidable" based on a forward-looking perspective and on potential "tail risk" scenarios. The topics reported in this paper relate to financial crises; Government policy; financial regulation; corporate governance; credit risk management

  5. Evaluation and comparison of mammalian subcellular localization prediction methods

    Directory of Open Access Journals (Sweden)

    Fink J Lynn

    2006-12-01

    Full Text Available Abstract Background Determination of the subcellular location of a protein is essential to understanding its biochemical function. This information can provide insight into the function of hypothetical or novel proteins. These data are difficult to obtain experimentally but have become especially important since many whole genome sequencing projects have been finished and many resulting protein sequences are still lacking detailed functional information. In order to address this paucity of data, many computational prediction methods have been developed. However, these methods have varying levels of accuracy and perform differently based on the sequences that are presented to the underlying algorithm. It is therefore useful to compare these methods and monitor their performance. Results In order to perform a comprehensive survey of prediction methods, we selected only methods that accepted large batches of protein sequences, were publicly available, and were able to predict localization to at least nine of the major subcellular locations (nucleus, cytosol, mitochondrion, extracellular region, plasma membrane, Golgi apparatus, endoplasmic reticulum (ER, peroxisome, and lysosome. The selected methods were CELLO, MultiLoc, Proteome Analyst, pTarget and WoLF PSORT. These methods were evaluated using 3763 mouse proteins from SwissProt that represent the source of the training sets used in development of the individual methods. In addition, an independent evaluation set of 2145 mouse proteins from LOCATE with a bias towards the subcellular localization underrepresented in SwissProt was used. The sensitivity and specificity were calculated for each method and compared to a theoretical value based on what might be observed by random chance. Conclusion No individual method had a sufficient level of sensitivity across both evaluation sets that would enable reliable application to hypothetical proteins. All methods showed lower performance on the LOCATE

  6. Artificial Neural Network and Genetic Algorithm Hybrid Intelligence for Predicting Thai Stock Price Index Trend

    Directory of Open Access Journals (Sweden)

    Montri Inthachot

    2016-01-01

    Full Text Available This study investigated the use of Artificial Neural Network (ANN and Genetic Algorithm (GA for prediction of Thailand’s SET50 index trend. ANN is a widely accepted machine learning method that uses past data to predict future trend, while GA is an algorithm that can find better subsets of input variables for importing into ANN, hence enabling more accurate prediction by its efficient feature selection. The imported data were chosen technical indicators highly regarded by stock analysts, each represented by 4 input variables that were based on past time spans of 4 different lengths: 3-, 5-, 10-, and 15-day spans before the day of prediction. This import undertaking generated a big set of diverse input variables with an exponentially higher number of possible subsets that GA culled down to a manageable number of more effective ones. SET50 index data of the past 6 years, from 2009 to 2014, were used to evaluate this hybrid intelligence prediction accuracy, and the hybrid’s prediction results were found to be more accurate than those made by a method using only one input variable for one fixed length of past time span.

  7. Artificial Neural Network and Genetic Algorithm Hybrid Intelligence for Predicting Thai Stock Price Index Trend

    Science.gov (United States)

    Boonjing, Veera; Intakosum, Sarun

    2016-01-01

    This study investigated the use of Artificial Neural Network (ANN) and Genetic Algorithm (GA) for prediction of Thailand's SET50 index trend. ANN is a widely accepted machine learning method that uses past data to predict future trend, while GA is an algorithm that can find better subsets of input variables for importing into ANN, hence enabling more accurate prediction by its efficient feature selection. The imported data were chosen technical indicators highly regarded by stock analysts, each represented by 4 input variables that were based on past time spans of 4 different lengths: 3-, 5-, 10-, and 15-day spans before the day of prediction. This import undertaking generated a big set of diverse input variables with an exponentially higher number of possible subsets that GA culled down to a manageable number of more effective ones. SET50 index data of the past 6 years, from 2009 to 2014, were used to evaluate this hybrid intelligence prediction accuracy, and the hybrid's prediction results were found to be more accurate than those made by a method using only one input variable for one fixed length of past time span. PMID:27974883

  8. Making detailed predictions makes (some) predictions worse

    Science.gov (United States)

    Kelly, Theresa F.

    In this paper, we investigate whether making detailed predictions about an event makes other predictions worse. Across 19 experiments, 10,895 participants, and 415,960 predictions about 724 professional sports games, we find that people who made detailed predictions about sporting events (e.g., how many hits each baseball team would get) made worse predictions about more general outcomes (e.g., which team would win). We rule out that this effect is caused by inattention or fatigue, thinking too hard, or a differential reliance on holistic information about the teams. Instead, we find that thinking about game-relevant details before predicting winning teams causes people to give less weight to predictive information, presumably because predicting details makes information that is relatively useless for predicting the winning team more readily accessible in memory and therefore incorporated into forecasts. Furthermore, we show that this differential use of information can be used to predict what kinds of games will and will not be susceptible to the negative effect of making detailed predictions.

  9. Economic decision making and the application of nonparametric prediction models

    Science.gov (United States)

    Attanasi, E.D.; Coburn, T.C.; Freeman, P.A.

    2008-01-01

    Sustained increases in energy prices have focused attention on gas resources in low-permeability shale or in coals that were previously considered economically marginal. Daily well deliverability is often relatively small, although the estimates of the total volumes of recoverable resources in these settings are often large. Planning and development decisions for extraction of such resources must be areawide because profitable extraction requires optimization of scale economies to minimize costs and reduce risk. For an individual firm, the decision to enter such plays depends on reconnaissance-level estimates of regional recoverable resources and on cost estimates to develop untested areas. This paper shows how simple nonparametric local regression models, used to predict technically recoverable resources at untested sites, can be combined with economic models to compute regional-scale cost functions. The context of the worked example is the Devonian Antrim-shale gas play in the Michigan basin. One finding relates to selection of the resource prediction model to be used with economic models. Models chosen because they can best predict aggregate volume over larger areas (many hundreds of sites) smooth out granularity in the distribution of predicted volumes at individual sites. This loss of detail affects the representation of economic cost functions and may affect economic decisions. Second, because some analysts consider unconventional resources to be ubiquitous, the selection and order of specific drilling sites may, in practice, be determined arbitrarily by extraneous factors. The analysis shows a 15-20% gain in gas volume when these simple models are applied to order drilling prospects strategically rather than to choose drilling locations randomly. Copyright ?? 2008 Society of Petroleum Engineers.

  10. Predictive modeling of complications.

    Science.gov (United States)

    Osorio, Joseph A; Scheer, Justin K; Ames, Christopher P

    2016-09-01

    Predictive analytic algorithms are designed to identify patterns in the data that allow for accurate predictions without the need for a hypothesis. Therefore, predictive modeling can provide detailed and patient-specific information that can be readily applied when discussing the risks of surgery with a patient. There are few studies using predictive modeling techniques in the adult spine surgery literature. These types of studies represent the beginning of the use of predictive analytics in spine surgery outcomes. We will discuss the advancements in the field of spine surgery with respect to predictive analytics, the controversies surrounding the technique, and the future directions.

  11. Predicting outdoor sound

    CERN Document Server

    Attenborough, Keith; Horoshenkov, Kirill

    2014-01-01

    1. Introduction  2. The Propagation of Sound Near Ground Surfaces in a Homogeneous Medium  3. Predicting the Acoustical Properties of Outdoor Ground Surfaces  4. Measurements of the Acoustical Properties of Ground Surfaces and Comparisons with Models  5. Predicting Effects of Source Characteristics on Outdoor Sound  6. Predictions, Approximations and Empirical Results for Ground Effect Excluding Meteorological Effects  7. Influence of Source Motion on Ground Effect and Diffraction  8. Predicting Effects of Mixed Impedance Ground  9. Predicting the Performance of Outdoor Noise Barriers  10. Predicting Effects of Vegetation, Trees and Turbulence  11. Analytical Approximations including Ground Effect, Refraction and Turbulence  12. Prediction Schemes  13. Predicting Sound in an Urban Environment.

  12. Close Approach Prediction Analysis of the Earth Science Constellation with the Fengyun-1C Debris

    Science.gov (United States)

    Duncan, Matthew; Rand, David K.

    2008-01-01

    Routine satellite operations for the Earth Science Constellation (ESC) include collision risk assessment between members of the constellation and other orbiting space objects. Each day, close approach predictions are generated by a U.S. Department of Defense Joint Space Operations Center Orbital Safety Analyst using the high accuracy Space Object Catalog maintained by the Air Force's 1" Space Control Squadron. Prediction results and other ancillary data such as state vector information are sent to NASAJGoddard Space Flight Center's (GSFC's) Collision Risk Assessment analysis team for review. Collision analysis is performed and the GSFC team works with the ESC member missions to develop risk reduction strategies as necessary. This paper presents various close approach statistics for the ESC. The ESC missions have been affected by debris from the recent anti-satellite test which destroyed the Chinese Fengyun- 1 C satellite. The paper also presents the percentage of close approach events induced by the Fengyun-1C debris, and presents analysis results which predict the future effects on the ESC caused by this event. Specifically, the Fengyun-1C debris is propagated for twenty years using high-performance computing technology and close approach predictions are generated for the ESC. The percent increase in the total number of conjunction events is considered to be an estimate of the collision risk due to the Fengyun-1C break- UP.

  13. Applied predictive control

    CERN Document Server

    Sunan, Huang; Heng, Lee Tong

    2002-01-01

    The presence of considerable time delays in the dynamics of many industrial processes, leading to difficult problems in the associated closed-loop control systems, is a well-recognized phenomenon. The performance achievable in conventional feedback control systems can be significantly degraded if an industrial process has a relatively large time delay compared with the dominant time constant. Under these circumstances, advanced predictive control is necessary to improve the performance of the control system significantly. The book is a focused treatment of the subject matter, including the fundamentals and some state-of-the-art developments in the field of predictive control. Three main schemes for advanced predictive control are addressed in this book: • Smith Predictive Control; • Generalised Predictive Control; • a form of predictive control based on Finite Spectrum Assignment. A substantial part of the book addresses application issues in predictive control, providing several interesting case studie...

  14. Predictable or not predictable? The MOV question

    International Nuclear Information System (INIS)

    Thibault, C.L.; Matzkiw, J.N.; Anderson, J.W.; Kessler, D.W.

    1994-01-01

    Over the past 8 years, the nuclear industry has struggled to understand the dynamic phenomena experienced during motor-operated valve (MOV) operation under differing flow conditions. For some valves and designs, their operational functionality has been found to be predictable; for others, unpredictable. Although much has been accomplished over this period of time, especially on modeling valve dynamics, the unpredictability of many valves and designs still exists. A few valve manufacturers are focusing on improving design and fabrication techniques to enhance product reliability and predictability. However, this approach does not address these issues for installed and inpredictable valves. This paper presents some of the more promising techniques that Wyle Laboratories has explored with potential for transforming unpredictable valves to predictable valves and for retrofitting installed MOVs. These techniques include optimized valve tolerancing, surrogated material evaluation, and enhanced surface treatments

  15. Could the outcome of the 2016 US elections have been predicted from past voting patterns?

    Science.gov (United States)

    Schmitz, Peter M. U.; Holloway, Jennifer P.; Dudeni-Tlhone, Nontembeko; Ntlangu, Mbulelo B.; Koen, Renee

    2018-05-01

    In South Africa, a team of analysts has for some years been using statistical techniques to predict election outcomes during election nights in South Africa. The prediction method involves using statistical clusters based on past voting patterns to predict final election outcomes, using a small number of released vote counts. With the US presidential elections in November 2016 hitting the global media headlines during the time period directly after successful predictions were done for the South African elections, the team decided to investigate adapting their meth-od to forecast the final outcome in the US elections. In particular, it was felt that the time zone differences between states would affect the time at which results are released and thereby provide a window of opportunity for doing election night prediction using only the early results from the eastern side of the US. Testing the method on the US presidential elections would have two advantages: it would determine whether the core methodology could be generalised, and whether it would work to include a stronger spatial element in the modelling, since the early results released would be spatially biased due to time zone differences. This paper presents a high-level view of the overall methodology and how it was adapted to predict the results of the US presidential elections. A discussion on the clustering of spatial units within the US is also provided and the spatial distribution of results together with the Electoral College prediction results from both a `test-run' and the final 2016 presidential elections are given and analysed.

  16. Predictive systems ecology.

    Science.gov (United States)

    Evans, Matthew R; Bithell, Mike; Cornell, Stephen J; Dall, Sasha R X; Díaz, Sandra; Emmott, Stephen; Ernande, Bruno; Grimm, Volker; Hodgson, David J; Lewis, Simon L; Mace, Georgina M; Morecroft, Michael; Moustakas, Aristides; Murphy, Eugene; Newbold, Tim; Norris, K J; Petchey, Owen; Smith, Matthew; Travis, Justin M J; Benton, Tim G

    2013-11-22

    Human societies, and their well-being, depend to a significant extent on the state of the ecosystems that surround them. These ecosystems are changing rapidly usually in response to anthropogenic changes in the environment. To determine the likely impact of environmental change on ecosystems and the best ways to manage them, it would be desirable to be able to predict their future states. We present a proposal to develop the paradigm of predictive systems ecology, explicitly to understand and predict the properties and behaviour of ecological systems. We discuss the necessary and desirable features of predictive systems ecology models. There are places where predictive systems ecology is already being practised and we summarize a range of terrestrial and marine examples. Significant challenges remain but we suggest that ecology would benefit both as a scientific discipline and increase its impact in society if it were to embrace the need to become more predictive.

  17. Seismology for rockburst prediction.

    CSIR Research Space (South Africa)

    De Beer, W

    2000-02-01

    Full Text Available project GAP409 presents a method (SOOTHSAY) for predicting larger mining induced seismic events in gold mines, as well as a pattern recognition algorithm (INDICATOR) for characterising the seismic response of rock to mining and inferring future... State. Defining the time series of a specific function on a catalogue as a prediction strategy, the algorithm currently has a success rate of 53% and 65%, respectively, of large events claimed as being predicted in these two cases, with uncertainties...

  18. Predictability of Conversation Partners

    Science.gov (United States)

    Takaguchi, Taro; Nakamura, Mitsuhiro; Sato, Nobuo; Yano, Kazuo; Masuda, Naoki

    2011-08-01

    Recent developments in sensing technologies have enabled us to examine the nature of human social behavior in greater detail. By applying an information-theoretic method to the spatiotemporal data of cell-phone locations, [C. Song , ScienceSCIEAS0036-8075 327, 1018 (2010)] found that human mobility patterns are remarkably predictable. Inspired by their work, we address a similar predictability question in a different kind of human social activity: conversation events. The predictability in the sequence of one’s conversation partners is defined as the degree to which one’s next conversation partner can be predicted given the current partner. We quantify this predictability by using the mutual information. We examine the predictability of conversation events for each individual using the longitudinal data of face-to-face interactions collected from two company offices in Japan. Each subject wears a name tag equipped with an infrared sensor node, and conversation events are marked when signals are exchanged between sensor nodes in close proximity. We find that the conversation events are predictable to a certain extent; knowing the current partner decreases the uncertainty about the next partner by 28.4% on average. Much of the predictability is explained by long-tailed distributions of interevent intervals. However, a predictability also exists in the data, apart from the contribution of their long-tailed nature. In addition, an individual’s predictability is correlated with the position of the individual in the static social network derived from the data. Individuals confined in a community—in the sense of an abundance of surrounding triangles—tend to have low predictability, and those bridging different communities tend to have high predictability.

  19. Predictability of Conversation Partners

    Directory of Open Access Journals (Sweden)

    Taro Takaguchi

    2011-09-01

    Full Text Available Recent developments in sensing technologies have enabled us to examine the nature of human social behavior in greater detail. By applying an information-theoretic method to the spatiotemporal data of cell-phone locations, [C. Song et al., Science 327, 1018 (2010SCIEAS0036-8075] found that human mobility patterns are remarkably predictable. Inspired by their work, we address a similar predictability question in a different kind of human social activity: conversation events. The predictability in the sequence of one’s conversation partners is defined as the degree to which one’s next conversation partner can be predicted given the current partner. We quantify this predictability by using the mutual information. We examine the predictability of conversation events for each individual using the longitudinal data of face-to-face interactions collected from two company offices in Japan. Each subject wears a name tag equipped with an infrared sensor node, and conversation events are marked when signals are exchanged between sensor nodes in close proximity. We find that the conversation events are predictable to a certain extent; knowing the current partner decreases the uncertainty about the next partner by 28.4% on average. Much of the predictability is explained by long-tailed distributions of interevent intervals. However, a predictability also exists in the data, apart from the contribution of their long-tailed nature. In addition, an individual’s predictability is correlated with the position of the individual in the static social network derived from the data. Individuals confined in a community—in the sense of an abundance of surrounding triangles—tend to have low predictability, and those bridging different communities tend to have high predictability.

  20. Is Time Predictability Quantifiable?

    DEFF Research Database (Denmark)

    Schoeberl, Martin

    2012-01-01

    Computer architects and researchers in the realtime domain start to investigate processors and architectures optimized for real-time systems. Optimized for real-time systems means time predictable, i.e., architectures where it is possible to statically derive a tight bound of the worst......-case execution time. To compare different approaches we would like to quantify time predictability. That means we need to measure time predictability. In this paper we discuss the different approaches for these measurements and conclude that time predictability is practically not quantifiable. We can only...... compare the worst-case execution time bounds of different architectures....

  1. Predicting scholars' scientific impact.

    Directory of Open Access Journals (Sweden)

    Amin Mazloumian

    Full Text Available We tested the underlying assumption that citation counts are reliable predictors of future success, analyzing complete citation data on the careers of ~150,000 scientists. Our results show that i among all citation indicators, the annual citations at the time of prediction is the best predictor of future citations, ii future citations of a scientist's published papers can be predicted accurately (r(2 = 0.80 for a 1-year prediction, P<0.001 but iii future citations of future work are hardly predictable.

  2. 非财务信息披露与分析师预测基于深市上市企业社会责任报告的实证检验%Nonfinancial Disclosure and Analyst Forecast Accuracy:Empirical Evidence on Corporate Social Responsibility Disclosure from Shenzhen Stock Exchange

    Institute of Scientific and Technical Information of China (English)

    李晚金; 张莉

    2014-01-01

    We examine the relationship between disclosure of nonfinancial information and an-alyst forecast accuracy using the issuance of stand-alone corporate social responsibility (CSR)re-ports to proxy for disclosure of nonfinancial information.The multiple regression analysis results show that the higher quality of the Corporate social responsibility report disclosure is associated with higher analyst forecast accuracy and the relationship is stronger in firms with more opaque financial disclosure,suggesting that this kind of non-financial information in social responsibility report not only has information content but also complements financial disclosure by mitigating the negative effect of financial opacity on forecast accuracy.%以深市上市企业披露的社会责任报告作为非财务信息的替代变量,实证检验了非财务信息披露质量与分析师盈利预测的关系。多元回归分析结果表明,企业社会责任报告披露质量越好,其分析师盈利预测越精确,并且在财务透明度低的企业中,这种正向关系更显著。这说明社会责任报告披露的这类非财务信息对分析师预测不仅具有信息含量,而且能够通过对财务信息的补充作用,缓解财务不透明对分析师预测精度的不利后果。

  3. An assessment of the validity of inelastic design analysis methods by comparisons of predictions with test results

    International Nuclear Information System (INIS)

    Corum, J.M.; Clinard, J.A.; Sartory, W.K.

    1976-01-01

    The use of computer programs that employ relatively complex constitutive theories and analysis procedures to perform inelastic design calculations on fast reactor system components introduces questions of validation and acceptance of the analysis results. We may ask ourselves, ''How valid are the answers.'' These questions, in turn, involve the concepts of verification of computer programs as well as qualification of the computer programs and of the underlying constitutive theories and analysis procedures. This paper addresses the latter - the qualification of the analysis methods for inelastic design calculations. Some of the work underway in the United States to provide the necessary information to evaluate inelastic analysis methods and computer programs is described, and typical comparisons of analysis predictions with inelastic structural test results are presented. It is emphasized throughout that rather than asking ourselves how valid, or correct, are the analytical predictions, we might more properly question whether or not the combination of the predictions and the associated high-temperature design criteria leads to an acceptable level of structural integrity. It is believed that in this context the analysis predictions are generally valid, even though exact correlations between predictions and actual behavior are not obtained and cannot be expected. Final judgment, however, must be reserved for the design analyst in each specific case. (author)

  4. The Prediction Value

    NARCIS (Netherlands)

    Koster, M.; Kurz, S.; Lindner, I.; Napel, S.

    2013-01-01

    We introduce the prediction value (PV) as a measure of players’ informational importance in probabilistic TU games. The latter combine a standard TU game and a probability distribution over the set of coalitions. Player i’s prediction value equals the difference between the conditional expectations

  5. Predictability of Stock Returns

    Directory of Open Access Journals (Sweden)

    Ahmet Sekreter

    2017-06-01

    Full Text Available Predictability of stock returns has been shown by empirical studies over time. This article collects the most important theories on forecasting stock returns and investigates the factors that affecting behavior of the stocks’ prices and the market as a whole. Estimation of the factors and the way of estimation are the key issues of predictability of stock returns.

  6. Predicting AD conversion

    DEFF Research Database (Denmark)

    Liu, Yawu; Mattila, Jussi; Ruiz, Miguel �ngel Mu�oz

    2013-01-01

    To compare the accuracies of predicting AD conversion by using a decision support system (PredictAD tool) and current research criteria of prodromal AD as identified by combinations of episodic memory impairment of hippocampal type and visual assessment of medial temporal lobe atrophy (MTA) on MRI...

  7. Predicting Free Recalls

    Science.gov (United States)

    Laming, Donald

    2006-01-01

    This article reports some calculations on free-recall data from B. Murdock and J. Metcalfe (1978), with vocal rehearsal during the presentation of a list. Given the sequence of vocalizations, with the stimuli inserted in their proper places, it is possible to predict the subsequent sequence of recalls--the predictions taking the form of a…

  8. Archaeological predictive model set.

    Science.gov (United States)

    2015-03-01

    This report is the documentation for Task 7 of the Statewide Archaeological Predictive Model Set. The goal of this project is to : develop a set of statewide predictive models to assist the planning of transportation projects. PennDOT is developing t...

  9. Evaluating prediction uncertainty

    International Nuclear Information System (INIS)

    McKay, M.D.

    1995-03-01

    The probability distribution of a model prediction is presented as a proper basis for evaluating the uncertainty in a model prediction that arises from uncertainty in input values. Determination of important model inputs and subsets of inputs is made through comparison of the prediction distribution with conditional prediction probability distributions. Replicated Latin hypercube sampling and variance ratios are used in estimation of the distributions and in construction of importance indicators. The assumption of a linear relation between model output and inputs is not necessary for the indicators to be effective. A sequential methodology which includes an independent validation step is applied in two analysis applications to select subsets of input variables which are the dominant causes of uncertainty in the model predictions. Comparison with results from methods which assume linearity shows how those methods may fail. Finally, suggestions for treating structural uncertainty for submodels are presented

  10. Ground motion predictions

    Energy Technology Data Exchange (ETDEWEB)

    Loux, P C [Environmental Research Corporation, Alexandria, VA (United States)

    1969-07-01

    Nuclear generated ground motion is defined and then related to the physical parameters that cause it. Techniques employed for prediction of ground motion peak amplitude, frequency spectra and response spectra are explored, with initial emphasis on the analysis of data collected at the Nevada Test Site (NTS). NTS postshot measurements are compared with pre-shot predictions. Applicability of these techniques to new areas, for example, Plowshare sites, must be questioned. Fortunately, the Atomic Energy Commission is sponsoring complementary studies to improve prediction capabilities primarily in new locations outside the NTS region. Some of these are discussed in the light of anomalous seismic behavior, and comparisons are given showing theoretical versus experimental results. In conclusion, current ground motion prediction techniques are applied to events off the NTS. Predictions are compared with measurements for the event Faultless and for the Plowshare events, Gasbuggy, Cabriolet, and Buggy I. (author)

  11. Ground motion predictions

    International Nuclear Information System (INIS)

    Loux, P.C.

    1969-01-01

    Nuclear generated ground motion is defined and then related to the physical parameters that cause it. Techniques employed for prediction of ground motion peak amplitude, frequency spectra and response spectra are explored, with initial emphasis on the analysis of data collected at the Nevada Test Site (NTS). NTS postshot measurements are compared with pre-shot predictions. Applicability of these techniques to new areas, for example, Plowshare sites, must be questioned. Fortunately, the Atomic Energy Commission is sponsoring complementary studies to improve prediction capabilities primarily in new locations outside the NTS region. Some of these are discussed in the light of anomalous seismic behavior, and comparisons are given showing theoretical versus experimental results. In conclusion, current ground motion prediction techniques are applied to events off the NTS. Predictions are compared with measurements for the event Faultless and for the Plowshare events, Gasbuggy, Cabriolet, and Buggy I. (author)

  12. Structural prediction in aphasia

    Directory of Open Access Journals (Sweden)

    Tessa Warren

    2015-05-01

    Full Text Available There is considerable evidence that young healthy comprehenders predict the structure of upcoming material, and that their processing is facilitated when they encounter material matching those predictions (e.g., Staub & Clifton, 2006; Yoshida, Dickey & Sturt, 2013. However, less is known about structural prediction in aphasia. There is evidence that lexical prediction may be spared in aphasia (Dickey et al., 2014; Love & Webb, 1977; cf. Mack et al, 2013. However, predictive mechanisms supporting facilitated lexical access may not necessarily support structural facilitation. Given that many people with aphasia (PWA exhibit syntactic deficits (e.g. Goodglass, 1993, PWA with such impairments may not engage in structural prediction. However, recent evidence suggests that some PWA may indeed predict upcoming structure (Hanne, Burchert, De Bleser, & Vashishth, 2015. Hanne et al. tracked the eyes of PWA (n=8 with sentence-comprehension deficits while they listened to reversible subject-verb-object (SVO and object-verb-subject (OVS sentences in German, in a sentence-picture matching task. Hanne et al. manipulated case and number marking to disambiguate the sentences’ structure. Gazes to an OVS or SVO picture during the unfolding of a sentence were assumed to indicate prediction of the structure congruent with that picture. According to this measure, the PWA’s structural prediction was impaired compared to controls, but they did successfully predict upcoming structure when morphosyntactic cues were strong and unambiguous. Hanne et al.’s visual-world evidence is suggestive, but their forced-choice sentence-picture matching task places tight constraints on possible structural predictions. Clearer evidence of structural prediction would come from paradigms where the content of upcoming material is not as constrained. The current study used self-paced reading study to examine structural prediction among PWA in less constrained contexts. PWA (n=17 who

  13. Prediction of bull fertility.

    Science.gov (United States)

    Utt, Matthew D

    2016-06-01

    Prediction of male fertility is an often sought-after endeavor for many species of domestic animals. This review will primarily focus on providing some examples of dependent and independent variables to stimulate thought about the approach and methodology of identifying the most appropriate of those variables to predict bull (bovine) fertility. Although the list of variables will continue to grow with advancements in science, the principles behind making predictions will likely not change significantly. The basic principle of prediction requires identifying a dependent variable that is an estimate of fertility and an independent variable or variables that may be useful in predicting the fertility estimate. Fertility estimates vary in which parts of the process leading to conception that they infer about and the amount of variation that influences the estimate and the uncertainty thereof. The list of potential independent variables can be divided into competence of sperm based on their performance in bioassays or direct measurement of sperm attributes. A good prediction will use a sample population of bulls that is representative of the population to which an inference will be made. Both dependent and independent variables should have a dynamic range in their values. Careful selection of independent variables includes reasonable measurement repeatability and minimal correlation among variables. Proper estimation and having an appreciation of the degree of uncertainty of dependent and independent variables are crucial for using predictions to make decisions regarding bull fertility. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Bootstrap prediction and Bayesian prediction under misspecified models

    OpenAIRE

    Fushiki, Tadayoshi

    2005-01-01

    We consider a statistical prediction problem under misspecified models. In a sense, Bayesian prediction is an optimal prediction method when an assumed model is true. Bootstrap prediction is obtained by applying Breiman's `bagging' method to a plug-in prediction. Bootstrap prediction can be considered to be an approximation to the Bayesian prediction under the assumption that the model is true. However, in applications, there are frequently deviations from the assumed model. In this paper, bo...

  15. Prediction ranges. Annual review

    Energy Technology Data Exchange (ETDEWEB)

    Parker, J.C.; Tharp, W.H.; Spiro, P.S.; Keng, K.; Angastiniotis, M.; Hachey, L.T.

    1988-01-01

    Prediction ranges equip the planner with one more tool for improved assessment of the outcome of a course of action. One of their major uses is in financial evaluations, where corporate policy requires the performance of uncertainty analysis for large projects. This report gives an overview of the uses of prediction ranges, with examples; and risks and uncertainties in growth, inflation, and interest and exchange rates. Prediction ranges and standard deviations of 80% and 50% probability are given for various economic indicators in Ontario, Canada, and the USA, as well as for foreign exchange rates and Ontario Hydro interest rates. An explanatory note on probability is also included. 23 tabs.

  16. Wind power prediction models

    Science.gov (United States)

    Levy, R.; Mcginness, H.

    1976-01-01

    Investigations were performed to predict the power available from the wind at the Goldstone, California, antenna site complex. The background for power prediction was derived from a statistical evaluation of available wind speed data records at this location and at nearby locations similarly situated within the Mojave desert. In addition to a model for power prediction over relatively long periods of time, an interim simulation model that produces sample wind speeds is described. The interim model furnishes uncorrelated sample speeds at hourly intervals that reproduce the statistical wind distribution at Goldstone. A stochastic simulation model to provide speed samples representative of both the statistical speed distributions and correlations is also discussed.

  17. Protein Sorting Prediction

    DEFF Research Database (Denmark)

    Nielsen, Henrik

    2017-01-01

    and drawbacks of each of these approaches is described through many examples of methods that predict secretion, integration into membranes, or subcellular locations in general. The aim of this chapter is to provide a user-level introduction to the field with a minimum of computational theory.......Many computational methods are available for predicting protein sorting in bacteria. When comparing them, it is important to know that they can be grouped into three fundamentally different approaches: signal-based, global-property-based and homology-based prediction. In this chapter, the strengths...

  18. 'Red Flag' Predictions

    DEFF Research Database (Denmark)

    Hallin, Carina Antonia; Andersen, Torben Juul; Tveterås, Sigbjørn

    -generation prediction markets and outline its unique features as a third-generation prediction market. It is argued that frontline employees gain deep insights when they execute operational activities on an ongoing basis in the organization. The experiential learning from close interaction with internal and external......This conceptual article introduces a new way to predict firm performance based on aggregation of sensing among frontline employees about changes in operational capabilities to update strategic action plans and generate innovations. We frame the approach in the context of first- and second...

  19. Towards Predictive Association Theories

    DEFF Research Database (Denmark)

    Kontogeorgis, Georgios; Tsivintzelis, Ioannis; Michelsen, Michael Locht

    2011-01-01

    Association equations of state like SAFT, CPA and NRHB have been previously applied to many complex mixtures. In this work we focus on two of these models, the CPA and the NRHB equations of state and the emphasis is on the analysis of their predictive capabilities for a wide range of applications....... We use the term predictive in two situations: (i) with no use of binary interaction parameters, and (ii) multicomponent calculations using binary interaction parameters based solely on binary data. It is shown that the CPA equation of state can satisfactorily predict CO2–water–glycols–alkanes VLE...

  20. Prediction of intermetallic compounds

    International Nuclear Information System (INIS)

    Burkhanov, Gennady S; Kiselyova, N N

    2009-01-01

    The problems of predicting not yet synthesized intermetallic compounds are discussed. It is noted that the use of classical physicochemical analysis in the study of multicomponent metallic systems is faced with the complexity of presenting multidimensional phase diagrams. One way of predicting new intermetallics with specified properties is the use of modern processing technology with application of teaching of image recognition by the computer. The algorithms used most often in these methods are briefly considered and the efficiency of their use for predicting new compounds is demonstrated.

  1. Filtering and prediction

    CERN Document Server

    Fristedt, B; Krylov, N

    2007-01-01

    Filtering and prediction is about observing moving objects when the observations are corrupted by random errors. The main focus is then on filtering out the errors and extracting from the observations the most precise information about the object, which itself may or may not be moving in a somewhat random fashion. Next comes the prediction step where, using information about the past behavior of the object, one tries to predict its future path. The first three chapters of the book deal with discrete probability spaces, random variables, conditioning, Markov chains, and filtering of discrete Markov chains. The next three chapters deal with the more sophisticated notions of conditioning in nondiscrete situations, filtering of continuous-space Markov chains, and of Wiener process. Filtering and prediction of stationary sequences is discussed in the last two chapters. The authors believe that they have succeeded in presenting necessary ideas in an elementary manner without sacrificing the rigor too much. Such rig...

  2. CMAQ predicted concentration files

    Data.gov (United States)

    U.S. Environmental Protection Agency — CMAQ predicted ozone. This dataset is associated with the following publication: Gantt, B., G. Sarwar, J. Xing, H. Simon, D. Schwede, B. Hutzell, R. Mathur, and A....

  3. Methane prediction in collieries

    CSIR Research Space (South Africa)

    Creedy, DP

    1999-06-01

    Full Text Available The primary aim of the project was to assess the current status of research on methane emission prediction for collieries in South Africa in comparison with methods used and advances achieved elsewhere in the world....

  4. Climate Prediction Center - Outlooks

    Science.gov (United States)

    Weather Service NWS logo - Click to go to the NWS home page Climate Prediction Center Home Site Map News Web resources and services. HOME > Outreach > Publications > Climate Diagnostics Bulletin Climate Diagnostics Bulletin - Tropics Climate Diagnostics Bulletin - Forecast Climate Diagnostics

  5. CMAQ predicted concentration files

    Data.gov (United States)

    U.S. Environmental Protection Agency — model predicted concentrations. This dataset is associated with the following publication: Muñiz-Unamunzaga, M., R. Borge, G. Sarwar, B. Gantt, D. de la Paz, C....

  6. Comparing Spatial Predictions

    KAUST Repository

    Hering, Amanda S.; Genton, Marc G.

    2011-01-01

    Under a general loss function, we develop a hypothesis test to determine whether a significant difference in the spatial predictions produced by two competing models exists on average across the entire spatial domain of interest. The null hypothesis

  7. Genomic prediction using subsampling

    OpenAIRE

    Xavier, Alencar; Xu, Shizhong; Muir, William; Rainey, Katy Martin

    2017-01-01

    Background Genome-wide assisted selection is a critical tool for the?genetic improvement of plants and animals. Whole-genome regression models in Bayesian framework represent the main family of prediction methods. Fitting such models with a large number of observations involves a prohibitive computational burden. We propose the use of subsampling bootstrap Markov chain in genomic prediction. Such method consists of fitting whole-genome regression models by subsampling observations in each rou...

  8. Predicting Online Purchasing Behavior

    OpenAIRE

    W.R BUCKINX; D. VAN DEN POEL

    2003-01-01

    This empirical study investigates the contribution of different types of predictors to the purchasing behaviour at an online store. We use logit modelling to predict whether or not a purchase is made during the next visit to the website using both forward and backward variable-selection techniques, as well as Furnival and Wilson’s global score search algorithm to find the best subset of predictors. We contribute to the literature by using variables from four different categories in predicting...

  9. Empirical Flutter Prediction Method.

    Science.gov (United States)

    1988-03-05

    been used in this way to discover species or subspecies of animals, and to discover different types of voter or comsumer requiring different persuasions...respect to behavior or performance or response variables. Once this were done, corresponding clusters might be sought among descriptive or predictive or...jump in a response. The first sort of usage does not apply to the flutter prediction problem. Here the types of behavior are the different kinds of

  10. Stuck pipe prediction

    KAUST Repository

    Alzahrani, Majed

    2016-03-10

    Disclosed are various embodiments for a prediction application to predict a stuck pipe. A linear regression model is generated from hook load readings at corresponding bit depths. A current hook load reading at a current bit depth is compared with a normal hook load reading from the linear regression model. A current hook load greater than a normal hook load for a given bit depth indicates the likelihood of a stuck pipe.

  11. Stuck pipe prediction

    KAUST Repository

    Alzahrani, Majed; Alsolami, Fawaz; Chikalov, Igor; Algharbi, Salem; Aboudi, Faisal; Khudiri, Musab

    2016-01-01

    Disclosed are various embodiments for a prediction application to predict a stuck pipe. A linear regression model is generated from hook load readings at corresponding bit depths. A current hook load reading at a current bit depth is compared with a normal hook load reading from the linear regression model. A current hook load greater than a normal hook load for a given bit depth indicates the likelihood of a stuck pipe.

  12. Genomic prediction using subsampling.

    Science.gov (United States)

    Xavier, Alencar; Xu, Shizhong; Muir, William; Rainey, Katy Martin

    2017-03-24

    Genome-wide assisted selection is a critical tool for the genetic improvement of plants and animals. Whole-genome regression models in Bayesian framework represent the main family of prediction methods. Fitting such models with a large number of observations involves a prohibitive computational burden. We propose the use of subsampling bootstrap Markov chain in genomic prediction. Such method consists of fitting whole-genome regression models by subsampling observations in each round of a Markov Chain Monte Carlo. We evaluated the effect of subsampling bootstrap on prediction and computational parameters. Across datasets, we observed an optimal subsampling proportion of observations around 50% with replacement, and around 33% without replacement. Subsampling provided a substantial decrease in computation time, reducing the time to fit the model by half. On average, losses on predictive properties imposed by subsampling were negligible, usually below 1%. For each dataset, an optimal subsampling point that improves prediction properties was observed, but the improvements were also negligible. Combining subsampling with Gibbs sampling is an interesting ensemble algorithm. The investigation indicates that the subsampling bootstrap Markov chain algorithm substantially reduces computational burden associated with model fitting, and it may slightly enhance prediction properties.

  13. Deep Visual Attention Prediction

    Science.gov (United States)

    Wang, Wenguan; Shen, Jianbing

    2018-05-01

    In this work, we aim to predict human eye fixation with view-free scenes based on an end-to-end deep learning architecture. Although Convolutional Neural Networks (CNNs) have made substantial improvement on human attention prediction, it is still needed to improve CNN based attention models by efficiently leveraging multi-scale features. Our visual attention network is proposed to capture hierarchical saliency information from deep, coarse layers with global saliency information to shallow, fine layers with local saliency response. Our model is based on a skip-layer network structure, which predicts human attention from multiple convolutional layers with various reception fields. Final saliency prediction is achieved via the cooperation of those global and local predictions. Our model is learned in a deep supervision manner, where supervision is directly fed into multi-level layers, instead of previous approaches of providing supervision only at the output layer and propagating this supervision back to earlier layers. Our model thus incorporates multi-level saliency predictions within a single network, which significantly decreases the redundancy of previous approaches of learning multiple network streams with different input scales. Extensive experimental analysis on various challenging benchmark datasets demonstrate our method yields state-of-the-art performance with competitive inference time.

  14. Comitê de Auditoria versus Conselho Fiscal Adaptado: a visão dos analistas de mercado e dos executivos das empresas que possuem ADRs Audit Committee versus Adapted Fiscal Council: the point of view of market analysts and executives of companies with ADRs

    Directory of Open Access Journals (Sweden)

    Fernanda Furuta

    2010-08-01

    Full Text Available Este estudo tem como objetivo obter a opinião dos executivos das empresas que operam no Brasil e negociam seus títulos no mercado norte-americano e dos analistas de mercado sobre a formação do Comitê de Auditoria ou do Conselho Fiscal adaptado. Para isso, foram aplicados questionários e realizadas entrevistas. A maioria dos executivos das empresas que formaram o Comitê de Auditoria apontaram que o nível de governança corporativa foi um dos fatores que mais influenciou na decisão de se formar um ou outro órgão. Por outro lado, a maioria dos executivos das empresas que formaram o Conselho Fiscal adaptado indicaram, além do nível de governança corporativa, o fato de ser auditada por uma das Big4 e a classificação da empresa conforme o valor agregado de mercado como fatores que influenciaram nas suas decisões. Não houve consenso de opiniões quanto ao Conselho Fiscal ser mais adaptável que o Comitê de Auditoria ao ambiente de negócios brasileiros, se as funções dos dois órgãos são distintas e se os custos associados à formação do Comitê de Auditoria são ou não relevantes. Assim, pode-se concluir que, em alguns aspectos, as percepções dos analistas de mercado e dos executivos das empresas são bastante diferentes.The aim of this study was to obtain the opinion of executives working in companies that companies that operate in Brazil and negotiate their titles in the North-American market and the opinion of market analysts. For that, we used questionnaires and made interviews. The majority of executives working in companies with an Audit Committee pointed out the level of Corporate Governance as one of the factors that most influenced the decision to establish either of the agencies. On the other hand, according to the majority of executives working in companies with an adapted Fiscal Council, the level of Corporate Governance, in addition to the fact that the company was audited by one of the Big4, and the company

  15. A hybrid clustering and classification approach for predicting crash injury severity on rural roads.

    Science.gov (United States)

    Hasheminejad, Seyed Hessam-Allah; Zahedi, Mohsen; Hasheminejad, Seyed Mohammad Hossein

    2018-03-01

    As a threat for transportation system, traffic crashes have a wide range of social consequences for governments. Traffic crashes are increasing in developing countries and Iran as a developing country is not immune from this risk. There are several researches in the literature to predict traffic crash severity based on artificial neural networks (ANNs), support vector machines and decision trees. This paper attempts to investigate the crash injury severity of rural roads by using a hybrid clustering and classification approach to compare the performance of classification algorithms before and after applying the clustering. In this paper, a novel rule-based genetic algorithm (GA) is proposed to predict crash injury severity, which is evaluated by performance criteria in comparison with classification algorithms like ANN. The results obtained from analysis of 13,673 crashes (5600 property damage, 778 fatal crashes, 4690 slight injuries and 2605 severe injuries) on rural roads in Tehran Province of Iran during 2011-2013 revealed that the proposed GA method outperforms other classification algorithms based on classification metrics like precision (86%), recall (88%) and accuracy (87%). Moreover, the proposed GA method has the highest level of interpretation, is easy to understand and provides feedback to analysts.

  16. Candidate Prediction Models and Methods

    DEFF Research Database (Denmark)

    Nielsen, Henrik Aalborg; Nielsen, Torben Skov; Madsen, Henrik

    2005-01-01

    This document lists candidate prediction models for Work Package 3 (WP3) of the PSO-project called ``Intelligent wind power prediction systems'' (FU4101). The main focus is on the models transforming numerical weather predictions into predictions of power production. The document also outlines...... the possibilities w.r.t. different numerical weather predictions actually available to the project....

  17. Transionospheric propagation predictions

    Science.gov (United States)

    Klobucher, J. A.; Basu, S.; Basu, S.; Bernhardt, P. A.; Davies, K.; Donatelli, D. E.; Fremouw, E. J.; Goodman, J. M.; Hartmann, G. K.; Leitinger, R.

    1979-01-01

    The current status and future prospects of the capability to make transionospheric propagation predictions are addressed, highlighting the effects of the ionized media, which dominate for frequencies below 1 to 3 GHz, depending upon the state of the ionosphere and the elevation angle through the Earth-space path. The primary concerns are the predictions of time delay of signal modulation (group path delay) and of radio wave scintillation. Progress in these areas is strongly tied to knowledge of variable structures in the ionosphere ranging from the large scale (thousands of kilometers in horizontal extent) to the fine scale (kilometer size). Ionospheric variability and the relative importance of various mechanisms responsible for the time histories observed in total electron content (TEC), proportional to signal group delay, and in irregularity formation are discussed in terms of capability to make both short and long term predictions. The data base upon which predictions are made is examined for its adequacy, and the prospects for prediction improvements by more theoretical studies as well as by increasing the available statistical data base are examined.

  18. Predictable grammatical constructions

    DEFF Research Database (Denmark)

    Lucas, Sandra

    2015-01-01

    My aim in this paper is to provide evidence from diachronic linguistics for the view that some predictable units are entrenched in grammar and consequently in human cognition, in a way that makes them functionally and structurally equal to nonpredictable grammatical units, suggesting that these p......My aim in this paper is to provide evidence from diachronic linguistics for the view that some predictable units are entrenched in grammar and consequently in human cognition, in a way that makes them functionally and structurally equal to nonpredictable grammatical units, suggesting...... that these predictable units should be considered grammatical constructions on a par with the nonpredictable constructions. Frequency has usually been seen as the only possible argument speaking in favor of viewing some formally and semantically fully predictable units as grammatical constructions. However, this paper...... semantically and formally predictable. Despite this difference, [méllo INF], like the other future periphrases, seems to be highly entrenched in the cognition (and grammar) of Early Medieval Greek language users, and consequently a grammatical construction. The syntactic evidence speaking in favor of [méllo...

  19. Essays on Earnings Predictability

    DEFF Research Database (Denmark)

    Bruun, Mark

    This dissertation addresses the prediction of corporate earnings. The thesis aims to examine whether the degree of precision in earnings forecasts can be increased by basing them on historical financial ratios. Furthermore, the intent of the dissertation is to analyze whether accounting standards...... forecasts are not more accurate than the simpler forecasts based on a historical timeseries of earnings. Secondly, the dissertation shows how accounting standards affect analysts’ earnings predictions. Accounting conservatism contributes to a more volatile earnings process, which lowers the accuracy...... of analysts’ earnings forecasts. Furthermore, the dissertation shows how the stock market’s reaction to the disclosure of information about corporate earnings depends on how well corporate earnings can be predicted. The dissertation indicates that the stock market’s reaction to the disclosure of earnings...

  20. Pulverized coal devolatilization prediction

    International Nuclear Information System (INIS)

    Rojas, Andres F; Barraza, Juan M

    2008-01-01

    The aim of this study was to predict the two bituminous coals devolatilization at low rate of heating (50 Celsius degrade/min), with program FG-DVC (functional group Depolymerization. Vaporization and crosslinking), and to compare the devolatilization profiles predicted by program FG-DVC, which are obtained in the thermogravimetric analyzer. It was also study the volatile liberation at (10 4 k/s) in a drop-tube furnace. The tar, methane, carbon monoxide, and carbon dioxide, formation rate profiles, and the hydrogen, oxygen, nitrogen and sulphur, elemental distribution in the devolatilization products by FG-DVC program at low rate of heating was obtained; and the liberation volatile and R factor at high rate of heating was calculated. it was found that the program predicts the bituminous coals devolatilization at low rate heating, at high rate heating, a volatile liberation around 30% was obtained

  1. Predicting Ideological Prejudice.

    Science.gov (United States)

    Brandt, Mark J

    2017-06-01

    A major shortcoming of current models of ideological prejudice is that although they can anticipate the direction of the association between participants' ideology and their prejudice against a range of target groups, they cannot predict the size of this association. I developed and tested models that can make specific size predictions for this association. A quantitative model that used the perceived ideology of the target group as the primary predictor of the ideology-prejudice relationship was developed with a representative sample of Americans ( N = 4,940) and tested against models using the perceived status of and choice to belong to the target group as predictors. In four studies (total N = 2,093), ideology-prejudice associations were estimated, and these observed estimates were compared with the models' predictions. The model that was based only on perceived ideology was the most parsimonious with the smallest errors.

  2. Inverse and Predictive Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Syracuse, Ellen Marie [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-27

    The LANL Seismo-Acoustic team has a strong capability in developing data-driven models that accurately predict a variety of observations. These models range from the simple – one-dimensional models that are constrained by a single dataset and can be used for quick and efficient predictions – to the complex – multidimensional models that are constrained by several types of data and result in more accurate predictions. Team members typically build models of geophysical characteristics of Earth and source distributions at scales of 1 to 1000s of km, the techniques used are applicable for other types of physical characteristics at an even greater range of scales. The following cases provide a snapshot of some of the modeling work done by the Seismo- Acoustic team at LANL.

  3. Tide Predictions, California, 2014, NOAA

    Data.gov (United States)

    U.S. Environmental Protection Agency — The predictions from the web based NOAA Tide Predictions are based upon the latest information available as of the date of the user's request. Tide predictions...

  4. Predicting tile drainage discharge

    DEFF Research Database (Denmark)

    Iversen, Bo Vangsø; Kjærgaard, Charlotte; Petersen, Rasmus Jes

    used in the analysis. For the dynamic modelling, a simple linear reservoir model was used where different outlets in the model represented tile drain as well as groundwater discharge outputs. This modelling was based on daily measured tile drain discharge values. The statistical predictive model...... was based on a polynomial regression predicting yearly tile drain discharge values using site specific parameters such as soil type, catchment topography, etc. as predictors. Values of calibrated model parameters from the dynamic modelling were compared to the same site specific parameter as used...

  5. Linguistic Structure Prediction

    CERN Document Server

    Smith, Noah A

    2011-01-01

    A major part of natural language processing now depends on the use of text data to build linguistic analyzers. We consider statistical, computational approaches to modeling linguistic structure. We seek to unify across many approaches and many kinds of linguistic structures. Assuming a basic understanding of natural language processing and/or machine learning, we seek to bridge the gap between the two fields. Approaches to decoding (i.e., carrying out linguistic structure prediction) and supervised and unsupervised learning of models that predict discrete structures as outputs are the focus. W

  6. Predicting Anthracycline Benefit

    DEFF Research Database (Denmark)

    Bartlett, John M S; McConkey, Christopher C; Munro, Alison F

    2015-01-01

    PURPOSE: Evidence supporting the clinical utility of predictive biomarkers of anthracycline activity is weak, with a recent meta-analysis failing to provide strong evidence for either HER2 or TOP2A. Having previously shown that duplication of chromosome 17 pericentromeric alpha satellite as measu......PURPOSE: Evidence supporting the clinical utility of predictive biomarkers of anthracycline activity is weak, with a recent meta-analysis failing to provide strong evidence for either HER2 or TOP2A. Having previously shown that duplication of chromosome 17 pericentromeric alpha satellite...

  7. Prediction of Antibody Epitopes

    DEFF Research Database (Denmark)

    Nielsen, Morten; Marcatili, Paolo

    2015-01-01

    Antibodies recognize their cognate antigens in a precise and effective way. In order to do so, they target regions of the antigenic molecules that have specific features such as large exposed areas, presence of charged or polar atoms, specific secondary structure elements, and lack of similarity...... to self-proteins. Given the sequence or the structure of a protein of interest, several methods exploit such features to predict the residues that are more likely to be recognized by an immunoglobulin.Here, we present two methods (BepiPred and DiscoTope) to predict linear and discontinuous antibody...

  8. Basis of predictive mycology.

    Science.gov (United States)

    Dantigny, Philippe; Guilmart, Audrey; Bensoussan, Maurice

    2005-04-15

    For over 20 years, predictive microbiology focused on food-pathogenic bacteria. Few studies concerned modelling fungal development. On one hand, most of food mycologists are not familiar with modelling techniques; on the other hand, people involved in modelling are developing tools dedicated to bacteria. Therefore, there is a tendency to extend the use of models that were developed for bacteria to moulds. However, some mould specificities should be taken into account. The use of specific models for predicting germination and growth of fungi was advocated previously []. This paper provides a short review of fungal modelling studies.

  9. Dopamine reward prediction error coding

    OpenAIRE

    Schultz, Wolfram

    2016-01-01

    Reward prediction errors consist of the differences between received and predicted rewards. They are crucial for basic forms of learning about rewards and make us strive for more rewards?an evolutionary beneficial trait. Most dopamine neurons in the midbrain of humans, monkeys, and rodents signal a reward prediction error; they are activated by more reward than predicted (positive prediction error), remain at baseline activity for fully predicted rewards, and show depressed activity with less...

  10. Steering smog prediction

    NARCIS (Netherlands)

    R. van Liere (Robert); J.J. van Wijk (Jack)

    1997-01-01

    textabstractThe use of computational steering for smog prediction is described. This application is representative for many underlying issues found in steering high performance applications: high computing times, large data sets, and many different input parameters. After a short description of the

  11. Predicting Sustainable Work Behavior

    DEFF Research Database (Denmark)

    Hald, Kim Sundtoft

    2013-01-01

    Sustainable work behavior is an important issue for operations managers – it has implications for most outcomes of OM. This research explores the antecedents of sustainable work behavior. It revisits and extends the sociotechnical model developed by Brown et al. (2000) on predicting safe behavior...

  12. Gate valve performance prediction

    International Nuclear Information System (INIS)

    Harrison, D.H.; Damerell, P.S.; Wang, J.K.; Kalsi, M.S.; Wolfe, K.J.

    1994-01-01

    The Electric Power Research Institute is carrying out a program to improve the performance prediction methods for motor-operated valves. As part of this program, an analytical method to predict the stem thrust required to stroke a gate valve has been developed and has been assessed against data from gate valve tests. The method accounts for the loads applied to the disc by fluid flow and for the detailed mechanical interaction of the stem, disc, guides, and seats. To support development of the method, two separate-effects test programs were carried out. One test program determined friction coefficients for contacts between gate valve parts by using material specimens in controlled environments. The other test program investigated the interaction of the stem, disc, guides, and seat using a special fixture with full-sized gate valve parts. The method has been assessed against flow-loop and in-plant test data. These tests include valve sizes from 3 to 18 in. and cover a considerable range of flow, temperature, and differential pressure. Stem thrust predictions for the method bound measured results. In some cases, the bounding predictions are substantially higher than the stem loads required for valve operation, as a result of the bounding nature of the friction coefficients in the method

  13. Prediction method abstracts

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-12-31

    This conference was held December 4--8, 1994 in Asilomar, California. The purpose of this meeting was to provide a forum for exchange of state-of-the-art information concerning the prediction of protein structure. Attention if focused on the following: comparative modeling; sequence to fold assignment; and ab initio folding.

  14. Predicting Intrinsic Motivation

    Science.gov (United States)

    Martens, Rob; Kirschner, Paul A.

    2004-01-01

    Intrinsic motivation can be predicted from participants' perceptions of the social environment and the task environment (Ryan & Deci, 2000)in terms of control, relatedness and competence. To determine the degree of independence of these factors 251 students in higher vocational education (physiotherapy and hotel management) indicated the…

  15. Predicting visibility of aircraft.

    Directory of Open Access Journals (Sweden)

    Andrew Watson

    Full Text Available Visual detection of aircraft by human observers is an important element of aviation safety. To assess and ensure safety, it would be useful to be able to be able to predict the visibility, to a human observer, of an aircraft of specified size, shape, distance, and coloration. Examples include assuring safe separation among aircraft and between aircraft and unmanned vehicles, design of airport control towers, and efforts to enhance or suppress the visibility of military and rescue vehicles. We have recently developed a simple metric of pattern visibility, the Spatial Standard Observer (SSO. In this report we examine whether the SSO can predict visibility of simulated aircraft images. We constructed a set of aircraft images from three-dimensional computer graphic models, and measured the luminance contrast threshold for each image from three human observers. The data were well predicted by the SSO. Finally, we show how to use the SSO to predict visibility range for aircraft of arbitrary size, shape, distance, and coloration.

  16. Climate Prediction Center

    Science.gov (United States)

    Weather Service NWS logo - Click to go to the NWS home page Climate Prediction Center Home Site Map News Organization Enter Search Term(s): Search Search the CPC Go NCEP Quarterly Newsletter Climate Highlights U.S Climate-Weather El Niño/La Niña MJO Blocking AAO, AO, NAO, PNA Climatology Global Monsoons Expert

  17. Predicting Commissary Store Success

    Science.gov (United States)

    2014-12-01

    stores or if it is possible to predict that success. Multiple studies of private commercial grocery consumer preferences , habits and demographics have...appropriate number of competitors due to the nature of international cultures and consumer preferences . 2. Missing Data Four of the remaining stores

  18. Predicting Job Satisfaction.

    Science.gov (United States)

    Blai, Boris, Jr.

    Psychological theories about human motivation and accommodation to environment can be used to achieve a better understanding of the human factors that function in the work environment. Maslow's theory of human motivational behavior provided a theoretical framework for an empirically-derived method to predict job satisfaction and explore the…

  19. Ocean Prediction Center

    Science.gov (United States)

    Social Media Facebook Twitter YouTube Search Search For Go NWS All NOAA Weather Analysis & Forecasts of Commerce Ocean Prediction Center National Oceanic and Atmospheric Administration Analysis & Unified Surface Analysis Ocean Ocean Products Ice & Icebergs NIC Ice Products NAIS Iceberg Analysis

  20. Predicting Reasoning from Memory

    Science.gov (United States)

    Heit, Evan; Hayes, Brett K.

    2011-01-01

    In an effort to assess the relations between reasoning and memory, in 8 experiments, the authors examined how well responses on an inductive reasoning task are predicted from responses on a recognition memory task for the same picture stimuli. Across several experimental manipulations, such as varying study time, presentation frequency, and the…

  1. Predicting coronary heart disease

    DEFF Research Database (Denmark)

    Sillesen, Henrik; Fuster, Valentin

    2012-01-01

    Atherosclerosis is the leading cause of death and disabling disease. Whereas risk factors are well known and constitute therapeutic targets, they are not useful for prediction of risk of future myocardial infarction, stroke, or death. Therefore, methods to identify atherosclerosis itself have bee...

  2. ANTHROPOMETRIC PREDICTIVE EQUATIONS FOR ...

    African Journals Online (AJOL)

    Keywords: Anthropometry, Predictive Equations, Percentage Body Fat, Nigerian Women, Bioelectric Impedance ... such as Asians and Indians (Pranav et al., 2009), ... size (n) of at least 3o is adjudged as sufficient for the ..... of people, gender and age (Vogel eta/., 1984). .... Fish Sold at Ile-Ife Main Market, South West Nigeria.

  3. Predicting Pilot Retention

    Science.gov (United States)

    2012-06-15

    forever… Gig ‘Em! Dale W. Stanley III vii Table of Contents Page Acknowledgments...over the last 20 years. Airbus predicted that these trends would continue as emerging economies , especially in Asia, were creating a fast growing...US economy , pay differential and hiring by the major airlines contributed most to the decision to separate from the Air Force (Fullerton, 2003: 354

  4. Predicting ideological prejudice

    NARCIS (Netherlands)

    Brandt, M.J.

    2018-01-01

    A major shortcoming of current models of ideological prejudice is that although they can anticipate the direction of the association between participants’ ideology and their prejudice against a range of target groups, they cannot predict the size of this association. I developed and tested models

  5. Validation predictions of a 13 m/s cross-wind fire for Fuego and the University of Waterloo dataset.

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Alexander L.; Evans, Gregory Herbert (Sandia National Laboratories, Livermore, CA); Gill, Walter; Jarboe, Daniel T. (Sandia National Laboratories, Livermore, CA)

    2008-03-01

    Detailed herein are the results of a validation comparison. The experiment involved a 2 meter diameter liquid pool of Jet-A fuel in a 13 m/s crosswind. The scenario included a large cylindrical blocking object just down-stream of the fire. It also included seven smaller calorimeters and extensive instrumentation. The experiments were simulated with Fuego. The model included several conduction regions to model the response of the calorimeters, the floor, and the large cylindrical blocking object. A blind comparison was used to compare the simulation predictions with the experimental data. The more upstream data compared very well with the simulation predictions. The more downstream data did not compare very well with the simulation predictions. Further investigation suggests that features omitted from the original model contributed to the discrepancies. Observations are made with respect to the scenario that are aimed at helping an analyst approach a comparable problem in a way that may help improve the potential for quantitative accuracy.

  6. Dopamine reward prediction error coding.

    Science.gov (United States)

    Schultz, Wolfram

    2016-03-01

    Reward prediction errors consist of the differences between received and predicted rewards. They are crucial for basic forms of learning about rewards and make us strive for more rewards-an evolutionary beneficial trait. Most dopamine neurons in the midbrain of humans, monkeys, and rodents signal a reward prediction error; they are activated by more reward than predicted (positive prediction error), remain at baseline activity for fully predicted rewards, and show depressed activity with less reward than predicted (negative prediction error). The dopamine signal increases nonlinearly with reward value and codes formal economic utility. Drugs of addiction generate, hijack, and amplify the dopamine reward signal and induce exaggerated, uncontrolled dopamine effects on neuronal plasticity. The striatum, amygdala, and frontal cortex also show reward prediction error coding, but only in subpopulations of neurons. Thus, the important concept of reward prediction errors is implemented in neuronal hardware.

  7. Urban pluvial flood prediction

    DEFF Research Database (Denmark)

    Thorndahl, Søren Liedtke; Nielsen, Jesper Ellerbæk; Jensen, David Getreuer

    2016-01-01

    Flooding produced by high-intensive local rainfall and drainage system capacity exceedance can have severe impacts in cities. In order to prepare cities for these types of flood events – especially in the future climate – it is valuable to be able to simulate these events numerically both...... historically and in real-time. There is a rather untested potential in real-time prediction of urban floods. In this paper radar data observations with different spatial and temporal resolution, radar nowcasts of 0–2 h lead time, and numerical weather models with lead times up to 24 h are used as inputs...... to an integrated flood and drainage systems model in order to investigate the relative difference between different inputs in predicting future floods. The system is tested on a small town Lystrup in Denmark, which has been flooded in 2012 and 2014. Results show it is possible to generate detailed flood maps...

  8. Predicting Bankruptcy in Pakistan

    Directory of Open Access Journals (Sweden)

    Abdul RASHID

    2011-09-01

    Full Text Available This paper aims to identify the financial ratios that are most significant in bankruptcy prediction for the non-financial sector of Pakistan based on a sample of companies which became bankrupt over the time period 1996-2006. Twenty four financial ratios covering four important financial attributes, namely profitability, liquidity, leverage, and turnover ratios, were examined for a five-year period prior bankruptcy. The discriminant analysis produced a parsimonious model of three variables viz. sales to total assets, EBIT to current liabilities, and cash flow ratio. Our estimates provide evidence that the firms having Z-value below zero fall into the “bankrupt” whereas the firms with Z-value above zero fall into the “non-bankrupt” category. The model achieved 76.9% prediction accuracy when it is applied to forecast bankruptcies on the underlying sample.

  9. Predicting Lotto Numbers

    DEFF Research Database (Denmark)

    Jørgensen, Claus Bjørn; Suetens, Sigrid; Tyran, Jean-Robert

    numbers based on recent drawings. While most players pick the same set of numbers week after week without regards of numbers drawn or anything else, we find that those who do change, act on average in the way predicted by the law of small numbers as formalized in recent behavioral theory. In particular......We investigate the “law of small numbers” using a unique panel data set on lotto gambling. Because we can track individual players over time, we can measure how they react to outcomes of recent lotto drawings. We can therefore test whether they behave as if they believe they can predict lotto......, on average they move away from numbers that have recently been drawn, as suggested by the “gambler’s fallacy”, and move toward numbers that are on streak, i.e. have been drawn several weeks in a row, consistent with the “hot hand fallacy”....

  10. Comparing Spatial Predictions

    KAUST Repository

    Hering, Amanda S.

    2011-11-01

    Under a general loss function, we develop a hypothesis test to determine whether a significant difference in the spatial predictions produced by two competing models exists on average across the entire spatial domain of interest. The null hypothesis is that of no difference, and a spatial loss differential is created based on the observed data, the two sets of predictions, and the loss function chosen by the researcher. The test assumes only isotropy and short-range spatial dependence of the loss differential but does allow it to be non-Gaussian, non-zero-mean, and spatially correlated. Constant and nonconstant spatial trends in the loss differential are treated in two separate cases. Monte Carlo simulations illustrate the size and power properties of this test, and an example based on daily average wind speeds in Oklahoma is used for illustration. Supplemental results are available online. © 2011 American Statistical Association and the American Society for Qualitys.

  11. Chaos detection and predictability

    CERN Document Server

    Gottwald, Georg; Laskar, Jacques

    2016-01-01

    Distinguishing chaoticity from regularity in deterministic dynamical systems and specifying the subspace of the phase space in which instabilities are expected to occur is of utmost importance in as disparate areas as astronomy, particle physics and climate dynamics.   To address these issues there exists a plethora of methods for chaos detection and predictability. The most commonly employed technique for investigating chaotic dynamics, i.e. the computation of Lyapunov exponents, however, may suffer a number of problems and drawbacks, for example when applied to noisy experimental data.   In the last two decades, several novel methods have been developed for the fast and reliable determination of the regular or chaotic nature of orbits, aimed at overcoming the shortcomings of more traditional techniques. This set of lecture notes and tutorial reviews serves as an introduction to and overview of modern chaos detection and predictability techniques for graduate students and non-specialists.   The book cover...

  12. Time-predictable architectures

    CERN Document Server

    Rochange, Christine; Uhrig , Sascha

    2014-01-01

    Building computers that can be used to design embedded real-time systems is the subject of this title. Real-time embedded software requires increasingly higher performances. The authors therefore consider processors that implement advanced mechanisms such as pipelining, out-of-order execution, branch prediction, cache memories, multi-threading, multicorearchitectures, etc. The authors of this book investigate the timepredictability of such schemes.

  13. Cultural Resource Predictive Modeling

    Science.gov (United States)

    2017-10-01

    CR cultural resource CRM cultural resource management CRPM Cultural Resource Predictive Modeling DoD Department of Defense ESTCP Environmental...resource management ( CRM ) legal obligations under NEPA and the NHPA, military installations need to demonstrate that CRM decisions are based on objective...maxim “one size does not fit all,” and demonstrate that DoD installations have many different CRM needs that can and should be met through a variety

  14. Predictive Game Theory

    Science.gov (United States)

    Wolpert, David H.

    2005-01-01

    Probability theory governs the outcome of a game; there is a distribution over mixed strat.'s, not a single "equilibrium". To predict a single mixed strategy must use our loss function (external to the game's players. Provides a quantification of any strategy's rationality. Prove rationality falls as cost of computation rises (for players who have not previously interacted). All extends to games with varying numbers of players.

  15. Predicting appointment breaking.

    Science.gov (United States)

    Bean, A G; Talaga, J

    1995-01-01

    The goal of physician referral services is to schedule appointments, but if too many patients fail to show up, the value of the service will be compromised. The authors found that appointment breaking can be predicted by the number of days to the scheduled appointment, the doctor's specialty, and the patient's age and gender. They also offer specific suggestions for modifying the marketing mix to reduce the incidence of no-shows.

  16. Adjusting estimative prediction limits

    OpenAIRE

    Masao Ueki; Kaoru Fueda

    2007-01-01

    This note presents a direct adjustment of the estimative prediction limit to reduce the coverage error from a target value to third-order accuracy. The adjustment is asymptotically equivalent to those of Barndorff-Nielsen & Cox (1994, 1996) and Vidoni (1998). It has a simpler form with a plug-in estimator of the coverage probability of the estimative limit at the target value. Copyright 2007, Oxford University Press.

  17. Space Weather Prediction

    Science.gov (United States)

    2014-10-31

    prominence eruptions and the ensuing coronal mass ejections. The ProMag is a spectro - polarimeter, consisting of a dual-beam polarization modulation unit...feeding a visible camera and an infrared camera. The instrument is designed to measure magnetic fields in solar prominences by simultaneous spectro ...as a result of coronal hole regions, we expect to improve UV predictions by incorporating an estimate of the Earth-side coronal hole regions. 5

  18. Instrument uncertainty predictions

    International Nuclear Information System (INIS)

    Coutts, D.A.

    1991-07-01

    The accuracy of measurements and correlations should normally be provided for most experimental activities. The uncertainty is a measure of the accuracy of a stated value or equation. The uncertainty term reflects a combination of instrument errors, modeling limitations, and phenomena understanding deficiencies. This report provides several methodologies to estimate an instrument's uncertainty when used in experimental work. Methods are shown to predict both the pretest and post-test uncertainty

  19. Predictive Systems Toxicology

    KAUST Repository

    Kiani, Narsis A.; Shang, Ming-Mei; Zenil, Hector; Tegner, Jesper

    2018-01-01

    In this review we address to what extent computational techniques can augment our ability to predict toxicity. The first section provides a brief history of empirical observations on toxicity dating back to the dawn of Sumerian civilization. Interestingly, the concept of dose emerged very early on, leading up to the modern emphasis on kinetic properties, which in turn encodes the insight that toxicity is not solely a property of a compound but instead depends on the interaction with the host organism. The next logical step is the current conception of evaluating drugs from a personalized medicine point-of-view. We review recent work on integrating what could be referred to as classical pharmacokinetic analysis with emerging systems biology approaches incorporating multiple omics data. These systems approaches employ advanced statistical analytical data processing complemented with machine learning techniques and use both pharmacokinetic and omics data. We find that such integrated approaches not only provide improved predictions of toxicity but also enable mechanistic interpretations of the molecular mechanisms underpinning toxicity and drug resistance. We conclude the chapter by discussing some of the main challenges, such as how to balance the inherent tension between the predictive capacity of models, which in practice amounts to constraining the number of features in the models versus allowing for rich mechanistic interpretability, i.e. equipping models with numerous molecular features. This challenge also requires patient-specific predictions on toxicity, which in turn requires proper stratification of patients as regards how they respond, with or without adverse toxic effects. In summary, the transformation of the ancient concept of dose is currently successfully operationalized using rich integrative data encoded in patient-specific models.

  20. Predictive systems ecology

    OpenAIRE

    Evans, Matthew R.; Bithell, Mike; Cornell, Stephen J.; Dall, Sasha R. X.; D?az, Sandra; Emmott, Stephen; Ernande, Bruno; Grimm, Volker; Hodgson, David J.; Lewis, Simon L.; Mace, Georgina M.; Morecroft, Michael; Moustakas, Aristides; Murphy, Eugene; Newbold, Tim

    2013-01-01

    Human societies, and their well-being, depend to a significant extent on the state of the ecosystems that surround them. These ecosystems are changing rapidly usually in response to anthropogenic changes in the environment. To determine the likely impact of environmental change on ecosystems and the best ways to manage them, it would be desirable to be able to predict their future states. We present a proposal to develop the paradigm of ...

  1. UXO Burial Prediction Fidelity

    Science.gov (United States)

    2017-07-01

    models to capture detailed projectile dynamics during the early phases of water entry are wasted with regard to sediment -penetration depth prediction...ordnance (UXO) migrates and becomes exposed over time in response to water and sediment motion.  Such models need initial sediment penetration estimates...munition’s initial penetration depth into the sediment ,  the velocity of water at the water - sediment boundary (i.e., the bottom water velocity

  2. Predictive Systems Toxicology

    KAUST Repository

    Kiani, Narsis A.

    2018-01-15

    In this review we address to what extent computational techniques can augment our ability to predict toxicity. The first section provides a brief history of empirical observations on toxicity dating back to the dawn of Sumerian civilization. Interestingly, the concept of dose emerged very early on, leading up to the modern emphasis on kinetic properties, which in turn encodes the insight that toxicity is not solely a property of a compound but instead depends on the interaction with the host organism. The next logical step is the current conception of evaluating drugs from a personalized medicine point-of-view. We review recent work on integrating what could be referred to as classical pharmacokinetic analysis with emerging systems biology approaches incorporating multiple omics data. These systems approaches employ advanced statistical analytical data processing complemented with machine learning techniques and use both pharmacokinetic and omics data. We find that such integrated approaches not only provide improved predictions of toxicity but also enable mechanistic interpretations of the molecular mechanisms underpinning toxicity and drug resistance. We conclude the chapter by discussing some of the main challenges, such as how to balance the inherent tension between the predictive capacity of models, which in practice amounts to constraining the number of features in the models versus allowing for rich mechanistic interpretability, i.e. equipping models with numerous molecular features. This challenge also requires patient-specific predictions on toxicity, which in turn requires proper stratification of patients as regards how they respond, with or without adverse toxic effects. In summary, the transformation of the ancient concept of dose is currently successfully operationalized using rich integrative data encoded in patient-specific models.

  3. Predictive Surface Complexation Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Sverjensky, Dimitri A. [Johns Hopkins Univ., Baltimore, MD (United States). Dept. of Earth and Planetary Sciences

    2016-11-29

    Surface complexation plays an important role in the equilibria and kinetics of processes controlling the compositions of soilwaters and groundwaters, the fate of contaminants in groundwaters, and the subsurface storage of CO2 and nuclear waste. Over the last several decades, many dozens of individual experimental studies have addressed aspects of surface complexation that have contributed to an increased understanding of its role in natural systems. However, there has been no previous attempt to develop a model of surface complexation that can be used to link all the experimental studies in order to place them on a predictive basis. Overall, my research has successfully integrated the results of the work of many experimentalists published over several decades. For the first time in studies of the geochemistry of the mineral-water interface, a practical predictive capability for modeling has become available. The predictive correlations developed in my research now enable extrapolations of experimental studies to provide estimates of surface chemistry for systems not yet studied experimentally and for natural and anthropogenically perturbed systems.

  4. Predicting Human Cooperation.

    Directory of Open Access Journals (Sweden)

    John J Nay

    Full Text Available The Prisoner's Dilemma has been a subject of extensive research due to its importance in understanding the ever-present tension between individual self-interest and social benefit. A strictly dominant strategy in a Prisoner's Dilemma (defection, when played by both players, is mutually harmful. Repetition of the Prisoner's Dilemma can give rise to cooperation as an equilibrium, but defection is as well, and this ambiguity is difficult to resolve. The numerous behavioral experiments investigating the Prisoner's Dilemma highlight that players often cooperate, but the level of cooperation varies significantly with the specifics of the experimental predicament. We present the first computational model of human behavior in repeated Prisoner's Dilemma games that unifies the diversity of experimental observations in a systematic and quantitatively reliable manner. Our model relies on data we integrated from many experiments, comprising 168,386 individual decisions. The model is composed of two pieces: the first predicts the first-period action using solely the structural game parameters, while the second predicts dynamic actions using both game parameters and history of play. Our model is successful not merely at fitting the data, but in predicting behavior at multiple scales in experimental designs not used for calibration, using only information about the game structure. We demonstrate the power of our approach through a simulation analysis revealing how to best promote human cooperation.

  5. Predicting big bang deuterium

    Energy Technology Data Exchange (ETDEWEB)

    Hata, N.; Scherrer, R.J.; Steigman, G.; Thomas, D.; Walker, T.P. [Department of Physics, Ohio State University, Columbus, Ohio 43210 (United States)

    1996-02-01

    We present new upper and lower bounds to the primordial abundances of deuterium and {sup 3}He based on observational data from the solar system and the interstellar medium. Independent of any model for the primordial production of the elements we find (at the 95{percent} C.L.): 1.5{times}10{sup {minus}5}{le}(D/H){sub {ital P}}{le}10.0{times}10{sup {minus}5} and ({sup 3}He/H){sub {ital P}}{le}2.6{times}10{sup {minus}5}. When combined with the predictions of standard big bang nucleosynthesis, these constraints lead to a 95{percent} C.L. bound on the primordial abundance deuterium: (D/H){sub best}=(3.5{sup +2.7}{sub {minus}1.8}){times}10{sup {minus}5}. Measurements of deuterium absorption in the spectra of high-redshift QSOs will directly test this prediction. The implications of this prediction for the primordial abundances of {sup 4}He and {sup 7}Li are discussed, as well as those for the universal density of baryons. {copyright} {ital 1996 The American Astronomical Society.}

  6. The Wealth of Nations and the Poverty of Analysts

    Science.gov (United States)

    Horowitz, Irving Louis

    2012-01-01

    Now that Colonel Muammar Gaddafi is dead and his forty-two years as despotic ruler of Libya and fomenter of international disorder has come to a permanent halt, it is a good time for governments--both in and beyond the NATO alliance--to review accommodations and agreements made with his regime. It is also time for the academic social policy…

  7. Vulnerability Analyst’s Guide to Geometric Target Description

    Science.gov (United States)

    1992-09-01

    not constitute indorsement of any commercial product. Form Approved REPORT DOCUMENTATION PAGE OMB No. 0704-O,8 public reporting burden for this...46 5.3 Surrogacy ..............................................46 5.4 Specialized Targets......................................46 5.5... commercially available documents for other large-scale software. The documentation is not a BRL technical report, but can be obtained by contacting

  8. Cost Beneftt Analysts of LH2 PadB

    Science.gov (United States)

    Mott, Brittany

    2013-01-01

    This analysis is used to evaluate, from a cost and benefit perspective, potential outcomes when replacing the pressurization switches and the pressurization system to meet the needs of the LH2 storage system at Pad B. This also includes alternatives, tangible and intangible benefits, and the results of the analysis.

  9. A Methodological Approach for Training Analysts of Small Business Problems.

    Science.gov (United States)

    Mackness, J. R.

    1986-01-01

    Steps in a small business analysis are discussed: understand how company activities interact internally and with markets and suppliers; know the relative importance of controllable management variables; understand the social atmosphere within the company; analyze the operations of the company; define main problem areas; identify possible actions…

  10. Recursive engagement: the public as data analysts and outreach creators

    CERN Document Server

    Kalderon, Charles William; The ATLAS collaboration

    2018-01-01

    Two recent outreach projects are making use of public communities to enhance and build upon the first phases setup by physicists. “HiggsHunters” asked the public to search for displaced vertices in event displays, during which time a pool of trusted members arose in the associated discussion. The data this generated is now being analysed by schoolchildren, through which they can both learn the principles of scientific research and contribute directly to it. Also involving schoolchildren is “ATLAScraft”, a recreation of ATLAS and the wider CERN complex in minecraft. Here, the basic layout was provided, but students subsequently researched and created their own mini-games to explain various aspects of the LHC and detector physics to others. [NB: 15 minutes talk plus 2 minutes of questions

  11. James Moir (1874–1929) Pioneering Chemical Analyst in South ...

    African Journals Online (AJOL)

    NICO

    This was a starch KI paper moistened with a 50 % glycerine solution. A brown ... The NO produced in this side reaction is oxidized to NO2 and by cancelling one ..... tion either with FeCl3 or CrO3, evidently due to oxidation. Also, when treated ...

  12. Erythropoiesis stimulating agents and techniques: a challenge for doping analysts.

    Science.gov (United States)

    Jelkmann, W

    2009-01-01

    Recombinant human erythropoietin (rHuEPO) engineered in Chinese hamster ovary (CHO) cell cultures (Epoetin alfa and Epoetin beta) and its hyperglycosylated analogue Darbepoetin alfa are known to be misused by athletes. The drugs can be detected by isoelectric focusing (IEF) and immunoblotting of urine samples, because "EPO" is in reality a mixture of isoforms and the N-glycans of the recombinant products differ from those of the endogenous hormone. However, there is a plethora of novel erythropoiesis stimulating agents (ESAs). Since the originator Epoetins alfa and beta are no longer protected by patent in the European Union, rHuEPO biosimilars have entered the market. In addition, several companies in Asia, Africa and Latin America produce copied rHuEPOs for clinical purposes. While the amino acid sequence of all Epoetins is identical, the structure of their glycans differs depending on the mode of production. Some products contain more acidic and others more basic EPO isoforms. Epoetin delta is special, as it was engineered by homologous recombination in human fibrosarcoma cells (HT-1080), thus lacking N-glycolylneuraminic acid like native human EPO. ESAs under development include EPO fusion proteins, synthetic erythropoiesis stimulating protein (SEP) and peptidic (Hematide(), CNTO 528) as well as non-peptidic EPO mimetics. Furthermore, preclinical respectively clinical trials have been performed with small orally active drugs that stimulate endogenous EPO production by activating the EPO promoter ("GATA-inhibitors": diazepane derivatives) or enhancer ("HIF-stabilizers": 2-oxoglutarate analogues). The prohibited direct EPO gene transfer may become a problem in sports only in the future.

  13. Latvian bank analysts strike back at "overheating hysteria" / Gary Peach

    Index Scriptorium Estoniae

    Peach, Gary

    2007-01-01

    Vastukaaluks rahvusvahelisele ajakirjandusele pole Läti kohalike pankade - Parex ja Hansabanka - majandusanalüütikud nii pessimistlikud, prognoosides Läti majandust ja selle võimalikku ülekuumenemist

  14. Training Guide for the Management Analyst Industrial Engineer Technician

    Science.gov (United States)

    1979-07-01

    comtemporary work operations, and blending traditional and modern organization concepts, the student devwlops the facility to analyze and create organization...training, the attendee will know the functions of a computer as it processes business data to produce information for improved management. He will...action which is most cost effective when considering proposed investments. Emphasis is placed on the adaption of general business practices to

  15. Intelligence Analysts Need Training on How to Think

    National Research Council Canada - National Science Library

    Hanson, Andrew

    2008-01-01

    .... It has been an ongoing struggle to adapt to unconventional methods. Now that the US is in a new kind of war, it is important to train soldiers not only to win today, but win in the future as well...

  16. Analyst: Soldier fails to sway election / Joel Alas

    Index Scriptorium Estoniae

    Alas, Joel

    2007-01-01

    President Toomas Hendrik Ilves ei kuulutanud välja keelatud rajatise kõrvaldamise seadust, sest see on põhiseadusega vastuolus. Politoloog Vello Pettai hinnangul pole valijatele Tõnismäe pronkssõduri teema oluline

  17. Improving Information Operations with a Military Cultural Analyst

    Science.gov (United States)

    2005-01-25

    Communicating Across Cultures, (Belmont, CA: Wadsworth Publishing Company, 1996), 24. 44 Ibid. 45 Marieke de Mooij, Global Marketing and Advertising...United States Army Training and Doctrine Command, 1992. De Mooij, Marieke. Global Marketing and Advertising: Understanding Cultural Paradoxes

  18. Cost Estimating Cases: Educational Tools for Cost Analysts

    Science.gov (United States)

    1993-09-01

    only appropriate documentation should be provided. In other words, students should not submit all of the documentation possible using ACEIT , only that...case was their lack of understanding of the ACEIT software used to conduct the estimate. Specifically, many students misinterpreted the cost...estimating relationships (CERs) embedded in the 49 software. Additionally, few of the students were able to properly organize the ACEIT documentation output

  19. Adversaries, Advocates, or Thoughtful Analysts? Some Lessons from Dance History.

    Science.gov (United States)

    Wagner, Ann

    1999-01-01

    Argues that the arts demand careful analysis when providing a rationale for the inclusion of the arts in educational programs and policies. Provides information on the content and context of dance opposition and provides examples from dance history of issues that need to be addressed. (CMK)

  20. Accounting for Systems Analysts in the 21st Century

    Science.gov (United States)

    Giordano, Thomas; McAleer, Brenda; Szakas, Joseph S.

    2010-01-01

    Computer Information System (CIS) majors are required to successfully complete an introductory accounting course. Given the current forces in the financial world, the appropriateness of this course warrants scrutiny as to whether it properly serves the student, and the degree to which it continues to meet the IS 2002 outcomes. The current business…

  1. Analysts : no need to panic over prices / Steven Paulikas

    Index Scriptorium Estoniae

    Paulikas, Steven

    2004-01-01

    Pärast EL-iga liitumist on Balti riikides hinnad tõusnud keskmiselt 5,7%, pankade hinnangul on oodata ka inflatsiooni kiirenemist. Lisa: Thank you, accession: Consumer Price Index, percent increase, year-on-year

  2. Procedure Improvement in Blood Processing for Chromosome Aberration Analyst

    International Nuclear Information System (INIS)

    Noraisyah Mohd Yusof; Juliana Mahamad; Rahimah Abd Rahim; Yahaya Talib; Mohd Rodzi Ali

    2015-01-01

    Detection of chromosome at metaphase of the cell cycle is performed either manually or automatically. Procedure for slide preparation published by the IAEA does not guarantee that the quality of slide is suitable for automatic detection. The detection efficiency reduces if there is cells debris on slides. This paper describes the modifications made to the standard procedure. The period of hypotonic treatment to the cell was lengthened; the slides were pre-treated with RNase and the frequency of rinsing during the chromosomal coloring process was increased. Results show the metaphase images were better and clearer, and numbers of metaphase that can be detected automatically were also increased. In conclusion, modification to the current standard protocol helps to easy the process of chromosome aberration analysis at Nuclear Malaysia. (author)

  3. Disruption prediction at JET

    International Nuclear Information System (INIS)

    Milani, F.

    1998-12-01

    The sudden loss of the plasma magnetic confinement, known as disruption, is one of the major issue in a nuclear fusion machine as JET (Joint European Torus). Disruptions pose very serious problems to the safety of the machine. The energy stored in the plasma is released to the machine structure in few milliseconds resulting in forces that at JET reach several Mega Newtons. The problem is even more severe in the nuclear fusion power station where the forces are in the order of one hundred Mega Newtons. The events that occur during a disruption are still not well understood even if some mechanisms that can lead to a disruption have been identified and can be used to predict them. Unfortunately it is always a combination of these events that generates a disruption and therefore it is not possible to use simple algorithms to predict it. This thesis analyses the possibility of using neural network algorithms to predict plasma disruptions in real time. This involves the determination of plasma parameters every few milliseconds. A plasma boundary reconstruction algorithm, XLOC, has been developed in collaboration with Dr. D. O'Brien and Dr. J. Ellis capable of determining the plasma wall/distance every 2 milliseconds. The XLOC output has been used to develop a multilayer perceptron network to determine plasma parameters as l i and q ψ with which a machine operational space has been experimentally defined. If the limits of this operational space are breached the disruption probability increases considerably. Another approach for prediction disruptions is to use neural network classification methods to define the JET operational space. Two methods have been studied. The first method uses a multilayer perceptron network with softmax activation function for the output layer. This method can be used for classifying the input patterns in various classes. In this case the plasma input patterns have been divided between disrupting and safe patterns, giving the possibility of

  4. Predicting the Consequences of MMOD Penetrations on the International Space Station

    Science.gov (United States)

    Hyde, James; Christiansen, E.; Lear, D.; Evans

    2018-01-01

    The threat from micrometeoroid and orbital debris (MMOD) impacts on space vehicles is often quantified in terms of the probability of no penetration (PNP). However, for large spacecraft, especially those with multiple compartments, a penetration may have a number of possible outcomes. The extent of the damage (diameter of hole, crack length or penetration depth), the location of the damage relative to critical equipment or crew, crew response, and even the time of day of the penetration are among the many factors that can affect the outcome. For the International Space Station (ISS), a Monte-Carlo style software code called Manned Spacecraft Crew Survivability (MSCSurv) is used to predict the probability of several outcomes of an MMOD penetration-broadly classified as loss of crew (LOC), crew evacuation (Evac), loss of escape vehicle (LEV), and nominal end of mission (NEOM). By generating large numbers of MMOD impacts (typically in the billions) and tracking the consequences, MSCSurv allows for the inclusion of a large number of parameters and models as well as enabling the consideration of uncertainties in the models and parameters. MSCSurv builds upon the results from NASA's Bumper software (which provides the probability of penetration and critical input data to MSCSurv) to allow analysts to estimate the probability of LOC, Evac, LEV, and NEOM. This paper briefly describes the overall methodology used by NASA to quantify LOC, Evac, LEV, and NEOM with particular emphasis on describing in broad terms how MSCSurv works and its capabilities and most significant models.

  5. Using self-organizing maps to determine observation threshold limit predictions in highly variant data

    Science.gov (United States)

    Paganoni, C.A.; Chang, K.C.; Robblee, M.B.

    2006-01-01

    A significant data quality challenge for highly variant systems surrounds the limited ability to quantify operationally reasonable limits on the data elements being collected and provide reasonable threshold predictions. In many instances, the number of influences that drive a resulting value or operational range is too large to enable physical sampling for each influencer, or is too complicated to accurately model in an explicit simulation. An alternative method to determine reasonable observation thresholds is to employ an automation algorithm that would emulate a human analyst visually inspecting data for limits. Using the visualization technique of self-organizing maps (SOM) on data having poorly understood relationships, a methodology for determining threshold limits was developed. To illustrate this approach, analysis of environmental influences that drive the abundance of a target indicator species (the pink shrimp, Farfantepenaeus duorarum) provided a real example of applicability. The relationship between salinity and temperature and abundance of F. duorarum is well documented, but the effect of changes in water quality upstream on pink shrimp abundance is not well understood. The highly variant nature surrounding catch of a specific number of organisms in the wild, and the data available from up-stream hydrology measures for salinity and temperature, made this an ideal candidate for the approach to provide a determination about the influence of changes in hydrology on populations of organisms.

  6. Rationale and design of the participant, investigator, observer, and data-analyst-blinded randomized AGENDA trial on associations between gene-polymorphisms, endophenotypes for depression and antidepressive intervention: the effect of escitalopram versus placebo on the combined dexamethasone-corticotrophine releasing hormone test and other potential endophenotypes in healthy first-degree relatives of persons with depression

    Directory of Open Access Journals (Sweden)

    Paulson Olaf

    2009-08-01

    Full Text Available Abstract Background Endophenotypes are heritable markers, which are more prevalent in patients and their healthy relatives than in the general population. Recent studies point at disturbed regulation of the hypothalamic-pituitary-adrenocortical axis as a possible endophenotype for depression. We hypothesize that potential endophenotypes for depression may be affected by selective serotonin re-uptake inhibitor antidepressants in healthy first-degree relatives of depressed patients. The primary outcome measure is the change in plasma cortisol in the dexamethasone-corticotrophin releasing hormone test from baseline to the end of intervention. Methods The AGENDA trial is designed as a participant, investigator, observer, and data-analyst-blinded randomized trial. Participants are 80 healthy first-degree relatives of patients with depression. Participants are randomized to escitalopram 10 mg per day versus placebo for four weeks. Randomization is stratified by gender and age. The primary outcome measure is the change in plasma cortisol in the dexamethasone-corticotrophin releasing hormone test at entry before intervention to after four weeks of intervention. With the inclusion of 80 participants, a 60% power is obtained to detect a clinically relevant difference in the primary outcome between the intervention and the placebo group. Secondary outcome measures are changes from baseline to four weeks in scores of: 1 cognition and 2 neuroticism. Tertiary outcomes measures are changes from baseline to four weeks in scores of: 1 depression and anxiety symptoms; 2 subjective evaluations of depressive symptoms, perceived stress, quality of life, aggression, sleep, and pain; and 3 salivary cortisol at eight different timepoints during an ordinary day. Assessments are undertaken by assessors blinded to the randomization group. Trial registration Local Ethics Committee: H-KF 307413 Danish Medicines Agency: 2612-3162. EudraCT: 2006-001750-28. Danish Data Agency

  7. Genomic Prediction in Barley

    DEFF Research Database (Denmark)

    Edriss, Vahid; Cericola, Fabio; Jensen, Jens D

    2015-01-01

    to next generation. The main goal of this study was to see the potential of using genomic prediction in a commercial Barley breeding program. The data used in this study was from Nordic Seed company which is located in Denmark. Around 350 advanced lines were genotyped with 9K Barely chip from Illumina....... Traits used in this study were grain yield, plant height and heading date. Heading date is number days it takes after 1st June for plant to head. Heritabilities were 0.33, 0.44 and 0.48 for yield, height and heading, respectively for the average of nine plots. The GBLUP model was used for genomic...

  8. Predicting Lotto Numbers

    DEFF Research Database (Denmark)

    Suetens, Sigrid; Galbo-Jørgensen, Claus B.; Tyran, Jean-Robert Karl

    2016-01-01

    We investigate the ‘law of small numbers’ using a data set on lotto gambling that allows us to measure players’ reactions to draws. While most players pick the same set of numbers week after week, we find that those who do change react on average as predicted by the law of small numbers...... as formalized in recent behavioral theory. In particular, players tend to bet less on numbers that have been drawn in the preceding week, as suggested by the ‘gambler’s fallacy’, and bet more on a number if it was frequently drawn in the recent past, consistent with the ‘hot-hand fallacy’....

  9. Predictable return distributions

    DEFF Research Database (Denmark)

    Pedersen, Thomas Quistgaard

    trace out the entire distribution. A univariate quantile regression model is used to examine stock and bond return distributions individually, while a multivariate model is used to capture their joint distribution. An empirical analysis on US data shows that certain parts of the return distributions......-of-sample analyses show that the relative accuracy of the state variables in predicting future returns varies across the distribution. A portfolio study shows that an investor with power utility can obtain economic gains by applying the empirical return distribution in portfolio decisions instead of imposing...

  10. Predicting Ground Illuminance

    Science.gov (United States)

    Lesniak, Michael V.; Tregoning, Brett D.; Hitchens, Alexandra E.

    2015-01-01

    Our Sun outputs 3.85 x 1026 W of radiation, of which roughly 37% is in the visible band. It is directly responsible for nearly all natural illuminance experienced on Earth's surface, either in the form of direct/refracted sunlight or in reflected light bouncing off the surfaces and/or atmospheres of our Moon and the visible planets. Ground illuminance, defined as the amount of visible light intercepting a unit area of surface (from all incident angles), varies over 7 orders of magnitude from day to night. It is highly dependent on well-modeled factors such as the relative positions of the Sun, Earth, and Moon. It is also dependent on less predictable factors such as local atmospheric conditions and weather.Several models have been proposed to predict ground illuminance, including Brown (1952) and Shapiro (1982, 1987). The Brown model is a set of empirical data collected from observation points around the world that has been reduced to a smooth fit of illuminance against a single variable, solar altitude. It provides limited applicability to the Moon and for cloudy conditions via multiplicative reduction factors. The Shapiro model is a theoretical model that treats the atmosphere as a three layer system of light reflectance and transmittance. It has different sets of reflectance and transmittance coefficients for various cloud types.In this paper we compare the models' predictions to ground illuminance data from an observing run at the White Sands missile range (data was obtained from the United Kingdom's Meteorology Office). Continuous illuminance readings were recorded under various cloud conditions, during both daytime and nighttime hours. We find that under clear skies, the Shapiro model tends to better fit the observations during daytime hours with typical discrepancies under 10%. Under cloudy skies, both models tend to poorly predict ground illuminance. However, the Shapiro model, with typical average daytime discrepancies of 25% or less in many cases

  11. Predicting sports betting outcomes

    OpenAIRE

    Flis, Borut

    2014-01-01

    We wish to build a model, which could predict the outcome of basketball games. The goal was to achieve an sufficient enough accuracy to make a profit in sports betting. One learning example is a game in the NBA regular season. Every example has multiple features, which describe the opposing teams. We tried many methods, which return the probability of the home team winning and the probability of the away team winning. These probabilities are used for risk analysis. We used the best model in h...

  12. Predicting chaotic time series

    International Nuclear Information System (INIS)

    Farmer, J.D.; Sidorowich, J.J.

    1987-01-01

    We present a forecasting technique for chaotic data. After embedding a time series in a state space using delay coordinates, we ''learn'' the induced nonlinear mapping using local approximation. This allows us to make short-term predictions of the future behavior of a time series, using information based only on past values. We present an error estimate for this technique, and demonstrate its effectiveness by applying it to several examples, including data from the Mackey-Glass delay differential equation, Rayleigh-Benard convection, and Taylor-Couette flow

  13. Lattice of quantum predictions

    Science.gov (United States)

    Drieschner, Michael

    1993-10-01

    What is the structure of reality? Physics is supposed to answer this question, but a purely empiristic view is not sufficient to explain its ability to do so. Quantum mechanics has forced us to think more deeply about what a physical theory is. There are preconditions every physical theory must fulfill. It has to contain, e.g., rules for empirically testable predictions. Those preconditions give physics a structure that is “a priori” in the Kantian sense. An example is given how the lattice structure of quantum mechanics can be understood along these lines.

  14. Foundations of predictive analytics

    CERN Document Server

    Wu, James

    2012-01-01

    Drawing on the authors' two decades of experience in applied modeling and data mining, Foundations of Predictive Analytics presents the fundamental background required for analyzing data and building models for many practical applications, such as consumer behavior modeling, risk and marketing analytics, and other areas. It also discusses a variety of practical topics that are frequently missing from similar texts. The book begins with the statistical and linear algebra/matrix foundation of modeling methods, from distributions to cumulant and copula functions to Cornish--Fisher expansion and o

  15. Prediction of regulatory elements

    DEFF Research Database (Denmark)

    Sandelin, Albin

    2008-01-01

    Finding the regulatory mechanisms responsible for gene expression remains one of the most important challenges for biomedical research. A major focus in cellular biology is to find functional transcription factor binding sites (TFBS) responsible for the regulation of a downstream gene. As wet......-lab methods are time consuming and expensive, it is not realistic to identify TFBS for all uncharacterized genes in the genome by purely experimental means. Computational methods aimed at predicting potential regulatory regions can increase the efficiency of wet-lab experiments significantly. Here, methods...

  16. Age and Stress Prediction

    Science.gov (United States)

    2000-01-01

    Genoa is a software product that predicts progressive aging and failure in a variety of materials. It is the result of a SBIR contract between the Glenn Research Center and Alpha Star Corporation. Genoa allows designers to determine if the materials they plan on applying to a structure are up to the task or if alternate materials should be considered. Genoa's two feature applications are its progressive failure simulations and its test verification. It allows for a reduction in inspection frequency, rapid design solutions, and manufacturing with low cost materials. It will benefit the aerospace, airline, and automotive industries, with future applications for other uses.

  17. Prediction of Biomolecular Complexes

    KAUST Repository

    Vangone, Anna

    2017-04-12

    Almost all processes in living organisms occur through specific interactions between biomolecules. Any dysfunction of those interactions can lead to pathological events. Understanding such interactions is therefore a crucial step in the investigation of biological systems and a starting point for drug design. In recent years, experimental studies have been devoted to unravel the principles of biomolecular interactions; however, due to experimental difficulties in solving the three-dimensional (3D) structure of biomolecular complexes, the number of available, high-resolution experimental 3D structures does not fulfill the current needs. Therefore, complementary computational approaches to model such interactions are necessary to assist experimentalists since a full understanding of how biomolecules interact (and consequently how they perform their function) only comes from 3D structures which provide crucial atomic details about binding and recognition processes. In this chapter we review approaches to predict biomolecular complexesBiomolecular complexes, introducing the concept of molecular dockingDocking, a technique which uses a combination of geometric, steric and energetics considerations to predict the 3D structure of a biological complex starting from the individual structures of its constituent parts. We provide a mini-guide about docking concepts, its potential and challenges, along with post-docking analysis and a list of related software.

  18. Nuclear criticality predictability

    International Nuclear Information System (INIS)

    Briggs, J.B.

    1999-01-01

    As a result of lots of efforts, a large portion of the tedious and redundant research and processing of critical experiment data has been eliminated. The necessary step in criticality safety analyses of validating computer codes with benchmark critical data is greatly streamlined, and valuable criticality safety experimental data is preserved. Criticality safety personnel in 31 different countries are now using the 'International Handbook of Evaluated Criticality Safety Benchmark Experiments'. Much has been accomplished by the work of the ICSBEP. However, evaluation and documentation represents only one element of a successful Nuclear Criticality Safety Predictability Program and this element only exists as a separate entity, because this work was not completed in conjunction with the experimentation process. I believe; however, that the work of the ICSBEP has also served to unify the other elements of nuclear criticality predictability. All elements are interrelated, but for a time it seemed that communications between these elements was not adequate. The ICSBEP has highlighted gaps in data, has retrieved lost data, has helped to identify errors in cross section processing codes, and has helped bring the international criticality safety community together in a common cause as true friends and colleagues. It has been a privilege to associate with those who work so diligently to make the project a success. (J.P.N.)

  19. Ratchetting strain prediction

    International Nuclear Information System (INIS)

    Noban, Mohammad; Jahed, Hamid

    2007-01-01

    A time-efficient method for predicting ratchetting strain is proposed. The ratchetting strain at any cycle is determined by finding the ratchetting rate at only a few cycles. This determination is done by first defining the trajectory of the origin of stress in the deviatoric stress space and then incorporating this moving origin into a cyclic plasticity model. It is shown that at the beginning of the loading, the starting point of this trajectory coincides with the initial stress origin and approaches the mean stress, displaying a power-law relationship with the number of loading cycles. The method of obtaining this trajectory from a standard uniaxial asymmetric cyclic loading is presented. Ratchetting rates are calculated with the help of this trajectory and through the use of a constitutive cyclic plasticity model which incorporates deviatoric stresses and back stresses that are measured with respect to this moving frame. The proposed model is used to predict the ratchetting strain of two types of steels under single- and multi-step loadings. Results obtained agree well with the available experimental measurements

  20. Predicting space climate change

    Science.gov (United States)

    Balcerak, Ernie

    2011-10-01

    Galactic cosmic rays and solar energetic particles can be hazardous to humans in space, damage spacecraft and satellites, pose threats to aircraft electronics, and expose aircrew and passengers to radiation. A new study shows that these threats are likely to increase in coming years as the Sun approaches the end of the period of high solar activity known as “grand solar maximum,” which has persisted through the past several decades. High solar activity can help protect the Earth by repelling incoming galactic cosmic rays. Understanding the past record can help scientists predict future conditions. Barnard et al. analyzed a 9300-year record of galactic cosmic ray and solar activity based on cosmogenic isotopes in ice cores as well as on neutron monitor data. They used this to predict future variations in galactic cosmic ray flux, near-Earth interplanetary magnetic field, sunspot number, and probability of large solar energetic particle events. The researchers found that the risk of space weather radiation events will likely increase noticeably over the next century compared with recent decades and that lower solar activity will lead to increased galactic cosmic ray levels. (Geophysical Research Letters, doi:10.1029/2011GL048489, 2011)

  1. Prediction of Biomolecular Complexes

    KAUST Repository

    Vangone, Anna; Oliva, Romina; Cavallo, Luigi; Bonvin, Alexandre M. J. J.

    2017-01-01

    Almost all processes in living organisms occur through specific interactions between biomolecules. Any dysfunction of those interactions can lead to pathological events. Understanding such interactions is therefore a crucial step in the investigation of biological systems and a starting point for drug design. In recent years, experimental studies have been devoted to unravel the principles of biomolecular interactions; however, due to experimental difficulties in solving the three-dimensional (3D) structure of biomolecular complexes, the number of available, high-resolution experimental 3D structures does not fulfill the current needs. Therefore, complementary computational approaches to model such interactions are necessary to assist experimentalists since a full understanding of how biomolecules interact (and consequently how they perform their function) only comes from 3D structures which provide crucial atomic details about binding and recognition processes. In this chapter we review approaches to predict biomolecular complexesBiomolecular complexes, introducing the concept of molecular dockingDocking, a technique which uses a combination of geometric, steric and energetics considerations to predict the 3D structure of a biological complex starting from the individual structures of its constituent parts. We provide a mini-guide about docking concepts, its potential and challenges, along with post-docking analysis and a list of related software.

  2. Predicting Alloreactivity in Transplantation

    Directory of Open Access Journals (Sweden)

    Kirsten Geneugelijk

    2014-01-01

    Full Text Available Human leukocyte Antigen (HLA mismatching leads to severe complications after solid-organ transplantation and hematopoietic stem-cell transplantation. The alloreactive responses underlying the posttransplantation complications include both direct recognition of allogeneic HLA by HLA-specific alloantibodies and T cells and indirect T-cell recognition. However, the immunogenicity of HLA mismatches is highly variable; some HLA mismatches lead to severe clinical B-cell- and T-cell-mediated alloreactivity, whereas others are well tolerated. Definition of the permissibility of HLA mismatches prior to transplantation allows selection of donor-recipient combinations that will have a reduced chance to develop deleterious host-versus-graft responses after solid-organ transplantation and graft-versus-host responses after hematopoietic stem-cell transplantation. Therefore, several methods have been developed to predict permissible HLA-mismatch combinations. In this review we aim to give a comprehensive overview about the current knowledge regarding HLA-directed alloreactivity and several developed in vitro and in silico tools that aim to predict direct and indirect alloreactivity.

  3. Generalized Predictive Control and Neural Generalized Predictive Control

    Directory of Open Access Journals (Sweden)

    Sadhana CHIDRAWAR

    2008-12-01

    Full Text Available As Model Predictive Control (MPC relies on the predictive Control using a multilayer feed forward network as the plants linear model is presented. In using Newton-Raphson as the optimization algorithm, the number of iterations needed for convergence is significantly reduced from other techniques. This paper presents a detailed derivation of the Generalized Predictive Control and Neural Generalized Predictive Control with Newton-Raphson as minimization algorithm. Taking three separate systems, performances of the system has been tested. Simulation results show the effect of neural network on Generalized Predictive Control. The performance comparison of this three system configurations has been given in terms of ISE and IAE.

  4. Numerical prediction of rose growth

    NARCIS (Netherlands)

    Bernsen, E.; Bokhove, Onno; van der Sar, D.M.

    2006-01-01

    A new mathematical model is presented for the prediction of rose growth in a greenhouse. Given the measured ambient environmental conditions, the model consists of a local photosynthesis model, predicting the photosynthesis per unit leaf area, coupled to a global greenhouse model, which predicts the

  5. Confidence scores for prediction models

    DEFF Research Database (Denmark)

    Gerds, Thomas Alexander; van de Wiel, MA

    2011-01-01

    In medical statistics, many alternative strategies are available for building a prediction model based on training data. Prediction models are routinely compared by means of their prediction performance in independent validation data. If only one data set is available for training and validation,...

  6. Protein docking prediction using predicted protein-protein interface

    Directory of Open Access Journals (Sweden)

    Li Bin

    2012-01-01

    Full Text Available Abstract Background Many important cellular processes are carried out by protein complexes. To provide physical pictures of interacting proteins, many computational protein-protein prediction methods have been developed in the past. However, it is still difficult to identify the correct docking complex structure within top ranks among alternative conformations. Results We present a novel protein docking algorithm that utilizes imperfect protein-protein binding interface prediction for guiding protein docking. Since the accuracy of protein binding site prediction varies depending on cases, the challenge is to develop a method which does not deteriorate but improves docking results by using a binding site prediction which may not be 100% accurate. The algorithm, named PI-LZerD (using Predicted Interface with Local 3D Zernike descriptor-based Docking algorithm, is based on a pair wise protein docking prediction algorithm, LZerD, which we have developed earlier. PI-LZerD starts from performing docking prediction using the provided protein-protein binding interface prediction as constraints, which is followed by the second round of docking with updated docking interface information to further improve docking conformation. Benchmark results on bound and unbound cases show that PI-LZerD consistently improves the docking prediction accuracy as compared with docking without using binding site prediction or using the binding site prediction as post-filtering. Conclusion We have developed PI-LZerD, a pairwise docking algorithm, which uses imperfect protein-protein binding interface prediction to improve docking accuracy. PI-LZerD consistently showed better prediction accuracy over alternative methods in the series of benchmark experiments including docking using actual docking interface site predictions as well as unbound docking cases.

  7. Protein docking prediction using predicted protein-protein interface.

    Science.gov (United States)

    Li, Bin; Kihara, Daisuke

    2012-01-10

    Many important cellular processes are carried out by protein complexes. To provide physical pictures of interacting proteins, many computational protein-protein prediction methods have been developed in the past. However, it is still difficult to identify the correct docking complex structure within top ranks among alternative conformations. We present a novel protein docking algorithm that utilizes imperfect protein-protein binding interface prediction for guiding protein docking. Since the accuracy of protein binding site prediction varies depending on cases, the challenge is to develop a method which does not deteriorate but improves docking results by using a binding site prediction which may not be 100% accurate. The algorithm, named PI-LZerD (using Predicted Interface with Local 3D Zernike descriptor-based Docking algorithm), is based on a pair wise protein docking prediction algorithm, LZerD, which we have developed earlier. PI-LZerD starts from performing docking prediction using the provided protein-protein binding interface prediction as constraints, which is followed by the second round of docking with updated docking interface information to further improve docking conformation. Benchmark results on bound and unbound cases show that PI-LZerD consistently improves the docking prediction accuracy as compared with docking without using binding site prediction or using the binding site prediction as post-filtering. We have developed PI-LZerD, a pairwise docking algorithm, which uses imperfect protein-protein binding interface prediction to improve docking accuracy. PI-LZerD consistently showed better prediction accuracy over alternative methods in the series of benchmark experiments including docking using actual docking interface site predictions as well as unbound docking cases.

  8. Epitope prediction methods

    DEFF Research Database (Denmark)

    Karosiene, Edita

    Analysis. The chapter provides detailed explanations on how to use different methods for T cell epitope discovery research, explaining how input should be given as well as how to interpret the output. In the last chapter, I present the results of a bioinformatics analysis of epitopes from the yellow fever...... peptide-MHC interactions. Furthermore, using yellow fever virus epitopes, we demonstrated the power of the %Rank score when compared with the binding affinity score of MHC prediction methods, suggesting that this score should be considered to be used for selecting potential T cell epitopes. In summary...... immune responses. Therefore, it is of great importance to be able to identify peptides that bind to MHC molecules, in order to understand the nature of immune responses and discover T cell epitopes useful for designing new vaccines and immunotherapies. MHC molecules in humans, referred to as human...

  9. Motor degradation prediction methods

    Energy Technology Data Exchange (ETDEWEB)

    Arnold, J.R.; Kelly, J.F.; Delzingaro, M.J.

    1996-12-01

    Motor Operated Valve (MOV) squirrel cage AC motor rotors are susceptible to degradation under certain conditions. Premature failure can result due to high humidity/temperature environments, high running load conditions, extended periods at locked rotor conditions (i.e. > 15 seconds) or exceeding the motor`s duty cycle by frequent starts or multiple valve stroking. Exposure to high heat and moisture due to packing leaks, pressure seal ring leakage or other causes can significantly accelerate the degradation. ComEd and Liberty Technologies have worked together to provide and validate a non-intrusive method using motor power diagnostics to evaluate MOV rotor condition and predict failure. These techniques have provided a quick, low radiation dose method to evaluate inaccessible motors, identify degradation and allow scheduled replacement of motors prior to catastrophic failures.

  10. Filter replacement lifetime prediction

    Science.gov (United States)

    Hamann, Hendrik F.; Klein, Levente I.; Manzer, Dennis G.; Marianno, Fernando J.

    2017-10-25

    Methods and systems for predicting a filter lifetime include building a filter effectiveness history based on contaminant sensor information associated with a filter; determining a rate of filter consumption with a processor based on the filter effectiveness history; and determining a remaining filter lifetime based on the determined rate of filter consumption. Methods and systems for increasing filter economy include measuring contaminants in an internal and an external environment; determining a cost of a corrosion rate increase if unfiltered external air intake is increased for cooling; determining a cost of increased air pressure to filter external air; and if the cost of filtering external air exceeds the cost of the corrosion rate increase, increasing an intake of unfiltered external air.

  11. Neurological abnormalities predict disability

    DEFF Research Database (Denmark)

    Poggesi, Anna; Gouw, Alida; van der Flier, Wiesje

    2014-01-01

    To investigate the role of neurological abnormalities and magnetic resonance imaging (MRI) lesions in predicting global functional decline in a cohort of initially independent-living elderly subjects. The Leukoaraiosis And DISability (LADIS) Study, involving 11 European centres, was primarily aimed...... at evaluating age-related white matter changes (ARWMC) as an independent predictor of the transition to disability (according to Instrumental Activities of Daily Living scale) or death in independent elderly subjects that were followed up for 3 years. At baseline, a standardized neurological examination.......0 years, 45 % males), 327 (51.7 %) presented at the initial visit with ≥1 neurological abnormality and 242 (38 %) reached the main study outcome. Cox regression analyses, adjusting for MRI features and other determinants of functional decline, showed that the baseline presence of any neurological...

  12. Motor degradation prediction methods

    International Nuclear Information System (INIS)

    Arnold, J.R.; Kelly, J.F.; Delzingaro, M.J.

    1996-01-01

    Motor Operated Valve (MOV) squirrel cage AC motor rotors are susceptible to degradation under certain conditions. Premature failure can result due to high humidity/temperature environments, high running load conditions, extended periods at locked rotor conditions (i.e. > 15 seconds) or exceeding the motor's duty cycle by frequent starts or multiple valve stroking. Exposure to high heat and moisture due to packing leaks, pressure seal ring leakage or other causes can significantly accelerate the degradation. ComEd and Liberty Technologies have worked together to provide and validate a non-intrusive method using motor power diagnostics to evaluate MOV rotor condition and predict failure. These techniques have provided a quick, low radiation dose method to evaluate inaccessible motors, identify degradation and allow scheduled replacement of motors prior to catastrophic failures

  13. Predictability in community dynamics.

    Science.gov (United States)

    Blonder, Benjamin; Moulton, Derek E; Blois, Jessica; Enquist, Brian J; Graae, Bente J; Macias-Fauria, Marc; McGill, Brian; Nogué, Sandra; Ordonez, Alejandro; Sandel, Brody; Svenning, Jens-Christian

    2017-03-01

    The coupling between community composition and climate change spans a gradient from no lags to strong lags. The no-lag hypothesis is the foundation of many ecophysiological models, correlative species distribution modelling and climate reconstruction approaches. Simple lag hypotheses have become prominent in disequilibrium ecology, proposing that communities track climate change following a fixed function or with a time delay. However, more complex dynamics are possible and may lead to memory effects and alternate unstable states. We develop graphical and analytic methods for assessing these scenarios and show that these dynamics can appear in even simple models. The overall implications are that (1) complex community dynamics may be common and (2) detailed knowledge of past climate change and community states will often be necessary yet sometimes insufficient to make predictions of a community's future state. © 2017 John Wiley & Sons Ltd/CNRS.

  14. Neonatal heart rate prediction.

    Science.gov (United States)

    Abdel-Rahman, Yumna; Jeremic, Aleksander; Tan, Kenneth

    2009-01-01

    Technological advances have caused a decrease in the number of infant deaths. Pre-term infants now have a substantially increased chance of survival. One of the mechanisms that is vital to saving the lives of these infants is continuous monitoring and early diagnosis. With continuous monitoring huge amounts of data are collected with so much information embedded in them. By using statistical analysis this information can be extracted and used to aid diagnosis and to understand development. In this study we have a large dataset containing over 180 pre-term infants whose heart rates were recorded over the length of their stay in the Neonatal Intensive Care Unit (NICU). We test two types of models, empirical bayesian and autoregressive moving average. We then attempt to predict future values. The autoregressive moving average model showed better results but required more computation.

  15. Chloride ingress prediction

    DEFF Research Database (Denmark)

    Frederiksen, Jens Mejer; Geiker, Mette Rica

    2008-01-01

    Prediction of chloride ingress into concrete is an important part of durability design of reinforced concrete structures exposed to chloride containing environment. This paper presents experimentally based design parameters for Portland cement concretes with and without silica fume and fly ash...... in marine atmospheric and submersed South Scandinavian environment. The design parameters are based on sequential measurements of 86 chloride profiles taken over ten years from 13 different types of concrete. The design parameters provide the input for an analytical model for chloride profiles as function...... of depth and time, when both the surface chloride concentration and the diffusion coefficient are allowed to vary in time. The model is presented in a companion paper....

  16. Strontium 90 fallout prediction

    International Nuclear Information System (INIS)

    Sarmiento, J.L.; Gwinn, E.

    1986-01-01

    An empirical formula is developed for predicting monthly sea level strontium 90 fallout (F) in the northern hemisphere as a function of time (t), precipitation rate (P), latitude (phi), longitude (lambda), and the sea level concentration of stronium 90 in air (C): F(lambda, phi, t) = C(t, phi)[v /sub d/(phi) + v/sub w/(lambda, phi, t)], where v/sub w/(lambda, phi, t) = a(phi)[P(lambda, phi, t)/P/sub o/]/sup b//sup (//sup phi//sup )/ is the wet removal, v/sub d/(phi) is the dry removal and P 0 is 1 cm/month. The constants v/sub d/, a, and b are determined as functions of latitude by fitting land based observations. The concentration of 90 Sr in air is calculated as a function of the deseasonalized concentration at a reference latitude (C-bar/sub r//sub e//sub f/), the ratio of the observations at the latitude of interest to the reference latitude (R), and a function representing the seasonal trend in the air concentration (1 + g): C-bar(t, phi) = C/sub r//sub e//sub f/(t)R(phi)[1 + g(m, phi)]; m is the month. Zonal trends in C are shown to be relatively small. This formula can be used in conjuction with precipitation observations and/or estimates to predict fallout in the northern hemisphere for any month in the years 1954 to 1974. Error estimates are given; they do not include uncertainty due to errors in precipitation data

  17. Plume rise predictions

    International Nuclear Information System (INIS)

    Briggs, G.A.

    1976-01-01

    Anyone involved with diffusion calculations becomes well aware of the strong dependence of maximum ground concentrations on the effective stack height, h/sub e/. For most conditions chi/sub max/ is approximately proportional to h/sub e/ -2 , as has been recognized at least since 1936 (Bosanquet and Pearson). Making allowance for the gradual decrease in the ratio of vertical to lateral diffusion at increasing heights, the exponent is slightly larger, say chi/sub max/ approximately h/sub e/ - 2 . 3 . In inversion breakup fumigation, the exponent is somewhat smaller; very crudely, chi/sub max/ approximately h/sub e/ -1 . 5 . In any case, for an elevated emission the dependence of chi/sub max/ on h/sub e/ is substantial. It is postulated that a really clever ignorant theoretician can disguise his ignorance with dimensionless constants. For most sources the effective stack height is considerably larger than the actual source height, h/sub s/. For instance, for power plants with no downwash problems, h/sub e/ is more than twice h/sub s/ whenever the wind is less than 10 m/sec, which is most of the time. This is unfortunate for anyone who has to predict ground concentrations, for he is likely to have to calculate the plume rise, Δh. Especially when using h/sub e/ = h/sub s/ + Δh instead of h/sub s/ may reduce chi/sub max/ by a factor of anywhere from 4 to infinity. Factors to be considered in making plume rise predictions are discussed

  18. Predictive coarse-graining

    Energy Technology Data Exchange (ETDEWEB)

    Schöberl, Markus, E-mail: m.schoeberl@tum.de [Continuum Mechanics Group, Technical University of Munich, Boltzmannstraße 15, 85748 Garching (Germany); Zabaras, Nicholas [Institute for Advanced Study, Technical University of Munich, Lichtenbergstraße 2a, 85748 Garching (Germany); Department of Aerospace and Mechanical Engineering, University of Notre Dame, 365 Fitzpatrick Hall, Notre Dame, IN 46556 (United States); Koutsourelakis, Phaedon-Stelios [Continuum Mechanics Group, Technical University of Munich, Boltzmannstraße 15, 85748 Garching (Germany)

    2017-03-15

    We propose a data-driven, coarse-graining formulation in the context of equilibrium statistical mechanics. In contrast to existing techniques which are based on a fine-to-coarse map, we adopt the opposite strategy by prescribing a probabilistic coarse-to-fine map. This corresponds to a directed probabilistic model where the coarse variables play the role of latent generators of the fine scale (all-atom) data. From an information-theoretic perspective, the framework proposed provides an improvement upon the relative entropy method and is capable of quantifying the uncertainty due to the information loss that unavoidably takes place during the coarse-graining process. Furthermore, it can be readily extended to a fully Bayesian model where various sources of uncertainties are reflected in the posterior of the model parameters. The latter can be used to produce not only point estimates of fine-scale reconstructions or macroscopic observables, but more importantly, predictive posterior distributions on these quantities. Predictive posterior distributions reflect the confidence of the model as a function of the amount of data and the level of coarse-graining. The issues of model complexity and model selection are seamlessly addressed by employing a hierarchical prior that favors the discovery of sparse solutions, revealing the most prominent features in the coarse-grained model. A flexible and parallelizable Monte Carlo – Expectation–Maximization (MC-EM) scheme is proposed for carrying out inference and learning tasks. A comparative assessment of the proposed methodology is presented for a lattice spin system and the SPC/E water model.

  19. Data-Based Predictive Control with Multirate Prediction Step

    Science.gov (United States)

    Barlow, Jonathan S.

    2010-01-01

    Data-based predictive control is an emerging control method that stems from Model Predictive Control (MPC). MPC computes current control action based on a prediction of the system output a number of time steps into the future and is generally derived from a known model of the system. Data-based predictive control has the advantage of deriving predictive models and controller gains from input-output data. Thus, a controller can be designed from the outputs of complex simulation code or a physical system where no explicit model exists. If the output data happens to be corrupted by periodic disturbances, the designed controller will also have the built-in ability to reject these disturbances without the need to know them. When data-based predictive control is implemented online, it becomes a version of adaptive control. One challenge of MPC is computational requirements increasing with prediction horizon length. This paper develops a closed-loop dynamic output feedback controller that minimizes a multi-step-ahead receding-horizon cost function with multirate prediction step. One result is a reduced influence of prediction horizon and the number of system outputs on the computational requirements of the controller. Another result is an emphasis on portions of the prediction window that are sampled more frequently. A third result is the ability to include more outputs in the feedback path than in the cost function.

  20. Earthquake prediction with electromagnetic phenomena

    Energy Technology Data Exchange (ETDEWEB)

    Hayakawa, Masashi, E-mail: hayakawa@hi-seismo-em.jp [Hayakawa Institute of Seismo Electomagnetics, Co. Ltd., University of Electro-Communications (UEC) Incubation Center, 1-5-1 Chofugaoka, Chofu Tokyo, 182-8585 (Japan); Advanced Wireless & Communications Research Center, UEC, Chofu Tokyo (Japan); Earthquake Analysis Laboratory, Information Systems Inc., 4-8-15, Minami-aoyama, Minato-ku, Tokyo, 107-0062 (Japan); Fuji Security Systems. Co. Ltd., Iwato-cho 1, Shinjyuku-ku, Tokyo (Japan)

    2016-02-01

    Short-term earthquake (EQ) prediction is defined as prospective prediction with the time scale of about one week, which is considered to be one of the most important and urgent topics for the human beings. If this short-term prediction is realized, casualty will be drastically reduced. Unlike the conventional seismic measurement, we proposed the use of electromagnetic phenomena as precursors to EQs in the prediction, and an extensive amount of progress has been achieved in the field of seismo-electromagnetics during the last two decades. This paper deals with the review on this short-term EQ prediction, including the impossibility myth of EQs prediction by seismometers, the reason why we are interested in electromagnetics, the history of seismo-electromagnetics, the ionospheric perturbation as the most promising candidate of EQ prediction, then the future of EQ predictology from two standpoints of a practical science and a pure science, and finally a brief summary.

  1. Performance Prediction Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    2017-09-25

    The Performance Prediction Toolkit (PPT), is a scalable co-design tool that contains the hardware and middle-ware models, which accept proxy applications as input in runtime prediction. PPT relies on Simian, a parallel discrete event simulation engine in Python or Lua, that uses the process concept, where each computing unit (host, node, core) is a Simian entity. Processes perform their task through message exchanges to remain active, sleep, wake-up, begin and end. The PPT hardware model of a compute core (such as a Haswell core) consists of a set of parameters, such as clock speed, memory hierarchy levels, their respective sizes, cache-lines, access times for different cache levels, average cycle counts of ALU operations, etc. These parameters are ideally read off a spec sheet or are learned using regression models learned from hardware counters (PAPI) data. The compute core model offers an API to the software model, a function called time_compute(), which takes as input a tasklist. A tasklist is an unordered set of ALU, and other CPU-type operations (in particular virtual memory loads and stores). The PPT application model mimics the loop structure of the application and replaces the computational kernels with a call to the hardware model's time_compute() function giving tasklists as input that model the compute kernel. A PPT application model thus consists of tasklists representing kernels and the high-er level loop structure that we like to think of as pseudo code. The key challenge for the hardware model's time_compute-function is to translate virtual memory accesses into actual cache hierarchy level hits and misses.PPT also contains another CPU core level hardware model, Analytical Memory Model (AMM). The AMM solves this challenge soundly, where our previous alternatives explicitly include the L1,L2,L3 hit-rates as inputs to the tasklists. Explicit hit-rates inevitably only reflect the application modeler's best guess, perhaps informed by a few

  2. Introduction: Long term prediction

    International Nuclear Information System (INIS)

    Beranger, G.

    2003-01-01

    Making a decision upon the right choice of a material appropriate to a given application should be based on taking into account several parameters as follows: cost, standards, regulations, safety, recycling, chemical properties, supplying, transformation, forming, assembly, mechanical and physical properties as well as the behaviour in practical conditions. Data taken from a private communication (J.H.Davidson) are reproduced presenting the life time range of materials from a couple of minutes to half a million hours corresponding to applications from missile technology up to high-temperature nuclear reactors or steam turbines. In the case of deep storage of nuclear waste the time required is completely different from these values since we have to ensure the integrity of the storage system for several thousand years. The vitrified nuclear wastes should be stored in metallic canisters made of iron and carbon steels, stainless steels, copper and copper alloys, nickel alloys or titanium alloys. Some of these materials are passivating metals, i.e. they develop a thin protective film, 2 or 3 nm thick - the so-called passive films. These films prevent general corrosion of the metal in a large range of chemical condition of the environment. In some specific condition, localized corrosion such as the phenomenon of pitting, occurs. Consequently, it is absolutely necessary to determine these chemical condition and their stability in time to understand the behavior of a given material. In other words the corrosion system is constituted by the complex material/surface/medium. For high level nuclear wastes the main features for resolving problem are concerned with: geological disposal; deep storage in clay; waste metallic canister; backfill mixture (clay-gypsum) or concrete; long term behavior; data needed for modelling and for predicting; choice of appropriate solution among several metallic candidates. The analysis of the complex material/surface/medium is of great importance

  3. Predictability of blocking

    International Nuclear Information System (INIS)

    Tosi, E.; Ruti, P.; Tibaldi, S.; D'Andrea, F.

    1994-01-01

    Tibaldi and Molteni (1990, hereafter referred to as TM) had previously investigated operational blocking predictability by the ECMWF model and the possible relationships between model systematic error and blocking in the winter season of the Northern Hemisphere, using seven years of ECMWF operational archives of analyses and day 1 to 10 forecasts. They showed that fewer blocking episodes than in the real atmosphere were generally simulated by the model, and that this deficiency increased with increasing forecast time. As a consequence of this, a major contribution to the systematic error in the winter season was shown to derive from the inability of the model to properly forecast blocking. In this study, the analysis performed in TM for the first seven winter seasons of the ECMWF operational model is extended to the subsequent five winters, during which model development, reflecting both resolution increases and parametrisation modifications, continued unabated. In addition the objective blocking index developed by TM has been applied to the observed data to study the natural low frequency variability of blocking. The ability to simulate blocking of some climate models has also been tested

  4. GABA predicts visual intelligence.

    Science.gov (United States)

    Cook, Emily; Hammett, Stephen T; Larsson, Jonas

    2016-10-06

    Early psychological researchers proposed a link between intelligence and low-level perceptual performance. It was recently suggested that this link is driven by individual variations in the ability to suppress irrelevant information, evidenced by the observation of strong correlations between perceptual surround suppression and cognitive performance. However, the neural mechanisms underlying such a link remain unclear. A candidate mechanism is neural inhibition by gamma-aminobutyric acid (GABA), but direct experimental support for GABA-mediated inhibition underlying suppression is inconsistent. Here we report evidence consistent with a global suppressive mechanism involving GABA underlying the link between sensory performance and intelligence. We measured visual cortical GABA concentration, visuo-spatial intelligence and visual surround suppression in a group of healthy adults. Levels of GABA were strongly predictive of both intelligence and surround suppression, with higher levels of intelligence associated with higher levels of GABA and stronger surround suppression. These results indicate that GABA-mediated neural inhibition may be a key factor determining cognitive performance and suggests a physiological mechanism linking surround suppression and intelligence. Copyright © 2016 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  5. Predictability in cellular automata.

    Science.gov (United States)

    Agapie, Alexandru; Andreica, Anca; Chira, Camelia; Giuclea, Marius

    2014-01-01

    Modelled as finite homogeneous Markov chains, probabilistic cellular automata with local transition probabilities in (0, 1) always posses a stationary distribution. This result alone is not very helpful when it comes to predicting the final configuration; one needs also a formula connecting the probabilities in the stationary distribution to some intrinsic feature of the lattice configuration. Previous results on the asynchronous cellular automata have showed that such feature really exists. It is the number of zero-one borders within the automaton's binary configuration. An exponential formula in the number of zero-one borders has been proved for the 1-D, 2-D and 3-D asynchronous automata with neighborhood three, five and seven, respectively. We perform computer experiments on a synchronous cellular automaton to check whether the empirical distribution obeys also that theoretical formula. The numerical results indicate a perfect fit for neighbourhood three and five, which opens the way for a rigorous proof of the formula in this new, synchronous case.

  6. Predictive Manufacturing: A Classification Strategy to Predict Product Failures

    DEFF Research Database (Denmark)

    Khan, Abdul Rauf; Schiøler, Henrik; Kulahci, Murat

    2018-01-01

    manufacturing analytics model that employs a big data approach to predicting product failures; third, we illustrate the issue of high dimensionality, along with statistically redundant information; and, finally, our proposed method will be compared against the well-known classification methods (SVM, K......-nearest neighbor, artificial neural networks). The results from real data show that our predictive manufacturing analytics approach, using genetic algorithms and Voronoi tessellations, is capable of predicting product failure with reasonable accuracy. The potential application of this method contributes...... to accurately predicting product failures, which would enable manufacturers to reduce production costs without compromising product quality....

  7. Predicting Near-Term Water Quality from Satellite Observations of Watershed Conditions

    Science.gov (United States)

    Weiss, W. J.; Wang, L.; Hoffman, K.; West, D.; Mehta, A. V.; Lee, C.

    2017-12-01

    Despite the strong influence of watershed conditions on source water quality, most water utilities and water resource agencies do not currently have the capability to monitor watershed sources of contamination with great temporal or spatial detail. Typically, knowledge of source water quality is limited to periodic grab sampling; automated monitoring of a limited number of parameters at a few select locations; and/or monitoring relevant constituents at a treatment plant intake. While important, such observations are not sufficient to inform proactive watershed or source water management at a monthly or seasonal scale. Satellite remote sensing data on the other hand can provide a snapshot of an entire watershed at regular, sub-monthly intervals, helping analysts characterize watershed conditions and identify trends that could signal changes in source water quality. Accordingly, the authors are investigating correlations between satellite remote sensing observations of watersheds and source water quality, at a variety of spatial and temporal scales and lags. While correlations between remote sensing observations and direct in situ measurements of water quality have been well described in the literature, there are few studies that link remote sensing observations across a watershed with near-term predictions of water quality. In this presentation, the authors will describe results of statistical analyses and discuss how these results are being used to inform development of a desktop decision support tool to support predictive application of remote sensing data. Predictor variables under evaluation include parameters that describe vegetative conditions; parameters that describe climate/weather conditions; and non-remote sensing, in situ measurements. Water quality parameters under investigation include nitrogen, phosphorus, organic carbon, chlorophyll-a, and turbidity.

  8. House Price Prediction Using LSTM

    OpenAIRE

    Chen, Xiaochen; Wei, Lai; Xu, Jiaxin

    2017-01-01

    In this paper, we use the house price data ranging from January 2004 to October 2016 to predict the average house price of November and December in 2016 for each district in Beijing, Shanghai, Guangzhou and Shenzhen. We apply Autoregressive Integrated Moving Average model to generate the baseline while LSTM networks to build prediction model. These algorithms are compared in terms of Mean Squared Error. The result shows that the LSTM model has excellent properties with respect to predict time...

  9. Long Range Aircraft Trajectory Prediction

    OpenAIRE

    Magister, Tone

    2009-01-01

    The subject of the paper is the improvement of the aircraft future trajectory prediction accuracy for long-range airborne separation assurance. The strategic planning of safe aircraft flights and effective conflict avoidance tactics demand timely and accurate conflict detection based upon future four–dimensional airborne traffic situation prediction which is as accurate as each aircraft flight trajectory prediction. The improved kinematics model of aircraft relative flight considering flight ...

  10. Review of Nearshore Morphologic Prediction

    Science.gov (United States)

    Plant, N. G.; Dalyander, S.; Long, J.

    2014-12-01

    The evolution of the world's erodible coastlines will determine the balance between the benefits and costs associated with human and ecological utilization of shores, beaches, dunes, barrier islands, wetlands, and estuaries. So, we would like to predict coastal evolution to guide management and planning of human and ecological response to coastal changes. After decades of research investment in data collection, theoretical and statistical analysis, and model development we have a number of empirical, statistical, and deterministic models that can predict the evolution of the shoreline, beaches, dunes, and wetlands over time scales of hours to decades, and even predict the evolution of geologic strata over the course of millennia. Comparisons of predictions to data have demonstrated that these models can have meaningful predictive skill. But these comparisons also highlight the deficiencies in fundamental understanding, formulations, or data that are responsible for prediction errors and uncertainty. Here, we review a subset of predictive models of the nearshore to illustrate tradeoffs in complexity, predictive skill, and sensitivity to input data and parameterization errors. We identify where future improvement in prediction skill will result from improved theoretical understanding, and data collection, and model-data assimilation.

  11. PREDICTED PERCENTAGE DISSATISFIED (PPD) MODEL ...

    African Journals Online (AJOL)

    HOD

    their low power requirements, are relatively cheap and are environment friendly. ... PREDICTED PERCENTAGE DISSATISFIED MODEL EVALUATION OF EVAPORATIVE COOLING ... The performance of direct evaporative coolers is a.

  12. Model Prediction Control For Water Management Using Adaptive Prediction Accuracy

    NARCIS (Netherlands)

    Tian, X.; Negenborn, R.R.; Van Overloop, P.J.A.T.M.; Mostert, E.

    2014-01-01

    In the field of operational water management, Model Predictive Control (MPC) has gained popularity owing to its versatility and flexibility. The MPC controller, which takes predictions, time delay and uncertainties into account, can be designed for multi-objective management problems and for

  13. Predictability and Prediction for an Experimental Cultural Market

    Science.gov (United States)

    Colbaugh, Richard; Glass, Kristin; Ormerod, Paul

    Individuals are often influenced by the behavior of others, for instance because they wish to obtain the benefits of coordinated actions or infer otherwise inaccessible information. In such situations this social influence decreases the ex ante predictability of the ensuing social dynamics. We claim that, interestingly, these same social forces can increase the extent to which the outcome of a social process can be predicted very early in the process. This paper explores this claim through a theoretical and empirical analysis of the experimental music market described and analyzed in [1]. We propose a very simple model for this music market, assess the predictability of market outcomes through formal analysis of the model, and use insights derived through this analysis to develop algorithms for predicting market share winners, and their ultimate market shares, in the very early stages of the market. The utility of these predictive algorithms is illustrated through analysis of the experimental music market data sets [2].

  14. Predicting epileptic seizures in advance.

    Directory of Open Access Journals (Sweden)

    Negin Moghim

    Full Text Available Epilepsy is the second most common neurological disorder, affecting 0.6-0.8% of the world's population. In this neurological disorder, abnormal activity of the brain causes seizures, the nature of which tend to be sudden. Antiepileptic Drugs (AEDs are used as long-term therapeutic solutions that control the condition. Of those treated with AEDs, 35% become resistant to medication. The unpredictable nature of seizures poses risks for the individual with epilepsy. It is clearly desirable to find more effective ways of preventing seizures for such patients. The automatic detection of oncoming seizures, before their actual onset, can facilitate timely intervention and hence minimize these risks. In addition, advance prediction of seizures can enrich our understanding of the epileptic brain. In this study, drawing on the body of work behind automatic seizure detection and prediction from digitised Invasive Electroencephalography (EEG data, a prediction algorithm, ASPPR (Advance Seizure Prediction via Pre-ictal Relabeling, is described. ASPPR facilitates the learning of predictive models targeted at recognizing patterns in EEG activity that are in a specific time window in advance of a seizure. It then exploits advanced machine learning coupled with the design and selection of appropriate features from EEG signals. Results, from evaluating ASPPR independently on 21 different patients, suggest that seizures for many patients can be predicted up to 20 minutes in advance of their onset. Compared to benchmark performance represented by a mean S1-Score (harmonic mean of Sensitivity and Specificity of 90.6% for predicting seizure onset between 0 and 5 minutes in advance, ASPPR achieves mean S1-Scores of: 96.30% for prediction between 1 and 6 minutes in advance, 96.13% for prediction between 8 and 13 minutes in advance, 94.5% for prediction between 14 and 19 minutes in advance, and 94.2% for prediction between 20 and 25 minutes in advance.

  15. Validation of the Air Force Weather Agency Ensemble Prediction Systems

    Science.gov (United States)

    2014-03-27

    by Mr. Evan L. Kuchera. Also, I would like to express my gratitude to Mr. Jeff H. Zaunter for painstakingly working with me to provided station...my fellow AFIT classmates, Capt Jeremy J. Hromsco, Capt Haley A. Homan, Capt Kyle R. Thurmond and 2Lt Coy C. Fischer for their support and...Codes. The raw METARs and SPECIs were decoded and provided for this research by Mr. Jeff Zautner, 14/WS Meteorologist, Tailored Product Analyst

  16. Quadratic prediction of factor scores

    NARCIS (Netherlands)

    Wansbeek, T

    1999-01-01

    Factor scores are naturally predicted by means of their conditional expectation given the indicators y. Under normality this expectation is linear in y but in general it is an unknown function of y. II is discussed that under nonnormality factor scores can be more precisely predicted by a quadratic

  17. Predictions for Excited Strange Baryons

    Energy Technology Data Exchange (ETDEWEB)

    Fernando, Ishara P.; Goity, Jose L. [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States)

    2016-04-01

    An assessment is made of predictions for excited hyperon masses which follow from flavor symmetry and consistency with a 1/N c expansion of QCD. Such predictions are based on presently established baryonic resonances. Low lying hyperon resonances which do not seem to fit into the proposed scheme are discussed.

  18. Climate Prediction Center - Seasonal Outlook

    Science.gov (United States)

    Weather Service NWS logo - Click to go to the NWS home page Climate Prediction Center Site Map News Forecast Discussion PROGNOSTIC DISCUSSION FOR MONTHLY OUTLOOK NWS CLIMATE PREDICTION CENTER COLLEGE PARK MD INFLUENCE ON THE MONTHLY-AVERAGED CLIMATE. OUR MID-MONTH ASSESSMENT OF LOW-FREQUENCY CLIMATE VARIABILITY IS

  19. Dividend Predictability Around the World

    DEFF Research Database (Denmark)

    Rangvid, Jesper; Schmeling, Maik; Schrimpf, Andreas

    2014-01-01

    We show that dividend-growth predictability by the dividend yield is the rule rather than the exception in global equity markets. Dividend predictability is weaker, however, in large and developed markets where dividends are smoothed more, the typical firm is large, and volatility is lower. Our f...

  20. Dividend Predictability Around the World

    DEFF Research Database (Denmark)

    Rangvid, Jesper; Schmeling, Maik; Schrimpf, Andreas

    We show that dividend growth predictability by the dividend yield is the rule rather than the exception in global equity markets. Dividend predictability is weaker, however, in large and developed markets where dividends are smoothed more, the typical firm is large, and volatility is lower. Our f...

  1. Decadal climate prediction (project GCEP).

    Science.gov (United States)

    Haines, Keith; Hermanson, Leon; Liu, Chunlei; Putt, Debbie; Sutton, Rowan; Iwi, Alan; Smith, Doug

    2009-03-13

    Decadal prediction uses climate models forced by changing greenhouse gases, as in the International Panel for Climate Change, but unlike longer range predictions they also require initialization with observations of the current climate. In particular, the upper-ocean heat content and circulation have a critical influence. Decadal prediction is still in its infancy and there is an urgent need to understand the important processes that determine predictability on these timescales. We have taken the first Hadley Centre Decadal Prediction System (DePreSys) and implemented it on several NERC institute compute clusters in order to study a wider range of initial condition impacts on decadal forecasting, eventually including the state of the land and cryosphere. The eScience methods are used to manage submission and output from the many ensemble model runs required to assess predictive skill. Early results suggest initial condition skill may extend for several years, even over land areas, but this depends sensitively on the definition used to measure skill, and alternatives are presented. The Grid for Coupled Ensemble Prediction (GCEP) system will allow the UK academic community to contribute to international experiments being planned to explore decadal climate predictability.

  2. Prediction during natural language comprehension

    NARCIS (Netherlands)

    Willems, R.M.; Frank, S.L.; Nijhof, A.D.; Hagoort, P.; Bosch, A.P.J. van den

    2016-01-01

    The notion of prediction is studied in cognitive neuroscience with increasing intensity. We investigated the neural basis of 2 distinct aspects of word prediction, derived from information theory, during story comprehension. We assessed the effect of entropy of next-word probability distributions as

  3. Reliability of windstorm predictions in the ECMWF ensemble prediction system

    Science.gov (United States)

    Becker, Nico; Ulbrich, Uwe

    2016-04-01

    Windstorms caused by extratropical cyclones are one of the most dangerous natural hazards in the European region. Therefore, reliable predictions of such storm events are needed. Case studies have shown that ensemble prediction systems (EPS) are able to provide useful information about windstorms between two and five days prior to the event. In this work, ensemble predictions with the European Centre for Medium-Range Weather Forecasts (ECMWF) EPS are evaluated in a four year period. Within the 50 ensemble members, which are initialized every 12 hours and are run for 10 days, windstorms are identified and tracked in time and space. By using a clustering approach, different predictions of the same storm are identified in the different ensemble members and compared to reanalysis data. The occurrence probability of the predicted storms is estimated by fitting a bivariate normal distribution to the storm track positions. Our results show, for example, that predicted storm clusters with occurrence probabilities of more than 50% have a matching observed storm in 80% of all cases at a lead time of two days. The predicted occurrence probabilities are reliable up to 3 days lead time. At longer lead times the occurrence probabilities are overestimated by the EPS.

  4. Psychometric prediction of penitentiary recidivism.

    Science.gov (United States)

    Medina García, Pedro M; Baños Rivera, Rosa M

    2016-05-01

    Attempts to predict prison recidivism based on the personality have not been very successful. This study aims to provide data on recidivism prediction based on the scores on a personality questionnaire. For this purpose, a predictive model combining the actuarial procedure with a posteriori probability was developed, consisting of the probabilistic calculation of the effective verification of the event once it has already occurred. Cuestionario de Personalidad Situacional (CPS; Fernández, Seisdedos, & Mielgo, 1998) was applied to 978 male inmates classified as recidivists or non-recidivists. High predictive power was achieved, with the area under the curve (AUC) of 0.85 (p <.001; Se = 0.012; 95% CI [0.826, 0.873]. The answers to the CPS items made it possible to properly discriminate 77.3% of the participants. These data indicate the important role of the personality as a key factor in understanding delinquency and predicting recidivism.

  5. Predictive Biomarkers for Asthma Therapy.

    Science.gov (United States)

    Medrek, Sarah K; Parulekar, Amit D; Hanania, Nicola A

    2017-09-19

    Asthma is a heterogeneous disease characterized by multiple phenotypes. Treatment of patients with severe disease can be challenging. Predictive biomarkers are measurable characteristics that reflect the underlying pathophysiology of asthma and can identify patients that are likely to respond to a given therapy. This review discusses current knowledge regarding predictive biomarkers in asthma. Recent trials evaluating biologic therapies targeting IgE, IL-5, IL-13, and IL-4 have utilized predictive biomarkers to identify patients who might benefit from treatment. Other work has suggested that using composite biomarkers may offer enhanced predictive capabilities in tailoring asthma therapy. Multiple biomarkers including sputum eosinophil count, blood eosinophil count, fractional concentration of nitric oxide in exhaled breath (FeNO), and serum periostin have been used to identify which patients will respond to targeted asthma medications. Further work is needed to integrate predictive biomarkers into clinical practice.

  6. An approach to model validation and model-based prediction -- polyurethane foam case study.

    Energy Technology Data Exchange (ETDEWEB)

    Dowding, Kevin J.; Rutherford, Brian Milne

    2003-07-01

    Enhanced software methodology and improved computing hardware have advanced the state of simulation technology to a point where large physics-based codes can be a major contributor in many systems analyses. This shift toward the use of computational methods has brought with it new research challenges in a number of areas including characterization of uncertainty, model validation, and the analysis of computer output. It is these challenges that have motivated the work described in this report. Approaches to and methods for model validation and (model-based) prediction have been developed recently in the engineering, mathematics and statistical literatures. In this report we have provided a fairly detailed account of one approach to model validation and prediction applied to an analysis investigating thermal decomposition of polyurethane foam. A model simulates the evolution of the foam in a high temperature environment as it transforms from a solid to a gas phase. The available modeling and experimental results serve as data for a case study focusing our model validation and prediction developmental efforts on this specific thermal application. We discuss several elements of the ''philosophy'' behind the validation and prediction approach: (1) We view the validation process as an activity applying to the use of a specific computational model for a specific application. We do acknowledge, however, that an important part of the overall development of a computational simulation initiative is the feedback provided to model developers and analysts associated with the application. (2) We utilize information obtained for the calibration of model parameters to estimate the parameters and quantify uncertainty in the estimates. We rely, however, on validation data (or data from similar analyses) to measure the variability that contributes to the uncertainty in predictions for specific systems or units (unit-to-unit variability). (3) We perform statistical

  7. Are abrupt climate changes predictable?

    Science.gov (United States)

    Ditlevsen, Peter

    2013-04-01

    It is taken for granted that the limited predictability in the initial value problem, the weather prediction, and the predictability of the statistics are two distinct problems. Lorenz (1975) dubbed this predictability of the first and the second kind respectively. Predictability of the first kind in a chaotic dynamical system is limited due to the well-known critical dependence on initial conditions. Predictability of the second kind is possible in an ergodic system, where either the dynamics is known and the phase space attractor can be characterized by simulation or the system can be observed for such long times that the statistics can be obtained from temporal averaging, assuming that the attractor does not change in time. For the climate system the distinction between predictability of the first and the second kind is fuzzy. This difficulty in distinction between predictability of the first and of the second kind is related to the lack of scale separation between fast and slow components of the climate system. The non-linear nature of the problem furthermore opens the possibility of multiple attractors, or multiple quasi-steady states. As the ice-core records show, the climate has been jumping between different quasi-stationary climates, stadials and interstadials through the Dansgaard-Oechger events. Such a jump happens very fast when a critical tipping point has been reached. The question is: Can such a tipping point be predicted? This is a new kind of predictability: the third kind. If the tipping point is reached through a bifurcation, where the stability of the system is governed by some control parameter, changing in a predictable way to a critical value, the tipping is predictable. If the sudden jump occurs because internal chaotic fluctuations, noise, push the system across a barrier, the tipping is as unpredictable as the triggering noise. In order to hint at an answer to this question, a careful analysis of the high temporal resolution NGRIP isotope

  8. Emerging approaches in predictive toxicology.

    Science.gov (United States)

    Zhang, Luoping; McHale, Cliona M; Greene, Nigel; Snyder, Ronald D; Rich, Ivan N; Aardema, Marilyn J; Roy, Shambhu; Pfuhler, Stefan; Venkatactahalam, Sundaresan

    2014-12-01

    Predictive toxicology plays an important role in the assessment of toxicity of chemicals and the drug development process. While there are several well-established in vitro and in vivo assays that are suitable for predictive toxicology, recent advances in high-throughput analytical technologies and model systems are expected to have a major impact on the field of predictive toxicology. This commentary provides an overview of the state of the current science and a brief discussion on future perspectives for the field of predictive toxicology for human toxicity. Computational models for predictive toxicology, needs for further refinement and obstacles to expand computational models to include additional classes of chemical compounds are highlighted. Functional and comparative genomics approaches in predictive toxicology are discussed with an emphasis on successful utilization of recently developed model systems for high-throughput analysis. The advantages of three-dimensional model systems and stem cells and their use in predictive toxicology testing are also described. © 2014 Wiley Periodicals, Inc.

  9. Earthquake prediction by Kina Method

    International Nuclear Information System (INIS)

    Kianoosh, H.; Keypour, H.; Naderzadeh, A.; Motlagh, H.F.

    2005-01-01

    Earthquake prediction has been one of the earliest desires of the man. Scientists have worked hard to predict earthquakes for a long time. The results of these efforts can generally be divided into two methods of prediction: 1) Statistical Method, and 2) Empirical Method. In the first method, earthquakes are predicted using statistics and probabilities, while the second method utilizes variety of precursors for earthquake prediction. The latter method is time consuming and more costly. However, the result of neither method has fully satisfied the man up to now. In this paper a new method entitled 'Kiana Method' is introduced for earthquake prediction. This method offers more accurate results yet lower cost comparing to other conventional methods. In Kiana method the electrical and magnetic precursors are measured in an area. Then, the time and the magnitude of an earthquake in the future is calculated using electrical, and in particular, electrical capacitors formulas. In this method, by daily measurement of electrical resistance in an area we make clear that the area is capable of earthquake occurrence in the future or not. If the result shows a positive sign, then the occurrence time and the magnitude can be estimated by the measured quantities. This paper explains the procedure and details of this prediction method. (authors)

  10. Collective motion of predictive swarms.

    Directory of Open Access Journals (Sweden)

    Nathaniel Rupprecht

    Full Text Available Theoretical models of populations and swarms typically start with the assumption that the motion of agents is governed by the local stimuli. However, an intelligent agent, with some understanding of the laws that govern its habitat, can anticipate the future, and make predictions to gather resources more efficiently. Here we study a specific model of this kind, where agents aim to maximize their consumption of a diffusing resource, by attempting to predict the future of a resource field and the actions of other agents. Once the agents make a prediction, they are attracted to move towards regions that have, and will have, denser resources. We find that the further the agents attempt to see into the future, the more their attempts at prediction fail, and the less resources they consume. We also study the case where predictive agents compete against non-predictive agents and find the predictors perform better than the non-predictors only when their relative numbers are very small. We conclude that predictivity pays off either when the predictors do not see too far into the future or the number of predictors is small.

  11. Benchmarking of RESRAD-OFFSITE : transition from RESRAD (onsite) to RESRAD-OFFSITE and comparison of the RESRAD-OFFSITE predictions with peercodes

    International Nuclear Information System (INIS)

    Yu, C.; Gnanapragasam, E.; Cheng, J.-J.; Biwer, B.

    2006-01-01

    The main purpose of this report is to document the benchmarking results and verification of the RESRAD-OFFSITE code as part of the quality assurance requirements of the RESRAD development program. This documentation will enable the U.S. Department of Energy (DOE) and its contractors, and the U.S. Nuclear Regulatory Commission (NRC) and its licensees and other stakeholders to use the quality-assured version of the code to perform dose analysis in a risk-informed and technically defensible manner to demonstrate compliance with the NRC's License Termination Rule, Title 10, Part 20, Subpart E, of the Code of Federal Regulations (10 CFR Part 20, Subpart E); DOE's 10 CFR Part 834, Order 5400.5, ''Radiation Protection of the Public and the Environment''; and other Federal and State regulatory requirements as appropriate. The other purpose of this report is to document the differences and similarities between the RESRAD (onsite) and RESRAD-OFFSITE codes so that users (dose analysts and risk assessors) can make a smooth transition from use of the RESRAD (onsite) code to use of the RESRAD-OFFSITE code for performing both onsite and offsite dose analyses. The evolution of the RESRAD-OFFSITE code from the RESRAD (onsite) code is described in Chapter 1 to help the dose analyst and risk assessor make a smooth conceptual transition from the use of one code to that of the other. Chapter 2 provides a comparison of the predictions of RESRAD (onsite) and RESRAD-OFFSITE for an onsite exposure scenario. Chapter 3 documents the results of benchmarking RESRAD-OFFSITE's atmospheric transport and dispersion submodel against the U.S. Environmental Protection Agency's (EPA's) CAP88-PC (Clean Air Act Assessment Package-1988) and ISCLT3 (Industrial Source Complex-Long Term) models. Chapter 4 documents the comparison results of the predictions of the RESRAD-OFFSITE code and its submodels with the predictions of peer models. This report was prepared by Argonne National Laboratory's (Argonne

  12. Dividend Predictability Around the World

    DEFF Research Database (Denmark)

    Rangvid, Jesper; Schrimpf, Andreas

    The common perception in the literature, mainly based on U.S. data, is that current dividend yields are uninformative about future dividends. We show that this nding changes substantially when looking at a broad international panel of countries, as aggregate dividend growth rates are found...... that in countries where the quality of institutions is high, dividend predictability is weaker. These ndings indicate that the apparent lack of dividend predictability in the U.S. does not, in general, extend to other countries. Rather, dividend predictability is driven by cross-country dierences in rm...

  13. The Theory of Linear Prediction

    CERN Document Server

    Vaidyanathan, PP

    2007-01-01

    Linear prediction theory has had a profound impact in the field of digital signal processing. Although the theory dates back to the early 1940s, its influence can still be seen in applications today. The theory is based on very elegant mathematics and leads to many beautiful insights into statistical signal processing. Although prediction is only a part of the more general topics of linear estimation, filtering, and smoothing, this book focuses on linear prediction. This has enabled detailed discussion of a number of issues that are normally not found in texts. For example, the theory of vecto

  14. Practical aspects of geological prediction

    International Nuclear Information System (INIS)

    Mallio, W.J.; Peck, J.H.

    1981-01-01

    Nuclear waste disposal requires that geology be a predictive science. The prediction of future events rests on (1) recognizing the periodicity of geologic events; (2) defining a critical dimension of effect, such as the area of a drainage basin, the length of a fault trace, etc; and (3) using our understanding of active processes the project the frequency and magnitude of future events in the light of geological principles. Of importance to nuclear waste disposal are longer term processes such as continental denudation and removal of materials by glacial erosion. Constant testing of projections will allow the practical limits of predicting geological events to be defined. 11 refs

  15. Adaptive filtering prediction and control

    CERN Document Server

    Goodwin, Graham C

    2009-01-01

    Preface1. Introduction to Adaptive TechniquesPart 1. Deterministic Systems2. Models for Deterministic Dynamical Systems3. Parameter Estimation for Deterministic Systems4. Deterministic Adaptive Prediction5. Control of Linear Deterministic Systems6. Adaptive Control of Linear Deterministic SystemsPart 2. Stochastic Systems7. Optimal Filtering and Prediction8. Parameter Estimation for Stochastic Dynamic Systems9. Adaptive Filtering and Prediction10. Control of Stochastic Systems11. Adaptive Control of Stochastic SystemsAppendicesA. A Brief Review of Some Results from Systems TheoryB. A Summary o

  16. Predicting emergency diesel starting performance

    International Nuclear Information System (INIS)

    DeBey, T.M.

    1989-01-01

    The US Department of Energy effort to extend the operational lives of commercial nuclear power plants has examined methods for predicting the performance of specific equipment. This effort focuses on performance prediction as a means for reducing equipment surveillance, maintenance, and outages. Realizing these goals will result in nuclear plants that are more reliable, have lower maintenance costs, and have longer lives. This paper describes a monitoring system that has been developed to predict starting performance in emergency diesels. A prototype system has been built and tested on an engine at Sandia National Laboratories. 2 refs

  17. Predictive Models and Computational Embryology

    Science.gov (United States)

    EPA’s ‘virtual embryo’ project is building an integrative systems biology framework for predictive models of developmental toxicity. One schema involves a knowledge-driven adverse outcome pathway (AOP) framework utilizing information from public databases, standardized ontologies...

  18. Fatigue life prediction in composites

    CSIR Research Space (South Africa)

    Huston, RJ

    1994-01-01

    Full Text Available Because of the relatively large number of possible failure mechanisms in fibre reinforced composite materials, the prediction of fatigue life in a component is not a simple process. Several mathematical and statistical models have been proposed...

  19. Trading network predicts stock price.

    Science.gov (United States)

    Sun, Xiao-Qian; Shen, Hua-Wei; Cheng, Xue-Qi

    2014-01-16

    Stock price prediction is an important and challenging problem for studying financial markets. Existing studies are mainly based on the time series of stock price or the operation performance of listed company. In this paper, we propose to predict stock price based on investors' trading behavior. For each stock, we characterize the daily trading relationship among its investors using a trading network. We then classify the nodes of trading network into three roles according to their connectivity pattern. Strong Granger causality is found between stock price and trading relationship indices, i.e., the fraction of trading relationship among nodes with different roles. We further predict stock price by incorporating these trading relationship indices into a neural network based on time series of stock price. Experimental results on 51 stocks in two Chinese Stock Exchanges demonstrate the accuracy of stock price prediction is significantly improved by the inclusion of trading relationship indices.

  20. Prediction based on mean subset

    DEFF Research Database (Denmark)

    Øjelund, Henrik; Brown, P. J.; Madsen, Henrik

    2002-01-01

    , it is found that the proposed mean subset method has superior prediction performance than prediction based on the best subset method, and in some settings also better than the ridge regression and lasso methods. The conclusions drawn from the Monte Carlo study is corroborated in an example in which prediction......Shrinkage methods have traditionally been applied in prediction problems. In this article we develop a shrinkage method (mean subset) that forms an average of regression coefficients from individual subsets of the explanatory variables. A Bayesian approach is taken to derive an expression of how...... the coefficient vectors from each subset should be weighted. It is not computationally feasible to calculate the mean subset coefficient vector for larger problems, and thus we suggest an algorithm to find an approximation to the mean subset coefficient vector. In a comprehensive Monte Carlo simulation study...

  1. EPRI MOV performance prediction program

    International Nuclear Information System (INIS)

    Hosler, J.F.; Damerell, P.S.; Eidson, M.G.; Estep, N.E.

    1994-01-01

    An overview of the EPRI Motor-Operated Valve (MOV) Performance Prediction Program is presented. The objectives of this Program are to better understand the factors affecting the performance of MOVs and to develop and validate methodologies to predict MOV performance. The Program involves valve analytical modeling, separate-effects testing to refine the models, and flow-loop and in-plant MOV testing to provide a basis for model validation. The ultimate product of the Program is an MOV Performance Prediction Methodology applicable to common gate, globe, and butterfly valves. The methodology predicts thrust and torque requirements at design-basis flow and differential pressure conditions, assesses the potential for gate valve internal damage, and provides test methods to quantify potential for gate valve internal damage, and provides test methods to quantify potential variations in actuator output thrust with loading condition. Key findings and their potential impact on MOV design and engineering application are summarized

  2. In silico prediction of genotoxicity.

    Science.gov (United States)

    Wichard, Jörg D

    2017-08-01

    The in silico prediction of genotoxicity has made considerable progress during the last years. The main driver for the pharmaceutical industry is the ICH M7 guideline about the assessment of DNA reactive impurities. An important component of this guideline is the use of in silico models as an alternative approach to experimental testing. The in silico prediction of genotoxicity provides an established and accepted method that defines the first step in the assessment of DNA reactive impurities. This was made possible by the growing amount of reliable Ames screening data, the attempts to understand the activity pathways and the subsequent development of computer-based prediction systems. This paper gives an overview of how the in silico prediction of genotoxicity is performed under the ICH M7 guideline. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. New Tool to Predict Glaucoma

    Science.gov (United States)

    ... In This Section A New Tool to Predict Glaucoma email Send this article to a friend by ... Close Send Thanks for emailing that article! Tweet Glaucoma can be difficult to detect and diagnose. Measurement ...

  4. Dynamical Predictability of Monthly Means.

    Science.gov (United States)

    Shukla, J.

    1981-12-01

    We have attempted to determine the theoretical upper limit of dynamical predictability of monthly means for prescribed nonfluctuating external forcings. We have extended the concept of `classical' predictability, which primarily refers to the lack of predictability due mainly to the instabilities of synoptic-scale disturbances, to the predictability of time averages, which are determined by the predictability of low-frequency planetary waves. We have carded out 60-day integrations of a global general circulation model with nine different initial conditions but identical boundary conditions of sea surface temperature, snow, sea ice and soil moisture. Three of these initial conditions are the observed atmospheric conditions on 1 January of 1975, 1976 and 1977. The other six initial conditions are obtained by superimposing over the observed initial conditions a random perturbation comparable to the errors of observation. The root-mean-square (rms) error of random perturbations at all the grid points and all the model levels is 3 m s1 in u and v components of wind. The rms vector wind error between the observed initial conditions is >15 m s1.It is hypothesized that for a given averaging period, if the rms error among the time averages predicted from largely different initial conditions becomes comparable to the rms error among the time averages predicted from randomly perturbed initial conditions, the time averages are dynamically unpredictable. We have carried out the analysis of variance to compare the variability, among the three groups, due to largely different initial conditions, and within each group due to random perturbations.It is found that the variances among the first 30-day means, predicted from largely different initial conditions, are significantly different from the variances due to random perturbations in the initial conditions, whereas the variances among 30-day means for days 31-60 are not distinguishable from the variances due to random initial

  5. Predictive coding in Agency Detection

    DEFF Research Database (Denmark)

    Andersen, Marc Malmdorf

    2017-01-01

    Agency detection is a central concept in the cognitive science of religion (CSR). Experimental studies, however, have so far failed to lend support to some of the most common predictions that follow from current theories on agency detection. In this article, I argue that predictive coding, a highly...... promising new framework for understanding perception and action, may solve pending theoretical inconsistencies in agency detection research, account for the puzzling experimental findings mentioned above, and provide hypotheses for future experimental testing. Predictive coding explains how the brain......, unbeknownst to consciousness, engages in sophisticated Bayesian statistics in an effort to constantly predict the hidden causes of sensory input. My fundamental argument is that most false positives in agency detection can be seen as the result of top-down interference in a Bayesian system generating high...

  6. Time-predictable Stack Caching

    DEFF Research Database (Denmark)

    Abbaspourseyedi, Sahar

    completely. Thus, in systems with hard deadlines the worst-case execution time (WCET) of the real-time software running on them needs to be bounded. Modern architectures use features such as pipelining and caches for improving the average performance. These features, however, make the WCET analysis more...... addresses, provides an opportunity to predict and tighten the WCET of accesses to data in caches. In this thesis, we introduce the time-predictable stack cache design and implementation within a time-predictable processor. We introduce several optimizations to our design for tightening the WCET while...... keeping the timepredictability of the design intact. Moreover, we provide a solution for reducing the cost of context switching in a system using the stack cache. In design of these caches, we use custom hardware and compiler support for delivering time-predictable stack data accesses. Furthermore...

  7. NASA/MSFC prediction techniques

    International Nuclear Information System (INIS)

    Smith, R.E.

    1987-01-01

    The NASA/MSFC method of forecasting is more formal than NOAA's. The data are smoothed by the Lagrangian method and linear regression prediction techniques are used. The solar activity period is fixed at 11 years--the mean period of all previous cycles. Interestingly, the present prediction for the time of the next solar minimum is February or March of 1987, which, within the uncertainties of two methods, can be taken to be the same as the NOAA result

  8. Prediction of molecular crystal structures

    International Nuclear Information System (INIS)

    Beyer, Theresa

    2001-01-01

    The ab initio prediction of molecular crystal structures is a scientific challenge. Reliability of first-principle prediction calculations would show a fundamental understanding of crystallisation. Crystal structure prediction is also of considerable practical importance as different crystalline arrangements of the same molecule in the solid state (polymorphs)are likely to have different physical properties. A method of crystal structure prediction based on lattice energy minimisation has been developed in this work. The choice of the intermolecular potential and of the molecular model is crucial for the results of such studies and both of these criteria have been investigated. An empirical atom-atom repulsion-dispersion potential for carboxylic acids has been derived and applied in a crystal structure prediction study of formic, benzoic and the polymorphic system of tetrolic acid. As many experimental crystal structure determinations at different temperatures are available for the polymorphic system of paracetamol (acetaminophen), the influence of the variations of the molecular model on the crystal structure lattice energy minima, has also been studied. The general problem of prediction methods based on the assumption that the experimental thermodynamically stable polymorph corresponds to the global lattice energy minimum, is that more hypothetical low lattice energy structures are found within a few kJ mol -1 of the global minimum than are likely to be experimentally observed polymorphs. This is illustrated by the results for molecule I, 3-oxabicyclo(3.2.0)hepta-1,4-diene, studied for the first international blindtest for small organic crystal structures organised by the Cambridge Crystallographic Data Centre (CCDC) in May 1999. To reduce the number of predicted polymorphs, additional factors to thermodynamic criteria have to be considered. Therefore the elastic constants and vapour growth morphologies have been calculated for the lowest lattice energy

  9. Does Carbon Dioxide Predict Temperature?

    OpenAIRE

    Mytty, Tuukka

    2013-01-01

    Does carbon dioxide predict temperature? No it does not, in the time period of 1880-2004 with the carbon dioxide and temperature data used in this thesis. According to the Inter Governmental Panel on Climate Change(IPCC) carbon dioxide is the most important factor in raising the global temperature. Therefore, it is reasonable to assume that carbon dioxide truly predicts temperature. Because this paper uses observational data it has to be kept in mind that no causality interpretation can be ma...

  10. Prediction of molecular crystal structures

    Energy Technology Data Exchange (ETDEWEB)

    Beyer, Theresa

    2001-07-01

    The ab initio prediction of molecular crystal structures is a scientific challenge. Reliability of first-principle prediction calculations would show a fundamental understanding of crystallisation. Crystal structure prediction is also of considerable practical importance as different crystalline arrangements of the same molecule in the solid state (polymorphs)are likely to have different physical properties. A method of crystal structure prediction based on lattice energy minimisation has been developed in this work. The choice of the intermolecular potential and of the molecular model is crucial for the results of such studies and both of these criteria have been investigated. An empirical atom-atom repulsion-dispersion potential for carboxylic acids has been derived and applied in a crystal structure prediction study of formic, benzoic and the polymorphic system of tetrolic acid. As many experimental crystal structure determinations at different temperatures are available for the polymorphic system of paracetamol (acetaminophen), the influence of the variations of the molecular model on the crystal structure lattice energy minima, has also been studied. The general problem of prediction methods based on the assumption that the experimental thermodynamically stable polymorph corresponds to the global lattice energy minimum, is that more hypothetical low lattice energy structures are found within a few kJ mol{sup -1} of the global minimum than are likely to be experimentally observed polymorphs. This is illustrated by the results for molecule I, 3-oxabicyclo(3.2.0)hepta-1,4-diene, studied for the first international blindtest for small organic crystal structures organised by the Cambridge Crystallographic Data Centre (CCDC) in May 1999. To reduce the number of predicted polymorphs, additional factors to thermodynamic criteria have to be considered. Therefore the elastic constants and vapour growth morphologies have been calculated for the lowest lattice energy

  11. Prediction of interannual climate variations

    International Nuclear Information System (INIS)

    Shukla, J.

    1993-01-01

    It has been known for some time that the behavior of the short-term fluctuations of the earth's atmosphere resembles that of a chaotic non-linear dynamical system, and that the day-to-day weather cannot be predicted beyond a few weeks. However, it has also been found that the interactions of the atmosphere with the underlying oceans and the land surfaces can produce fluctuations whose time scales are much longer than the limits of deterministic prediction of weather. It is, therefore, natural to ask whether it is possible that the seasonal and longer time averages of climate fluctuations can be predicted with sufficient skill to be beneficial for social and economic applications, even though the details of day-to-day weather cannot be predicted beyond a few weeks. The main objective of the workshop was to address this question by assessing the current state of knowledge on predictability of seasonal and interannual climate variability and to investigate various possibilities for its prediction. (orig./KW)

  12. Postprocessing for Air Quality Predictions

    Science.gov (United States)

    Delle Monache, L.

    2017-12-01

    In recent year, air quality (AQ) forecasting has made significant progress towards better predictions with the goal of protecting the public from harmful pollutants. This progress is the results of improvements in weather and chemical transport models, their coupling, and more accurate emission inventories (e.g., with the development of new algorithms to account in near real-time for fires). Nevertheless, AQ predictions are still affected at times by significant biases which stem from limitations in both weather and chemistry transport models. Those are the result of numerical approximations and the poor representation (and understanding) of important physical and chemical process. Moreover, although the quality of emission inventories has been significantly improved, they are still one of the main sources of uncertainties in AQ predictions. For operational real-time AQ forecasting, a significant portion of these biases can be reduced with the implementation of postprocessing methods. We will review some of the techniques that have been proposed to reduce both systematic and random errors of AQ predictions, and improve the correlation between predictions and observations of ground-level ozone and surface particulate matter less than 2.5 µm in diameter (PM2.5). These methods, which can be applied to both deterministic and probabilistic predictions, include simple bias-correction techniques, corrections inspired by the Kalman filter, regression methods, and the more recently developed analog-based algorithms. These approaches will be compared and contrasted, and strength and weaknesses of each will be discussed.

  13. Network discovery, characterization, and prediction : a grand challenge LDRD final report.

    Energy Technology Data Exchange (ETDEWEB)

    Kegelmeyer, W. Philip, Jr.

    2010-11-01

    This report is the final summation of Sandia's Grand Challenge LDRD project No.119351, 'Network Discovery, Characterization and Prediction' (the 'NGC') which ran from FY08 to FY10. The aim of the NGC, in a nutshell, was to research, develop, and evaluate relevant analysis capabilities that address adversarial networks. Unlike some Grand Challenge efforts, that ambition created cultural subgoals, as well as technical and programmatic ones, as the insistence on 'relevancy' required that the Sandia informatics research communities and the analyst user communities come to appreciate each others needs and capabilities in a very deep and concrete way. The NGC generated a number of technical, programmatic, and cultural advances, detailed in this report. There were new algorithmic insights and research that resulted in fifty-three refereed publications and presentations; this report concludes with an abstract-annotated bibliography pointing to them all. The NGC generated three substantial prototypes that not only achieved their intended goals of testing our algorithmic integration, but which also served as vehicles for customer education and program development. The NGC, as intended, has catalyzed future work in this domain; by the end it had already brought in, in new funding, as much funding as had been invested in it. Finally, the NGC knit together previously disparate research staff and user expertise in a fashion that not only addressed our immediate research goals, but which promises to have created an enduring cultural legacy of mutual understanding, in service of Sandia's national security responsibilities in cybersecurity and counter proliferation.

  14. Machine Learning Technologies Translates Vigilant Surveillance Satellite Big Data into Predictive Alerts for Environmental Stressors

    Science.gov (United States)

    Johnson, S. P.; Rohrer, M. E.

    2017-12-01

    The application of scientific research pertaining to satellite imaging and data processing has facilitated the development of dynamic methodologies and tools that utilize nanosatellites and analytical platforms to address the increasing scope, scale, and intensity of emerging environmental threats to national security. While the use of remotely sensed data to monitor the environment at local and global scales is not a novel proposition, the application of advances in nanosatellites and analytical platforms are capable of overcoming the data availability and accessibility barriers that have historically impeded the timely detection, identification, and monitoring of these stressors. Commercial and university-based applications of these technologies were used to identify and evaluate their capacity as security-motivated environmental monitoring tools. Presently, nanosatellites can provide consumers with 1-meter resolution imaging, frequent revisits, and customizable tasking, allowing users to define an appropriate temporal scale for high resolution data collection that meets their operational needs. Analytical platforms are capable of ingesting increasingly large and diverse volumes of data, delivering complex analyses in the form of interpretation-ready data products and solutions. The synchronous advancement of these technologies creates the capability of analytical platforms to deliver interpretable products from persistently collected high-resolution data that meet varying temporal and geographic scale requirements. In terms of emerging environmental threats, these advances translate into customizable and flexible tools that can respond to and accommodate the evolving nature of environmental stressors. This presentation will demonstrate the capability of nanosatellites and analytical platforms to provide timely, relevant, and actionable information that enables environmental analysts and stakeholders to make informed decisions regarding the prevention

  15. Predictive value of diminutive colonic adenoma trial: the PREDICT trial.

    Science.gov (United States)

    Schoenfeld, Philip; Shad, Javaid; Ormseth, Eric; Coyle, Walter; Cash, Brooks; Butler, James; Schindler, William; Kikendall, Walter J; Furlong, Christopher; Sobin, Leslie H; Hobbs, Christine M; Cruess, David; Rex, Douglas

    2003-05-01

    Diminutive adenomas (1-9 mm in diameter) are frequently found during colon cancer screening with flexible sigmoidoscopy (FS). This trial assessed the predictive value of these diminutive adenomas for advanced adenomas in the proximal colon. In a multicenter, prospective cohort trial, we matched 200 patients with normal FS and 200 patients with diminutive adenomas on FS for age and gender. All patients underwent colonoscopy. The presence of advanced adenomas (adenoma >or= 10 mm in diameter, villous adenoma, adenoma with high grade dysplasia, and colon cancer) and adenomas (any size) was recorded. Before colonoscopy, patients completed questionnaires about risk factors for adenomas. The prevalence of advanced adenomas in the proximal colon was similar in patients with diminutive adenomas and patients with normal FS (6% vs. 5.5%, respectively) (relative risk, 1.1; 95% confidence interval [CI], 0.5-2.6). Diminutive adenomas on FS did not accurately predict advanced adenomas in the proximal colon: sensitivity, 52% (95% CI, 32%-72%); specificity, 50% (95% CI, 49%-51%); positive predictive value, 6% (95% CI, 4%-8%); and negative predictive value, 95% (95% CI, 92%-97%). Male gender (odds ratio, 1.63; 95% CI, 1.01-2.61) was associated with an increased risk of proximal colon adenomas. Diminutive adenomas on sigmoidoscopy may not accurately predict advanced adenomas in the proximal colon.

  16. Reward positivity: Reward prediction error or salience prediction error?

    Science.gov (United States)

    Heydari, Sepideh; Holroyd, Clay B

    2016-08-01

    The reward positivity is a component of the human ERP elicited by feedback stimuli in trial-and-error learning and guessing tasks. A prominent theory holds that the reward positivity reflects a reward prediction error signal that is sensitive to outcome valence, being larger for unexpected positive events relative to unexpected negative events (Holroyd & Coles, 2002). Although the theory has found substantial empirical support, most of these studies have utilized either monetary or performance feedback to test the hypothesis. However, in apparent contradiction to the theory, a recent study found that unexpected physical punishments also elicit the reward positivity (Talmi, Atkinson, & El-Deredy, 2013). The authors of this report argued that the reward positivity reflects a salience prediction error rather than a reward prediction error. To investigate this finding further, in the present study participants navigated a virtual T maze and received feedback on each trial under two conditions. In a reward condition, the feedback indicated that they would either receive a monetary reward or not and in a punishment condition the feedback indicated that they would receive a small shock or not. We found that the feedback stimuli elicited a typical reward positivity in the reward condition and an apparently delayed reward positivity in the punishment condition. Importantly, this signal was more positive to the stimuli that predicted the omission of a possible punishment relative to stimuli that predicted a forthcoming punishment, which is inconsistent with the salience hypothesis. © 2016 Society for Psychophysiological Research.

  17. Climate Prediction - NOAA's National Weather Service

    Science.gov (United States)

    Statistical Models... MOS Prod GFS-LAMP Prod Climate Past Weather Predictions Weather Safety Weather Radio National Weather Service on FaceBook NWS on Facebook NWS Director Home > Climate > Predictions Climate Prediction Long range forecasts across the U.S. Climate Prediction Web Sites Climate Prediction

  18. Weighted-Average Least Squares Prediction

    NARCIS (Netherlands)

    Magnus, Jan R.; Wang, Wendun; Zhang, Xinyu

    2016-01-01

    Prediction under model uncertainty is an important and difficult issue. Traditional prediction methods (such as pretesting) are based on model selection followed by prediction in the selected model, but the reported prediction and the reported prediction variance ignore the uncertainty from the

  19. Potential Predictability and Prediction Skill for Southern Peru Summertime Rainfall

    Science.gov (United States)

    WU, S.; Notaro, M.; Vavrus, S. J.; Mortensen, E.; Block, P. J.; Montgomery, R. J.; De Pierola, J. N.; Sanchez, C.

    2016-12-01

    The central Andes receive over 50% of annual climatological rainfall during the short period of January-March. This summertime rainfall exhibits strong interannual and decadal variability, including severe drought events that incur devastating societal impacts and cause agricultural communities and mining facilities to compete for limited water resources. An improved seasonal prediction skill of summertime rainfall would aid in water resource planning and allocation across the water-limited southern Peru. While various underlying mechanisms have been proposed by past studies for the drivers of interannual variability in summertime rainfall across southern Peru, such as the El Niño-Southern Oscillation (ENSO), Madden Julian Oscillation (MJO), and extratropical forcings, operational forecasts continue to be largely based on rudimentary ENSO-based indices, such as NINO3.4, justifying further exploration of predictive skill. In order to bridge this gap between the understanding of driving mechanisms and the operational forecast, we performed systematic studies on the predictability and prediction skill of southern Peru summertime rainfall by constructing statistical forecast models using best available weather station and reanalysis datasets. At first, by assuming the first two empirical orthogonal functions (EOFs) of summertime rainfall are predictable, the potential predictability skill was evaluated for southern Peru. Then, we constructed a simple regression model, based on the time series of tropical Pacific sea-surface temperatures (SSTs), and a more advanced Linear Inverse Model (LIM), based on the EOFs of tropical ocean SSTs and large-scale atmosphere variables from reanalysis. Our results show that the LIM model consistently outperforms the more rudimentary regression models on the forecast skill of domain averaged precipitation index and individual station indices. The improvement of forecast correlation skill ranges from 10% to over 200% for different

  20. Prediction of GNSS satellite clocks

    International Nuclear Information System (INIS)

    Broederbauer, V.

    2010-01-01

    This thesis deals with the characterisation and prediction of GNSS-satellite-clocks. A prerequisite to develop powerful algorithms for the prediction of clock-corrections is the thorough study of the behaviour of the different clock-types of the satellites. In this context the predicted part of the IGU-clock-corrections provided by the Analysis Centers (ACs) of the IGS was compared to the IGS-Rapid-clock solutions to determine reasonable estimates of the quality of already existing well performing predictions. For the shortest investigated interval (three hours) all ACs obtain almost the same accuracy of 0,1 to 0,4 ns. For longer intervals the individual predictions results start to diverge. Thus, for a 12-hours- interval the differences range from nearly 10 ns (GFZ, CODE) until up to some 'tens of ns'. Based on the estimated clock corrections provided via the IGS Rapid products a simple quadratic polynomial turns out to be sufficient to describe the time series of Rubidium-clocks. On the other hand Cesium-clocks show a periodical behaviour (revolution period) with an amplitude of up to 6 ns. A clear correlation between these amplitudes and the Sun elevation angle above the orbital planes can be demonstrated. The variability of the amplitudes is supposed to be caused by temperature-variations affecting the oscillator. To account for this periodical behaviour a quadratic polynomial with an additional sinus-term was finally chosen as prediction model both for the Cesium as well as for the Rubidium clocks. The three polynomial-parameters as well as amplitude and phase shift of the periodic term are estimated within a least-square-adjustment by means of program GNSS-VC/static. Input-data are time series of the observed part of the IGU clock corrections. With the estimated parameters clock-corrections are predicted for various durations. The mean error of the prediction of Rubidium-clock-corrections for an interval of six hours reaches up to 1,5 ns. For the 12-hours

  1. Geophysical Anomalies and Earthquake Prediction

    Science.gov (United States)

    Jackson, D. D.

    2008-12-01

    Finding anomalies is easy. Predicting earthquakes convincingly from such anomalies is far from easy. Why? Why have so many beautiful geophysical abnormalities not led to successful prediction strategies? What is earthquake prediction? By my definition it is convincing information that an earthquake of specified size is temporarily much more likely than usual in a specific region for a specified time interval. We know a lot about normal earthquake behavior, including locations where earthquake rates are higher than elsewhere, with estimable rates and size distributions. We know that earthquakes have power law size distributions over large areas, that they cluster in time and space, and that aftershocks follow with power-law dependence on time. These relationships justify prudent protective measures and scientific investigation. Earthquake prediction would justify exceptional temporary measures well beyond those normal prudent actions. Convincing earthquake prediction would result from methods that have demonstrated many successes with few false alarms. Predicting earthquakes convincingly is difficult for several profound reasons. First, earthquakes start in tiny volumes at inaccessible depth. The power law size dependence means that tiny unobservable ones are frequent almost everywhere and occasionally grow to larger size. Thus prediction of important earthquakes is not about nucleation, but about identifying the conditions for growth. Second, earthquakes are complex. They derive their energy from stress, which is perniciously hard to estimate or model because it is nearly singular at the margins of cracks and faults. Physical properties vary from place to place, so the preparatory processes certainly vary as well. Thus establishing the needed track record for validation is very difficult, especially for large events with immense interval times in any one location. Third, the anomalies are generally complex as well. Electromagnetic anomalies in particular require

  2. Neural Elements for Predictive Coding

    Directory of Open Access Journals (Sweden)

    Stewart SHIPP

    2016-11-01

    Full Text Available Predictive coding theories of sensory brain function interpret the hierarchical construction of the cerebral cortex as a Bayesian, generative model capable of predicting the sensory data consistent with any given percept. Predictions are fed backwards in the hierarchy and reciprocated by prediction error in the forward direction, acting to modify the representation of the outside world at increasing levels of abstraction, and so to optimize the nature of perception over a series of iterations. This accounts for many ‘illusory’ instances of perception where what is seen (heard, etc is unduly influenced by what is expected, based on past experience. This simple conception, the hierarchical exchange of prediction and prediction error, confronts a rich cortical microcircuitry that is yet to be fully documented. This article presents the view that, in the current state of theory and practice, it is profitable to begin a two-way exchange: that predictive coding theory can support an understanding of cortical microcircuit function, and prompt particular aspects of future investigation, whilst existing knowledge of microcircuitry can, in return, influence theoretical development. As an example, a neural inference arising from the earliest formulations of predictive coding is that the source populations of forwards and backwards pathways should be completely separate, given their functional distinction; this aspect of circuitry – that neurons with extrinsically bifurcating axons do not project in both directions – has only recently been confirmed. Here, the computational architecture prescribed by a generalized (free-energy formulation of predictive coding is combined with the classic ‘canonical microcircuit’ and the laminar architecture of hierarchical extrinsic connectivity to produce a template schematic, that is further examined in the light of (a updates in the microcircuitry of primate visual cortex, and (b rapid technical advances made

  3. Neural Elements for Predictive Coding.

    Science.gov (United States)

    Shipp, Stewart

    2016-01-01

    Predictive coding theories of sensory brain function interpret the hierarchical construction of the cerebral cortex as a Bayesian, generative model capable of predicting the sensory data consistent with any given percept. Predictions are fed backward in the hierarchy and reciprocated by prediction error in the forward direction, acting to modify the representation of the outside world at increasing levels of abstraction, and so to optimize the nature of perception over a series of iterations. This accounts for many 'illusory' instances of perception where what is seen (heard, etc.) is unduly influenced by what is expected, based on past experience. This simple conception, the hierarchical exchange of prediction and prediction error, confronts a rich cortical microcircuitry that is yet to be fully documented. This article presents the view that, in the current state of theory and practice, it is profitable to begin a two-way exchange: that predictive coding theory can support an understanding of cortical microcircuit function, and prompt particular aspects of future investigation, whilst existing knowledge of microcircuitry can, in return, influence theoretical development. As an example, a neural inference arising from the earliest formulations of predictive coding is that the source populations of forward and backward pathways should be completely separate, given their functional distinction; this aspect of circuitry - that neurons with extrinsically bifurcating axons do not project in both directions - has only recently been confirmed. Here, the computational architecture prescribed by a generalized (free-energy) formulation of predictive coding is combined with the classic 'canonical microcircuit' and the laminar architecture of hierarchical extrinsic connectivity to produce a template schematic, that is further examined in the light of (a) updates in the microcircuitry of primate visual cortex, and (b) rapid technical advances made possible by transgenic neural

  4. The prediction of the LWR plant accident based on the measured plant data

    International Nuclear Information System (INIS)

    Miettinen, J.; Schmuck, P.

    2005-01-01

    In case of accident affecting a nuclear reactor, it is essential to anticipate the possible development of the situation to efficiently succeed in emergency response actions, i.e. firstly to be early warned, to get sufficient information on the plant: and as far as possible. The ASTRID (Assessment of Source Term for Emergency Response based on Installation Data) project consists in developing a methodology: of expertise to; structure the work of technical teams and to facilitate cross competence communications among EP players and a qualified computer tool that could be commonly used by the European countries to reliably predict source term in case of an accident in a light water reactor, using the information available on the plant. In many accident conditions the team of analysts may be located far away from the plant experiencing the accident and their decision making is based on the on-line plant data transmitted into the crisis centre in an interval of 30 - 600 seconds. The plant condition has to be diagnosed based on this information, In the ASTRID project the plant status diagnostics has been studied for the European reactor types including BWR, PWR and VVER plants. The directly measured plant data may be used for estimations of the break size from the primary system and its locations. The break size prediction may be based on the pressurizer level, reactor vessel level, primary pressure and steam generator level in the case of the steam generator tube rupture. In the ASTRID project the break predictions concept was developed and its validity for different plant types and is presented in the paper, when the plant data has been created with the plant specific thermohydraulic simulation model. The tracking simulator attempts to follow the plant behavior on-line based on the measured plant data for the main process parameters and most important boundary conditions. When the plant state tracking fails, the plant may be experiencing an accident, and the tracking

  5. Predicting Ambulance Time of Arrival to the Emergency Department Using Global Positioning System and Google Maps

    Science.gov (United States)

    Fleischman, Ross J.; Lundquist, Mark; Jui, Jonathan; Newgard, Craig D.; Warden, Craig

    2014-01-01

    Objective To derive and validate a model that accurately predicts ambulance arrival time that could be implemented as a Google Maps web application. Methods This was a retrospective study of all scene transports in Multnomah County, Oregon, from January 1 through December 31, 2008. Scene and destination hospital addresses were converted to coordinates. ArcGIS Network Analyst was used to estimate transport times based on street network speed limits. We then created a linear regression model to improve the accuracy of these street network estimates using weather, patient characteristics, use of lights and sirens, daylight, and rush-hour intervals. The model was derived from a 50% sample and validated on the remainder. Significance of the covariates was determined by p times recorded by computer-aided dispatch. We then built a Google Maps-based web application to demonstrate application in real-world EMS operations. Results There were 48,308 included transports. Street network estimates of transport time were accurate within 5 minutes of actual transport time less than 16% of the time. Actual transport times were longer during daylight and rush-hour intervals and shorter with use of lights and sirens. Age under 18 years, gender, wet weather, and trauma system entry were not significant predictors of transport time. Our model predicted arrival time within 5 minutes 73% of the time. For lights and sirens transports, accuracy was within 5 minutes 77% of the time. Accuracy was identical in the validation dataset. Lights and sirens saved an average of 3.1 minutes for transports under 8.8 minutes, and 5.3 minutes for longer transports. Conclusions An estimate of transport time based only on a street network significantly underestimated transport times. A simple model incorporating few variables can predict ambulance time of arrival to the emergency department with good accuracy. This model could be linked to global positioning system data and an automated Google Maps web

  6. Quantifying prognosis with risk predictions.

    Science.gov (United States)

    Pace, Nathan L; Eberhart, Leopold H J; Kranke, Peter R

    2012-01-01

    Prognosis is a forecast, based on present observations in a patient, of their probable outcome from disease, surgery and so on. Research methods for the development of risk probabilities may not be familiar to some anaesthesiologists. We briefly describe methods for identifying risk factors and risk scores. A probability prediction rule assigns a risk probability to a patient for the occurrence of a specific event. Probability reflects the continuum between absolute certainty (Pi = 1) and certified impossibility (Pi = 0). Biomarkers and clinical covariates that modify risk are known as risk factors. The Pi as modified by risk factors can be estimated by identifying the risk factors and their weighting; these are usually obtained by stepwise logistic regression. The accuracy of probabilistic predictors can be separated into the concepts of 'overall performance', 'discrimination' and 'calibration'. Overall performance is the mathematical distance between predictions and outcomes. Discrimination is the ability of the predictor to rank order observations with different outcomes. Calibration is the correctness of prediction probabilities on an absolute scale. Statistical methods include the Brier score, coefficient of determination (Nagelkerke R2), C-statistic and regression calibration. External validation is the comparison of the actual outcomes to the predicted outcomes in a new and independent patient sample. External validation uses the statistical methods of overall performance, discrimination and calibration and is uniformly recommended before acceptance of the prediction model. Evidence from randomised controlled clinical trials should be obtained to show the effectiveness of risk scores for altering patient management and patient outcomes.

  7. PREDICTING DEMAND FOR COTTON YARNS

    Directory of Open Access Journals (Sweden)

    SALAS-MOLINA Francisco

    2017-05-01

    Full Text Available Predicting demand for fashion products is crucial for textile manufacturers. In an attempt to both avoid out-of-stocks and minimize holding costs, different forecasting techniques are used by production managers. Both linear and non-linear time-series analysis techniques are suitable options for forecasting purposes. However, demand for fashion products presents a number of particular characteristics such as short life-cycles, short selling seasons, high impulse purchasing, high volatility, low predictability, tremendous product variety and a high number of stock-keeping-units. In this paper, we focus on predicting demand for cotton yarns using a non-linear forecasting technique that has been fruitfully used in many areas, namely, random forests. To this end, we first identify a number of explanatory variables to be used as a key input to forecasting using random forests. We consider explanatory variables usually labeled either as causal variables, when some correlation is expected between them and the forecasted variable, or as time-series features, when extracted from time-related attributes such as seasonality. Next, we evaluate the predictive power of each variable by means of out-of-sample accuracy measurement. We experiment on a real data set from a textile company in Spain. The numerical results show that simple time-series features present more predictive ability than other more sophisticated explanatory variables.

  8. Lightning prediction using radiosonde data

    Energy Technology Data Exchange (ETDEWEB)

    Weng, L.Y.; Bin Omar, J.; Siah, Y.K.; Bin Zainal Abidin, I.; Ahmad, S.K. [Univ. Tenaga, Darul Ehsan (Malaysia). College of Engineering

    2008-07-01

    Lightning is a natural phenomenon in tropical regions. Malaysia experiences very high cloud-to-ground lightning density, posing both health and economic concerns to individuals and industries. In the commercial sector, power lines, telecommunication towers and buildings are most frequently hit by lightning. In the event that a power line is hit and the protection system fails, industries which rely on that power line would cease operations temporarily, resulting in significant monetary loss. Current technology is unable to prevent lightning occurrences. However, the ability to predict lightning would significantly reduce damages from direct and indirect lightning strikes. For that reason, this study focused on developing a method to predict lightning with radiosonde data using only a simple back propagation neural network model written in C code. The study was performed at the Kuala Lumpur International Airport (KLIA). In this model, the parameters related to wind were disregarded. Preliminary results indicate that this method shows some positive results in predicting lighting. However, a larger dataset is needed in order to obtain more accurate predictions. It was concluded that future work should include wind parameters to fully capture all properties for lightning formation, subsequently its prediction. 8 refs., 5 figs.

  9. Prediction, Regression and Critical Realism

    DEFF Research Database (Denmark)

    Næss, Petter

    2004-01-01

    This paper considers the possibility of prediction in land use planning, and the use of statistical research methods in analyses of relationships between urban form and travel behaviour. Influential writers within the tradition of critical realism reject the possibility of predicting social...... phenomena. This position is fundamentally problematic to public planning. Without at least some ability to predict the likely consequences of different proposals, the justification for public sector intervention into market mechanisms will be frail. Statistical methods like regression analyses are commonly...... seen as necessary in order to identify aggregate level effects of policy measures, but are questioned by many advocates of critical realist ontology. Using research into the relationship between urban structure and travel as an example, the paper discusses relevant research methods and the kinds...

  10. Intelligent Prediction of Ship Maneuvering

    Directory of Open Access Journals (Sweden)

    Miroslaw Lacki

    2016-09-01

    Full Text Available In this paper the author presents an idea of the intelligent ship maneuvering prediction system with the usage of neuroevolution. This may be also be seen as the ship handling system that simulates a learning process of an autonomous control unit, created with artificial neural network. The control unit observes input signals and calculates the values of required parameters of the vessel maneuvering in confined waters. In neuroevolution such units are treated as individuals in population of artificial neural networks, which through environmental sensing and evolutionary algorithms learn to perform given task efficiently. The main task of the system is to learn continuously and predict the values of a navigational parameters of the vessel after certain amount of time, regarding an influence of its environment. The result of a prediction may occur as a warning to navigator to aware him about incoming threat.

  11. Predictive Modeling in Race Walking

    Directory of Open Access Journals (Sweden)

    Krzysztof Wiktorowicz

    2015-01-01

    Full Text Available This paper presents the use of linear and nonlinear multivariable models as tools to support training process of race walkers. These models are calculated using data collected from race walkers’ training events and they are used to predict the result over a 3 km race based on training loads. The material consists of 122 training plans for 21 athletes. In order to choose the best model leave-one-out cross-validation method is used. The main contribution of the paper is to propose the nonlinear modifications for linear models in order to achieve smaller prediction error. It is shown that the best model is a modified LASSO regression with quadratic terms in the nonlinear part. This model has the smallest prediction error and simplified structure by eliminating some of the predictors.

  12. Sentence-Level Attachment Prediction

    Science.gov (United States)

    Albakour, M.-Dyaa; Kruschwitz, Udo; Lucas, Simon

    Attachment prediction is the task of automatically identifying email messages that should contain an attachment. This can be useful to tackle the problem of sending out emails but forgetting to include the relevant attachment (something that happens all too often). A common Information Retrieval (IR) approach in analyzing documents such as emails is to treat the entire document as a bag of words. Here we propose a finer-grained analysis to address the problem. We aim at identifying individual sentences within an email that refer to an attachment. If we detect any such sentence, we predict that the email should have an attachment. Using part of the Enron corpus for evaluation we find that our finer-grained approach outperforms previously reported document-level attachment prediction in similar evaluation settings.

  13. BBN predictions for 4He

    International Nuclear Information System (INIS)

    Walker, T.P.

    1993-01-01

    The standard model of the hot big bang assumes a homogeneous and isotropic Universe with gravity described by General Relativity and strong and electroweak interactions described by the Standard Model of particle physics. The hot big bang model makes the unavoidable prediction that the production of primordial elements occurred about one minute after the big band (referred to as big bang or primordial nucleosynthesis BBN). This review concerns the range of the primordial abundance of 4 He as predicted by standard BBN (i.e., primordial nucleosynthesis assuming a homogeneous distribution of baryons). In it the author discusses: (1) Uncertainties in the calculation of Y p (the mass fraction of primordial 4 He), (2) The expected range of Y p , (3) How the predictions stack up against the latest observations, and (4) The latest BBN bounds on Ω B h 2 and N ν . 13 refs., 2 figs

  14. Human motion simulation predictive dynamics

    CERN Document Server

    Abdel-Malek, Karim

    2013-01-01

    Simulate realistic human motion in a virtual world with an optimization-based approach to motion prediction. With this approach, motion is governed by human performance measures, such as speed and energy, which act as objective functions to be optimized. Constraints on joint torques and angles are imposed quite easily. Predicting motion in this way allows one to use avatars to study how and why humans move the way they do, given specific scenarios. It also enables avatars to react to infinitely many scenarios with substantial autonomy. With this approach it is possible to predict dynamic motion without having to integrate equations of motion -- rather than solving equations of motion, this approach solves for a continuous time-dependent curve characterizing joint variables (also called joint profiles) for every degree of freedom. Introduces rigorous mathematical methods for digital human modelling and simulation Focuses on understanding and representing spatial relationships (3D) of biomechanics Develops an i...

  15. Model predictive control using fuzzy decision functions

    NARCIS (Netherlands)

    Kaymak, U.; Costa Sousa, da J.M.

    2001-01-01

    Fuzzy predictive control integrates conventional model predictive control with techniques from fuzzy multicriteria decision making, translating the goals and the constraints to predictive control in a transparent way. The information regarding the (fuzzy) goals and the (fuzzy) constraints of the

  16. Evaluating the Predictive Value of Growth Prediction Models

    Science.gov (United States)

    Murphy, Daniel L.; Gaertner, Matthew N.

    2014-01-01

    This study evaluates four growth prediction models--projection, student growth percentile, trajectory, and transition table--commonly used to forecast (and give schools credit for) middle school students' future proficiency. Analyses focused on vertically scaled summative mathematics assessments, and two performance standards conditions (high…

  17. Ensemble method for dengue prediction.

    Science.gov (United States)

    Buczak, Anna L; Baugher, Benjamin; Moniz, Linda J; Bagley, Thomas; Babin, Steven M; Guven, Erhan

    2018-01-01

    In the 2015 NOAA Dengue Challenge, participants made three dengue target predictions for two locations (Iquitos, Peru, and San Juan, Puerto Rico) during four dengue seasons: 1) peak height (i.e., maximum weekly number of cases during a transmission season; 2) peak week (i.e., week in which the maximum weekly number of cases occurred); and 3) total number of cases reported during a transmission season. A dengue transmission season is the 12-month period commencing with the location-specific, historical week with the lowest number of cases. At the beginning of the Dengue Challenge, participants were provided with the same input data for developing the models, with the prediction testing data provided at a later date. Our approach used ensemble models created by combining three disparate types of component models: 1) two-dimensional Method of Analogues models incorporating both dengue and climate data; 2) additive seasonal Holt-Winters models with and without wavelet smoothing; and 3) simple historical models. Of the individual component models created, those with the best performance on the prior four years of data were incorporated into the ensemble models. There were separate ensembles for predicting each of the three targets at each of the two locations. Our ensemble models scored higher for peak height and total dengue case counts reported in a transmission season for Iquitos than all other models submitted to the Dengue Challenge. However, the ensemble models did not do nearly as well when predicting the peak week. The Dengue Challenge organizers scored the dengue predictions of the Challenge participant groups. Our ensemble approach was the best in predicting the total number of dengue cases reported for transmission season and peak height for Iquitos, Peru.

  18. Evoked emotions predict food choice.

    Science.gov (United States)

    Dalenberg, Jelle R; Gutjar, Swetlana; Ter Horst, Gert J; de Graaf, Kees; Renken, Remco J; Jager, Gerry

    2014-01-01

    In the current study we show that non-verbal food-evoked emotion scores significantly improve food choice prediction over merely liking scores. Previous research has shown that liking measures correlate with choice. However, liking is no strong predictor for food choice in real life environments. Therefore, the focus within recent studies shifted towards using emotion-profiling methods that successfully can discriminate between products that are equally liked. However, it is unclear how well scores from emotion-profiling methods predict actual food choice and/or consumption. To test this, we proposed to decompose emotion scores into valence and arousal scores using Principal Component Analysis (PCA) and apply Multinomial Logit Models (MLM) to estimate food choice using liking, valence, and arousal as possible predictors. For this analysis, we used an existing data set comprised of liking and food-evoked emotions scores from 123 participants, who rated 7 unlabeled breakfast drinks. Liking scores were measured using a 100-mm visual analogue scale, while food-evoked emotions were measured using 2 existing emotion-profiling methods: a verbal and a non-verbal method (EsSense Profile and PrEmo, respectively). After 7 days, participants were asked to choose 1 breakfast drink from the experiment to consume during breakfast in a simulated restaurant environment. Cross validation showed that we were able to correctly predict individualized food choice (1 out of 7 products) for over 50% of the participants. This number increased to nearly 80% when looking at the top 2 candidates. Model comparisons showed that evoked emotions better predict food choice than perceived liking alone. However, the strongest predictive strength was achieved by the combination of evoked emotions and liking. Furthermore we showed that non-verbal food-evoked emotion scores more accurately predict food choice than verbal food-evoked emotions scores.

  19. Ensemble method for dengue prediction.

    Directory of Open Access Journals (Sweden)

    Anna L Buczak

    Full Text Available In the 2015 NOAA Dengue Challenge, participants made three dengue target predictions for two locations (Iquitos, Peru, and San Juan, Puerto Rico during four dengue seasons: 1 peak height (i.e., maximum weekly number of cases during a transmission season; 2 peak week (i.e., week in which the maximum weekly number of cases occurred; and 3 total number of cases reported during a transmission season. A dengue transmission season is the 12-month period commencing with the location-specific, historical week with the lowest number of cases. At the beginning of the Dengue Challenge, participants were provided with the same input data for developing the models, with the prediction testing data provided at a later date.Our approach used ensemble models created by combining three disparate types of component models: 1 two-dimensional Method of Analogues models incorporating both dengue and climate data; 2 additive seasonal Holt-Winters models with and without wavelet smoothing; and 3 simple historical models. Of the individual component models created, those with the best performance on the prior four years of data were incorporated into the ensemble models. There were separate ensembles for predicting each of the three targets at each of the two locations.Our ensemble models scored higher for peak height and total dengue case counts reported in a transmission season for Iquitos than all other models submitted to the Dengue Challenge. However, the ensemble models did not do nearly as well when predicting the peak week.The Dengue Challenge organizers scored the dengue predictions of the Challenge participant groups. Our ensemble approach was the best in predicting the total number of dengue cases reported for transmission season and peak height for Iquitos, Peru.

  20. Dinosaur fossils predict body temperatures.

    Directory of Open Access Journals (Sweden)

    James F Gillooly

    2006-07-01

    Full Text Available Perhaps the greatest mystery surrounding dinosaurs concerns whether they were endotherms, ectotherms, or some unique intermediate form. Here we present a model that yields estimates of dinosaur body temperature based on ontogenetic growth trajectories obtained from fossil bones. The model predicts that dinosaur body temperatures increased with body mass from approximately 25 degrees C at 12 kg to approximately 41 degrees C at 13,000 kg. The model also successfully predicts observed increases in body temperature with body mass for extant crocodiles. These results provide direct evidence that dinosaurs were reptiles that exhibited inertial homeothermy.

  1. Calorimetry end-point predictions

    International Nuclear Information System (INIS)

    Fox, M.A.

    1981-01-01

    This paper describes a portion of the work presently in progress at Rocky Flats in the field of calorimetry. In particular, calorimetry end-point predictions are outlined. The problems associated with end-point predictions and the progress made in overcoming these obstacles are discussed. The two major problems, noise and an accurate description of the heat function, are dealt with to obtain the most accurate results. Data are taken from an actual calorimeter and are processed by means of three different noise reduction techniques. The processed data are then utilized by one to four algorithms, depending on the accuracy desired to determined the end-point

  2. Prediction of eyespot infection risks

    Directory of Open Access Journals (Sweden)

    M. Váòová

    2012-12-01

    Full Text Available The objective of the study was to design a prediction model for eyespot (Tapesia yallundae infection based on climatic factors (temperature, precipitation, air humidity. Data from experiment years 1994-2002 were used to study correlations between the eyespot infection index and individual weather characteristics. The model of prediction was constructed using multiple regression when a separate parameter is assigned to each factor, i.e. the frequency of days with optimum temperatures, humidity, and precipitation. The correlation between relative air humidity and precipitation and the infection index is significant.

  3. Can we predict nuclear proliferation

    International Nuclear Information System (INIS)

    Tertrais, Bruno

    2011-01-01

    The author aims at improving nuclear proliferation prediction capacities, i.e. the capacities to identify countries susceptible to acquire nuclear weapons, to interpret sensitive activities, and to assess nuclear program modalities. He first proposes a retrospective assessment of counter-proliferation actions since 1945. Then, based on academic studies, he analyzes what causes and motivates proliferation, with notably the possibility of existence of a chain phenomenon (mechanisms driving from one program to another). He makes recommendations for a global approach to proliferation prediction, and proposes proliferation indices and indicators

  4. CERAPP: Collaborative Estrogen Receptor Activity Prediction Project

    Data.gov (United States)

    U.S. Environmental Protection Agency — Data from a large-scale modeling project called CERAPP (Collaborative Estrogen Receptor Activity Prediction Project) demonstrating using predictive computational...

  5. The Challenge of Weather Prediction

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 2; Issue 3. The Challenge of Weather Prediction Old and Modern Ways of Weather Forecasting. B N Goswami. Series Article Volume 2 Issue 3 March 1997 pp 8-15. Fulltext. Click here to view fulltext PDF. Permanent link:

  6. Predictability of weather and climate

    National Research Council Canada - National Science Library

    Palmer, Tim; Hagedorn, Renate

    2006-01-01

    ... and anthropogenic climate change are among those included. Ensemble systems for forecasting predictability are discussed extensively. Ed Lorenz, father of chaos theory, makes a contribution to theoretical analysis with a previously unpublished paper. This well-balanced volume will be a valuable resource for many years. High-quality chapter autho...

  7. Evaluation of environmental impact predictions

    International Nuclear Information System (INIS)

    Cunningham, P.A.; Adams, S.M.; Kumar, K.D.

    1977-01-01

    An analysis and evaluation of the ecological monitoring program at the Surry Nuclear Power Plant showed that predictions of potential environmental impact made in the Final Environmental Statement (FES), which were based on generally accepted ecological principles, were not completely substantiated by environmental monitoring data. The Surry Nuclear Power Plant (Units 1 and 2) was chosen for study because of the facility's relatively continuous operating history and the availability of environmental data adequate for analysis. Preoperational and operational fish monitoring data were used to assess the validity of the FES prediction that fish would congregate in the thermal plume during winter months and would avoid the plume during summer months. Analysis of monitoring data showed that fish catch per unit effort (CPE) was generally high in the thermal plume during winter months; however, the highest fish catches occurred in the plume during the summer. Possible explanations for differences between the FES prediction and results observed in analysis of monitoring data are discussed, and general recommendations are outlined for improving impact assessment predictions

  8. Using Predictability for Lexical Segmentation.

    Science.gov (United States)

    Çöltekin, Çağrı

    2017-09-01

    This study investigates a strategy based on predictability of consecutive sub-lexical units in learning to segment a continuous speech stream into lexical units using computational modeling and simulations. Lexical segmentation is one of the early challenges during language acquisition, and it has been studied extensively through psycholinguistic experiments as well as computational methods. However, despite strong empirical evidence, the explicit use of predictability of basic sub-lexical units in models of segmentation is underexplored. This paper presents an incremental computational model of lexical segmentation for exploring the usefulness of predictability for lexical segmentation. We show that the predictability cue is a strong cue for segmentation. Contrary to earlier reports in the literature, the strategy yields state-of-the-art segmentation performance with an incremental computational model that uses only this particular cue in a cognitively plausible setting. The paper also reports an in-depth analysis of the model, investigating the conditions affecting the usefulness of the strategy. Copyright © 2016 Cognitive Science Society, Inc.

  9. Solution Patterns Predicting Pythagorean Triples

    Science.gov (United States)

    Ezenweani, Ugwunna Louis

    2013-01-01

    Pythagoras Theorem is an old mathematical treatise that has traversed the school curricula from secondary to tertiary levels. The patterns it produced are quite interesting that many researchers have tried to generate a kind of predictive approach to identifying triples. Two attempts, namely Diophantine equation and Brahmagupta trapezium presented…

  10. Predicting response to epigenetic therapy

    DEFF Research Database (Denmark)

    Treppendahl, Marianne B; Sommer Kristensen, Lasse; Grønbæk, Kirsten

    2014-01-01

    of good pretreatment predictors of response is of great value. Many clinical parameters and molecular targets have been tested in preclinical and clinical studies with varying results, leaving room for optimization. Here we provide an overview of markers that may predict the efficacy of FDA- and EMA...

  11. Predicting Volleyball Serve-Reception

    NARCIS (Netherlands)

    Paulo, Ana; Zaal, Frank T J M; Fonseca, Sofia; Araujo, Duarte

    2016-01-01

    Serve and serve-reception performance have predicted success in volleyball. Given the impact of serve-reception on the game, we aimed at understanding what it is in the serve and receiver's actions that determines the selection of the type of pass used in serve-reception and its efficacy. Four

  12. Prediction of electric vehicle penetration.

    Science.gov (United States)

    2017-05-01

    The object of this report is to present the current market status of plug-in-electric : vehicles (PEVs) and to predict their future penetration within the world and U.S. : markets. The sales values for 2016 show a strong year of PEV sales both in the...

  13. Evoked Emotions Predict Food Choice

    NARCIS (Netherlands)

    Dalenberg, Jelle R.; Gutjar, Swetlana; ter Horst, Gert J.; de Graaf, Kees; Renken, Remco J.; Jager, Gerry

    2014-01-01

    In the current study we show that non-verbal food-evoked emotion scores significantly improve food choice prediction over merely liking scores. Previous research has shown that liking measures correlate with choice. However, liking is no strong predictor for food choice in real life environments.

  14. Framework for Traffic Congestion Prediction

    NARCIS (Netherlands)

    Zaki, J.F.W.; Ali-Eldin, A.M.T.; Hussein, S.E.; Saraya, S.F.; Areed, F.F.

    2016-01-01

    Traffic Congestion is a complex dilemma facing most major cities. It has undergone a lot of research since the early 80s in an attempt to predict traffic in the short-term. Recently, Intelligent Transportation Systems (ITS) became an integral part of traffic research which helped in modeling and

  15. Predicting Character Traits Through Reddit

    Science.gov (United States)

    2015-01-01

    and even employers (Res). Companies like Netflix also use personality classification algorithms in order to provide users with predictions of movies...science behind the netflix algorithms that decide what to watch next, August 2013. Reza Zafarani and Huan Liu. Evaluation without ground truth in social media research. Communications Of The ACM, 58(6):54–60, June 2015. 12

  16. Prediction of natural gas consumption

    International Nuclear Information System (INIS)

    Zhang, R.L.; Walton, D.J.; Hoskins, W.D.

    1993-01-01

    Distributors of natural gas need to predict future consumption in order to purchase a sufficient supply on contract. Distributors that offer their customers equal payment plans need to predict the consumption of each customer 12 months in advance. Estimates of previous consumption are often used for months when meters are inaccessible, or bimonthly-read meters. Existing methods of predicting natural gas consumption, and a proposed new method for each local region are discussed. The proposed model distinguishes the consumption load factors from summer to other seasons by attempting to adjust them by introducing two parameters. The problem is then reduced to a quadratic programming problem. However, since it is not necessary to use both parameters simultaneously, the problem can be solved with a simple iterative procedure. Results show that the new model can improve the two-equation model to a certain scale. The adjustment to heat load factor can reduce the error of prediction markedly while that to base load factor influences the error marginally. 3 refs., 11 figs., 2 tabs

  17. Prediction of Subsidence Depression Development

    Czech Academy of Sciences Publication Activity Database

    Doležalová, Hana; Kajzar, Vlastimil

    2017-01-01

    Roč. 6, č. 4 (2017), s. 208-214 E-ISSN 2391-9361. [Cross-border Exchange of Experience in Production Engineering Using Principles of Mathematics. Rybnik, 07.06.2017-09.06.2017] Institutional support: RVO:68145535 Keywords : undermining * prediction * regression analysis Subject RIV: DH - Mining, incl. Coal Mining OBOR OECD: Mining and mineral processing

  18. Bankruptcy Prediction with Rough Sets

    NARCIS (Netherlands)

    J.C. Bioch (Cor); V. Popova (Viara)

    2001-01-01

    textabstractThe bankruptcy prediction problem can be considered an or dinal classification problem. The classical theory of Rough Sets describes objects by discrete attributes, and does not take into account the order- ing of the attributes values. This paper proposes a modification of the Rough Set

  19. Climate Prediction Center - monthly Outlook

    Science.gov (United States)

    Weather Service NWS logo - Click to go to the NWS home page Climate Prediction Center Site Map News Outlooks monthly Climate Outlooks Banner OFFICIAL Forecasts June 2018 [UPDATED MONTHLY FORECASTS SERVICE ) Canonical Correlation Analysis ECCA - Ensemble Canonical Correlation Analysis Optimal Climate Normals

  20. Climate Prediction Center - Site Index

    Science.gov (United States)

    Weather Service NWS logo - Click to go to the NWS home page Climate Prediction Center Home Site Map News Means Bulletins Annual Winter Stratospheric Ozone Climate Diagnostics Bulletin (Most Recent) Climate (Hazards Outlook) Climate Assessment: Dec. 1999-Feb. 2000 (Seasonal) Climate Assessment: Mar-May 2000

  1. Predictive medical information and underwriting.

    Science.gov (United States)

    Dodge, John H

    2007-01-01

    Medical underwriting involves the application of actuarial science by analyzing medical information to predict the future risk of a claim. The objective is that individuals with like risk are treated in a like manner so that the premium paid is proportional to the risk of future claim.

  2. Can Creativity Predict Cognitive Reserve?

    Science.gov (United States)

    Palmiero, Massimiliano; Di Giacomo, Dina; Passafiume, Domenico

    2016-01-01

    Cognitive reserve relies on the ability to effectively cope with aging and brain damage by using alternate processes to approach tasks when standard approaches are no longer available. In this study, the issue if creativity can predict cognitive reserve has been explored. Forty participants (mean age: 61 years) filled out: the Cognitive Reserve…

  3. A prediction for bubbling geometries

    OpenAIRE

    Okuda, Takuya

    2007-01-01

    We study the supersymmetric circular Wilson loops in N=4 Yang-Mills theory. Their vacuum expectation values are computed in the parameter region that admits smooth bubbling geometry duals. The results are a prediction for the supergravity action evaluated on the bubbling geometries for Wilson loops.

  4. Detecting failure of climate predictions

    Science.gov (United States)

    Runge, Michael C.; Stroeve, Julienne C.; Barrett, Andrew P.; McDonald-Madden, Eve

    2016-01-01

    The practical consequences of climate change challenge society to formulate responses that are more suited to achieving long-term objectives, even if those responses have to be made in the face of uncertainty1, 2. Such a decision-analytic focus uses the products of climate science as probabilistic predictions about the effects of management policies3. Here we present methods to detect when climate predictions are failing to capture the system dynamics. For a single model, we measure goodness of fit based on the empirical distribution function, and define failure when the distribution of observed values significantly diverges from the modelled distribution. For a set of models, the same statistic can be used to provide relative weights for the individual models, and we define failure when there is no linear weighting of the ensemble models that produces a satisfactory match to the observations. Early detection of failure of a set of predictions is important for improving model predictions and the decisions based on them. We show that these methods would have detected a range shift in northern pintail 20 years before it was actually discovered, and are increasingly giving more weight to those climate models that forecast a September ice-free Arctic by 2055.

  5. Predicting severity of paranoid schizophrenia

    OpenAIRE

    Kolesnichenko Elena Vladimirovna

    2015-01-01

    Clinical symptoms, course and outcomes of paranoid schizophrenia are polymorphic. 206 cases of paranoid schizophrenia were investigated. Clinical predictors were collected from hospital records and interviews. Quantitative assessment of the severity of schizophrenia as special indexes was used. Schizoid, epileptoid, psychasthenic and conformal accentuation of personality in the premorbid, early onset of psychosis, paranoid and hallucinatory-paranoid variants of onset predicted more expressed ...

  6. Predictability of Mobile Phone Associations

    DEFF Research Database (Denmark)

    Jensen, Bjørn Sand; Larsen, Jan; Hansen, Lars Kai

    2010-01-01

    Prediction and understanding of human behavior is of high importance in many modern applications and research areas ranging from context-aware services, wireless resource allocation to social sciences. In this study we collect a novel dataset using standard mobile phones and analyze how the predi...... representation, and general behavior. This is of vital interest in the development of context-aware services which rely on forecasting based on mobile phone sensors.......Prediction and understanding of human behavior is of high importance in many modern applications and research areas ranging from context-aware services, wireless resource allocation to social sciences. In this study we collect a novel dataset using standard mobile phones and analyze how...... the predictability of mobile sensors, acting as proxies for humans, change with time scale and sensor type such as GSM and WLAN. Applying recent information theoretic methods, it is demonstrated that an upper bound on predictability is relatively high for all sensors given the complete history (typically above 90...

  7. Numerical prediction of slamming loads

    DEFF Research Database (Denmark)

    Seng, Sopheak; Jensen, Jørgen J; Pedersen, Preben T

    2012-01-01

    It is important to include the contribution of the slamming-induced response in the structural design of large vessels with a significant bow flare. At the same time it is a challenge to develop rational tools to determine the slamming-induced loads and the prediction of their occurrence. Today i...

  8. Predictive models of moth development

    Science.gov (United States)

    Degree-day models link ambient temperature to insect life-stages, making such models valuable tools in integrated pest management. These models increase management efficacy by predicting pest phenology. In Wisconsin, the top insect pest of cranberry production is the cranberry fruitworm, Acrobasis v...

  9. Prediction of Malaysian monthly GDP

    Science.gov (United States)

    Hin, Pooi Ah; Ching, Soo Huei; Yeing, Pan Wei

    2015-12-01

    The paper attempts to use a method based on multivariate power-normal distribution to predict the Malaysian Gross Domestic Product next month. Letting r(t) be the vector consisting of the month-t values on m selected macroeconomic variables, and GDP, we model the month-(t+1) GDP to be dependent on the present and l-1 past values r(t), r(t-1),…,r(t-l+1) via a conditional distribution which is derived from a [(m+1)l+1]-dimensional power-normal distribution. The 100(α/2)% and 100(1-α/2)% points of the conditional distribution may be used to form an out-of sample prediction interval. This interval together with the mean of the conditional distribution may be used to predict the month-(t+1) GDP. The mean absolute percentage error (MAPE), estimated coverage probability and average length of the prediction interval are used as the criterions for selecting the suitable lag value l-1 and the subset from a pool of 17 macroeconomic variables. It is found that the relatively better models would be those of which 2 ≤ l ≤ 3, and involving one or two of the macroeconomic variables given by Market Indicative Yield, Oil Prices, Exchange Rate and Import Trade.

  10. Cast iron - a predictable material

    Directory of Open Access Journals (Sweden)

    Jorg C. Sturm

    2011-02-01

    Full Text Available High strength compacted graphite iron (CGI or alloyed cast iron components are substituting previously used non-ferrous castings in automotive power train applications. The mechanical engineering industry has recognized the value in substituting forged or welded structures with stiff and light-weight cast iron castings. New products such as wind turbines have opened new markets for an entire suite of highly reliable ductile iron cast components. During the last 20 years, casting process simulation has developed from predicting hot spots and solidification to an integral assessment tool for foundries for the entire manufacturing route of castings. The support of the feeding related layout of the casting is still one of the most important duties for casting process simulation. Depending on the alloy poured, different feeding behaviors and self-feeding capabilities need to be considered to provide a defect free casting. Therefore, it is not enough to base the prediction of shrinkage defects solely on hot spots derived from temperature fields. To be able to quantitatively predict these defects, solidification simulation had to be combined with density and mass transport calculations, in order to evaluate the impact of the solidification morphology on the feeding behavior as well as to consider alloy dependent feeding ranges. For cast iron foundries, the use of casting process simulation has become an important instrument to predict the robustness and reliability of their processes, especially since the influence of alloying elements, melting practice and metallurgy need to be considered to quantify the special shrinkage and solidification behavior of cast iron. This allows the prediction of local structures, phases and ultimately the local mechanical properties of cast irons, to asses casting quality in the foundry but also to make use of this quantitative information during design of the casting. Casting quality issues related to thermally driven

  11. HUMAN DECISIONS AND MACHINE PREDICTIONS.

    Science.gov (United States)

    Kleinberg, Jon; Lakkaraju, Himabindu; Leskovec, Jure; Ludwig, Jens; Mullainathan, Sendhil

    2018-02-01

    Can machine learning improve human decision making? Bail decisions provide a good test case. Millions of times each year, judges make jail-or-release decisions that hinge on a prediction of what a defendant would do if released. The concreteness of the prediction task combined with the volume of data available makes this a promising machine-learning application. Yet comparing the algorithm to judges proves complicated. First, the available data are generated by prior judge decisions. We only observe crime outcomes for released defendants, not for those judges detained. This makes it hard to evaluate counterfactual decision rules based on algorithmic predictions. Second, judges may have a broader set of preferences than the variable the algorithm predicts; for instance, judges may care specifically about violent crimes or about racial inequities. We deal with these problems using different econometric strategies, such as quasi-random assignment of cases to judges. Even accounting for these concerns, our results suggest potentially large welfare gains: one policy simulation shows crime reductions up to 24.7% with no change in jailing rates, or jailing rate reductions up to 41.9% with no increase in crime rates. Moreover, all categories of crime, including violent crimes, show reductions; and these gains can be achieved while simultaneously reducing racial disparities. These results suggest that while machine learning can be valuable, realizing this value requires integrating these tools into an economic framework: being clear about the link between predictions and decisions; specifying the scope of payoff functions; and constructing unbiased decision counterfactuals. JEL Codes: C10 (Econometric and statistical methods and methodology), C55 (Large datasets: Modeling and analysis), K40 (Legal procedure, the legal system, and illegal behavior).

  12. Ocean eddies and climate predictability.

    Science.gov (United States)

    Kirtman, Ben P; Perlin, Natalie; Siqueira, Leo

    2017-12-01

    A suite of coupled climate model simulations and experiments are used to examine how resolved mesoscale ocean features affect aspects of climate variability, air-sea interactions, and predictability. In combination with control simulations, experiments with the interactive ensemble coupling strategy are used to further amplify the role of the oceanic mesoscale field and the associated air-sea feedbacks and predictability. The basic intent of the interactive ensemble coupling strategy is to reduce the atmospheric noise at the air-sea interface, allowing an assessment of how noise affects the variability, and in this case, it is also used to diagnose predictability from the perspective of signal-to-noise ratios. The climate variability is assessed from the perspective of sea surface temperature (SST) variance ratios, and it is shown that, unsurprisingly, mesoscale variability significantly increases SST variance. Perhaps surprising is the fact that the presence of mesoscale ocean features even further enhances the SST variance in the interactive ensemble simulation beyond what would be expected from simple linear arguments. Changes in the air-sea coupling between simulations are assessed using pointwise convective rainfall-SST and convective rainfall-SST tendency correlations and again emphasize how the oceanic mesoscale alters the local association between convective rainfall and SST. Understanding the possible relationships between the SST-forced signal and the weather noise is critically important in climate predictability. We use the interactive ensemble simulations to diagnose this relationship, and we find that the presence of mesoscale ocean features significantly enhances this link particularly in ocean eddy rich regions. Finally, we use signal-to-noise ratios to show that the ocean mesoscale activity increases model estimated predictability in terms of convective precipitation and atmospheric upper tropospheric circulation.

  13. Predicting steam generator crevice chemistry

    International Nuclear Information System (INIS)

    Burton, G.; Strati, G.

    2006-01-01

    'Full text:' Corrosion of steam cycle components produces insoluble material, mostly iron oxides, that are transported to the steam generator (SG) via the feedwater and deposited on internal surfaces such as the tubes, tube support plates and the tubesheet. The build up of these corrosion products over time can lead to regions of restricted flow with water chemistry that may be significantly different, and potentially more corrosive to SG tube material, than the bulk steam generator water chemistry. The aim of the present work is to predict SG crevice chemistry using experimentation and modelling as part of AECL's overall strategy for steam generator life management. Hideout-return experiments are performed under CANDU steam generator conditions to assess the accumulation of impurities in hideout, and return from, model crevices. The results are used to validate the ChemSolv model that predicts steam generator crevice impurity concentrations, and high temperature pH, based on process parameters (e.g., heat flux, primary side temperature) and blowdown water chemistry. The model has been incorporated into ChemAND, AECL's system health monitoring software for chemistry monitoring, analysis and diagnostics that has been installed at two domestic and one international CANDU station. ChemAND provides the station chemists with the only method to predict SG crevice chemistry. In one recent application, the software has been used to evaluate the crevice chemistry based on the elevated, but balanced, SG bulk water impurity concentrations present during reactor startup, in order to reduce hold times. The present paper will describe recent hideout-return experiments that are used for the validation of the ChemSolv model, station experience using the software, and improvements to predict the crevice electrochemical potential that will permit station staff to ensure that the SG tubes are in the 'safe operating zone' predicted by Lu (AECL). (author)

  14. Predicting outcome of status epilepticus.

    Science.gov (United States)

    Leitinger, M; Kalss, G; Rohracher, A; Pilz, G; Novak, H; Höfler, J; Deak, I; Kuchukhidze, G; Dobesberger, J; Wakonig, A; Trinka, E

    2015-08-01

    Status epilepticus (SE) is a frequent neurological emergency complicated by high mortality and often poor functional outcome in survivors. The aim of this study was to review available clinical scores to predict outcome. Literature review. PubMed Search terms were "score", "outcome", and "status epilepticus" (April 9th 2015). Publications with abstracts available in English, no other language restrictions, or any restrictions concerning investigated patients were included. Two scores were identified: "Status Epilepticus Severity Score--STESS" and "Epidemiology based Mortality score in SE--EMSE". A comprehensive comparison of test parameters concerning performance, options, and limitations was performed. Epidemiology based Mortality score in SE allows detailed individualization of risk factors and is significantly superior to STESS in a retrospective explorative study. In particular, EMSE is very good at detection of good and bad outcome, whereas STESS detecting bad outcome is limited by a ceiling effect and uncertainty of correct cutoff value. Epidemiology based Mortality score in SE can be adapted to different regions in the world and to advances in medicine, as new data emerge. In addition, we designed a reporting standard for status epilepticus to enhance acquisition and communication of outcome relevant data. A data acquisition sheet used from patient admission in emergency room, from the EEG lab to intensive care unit, is provided for optimized data collection. Status Epilepticus Severity Score is easy to perform and predicts bad outcome, but has a low predictive value for good outcomes. Epidemiology based Mortality score in SE is superior to STESS in predicting good or bad outcome but needs marginally more time to perform. Epidemiology based Mortality score in SE may prove very useful for risk stratification in interventional studies and is recommended for individual outcome prediction. Prospective validation in different cohorts is needed for EMSE, whereas

  15. Multiphase, multicomponent phase behavior prediction

    Science.gov (United States)

    Dadmohammadi, Younas

    Accurate prediction of phase behavior of fluid mixtures in the chemical industry is essential for designing and operating a multitude of processes. Reliable generalized predictions of phase equilibrium properties, such as pressure, temperature, and phase compositions offer an attractive alternative to costly and time consuming experimental measurements. The main purpose of this work was to assess the efficacy of recently generalized activity coefficient models based on binary experimental data to (a) predict binary and ternary vapor-liquid equilibrium systems, and (b) characterize liquid-liquid equilibrium systems. These studies were completed using a diverse binary VLE database consisting of 916 binary and 86 ternary systems involving 140 compounds belonging to 31 chemical classes. Specifically the following tasks were undertaken: First, a comprehensive assessment of the two common approaches (gamma-phi (gamma-ϕ) and phi-phi (ϕ-ϕ)) used for determining the phase behavior of vapor-liquid equilibrium systems is presented. Both the representation and predictive capabilities of these two approaches were examined, as delineated form internal and external consistency tests of 916 binary systems. For the purpose, the universal quasi-chemical (UNIQUAC) model and the Peng-Robinson (PR) equation of state (EOS) were used in this assessment. Second, the efficacy of recently developed generalized UNIQUAC and the nonrandom two-liquid (NRTL) for predicting multicomponent VLE systems were investigated. Third, the abilities of recently modified NRTL model (mNRTL2 and mNRTL1) to characterize liquid-liquid equilibria (LLE) phase conditions and attributes, including phase stability, miscibility, and consolute point coordinates, were assessed. The results of this work indicate that the ϕ-ϕ approach represents the binary VLE systems considered within three times the error of the gamma-ϕ approach. A similar trend was observed for the for the generalized model predictions using

  16. Branch prediction in the pentium family

    DEFF Research Database (Denmark)

    Fog, Agner

    1998-01-01

    How the branch prediction mechanism in the Pentium has been uncovered with all its quirks, and the incredibly more effective branch prediction in the later versions.......How the branch prediction mechanism in the Pentium has been uncovered with all its quirks, and the incredibly more effective branch prediction in the later versions....

  17. Neural Networks for protein Structure Prediction

    DEFF Research Database (Denmark)

    Bohr, Henrik

    1998-01-01

    This is a review about neural network applications in bioinformatics. Especially the applications to protein structure prediction, e.g. prediction of secondary structures, prediction of surface structure, fold class recognition and prediction of the 3-dimensional structure of protein backbones...

  18. Semen analysis and prediction of natural conception

    NARCIS (Netherlands)

    Leushuis, Esther; van der Steeg, Jan Willem; Steures, Pieternel; Repping, Sjoerd; Bossuyt, Patrick M. M.; Mol, Ben Willem J.; Hompes, Peter G. A.; van der Veen, Fulco

    2014-01-01

    Do two semen analyses predict natural conception better than a single semen analysis and will adding the results of repeated semen analyses to a prediction model for natural pregnancy improve predictions? A second semen analysis does not add helpful information for predicting natural conception

  19. Time-Predictable Virtual Memory

    DEFF Research Database (Denmark)

    Puffitsch, Wolfgang; Schoeberl, Martin

    2016-01-01

    Virtual memory is an important feature of modern computer architectures. For hard real-time systems, memory protection is a particularly interesting feature of virtual memory. However, current memory management units are not designed for time-predictability and therefore cannot be used...... in such systems. This paper investigates the requirements on virtual memory from the perspective of hard real-time systems and presents the design of a time-predictable memory management unit. Our evaluation shows that the proposed design can be implemented efficiently. The design allows address translation...... and address range checking in constant time of two clock cycles on a cache miss. This constant time is in strong contrast to the possible cost of a miss in a translation look-aside buffer in traditional virtual memory organizations. Compared to a platform without a memory management unit, these two additional...

  20. Predicting responses from Rasch measures.

    Science.gov (United States)

    Linacre, John M

    2010-01-01

    There is a growing family of Rasch models for polytomous observations. Selecting a suitable model for an existing dataset, estimating its parameters and evaluating its fit is now routine. Problems arise when the model parameters are to be estimated from the current data, but used to predict future data. In particular, ambiguities in the nature of the current data, or overfit of the model to the current dataset, may mean that better fit to the current data may lead to worse fit to future data. The predictive power of several Rasch and Rasch-related models are discussed in the context of the Netflix Prize. Rasch-related models are proposed based on Singular Value Decomposition (SVD) and Boltzmann Machines.