WorldWideScience

Sample records for analysts predict furtmer

  1. Analyst-to-Analyst Variability in Simulation-Based Prediction

    Energy Technology Data Exchange (ETDEWEB)

    Glickman, Matthew R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Romero, Vicente J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-02-01

    This report describes findings from the culminating experiment of the LDRD project entitled, "Analyst-to-Analyst Variability in Simulation-Based Prediction". For this experiment, volunteer participants solving a given test problem in engineering and statistics were interviewed at different points in their solution process. These interviews are used to trace differing solutions to differing solution processes, and differing processes to differences in reasoning, assumptions, and judgments. The issue that the experiment was designed to illuminate -- our paucity of understanding of the ways in which humans themselves have an impact on predictions derived from complex computational simulations -- is a challenging and open one. Although solution of the test problem by analyst participants in this experiment has taken much more time than originally anticipated, and is continuing past the end of this LDRD, this project has provided a rare opportunity to explore analyst-to-analyst variability in significant depth, from which we derive evidence-based insights to guide further explorations in this important area.

  2. Essays on financial analysts' forecasts

    OpenAIRE

    Rodriguez, Marius del Giudice

    2006-01-01

    This dissertation contains three self-contained chapters dealing with specific aspects of financial analysts' earnings forecasts. After recent accounting scandals, much attention has turned to the incentives present in the career of professional financial analysts. The literature points to several reasons why financial analysts behave overoptimistically when providing their predictions. In particular, analysts may wish to maintain good relations with firm management, to please the underwriter...

  3. Analysts forecast error : A robust prediction model and its short term trading

    NARCIS (Netherlands)

    Boudt, Kris; de Goeij, Peter; Thewissen, James; Van Campenhout, Geert

    We examine the profitability of implementing a short term trading strategy based on predicting the error in analysts' earnings per share forecasts using publicly available information. Since large earnings surprises may lead to extreme values in the forecast error series that disrupt their smooth

  4. Learning about Analysts

    DEFF Research Database (Denmark)

    Rudiger, Jesper; Vigier, Adrien

    2017-01-01

    We examine an analyst with career concerns making cheap talk recommendations to a sequence of traders, each of whom possesses noisy private information concerning the analyst's ability. Each period, the reputation of the analyst is updated based on the recommendation and price developments....... An endogeneity problem thus arises, creating opportunities for the bad analyst to manipulate the market. We show that if, by a streak of good luck, the bad analyst builds up her reputation she can then momentarily hide her type. However, the capability of the bad analyst to manipulate the market has limits...

  5. Big Data, Data Analyst, and Improving the Competence of Librarian

    Directory of Open Access Journals (Sweden)

    Albertus Pramukti Narendra

    2016-01-01

    Full Text Available Issue of Big Data was already raised by Fremont Rider, an American Librarian from Westleyan University, in 1944. He predicted that the volume of American universities collection would reach 200 million copies in 2040. As a result, it brings to fore multiple issues such as big data users, storage capacity, and the need to have data analysts. In Indonesia, data analysts is still a rare profession, and therefore urgently needed. One of its distinctive tasks is to conduct visual analyses from various data resources and also to present the result visually as interesting knowledge. It becomes science enliven by interactive visualization. In response to the issue, librarians have already been equipped with basic information management. Yet, they can see the opportunity and improve themselves as data analysts. In developed countries, it is common that librarian are also regarded as data analysts. They enhance themselves with various skills required, such as cloud computing and smart computing. In the end librarian with data analysts competency are eloquent to extract and present complex data resources as interesting and discernible knowledge.

  6. Through the Eyes of Analysts: A Content Analysis of Analyst Report Narratives

    DEFF Research Database (Denmark)

    Nielsen, Christian

    2004-01-01

    information as relevant to their analyses and recommendations. The paper shows that background information about the company, i.e. about products, markets and the industry, along with the analysts' own analysis of financial and operating data account for nearly 55% of the total disclosure in fundamental......This paper contributes to the ongoing debate of developing corporate reporting practices by analyzing the information content of fundamental analyst reports and comparing this with annual reporting practices. As there has been much critique of the lacking relevance of disclosures through corporate...... analyst reports, and the amount of financial data supplied is not related to the other disclosures in the reports. In comparison to business reporting practices, the fundamental analyst reports put relatively less weight on social and sustainability information, intellectual capital and corporate...

  7. Gender issues of financial analysts

    OpenAIRE

    Jingwen Ge

    2013-01-01

    Increased attention has been drawn to the gender disparity in workplace. This dissertation is dedicated to provide sight to the gender issues in financial analysts. Profound literature reviews are conducted about gender issues and financial analysts, respectively in order to comprehend the existing gender concerns in the business world, and role and functions of financial analysts. Research proposals are described to answer the following question: whether women financial analysts are more lik...

  8. Big Data, Data Analyst, and Improving the Competence of Librarian

    Directory of Open Access Journals (Sweden)

    Albertus Pramukti Narendra

    2018-01-01

    Full Text Available Issue of Big Data was already raised by Fremont Rider, an American Librarian from Westleyan University, in 1944. He predicted that the volume of American universities collection would reach 200 million copies in 2040. As a result, it brings to fore multiple issues such as big data users, storage capacity, and the need to have data analysts.  In Indonesia, data analysts is still a rare profession, and therefore urgently needed. One of its distinctive tasks  is to conduct visual analyses from various data resources and also to present the result visually as interesting knowledge. It becomes science enliven by interactive visualization. (Thomas and Cook, 2005. In response to the issue, librarians have already been equipped with basic information management. Yet, they can see the opportunity and improve themselves as data analysts. In developed countries, it is common that librarian are also regarded as data analysts. They enhance  themselves with various skills required, such as cloud computing and smart computing. In the end librarian with data analysts competency are eloquent to extract and present complex data resources as “interesting and discernible” knowledge.

  9. A content analysis of analyst research: health care through the eyes of analysts.

    Science.gov (United States)

    Nielsen, Christian

    2008-01-01

    This article contributes to the understanding of how health care companies may communicate the business models by studying financial analysts' analyst reports. The study examines the differences between the information conveyed in recurrent and fundamental analyst reports as well as whether the characteristics of the analysts and their environment affect their business model analyses. A medium-sized health care company in the medical-technology sector, internationally renowned for its state-of-the-art business reporting, was chosen as the basis for the study. An analysis of 111 fundamental and recurrent analyst reports on this company by each investment bank actively following it was conducted using a content analysis methodology. The study reveals that the recurrent analyses are concerned with evaluating the information disclosed by the health care company itself and not so much with digging up new information. It also indicates that while maintenance work might be focused on evaluating specific details, fundamental research is more concerned with extending the understanding of the general picture, i.e., the sustainability and performance of the overall business model. The amount of financial information disclosed in either type of report is not correlated to the other disclosures in the reports. In comparison to business reporting practices, the fundamental analyst reports put considerably less weight on social and sustainability, intellectual capital and corporate governance information, and they disclose much less comparable non-financial information. The suggestion made is that looking at the types of information financial analysts consider important and convey to their "customers," the investors and fund managers, constitutes a valuable indication to health care companies regarding the needs of the financial market users of their reports and other communications. There are some limitations to the possibility of applying statistical tests to the data-set as

  10. Desire and the female analyst.

    Science.gov (United States)

    Schaverien, J

    1996-04-01

    The literature on erotic transference and countertransference between female analyst and male patient is reviewed and discussed. It is known that female analysts are less likely than their male colleagues to act out sexually with their patients. It has been claimed that a) male patients do not experience sustained erotic transferences, and b) female analysts do not experience erotic countertransferences with female or male patients. These views are challenged and it is argued that, if there is less sexual acting out by female analysts, it is not because of an absence of eros in the therapeutic relationship. The literature review covers material drawn from psychoanalysis, feminist psychotherapy, Jungian analysis, as well as some sociological and cultural sources. It is organized under the following headings: the gender of the analyst, sexual acting out, erotic transference, maternal and paternal transference, gender and power, countertransference, incest taboo--mothers and sons and sexual themes in the transference.

  11. Analyst workbenches state of the art report

    CERN Document Server

    Rock-Evans, R

    1987-01-01

    Analyst Workbenches examines various aspects of analyst workbenches and the tasks and data that they should support. The major advances and state of the art in analyst workbenches are discussed. A comprehensive list of the available analyst workbenches, both the experimental and the commercial products, is provided. Comprised of three parts, this book begins by describing International Computers Ltd's approach to automating analysis and design. It then explains what business analysis really means, outlines the principal features of analyst workbenches, and considers the ways in which they can

  12. Investment Banking and Analyst Objectivity: Evidence from Forecasts and Recommendations of Analysts Affiliated with M&A Advisors

    OpenAIRE

    Kolasinski, Adam; Kothari, S.P.

    2004-01-01

    Previous research finds some evidence that analysts affiliated with equity underwriters issue more optimistic earnings growth forecasts and optimistic recommendations of client stock than unaffiliated analysts. Unfortunately, these studies are unable to discriminate between three competing hypotheses for the apparent optimism. Under the bribery hypothesis, underwriting clients, with the promise of underwriting fees, coax analysts to compromise their objectivity. The execution-related conflict...

  13. Key Performance Indicators and Analysts' Earnings Forecast Accuracy: An Application of Content Analysis

    OpenAIRE

    Alireza Dorestani; Zabihollah Rezaee

    2011-01-01

    We examine the association between the extent of change in key performance indicator (KPI) disclosures and the accuracy of forecasts made by analysts. KPIs are regarded as improving both the transparency and relevancy of public financial information. The results of using linear regression models show that contrary to our prediction and the hypothesis of this paper, there is no significant association between the change in non- financial KPI disclosures and the accuracy of analysts' forecasts....

  14. Do Investors Learn About Analyst Accuracy?

    OpenAIRE

    Chang, Charles; Daouk, Hazem; Wang, Albert

    2008-01-01

    We study the impact of analyst forecasts on prices to determine whether investors learn about analyst accuracy. Our test market is the crude oil futures market. Prices rise when analysts forecast a decrease (increase) in crude supplies. In the 15 minutes following supply realizations, prices rise (fall) when forecasts have been too high (low). In both the initial price action relative to forecasts and in the subsequent reaction relative to realized forecast errors, the price response is stron...

  15. Implementation of the INEEL safety analyst training standard

    International Nuclear Information System (INIS)

    Hochhalter, E. E.

    2000-01-01

    The Idaho Nuclear Technology and Engineering Center (INTEC) safety analysis units at the Idaho National Engineering and Environmental Laboratory (INEEL) are in the process of implementing the recently issued INEEL Safety Analyst Training Standard (STD-1107). Safety analyst training and qualifications are integral to the development and maintenance of core safety analysis capabilities. The INEEL Safety Analyst Training Standard (STD-1107) was developed directly from EFCOG Training Subgroup draft safety analyst training plan template, but has been adapted to the needs and requirements of the INEEL safety analysis community. The implementation of this Safety Analyst Training Standard is part of the Integrated Safety Management System (ISMS) Phase II Implementation currently underway at the INEEL. The objective of this paper is to discuss (1) the INEEL Safety Analyst Training Standard, (2) the development of the safety analyst individual training plans, (3) the implementation issues encountered during this initial phase of implementation, (4) the solutions developed, and (5) the implementation activities remaining to be completed

  16. Do analysts disclose cash flow forecasts with earnings estimates when earnings quality is low?

    OpenAIRE

    Bilinski, P.

    2014-01-01

    Cash flows are incrementally useful to earnings in security valuation mainly when earnings quality is low. This suggests that when earnings quality decreases, analysts will be more likely to supplement their earnings forecasts with cash flow estimates. Contrary to this prediction, we find that analysts do not disclose cash flow forecasts when the quality of earnings is low. This is because cash flow forecast accuracy depends on the accuracy of the accrual estimates and the precision of accrua...

  17. AN ANALYST'S UNCERTAINTY AND FEAR.

    Science.gov (United States)

    Chused, Judith Fingert

    2016-10-01

    The motivations for choosing psychoanalysis as a profession are many and differ depending on the psychology of the analyst. However, common to most psychoanalysts is the desire to forge a helpful relationship with the individuals with whom they work therapeutically. This article presents an example of what happens when an analyst is confronted by a patient for whom being in a relationship and being helped are intolerable. © 2016 The Psychoanalytic Quarterly, Inc.

  18. Multi-Intelligence Analytics for Next Generation Analysts (MIAGA)

    Science.gov (United States)

    Blasch, Erik; Waltz, Ed

    2016-05-01

    Current analysts are inundated with large volumes of data from which extraction, exploitation, and indexing are required. A future need for next-generation analysts is an appropriate balance between machine analytics from raw data and the ability of the user to interact with information through automation. Many quantitative intelligence tools and techniques have been developed which are examined towards matching analyst opportunities with recent technical trends such as big data, access to information, and visualization. The concepts and techniques summarized are derived from discussions with real analysts, documented trends of technical developments, and methods to engage future analysts with multiintelligence services. For example, qualitative techniques should be matched against physical, cognitive, and contextual quantitative analytics for intelligence reporting. Future trends include enabling knowledge search, collaborative situational sharing, and agile support for empirical decision-making and analytical reasoning.

  19. Procurement and execution of PCB analyses: Customer-analyst interactions

    International Nuclear Information System (INIS)

    Erickson, M.D.

    1993-01-01

    The practical application of PCB (polychlorinated biphenyl) analyses begins with a request for the analysis and concludes with provision of the requested analysis. The key to successful execution of this iteration is timely, professional communication between the requester and the analyst. Often PCB analyses are not satisfactorily executed, either because the requester failed to give adequate instructions or because the analyst simply ''did what he/she was told.'' The request for and conduct of a PCB analysis represents a contract for the procurement of a product (information about the sample); if both parties recognize and abide by this contractual relationship, the process generally proceeds smoothly. Requesters may be corporate purchasing agents working from a scope of work, a sample management office, a field team leader, a project manager, a physician's office, or the analyst himself. The analyst with whom the requester communicates may be a laboratory supervisor, a sample-receiving department, a salesperson for the laboratory, or the analyst himself. The analyst conducting the analysis is often a team, with custody of the sample being passed from sample receiving to the extraction laboratory, to the cleanup laboratory, to the gas chromatography (GC) laboratory, to the data reduction person, to the package preparation person, to the quality control (QC) department for verification, to shipping. Where a team of analysts is involved, the requester needs a central point of contact to minimize confusion and frustration. For the requester-analyst interface to work smoothly, it must function as if it is a one-to-one interaction. This article addresses the pitfalls of the requester-analyst interaction and provides suggestions for improving the quality of the analytical product through the requester-analyst interface

  20. The relevance of security analyst opinions for investment decisions

    NARCIS (Netherlands)

    Gerritsen, D.F.

    2014-01-01

    Security analysts analyze information regarding publicly traded companies after which they publish their opinion regarding these companies’ stocks. In this dissertation the published opinions of two different types of analysts are analyzed. Technical analysts derive a recommendation to buy, hold, or

  1. Gender heterogeneity in the sell-side analyst recommendation issuing process

    NARCIS (Netherlands)

    Bosquet, K.; de Goeij, P.C.; Smedts, K.

    Using analyst stock recommendations issued between January 1996 and December 2006 we show that the odds for female financial analysts to issue optimistic investment advice is 40% lower than for male analysts. Although 17% of our sample of analysts is female, 48% is employed by a top financial

  2. Entry Level Systems Analysts: What Does the Industry Want?

    Directory of Open Access Journals (Sweden)

    Donna M. Grant

    2016-06-01

    Full Text Available This study investigates the skill sets necessary for entry level systems analysts. Towards this end, the study combines two sources of data, namely, a content analysis of 200 systems analysts’ online job advertisements and a survey of 20 senior Information Systems (IS professionals. Based on Chi-square tests, the results reveal that most employers prefer entry level systems analysts with an undergraduate Computer Science degree. Furthermore, most of the employers prefer entry level systems analysts to have some years of experience as well as industry certifications. The results also reveal that there is a higher preference for entry level systems analysts who have non-technical and people skills (e.g., problem solving and oral communication. The empirical results from this study will inform IS educators as they develop future systems analysts. Additionally, the results will be useful to the aspiring systems analysts who need to make sure that they have the necessary job skills before graduating and entering the labor market.

  3. Training for spacecraft technical analysts

    Science.gov (United States)

    Ayres, Thomas J.; Bryant, Larry

    1989-01-01

    Deep space missions such as Voyager rely upon a large team of expert analysts who monitor activity in the various engineering subsystems of the spacecraft and plan operations. Senior teammembers generally come from the spacecraft designers, and new analysts receive on-the-job training. Neither of these methods will suffice for the creation of a new team in the middle of a mission, which may be the situation during the Magellan mission. New approaches are recommended, including electronic documentation, explicit cognitive modeling, and coached practice with archived data.

  4. Setting analyst: A practical harvest planning technique

    Science.gov (United States)

    Olivier R.M. Halleux; W. Dale Greene

    2001-01-01

    Setting Analyst is an ArcView extension that facilitates practical harvest planning for ground-based systems. By modeling the travel patterns of ground-based machines, it compares different harvesting settings based on projected average skidding distance, logging costs, and site disturbance levels. Setting Analyst uses information commonly available to consulting...

  5. Analyst, Policy and Strategy | IDRC - International Development ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Working under the supervision of the Director, and with guidance from the Senior Analyst, the Analyst provides research, analysis, and advice on matters of policy and strategy for the Centre and its Board. He or she contributes to strategic and operational planning, corporate reporting, trend monitoring, and engagement with ...

  6. Residues in the analyst of the patient's symbiotic connection at a somatic level: unrepresented states in the patient and analyst.

    Science.gov (United States)

    Godsil, Geraldine

    2018-02-01

    This paper discusses the residues of a somatic countertransference that revealed its meaning several years after apparently successful analytic work had ended. Psychoanalytic and Jungian analytic ideas on primitive communication, dissociation and enactment are explored in the working through of a shared respiratory symptom between patient and analyst. Growth in the analyst was necessary so that the patient's communication at a somatic level could be understood. Bleger's concept that both the patient's and analyst's body are part of the setting was central in the working through. © 2018, The Society of Analytical Psychology.

  7. "This strange disease": adolescent transference and the analyst's sexual orientation.

    Science.gov (United States)

    Burton, John K; Gilmore, Karen

    2010-08-01

    The treatment of adolescents by gay analysts is uncharted territory regarding the impact of the analyst's sexuality on the analytic process. Since a core challenge of adolescence involves the integration of the adult sexual body, gender role, and reproductive capacities into evolving identity, and since adolescents seek objects in their environment to facilitate both identity formation and the establishment of autonomy from primary objects, the analyst's sexual orientation is arguably a potent influence on the outcome of adolescent development. However, because sexual orientation is a less visible characteristic of the analyst than gender, race, or age, for example, the line between reality and fantasy is less clearly demarcated. This brings up special considerations regarding discovery and disclosure in the treatment. To explore these issues, the case of a late adolescent girl in treatment with a gay male analyst is presented. In this treatment, the question of the analyst's sexual orientation, and the demand by the patient for the analyst's self-disclosure, became a transference nucleus around which the patient's individual dynamics and adolescent dilemmas could be explored and clarified.

  8. Financial Analyst | IDRC - International Development Research Centre

    International Development Research Centre (IDRC) Digital Library (Canada)

    You will assist the Chief and the Senior Financial Analyst in the discharge of their responsibilities, and backs up the Senior Financial Analyst, Treasury ... in the FAD Annual Work Plan), at the request of the Director, and ensures deliverables are met within assigned time lines as set by the FAD Manager leading the initiative;

  9. Improving the information environment for analysts

    DEFF Research Database (Denmark)

    Farooq, Omar; Nielsen, Christian

    2014-01-01

    they have more information. Our results also show that intellectual capital disclosure related to employees and strategic statements are the most important disclosures for analysts. Research limitations/implications: More relevant methods, such as survey or interviews with management, may be used to improve...... the information content of intellectual capital disclosure. Analysts, probably, deduce intellectual capital of a firm from interaction with management rather than financial statements. Practical implications: Firms in biotechnology sector can improve their information environment by disclosing more information...

  10. Is customer satisfaction a relevant metric for financial analysts?

    OpenAIRE

    Ngobo , Paul-Valentin; Casta , Jean-François; Ramond , Olivier ,

    2012-01-01

    published on line : 2011/01/08; International audience; This study examines the effects of customer satisfaction on analysts' earnings forecast errors. Based on a sample of analysts following companies measured by the American Customer Satisfaction Index (ACSI), we find that customer satisfaction reduces earnings forecast errors. However, analysts respond to changes in customer satisfaction but not to the ACSI metric per se. Furthermore, the effects of customer satisfaction are asymmetric; fo...

  11. Using MetaboAnalyst 3.0 for Comprehensive Metabolomics Data Analysis.

    Science.gov (United States)

    Xia, Jianguo; Wishart, David S

    2016-09-07

    MetaboAnalyst (http://www.metaboanalyst.ca) is a comprehensive Web application for metabolomic data analysis and interpretation. MetaboAnalyst handles most of the common metabolomic data types from most kinds of metabolomics platforms (MS and NMR) for most kinds of metabolomics experiments (targeted, untargeted, quantitative). In addition to providing a variety of data processing and normalization procedures, MetaboAnalyst also supports a number of data analysis and data visualization tasks using a range of univariate, multivariate methods such as PCA (principal component analysis), PLS-DA (partial least squares discriminant analysis), heatmap clustering and machine learning methods. MetaboAnalyst also offers a variety of tools for metabolomic data interpretation including MSEA (metabolite set enrichment analysis), MetPA (metabolite pathway analysis), and biomarker selection via ROC (receiver operating characteristic) curve analysis, as well as time series and power analysis. This unit provides an overview of the main functional modules and the general workflow of the latest version of MetaboAnalyst (MetaboAnalyst 3.0), followed by eight detailed protocols. © 2016 by John Wiley & Sons, Inc. Copyright © 2016 John Wiley & Sons, Inc.

  12. Dynamics of analyst forecasts and emergence of complexity: Role of information disparity.

    Directory of Open Access Journals (Sweden)

    Chansoo Kim

    Full Text Available We report complex phenomena arising among financial analysts, who gather information and generate investment advice, and elucidate them with the help of a theoretical model. Understanding how analysts form their forecasts is important in better understanding the financial market. Carrying out big-data analysis of the analyst forecast data from I/B/E/S for nearly thirty years, we find skew distributions as evidence for emergence of complexity, and show how information asymmetry or disparity affects financial analysts' forming their forecasts. Here regulations, information dissemination throughout a fiscal year, and interactions among financial analysts are regarded as the proxy for a lower level of information disparity. It is found that financial analysts with better access to information display contrasting behaviors: a few analysts become bolder and issue forecasts independent of other forecasts while the majority of analysts issue more accurate forecasts and flock to each other. Main body of our sample of optimistic forecasts fits a log-normal distribution, with the tail displaying a power law. Based on the Yule process, we propose a model for the dynamics of issuing forecasts, incorporating interactions between analysts. Explaining nicely empirical data on analyst forecasts, this provides an appealing instance of understanding social phenomena in the perspective of complex systems.

  13. Forecasting Hotspots-A Predictive Analytics Approach.

    Science.gov (United States)

    Maciejewski, R; Hafen, R; Rudolph, S; Larew, S G; Mitchell, M A; Cleveland, W S; Ebert, D S

    2011-04-01

    Current visual analytics systems provide users with the means to explore trends in their data. Linked views and interactive displays provide insight into correlations among people, events, and places in space and time. Analysts search for events of interest through statistical tools linked to visual displays, drill down into the data, and form hypotheses based upon the available information. However, current systems stop short of predicting events. In spatiotemporal data, analysts are searching for regions of space and time with unusually high incidences of events (hotspots). In the cases where hotspots are found, analysts would like to predict how these regions may grow in order to plan resource allocation and preventative measures. Furthermore, analysts would also like to predict where future hotspots may occur. To facilitate such forecasting, we have created a predictive visual analytics toolkit that provides analysts with linked spatiotemporal and statistical analytic views. Our system models spatiotemporal events through the combination of kernel density estimation for event distribution and seasonal trend decomposition by loess smoothing for temporal predictions. We provide analysts with estimates of error in our modeling, along with spatial and temporal alerts to indicate the occurrence of statistically significant hotspots. Spatial data are distributed based on a modeling of previous event locations, thereby maintaining a temporal coherence with past events. Such tools allow analysts to perform real-time hypothesis testing, plan intervention strategies, and allocate resources to correspond to perceived threats.

  14. Information Management Analyst | IDRC - International ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Job Summary The Information Management Analyst is the technical resource person ... Performs systems configuration, testing and quality assurance for all IRO ... IMTD employees, identifying root cause of system issues and solving them or ...

  15. The Role of Analyst Conference Calls in Capital Markets

    NARCIS (Netherlands)

    E.M. Roelofsen (Erik)

    2010-01-01

    textabstractMany firms conduct a conference call with analysts shortly after the quarterly earnings announcement. In these calls, management discusses the completed quarter, and analysts can ask questions. Due to SEC requirements, conference calls in the United States are virtually always live

  16. The patient who believes and the analyst who does not (1).

    Science.gov (United States)

    Lijtmaer, Ruth M

    2009-01-01

    A patient's religious beliefs and practices challenge the clinical experience and self-knowledge of the analyst owing to a great complexity of factors, and often take the form of the analyst's resistances and countertransference reactions to spiritual and religious issues. The analyst's feelings about the patient's encounters with religion and other forms of healing experiences may result in impasses and communication breakdown for a variety of reasons. These reasons include the analyst's own unresolved issues around her role as a psychoanalyst-which incorporates in some way psychoanalysis's views of religious belief-and these old conflicts may be irritated by the religious themes expressed by the patient. Vignettes from the treatments of two patients provide examples of the analyst's countertransference conflicts, particularly envy in the case of a therapist who is an atheist.

  17. Self-disclosure, trauma and the pressures on the analyst.

    Science.gov (United States)

    West, Marcus

    2017-09-01

    This paper argues that self-disclosure is intimately related to traumatic experience and the pressures on the analyst not to re-traumatize the patient or repeat traumatic dynamics. The paper gives a number of examples of such pressures and outlines the difficulties the analyst may experience in adopting an analytic attitude - attempting to stay as closely as possible with what the patient brings. It suggests that self-disclosure may be used to try to disconfirm the patient's negative sense of themselves or the analyst, or to try to induce a positive sense of self or of the analyst which, whilst well-meaning, may be missing the point and may be prolonging the patient's distress. Examples are given of staying with the co-construction of the traumatic early relational dynamics and thus working through the traumatic complex; this attitude is compared and contrasted with some relational psychoanalytic attitudes. © 2017, The Society of Analytical Psychology.

  18. Effects of Motivation: Rewarding Hackers for Undetected Attacks Cause Analysts to Perform Poorly.

    Science.gov (United States)

    Maqbool, Zahid; Makhijani, Nidhi; Pammi, V S Chandrasekhar; Dutt, Varun

    2017-05-01

    The aim of this study was to determine how monetary motivations influence decision making of humans performing as security analysts and hackers in a cybersecurity game. Cyberattacks are increasing at an alarming rate. As cyberattacks often cause damage to existing cyber infrastructures, it is important to understand how monetary rewards may influence decision making of hackers and analysts in the cyber world. Currently, only limited attention has been given to this area. In an experiment, participants were randomly assigned to three between-subjects conditions ( n = 26 for each condition): equal payoff, where the magnitude of monetary rewards for hackers and defenders was the same; rewarding hacker, where the magnitude of monetary reward for hacker's successful attack was 10 times the reward for analyst's successful defense; and rewarding analyst, where the magnitude of monetary reward for analyst's successful defense was 10 times the reward for hacker's successful attack. In all conditions, half of the participants were human hackers playing against Nash analysts and half were human analysts playing against Nash hackers. Results revealed that monetary rewards for human hackers and analysts caused a decrease in attack and defend actions compared with the baseline. Furthermore, rewarding human hackers for undetected attacks made analysts deviate significantly from their optimal behavior. If hackers are rewarded for their undetected attack actions, then this causes analysts to deviate from optimal defend proportions. Thus, analysts need to be trained not become overenthusiastic in defending networks. Applications of our results are to networks where the influence of monetary rewards may cause information theft and system damage.

  19. The analyst's participation in the analytic process.

    Science.gov (United States)

    Levine, H B

    1994-08-01

    The analyst's moment-to-moment participation in the analytic process is inevitably and simultaneously determined by at least three sets of considerations. These are: (1) the application of proper analytic technique; (2) the analyst's personally-motivated responses to the patient and/or the analysis; (3) the analyst's use of him or herself to actualise, via fantasy, feeling or action, some aspect of the patient's conflicts, fantasies or internal object relationships. This formulation has relevance to our view of actualisation and enactment in the analytic process and to our understanding of a series of related issues that are fundamental to our theory of technique. These include the dialectical relationships that exist between insight and action, interpretation and suggestion, empathy and countertransference, and abstinence and gratification. In raising these issues, I do not seek to encourage or endorse wild analysis, the attempt to supply patients with 'corrective emotional experiences' or a rationalisation for acting out one's countertransferences. Rather, it is my hope that if we can better appreciate and describe these important dimensions of the analytic encounter, we can be better prepared to recognise, understand and interpret the continual streams of actualisation and enactment that are embedded in the analytic process. A deeper appreciation of the nature of the analyst's participation in the analytic process and the dimensions of the analytic process to which that participation gives rise may offer us a limited, although important, safeguard against analytic impasse.

  20. The reality of the other: dreaming of the analyst.

    Science.gov (United States)

    Ferruta, Anna

    2009-02-01

    The author discusses the obstacles to symbolization encountered when the analyst appears in the first dream of an analysis: the reality of the other is represented through the seeming recognition of the person of the analyst, who is portrayed in undisguised form. The interpretation of this first dream gives rise to reflections on the meaning of the other's reality in analysis: precisely this realistic representation indicates that the function of the other in the construction of the psychic world has been abolished. An analogous phenomenon is observed in the countertransference, as the analyst's mental processes are occluded by an exclusively self-generated interpretation of the patient's psychic world. For the analyst too, the reality of the other proves not to play a significant part in the construction of her interpretation. A 'turning-point' dream after five years bears witness to the power of the transforming function performed by the other throughout the analysis, by way of the representation of characters who stand for the necessary presence of a third party in the construction of a personal psychic reality. The author examines the mutual denial of the other's otherness, as expressed by the vicissitudes of the transference and countertransference between analyst and patient, otherness being experienced as a disturbance of self-sufficient narcissistic functioning. The paper ends with an analysis of the transformations that took place in the analytic relationship.

  1. Do analysts anticipate and react to bankruptcy? Evidence

    OpenAIRE

    Coelho, Luís; Peixinho, Rúben

    2005-01-01

    Finance literature suggests that financial analysts are sophisticated agents that act as facilitators of market efficiency by releasing relevant information to the market. This paper uses a sample of four major US bankruptcies to explore if analysts are able to disclose information to the market that provides investors with material information for their investment decisions. In particular, we use a qualitative approach to analyse analysts’ reports in order to verify if these agents are ab...

  2. Human Functions, Machine Tools, and the Role of the Analyst

    Directory of Open Access Journals (Sweden)

    Gordon R. Middleton

    2015-09-01

    Full Text Available In an era of rapidly increasing technical capability, the intelligence focus is often on the modes of collection and tools of analysis rather than the analyst themselves. Data are proliferating and so are tools to help analysts deal with the flood of data and the increasingly demanding timeline for intelligence production, but the role of the analyst in such a data-driven environment needs to be understood in order to support key management decisions (e.g., training and investment priorities. This paper describes a model of the analytic process, and analyzes the roles played by humans and machine tools in each process element. It concludes that human analytic functions are as critical in the intelligence process as they have ever been, and perhaps even more so due to the advance of technology in the intelligence business. Human functions performed by analysts are critical in nearly every step in the process, particularly at the front end of the analytic process, in defining and refining the problem statement, and at the end of the process, in generating knowledge, presenting the story in understandable terms, tailoring the presentation of the results of the analysis to various audiences, as well as in determining when to initiate iterative loops in the process. The paper concludes with observations on the necessity of enabling expert analysts, tools to deal with big data, developing analysts with advanced analytic methods as well as with techniques for optimal use of advanced tools, and suggestions for further quantitative research.

  3. MetaboAnalyst 3.0--making metabolomics more meaningful.

    Science.gov (United States)

    Xia, Jianguo; Sinelnikov, Igor V; Han, Beomsoo; Wishart, David S

    2015-07-01

    MetaboAnalyst (www.metaboanalyst.ca) is a web server designed to permit comprehensive metabolomic data analysis, visualization and interpretation. It supports a wide range of complex statistical calculations and high quality graphical rendering functions that require significant computational resources. First introduced in 2009, MetaboAnalyst has experienced more than a 50X growth in user traffic (>50 000 jobs processed each month). In order to keep up with the rapidly increasing computational demands and a growing number of requests to support translational and systems biology applications, we performed a substantial rewrite and major feature upgrade of the server. The result is MetaboAnalyst 3.0. By completely re-implementing the MetaboAnalyst suite using the latest web framework technologies, we have been able substantially improve its performance, capacity and user interactivity. Three new modules have also been added including: (i) a module for biomarker analysis based on the calculation of receiver operating characteristic curves; (ii) a module for sample size estimation and power analysis for improved planning of metabolomics studies and (iii) a module to support integrative pathway analysis for both genes and metabolites. In addition, popular features found in existing modules have been significantly enhanced by upgrading the graphical output, expanding the compound libraries and by adding support for more diverse organisms. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  4. Physics-based and human-derived information fusion for analysts

    Science.gov (United States)

    Blasch, Erik; Nagy, James; Scott, Steve; Okoth, Joshua; Hinman, Michael

    2017-05-01

    Recent trends in physics-based and human-derived information fusion (PHIF) have amplified the capabilities of analysts; however with the big data opportunities there is a need for open architecture designs, methods of distributed team collaboration, and visualizations. In this paper, we explore recent trends in the information fusion to support user interaction and machine analytics. Challenging scenarios requiring PHIF include combing physics-based video data with human-derived text data for enhanced simultaneous tracking and identification. A driving effort would be to provide analysts with applications, tools, and interfaces that afford effective and affordable solutions for timely decision making. Fusion at scale should be developed to allow analysts to access data, call analytics routines, enter solutions, update models, and store results for distributed decision making.

  5. The analyst's body as tuning fork: embodied resonance in countertransference.

    Science.gov (United States)

    Stone, Martin

    2006-02-01

    This paper examines the phenomenon of embodied countertransference: where the analyst experiences a somatic reaction rather than the more common countertransference responses of thoughts, feelings, images, fantasies and dreams. Discussion of clinical material considers neurotic and syntonic aspects. The analogy is made of resonance with a tuning fork. Several questions are posed: Why does countertransference resonate in the bodies of some analysts but not all? Why do those analysts who are sensitive to this, experience it with some patients but not with others? And what are the conditions which are conducive to producing somatic responses? It proposes that somatic reactions are more likely to occur when a number of conditions come together: when working with patients exhibiting borderline, psychotic or severe narcissistic elements; where there has been early severe childhood trauma; and where there is fear of expressing strong emotions directly. In addition another theoretical factor is proposed, namely the typology of the analyst.

  6. The Determinants of Sell-side Analysts' Forecast Accuracy and Media Exposure

    OpenAIRE

    Sorogho, Samira Amadu

    2017-01-01

    This study examines contributing factors to the differential forecasting abilities of sell-side analysts and the relation between the sentiments of these analysts and their media exposure. In particular, I investigate whether the level of optimism expressed in sell-side analysts’ reports of fifteen constituents of primarily the S&P 500 Oil and Gas Industry1, enhance the media appearance of these analysts. Using a number of variables estimated from the I/B/E/S Detail history database, 15,455 a...

  7. Senior Systems Analyst | IDRC - International Development ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    ... Systems Analyst will play a critical role as part of the Information Technology ... data analysis, and system design); the delivery of professional IM/IT advisory and ... with the current development activities ensuring resolution of those issues.

  8. The analyst's anxieties in the first interview: barriers against analytic presence.

    Science.gov (United States)

    Møller, Mette

    2014-06-01

    To answer the questions: why don't more people enter analysis and how do we get more people to do so? Attention is drawn to anxieties in the analyst that become obstacles to the initiation of analysis. The main focus of the paper is how to understand why analysts, irrespective of patient characteristics, seem to have resistances against embarking on analysis. Being a meeting between strangers the consultation activates strong emotional reactions in both parties. One way of coping is defensively to diagnose, assess and exclude instead of being present as an analyst. The analytic frame of a consultation is ambiguous, and a secure analytic function is needed in order to meet the openness and unpredictability of this frame. A fragile psychoanalytic identity is seen as central to analysts' failure to create an analytic practice; it takes years to develop and maintain a robust analytic function, and analytic work continues to cause disturbing emotional reactions in the analyst. Analysts' vulnerable identity is also linked to the history of psychoanalysis that has fostered an ideal of analytic practice that is omnipotent and impossible to reach. Therefore it is no wonder that attempts to reach a convinced recommendation of analysis can become diverted in the process of consultation. Confronting these inner impediments in order to strengthen the analytic identity is suggested as a better way to get more analytic patients than to keep looking for so-called analysability in patients. Copyright © 2014 Institute of Psychoanalysis.

  9. A Survey of Functional Behavior Assessment Methods Used by Behavior Analysts in Practice

    Science.gov (United States)

    Oliver, Anthony C.; Pratt, Leigh A.; Normand, Matthew P.

    2015-01-01

    To gather information about the functional behavior assessment (FBA) methods behavior analysts use in practice, we sent a web-based survey to 12,431 behavior analysts certified by the Behavior Analyst Certification Board. Ultimately, 724 surveys were returned, with the results suggesting that most respondents regularly use FBA methods, especially…

  10. SafetyAnalyst : software tools for safety management of specific highway sites

    Science.gov (United States)

    2010-07-01

    SafetyAnalyst provides a set of software tools for use by state and local highway agencies for highway safety management. SafetyAnalyst can be used by highway agencies to improve their programming of site-specific highway safety improvements. SafetyA...

  11. On the relation between forecast precision and trading profitability of financial analysts

    DEFF Research Database (Denmark)

    Marinelli, Carlo; Weissensteiner, Alex

    2014-01-01

    We analyze the relation between earnings forecast accuracy and the expected profitability of financial analysts. Modeling forecast errors with a multivariate normal distribution, a complete characterization of the payoff of each analyst is provided. In particular, closed-form expressions for the ......We analyze the relation between earnings forecast accuracy and the expected profitability of financial analysts. Modeling forecast errors with a multivariate normal distribution, a complete characterization of the payoff of each analyst is provided. In particular, closed-form expressions...... for the probability density function, for the expectation, and, more generally, for moments of all orders are obtained. Our analysis shows that the relationship between forecast precision and trading profitability needs not be monotonic, and that the impact of the correlation between the forecasts on the expected...

  12. The demand for corporate financial reporting: A survey among financial analysts

    NARCIS (Netherlands)

    A. de Jong (Abe); G.M.H. Mertens (Gerard); A.M. van der Poel (Marieke); R. van Dijk (Ronald)

    2010-01-01

    textabstractAbstract: We examine financial analysts’ views on corporate financial reporting issues by means of a survey among 306 analysts and interviews among 21 analysts and compare their views with that of CFOs. Since CFOs believe that meeting or beating analysts’ forecasts and managing

  13. Paired analyst recommendations and internet IPOs

    NARCIS (Netherlands)

    van der Goot, T.; van Giersbergen, N.

    2008-01-01

    The paper investigates analyst recommendations for internet firms that went public during 1997-2000. Our contribution to the literature is that we match recommendations for the same firm issued by different investment banks that have published the recommendations in an interval around the same date.

  14. Reflections: can the analyst share a traumatizing experience with a traumatized patient?

    Science.gov (United States)

    Lijtmaer, Ruth

    2010-01-01

    This is a personal account of a dreadful event in the analyst's life that was similar to a patient's trauma. It is a reflection on how the analyst dealt with her own trauma, the patient's trauma, and the transference and countertransference dynamics. Included is a description of the analyst's inner struggles with self-disclosure, continuance of her professional work, and the need for persistent self-scrutiny. The meaning of objects in people's life, particularly the concept of home, will be addressed.

  15. Using the living laboratory framework as a basis for understanding next-generation analyst work

    Science.gov (United States)

    McNeese, Michael D.; Mancuso, Vincent; McNeese, Nathan; Endsley, Tristan; Forster, Pete

    2013-05-01

    The preparation of next generation analyst work requires alternative levels of understanding and new methodological departures from the way current work transpires. Current work practices typically do not provide a comprehensive approach that emphasizes the role of and interplay between (a) cognition, (b) emergent activities in a shared situated context, and (c) collaborative teamwork. In turn, effective and efficient problem solving fails to take place, and practice is often composed of piecemeal, techno-centric tools that isolate analysts by providing rigid, limited levels of understanding of situation awareness. This coupled with the fact that many analyst activities are classified produces a challenging situation for researching such phenomena and designing and evaluating systems to support analyst cognition and teamwork. Through our work with cyber, image, and intelligence analysts we have realized that there is more required of researchers to study human-centered designs to provide for analyst's needs in a timely fashion. This paper identifies and describes how The Living Laboratory Framework can be utilized as a means to develop a comprehensive, human-centric, and problem-focused approach to next generation analyst work, design, and training. We explain how the framework is utilized for specific cases in various applied settings (e.g., crisis management analysis, image analysis, and cyber analysis) to demonstrate its value and power in addressing an area of utmost importance to our national security. Attributes of analyst work settings are delineated to suggest potential design affordances that could help improve cognitive activities and awareness. Finally, the paper puts forth a research agenda for the use of the framework for future work that will move the analyst profession in a viable manner to address the concerns identified.

  16. Financial Analyst II - External Funds Management | IDRC ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    The Financial Analyst position is essential for the administration and smooth ... Prepare monthly journal voucher to record the donor partnership revenue in ... of the business requirements for the development and enhancement of financial ...

  17. COMPREHENSIVE APPROACH OVER THE PROFESSIONAL JUDGMENT OF THE FINANCIAL ANALYST

    Directory of Open Access Journals (Sweden)

    Viorica Mirela ŞTEFAN-DUICU

    2016-06-01

    Full Text Available The professional judgment is emblematical at a decisional level. This paper aims to highlight the valences of the professional judgment of the financial analyst by describing the components of its activity and also through highlighting the typologies of the mechanisms involved. Within this paper we have presented the types of financial analysts, the responsibilities that guide the professional judgment and also the interdependent elements of their activity.

  18. Policy Analyst | IDRC - International Development Research Centre

    International Development Research Centre (IDRC) Digital Library (Canada)

    ... reviews the plans produced to ensure that they are of the highest possible quality. ... The Policy Analyst plays a key role in information management. ... discussion and decision-making; prepares guidelines on issues relating to processes, ...

  19. Ground Truth Annotation in T Analyst

    DEFF Research Database (Denmark)

    2015-01-01

    This video shows how to annotate the ground truth tracks in the thermal videos. The ground truth tracks are produced to be able to compare them to tracks obtained from a Computer Vision tracking approach. The program used for annotation is T-Analyst, which is developed by Aliaksei Laureshyn, Ph...

  20. Analyst reluctance in conveying negative information to the market

    Directory of Open Access Journals (Sweden)

    Luca Piras

    2012-11-01

    Full Text Available This paper investigates one of the main sources of financial markets’ public information: financial analysts’ reports. We analyze reports on S&P 500 index through a multidisciplinary approach integrating behavioral finance with linguistic analysis to understand how financial phenomena reflect in or are deviated by language, i.e. whether financial and linguistic trends follow the same patterns, boosting each other, or diverge. In the latter, language could conceal financial events, mitigating analysts’ feelings and misleading investors. Therefore, we attempt to identify behavioral biases (mainly represented by cognitive dissonances present in analysts’ reports. In doing so, we try to understand whether analysts try to hide the perception of negative price-sensitive events or not, eventually anticipating and controlling the market “mood”. This study focuses on how analysts use linguistic strategies in order to minimize their risk of issuing wrong advice. Our preliminary results show reluctance to incorporate negative information in the reports. A slight asymmetry between the use of positive/negative keywords taken into account and the negative/positive trends of the index seems to emerge. In those weeks characterized by the index poor performances, the frequency of keywords with a negative meaning is lower. On the contrary, in the recovering weeks a higher use of keywords with a positive meaning does not clearly appear. A thorough investigation on the market moods and the analysis of the text of the reports enable us to assess if and to what extent analysts have been willing to mitigate pessimism or emphasize confidence. Furthermore, we contribute to the existing literature also proposing a possible analysts’ value function based on the Prospect Theory [Kahneman and Tversky, 1979] where analysts try to maximize the value deriving from enhancing their reputation, taking into account the risks that may cause a reputational loss. This

  1. Teaching Bayesian Statistics To Intelligence Analysts: Lessons Learned

    Directory of Open Access Journals (Sweden)

    Hemangni Deshmukh

    2009-01-01

    Full Text Available The Community must develop and integrate into regular use new tools that can assist analysts in filtering and correlating the vast quantities of information that threaten to overwhelm the analytic process…—Commission on the Intelligence Capabilities of the United States.Regarding Weapons of Mass Destruction (The WMD Report1Unlike the other social sciences and, particularly, the physical sciences, where scientists get to choose the questions they wish to answer and experiments are carefully designed to confirm or negate hypotheses, intelligence analysis requires analysts to deal with the demands of decision makers and estimate the intentions of foreign actors, criminals or business competitors in an environment filled with uncertainty and even deliberate deception.

  2. Stock index adjustments, analyst coverage and institutional holdings: Evidence from China

    Directory of Open Access Journals (Sweden)

    Song Zhu

    2017-09-01

    Full Text Available Using 231 pairs of matched firms from 2009 to 2012 in Chinese stock market, we find that the stock index adjustment significantly affects the analyst coverage, which in addition to the stock index leads to more analyst coverage, while deletion from the stock index has no significant effect, indicating that stock index adjustment can significantly change the information environments of firms that are added to the index. An index adjustment also affects institutional holdings in consideration of new information (e.g., changes in fundamentals and information environments. Changes in institutional holdings are partially due to changes in analyst coverage, and both index funds and other types can change their portfolios in response to changes in the target firms’ informativeness.

  3. The Role of Analysts as Gatekeepers: Enhancing Transparency and Curbing Earnings Management in Brazil

    Directory of Open Access Journals (Sweden)

    Antonio Lopo Martinez

    2011-07-01

    Full Text Available This paper examines the relationship of analysts’ coverage, forecasting errors and earnings management. It corroborates the role of analysts as gatekeepers by finding that analysts enhance transparency and reduce the scope of earnings management. To identify analysts’ coverage we used the I/B/E/S, from where we also obtained information on the consensus projections of analysts for listed Brazilian companies. The results indicated a negative correlation between the number of analysts covering firms and the magnitude of their discretionary accruals in absolute terms, indicating that more scrutiny inhibits earnings management. We also found a negative correlation between analysts’ coverage and forecasting errors. Multivariate regressions showed statistically significant results in the same sense. Therefore, market analysts, despite the severe criticism they receive from the specialized press, actually have a beneficial effect on corporate governance by monitoring managers and inhibiting earnings management.

  4. When the analyst is ill: dimensions of self-disclosure.

    Science.gov (United States)

    Pizer, B

    1997-07-01

    This article examines questions related to the "inescapable," the "inadvertent," and the "deliberate" personal disclosures by an analyst. Technical and personal considerations that influence the analyst's decision to disclose, as well as the inherent responsibilities and potential clinical consequences involved in self-disclosure, are explored, with particular attention to transference-countertransference dynamics, therapeutic goals, and the negotiation of resistance. The author describes her clinical work during a period of prolonged illness, with case vignettes that illustrate how-self-disclosure may be regarded as both an occasional authentic requirement and a regular intrinsic component of clinical technique.

  5. An eye tracking study of bloodstain pattern analysts during pattern classification.

    Science.gov (United States)

    Arthur, R M; Hoogenboom, J; Green, R D; Taylor, M C; de Bruin, K G

    2018-05-01

    Bloodstain pattern analysis (BPA) is the forensic discipline concerned with the classification and interpretation of bloodstains and bloodstain patterns at the crime scene. At present, it is unclear exactly which stain or pattern properties and their associated values are most relevant to analysts when classifying a bloodstain pattern. Eye tracking technology has been widely used to investigate human perception and cognition. Its application to forensics, however, is limited. This is the first study to use eye tracking as a tool for gaining access to the mindset of the bloodstain pattern expert. An eye tracking method was used to follow the gaze of 24 bloodstain pattern analysts during an assigned task of classifying a laboratory-generated test bloodstain pattern. With the aid of an automated image-processing methodology, the properties of selected features of the pattern were quantified leading to the delineation of areas of interest (AOIs). Eye tracking data were collected for each AOI and combined with verbal statements made by analysts after the classification task to determine the critical range of values for relevant diagnostic features. Eye-tracking data indicated that there were four main regions of the pattern that analysts were most interested in. Within each region, individual elements or groups of elements that exhibited features associated with directionality, size, colour and shape appeared to capture the most interest of analysts during the classification task. The study showed that the eye movements of trained bloodstain pattern experts and their verbal descriptions of a pattern were well correlated.

  6. The Inefficient Use of Macroeconomic Information in Analysts' Earnings Forecasts in Emerging Markets

    NARCIS (Netherlands)

    G.J. de Zwart (Gerben); D.J.C. van Dijk (Dick)

    2008-01-01

    textabstractThis paper presents empirical evidence that security analysts do not efficiently use publicly available macroeconomic information in their earnings forecasts for emerging market stocks. Analysts completely ignore forecasts on political stability, while these provide valuable information

  7. Instruction in Information Structuring Improves Bayesian Judgment in Intelligence Analysts

    Directory of Open Access Journals (Sweden)

    David R. Mandel

    2015-04-01

    Full Text Available An experiment was conducted to test the effectiveness of brief instruction in information structuring (i.e., representing and integrating information for improving the coherence of probability judgments and binary choices among intelligence analysts. Forty-three analysts were presented with comparable sets of Bayesian judgment problems before and immediately after instruction. After instruction, analysts’ probability judgments were more coherent (i.e., more additive and compliant with Bayes theorem. Instruction also improved the coherence of binary choices regarding category membership: after instruction, subjects were more likely to invariably choose the category to which they assigned the higher probability of a target’s membership. The research provides a rare example of evidence-based validation of effectiveness in instruction to improve the statistical assessment skills of intelligence analysts. Such instruction could also be used to improve the assessment quality of other types of experts who are required to integrate statistical information or make probabilistic assessments.

  8. 17 CFR 200.17 - Chief Management Analyst.

    Science.gov (United States)

    2010-04-01

    ...) Organizational structures and delegations of authority; (d) Management information systems and concepts; and (e... 17 Commodity and Securities Exchanges 2 2010-04-01 2010-04-01 false Chief Management Analyst. 200...; CONDUCT AND ETHICS; AND INFORMATION AND REQUESTS Organization and Program Management General Organization...

  9. Senior Financial Analyst – External Funds Management | IDRC ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Job Summary The Senior Financial Analyst, External Funds Management is responsible ... in accordance with the donor agreements and accounting principles. ... Assist the Manager in the development of the financial accounting structure for ...

  10. How Analysts Cognitively “Connect the Dots”

    Energy Technology Data Exchange (ETDEWEB)

    Bradel, Lauren; Self, Jessica S.; Endert, Alexander; Hossain, Shahriar M.; North, Chris; Ramakrishnan, Naren

    2013-06-04

    As analysts attempt to make sense of a collection of documents, such as intelligence analysis reports, they may wish to “connect the dots” between pieces of information that may initially seem unrelated. This process of synthesizing information between information requires users to make connections between pairs of documents, creating a conceptual story. We conducted a user study to analyze the process by which users connect pairs of documents and how they spatially arrange information. Users created conceptual stories that connected the dots using organizational strategies that ranged in complexity. We propose taxonomies for cognitive connections and physical structures used when trying to “connect the dots” between two documents. We compared the user-created stories with a data-mining algorithm that constructs chains of documents using co-occurrence metrics. Using the insight gained into the storytelling process, we offer design considerations for the existing data mining algorithm and corresponding tools to combine the power of data mining and the complex cognitive processing of analysts.

  11. Special Nuclear Material Gamma-Ray Signatures for Reachback Analysts

    Energy Technology Data Exchange (ETDEWEB)

    Karpius, Peter Joseph [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Myers, Steven Charles [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-08-29

    These are slides on special nuclear material gamma-ray signatures for reachback analysts for an LSS Spectroscopy course. The closing thoughts for this presentation are the following: SNM materials have definite spectral signatures that should be readily recognizable to analysts in both bare and shielded configurations. One can estimate burnup of plutonium using certain pairs of peaks that are a few keV apart. In most cases, one cannot reliably estimate uranium enrichment in an analogous way to the estimation of plutonium burnup. The origin of the most intense peaks from some SNM items may be indirect and from ‘associated nuclides.' Indirect SNM signatures sometimes have commonalities with the natural gamma-ray background.

  12. Understanding NASA surface missions with the PDS Analyst's Notebook

    Science.gov (United States)

    Stein, T.

    2011-10-01

    Planetary data archives of surface missions contain data from numerous hosted instruments. Because of the nondeterministic nature of surface missions, it is not possible to assess the data without understanding the context in which they were collected. The PDS Analyst's Notebook (http://an.rsl.wustl.edu) provides access to Mars Exploration Rover (MER) [1] and Mars Phoenix Lander [2] data archives by integrating sequence information, engineering and science data, observation planning and targeting, and documentation into web-accessible pages to facilitate "mission replay." In addition, Lunar Apollo surface mission data archives and LCROSS mission data are available in the Analyst's Notebook concept, and a Notebook is planned for Mars Science Laboratory (MSL) mission.

  13. Learning patterns of life from intelligence analyst chat

    Science.gov (United States)

    Schneider, Michael K.; Alford, Mark; Babko-Malaya, Olga; Blasch, Erik; Chen, Lingji; Crespi, Valentino; HandUber, Jason; Haney, Phil; Nagy, Jim; Richman, Mike; Von Pless, Gregory; Zhu, Howie; Rhodes, Bradley J.

    2016-05-01

    Our Multi-INT Data Association Tool (MIDAT) learns patterns of life (POL) of a geographical area from video analyst observations called out in textual reporting. Typical approaches to learning POLs from video make use of computer vision algorithms to extract locations in space and time of various activities. Such approaches are subject to the detection and tracking performance of the video processing algorithms. Numerous examples of human analysts monitoring live video streams annotating or "calling out" relevant entities and activities exist, such as security analysis, crime-scene forensics, news reports, and sports commentary. This user description typically corresponds with textual capture, such as chat. Although the purpose of these text products is primarily to describe events as they happen, organizations typically archive the reports for extended periods. This archive provides a basis to build POLs. Such POLs are useful for diagnosis to assess activities in an area based on historical context, and for consumers of products, who gain an understanding of historical patterns. MIDAT combines natural language processing, multi-hypothesis tracking, and Multi-INT Activity Pattern Learning and Exploitation (MAPLE) technologies in an end-to-end lab prototype that processes textual products produced by video analysts, infers POLs, and highlights anomalies relative to those POLs with links to "tracks" of related activities performed by the same entity. MIDAT technologies perform well, achieving, for example, a 90% F1-value on extracting activities from the textual reports.

  14. What's in a name: what analyst and patient call each other.

    Science.gov (United States)

    Barron, Grace Caroline

    2006-01-01

    Awkward moments often arise between patient and analyst involving the question, "What do we call each other?" The manner in which the dyad address each other contains material central to the patient's inner life. Names, like dreams, deserve a privileged status as providing a royal road into the paradoxical analytic relationship and the unconscious conflicts that feed it. Whether an analyst addresses the patient formally, informally, or not at all, awareness of the issues surrounding names is important.

  15. Nothing but the truth: self-disclosure, self-revelation, and the persona of the analyst.

    Science.gov (United States)

    Levine, Susan S

    2007-01-01

    The question of the analyst's self-disclosure and self-revelation inhabits every moment of every psychoanalytic treatment. All self-disclosures and revelations, however, are not equivalent, and differentiating among them allows us to define a construct that can be called the analytic persona. Analysts already rely on an unarticulated concept of an analytic persona that guides them, for instance, as they decide what constitutes appropriate boundaries. Clinical examples illustrate how self-disclosures and revelations from within and without the analytic persona feel different, for both patient and analyst. The analyst plays a specific role for each patient and is both purposefully and unconsciously different in this context than in other settings. To a great degree, the self is a relational phenomenon. Our ethics call for us to tell nothing but the truth and simultaneously for us not to tell the whole truth. The unarticulated working concept of an analytic persona that many analysts have refers to the self we step out of at the close of each session and the self we step into as the patient enters the room. Attitudes toward self-disclosure and self-revelation can be considered reflections of how we conceptualize this persona.

  16. Financial Analysts' Forecast Accuracy : Before and After the Introduction of AIFRS

    Directory of Open Access Journals (Sweden)

    Chee Seng Cheong

    2010-09-01

    Full Text Available We examine whether financial analysts’ forecast accuracy differs between the pre- and post- adoption ofAustralian Equivalents to the International Financial Reporting Standards (AIFRS. We find that forecastaccuracy has improved after Australia adopted AIFRS. As a secondary objective, this paper also investigatesthe role of financial analysts in reducing information asymmetry in today’s Australian capital market. We findweak evidence that more analysts following a stock do not help to improve forecast accuracy by bringingmore firm-specific information to the market.

  17. The Pentagon's Military Analyst Program

    Science.gov (United States)

    Valeri, Andy

    2014-01-01

    This article provides an investigatory overview of the Pentagon's military analyst program, what it is, how it was implemented, and how it constitutes a form of propaganda. A technical analysis of the program is applied using the theoretical framework of the propaganda model first developed by Noam Chomsky and Edward S. Herman. Definitions…

  18. Analysis of Skills Requirement for Entry-Level Programmer/Analysts in Fortune 500 Corporations

    Science.gov (United States)

    Lee, Choong Kwon; Han, Hyo-Joo

    2008-01-01

    This paper presents the most up-to-date skill requirements for programmer/analyst, one of the most demanded entry-level job titles in the Information Systems (IS) field. In the past, several researchers studied job skills for IS professionals, but few have focused especially on "programmer/analyst." The authors conducted an extensive empirical…

  19. A reply to behavior analysts writing about rules and rule-governed behavior.

    Science.gov (United States)

    Schlinger, H D

    1990-01-01

    Verbal stimuli called "rules" or "instructions" continue to be interpreted as discriminative stimuli despite recent arguments against this practice. Instead, it might more fruitful for behavior analysts to focus on "contingency-specifying stimuli" which are function-altering. Moreover, rather than having a special term, "rule," for verbal stimuli whose only function is discriminative, perhaps behavior analysts should reserve the term, if at all, only for these function-altering contingency-specifying stimuli.

  20. Implementation status of performance demonstration program for steam generator tubing analysts in Korea

    International Nuclear Information System (INIS)

    Cho, Chan Hee; Lee, Hee Jong; Yoo, Hyun Ju; Nam, Min Woo; Hong, Sung Yull

    2013-01-01

    Some essential components in nuclear power plants are periodically inspected using non destructive examinations, for example ultrasonic, eddy current and radiographic examinations, in order to determine their integrity. These components include nuclear power plant items such as vessels, containments, piping systems, pumps, valves, tubes and core support structure. Steam generator tubes have an important safety role because they constitute one of the primary barriers between the radioactive and non radioactive sides of the nuclear power plant. There is potential that if a tube bursts while a plant is operating, radioactivity from the primary coolant system could escape directly to the atmosphere. Therefore, in service inspections are critical in maintaining steam generator tube integrity. In general, the eddy current testing is widely used for the inspection of steam generator tubes due tube integrity. In general, the eddy current testing is widely used for the inspection of steam generator tubes due to its high inspection speed and flaw detectability on non magnetic tubes. However, it is not easy to analyze correctly eddy current signals because they are influenced by many factors. Therefore, the performance of eddy current data analysts for steam generator tubing should be demonstrated comprehensively. In Korea, the performance of steam generator tubing analysts has been demonstrated using the Qualified Data Analyst program. This paper describes the performance demonstration program for steam generator tubing analysts and its implementation results in Korea. The pass rate of domestic analysts for this program was 71.4%

  1. Implementation status of performance demonstration program for steam generator tubing analysts in Korea

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Chan Hee; Lee, Hee Jong; Yoo, Hyun Ju; Nam, Min Woo [KHNP Central Research Institute, Daejeon (Korea, Republic of); Hong, Sung Yull [Yeungnam Univ., Gyeongsan (Korea, Republic of)

    2013-02-15

    Some essential components in nuclear power plants are periodically inspected using non destructive examinations, for example ultrasonic, eddy current and radiographic examinations, in order to determine their integrity. These components include nuclear power plant items such as vessels, containments, piping systems, pumps, valves, tubes and core support structure. Steam generator tubes have an important safety role because they constitute one of the primary barriers between the radioactive and non radioactive sides of the nuclear power plant. There is potential that if a tube bursts while a plant is operating, radioactivity from the primary coolant system could escape directly to the atmosphere. Therefore, in service inspections are critical in maintaining steam generator tube integrity. In general, the eddy current testing is widely used for the inspection of steam generator tubes due tube integrity. In general, the eddy current testing is widely used for the inspection of steam generator tubes due to its high inspection speed and flaw detectability on non magnetic tubes. However, it is not easy to analyze correctly eddy current signals because they are influenced by many factors. Therefore, the performance of eddy current data analysts for steam generator tubing should be demonstrated comprehensively. In Korea, the performance of steam generator tubing analysts has been demonstrated using the Qualified Data Analyst program. This paper describes the performance demonstration program for steam generator tubing analysts and its implementation results in Korea. The pass rate of domestic analysts for this program was 71.4%.

  2. A reply to behavior analysts writing about rules and rule-governed behavior

    OpenAIRE

    Schlinger, Henry D.

    1990-01-01

    Verbal stimuli called “rules” or “instructions” continue to be interpreted as discriminative stimuli despite recent arguments against this practice. Instead, it might more fruitful for behavior analysts to focus on “contingency-specifying stimuli” which are function-altering. Moreover, rather than having a special term, “rule,” for verbal stimuli whose only function is discriminative, perhaps behavior analysts should reserve the term, if at all, only for these function-altering contingency-sp...

  3. Are security analysts rational? a literature review

    OpenAIRE

    Peixinho, Rúben; Coelho, Luís; Taffler, Richard J.

    2005-01-01

    Rational choice theory and bounded rationality constitute the basis for the discussion in several areas regarding human rationality. In finance, this discussion has been made between traditional finance and behavioural finance approach, which have different perspectives concerning market agents’ rationality. This paper reviews several studies addressing rationality among security analysts. The analysis shows that analysts’systematic optimism seems to be inconsistent with rationality....

  4. Audience as analyst: Dennis Potter's The Singing Detective.

    Science.gov (United States)

    Jeffrey, W

    1997-06-01

    Author Dennis Potter has written an exceptional psychoanalytically informed television series in The Singing Detective. Potter succeeds by echewing the usual portrayal of psychoanalysis in cinema and television as a therapy which the viewer observes but instead creates, by means of the content and structure of the series, a production that forces the audience into a role of analyst. The story of the current life and the childhood of the protagonist, Philip Marlow, has depth and context which allows the audience to examine the personality of Marlow, including character pathology and traits, sexuality, fantasy, dreams, and delusions from several metapsychological viewpoints. Potter allows the audience to use the dynamic, genetic, topographic, and, most unusual in drama, structural viewpoints. The audience can experience aspects of an analyst's experience, including the process of formulating and evaluating over time analytic hypotheses and coping with emotional reactions to the material which at times has transferencelike qualities.

  5. Fuzzy VIKOR approach for selection of big data analyst in procurement management

    Directory of Open Access Journals (Sweden)

    Surajit Bag

    2016-07-01

    Full Text Available Background: Big data and predictive analysis have been hailed as the fourth paradigm of science. Big data and analytics are critical to the future of business sustainability. The demand for data scientists is increasing with the dynamic nature of businesses, thus making it indispensable to manage big data, derive meaningful results and interpret management decisions. Objectives: The purpose of this study was to provide a brief conceptual review of big data and analytics and further illustrate the use of a multicriteria decision-making technique in selecting the right skilled candidate for big data and analytics in procurement management. Method: It is important for firms to select and recruit the right data analyst, both in terms of skills sets and scope of analysis. The nature of such a problem is complex and multicriteria decision-making, which deals with both qualitative and quantitative factors. In the current study, an application of the Fuzzy VIsekriterijumska optimizacija i KOmpromisno Resenje (VIKOR method was used to solve the big data analyst selection problem. Results: From this study, it was identified that Technical knowledge (C1, Intellectual curiosity (C4 and Business acumen (C5 are the strongest influential criteria and must be present in the candidate for the big data and analytics job. Conclusion: Fuzzy VIKOR is the perfect technique in this kind of multiple criteria decisionmaking problematic scenario. This study will assist human resource managers and procurement managers in selecting the right workforce for big data analytics.

  6. The body of the analyst and the analytic setting: reflections on the embodied setting and the symbiotic transference.

    Science.gov (United States)

    Lemma, Alessandra

    2014-04-01

    In this paper the author questions whether the body of the analyst may be helpfully conceptualized as an embodied feature of the setting and suggests that this may be especially helpful for understanding patients who develop a symbiotic transference and for whom any variance in the analyst's body is felt to be profoundly destabilizing. In such cases the patient needs to relate to the body of the analyst concretely and exclusively as a setting 'constant' and its meaning for the patient may thus remain inaccessible to analysis for a long time. When the separateness of the body of the analyst reaches the patient's awareness because of changes in the analyst's appearance or bodily state, it then mobilizes primitive anxieties in the patient. It is only when the body of the analyst can become a dynamic variable between them (i.e., part of the process) that it can be used by the patient to further the exploration of their own mind. Copyright © 2014 Institute of Psychoanalysis.

  7. Applied predictive analytics principles and techniques for the professional data analyst

    CERN Document Server

    Abbott, Dean

    2014-01-01

    Learn the art and science of predictive analytics - techniques that get results Predictive analytics is what translates big data into meaningful, usable business information. Written by a leading expert in the field, this guide examines the science of the underlying algorithms as well as the principles and best practices that govern the art of predictive analytics. It clearly explains the theory behind predictive analytics, teaches the methods, principles, and techniques for conducting predictive analytics projects, and offers tips and tricks that are essential for successful predictive mode

  8. 78 FR 14359 - Verizon Business Networks Services, Inc., Senior Analysts-Order Management, Voice Over Internet...

    Science.gov (United States)

    2013-03-05

    ... Business Networks Services, Inc., Senior Analysts-Order Management, Voice Over Internet Protocol, Small And... Management, Voice Over Internet Protocol, Small And Medium Business, San Antonio, TX; Amended Certification... Business Networks Services, Inc., Senior Analysts-Order Management, Voice Over Internet Protocol, Small and...

  9. COLAB: A Laboratory Environment for Studying Analyst Sensemaking and Collaboration

    National Research Council Canada - National Science Library

    Morrison, Clayton T; Cohen, Paul R

    2005-01-01

    COLAB is a laboratory for studying tools that facilitate collaboration and sensemaking among groups of human analysts as they build interpretations of unfolding situations based on accruing intelligence data...

  10. Development of a Nevada Statewide Database for Safety Analyst Software

    Science.gov (United States)

    2017-02-02

    Safety Analyst is a software package developed by the Federal Highway Administration (FHWA) and twenty-seven participating state and local agencies including the Nevada Department of Transportation (NDOT). The software package implemented many of the...

  11. Temenos regained: reflections on the absence of the analyst.

    Science.gov (United States)

    Abramovitch, Henry

    2002-10-01

    The importance of the temenos as a metaphor to conceptualize therapeutic containment is discussed. Jung drew the analogy between the consulting room and the temenos, at the centre of the Greek Temple as a sacred and inviolate place where the analysand might encounter the Self. Although Jung believed that whether called or not, the gods would appear, under certain conditions, patients may experience 'temenos lost', the loss of the holding function of the analytic space. Two cases are presented in which temenos issues played a central role. In one case, an unorthodox method was used to preserve the analytic container during the absence of the analyst and in the other, the impact of an extra-analytical encounter had a dramatic effect on the holding function of the temenos. A discussion is presented of the appropriate circumstances in which analysts may deviate from traditional analytic practice in order to preserve the temenos and transform a 'temenos lost' into a 'temenos regained'.

  12. Self-confidence in financial analysis: a study of younger and older male professional analysts.

    Science.gov (United States)

    Webster, R L; Ellis, T S

    2001-06-01

    Measures of reported self-confidence in performing financial analysis by 59 professional male analysts, 31 born between 1946 and 1964 and 28 born between 1965 and 1976, were investigated and reported. Self-confidence in one's ability is important in the securities industry because it affects recommendations and decisions to buy, sell, and hold securities. The respondents analyzed a set of multiyear corporate financial statements and reported their self-confidence in six separate financial areas. Data from the 59 male financial analysts were tallied and analyzed using both univariate and multivariate statistical tests. Rated self-confidence was not significantly different for the younger and the older men. These results are not consistent with a similar prior study of female analysts in which younger women showed significantly higher self-confidence than older women.

  13. Predicting unpredictability

    Science.gov (United States)

    Davis, Steven J.

    2018-04-01

    Analysts and markets have struggled to predict a number of phenomena, such as the rise of natural gas, in US energy markets over the past decade or so. Research shows the challenge may grow because the industry — and consequently the market — is becoming increasingly volatile.

  14. Accuracy and Consistency of Grass Pollen Identification by Human Analysts Using Electron Micrographs of Surface Ornamentation

    Directory of Open Access Journals (Sweden)

    Luke Mander

    2014-08-01

    Full Text Available Premise of the study: Humans frequently identify pollen grains at a taxonomic rank above species. Grass pollen is a classic case of this situation, which has led to the development of computational methods for identifying grass pollen species. This paper aims to provide context for these computational methods by quantifying the accuracy and consistency of human identification. Methods: We measured the ability of nine human analysts to identify 12 species of grass pollen using scanning electron microscopy images. These are the same images that were used in computational identifications. We have measured the coverage, accuracy, and consistency of each analyst, and investigated their ability to recognize duplicate images. Results: Coverage ranged from 87.5% to 100%. Mean identification accuracy ranged from 46.67% to 87.5%. The identification consistency of each analyst ranged from 32.5% to 87.5%, and each of the nine analysts produced considerably different identification schemes. The proportion of duplicate image pairs that were missed ranged from 6.25% to 58.33%. Discussion: The identification errors made by each analyst, which result in a decline in accuracy and consistency, are likely related to psychological factors such as the limited capacity of human memory, fatigue and boredom, recency effects, and positivity bias.

  15. Information seeking and use behaviour of economists and business analysts

    Directory of Open Access Journals (Sweden)

    Eric Thivant

    2005-01-01

    Full Text Available Introduction. The aim of this paper is to deal with the information seeking and use problem in a professional context and understand how activity can influence practices, by taking as examples, the research undertaken by economic analysts. We analyse the relationship between the situational approach, described by Cheuk, the work environment complexity (with social, technological and personal aspects, and the information seeking and use strategies, which relied on Ellis and Wilson's model, with Bates's comments. Method. We interviewed eight economists, using a questionnaire and the SICIA (Situation, Complexity and Information Activity method. The SICAI method is a qualitative approach, which underlines the relationship between situations, professional contexts and strategies. Both methods allow better understanding of how investment analysts find out what they need for their job. We can clarify their information sources and practices of information seeking, which are very particular because of their activities. We complete our analysis by interviewing analysts from financial institutions. Analysis. A qualitative mode of analysis was used to interpret the interviewees' comments, within the research framework adopted. Results. We find similarity in information seeking and use strategies used by these two groups and environmental levels meet in most situations. But some differences can be also found, explained by the activity frameworks and goals. Conclusion. This study demonstrates that the activity and also the professional context (here the financial context can directly influence practices.

  16. Seeing, mirroring, desiring: the impact of the analyst's pregnant body on the patient's body image.

    Science.gov (United States)

    Yakeley, Jessica

    2013-08-01

    The paper explores the impact of the analyst's pregnant body on the course of two analyses, a young man, and a young woman, specifically focusing on how each patient's visual perception and affective experience of being with the analyst's pregnant body affected their own body image and subjective experience of their body. The pre-verbal or 'subsymbolic' material evoked in the analyses contributed to a greater understanding of the patients' developmental experiences in infancy and adolescence, which had resulted in both carrying a profoundly distorted body image into adulthood. The analyst's pregnancy offered a therapeutic window in which a shift in the patient's body image could be initiated. Clinical material is presented in detail with reference to the psychoanalytic literature on the pregnant analyst, and that of the development of the body image, particularly focusing on the role of visual communication and the face. The author proposes a theory of psychic change, drawing on Bucci's multiple code theory, in which the patients' unconscious or 'subsymbolic' awareness of her pregnancy, which were manifest in their bodily responses, feeling states and dreams, as well as in the analyst s countertransference, could gradually be verbalized and understood within the transference. Thus visual perception, or 'external seeing', could gradually become 'internal seeing', or insight into unconscious phantasies, leading to a shift in the patients internal object world towards a less persecutory state and more realistic appraisal of their body image. Copyright © 2013 Institute of Psychoanalysis.

  17. Look who is talking now: analyst recommendations and internet IPOs

    NARCIS (Netherlands)

    van der Goot, T.; van Giersbergen, N.

    2009-01-01

    This paper investigates whether analyst recommendations are independent of their employer’s investment banking activities. Our sample consists of internet firms that went public during 1997-2000. The contribution of the paper to the literature is threefold. First, to account for missing

  18. [Concordance among analysts from Latin-American laboratories for rice grain appearance determination using a gallery of digital images].

    Science.gov (United States)

    Avila, Manuel; Graterol, Eduardo; Alezones, Jesús; Criollo, Beisy; Castillo, Dámaso; Kuri, Victoria; Oviedo, Norman; Moquete, Cesar; Romero, Marbella; Hanley, Zaida; Taylor, Margie

    2012-06-01

    The appearance of rice grain is a key aspect in quality determination. Mainly, this analysis is performed by expert analysts through visual observation; however, due to the subjective nature of the analysis, the results may vary among analysts. In order to evaluate the concordance between analysts from Latin-American rice quality laboratories for rice grain appearance through digital images, an inter-laboratory test was performed with ten analysts and images of 90 grains captured with a high resolution scanner. Rice grains were classified in four categories including translucent, chalky, white belly, and damaged grain. Data was categorized using statistic parameters like mode and its frequency, the relative concordance, and the reproducibility parameter kappa. Additionally, a referential image gallery of typical grain for each category was constructed based on mode frequency. Results showed a Kappa value of 0.49, corresponding to a moderate reproducibility, attributable to subjectivity in the visual analysis of grain images. These results reveal the need for standardize the evaluation criteria among analysts to improve the confidence of the determination of rice grain appearance.

  19. Which analysts benefited most from mandatory IFRS adoption in Europe?

    NARCIS (Netherlands)

    Beuselinck, Christof; Joos, Philip; Khurana, I.K.; van der Meulen, Sofie

    2017-01-01

    This study examines whether financial analysts' research structure and portfolio selection choices helped in improving relative earnings forecast accuracy around mandatory IFRS adoption in Europe. Using a sample of 68,665 one-year ahead forecasts for 1,980 publicly listed firms, we find that

  20. Meta-Analyst: software for meta-analysis of binary, continuous and diagnostic data

    Directory of Open Access Journals (Sweden)

    Schmid Christopher H

    2009-12-01

    Full Text Available Abstract Background Meta-analysis is increasingly used as a key source of evidence synthesis to inform clinical practice. The theory and statistical foundations of meta-analysis continually evolve, providing solutions to many new and challenging problems. In practice, most meta-analyses are performed in general statistical packages or dedicated meta-analysis programs. Results Herein, we introduce Meta-Analyst, a novel, powerful, intuitive, and free meta-analysis program for the meta-analysis of a variety of problems. Meta-Analyst is implemented in C# atop of the Microsoft .NET framework, and features a graphical user interface. The software performs several meta-analysis and meta-regression models for binary and continuous outcomes, as well as analyses for diagnostic and prognostic test studies in the frequentist and Bayesian frameworks. Moreover, Meta-Analyst includes a flexible tool to edit and customize generated meta-analysis graphs (e.g., forest plots and provides output in many formats (images, Adobe PDF, Microsoft Word-ready RTF. The software architecture employed allows for rapid changes to be made to either the Graphical User Interface (GUI or to the analytic modules. We verified the numerical precision of Meta-Analyst by comparing its output with that from standard meta-analysis routines in Stata over a large database of 11,803 meta-analyses of binary outcome data, and 6,881 meta-analyses of continuous outcome data from the Cochrane Library of Systematic Reviews. Results from analyses of diagnostic and prognostic test studies have been verified in a limited number of meta-analyses versus MetaDisc and MetaTest. Bayesian statistical analyses use the OpenBUGS calculation engine (and are thus as accurate as the standalone OpenBUGS software. Conclusion We have developed and validated a new program for conducting meta-analyses that combines the advantages of existing software for this task.

  1. Macroeconomic predictions – Three essays on analysts' forecast quality

    OpenAIRE

    Orbe, Sebastian

    2013-01-01

    Macroeconomic expectation data are of great interest to different agents due to their importance as central input factors in various applications. To name but a few, politicians, capital market participants, as well as academics, incorporate these forecast data into their decision processes. Consequently, a sound understanding of the quality properties of macroeconomic forecast data, their quality determinants, as well as potential ways to improve macroeconomic predictions is desirable. ...

  2. The Effects of Bug-in-Ear Coaching on Pre-Service Behavior Analysts' Use of Functional Communication Training.

    Science.gov (United States)

    Artman-Meeker, Kathleen; Rosenberg, Nancy; Badgett, Natalie; Yang, Xueyan; Penney, Ashley

    2017-09-01

    Behavior analysts play an important role in supporting the behavior and learning of young children with disabilities in natural settings. However, there is very little research related specifically to developing the skills and competencies needed by pre-service behavior analysts. This study examined the effects of "bug-in-ear" (BIE) coaching on pre-service behavior analysts' implementation of functional communication training with pre-school children with autism in their classrooms. BIE coaching was associated with increases in the rate of functional communication training trials each intern initiated per session and in the fidelity with which interns implemented functional communication training. Adults created more intentional opportunities for children to communicate, and adults provided more systematic instruction around those opportunities.

  3. The analyst's desire in the clinic of anorexia

    OpenAIRE

    Silva, Mariana Benatto Pereira da; Pereira, Mario Eduardo Costa; Celeri, Eloísa Helena Valler

    2010-01-01

    The present work deals with the issue of the analyst's desire in the psychoanalytical treatment of anorexia. It analyzes important elements to establish transference in these cases, as the pursuit of death and the choice of refusing food as a way of controlling the demands of the Other. It then discusses the "analist's desire" function in this clinic. Rejecting the definition of a treatment model and the structural categorization of anorexia, we can find in the cases of the girl of Angouleme ...

  4. Disclosure of Non-Financial Information: Relevant to Financial Analysts?

    OpenAIRE

    ORENS, Raf; LYBAERT, Nadine

    2013-01-01

    The decline in the relevance of financial statement information to value firms leads to calls from organizational stakeholders to convey non-financial information in order to be able to judge firms' financial performance and value. This literature review aims to report extant literature findings on the use of corporate non-financial information by sell-side financial analysts, the information intermediaries between corporate management and investors. Prior studies highlight that financial ana...

  5. Idealization of the analyst by the young adult.

    Science.gov (United States)

    Chused, J F

    1987-01-01

    Idealization is an intrapsychic process that serves many functions. In addition to its use defensively and for gratification of libidinal and aggressive drive derivatives, it can contribute to developmental progression, particularly during late adolescence and young adulthood. During an analysis, it is important to recognize all the determinants of idealization, including those related to the reworking of developmental conflicts. If an analyst understands idealization solely as a manifestation of pathology, he may interfere with his patient's use of it for the development of autonomous functioning.

  6. Analyzing the Qualitative Data Analyst: A Naturalistic Investigation of Data Interpretation

    Directory of Open Access Journals (Sweden)

    Wolff-Michael Roth

    2015-07-01

    Full Text Available Much qualitative research involves the analysis of verbal data. Although the possibility to conduct qualitative research in a rigorous manner is sometimes contested in debates of qualitative/quantitative methods, there are scholarly communities within which qualitative research is indeed data driven and enacted in rigorous ways. How might one teach rigorous approaches to analysis of verbal data? In this study, 20 sessions were recorded in introductory graduate classes on qualitative research methods. The social scientist thought aloud while analyzing transcriptions that were handed to her immediately prior the sessions and for which she had no background information. The students then assessed, sometimes showing the original video, the degree to which the analyst had recovered (the structures of the original events. This study provides answers to the broad question: "How does an analyst recover an original event with a high degree of accuracy?" Implications are discussed for teaching qualitative data analysis. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1503119

  7. Examination of suspicious objects by virus analysts

    Science.gov (United States)

    Ananin, E. V.; Ananina, I. S.; Nikishova, A. V.

    2018-05-01

    The paper presents data on virus threats urgency. But in order for antiviruses to work properly, all data on new implementations of viruses should be added to its database. For that to be done, all suspicious objects should be investigated. It is a dangerous process and should be done in the virtual system. However, it is not secure for the main system as well. So the diagram of a secure workplace for a virus analyst is proposed. It contains software for its protection. Also all kinds of setting to ensure security of the process of investigating suspicious objects are proposed. The proposed approach allows minimizing risks caused by the virus.

  8. Analyst Hype in IPOs: Explaining the Popularity of Bookbuilding

    OpenAIRE

    Francois Degeorge; Francois Derrien; Kent L. Womack

    2007-01-01

    The bookbuilding IPO procedure has captured significant market share from auction alternatives recently, despite the significantly lower costs related to the auction mechanism. In France, where both mechanisms were used in the 1990s, the ostensible advantages of bookbuilding were advertising-related benefits. Book-built issues were more likely to be followed and positively recommended by lead underwriters. Even nonunderwriters' analysts promote book-built issues more in order to curry favor w...

  9. Monday-Morning Quarterbacking: A Senior Analyst Uses His Early Work to Discuss Contemporary Child and Adolescent Psychoanalytic Technique.

    Science.gov (United States)

    Sugarman, Alan

    2015-01-01

    Contemporary child and adolescent psychoanalytic technique has evolved and changed a great deal in the last thirty years. This paper will describe the analysis of an adolescent girl from early in the author's career to demonstrate the ways in which technique has changed. The clinical material presented highlights six areas in which contemporary child and adolescent analysts practice and/or understand material and the clinical process differently than they did thirty years ago: (1) the contemporary perspective on mutative action, (2) the contemporary emphasis on mental organization, (3) the developmental lag in integrating the structural model, (4) the child analyst's multiple functions, (5) the child analyst's use of countertransference, and (6) the child analyst's work with parents. The author discusses how he would work differently with the patient now using his contemporary perspective. But he also wonders what might have been lost by not working in a more traditional manner, in particular the opportunity to analyze the patient's hypersensitivity to feeling hurt and mistreated so directly in the transference.

  10. Storing and managing information artifacts collected by information analysts using a computing device

    Science.gov (United States)

    Pike, William A; Riensche, Roderick M; Best, Daniel M; Roberts, Ian E; Whyatt, Marie V; Hart, Michelle L; Carr, Norman J; Thomas, James J

    2012-09-18

    Systems and computer-implemented processes for storage and management of information artifacts collected by information analysts using a computing device. The processes and systems can capture a sequence of interactive operation elements that are performed by the information analyst, who is collecting an information artifact from at least one of the plurality of software applications. The information artifact can then be stored together with the interactive operation elements as a snippet on a memory device, which is operably connected to the processor. The snippet comprises a view from an analysis application, data contained in the view, and the sequence of interactive operation elements stored as a provenance representation comprising operation element class, timestamp, and data object attributes for each interactive operation element in the sequence.

  11. Transformations in hallucinosis and the receptivity of the analyst.

    Science.gov (United States)

    Civitarese, Giuseppe

    2015-08-01

    Bion describes transformation in hallucinosis (TH) as a psychic defence present in elusive psychotic scenarios in which there is a total adherence to concrete reality: as the hallucinatory activity which physiologically infiltrates perception and allows us to know reality, setting it off against a background of familiarity; and then, surprisingly, as the ideal state of mind towards which the analyst has to move in order to intuit the facts of the analysis. When hallucinosis is followed by 'awakening', the analyst gains understanding from the experience and goes through a transformation that will inevitably be transmitted to the analytic field and to the patient. In this paper I illustrate Bion's concept and underline its eminently intersubjective nature. Then I differentiate it from two other technical devices: reverie, which unlike hallucinosis does not imply the persistence of a feeling of the real, and Ferro's transformation in dreaming, i.e., purposeful listening to everything that is said in the analysis as if it were the telling of a dream. Finally, I try to demonstrate the practical utility of the concept of transformation in hallucinosis in order to read the complex dynamics of a clinical vignette. Though not well known (only two references in English in the PEP archive), TH proves to be remarkably versatile and productive for thinking about psychoanalytic theory, technique and clinical work. Copyright © 2014 Institute of Psychoanalysis.

  12. Transference to the analyst as an excluded observer.

    Science.gov (United States)

    Steiner, John

    2008-02-01

    In this paper I briefly review some significant points in the development of ideas on transference which owe so much to the discoveries of Freud. I then discuss some of the subsequent developments which were based on Freud 's work and which have personally impressed me. In particular I mention Melanie Klein's elaboration of an internal world peopled by internal object and her description of the mechanisms of splitting and projective identification, both of which profoundly affect our understanding of transference. Using some clinical material I try to illustrate an important transference situation which I do not think has been sufficiently emphasized although it is part of the 'total situation' outlined by Klein. In this kind of transference the analyst finds himself in an observing position and is no longer the primary object to whom love and hate are directed. Instead he is put in a position of an excluded figure who can easily enact rather than understand the role he has been put in. In this situation he may try to regain the position as the patient's primary object in the transference or avoid the transference altogether and make extra-transference interpretations and in this way enact the role of a judgemental and critical super-ego. If he can tolerate the loss of a central role and understand the transference position he has been put in, the analyst can sometimes reduce enactments and release feelings to do with mourning and loss in both himself and his patient.

  13. Presentation of the results of a Bayesian automatic event detection and localization program to human analysts

    Science.gov (United States)

    Kushida, N.; Kebede, F.; Feitio, P.; Le Bras, R.

    2016-12-01

    The Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) has been developing and testing NET-VISA (Arora et al., 2013), a Bayesian automatic event detection and localization program, and evaluating its performance in a realistic operational mode. In our preliminary testing at the CTBTO, NET-VISA shows better performance than its currently operating automatic localization program. However, given CTBTO's role and its international context, a new technology should be introduced cautiously when it replaces a key piece of the automatic processing. We integrated the results of NET-VISA into the Analyst Review Station, extensively used by the analysts so that they can check the accuracy and robustness of the Bayesian approach. We expect the workload of the analysts to be reduced because of the better performance of NET-VISA in finding missed events and getting a more complete set of stations than the current system which has been operating for nearly twenty years. The results of a series of tests indicate that the expectations born from the automatic tests, which show an overall overlap improvement of 11%, meaning that the missed events rate is cut by 42%, hold for the integrated interactive module as well. New events are found by analysts, which qualify for the CTBTO Reviewed Event Bulletin, beyond the ones analyzed through the standard procedures. Arora, N., Russell, S., and Sudderth, E., NET-VISA: Network Processing Vertically Integrated Seismic Analysis, 2013, Bull. Seismol. Soc. Am., 103, 709-729.

  14. Predictive maintenance primer

    International Nuclear Information System (INIS)

    Flude, J.W.; Nicholas, J.R.

    1991-04-01

    This Predictive Maintenance Primer provides utility plant personnel with a single-source reference to predictive maintenance analysis methods and technologies used successfully by utilities and other industries. It is intended to be a ready reference to personnel considering starting, expanding or improving a predictive maintenance program. This Primer includes a discussion of various analysis methods and how they overlap and interrelate. Additionally, eighteen predictive maintenance technologies are discussed in sufficient detail for the user to evaluate the potential of each technology for specific applications. This document is designed to allow inclusion of additional technologies in the future. To gather the information necessary to create this initial Primer the Nuclear Maintenance Applications Center (NMAC) collected experience data from eighteen utilities plus other industry and government sources. NMAC also contacted equipment manufacturers for information pertaining to equipment utilization, maintenance, and technical specifications. The Primer includes a discussion of six methods used by analysts to study predictive maintenance data. These are: trend analysis; pattern recognition; correlation; test against limits or ranges; relative comparison data; and statistical process analysis. Following the analysis methods discussions are detailed descriptions for eighteen technologies analysts have found useful for predictive maintenance programs at power plants and other industrial facilities. Each technology subchapter has a description of the operating principles involved in the technology, a listing of plant equipment where the technology can be applied, and a general description of the monitoring equipment. Additionally, these descriptions include a discussion of results obtained from actual equipment users and preferred analysis techniques to be used on data obtained from the technology. 5 refs., 30 figs

  15. The analyst's desire in the clinic of anorexia

    Directory of Open Access Journals (Sweden)

    Mariana Benatto Pereira da Silva

    2010-06-01

    Full Text Available The present work deals with the issue of the analyst's desire in the psychoanalytical treatment of anorexia. It analyzes important elements to establish transference in these cases, as the pursuit of death and the choice of refusing food as a way of controlling the demands of the Other. It then discusses the "analist's desire" function in this clinic. Rejecting the definition of a treatment model and the structural categorization of anorexia, we can find in the cases of the girl of Angouleme (Charcot and Sidonie (M. Mannoni present possible subjective ways to get out of this psychopathological impasse, by means of this function.

  16. Analyste de systèmes principal (h/f) | CRDI - Centre de recherches ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    L'analyste de systèmes principal travaille sous la supervision du ... une analyse fonctionnelle élémentaire des exigences en parlant de la portée et de la raison ... et l'infrastructure en matière de TI tout en se mettant à la place des techniciens, ...

  17. Translating the covenant: The behavior analyst as ambassador and translator.

    Science.gov (United States)

    Foxx, R M

    1996-01-01

    Behavior analysts should be sensitive to how others react to and interpret our language because it is inextricably related to our image. Our use of conceptual revision, with such terms as punishment, has created communicative confusion and hostility on the part of general and professional audiences we have attempted to influence. We must, therefore, adopt the role of ambassador and translator in the nonbehavioral world. A number of recommendations are offered for promoting, translating, and disseminating behavior analysis.

  18. Analyste de la gestion des documents (h/f) | CRDI - Centre de ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Résumé des fonctions L'analyste de la gestion des documents est la ressource technique au sein de l'équipe de la Gestion de l'information et des documents. ... Se tenir également informé de toutes les pratiques exemplaires liées à la gestion de l'information et des documents électroniques et matériels que le CRDI doit ...

  19. Radiation litigation: Quality assurance and the radiation analyst

    International Nuclear Information System (INIS)

    Jose, D.E.

    1986-01-01

    This paper touches on three areas of interest to the radiation analyst; the dose issue, legal persuasion, and future legal issues. While many laboratory scientists would think that the actual dose received by the plaintiff's relevant organ would be an easy issue to resolve, that has not been the experience to date. All radiation cases are assumed to be ultrahazardous activity cases, even though they involve a dose well below yearly natural background. At some point the law needs to realize that such low dose cases are a waste of scarce judicial resources. Lawyers and scientists need to communicate with each other and work together to help improve the way the legal system processes these important cases

  20. Many analysts, one dataset: Making transparent how variations in analytical choices affect results

    NARCIS (Netherlands)

    Silberzahn, Raphael; Uhlmann, E.L.; Martin, D.P.; Anselmi, P.; Aust, F.; Awtrey, E.; Bahnik, Š.; Bai, F.; Bannard, C.; Bonnier, E.; Carlsson, R.; Cheung, F.; Christensen, G.; Clay, R.; Craig, M.A.; Dalla Rosa, A.; Dam, Lammertjan; Evans, M.H.; Flores Cervantes, I.; Fong, N.; Gamez-Djokic, M.; Glenz, A.; Gordon-McKeon, S.; Heaton, T.J.; Hederos, K.; Heene, M.; Hofelich Mohr, A.J.; Högden, F.; Hui, K.; Johannesson, M.; Kalodimos, J.; Kaszubowski, E.; Kennedy, D.M.; Lei, R.; Lindsay, T.A.; Liverani, S.; Madan, C.R.; Molden, D.; Molleman, Henricus; Morey, R.D.; Mulder, Laetitia; Nijstad, Bernard; Pope, N.G.; Pope, B.; Prenoveau, J.M.; Rink, Floortje; Robusto, E.; Roderique, H.; Sandberg, A.; Schlüter, E.; Schönbrodt, F.D.; Sherman, M.F.; Sommer, S.A.; Sotak, K.; Spain, S.; Spörlein, C.; Stafford, T.; Stefanutti, L.; Täuber, Susanne; Ullrich, J.; Vianello, M.; Wagenmakers, E.-J.; Witkowiak, M.; Yoon, S.; Nosek, B.A.

    2018-01-01

    Twenty-nine teams involving 61 analysts used the same dataset to address the same research question: whether soccer referees are more likely to give red cards to dark skin toned players than light skin toned players. Analytic approaches varied widely across teams, and estimated effect sizes ranged

  1. Are the People Backward? Algerian Symbolic Analysts and the Culture of the Masses

    Directory of Open Access Journals (Sweden)

    Thomas Serres

    2017-01-01

    Full Text Available This article studies representations of the Algerian population promoted by francophone intellectuals in a context of longstanding crisis and uncertainty. Borrowing the category of symbolic analysts from Robert Reich, it looks at the way in which novelists, scholars and journalists try to make sense of a critical situation by diagnosing the culture of the Algerian population as deviant or backward. Aiming to encourage social and political reform, these actors try to understand the characteristics of their "people", often by pointing to their so-called pre-modern or passive behaviors. This article analyzes two aspects of this activity: first, attempts to determine who is responsible for the ongoing crisis, and second, the reproduction of cultural prejudices in a context of increased transnationalization. Moreover, it argues that one can interpret the political and intellectual commitments of these analysts by drawing on the triad concept of "Naming, Blaming, Claiming", which as been used to study the publicization of disputes.

  2. Analyste de systèmes intermédiaire (h/f) | CRDI - Centre de ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    L'analyste de systèmes intermédiaire travaille sous la supervision du ... à la mise en œuvre et à la maintenance d'un large éventail de systèmes tout en assurant ... directement les consultants externes sous contrat afin de faciliter la transition ...

  3. School-Wide PBIS: Extending the Impact of Applied Behavior Analysis. Why is This Important to Behavior Analysts?

    Science.gov (United States)

    Putnam, Robert F; Kincaid, Donald

    2015-05-01

    Horner and Sugai (2015) recently wrote a manuscript providing an overview of school-wide positive behavioral interventions and supports (PBIS) and why it is an example of applied behavior analysis at the scale of social importance. This paper will describe why school-wide PBIS is important to behavior analysts, how it helps promote applied behavior analysis in schools and other organizations, and how behavior analysts can use this framework to assist them in the promotion and implementation of applied behavior analysis at both at the school and organizational level, as well as, the classroom and individual level.

  4. Could the outcome of the 2016 US elections have been predicted from past voting patterns?

    CSIR Research Space (South Africa)

    Schmitz, Peter MU

    2017-07-01

    Full Text Available In South Africa, a team of analysts has for some years been using statistical techniques to predict election outcomes during election nights in South Africa. The prediction method involves using statistical clusters based on past voting patterns...

  5. The Role of Teamwork in the Analysis of Big Data: A Study of Visual Analytics and Box Office Prediction.

    Science.gov (United States)

    Buchanan, Verica; Lu, Yafeng; McNeese, Nathan; Steptoe, Michael; Maciejewski, Ross; Cooke, Nancy

    2017-03-01

    Historically, domains such as business intelligence would require a single analyst to engage with data, develop a model, answer operational questions, and predict future behaviors. However, as the problems and domains become more complex, organizations are employing teams of analysts to explore and model data to generate knowledge. Furthermore, given the rapid increase in data collection, organizations are struggling to develop practices for intelligence analysis in the era of big data. Currently, a variety of machine learning and data mining techniques are available to model data and to generate insights and predictions, and developments in the field of visual analytics have focused on how to effectively link data mining algorithms with interactive visuals to enable analysts to explore, understand, and interact with data and data models. Although studies have explored the role of single analysts in the visual analytics pipeline, little work has explored the role of teamwork and visual analytics in the analysis of big data. In this article, we present an experiment integrating statistical models, visual analytics techniques, and user experiments to study the role of teamwork in predictive analytics. We frame our experiment around the analysis of social media data for box office prediction problems and compare the prediction performance of teams, groups, and individuals. Our results indicate that a team's performance is mediated by the team's characteristics such as openness of individual members to others' positions and the type of planning that goes into the team's analysis. These findings have important implications for how organizations should create teams in order to make effective use of information from their analytic models.

  6. Emotional Detachment of Partners and the Sanctity of the Relationship with the Analyst as the Most Powerful Curative Factor.

    Science.gov (United States)

    Gostečnik, Christian; Slavič, Tanja Repič; Lukek, Saša Poljak; Pate, Tanja; Cvetek, Robert

    2017-08-01

    The relationship between partners and the analyst is considered the most basic means for healing in contemporary psychoanalytic theories and analyses. It also holds as one of the most fundamental phenomenon's of psychoanalysis, so it comes as no surprise that it has always been deliberated over as an object of great interest as well as immense controversy. This same relationship, mutually co-created by the analyst and each individual and partner in analysis, represents also the core of sanctity and sacred space in contemporary psychoanalysis.

  7. The Effect of Ownership Structure and Investor Protection on Firm Value: Analyst Following as Moderating Variable

    Directory of Open Access Journals (Sweden)

    Desi Susilawati

    2017-12-01

    Full Text Available The research related to the association between structure ownership and the firm value is a discussion about corporate governance which is still has contradictory conclusion and mixed result. It indicates open question that needs empirical evidence. The influence of concentrated ownership on firm value still brought conflict of interest so the role of analyst following can be stated as an alternative of corporate governance mechanism (Lang et al., 2004. The objectives of this research are to examine the interaction effect between concentrated ownership and analyst following, and the effect of investor protection toward firm value in five Asian companies. Asia is chosen because it has unique characteristic, in term of corporates ownership structure which is more concentrated on families and board of governance is weak (Choi, 2003. The data is consisting of 7.100 firm year observations obtained from Bloomberg and OSIRIS database for the period 2011-2013 in five Asian Countries, i.e. China, South Korea,  Malaysia, Taiwan, and Thailand. Multiple Regression analysis is used to test hypotheses. The results show that concentrated ownership is positively affects the firm value. However, there is no empirical evidence that the interaction of concentrated ownership and analyst following positively affect the firm value. As hypothesized, this research also shows that investor protection has negative impact on firm’s value.

  8. Connecting Hazard Analysts and Risk Managers to Sensor Information.

    Science.gov (United States)

    Le Cozannet, Gonéri; Hosford, Steven; Douglas, John; Serrano, Jean-Jacques; Coraboeuf, Damien; Comte, Jérémie

    2008-06-11

    Hazard analysts and risk managers of natural perils, such as earthquakes, landslides and floods, need to access information from sensor networks surveying their regions of interest. However, currently information about these networks is difficult to obtain and is available in varying formats, thereby restricting accesses and consequently possibly leading to decision-making based on limited information. As a response to this issue, state-of-the-art interoperable catalogues are being currently developed within the framework of the Group on Earth Observations (GEO) workplan. This article provides an overview of the prototype catalogue that was developed to improve access to information about the sensor networks surveying geological hazards (geohazards), such as earthquakes, landslides and volcanoes.

  9. Proactive Spatiotemporal Resource Allocation and Predictive Visual Analytics for Community Policing and Law Enforcement.

    Science.gov (United States)

    Malik, Abish; Maciejewski, Ross; Towers, Sherry; McCullough, Sean; Ebert, David S

    2014-12-01

    In this paper, we present a visual analytics approach that provides decision makers with a proactive and predictive environment in order to assist them in making effective resource allocation and deployment decisions. The challenges involved with such predictive analytics processes include end-users' understanding, and the application of the underlying statistical algorithms at the right spatiotemporal granularity levels so that good prediction estimates can be established. In our approach, we provide analysts with a suite of natural scale templates and methods that enable them to focus and drill down to appropriate geospatial and temporal resolution levels. Our forecasting technique is based on the Seasonal Trend decomposition based on Loess (STL) method, which we apply in a spatiotemporal visual analytics context to provide analysts with predicted levels of future activity. We also present a novel kernel density estimation technique we have developed, in which the prediction process is influenced by the spatial correlation of recent incidents at nearby locations. We demonstrate our techniques by applying our methodology to Criminal, Traffic and Civil (CTC) incident datasets.

  10. It's the People, Stupid: The Role of Personality and Situational Variable in Predicting Decisionmaker Behavior

    National Research Council Canada - National Science Library

    Sticha, Paul J; Buede, Dennis M; Rees, Richard L

    2006-01-01

    .... The analyst builds Bayesian networks that integrate situational information with the Subject's personality and culture to provide a probabilistic prediction of the hypothesized actions a Subject might choose...

  11. Dividend Policy: A Survey of Malaysian Public Listed Companies and Security Analysts

    OpenAIRE

    Chong, David Voon Chee

    2003-01-01

    This dissertation revisits the dividend puzzle and attempts to answer the fundamental question of “why do companies pay dividends?” by using a simple regression model incorporating the major theories on dividend relevance. In addition, this research investigates if the opinions of companies (both dividend and non-dividend paying) and security analysts differ with respect to the various explanations for paying dividends. Finally, this research also explores the views and opinions of corporate ...

  12. Additional Support for the Information Systems Analyst Exam as a Valid Program Assessment Tool

    Science.gov (United States)

    Carpenter, Donald A.; Snyder, Johnny; Slauson, Gayla Jo; Bridge, Morgan K.

    2011-01-01

    This paper presents a statistical analysis to support the notion that the Information Systems Analyst (ISA) exam can be used as a program assessment tool in addition to measuring student performance. It compares ISA exam scores earned by students in one particular Computer Information Systems program with scores earned by the same students on the…

  13. The role of analytic neutrality in the use of the child analyst as a new object.

    Science.gov (United States)

    Chused, J F

    1982-01-01

    The analyses of two children and one adolescent were presented to illustrate the concept that the neutrality of the analyst can be used not only to (a) establish a working, analyzing, and observing alliance, (b) permit the development, recognition, and working through of the transference neurosis, but also to (c) develop a sense of autonomy and self-esteem which had been contaminated by the neediness and lack of true empathy of the primary objects during the practicing and rapprochement phases of separation-individuation. For the patients discussed above, many ego functions which should have had a degree of secondary autonomy were either inhibited, enmeshed in conflict, or experienced as nongenuine, part of a "false self." It was as if the experience with the neutral analyst permitted an "autonomous practicing" that had not been possible during the period of separation-individuation.

  14. Automatic theory generation from analyst text files using coherence networks

    Science.gov (United States)

    Shaffer, Steven C.

    2014-05-01

    This paper describes a three-phase process of extracting knowledge from analyst textual reports. Phase 1 involves performing natural language processing on the source text to extract subject-predicate-object triples. In phase 2, these triples are then fed into a coherence network analysis process, using a genetic algorithm optimization. Finally, the highest-value sub networks are processed into a semantic network graph for display. Initial work on a well- known data set (a Wikipedia article on Abraham Lincoln) has shown excellent results without any specific tuning. Next, we ran the process on the SYNthetic Counter-INsurgency (SYNCOIN) data set, developed at Penn State, yielding interesting and potentially useful results.

  15. A Graph is Worth a Thousand Words: How Overconfidence and Graphical Disclosure of Numerical Information Influence Financial Analysts Accuracy on Decision Making.

    Directory of Open Access Journals (Sweden)

    Ricardo Lopes Cardoso

    Full Text Available Previous researches support that graphs are relevant decision aids to tasks related to the interpretation of numerical information. Moreover, literature shows that different types of graphical information can help or harm the accuracy on decision making of accountants and financial analysts. We conducted a 4×2 mixed-design experiment to examine the effects of numerical information disclosure on financial analysts' accuracy, and investigated the role of overconfidence in decision making. Results show that compared to text, column graph enhanced accuracy on decision making, followed by line graphs. No difference was found between table and textual disclosure. Overconfidence harmed accuracy, and both genders behaved overconfidently. Additionally, the type of disclosure (text, table, line graph and column graph did not affect the overconfidence of individuals, providing evidence that overconfidence is a personal trait. This study makes three contributions. First, it provides evidence from a larger sample size (295 of financial analysts instead of a smaller sample size of students that graphs are relevant decision aids to tasks related to the interpretation of numerical information. Second, it uses the text as a baseline comparison to test how different ways of information disclosure (line and column graphs, and tables can enhance understandability of information. Third, it brings an internal factor to this process: overconfidence, a personal trait that harms the decision-making process of individuals. At the end of this paper several research paths are highlighted to further study the effect of internal factors (personal traits on financial analysts' accuracy on decision making regarding numerical information presented in a graphical form. In addition, we offer suggestions concerning some practical implications for professional accountants, auditors, financial analysts and standard setters.

  16. A Graph is Worth a Thousand Words: How Overconfidence and Graphical Disclosure of Numerical Information Influence Financial Analysts Accuracy on Decision Making.

    Science.gov (United States)

    Cardoso, Ricardo Lopes; Leite, Rodrigo Oliveira; de Aquino, André Carlos Busanelli

    2016-01-01

    Previous researches support that graphs are relevant decision aids to tasks related to the interpretation of numerical information. Moreover, literature shows that different types of graphical information can help or harm the accuracy on decision making of accountants and financial analysts. We conducted a 4×2 mixed-design experiment to examine the effects of numerical information disclosure on financial analysts' accuracy, and investigated the role of overconfidence in decision making. Results show that compared to text, column graph enhanced accuracy on decision making, followed by line graphs. No difference was found between table and textual disclosure. Overconfidence harmed accuracy, and both genders behaved overconfidently. Additionally, the type of disclosure (text, table, line graph and column graph) did not affect the overconfidence of individuals, providing evidence that overconfidence is a personal trait. This study makes three contributions. First, it provides evidence from a larger sample size (295) of financial analysts instead of a smaller sample size of students that graphs are relevant decision aids to tasks related to the interpretation of numerical information. Second, it uses the text as a baseline comparison to test how different ways of information disclosure (line and column graphs, and tables) can enhance understandability of information. Third, it brings an internal factor to this process: overconfidence, a personal trait that harms the decision-making process of individuals. At the end of this paper several research paths are highlighted to further study the effect of internal factors (personal traits) on financial analysts' accuracy on decision making regarding numerical information presented in a graphical form. In addition, we offer suggestions concerning some practical implications for professional accountants, auditors, financial analysts and standard setters.

  17. Evolution of Research on Interventions for Individuals with Autism Spectrum Disorder: Implications for Behavior Analysts

    Science.gov (United States)

    Smith, Tristram

    2012-01-01

    The extraordinary success of behavior-analytic interventions for individuals with autism spectrum disorder (ASD) has fueled the rapid growth of behavior analysis as a profession. One reason for this success is that for many years behavior analysts were virtually alone in conducting programmatic ASD intervention research. However, that era has…

  18. Prerequisites for Systems Analysts: Analytic and Management Demands of a New Approach to Educational Administration.

    Science.gov (United States)

    Ammentorp, William

    There is much to be gained by using systems analysis in educational administration. Most administrators, presently relying on classical statistical techniques restricted to problems having few variables, should be trained to use more sophisticated tools such as systems analysis. The systems analyst, interested in the basic processes of a group or…

  19. THE MISSING FATHER FUNCTION IN PSYCHOANALYTIC THEORY AND TECHNIQUE: THE ANALYST'S INTERNAL COUPLE AND MATURING INTIMACY.

    Science.gov (United States)

    Diamond, Michael J

    2017-10-01

    This paper argues that recovering the "missing" paternal function in analytic space is essential for the patient's achievement of mature object relations. Emerging from the helpless infant's contact with primary caregivers, mature intimacy rests on establishing healthy triadic functioning based on an infant-with-mother-and-father. Despite a maternocentric bias in contemporary clinical theory, the emergence of triangularity and the inclusion of the paternal third as a separating element is vital in the analytic dyad. Effective technique requires the analyst's balanced interplay between the paternal, investigative and the maternal, maximally receptive modes of functioning-the good enough analytic couple within the analyst-to serve as the separating element that procreatively fertilizes the capacity for intimacy with a differentiated other. A clinical example illustrates how treatment is limited when the paternal function is minimized within more collusive, unconsciously symbiotic dyads. © 2017 The Psychoanalytic Quarterly, Inc.

  20. A Few Insights Into Romanian Information Systems Analysts and Designers Toolbox

    Directory of Open Access Journals (Sweden)

    Fotache Marin

    2017-06-01

    Full Text Available Information Systems (IS analysts and designers have been key members in software development teams. From waterfall to Rational Unified Process, from UML to agile development, IS modelers have faced many trends and buzzwords. Even if the topic of models and modeling tools in software development is important, there are no many detailed studies to identify for what the developers, customers and managers decide to use the modeling and specific tools. Despite the popularity of the subject, studies showing what tools the IS modelers prefer are scarce, and quasi-non-existent, when talking about Romanian market. As Romania is an important IT outsourcing market, this paper investigated what methods and tools Romanian IS analysts and designers apply. In this context, the starting question of our research focuses on the preference of the developers to choose between agile or non-agile methods in IT projects. As a result, the research questions targeted the main drivers in choosing specific methods and tools for IT projects deployed in Romanian companies. Also, one of the main objectives of this paper was to approach the relationship between the methodologies (agile or non-agile, diagrams and other tools (we refer in our study to the CASE features with other variables/metrics of the system/software development project. The observational study was conducted based on a survey filled by IS modelers in Romanian IT companies. The data collected were processed and analyzed using Exploratory Data Analysis. The platform for data visualization and analysis was R.

  1. Conserving analyst attention units: use of multi-agent software and CEP methods to assist information analysis

    Science.gov (United States)

    Rimland, Jeffrey; McNeese, Michael; Hall, David

    2013-05-01

    Although the capability of computer-based artificial intelligence techniques for decision-making and situational awareness has seen notable improvement over the last several decades, the current state-of-the-art still falls short of creating computer systems capable of autonomously making complex decisions and judgments in many domains where data is nuanced and accountability is high. However, there is a great deal of potential for hybrid systems in which software applications augment human capabilities by focusing the analyst's attention to relevant information elements based on both a priori knowledge of the analyst's goals and the processing/correlation of a series of data streams too numerous and heterogeneous for the analyst to digest without assistance. Researchers at Penn State University are exploring ways in which an information framework influenced by Klein's (Recognition Primed Decision) RPD model, Endsley's model of situational awareness, and the Joint Directors of Laboratories (JDL) data fusion process model can be implemented through a novel combination of Complex Event Processing (CEP) and Multi-Agent Software (MAS). Though originally designed for stock market and financial applications, the high performance data-driven nature of CEP techniques provide a natural compliment to the proven capabilities of MAS systems for modeling naturalistic decision-making, performing process adjudication, and optimizing networked processing and cognition via the use of "mobile agents." This paper addresses the challenges and opportunities of such a framework for augmenting human observational capability as well as enabling the ability to perform collaborative context-aware reasoning in both human teams and hybrid human / software agent teams.

  2. Analysts’ forecast error: A robust prediction model and its short term trading profitability

    NARCIS (Netherlands)

    Boudt, K.M.R.; de Goei, P.; Thewissen, J.; van Campenhout, G.

    2015-01-01

    This paper contributes to the empirical evidence on the investment horizon salient to trading based on predicting the error in analysts' earnings forecasts. An econometric framework is proposed that accommodates the stylized fact of extreme values in the forecast error series. We find that between

  3. A feasibility study for Arizona's roadway safety management process using the Highway Safety Manual and SafetyAnalyst : final report.

    Science.gov (United States)

    2016-07-01

    To enable implementation of the American Association of State Highway Transportation (AASHTO) Highway Safety Manual using : SaftetyAnalyst (an AASHTOWare software product), the Arizona Department of Transportation (ADOT) studied the data assessment :...

  4. The Insider Threat to Cybersecurity: How Group Process and Ignorance Affect Analyst Accuracy and Promptitude

    Science.gov (United States)

    2017-09-01

    McCarthy, J. (1980). Circumscription - A Form of Nonmonotonic Reasoning. Artificial Intelligence , 13, 27–39. McClure, S., Scambray, J., & Kurtz, G. (2012...THREAT TO CYBERSECURITY : HOW GROUP PROCESS AND IGNORANCE AFFECT ANALYST ACCURACY AND PROMPTITUDE by Ryan F. Kelly September 2017...September 2017 3. REPORT TYPE AND DATES COVERED Dissertation 4. TITLE AND SUBTITLE THE INSIDER THREAT TO CYBERSECURITY : HOW GROUP PROCESS AND

  5. Massive Predictive Modeling using Oracle R Enterprise

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    R is fast becoming the lingua franca for analyzing data via statistics, visualization, and predictive analytics. For enterprise-scale data, R users have three main concerns: scalability, performance, and production deployment. Oracle's R-based technologies - Oracle R Distribution, Oracle R Enterprise, Oracle R Connector for Hadoop, and the R package ROracle - address these concerns. In this talk, we introduce Oracle's R technologies, highlighting how each enables R users to achieve scalability and performance while making production deployment of R results a natural outcome of the data analyst/scientist efforts. The focus then turns to Oracle R Enterprise with code examples using the transparency layer and embedded R execution, targeting massive predictive modeling. One goal behind massive predictive modeling is to build models per entity, such as customers, zip codes, simulations, in an effort to understand behavior and tailor predictions at the entity level. Predictions...

  6. BETWEEN PSYCHOANALYSIS AND TESTIMONIAL SPACE: THE ANALYST AS A WITNESS.

    Science.gov (United States)

    Gondar, Jô

    2017-04-01

    The aim of this article is to think of the place of the witness as a third place that the analyst, in the clinical space of trauma, is able to sustain. According to Ferenczi, in traumatic dreams a third is already being summoned. It is not the witness of the realm of law, nor the place of the father or the symbolic law. This is a third space that can be called potential, interstitial space, indeterminate and formless, where something that at first would be incommunicable circulates and gradually takes shape. This space allows and supports the literalness of a testimonial narrative, its hesitations, paradoxes and silences. More than a trauma theory, the notion of a potential space would be the great contribution of psychoanalysis to the treatment of trauma survivors, establishing the difference between the task of a psychoanalyst and the one of a truth commission.

  7. LES PREVISIONS DES ANALYSTES FINANCIERS ET LES INCORPORELS : LES IAS/IFRS APPORTENT-ELLES UNE AMELIORATION ?

    OpenAIRE

    Lenormand , Gaëlle; Touchais , Lionel

    2017-01-01

    International audience; Due to the identification and assessment difficulties, the accounting system does not always adequately take into account the intangibles. With the IFRS, there are new accounting rules for these items. The article aims to analyze whether these changes convey more useful information for intangible assets with an improvement of analysts' earnings forecasts. To test this question, we use a sample of 209 firms listed on Euronext over 9 years with the national GAAP from 200...

  8. Blurred Lines: Ethical Implications of Social Media for Behavior Analysts.

    Science.gov (United States)

    O'Leary, Patrick N; Miller, Megan M; Olive, Melissa L; Kelly, Amanda N

    2017-03-01

    Social networking has a long list of advantages: it enables access to a large group of people that would otherwise not be geographically convenient or possible to connect with; it reaches several different generations, particularly younger ones, which are not typically involved in discussion of current events; and these sites allow a cost effective, immediate, and interactive way to engage with others. With the vast number of individuals who use social media sites as a way to connect with others, it may not be possible to completely abstain from discussions and interactions on social media that pertain to our professional practice. This is all the more reason that behavior analysts attend to the contingencies specific to these tools. This paper discusses potential ethical situations that may arise and offers a review of the Behavior Analysis Certification Board (BACB) guidelines pertaining to social networking, as well as provides suggestions for avoiding or resolving potential violations relating to online social behavior.

  9. Clearing and settlement of interbank card transactions: a MasterCard tutorial for Federal Reserve payments analysts

    OpenAIRE

    Susan Herbst-Murphy

    2013-01-01

    The Payment Cards Center organized a meeting at which senior officials from MasterCard shared information with Federal Reserve System payments analysts about the clearing and settlement functions that MasterCard performs for its client banks. These functions involve the transfer of information pertaining to card-based transactions (clearing) and the exchange of monetary value (settlement) that takes place between the banks whose customers are cardholders and those banks whose customers are ca...

  10. The Stewardship Role of Analyst Forecasts, and Discretionary Versus Non-Discretionary Accruals

    DEFF Research Database (Denmark)

    Christensen, Peter Ove; Frimor, Hans; Sabac, Florin

    We examine the interaction between discretionary and non-discretionary accruals in a stewardship setting. Contracting includes multiple rounds of renegotiation based on contractible accounting information and non-contractible but more timely non-accounting information. We show that accounting...... timely non-accounting information (analyst earnings forecasts) increases the ex ante value of the firm and reduces costly earnings management. There is an optimal level of reversible non-discretionary accrual noise introduced through revenue recognition policies. Tight rules-based accounting regulation...... regulation aimed at increasing earnings quality from a valuation perspective (earnings persistence) may have a significant impact on how firms rationally respond in terms of allowing accrual discretion in order to alleviate the impact on the stewardship role of earnings. Increasing the precision of more...

  11. The Stewardship Role of Analyst Forecasts, and Discretionary Versus Non-Discretionary Accruals

    DEFF Research Database (Denmark)

    Christensen, Peter Ove; Frimor, Hans; Sabac, Florin

    2013-01-01

    We examine the interaction between discretionary and non-discretionary accruals in a stewardship setting. Contracting includes multiple rounds of renegotiation based on contractible accounting information and non-contractible but more timely non-accounting information. We show that accounting...... timely non-accounting information (analyst earnings forecasts) increases the ex ante value of the firm and reduces costly earnings management. There is an optimal level of reversible non-discretionary accrual noise introduced through revenue recognition policies. Tight rules-based accounting regulation...... regulation aimed at increasing earnings quality from a valuation perspective (earnings persistence) may have a significant impact on how firms rationally respond in terms of allowing accrual discretion in order to alleviate the impact on the stewardship role of earnings. Increasing the precision of more...

  12. A Comparative Study of Automated Infrasound Detectors - PMCC and AFD with Analyst Review.

    Energy Technology Data Exchange (ETDEWEB)

    Park, Junghyun; Hayward, Chris; Zeiler, Cleat; Arrowsmith, Stephen John; Stump, Brian

    2015-08-01

    Automated detections calculated by the progressive multi-channel correlation (PMCC) method (Cansi, 1995) and the adaptive F detector (AFD) (Arrowsmith et al., 2009) are compared to the signals identified by five independent analysts. Each detector was applied to a four-hour time sequence recorded by the Korean infrasound array CHNAR. This array was used because it is composed of both small (<100 m) and large (~1000 m) aperture element spacing. The four hour time sequence contained a number of easily identified signals under noise conditions that have average RMS amplitudes varied from 1.2 to 4.5 mPa (1 to 5 Hz), estimated with running five-minute window. The effectiveness of the detectors was estimated for the small aperture, large aperture, small aperture combined with the large aperture, and full array. The full and combined arrays performed the best for AFD under all noise conditions while the large aperture array had the poorest performance for both detectors. PMCC produced similar results as AFD under the lower noise conditions, but did not produce as dramatic an increase in detections using the full and combined arrays. Both automated detectors and the analysts produced a decrease in detections under the higher noise conditions. Comparing the detection probabilities with Estimated Receiver Operating Characteristic (EROC) curves we found that the smaller value of consistency for PMCC and the larger p-value for AFD had the highest detection probability. These parameters produced greater changes in detection probability than estimates of the false alarm rate. The detection probability was impacted the most by noise level, with low noise (average RMS amplitude of 1.7 mPa) having an average detection probability of ~40% and high noise (average RMS amplitude of 2.9 mPa) average detection probability of ~23%.

  13. Analyst Tools and Quality Control Software for the ARM Data System

    Energy Technology Data Exchange (ETDEWEB)

    Moore, S.T.

    2004-12-14

    ATK Mission Research develops analyst tools and automated quality control software in order to assist the Atmospheric Radiation Measurement (ARM) Data Quality Office with their data inspection tasks. We have developed a web-based data analysis and visualization tool, called NCVweb, that allows for easy viewing of ARM NetCDF files. NCVweb, along with our library of sharable Interactive Data Language procedures and functions, allows even novice ARM researchers to be productive with ARM data with only minimal effort. We also contribute to the ARM Data Quality Office by analyzing ARM data streams, developing new quality control metrics, new diagnostic plots, and integrating this information into DQ HandS - the Data Quality Health and Status web-based explorer. We have developed several ways to detect outliers in ARM data streams and have written software to run in an automated fashion to flag these outliers.

  14. Is Student Performance on the Information Systems Analyst Certification Exam Affected by Form of Delivery of Information Systems Coursework?

    Science.gov (United States)

    Haga, Wayne; Moreno, Abel; Segall, Mark

    2012-01-01

    In this paper, we compare the performance of Computer Information Systems (CIS) majors on the Information Systems Analyst (ISA) Certification Exam. The impact that the form of delivery of information systems coursework may have on the exam score is studied. Using a sample that spans three years, we test for significant differences between scores…

  15. Cyber Situation Awareness through Instance-Based Learning: Modeling the Security Analyst in a Cyber-Attack Scenario

    Science.gov (United States)

    2012-01-01

    Chocolate Avenue Hershey PA 17033 Tel: 717-533-8845 Fax: 717-533-8661 E-mail: cust@igi-global.com Web site: http://www.igi-global.com Copyright © 2011...program and obtain control on the machine (event 21st out of 25). During the course of this simple scenario, a security analyst is able to observe...G. A. (1989). Recognition-primed deci- sions. In Rouse, W. B. (Ed.), Advances in man- machine system research (Vol. 5, pp. 47–92). Greenwich, CT

  16. Energy Predictions 2011

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2010-10-15

    Even as the recession begins to subside, the energy sector is still likely to experience challenging conditions as we enter 2011. It should be remembered how very important a role energy plays in driving the global economy. Serving as a simple yet global and unified measure of economic recovery, it is oil's price range and the strength and sustainability of the recovery which will impact the ways in which all forms of energy are produced and consumed. The report aims for a closer insight into these predictions: What will happen with M and A (Mergers and Acquisitions) in the energy industry?; What are the prospects for renewables?; Will the water-energy nexus grow in importance?; How will technological leaps and bounds affect E and P (exploration and production) operations?; What about electric cars? This is the second year Deloitte's Global Energy and Resources Group has published its predictions for the year ahead. The report is based on in-depth interviews with clients, industry analysts, and senior energy practitioners from Deloitte member firms around the world.

  17. Energy Predictions 2011

    International Nuclear Information System (INIS)

    2010-10-01

    Even as the recession begins to subside, the energy sector is still likely to experience challenging conditions as we enter 2011. It should be remembered how very important a role energy plays in driving the global economy. Serving as a simple yet global and unified measure of economic recovery, it is oil's price range and the strength and sustainability of the recovery which will impact the ways in which all forms of energy are produced and consumed. The report aims for a closer insight into these predictions: What will happen with M and A (Mergers and Acquisitions) in the energy industry?; What are the prospects for renewables?; Will the water-energy nexus grow in importance?; How will technological leaps and bounds affect E and P (exploration and production) operations?; What about electric cars? This is the second year Deloitte's Global Energy and Resources Group has published its predictions for the year ahead. The report is based on in-depth interviews with clients, industry analysts, and senior energy practitioners from Deloitte member firms around the world.

  18. Using Machine Learning to Predict MCNP Bias

    Energy Technology Data Exchange (ETDEWEB)

    Grechanuk, Pavel Aleksandrovi [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-01-09

    For many real-world applications in radiation transport where simulations are compared to experimental measurements, like in nuclear criticality safety, the bias (simulated - experimental keff) in the calculation is an extremely important quantity used for code validation. The objective of this project is to accurately predict the bias of MCNP6 [1] criticality calculations using machine learning (ML) algorithms, with the intention of creating a tool that can complement the current nuclear criticality safety methods. In the latest release of MCNP6, the Whisper tool is available for criticality safety analysts and includes a large catalogue of experimental benchmarks, sensitivity profiles, and nuclear data covariance matrices. This data, coming from 1100+ benchmark cases, is used in this study of ML algorithms for criticality safety bias predictions.

  19. Data analyst technician: an innovative role for the pharmacy technician.

    Science.gov (United States)

    Ervin, K C; Skledar, S; Hess, M M; Ryan, M

    2001-10-01

    The development of an innovative role for the pharmacy technician is described. The role of the pharmacy technician was based on a needs assessment and the expertise of the pharmacy technician selected. Initial responsibilities of the technician included chart reviews, benchmarking surveys, monthly financial impact analysis, initiative assessment, and quality improvement reporting. As the drug-use and disease-state management (DUDSM) program expanded, pharmacist activities increased, requiring the expansion of data analyst technician (DAT) duties. These new responsibilities included participation in patient assessment, data collection and interpretation, and formulary enforcement. Most recently, technicians' expanded duties include maintenance of a physician compliance profiling database, quality improvement reporting and graphing, active role in patient risk assessment and database management for adult vaccination, and support of financial impact monitoring for other institutions within the health system. This pharmacist-technician collaboration resulted a threefold increase in patient assessments completed per day. In addition, as the DUDSM program continues to expand across the health system, an increase in DAT resources from 0.5 to 1.0 full-time equivalent was obtained. The role of the DAT has increased the efficiency of the DUDSM program and has provided an innovative role for the pharmacy technician.

  20. The Value of Different Customer Satisfaction and Loyalty Metrics in Predicting Business Performance

    OpenAIRE

    Neil A. Morgan; Lopo Leotte Rego

    2006-01-01

    Managers commonly use customer feedback data to set goals and monitor performance on metrics such as “Top 2 Box” customer satisfaction scores and “intention-to-repurchase” loyalty scores. However, analysts have advocated a number of different customer feedback metrics including average customer satisfaction scores and the number of “net promoters” among a firm's customers. We empirically examine which commonly used and widely advocated customer feedback metrics are most valuable in predicting...

  1. Interfacing a biosurveillance portal and an international network of institutional analysts to detect biological threats.

    Science.gov (United States)

    Riccardo, Flavia; Shigematsu, Mika; Chow, Catherine; McKnight, C Jason; Linge, Jens; Doherty, Brian; Dente, Maria Grazia; Declich, Silvia; Barker, Mike; Barboza, Philippe; Vaillant, Laetitia; Donachie, Alastair; Mawudeku, Abla; Blench, Michael; Arthur, Ray

    2014-01-01

    The Early Alerting and Reporting (EAR) project, launched in 2008, is aimed at improving global early alerting and risk assessment and evaluating the feasibility and opportunity of integrating the analysis of biological, chemical, radionuclear (CBRN), and pandemic influenza threats. At a time when no international collaborations existed in the field of event-based surveillance, EAR's innovative approach involved both epidemic intelligence experts and internet-based biosurveillance system providers in the framework of an international collaboration called the Global Health Security Initiative, which involved the ministries of health of the G7 countries and Mexico, the World Health Organization, and the European Commission. The EAR project pooled data from 7 major internet-based biosurveillance systems onto a common portal that was progressively optimized for biological threat detection under the guidance of epidemic intelligence experts from public health institutions in Canada, the European Centre for Disease Prevention and Control, France, Germany, Italy, Japan, the United Kingdom, and the United States. The group became the first end users of the EAR portal, constituting a network of analysts working with a common standard operating procedure and risk assessment tools on a rotation basis to constantly screen and assess public information on the web for events that could suggest an intentional release of biological agents. Following the first 2-year pilot phase, the EAR project was tested in its capacity to monitor biological threats, proving that its working model was feasible and demonstrating the high commitment of the countries and international institutions involved. During the testing period, analysts using the EAR platform did not miss intentional events of a biological nature and did not issue false alarms. Through the findings of this initial assessment, this article provides insights into how the field of epidemic intelligence can advance through an

  2. ON THE ANALYST'S IDENTIFICATION WITH THE PATIENT: THE CASE OF J.-B. PONTALIS AND G. PEREC.

    Science.gov (United States)

    Schwartz, Henry P

    2016-01-01

    The writer Georges Perec was in psychoanalysis with Jean-Bertrand Pontalis for four years in the early 1970s. In this essay, the author presents the exceptional interest this analyst took in this patient and the ways in which that interest manifested itself in his work, psychoanalytic and otherwise. Many correlative factors suggest that identificatory processes persisted beyond the treatment and were maintained into Pontalis's later life. While this paper is primarily intended to provide evidence to support this view of a specific case, the author closes by reflecting that this may be a more general phenomenon and the reasons for this. © 2016 The Psychoanalytic Quarterly, Inc.

  3. Counter-terrorism threat prediction architecture

    Science.gov (United States)

    Lehman, Lynn A.; Krause, Lee S.

    2004-09-01

    This paper will evaluate the feasibility of constructing a system to support intelligence analysts engaged in counter-terrorism. It will discuss the use of emerging techniques to evaluate a large-scale threat data repository (or Infosphere) and comparing analyst developed models to identify and discover potential threat-related activity with a uncertainty metric used to evaluate the threat. This system will also employ the use of psychological (or intent) modeling to incorporate combatant (i.e. terrorist) beliefs and intent. The paper will explore the feasibility of constructing a hetero-hierarchical (a hierarchy of more than one kind or type characterized by loose connection/feedback among elements of the hierarchy) agent based framework or "family of agents" to support "evidence retrieval" defined as combing, or searching the threat data repository and returning information with an uncertainty metric. The counter-terrorism threat prediction architecture will be guided by a series of models, constructed to represent threat operational objectives, potential targets, or terrorist objectives. The approach would compare model representations against information retrieved by the agent family to isolate or identify patterns that match within reasonable measures of proximity. The central areas of discussion will be the construction of an agent framework to search the available threat related information repository, evaluation of results against models that will represent the cultural foundations, mindset, sociology and emotional drive of typical threat combatants (i.e. the mind and objectives of a terrorist), and the development of evaluation techniques to compare result sets with the models representing threat behavior and threat targets. The applicability of concepts surrounding Modeling Field Theory (MFT) will be discussed as the basis of this research into development of proximity measures between the models and result sets and to provide feedback in support of model

  4. Behavior Analysts to the Front! A 15-Step Tutorial on Public Speaking.

    Science.gov (United States)

    Friman, Patrick C

    2014-10-01

    Mainstream prominence was Skinner's vision for behavior analysis. Unfortunately, it remains elusive, even as we approach the 110th anniversary of his birth. It can be achieved, however, and there are many routes. One that seems overlooked in many (most?) behavior analytic training programs is what I call the front of the room. The front of the room is a very powerful locus for influencing people. Mastering it can turn a commoner into a king; a middling man into a mayor; or a group of disorganized, dispirited people into an energized force marching into battle. The most powerful members of our species had their most memorable moments at the front of the room. If so much is available there, why is mastery of it in such short supply, not just in behavior analysts but in the population at large? In this paper, I address why, argue that the primary reason can be overcome, and supply 15 behaviorally based steps to take in pursuit of front of the room mastery.

  5. Teleconsultation in school settings: linking classroom teachers and behavior analysts through web-based technology.

    Science.gov (United States)

    Frieder, Jessica E; Peterson, Stephanie M; Woodward, Judy; Crane, Jaelee; Garner, Marlane

    2009-01-01

    This paper describes a technically driven, collaborative approach to assessing the function of problem behavior using web-based technology. A case example is provided to illustrate the process used in this pilot project. A school team conducted a functional analysis with a child who demonstrated challenging behaviors in a preschool setting. Behavior analysts at a university setting provided the school team with initial workshop trainings, on-site visits, e-mail and phone communication, as well as live web-based feedback on functional analysis sessions. The school personnel implemented the functional analysis with high fidelity and scored the data reliably. Outcomes of the project suggest that there is great potential for collaboration via the use of web-based technologies for ongoing assessment and development of effective interventions. However, an empirical evaluation of this model should be conducted before wide-scale adoption is recommended.

  6. Complexity Analysis of Industrial Organizations Based on a Perspective of Systems Engineering Analysts

    Directory of Open Access Journals (Sweden)

    I. H. Garbie

    2011-12-01

    Full Text Available Complexity in industrial organizations became more difficult and complex to be solved and it needs more attention from academicians and technicians. For these reasons, complexity in industrial organizations represents a new challenge in the next decades. Until now, analysis of industrial organizations complexity is still remaining a research topic of immense international interest and they require reduction in their complexity. In this paper, analysis of complexity in industrial organizations is shown based on the perspective of systems engineering analyst. In this perspective, analysis of complexity was divided into different levels and these levels were defined as complexity levels. A framework of analyzing these levels was proposed and suggested based on the complexity in industrial organizations. This analysis was divided into four main issues: industrial system vision, industrial system structure, industrial system operating, and industrial system evaluating. This analysis shows that the complexity of industrial organizations is still an ill-structured and a multi-dimensional problem.

  7. Training the next generation analyst using red cell analytics

    Science.gov (United States)

    Graham, Meghan N.; Graham, Jacob L.

    2016-05-01

    We have seen significant change in the study and practice of human reasoning in recent years from both a theoretical and methodological perspective. Ubiquitous communication coupled with advances in computing and a plethora of analytic support tools have created a push for instantaneous reporting and analysis. This notion is particularly prevalent in law enforcement, emergency services and the intelligence community (IC), where commanders (and their civilian leadership) expect not only a birds' eye view of operations as they occur, but a play-by-play analysis of operational effectiveness. This paper explores the use of Red Cell Analytics (RCA) as pedagogy to train the next-gen analyst. A group of Penn State students in the College of Information Sciences and Technology at the University Park campus of The Pennsylvania State University have been practicing Red Team Analysis since 2008. RCA draws heavily from the military application of the same concept, except student RCA problems are typically on non-military in nature. RCA students utilize a suite of analytic tools and methods to explore and develop red-cell tactics, techniques and procedures (TTPs), and apply their tradecraft across a broad threat spectrum, from student-life issues to threats to national security. The strength of RCA is not always realized by the solution but by the exploration of the analytic pathway. This paper describes the concept and use of red cell analytics to teach and promote the use of structured analytic techniques, analytic writing and critical thinking in the area of security and risk and intelligence training.

  8. PSORTb 3.0: improved protein subcellular localization prediction with refined localization subcategories and predictive capabilities for all prokaryotes.

    Science.gov (United States)

    Yu, Nancy Y; Wagner, James R; Laird, Matthew R; Melli, Gabor; Rey, Sébastien; Lo, Raymond; Dao, Phuong; Sahinalp, S Cenk; Ester, Martin; Foster, Leonard J; Brinkman, Fiona S L

    2010-07-01

    PSORTb has remained the most precise bacterial protein subcellular localization (SCL) predictor since it was first made available in 2003. However, the recall needs to be improved and no accurate SCL predictors yet make predictions for archaea, nor differentiate important localization subcategories, such as proteins targeted to a host cell or bacterial hyperstructures/organelles. Such improvements should preferably be encompassed in a freely available web-based predictor that can also be used as a standalone program. We developed PSORTb version 3.0 with improved recall, higher proteome-scale prediction coverage, and new refined localization subcategories. It is the first SCL predictor specifically geared for all prokaryotes, including archaea and bacteria with atypical membrane/cell wall topologies. It features an improved standalone program, with a new batch results delivery system complementing its web interface. We evaluated the most accurate SCL predictors using 5-fold cross validation plus we performed an independent proteomics analysis, showing that PSORTb 3.0 is the most accurate but can benefit from being complemented by Proteome Analyst predictions. http://www.psort.org/psortb (download open source software or use the web interface). psort-mail@sfu.ca Supplementary data are available at Bioinformatics online.

  9. Behavior analysts in the war on poverty: A review of the use of financial incentives to promote education and employment.

    Science.gov (United States)

    Holtyn, August F; Jarvis, Brantley P; Silverman, Kenneth

    2017-01-01

    Poverty is a pervasive risk factor underlying poor health. Many interventions that have sought to reduce health disparities associated with poverty have focused on improving health-related behaviors of low-income adults. Poverty itself could be targeted to improve health, but this approach would require programs that can consistently move poor individuals out of poverty. Governments and other organizations in the United States have tested a diverse range of antipoverty programs, generally on a large scale and in conjunction with welfare reform initiatives. This paper reviews antipoverty programs that used financial incentives to promote education and employment among welfare recipients and other low-income adults. The incentive-based, antipoverty programs had small or no effects on the target behaviors; they were implemented on large scales from the outset, without systematic development and evaluation of their components; and they did not apply principles of operant conditioning that have been shown to determine the effectiveness of incentive or reinforcement interventions. By applying basic principles of operant conditioning, behavior analysts could help address poverty and improve health through development of effective antipoverty programs. This paper describes a potential framework for a behavior-analytic antipoverty program, with the goal of illustrating that behavior analysts could be uniquely suited to make substantial contributions to the war on poverty. © 2017 Society for the Experimental Analysis of Behavior.

  10. Methodologies of the hardware reliability prediction for PSA of digital I and C systems

    International Nuclear Information System (INIS)

    Jung, H. S.; Sung, T. Y.; Eom, H. S.; Park, J. K.; Kang, H. G.; Park, J.

    2000-09-01

    Digital I and C systems are being used widely in the Non-safety systems of the NPP and they are expanding their applications to safety critical systems. The regulatory body shifts their policy to risk based and may require Probabilistic Safety Assessment for the digital I and C systems. But there is no established reliability prediction methodology for the digital I and C systems including both software and hardware yet. This survey report includes a lot of reliability prediction methods for electronic systems in view of hardware. Each method has both the strong and the weak points. This report provides the state-of-art of prediction methods and focus on Bellcore method and MIL-HDBK-217F method in deeply. The reliability analysis models are reviewed and discussed to help analysts. Also this report includes state-of-art of software tools that are supporting reliability prediction

  11. Methodologies of the hardware reliability prediction for PSA of digital I and C systems

    Energy Technology Data Exchange (ETDEWEB)

    Jung, H. S.; Sung, T. Y.; Eom, H. S.; Park, J. K.; Kang, H. G.; Park, J

    2000-09-01

    Digital I and C systems are being used widely in the Non-safety systems of the NPP and they are expanding their applications to safety critical systems. The regulatory body shifts their policy to risk based and may require Probabilistic Safety Assessment for the digital I and C systems. But there is no established reliability prediction methodology for the digital I and C systems including both software and hardware yet. This survey report includes a lot of reliability prediction methods for electronic systems in view of hardware. Each method has both the strong and the weak points. This report provides the state-of-art of prediction methods and focus on Bellcore method and MIL-HDBK-217F method in deeply. The reliability analysis models are reviewed and discussed to help analysts. Also this report includes state-of-art of software tools that are supporting reliability prediction.

  12. Automated Predictive Big Data Analytics Using Ontology Based Semantics.

    Science.gov (United States)

    Nural, Mustafa V; Cotterell, Michael E; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A

    2015-10-01

    Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology.

  13. Un estudio preliminar del fundamento pulsional de la "aptitud de analista" Preliminary study of the instinctual foundation of the "analyst's competence"

    Directory of Open Access Journals (Sweden)

    Osvaldo Delgado

    2007-12-01

    Full Text Available Este trabajo presenta algunas preguntas y desarrollos preliminares surgidos en un recorrido teórico realizado por los textos freudianos del término "aptitud". - Se presentan las referencias a textos anteriores a 1920, aunque se privilegie, en función de los objetivos de la investigación en curso, el ordenamiento y relación de la "aptitud de analista" con los conceptos fundamentales de la segunda tópica freudiana. ¿Se puede elevar el término castellano aptitud y sus originales alemanes al estatuto de un concepto? Finalmente se planteará que la dimensión pulsional del término es lo que permite darle a la "aptitud de analista" un estatuto conceptual, ya que la aptitud como "tauglich" en el advenimiento de un nuevo analista implica una transmutación pulsional específica. La pregunta por cuál es la relación entre lo que porta el carácter y la recomposición de las alteraciones del yo en el período posterior al análisis quedará como orientación para otro trabajo.This work presents some questions and preliminary developments which erased during the theoretical examinations realized on the Freud's texts concerning the term ¨competence¨. The references to the texts earlier than 1920 are given, however, in function of the objectives of the investigation in course, the order and relation of the ¨analyst' s competence¨ to the fundamental concepts of Freud's second topography are favored. Can we give a Spanish term ¨aptitud¨ and its German originals a status of a concept? Final consideration will be that the instinctual dimension of the term is what permits to give it a conceptual status to the ¨analyst's competence¨, since ¨competence¨ as ¨tauglich¨ in making a new analyst implies a specific instinctual transmutation. The question about what is the relation between what makes a character and the reparation of the alteration of ¨I ¨ in the period following the analysis will remain an orientation for further work.

  14. Financial Distress Prediction of Iranian Companies Using Data Mining Techniques

    Directory of Open Access Journals (Sweden)

    Moradi Mahdi

    2013-01-01

    Full Text Available Decision-making problems in the area of financial status evaluation are considered very important. Making incorrect decisions in firms is very likely to cause financial crises and distress. Predicting financial distress of factories and manufacturing companies is the desire of managers and investors, auditors, financial analysts, governmental officials, employees. Therefore, the current study aims to predict financial distress of Iranian Companies. The current study applies support vector data description (SVDD to the financial distress prediction problem in an attempt to suggest a new model with better explanatory power and stability. To serve this purpose, we use a grid-search technique using 3-fold cross-validation to find out the optimal parameter values of kernel function of SVDD. To evaluate the prediction accuracy of SVDD, we compare its performance with fuzzy c-means (FCM.The experiment results show that SVDD outperforms the other method in years before financial distress occurrence. The data used in this research were obtained from Iran Stock Market and Accounting Research Database. According to the data between 2000 and 2009, 70 pairs of companies listed in Tehran Stock Exchange are selected as initial data set.

  15. Oncology Modeling for Fun and Profit! Key Steps for Busy Analysts in Health Technology Assessment.

    Science.gov (United States)

    Beca, Jaclyn; Husereau, Don; Chan, Kelvin K W; Hawkins, Neil; Hoch, Jeffrey S

    2018-01-01

    In evaluating new oncology medicines, two common modeling approaches are state transition (e.g., Markov and semi-Markov) and partitioned survival. Partitioned survival models have become more prominent in oncology health technology assessment processes in recent years. Our experience in conducting and evaluating models for economic evaluation has highlighted many important and practical pitfalls. As there is little guidance available on best practices for those who wish to conduct them, we provide guidance in the form of 'Key steps for busy analysts,' who may have very little time and require highly favorable results. Our guidance highlights the continued need for rigorous conduct and transparent reporting of economic evaluations regardless of the modeling approach taken, and the importance of modeling that better reflects reality, which includes better approaches to considering plausibility, estimating relative treatment effects, dealing with post-progression effects, and appropriate characterization of the uncertainty from modeling itself.

  16. How well do financial experts perform? A review of empirical research on performance of analysts, day-traders, forecasters, fund managers, investors, and stockbrokers

    OpenAIRE

    Andersson, Patric

    2004-01-01

    In this manuscript, empirical research on performance of various types of financial experts is reviewed. Financial experts are used as the umbrella term for financial analysts, stockbrokers, money managers, investors, and day-traders etc. The goal of the review is to find out about the abilities of financial experts to produce accurate forecasts, to issue profitable stock recommendations, as well as to make successful investments and trades. On the whole, the reviewed studies show discouragin...

  17. Fatigue crack growth and life prediction under mixed-mode loading

    Science.gov (United States)

    Sajith, S.; Murthy, K. S. R. K.; Robi, P. S.

    2018-04-01

    Fatigue crack growth life as a function of crack length is essential for the prevention of catastrophic failures from damage tolerance perspective. In damage tolerance design approach, principles of fracture mechanics are usually applied to predict the fatigue life of structural components. Numerical prediction of crack growth versus number of cycles is essential in damage tolerance design. For cracks under mixed mode I/II loading, modified Paris law (d/a d N =C (ΔKe q ) m ) along with different equivalent stress intensity factor (ΔKeq) model is used for fatigue crack growth rate prediction. There are a large number of ΔKeq models available for the mixed mode I/II loading, the selection of proper ΔKeq model has significant impact on fatigue life prediction. In the present investigation, the performance of ΔKeq models in fatigue life prediction is compared with respect to the experimental findings as there are no guidelines/suggestions available on the selection of these models for accurate and/or conservative predictions of fatigue life. Within the limitations of availability of experimental data and currently available numerical simulation techniques, the results of present study attempt to outline models that would provide accurate and conservative life predictions. Such a study aid the numerical analysts or engineers in the proper selection of the model for numerical simulation of the fatigue life. Moreover, the present investigation also suggests a procedure to enhance the accuracy of life prediction using Paris law.

  18. Artificial Neural Network and Genetic Algorithm Hybrid Intelligence for Predicting Thai Stock Price Index Trend

    Science.gov (United States)

    Boonjing, Veera; Intakosum, Sarun

    2016-01-01

    This study investigated the use of Artificial Neural Network (ANN) and Genetic Algorithm (GA) for prediction of Thailand's SET50 index trend. ANN is a widely accepted machine learning method that uses past data to predict future trend, while GA is an algorithm that can find better subsets of input variables for importing into ANN, hence enabling more accurate prediction by its efficient feature selection. The imported data were chosen technical indicators highly regarded by stock analysts, each represented by 4 input variables that were based on past time spans of 4 different lengths: 3-, 5-, 10-, and 15-day spans before the day of prediction. This import undertaking generated a big set of diverse input variables with an exponentially higher number of possible subsets that GA culled down to a manageable number of more effective ones. SET50 index data of the past 6 years, from 2009 to 2014, were used to evaluate this hybrid intelligence prediction accuracy, and the hybrid's prediction results were found to be more accurate than those made by a method using only one input variable for one fixed length of past time span. PMID:27974883

  19. Artificial Neural Network and Genetic Algorithm Hybrid Intelligence for Predicting Thai Stock Price Index Trend

    Directory of Open Access Journals (Sweden)

    Montri Inthachot

    2016-01-01

    Full Text Available This study investigated the use of Artificial Neural Network (ANN and Genetic Algorithm (GA for prediction of Thailand’s SET50 index trend. ANN is a widely accepted machine learning method that uses past data to predict future trend, while GA is an algorithm that can find better subsets of input variables for importing into ANN, hence enabling more accurate prediction by its efficient feature selection. The imported data were chosen technical indicators highly regarded by stock analysts, each represented by 4 input variables that were based on past time spans of 4 different lengths: 3-, 5-, 10-, and 15-day spans before the day of prediction. This import undertaking generated a big set of diverse input variables with an exponentially higher number of possible subsets that GA culled down to a manageable number of more effective ones. SET50 index data of the past 6 years, from 2009 to 2014, were used to evaluate this hybrid intelligence prediction accuracy, and the hybrid’s prediction results were found to be more accurate than those made by a method using only one input variable for one fixed length of past time span.

  20. Engaging policy-makers, heath system managers, and policy analysts in the knowledge synthesis process: a scoping review.

    Science.gov (United States)

    Tricco, Andrea C; Zarin, Wasifa; Rios, Patricia; Nincic, Vera; Khan, Paul A; Ghassemi, Marco; Diaz, Sanober; Pham, Ba'; Straus, Sharon E; Langlois, Etienne V

    2018-02-12

    It is unclear how to engage a wide range of knowledge users in research. We aimed to map the evidence on engaging knowledge users with an emphasis on policy-makers, health system managers, and policy analysts in the knowledge synthesis process through a scoping review. We used the Joanna Briggs Institute guidance for scoping reviews. Nine electronic databases (e.g., MEDLINE), two grey literature sources (e.g., OpenSIGLE), and reference lists of relevant systematic reviews were searched from 1996 to August 2016. We included any type of study describing strategies, barriers and facilitators, or assessing the impact of engaging policy-makers, health system managers, and policy analysts in the knowledge synthesis process. Screening and data abstraction were conducted by two reviewers independently with a third reviewer resolving discrepancies. Frequency and thematic analyses were conducted. After screening 8395 titles and abstracts followed by 394 full-texts, 84 unique documents and 7 companion reports fulfilled our eligibility criteria. All 84 documents were published in the last 10 years, and half were prepared in North America. The most common type of knowledge synthesis with knowledge user engagement was a systematic review (36%). The knowledge synthesis most commonly addressed an issue at the level of national healthcare system (48%) and focused on health services delivery (17%) in high-income countries (86%). Policy-makers were the most common (64%) knowledge users, followed by healthcare professionals (49%) and government agencies as well as patients and caregivers (34%). Knowledge users were engaged in conceptualization and design (49%), literature search and data collection (52%), data synthesis and interpretation (71%), and knowledge dissemination and application (44%). Knowledge users were most commonly engaged as key informants through meetings and workshops as well as surveys, focus groups, and interviews either in-person or by telephone and emails

  1. What Performance Analysts Need to Know About Research Trends in Association Football (2012-2016): A Systematic Review.

    Science.gov (United States)

    Sarmento, Hugo; Clemente, Filipe Manuel; Araújo, Duarte; Davids, Keith; McRobert, Allistair; Figueiredo, António

    2018-04-01

    Evolving patterns of match analysis research need to be systematically reviewed regularly since this area of work is burgeoning rapidly and studies can offer new insights to performance analysts if theoretically and coherently organized. The purpose of this paper was to conduct a systematic review of published articles on match analysis in adult male football, identify and organize common research topics, and synthesize the emerging patterns of work between 2012 and 2016, according to the Preferred Reporting Items for Systematic Reviews and Meta-analyses (PRISMA) guidelines. The Web of Science database was searched for relevant published studies using the following keywords: 'football' and 'soccer', each one associated with the terms 'match analysis', 'performance analysis', 'notational analysis', 'game analysis', 'tactical analysis' and 'patterns of play'. Of 483 studies initially identified, 77 were fully reviewed and their outcome measures extracted and analyzed. Results showed that research mainly focused on (1) performance at set pieces, i.e. corner kicks, free kicks, penalty kicks; (2) collective system behaviours, captured by established variables such as team centroid (geometrical centre of a set of players) and team dispersion (quantification of how far players are apart), as well as tendencies for team communication (establishing networks based on passing sequences), sequential patterns (predicting future passing sequences), and group outcomes (relationships between match-related statistics and final match scores); and (3) activity profile of players, i.e. playing roles, effects of fatigue, substitutions during matches, and the effects of environmental constraints on performance, such as heat and altitude. From the previous review, novel variables were identified that require new measurement techniques. It is evident that the complexity engendered during performance in competitive soccer requires an integrated approach that considers multiple aspects. A

  2. Interpreting Black-Box Classifiers Using Instance-Level Visual Explanations

    Energy Technology Data Exchange (ETDEWEB)

    Tamagnini, Paolo; Krause, Josua W.; Dasgupta, Aritra; Bertini, Enrico

    2017-05-14

    To realize the full potential of machine learning in diverse real- world domains, it is necessary for model predictions to be readily interpretable and actionable for the human in the loop. Analysts, who are the users but not the developers of machine learning models, often do not trust a model because of the lack of transparency in associating predictions with the underlying data space. To address this problem, we propose Rivelo, a visual analytic interface that enables analysts to understand the causes behind predictions of binary classifiers by interactively exploring a set of instance-level explanations. These explanations are model-agnostic, treating a model as a black box, and they help analysts in interactively probing the high-dimensional binary data space for detecting features relevant to predictions. We demonstrate the utility of the interface with a case study analyzing a random forest model on the sentiment of Yelp reviews about doctors.

  3. Identifying the Education Needs of the Business Analyst: An Australian Study

    Directory of Open Access Journals (Sweden)

    Deborah Richards

    2014-06-01

    Full Text Available The Business Analyst (BA plays a key role in ensuring that technology is appropriately used to achieve the organisation’s goals. This important mediating role is currently in high (unmet demand in many English-speaking countries and thus more people need to be trained for this role. To determine the educational and/or training needs of a BA we conducted a survey in the Information and Communication Technology industry in Australia. The survey items are based on prior studies of information systems educational requirements and the internationally-developed Skills Framework for the Information Age (SFIA that has been endorsed by the Australian Computer Society. From the literature we identified three types of skills: soft, business and technical. With the increasing importance of GreenIT and the pivotal role that the BA could play in green decision making, we added a fourth type of skill: green. The survey considers 85 skills, their importance, the level of attainment of that skill, skill gaps and types of skills. Results show that all soft skills were considered to be important with the smallest knowledge gaps. Selected business skills and green skills were seen to be important. Technical skills were considered less important, but also where the largest knowledge gaps existed. Further we asked respondents whether each skill should be acquired via an undergraduate or postgraduate degree and/or industry training and experience. We found that the workplace was considered the most appropriate place to acquire and/or develop all skills, except the ability to innovate. While we found that softskills should be taught almost equally at the undergraduate and postgraduate level, business and green skills were more appropriate in a postgraduate degree. In contrast, technical skills were best acquired in an undergraduate program of study.

  4. Validation predictions of a 13 m/s cross-wind fire for Fuego and the University of Waterloo dataset.

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Alexander L.; Evans, Gregory Herbert (Sandia National Laboratories, Livermore, CA); Gill, Walter; Jarboe, Daniel T. (Sandia National Laboratories, Livermore, CA)

    2008-03-01

    Detailed herein are the results of a validation comparison. The experiment involved a 2 meter diameter liquid pool of Jet-A fuel in a 13 m/s crosswind. The scenario included a large cylindrical blocking object just down-stream of the fire. It also included seven smaller calorimeters and extensive instrumentation. The experiments were simulated with Fuego. The model included several conduction regions to model the response of the calorimeters, the floor, and the large cylindrical blocking object. A blind comparison was used to compare the simulation predictions with the experimental data. The more upstream data compared very well with the simulation predictions. The more downstream data did not compare very well with the simulation predictions. Further investigation suggests that features omitted from the original model contributed to the discrepancies. Observations are made with respect to the scenario that are aimed at helping an analyst approach a comparable problem in a way that may help improve the potential for quantitative accuracy.

  5. Implementing Operational Analytics using Big Data Technologies to Detect and Predict Sensor Anomalies

    Science.gov (United States)

    Coughlin, J.; Mital, R.; Nittur, S.; SanNicolas, B.; Wolf, C.; Jusufi, R.

    2016-09-01

    Operational analytics when combined with Big Data technologies and predictive techniques have been shown to be valuable in detecting mission critical sensor anomalies that might be missed by conventional analytical techniques. Our approach helps analysts and leaders make informed and rapid decisions by analyzing large volumes of complex data in near real-time and presenting it in a manner that facilitates decision making. It provides cost savings by being able to alert and predict when sensor degradations pass a critical threshold and impact mission operations. Operational analytics, which uses Big Data tools and technologies, can process very large data sets containing a variety of data types to uncover hidden patterns, unknown correlations, and other relevant information. When combined with predictive techniques, it provides a mechanism to monitor and visualize these data sets and provide insight into degradations encountered in large sensor systems such as the space surveillance network. In this study, data from a notional sensor is simulated and we use big data technologies, predictive algorithms and operational analytics to process the data and predict sensor degradations. This study uses data products that would commonly be analyzed at a site. This study builds on a big data architecture that has previously been proven valuable in detecting anomalies. This paper outlines our methodology of implementing an operational analytic solution through data discovery, learning and training of data modeling and predictive techniques, and deployment. Through this methodology, we implement a functional architecture focused on exploring available big data sets and determine practical analytic, visualization, and predictive technologies.

  6. The ability of analysts' recommendations to predict optimistic and pessimistic forecasts.

    Directory of Open Access Journals (Sweden)

    Vahid Biglari

    Full Text Available Previous researches show that buy (growth companies conduct income increasing earnings management in order to meet forecasts and generate positive forecast Errors (FEs. This behavior however, is not inherent in sell (non-growth companies. Using the aforementioned background, this research hypothesizes that since sell companies are pressured to avoid income increasing earnings management, they are capable, and in fact more inclined, to pursue income decreasing Forecast Management (FM with the purpose of generating positive FEs. Using a sample of 6553 firm-years of companies that are listed in the NYSE between the years 2005-2010, the study determines that sell companies conduct income decreasing FM to generate positive FEs. However, the frequency of positive FEs of sell companies does not exceed that of buy companies. Using the efficiency perspective, the study suggests that even though buy and sell companies have immense motivation in avoiding negative FEs, they exploit different but efficient strategies, respectively, in order to meet forecasts. Furthermore, the findings illuminated the complexities behind informative and opportunistic forecasts that falls under the efficiency versus opportunistic theories in literature.

  7. The Ability of Analysts' Recommendations to Predict Optimistic and Pessimistic Forecasts

    Science.gov (United States)

    Biglari, Vahid; Alfan, Ervina Binti; Ahmad, Rubi Binti; Hajian, Najmeh

    2013-01-01

    Previous researches show that buy (growth) companies conduct income increasing earnings management in order to meet forecasts and generate positive forecast Errors (FEs). This behavior however, is not inherent in sell (non-growth) companies. Using the aforementioned background, this research hypothesizes that since sell companies are pressured to avoid income increasing earnings management, they are capable, and in fact more inclined, to pursue income decreasing Forecast Management (FM) with the purpose of generating positive FEs. Using a sample of 6553 firm-years of companies that are listed in the NYSE between the years 2005–2010, the study determines that sell companies conduct income decreasing FM to generate positive FEs. However, the frequency of positive FEs of sell companies does not exceed that of buy companies. Using the efficiency perspective, the study suggests that even though buy and sell companies have immense motivation in avoiding negative FEs, they exploit different but efficient strategies, respectively, in order to meet forecasts. Furthermore, the findings illuminated the complexities behind informative and opportunistic forecasts that falls under the efficiency versus opportunistic theories in literature. PMID:24146741

  8. An assessment of the validity of inelastic design analysis methods by comparisons of predictions with test results

    International Nuclear Information System (INIS)

    Corum, J.M.; Clinard, J.A.; Sartory, W.K.

    1976-01-01

    The use of computer programs that employ relatively complex constitutive theories and analysis procedures to perform inelastic design calculations on fast reactor system components introduces questions of validation and acceptance of the analysis results. We may ask ourselves, ''How valid are the answers.'' These questions, in turn, involve the concepts of verification of computer programs as well as qualification of the computer programs and of the underlying constitutive theories and analysis procedures. This paper addresses the latter - the qualification of the analysis methods for inelastic design calculations. Some of the work underway in the United States to provide the necessary information to evaluate inelastic analysis methods and computer programs is described, and typical comparisons of analysis predictions with inelastic structural test results are presented. It is emphasized throughout that rather than asking ourselves how valid, or correct, are the analytical predictions, we might more properly question whether or not the combination of the predictions and the associated high-temperature design criteria leads to an acceptable level of structural integrity. It is believed that in this context the analysis predictions are generally valid, even though exact correlations between predictions and actual behavior are not obtained and cannot be expected. Final judgment, however, must be reserved for the design analyst in each specific case. (author)

  9. A Tracking Analyst for large 3D spatiotemporal data from multiple sources (case study: Tracking volcanic eruptions in the atmosphere)

    Science.gov (United States)

    Gad, Mohamed A.; Elshehaly, Mai H.; Gračanin, Denis; Elmongui, Hicham G.

    2018-02-01

    This research presents a novel Trajectory-based Tracking Analyst (TTA) that can track and link spatiotemporally variable data from multiple sources. The proposed technique uses trajectory information to determine the positions of time-enabled and spatially variable scatter data at any given time through a combination of along trajectory adjustment and spatial interpolation. The TTA is applied in this research to track large spatiotemporal data of volcanic eruptions (acquired using multi-sensors) in the unsteady flow field of the atmosphere. The TTA enables tracking injections into the atmospheric flow field, the reconstruction of the spatiotemporally variable data at any desired time, and the spatiotemporal join of attribute data from multiple sources. In addition, we were able to create a smooth animation of the volcanic ash plume at interactive rates. The initial results indicate that the TTA can be applied to a wide range of multiple-source data.

  10. IMPROVED GROUND TRUTH IN SOUTHERN ASIA USING IN-COUNTRY DATA, ANALYST WAVEFORM REVIEW, AND ADVANCED ALGORITHMS

    Energy Technology Data Exchange (ETDEWEB)

    Engdahl, Eric, R.; Bergman, Eric, A.; Myers, Stephen, C.; Ryall, Floriana

    2009-06-19

    A new catalog of seismicity at magnitudes above 2.5 for the period 1923-2008 in the Iran region is assembled from arrival times reported by global, regional, and local seismic networks. Using in-country data we have formed new events, mostly at lower magnitudes that were not previously included in standard global earthquake catalogs. The magnitude completeness of the catalog varies strongly through time, complete to about magnitude 4.2 prior to 1998 and reaching a minimum of about 3.6 during the period 1998-2005. Of the 25,722 events in the catalog, most of the larger events have been carefully reviewed for proper phase association, especially for depth phases and to eliminate outlier readings, and relocated. To better understand the quality of the data set of arrival times reported by Iranian networks that are central to this study, many waveforms for events in Iran have been re-picked by an experienced seismic analyst. Waveforms at regional distances in this region are often complex. For many events this makes arrival time picks difficult to make, especially for smaller magnitude events, resulting in reported times that can be substantially improved by an experienced analyst. Even when the signal/noise ratio is large, re-picking can lead to significant differences. Picks made by our analyst are compared with original picks made by the regional networks. In spite of the obvious outliers, the median (-0.06 s) and spread (0.51 s) are small, suggesting that reasonable confidence can be placed in the picks reported by regional networks in Iran. This new catalog has been used to assess focal depth distributions throughout Iran. A principal result of this study is that the geographic pattern of depth distributions revealed by the relatively small number of earthquakes (~167) with depths constrained by waveform modeling (+/- 4 km) are now in agreement with the much larger number of depths (~1229) determined using reanalysis of ISC arrival-times (+/-10 km), within their

  11. Could the outcome of the 2016 US elections have been predicted from past voting patterns?

    Science.gov (United States)

    Schmitz, Peter M. U.; Holloway, Jennifer P.; Dudeni-Tlhone, Nontembeko; Ntlangu, Mbulelo B.; Koen, Renee

    2018-05-01

    In South Africa, a team of analysts has for some years been using statistical techniques to predict election outcomes during election nights in South Africa. The prediction method involves using statistical clusters based on past voting patterns to predict final election outcomes, using a small number of released vote counts. With the US presidential elections in November 2016 hitting the global media headlines during the time period directly after successful predictions were done for the South African elections, the team decided to investigate adapting their meth-od to forecast the final outcome in the US elections. In particular, it was felt that the time zone differences between states would affect the time at which results are released and thereby provide a window of opportunity for doing election night prediction using only the early results from the eastern side of the US. Testing the method on the US presidential elections would have two advantages: it would determine whether the core methodology could be generalised, and whether it would work to include a stronger spatial element in the modelling, since the early results released would be spatially biased due to time zone differences. This paper presents a high-level view of the overall methodology and how it was adapted to predict the results of the US presidential elections. A discussion on the clustering of spatial units within the US is also provided and the spatial distribution of results together with the Electoral College prediction results from both a `test-run' and the final 2016 presidential elections are given and analysed.

  12. Trabalho, saúde e gênero: estudo comparativo sobre analistas de sistemas Work and health: a gender study on systems analysts

    Directory of Open Access Journals (Sweden)

    Lys Esther Rocha

    2001-12-01

    Full Text Available OBJETIVO: Avaliar as repercussões do trabalho de mulheres e homens analistas de sistemas na saúde. MÉTODOS: Trata-se de estudo exploratório de delineamento transversal, abrangendo 553 analistas de duas empresas de processamento de dados da região metropolitana de São Paulo. Foram realizadas análises ergonômicas do trabalho, entrevistas semi-estruturadas e preenchimento de questionários para auto-aplicação. A análise dos dados baseou-se em tabelas de contingência com qui-quadrado a 5% de significância e razões de prevalência e seus intervalos de confiança segundo gênero. RESULTADOS: As mulheres constituíram 40,7% do grupo estudado, sendo mais jovens que os homens. A presença de filhos foi maior entre os homens, embora o tempo diário dedicado às tarefas domésticas tenha sido maior entre as mulheres. Observou-se predomínio dos homens nas funções de chefia. Fatores de incômodo, com freqüência semelhante entre homens e mulheres, foram: sobrecarga de trabalho devido a prazos curtos; alto grau de responsabilidade; exigência mental do trabalho; e complexidade da tarefa. Fatores de incômodo predominantes em mulheres foram: postura desconfortável; maior exposição ao computador; e presença de equipamento obsoleto. As mulheres relataram maior freqüência de sintomas visuais, musculares e relacionados a estresse; maior insatisfação com o trabalho; maior fadiga física e mental. CONCLUSÕES: O estudo sugere que as repercussões na saúde das analistas de sistemas estão associadas às exigências do trabalho e ao papel da mulher na sociedade. Os resultados destacam a importância de estudos sobre saúde, trabalho e gênero, em analisar a interseção entre a esfera produtiva e a doméstica.OBJECTIVE: To assess the health impact of working conditions among male and female systems analysts. METHODS: In this cross-sectional study, 533 systems analysts of two data analysis companies located in the metropolitan area of S

  13. Close Approach Prediction Analysis of the Earth Science Constellation with the Fengyun-1C Debris

    Science.gov (United States)

    Duncan, Matthew; Rand, David K.

    2008-01-01

    Routine satellite operations for the Earth Science Constellation (ESC) include collision risk assessment between members of the constellation and other orbiting space objects. Each day, close approach predictions are generated by a U.S. Department of Defense Joint Space Operations Center Orbital Safety Analyst using the high accuracy Space Object Catalog maintained by the Air Force's 1" Space Control Squadron. Prediction results and other ancillary data such as state vector information are sent to NASAJGoddard Space Flight Center's (GSFC's) Collision Risk Assessment analysis team for review. Collision analysis is performed and the GSFC team works with the ESC member missions to develop risk reduction strategies as necessary. This paper presents various close approach statistics for the ESC. The ESC missions have been affected by debris from the recent anti-satellite test which destroyed the Chinese Fengyun- 1 C satellite. The paper also presents the percentage of close approach events induced by the Fengyun-1C debris, and presents analysis results which predict the future effects on the ESC caused by this event. Specifically, the Fengyun-1C debris is propagated for twenty years using high-performance computing technology and close approach predictions are generated for the ESC. The percent increase in the total number of conjunction events is considered to be an estimate of the collision risk due to the Fengyun-1C break- UP.

  14. Economic decision making and the application of nonparametric prediction models

    Science.gov (United States)

    Attanasi, E.D.; Coburn, T.C.; Freeman, P.A.

    2008-01-01

    Sustained increases in energy prices have focused attention on gas resources in low-permeability shale or in coals that were previously considered economically marginal. Daily well deliverability is often relatively small, although the estimates of the total volumes of recoverable resources in these settings are often large. Planning and development decisions for extraction of such resources must be areawide because profitable extraction requires optimization of scale economies to minimize costs and reduce risk. For an individual firm, the decision to enter such plays depends on reconnaissance-level estimates of regional recoverable resources and on cost estimates to develop untested areas. This paper shows how simple nonparametric local regression models, used to predict technically recoverable resources at untested sites, can be combined with economic models to compute regional-scale cost functions. The context of the worked example is the Devonian Antrim-shale gas play in the Michigan basin. One finding relates to selection of the resource prediction model to be used with economic models. Models chosen because they can best predict aggregate volume over larger areas (many hundreds of sites) smooth out granularity in the distribution of predicted volumes at individual sites. This loss of detail affects the representation of economic cost functions and may affect economic decisions. Second, because some analysts consider unconventional resources to be ubiquitous, the selection and order of specific drilling sites may, in practice, be determined arbitrarily by extraneous factors. The analysis shows a 15-20% gain in gas volume when these simple models are applied to order drilling prospects strategically rather than to choose drilling locations randomly. Copyright ?? 2008 Society of Petroleum Engineers.

  15. An approach to model validation and model-based prediction -- polyurethane foam case study.

    Energy Technology Data Exchange (ETDEWEB)

    Dowding, Kevin J.; Rutherford, Brian Milne

    2003-07-01

    Enhanced software methodology and improved computing hardware have advanced the state of simulation technology to a point where large physics-based codes can be a major contributor in many systems analyses. This shift toward the use of computational methods has brought with it new research challenges in a number of areas including characterization of uncertainty, model validation, and the analysis of computer output. It is these challenges that have motivated the work described in this report. Approaches to and methods for model validation and (model-based) prediction have been developed recently in the engineering, mathematics and statistical literatures. In this report we have provided a fairly detailed account of one approach to model validation and prediction applied to an analysis investigating thermal decomposition of polyurethane foam. A model simulates the evolution of the foam in a high temperature environment as it transforms from a solid to a gas phase. The available modeling and experimental results serve as data for a case study focusing our model validation and prediction developmental efforts on this specific thermal application. We discuss several elements of the ''philosophy'' behind the validation and prediction approach: (1) We view the validation process as an activity applying to the use of a specific computational model for a specific application. We do acknowledge, however, that an important part of the overall development of a computational simulation initiative is the feedback provided to model developers and analysts associated with the application. (2) We utilize information obtained for the calibration of model parameters to estimate the parameters and quantify uncertainty in the estimates. We rely, however, on validation data (or data from similar analyses) to measure the variability that contributes to the uncertainty in predictions for specific systems or units (unit-to-unit variability). (3) We perform statistical

  16. Utilizing functional near-infrared spectroscopy for prediction of cognitive workload in noisy work environments.

    Science.gov (United States)

    Gabbard, Ryan; Fendley, Mary; Dar, Irfaan A; Warren, Rik; Kashou, Nasser H

    2017-10-01

    Occupational noise frequently occurs in the work environment in military intelligence, surveillance, and reconnaissance operations. This impacts cognitive performance by acting as a stressor, potentially interfering with the analysts' decision-making process. We investigated the effects of different noise stimuli on analysts' performance and workload in anomaly detection by simulating a noisy work environment. We utilized functional near-infrared spectroscopy (fNIRS) to quantify oxy-hemoglobin (HbO) and deoxy-hemoglobin concentration changes in the prefrontal cortex (PFC), as well as behavioral measures, which include eye tracking, reaction time, and accuracy rate. We hypothesized that noisy environments would have a negative effect on the participant in terms of anomaly detection performance due to the increase in workload, which would be reflected by an increase in PFC activity. We found that HbO for some of the channels analyzed were significantly different across noise types ([Formula: see text]). Our results also indicated that HbO activation for short-intermittent noise stimuli was greater in the PFC compared to long-intermittent noises. These approaches using fNIRS in conjunction with an understanding of the impact on human analysts in anomaly detection could potentially lead to better performance by optimizing work environments.

  17. A hybrid clustering and classification approach for predicting crash injury severity on rural roads.

    Science.gov (United States)

    Hasheminejad, Seyed Hessam-Allah; Zahedi, Mohsen; Hasheminejad, Seyed Mohammad Hossein

    2018-03-01

    As a threat for transportation system, traffic crashes have a wide range of social consequences for governments. Traffic crashes are increasing in developing countries and Iran as a developing country is not immune from this risk. There are several researches in the literature to predict traffic crash severity based on artificial neural networks (ANNs), support vector machines and decision trees. This paper attempts to investigate the crash injury severity of rural roads by using a hybrid clustering and classification approach to compare the performance of classification algorithms before and after applying the clustering. In this paper, a novel rule-based genetic algorithm (GA) is proposed to predict crash injury severity, which is evaluated by performance criteria in comparison with classification algorithms like ANN. The results obtained from analysis of 13,673 crashes (5600 property damage, 778 fatal crashes, 4690 slight injuries and 2605 severe injuries) on rural roads in Tehran Province of Iran during 2011-2013 revealed that the proposed GA method outperforms other classification algorithms based on classification metrics like precision (86%), recall (88%) and accuracy (87%). Moreover, the proposed GA method has the highest level of interpretation, is easy to understand and provides feedback to analysts.

  18. Evaluating the results of a site-specific PSHA from the perspective of a risk analyst

    Science.gov (United States)

    Klügel, Jens-Uwe

    2016-04-01

    From 1998 till 2015 Swiss Nuclear Power Plants sponsored a set of comprehensive site-specific PSHA-studies (PEGASOS, PEGASOS Refinement Project) to obtain the requested input for their plant specific probabilistic risk assessments following the US SSHAC procedures at their most elaborated level 4. The studies were performed by well-known earth scientists working completely independent from sponsors under participatory review of the Swiss Nuclear Safety Inspectorate. Risk analysts of Swiss Nuclear Power Plants recently have been mandated to implement the final results of the studies in their risk assessment studies. This triggered an in depth assessment of the results focussed on their practical applicability for risk studies. This assessment resulted in some important insights that are of interest for future PSHA studies performed for new nuclear power plants. The assessment included a review of the completeness of results with respect to risk applications as well as plausibility checks of hazard results based on Black Swan Theory and known historical events. The key lessons and recommendations for more detailed project output specifications for future projects are presented in the paper. It was established that future PSHA projects shall provide the joint probability distribution of ground motion hazard and the associated strong motion duration as the output to allow for a technically meaningful risk assessment. The recommendation of WENRA (West European Nuclear Regulators) published in their reference levels to perform natural hazard assessment preferably based on physical grounds (deterministic method) is also rationalized by recommending an holistic approach to hazard analysis comparing PSHA insights with the results of modelling deterministic Seismic Hazard Analysis.

  19. Evaluation and comparison of mammalian subcellular localization prediction methods

    Directory of Open Access Journals (Sweden)

    Fink J Lynn

    2006-12-01

    Full Text Available Abstract Background Determination of the subcellular location of a protein is essential to understanding its biochemical function. This information can provide insight into the function of hypothetical or novel proteins. These data are difficult to obtain experimentally but have become especially important since many whole genome sequencing projects have been finished and many resulting protein sequences are still lacking detailed functional information. In order to address this paucity of data, many computational prediction methods have been developed. However, these methods have varying levels of accuracy and perform differently based on the sequences that are presented to the underlying algorithm. It is therefore useful to compare these methods and monitor their performance. Results In order to perform a comprehensive survey of prediction methods, we selected only methods that accepted large batches of protein sequences, were publicly available, and were able to predict localization to at least nine of the major subcellular locations (nucleus, cytosol, mitochondrion, extracellular region, plasma membrane, Golgi apparatus, endoplasmic reticulum (ER, peroxisome, and lysosome. The selected methods were CELLO, MultiLoc, Proteome Analyst, pTarget and WoLF PSORT. These methods were evaluated using 3763 mouse proteins from SwissProt that represent the source of the training sets used in development of the individual methods. In addition, an independent evaluation set of 2145 mouse proteins from LOCATE with a bias towards the subcellular localization underrepresented in SwissProt was used. The sensitivity and specificity were calculated for each method and compared to a theoretical value based on what might be observed by random chance. Conclusion No individual method had a sufficient level of sensitivity across both evaluation sets that would enable reliable application to hypothetical proteins. All methods showed lower performance on the LOCATE

  20. Predictable and avoidable: What’s next?

    Directory of Open Access Journals (Sweden)

    Ivo Pezzuto

    2014-09-01

    Full Text Available The author of this paper (Dr. Ivo Pezzuto has been one of the first authors to write back in 2008 about the alleged "subprime mortgage loans fraud" which has triggered the 2008 financial crisis, in combination with multiple other complex, highly interrelated, and concurrent factors. The author has been also one of the first authors to report in that same working paper of 2008 (available on SSRN and titled "Miraculous Financial Engineering or Toxic Finance? The Genesis of the U.S. Subprime Mortgage Loans Crisis and its Consequences on the Global Financial Markets and Real Economy" the high probability of a Eurozone debt crisis, due to a number of unsolved structural macroeconomic problems, the lack of a single crisis resolution scheme, current account imbalances, and in some countries, housing bubbles/high private debt. In the book published in 2013 and titled "Predictable and Avoidable: Repairing Economic Dislocation and Preventing the Recurrence of Crisis", Dr. Ivo Pezzuto has exposed the root causes of the financial crisis in order to enables readers to understand that the crisis we have seen was predictable and should have been avoidable, and that a recurrence can be avoided, if lessons are learned and the right action taken. Almost one year after the publication of the book "Predictable and Avoidable: Repairing Economic Dislocation and Preventing the Recurrence of Crisis", the author has decided to write this working paper to explore what happened in the meantime to the financial markets and to the financial regulation implementation. Most of all, the author with this working paper aims to provide an updated analysis as strategist and scenario analyst on the topics addressed in the book "Predictable and Avoidable" based on a forward-looking perspective and on potential "tail risk" scenarios. The topics reported in this paper relate to financial crises; Government policy; financial regulation; corporate governance; credit risk management

  1. Economic analyses to support decisions about HPV vaccination in low- and middle-income countries: a consensus report and guide for analysts.

    Science.gov (United States)

    Jit, Mark; Levin, Carol; Brisson, Marc; Levin, Ann; Resch, Stephen; Berkhof, Johannes; Kim, Jane; Hutubessy, Raymond

    2013-01-30

    Low- and middle-income countries need to consider economic issues such as cost-effectiveness, affordability and sustainability before introducing a program for human papillomavirus (HPV) vaccination. However, many such countries lack the technical capacity and data to conduct their own analyses. Analysts informing policy decisions should address the following questions: 1) Is an economic analysis needed? 2) Should analyses address costs, epidemiological outcomes, or both? 3) If costs are considered, what sort of analysis is needed? 4) If outcomes are considered, what sort of model should be used? 5) How complex should the analysis be? 6) How should uncertainty be captured? 7) How should model results be communicated? Selecting the appropriate analysis is essential to ensure that all the important features of the decision problem are correctly represented, but that the analyses are not more complex than necessary. This report describes the consensus of an expert group convened by the World Health Organization, prioritizing key issues to be addressed when considering economic analyses to support HPV vaccine introduction in these countries.

  2. The Regional Healthcare Ecosystem Analyst (RHEA): a simulation modeling tool to assist infectious disease control in a health system.

    Science.gov (United States)

    Lee, Bruce Y; Wong, Kim F; Bartsch, Sarah M; Yilmaz, S Levent; Avery, Taliser R; Brown, Shawn T; Song, Yeohan; Singh, Ashima; Kim, Diane S; Huang, Susan S

    2013-06-01

    As healthcare systems continue to expand and interconnect with each other through patient sharing, administrators, policy makers, infection control specialists, and other decision makers may have to take account of the entire healthcare 'ecosystem' in infection control. We developed a software tool, the Regional Healthcare Ecosystem Analyst (RHEA), that can accept user-inputted data to rapidly create a detailed agent-based simulation model (ABM) of the healthcare ecosystem (ie, all healthcare facilities, their adjoining community, and patient flow among the facilities) of any region to better understand the spread and control of infectious diseases. To demonstrate RHEA's capabilities, we fed extensive data from Orange County, California, USA, into RHEA to create an ABM of a healthcare ecosystem and simulate the spread and control of methicillin-resistant Staphylococcus aureus. Various experiments explored the effects of changing different parameters (eg, degree of transmission, length of stay, and bed capacity). Our model emphasizes how individual healthcare facilities are components of integrated and dynamic networks connected via patient movement and how occurrences in one healthcare facility may affect many other healthcare facilities. A decision maker can utilize RHEA to generate a detailed ABM of any healthcare system of interest, which in turn can serve as a virtual laboratory to test different policies and interventions.

  3. Benchmarking of RESRAD-OFFSITE : transition from RESRAD (onsite) to RESRAD-OFFSITE and comparison of the RESRAD-OFFSITE predictions with peercodes

    International Nuclear Information System (INIS)

    Yu, C.; Gnanapragasam, E.; Cheng, J.-J.; Biwer, B.

    2006-01-01

    The main purpose of this report is to document the benchmarking results and verification of the RESRAD-OFFSITE code as part of the quality assurance requirements of the RESRAD development program. This documentation will enable the U.S. Department of Energy (DOE) and its contractors, and the U.S. Nuclear Regulatory Commission (NRC) and its licensees and other stakeholders to use the quality-assured version of the code to perform dose analysis in a risk-informed and technically defensible manner to demonstrate compliance with the NRC's License Termination Rule, Title 10, Part 20, Subpart E, of the Code of Federal Regulations (10 CFR Part 20, Subpart E); DOE's 10 CFR Part 834, Order 5400.5, ''Radiation Protection of the Public and the Environment''; and other Federal and State regulatory requirements as appropriate. The other purpose of this report is to document the differences and similarities between the RESRAD (onsite) and RESRAD-OFFSITE codes so that users (dose analysts and risk assessors) can make a smooth transition from use of the RESRAD (onsite) code to use of the RESRAD-OFFSITE code for performing both onsite and offsite dose analyses. The evolution of the RESRAD-OFFSITE code from the RESRAD (onsite) code is described in Chapter 1 to help the dose analyst and risk assessor make a smooth conceptual transition from the use of one code to that of the other. Chapter 2 provides a comparison of the predictions of RESRAD (onsite) and RESRAD-OFFSITE for an onsite exposure scenario. Chapter 3 documents the results of benchmarking RESRAD-OFFSITE's atmospheric transport and dispersion submodel against the U.S. Environmental Protection Agency's (EPA's) CAP88-PC (Clean Air Act Assessment Package-1988) and ISCLT3 (Industrial Source Complex-Long Term) models. Chapter 4 documents the comparison results of the predictions of the RESRAD-OFFSITE code and its submodels with the predictions of peer models. This report was prepared by Argonne National Laboratory's (Argonne

  4. NATO Guide for Judgement-Based Operational Analysis in Defence Decision Making (Guide OTAN pour l’analyse operationnelle basee sur le jugement dans la prise de decision de defense). Analyst-Oriented Volume: Code of Best Practice for Soft Operational Analysis

    Science.gov (United States)

    2012-06-01

    different stages of the cognitive system [3], such as the following: • The ease with which information can be recalled from memory affects how frequently... colouring is used to denote the three different text box types. The TG has restricted itself in referencing the material in the main text in order to...1: Procedure for Interpreting Problematic Situations and Identifying Their Nature. Recall from Chapter 4 that the analyst very often has to ‘prove

  5. Rationale and design of the participant, investigator, observer, and data-analyst-blinded randomized AGENDA trial on associations between gene-polymorphisms, endophenotypes for depression and antidepressive intervention: the effect of escitalopram versus placebo on the combined dexamethasone-corticotrophine releasing hormone test and other potential endophenotypes in healthy first-degree relatives of persons with depression

    DEFF Research Database (Denmark)

    Knorr, Ulla; Vinberg, Maj; Klose, Marianne

    2009-01-01

    from baseline to the end of intervention. METHODS: The AGENDA trial is designed as a participant, investigator, observer, and data-analyst-blinded randomized trial. Participants are 80 healthy first-degree relatives of patients with depression. Participants are randomized to escitalopram 10 mg per day...

  6. To Your Health: NLM update transcript - Improving the international response to infectious illness

    Science.gov (United States)

    ... data analysts to estimate and predict the eventual importance of the initial cases to public health that ... is authoritative. It's free. We do not accept advertising .... and is written to help you. To find ...

  7. Credit risk evaluation based on social media.

    Science.gov (United States)

    Yang, Yang; Gu, Jing; Zhou, Zongfang

    2016-07-01

    Social media has been playing an increasingly important role in the sharing of individuals' opinions on many financial issues, including credit risk in investment decisions. This paper analyzes whether these opinions, which are transmitted through social media, can accurately predict enterprises' future credit risk. We consider financial statements oriented evaluation results based on logit and probit approaches as the benchmarks. We then conduct textual analysis to retrieve both posts and their corresponding commentaries published on two of the most popular social media platforms for financial investors in China. Professional advice from financial analysts is also investigated in this paper. We surprisingly find that the opinions extracted from both posts and commentaries surpass opinions of analysts in terms of credit risk prediction. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. Household Consumption and Expenditures Surveys (HCES): a primer for food and nutrition analysts in low- and middle-income countries.

    Science.gov (United States)

    Fiedler, John L; Lividini, Keith; Bermudez, Odilia I; Smitz, Marc-Francois

    2012-09-01

    The dearth of 24-hour recall and observed-weighed food record data--what most nutritionists regard as the gold standard source of food consumption data-has long been an obstacle to evidence-based food and nutrition policy. There have been a steadily growing number of studies using household food acquisition and consumption data from a variety of multipurpose, nationally representative household surveys as a proxy measure to overcome this fundamental information gap. To describe the key characteristics of these increasingly available Household Consumption and Expenditures Surveys (HCES) in order to help familiarize food and nutrition analysts with the strengths and shortcomings of these data and thus encourage their use in low- and middle-income countries; and to identify common shortcomings that can be readily addressed in the near term in a country-by-country approach, as new HCES are fielded, thereby beginning a process of improving the potential of these surveys as sources of useful data for better understanding food- and nutrition-related issues. Common characteristics of key food and nutrition information that is available in HCES and some basic common steps in processing HCES data for food and nutrition analyses are described. The common characteristics of these surveys are documented, and their usefulness in addressing major food and nutrition issues, as well as their shortcomings, is demonstrated. Despite their limitations, the use of HCES data constitutes a generally unexploited opportunity to address the food consumption information gap by using survey data that most countries are already routinely collecting.

  9. Validation of the Air Force Weather Agency Ensemble Prediction Systems

    Science.gov (United States)

    2014-03-27

    by Mr. Evan L. Kuchera. Also, I would like to express my gratitude to Mr. Jeff H. Zaunter for painstakingly working with me to provided station...my fellow AFIT classmates, Capt Jeremy J. Hromsco, Capt Haley A. Homan, Capt Kyle R. Thurmond and 2Lt Coy C. Fischer for their support and...Codes. The raw METARs and SPECIs were decoded and provided for this research by Mr. Jeff Zautner, 14/WS Meteorologist, Tailored Product Analyst

  10. Predicting Near-Term Water Quality from Satellite Observations of Watershed Conditions

    Science.gov (United States)

    Weiss, W. J.; Wang, L.; Hoffman, K.; West, D.; Mehta, A. V.; Lee, C.

    2017-12-01

    Despite the strong influence of watershed conditions on source water quality, most water utilities and water resource agencies do not currently have the capability to monitor watershed sources of contamination with great temporal or spatial detail. Typically, knowledge of source water quality is limited to periodic grab sampling; automated monitoring of a limited number of parameters at a few select locations; and/or monitoring relevant constituents at a treatment plant intake. While important, such observations are not sufficient to inform proactive watershed or source water management at a monthly or seasonal scale. Satellite remote sensing data on the other hand can provide a snapshot of an entire watershed at regular, sub-monthly intervals, helping analysts characterize watershed conditions and identify trends that could signal changes in source water quality. Accordingly, the authors are investigating correlations between satellite remote sensing observations of watersheds and source water quality, at a variety of spatial and temporal scales and lags. While correlations between remote sensing observations and direct in situ measurements of water quality have been well described in the literature, there are few studies that link remote sensing observations across a watershed with near-term predictions of water quality. In this presentation, the authors will describe results of statistical analyses and discuss how these results are being used to inform development of a desktop decision support tool to support predictive application of remote sensing data. Predictor variables under evaluation include parameters that describe vegetative conditions; parameters that describe climate/weather conditions; and non-remote sensing, in situ measurements. Water quality parameters under investigation include nitrogen, phosphorus, organic carbon, chlorophyll-a, and turbidity.

  11. Deployed Analyst Handbook

    Science.gov (United States)

    2016-06-01

    layered approach to data verification. One should use quality control measures throughout the data management process, from the entry of an initial...intermediate milestones aid in project management and ensure customer satisfaction ).  Communications. One should establish recurrent communication with the...distribution across an area. ......................................71 CAA-2015094 vii Figure 22. Clustered Bars – Illustrate a rank order among

  12. Recommendations for strengthening the infrared technology component of any condition monitoring program

    Science.gov (United States)

    Nicholas, Jack R., Jr.; Young, R. K.

    1999-03-01

    This presentation provides insights of a long term 'champion' of many condition monitoring technologies and a Level III infra red thermographer. The co-authors present recommendations based on their observations of infra red and other components of predictive, condition monitoring programs in manufacturing, utility and government defense and energy activities. As predictive maintenance service providers, trainers, informal observers and formal auditors of such programs, the co-authors provide a unique perspective that can be useful to practitioners, managers and customers of advanced programs. Each has over 30 years experience in the field of machinery operation, maintenance, and support the origins of which can be traced to and through the demanding requirements of the U.S. Navy nuclear submarine forces. They have over 10 years each of experience with programs in many different countries on 3 continents. Recommendations are provided on the following: (1) Leadership and Management Support (For survival); (2) Life Cycle View (For establishment of a firm and stable foundation for a program); (3) Training and Orientation (For thermographers as well as operators, managers and others); (4) Analyst Flexibility (To innovate, explore and develop their understanding of machinery condition); (5) Reports and Program Justification (For program visibility and continued expansion); (6) Commitment to Continuous Improvement of Capability and Productivity (Through application of updated hardware and software); (7) Mutual Support by Analysts (By those inside and outside of the immediate organization); (8) Use of Multiple Technologies and System Experts to Help Define Problems (Through the use of correlation analysis of data from up to 15 technologies. An example correlation analysis table for AC and DC motors is provided.); (9) Root Cause Analysis (Allows a shift from reactive to proactive stance for a program); (10) Master Equipment Identification and Technology Application (To

  13. Spreadsheets for business process management : Using process mining to deal with “events” rather than “numbers”?

    NARCIS (Netherlands)

    van der Aalst, Wil

    2018-01-01

    Purpose: Process mining provides a generic collection of techniques to turn event data into valuable insights, improvement ideas, predictions, and recommendations. This paper uses spreadsheets as a metaphor to introduce process mining as an essential tool for data scientists and business analysts.

  14. Predictive Factors in Conflict: Assessing the Likelihood of a Preemptive Strike by Israel on Iran Using a Computer Model

    Science.gov (United States)

    2013-03-01

    Proliferation Treaty OSINT Open Source Intelligence SAFF Safing, Arming, Fuzing, Firing SIAM Situational Influence Assessment Module SME Subject...expertise. One of the analysts can also be trained to tweak CAST logic as needed. In this initial build, only open-source intelligence ( OSINT ) will

  15. Can measures of the consumer debt burden reliably predict an economic slowdown?

    OpenAIRE

    C. Alan Garner

    1996-01-01

    Some analysts and business executives are becoming concerned that recent increases in the consumer debt burden may foreshadow an economic slowdown. Higher debt increases the risk that a household may experience financial distress in the event of an adverse economic shock, such as the loss of a job or large uninsured medical expenses. As the risk of financial distress rises, households may become less willing to spend on consumer goods, particularly big ticket items such as automobiles and hom...

  16. Psychology of communications

    International Nuclear Information System (INIS)

    Hunns, D.M.

    1980-01-01

    A theory is proposed relating to the structuring of mental models and this theory used to account for a number of human error mechanisms. Communications amongst operators and the systems around them is seen as a vital factor in the area of human error and a technique, communications analysis, is proposed as one approach to systematically predicting the ways in which actual system state and the operators' perceptions of that state can get out of step and lead to catastrophe. To be most effective it is expected that the analyst would apply communications analyst with an interactive computer system. Of particular importance is the ability to trace the operator-system communication scenarios in various abnormal system configurations. (orig.)

  17. Plus C'est La Meme Chose? Questioning Crop Diversification as a Response to Agricultural Deregulation in Saskatchewan, Canada

    Science.gov (United States)

    Bradshaw, Ben

    2004-01-01

    In the context of declining government subsidization of agriculture, many analysts have predicted reversals in certain characteristic trends of post-1945 Western agriculture with positive implications for agroecosystem well-being. One example, investigated herein, is the suggestion that, in the absence of government safety nets, farmers will seek…

  18. Human factors analysis and design methods for nuclear waste retrieval systems. Volume III. User's guide for the computerized event-tree analysis technique

    International Nuclear Information System (INIS)

    Casey, S.M.; Deretsky, Z.

    1980-08-01

    This document provides detailed instructions for using the Computerized Event-Tree Analysis Technique (CETAT), a program designed to assist a human factors analyst in predicting event probabilities in complex man-machine configurations found in waste retrieval systems. The instructions contained herein describe how to (a) identify the scope of a CETAT analysis, (b) develop operator performance data, (c) enter an event-tree structure, (d) modify a data base, and (e) analyze event paths and man-machine system configurations. Designed to serve as a tool for developing, organizing, and analyzing operator-initiated event probabilities, CETAT simplifies the tasks of the experienced systems analyst by organizing large amounts of data and performing cumbersome and time consuming arithmetic calculations. The principal uses of CETAT in the waste retrieval development project will be to develop models of system reliability and evaluate alternative equipment designs and operator tasks. As with any automated technique, however, the value of the output will be a function of the knowledge and skill of the analyst using the program

  19. An analyst's self-analysis.

    Science.gov (United States)

    Calder, K T

    1980-01-01

    I have told you why I selected the topic of self-analysis, and I have described my method for it: of recording primary data such as dreams, daydreams, memories, and symptoms and of recording associations to this primary data, followed by an attempt at analyzing this written material. I have described a dream, a memory and a daydream which is also a symptom, each of which primary data I found useful in understanding myself. Finally, I reached some conclusions regarding the uses of self-analysis, including self-analysis as a research tool.

  20. The Effect of Latent Binary Variables on the Uncertainty of the Prediction of a Dichotomous Outcome Using Logistic Regression Based Propensity Score Matching.

    Science.gov (United States)

    Szekér, Szabolcs; Vathy-Fogarassy, Ágnes

    2018-01-01

    Logistic regression based propensity score matching is a widely used method in case-control studies to select the individuals of the control group. This method creates a suitable control group if all factors affecting the output variable are known. However, if relevant latent variables exist as well, which are not taken into account during the calculations, the quality of the control group is uncertain. In this paper, we present a statistics-based research in which we try to determine the relationship between the accuracy of the logistic regression model and the uncertainty of the dependent variable of the control group defined by propensity score matching. Our analyses show that there is a linear correlation between the fit of the logistic regression model and the uncertainty of the output variable. In certain cases, a latent binary explanatory variable can result in a relative error of up to 70% in the prediction of the outcome variable. The observed phenomenon calls the attention of analysts to an important point, which must be taken into account when deducting conclusions.

  1. Using self-organizing maps to determine observation threshold limit predictions in highly variant data

    Science.gov (United States)

    Paganoni, C.A.; Chang, K.C.; Robblee, M.B.

    2006-01-01

    A significant data quality challenge for highly variant systems surrounds the limited ability to quantify operationally reasonable limits on the data elements being collected and provide reasonable threshold predictions. In many instances, the number of influences that drive a resulting value or operational range is too large to enable physical sampling for each influencer, or is too complicated to accurately model in an explicit simulation. An alternative method to determine reasonable observation thresholds is to employ an automation algorithm that would emulate a human analyst visually inspecting data for limits. Using the visualization technique of self-organizing maps (SOM) on data having poorly understood relationships, a methodology for determining threshold limits was developed. To illustrate this approach, analysis of environmental influences that drive the abundance of a target indicator species (the pink shrimp, Farfantepenaeus duorarum) provided a real example of applicability. The relationship between salinity and temperature and abundance of F. duorarum is well documented, but the effect of changes in water quality upstream on pink shrimp abundance is not well understood. The highly variant nature surrounding catch of a specific number of organisms in the wild, and the data available from up-stream hydrology measures for salinity and temperature, made this an ideal candidate for the approach to provide a determination about the influence of changes in hydrology on populations of organisms.

  2. Stress Voiding in IC Interconnects - Rules of Evidence for Failure Analysts

    Energy Technology Data Exchange (ETDEWEB)

    FILTER, WILLIAM F.

    1999-09-17

    Mention the words ''stress voiding'', and everyone from technology engineer to manager to customer is likely to cringe. This IC failure mechanism elicits fear because it is insidious, capricious, and difficult to identify and arrest. There are reasons to believe that a damascene-copper future might be void-free. Nevertheless, engineers who continue to produce ICs with Al-alloy interconnects, or who assess the reliability of legacy ICs with long service life, need up-to-date insights and techniques to deal with stress voiding problems. Stress voiding need not be fearful. Not always predictable, neither is it inevitable. On the contrary, stress voids are caused by specific, avoidable processing errors. Analytical work, though often painful, can identify these errors when stress voiding occurs, and vigilance in monitoring the improved process can keep it from recurring. In this article, they show that a methodical, forensics approach to failure analysis can solve suspected cases of stress voiding. This approach uses new techniques, and patiently applies familiar ones, to develop evidence meeting strict standards of proof.

  3. The Potential of Economic Model Predictive Control for Spray Drying Plants

    DEFF Research Database (Denmark)

    Petersen, Lars Norbert; Poulsen, Niels Kjølstad; Niemann, Hans Henrik

    In 2015 the milk quota system in the European Union will be completely liberalized. As a result, analysts expect production of skimmed and whole milk powder to increase by 5-6% while its price will decline by about 6-7%. Multi-stage spray drying is the prime process for the production of food...... powders. The process is highly energy consuming and capacity depends among other factors on correct control of the dryer. Consequently efficient control and optimization of the spray drying process has become increasingly important to accommodate the future market challenges. The goal of the presentation...

  4. Predicting Ambulance Time of Arrival to the Emergency Department Using Global Positioning System and Google Maps

    Science.gov (United States)

    Fleischman, Ross J.; Lundquist, Mark; Jui, Jonathan; Newgard, Craig D.; Warden, Craig

    2014-01-01

    Objective To derive and validate a model that accurately predicts ambulance arrival time that could be implemented as a Google Maps web application. Methods This was a retrospective study of all scene transports in Multnomah County, Oregon, from January 1 through December 31, 2008. Scene and destination hospital addresses were converted to coordinates. ArcGIS Network Analyst was used to estimate transport times based on street network speed limits. We then created a linear regression model to improve the accuracy of these street network estimates using weather, patient characteristics, use of lights and sirens, daylight, and rush-hour intervals. The model was derived from a 50% sample and validated on the remainder. Significance of the covariates was determined by p times recorded by computer-aided dispatch. We then built a Google Maps-based web application to demonstrate application in real-world EMS operations. Results There were 48,308 included transports. Street network estimates of transport time were accurate within 5 minutes of actual transport time less than 16% of the time. Actual transport times were longer during daylight and rush-hour intervals and shorter with use of lights and sirens. Age under 18 years, gender, wet weather, and trauma system entry were not significant predictors of transport time. Our model predicted arrival time within 5 minutes 73% of the time. For lights and sirens transports, accuracy was within 5 minutes 77% of the time. Accuracy was identical in the validation dataset. Lights and sirens saved an average of 3.1 minutes for transports under 8.8 minutes, and 5.3 minutes for longer transports. Conclusions An estimate of transport time based only on a street network significantly underestimated transport times. A simple model incorporating few variables can predict ambulance time of arrival to the emergency department with good accuracy. This model could be linked to global positioning system data and an automated Google Maps web

  5. The prediction of the LWR plant accident based on the measured plant data

    International Nuclear Information System (INIS)

    Miettinen, J.; Schmuck, P.

    2005-01-01

    In case of accident affecting a nuclear reactor, it is essential to anticipate the possible development of the situation to efficiently succeed in emergency response actions, i.e. firstly to be early warned, to get sufficient information on the plant: and as far as possible. The ASTRID (Assessment of Source Term for Emergency Response based on Installation Data) project consists in developing a methodology: of expertise to; structure the work of technical teams and to facilitate cross competence communications among EP players and a qualified computer tool that could be commonly used by the European countries to reliably predict source term in case of an accident in a light water reactor, using the information available on the plant. In many accident conditions the team of analysts may be located far away from the plant experiencing the accident and their decision making is based on the on-line plant data transmitted into the crisis centre in an interval of 30 - 600 seconds. The plant condition has to be diagnosed based on this information, In the ASTRID project the plant status diagnostics has been studied for the European reactor types including BWR, PWR and VVER plants. The directly measured plant data may be used for estimations of the break size from the primary system and its locations. The break size prediction may be based on the pressurizer level, reactor vessel level, primary pressure and steam generator level in the case of the steam generator tube rupture. In the ASTRID project the break predictions concept was developed and its validity for different plant types and is presented in the paper, when the plant data has been created with the plant specific thermohydraulic simulation model. The tracking simulator attempts to follow the plant behavior on-line based on the measured plant data for the main process parameters and most important boundary conditions. When the plant state tracking fails, the plant may be experiencing an accident, and the tracking

  6. Comitê de Auditoria versus Conselho Fiscal Adaptado: a visão dos analistas de mercado e dos executivos das empresas que possuem ADRs Audit Committee versus Adapted Fiscal Council: the point of view of market analysts and executives of companies with ADRs

    Directory of Open Access Journals (Sweden)

    Fernanda Furuta

    2010-08-01

    Full Text Available Este estudo tem como objetivo obter a opinião dos executivos das empresas que operam no Brasil e negociam seus títulos no mercado norte-americano e dos analistas de mercado sobre a formação do Comitê de Auditoria ou do Conselho Fiscal adaptado. Para isso, foram aplicados questionários e realizadas entrevistas. A maioria dos executivos das empresas que formaram o Comitê de Auditoria apontaram que o nível de governança corporativa foi um dos fatores que mais influenciou na decisão de se formar um ou outro órgão. Por outro lado, a maioria dos executivos das empresas que formaram o Conselho Fiscal adaptado indicaram, além do nível de governança corporativa, o fato de ser auditada por uma das Big4 e a classificação da empresa conforme o valor agregado de mercado como fatores que influenciaram nas suas decisões. Não houve consenso de opiniões quanto ao Conselho Fiscal ser mais adaptável que o Comitê de Auditoria ao ambiente de negócios brasileiros, se as funções dos dois órgãos são distintas e se os custos associados à formação do Comitê de Auditoria são ou não relevantes. Assim, pode-se concluir que, em alguns aspectos, as percepções dos analistas de mercado e dos executivos das empresas são bastante diferentes.The aim of this study was to obtain the opinion of executives working in companies that companies that operate in Brazil and negotiate their titles in the North-American market and the opinion of market analysts. For that, we used questionnaires and made interviews. The majority of executives working in companies with an Audit Committee pointed out the level of Corporate Governance as one of the factors that most influenced the decision to establish either of the agencies. On the other hand, according to the majority of executives working in companies with an adapted Fiscal Council, the level of Corporate Governance, in addition to the fact that the company was audited by one of the Big4, and the company

  7. A Pilot Comparative Study of Quantitative Ultrasound, Conventional Ultrasound, and MRI for Predicting Histology-Determined Steatosis Grade in Adult Nonalcoholic Fatty Liver Disease.

    Science.gov (United States)

    Paige, Jeremy S; Bernstein, Gregory S; Heba, Elhamy; Costa, Eduardo A C; Fereirra, Marilia; Wolfson, Tanya; Gamst, Anthony C; Valasek, Mark A; Lin, Grace Y; Han, Aiguo; Erdman, John W; O'Brien, William D; Andre, Michael P; Loomba, Rohit; Sirlin, Claude B

    2017-05-01

    The purpose of this study is to explore the diagnostic performance of two investigational quantitative ultrasound (QUS) parameters, attenuation coefficient and backscatter coefficient, in comparison with conventional ultrasound (CUS) and MRI-estimated proton density fat fraction (PDFF) for predicting histology-confirmed steatosis grade in adults with nonalcoholic fatty liver disease (NAFLD). In this prospectively designed pilot study, 61 adults with histology-confirmed NAFLD were enrolled from September 2012 to February 2014. Subjects underwent QUS, CUS, and MRI examinations within 100 days of clinical-care liver biopsy. QUS parameters (attenuation coefficient and backscatter coefficient) were estimated using a reference phantom technique by two analysts independently. Three-point ordinal CUS scores intended to predict steatosis grade (1, 2, or 3) were generated independently by two radiologists on the basis of QUS features. PDFF was estimated using an advanced chemical shift-based MRI technique. Using histologic examination as the reference standard, ROC analysis was performed. Optimal attenuation coefficient, backscatter coefficient, and PDFF cutoff thresholds were identified, and the accuracy of attenuation coefficient, backscatter coefficient, PDFF, and CUS to predict steatosis grade was determined. Interobserver agreement for attenuation coefficient, backscatter coefficient, and CUS was analyzed. CUS had 51.7% grading accuracy. The raw and cross-validated steatosis grading accuracies were 61.7% and 55.0%, respectively, for attenuation coefficient, 68.3% and 68.3% for backscatter coefficient, and 76.7% and 71.3% for MRI-estimated PDFF. Interobserver agreements were 53.3% for CUS (κ = 0.61), 90.0% for attenuation coefficient (κ = 0.87), and 71.7% for backscatter coefficient (κ = 0.82) (p hepatic steatosis grade in patients with NAFLD.

  8. Network discovery, characterization, and prediction : a grand challenge LDRD final report.

    Energy Technology Data Exchange (ETDEWEB)

    Kegelmeyer, W. Philip, Jr.

    2010-11-01

    This report is the final summation of Sandia's Grand Challenge LDRD project No.119351, 'Network Discovery, Characterization and Prediction' (the 'NGC') which ran from FY08 to FY10. The aim of the NGC, in a nutshell, was to research, develop, and evaluate relevant analysis capabilities that address adversarial networks. Unlike some Grand Challenge efforts, that ambition created cultural subgoals, as well as technical and programmatic ones, as the insistence on 'relevancy' required that the Sandia informatics research communities and the analyst user communities come to appreciate each others needs and capabilities in a very deep and concrete way. The NGC generated a number of technical, programmatic, and cultural advances, detailed in this report. There were new algorithmic insights and research that resulted in fifty-three refereed publications and presentations; this report concludes with an abstract-annotated bibliography pointing to them all. The NGC generated three substantial prototypes that not only achieved their intended goals of testing our algorithmic integration, but which also served as vehicles for customer education and program development. The NGC, as intended, has catalyzed future work in this domain; by the end it had already brought in, in new funding, as much funding as had been invested in it. Finally, the NGC knit together previously disparate research staff and user expertise in a fashion that not only addressed our immediate research goals, but which promises to have created an enduring cultural legacy of mutual understanding, in service of Sandia's national security responsibilities in cybersecurity and counter proliferation.

  9. Predicting the Consequences of MMOD Penetrations on the International Space Station

    Science.gov (United States)

    Hyde, James; Christiansen, E.; Lear, D.; Evans

    2018-01-01

    The threat from micrometeoroid and orbital debris (MMOD) impacts on space vehicles is often quantified in terms of the probability of no penetration (PNP). However, for large spacecraft, especially those with multiple compartments, a penetration may have a number of possible outcomes. The extent of the damage (diameter of hole, crack length or penetration depth), the location of the damage relative to critical equipment or crew, crew response, and even the time of day of the penetration are among the many factors that can affect the outcome. For the International Space Station (ISS), a Monte-Carlo style software code called Manned Spacecraft Crew Survivability (MSCSurv) is used to predict the probability of several outcomes of an MMOD penetration-broadly classified as loss of crew (LOC), crew evacuation (Evac), loss of escape vehicle (LEV), and nominal end of mission (NEOM). By generating large numbers of MMOD impacts (typically in the billions) and tracking the consequences, MSCSurv allows for the inclusion of a large number of parameters and models as well as enabling the consideration of uncertainties in the models and parameters. MSCSurv builds upon the results from NASA's Bumper software (which provides the probability of penetration and critical input data to MSCSurv) to allow analysts to estimate the probability of LOC, Evac, LEV, and NEOM. This paper briefly describes the overall methodology used by NASA to quantify LOC, Evac, LEV, and NEOM with particular emphasis on describing in broad terms how MSCSurv works and its capabilities and most significant models.

  10. Automated Sunspot Detection and Classification Using SOHO/MDI Imagery

    Science.gov (United States)

    2015-03-01

    to the geocentric North). 3. Focus and size of the solar disk is adjusted to fit an 18 cm diameter circle on the worksheet. 4. Analyst hand draws the...General Nature of the Sunspot,” The Astrophysical Journal 230, 905–913 (1979). 14. Wheatland, M. S., “A Bayesian Approach to Solar Flare Prediction,” The

  11. The Start of a Tech Revolution

    Science.gov (United States)

    Dyrli, Kurt O.

    2009-01-01

    We are at the start of a revolution in the use of computers, one that analysts predict will rival the development of the PC in its significance. Companies such as Google, HP, Amazon, Sun Microsystems, Sony, IBM, and Apple are orienting their entire business models toward this change, and software maker SAS has announced plans for a $70 million…

  12. Greenhouse energy consumption

    Science.gov (United States)

    Eric van Steenis

    2009-01-01

    Depending on location and luck, natural gas rates have gone from less that CAN$ 3.00 to more than CAN$ 20.00/gigajoule (Gj). Natural gas rates are currently around CAN$ 13.00/Gj, although industry "analysts" predict an increase. A gigajoule is equivalent to the energy released by the combustion of approximately 30 L (8 gal) of gasoline. It is also equivalent...

  13. Relatedness, national boarders, perceptions of firms and the value of their innovations

    Science.gov (United States)

    Castor, Adam R.

    The main goal of this dissertation is to better understand how external corporate stakeholder perceptions of relatedness affect important outcomes for companies. In pursuit of this goal, I apply the lens of category studies. Categories not only help audiences to distinguish between members of different categories, they also convey patterns of relatedness. In turn, this may have implications for understanding how audiences search, what they attend to, and how the members are ultimately valued. In the first chapter, I apply incites from social psychology to show how the nationality of audience members affects the way that they cognitively group objects into similar categories. I find that the geographic location of stock market analysts affect the degree to which they will revise their earnings estimates for a given company in the wake of an earnings miss by another firm in the same industry. Foreign analysts revise their earnings estimates downward more so than do local analysts, suggesting that foreign analysts ascribe the earnings miss more broadly and tend to lump companies located in the same country into larger groups than do local analysts. In the second chapter, I demonstrate that the structure of inter-category relationships can have consequential effects for the members of a focal category. Leveraging an experimental-like design, I study the outcomes of nanotechnology patents and the pattern of forward citations across multiple patent jurisdictions. I find that members of technology categories with many close category 'neighbors' are more broadly cited than members of categories with few category 'neighbors.' My findings highlight how category embeddedness and category system structure affect the outcomes of category members as well as the role that classification plays in the valuation of innovation. In the third chapter, I propose a novel and dynamic measure of corporate similarity that is constructed from the two-mode analyst and company coverage network

  14. Making detailed predictions makes (some) predictions worse

    Science.gov (United States)

    Kelly, Theresa F.

    In this paper, we investigate whether making detailed predictions about an event makes other predictions worse. Across 19 experiments, 10,895 participants, and 415,960 predictions about 724 professional sports games, we find that people who made detailed predictions about sporting events (e.g., how many hits each baseball team would get) made worse predictions about more general outcomes (e.g., which team would win). We rule out that this effect is caused by inattention or fatigue, thinking too hard, or a differential reliance on holistic information about the teams. Instead, we find that thinking about game-relevant details before predicting winning teams causes people to give less weight to predictive information, presumably because predicting details makes information that is relatively useless for predicting the winning team more readily accessible in memory and therefore incorporated into forecasts. Furthermore, we show that this differential use of information can be used to predict what kinds of games will and will not be susceptible to the negative effect of making detailed predictions.

  15. RELAP/SCDAPSIM Reactor System Simulator Development and Training for University and Reactor Applications

    International Nuclear Information System (INIS)

    Hohorst, J.K.; Allison, C.M.

    2010-01-01

    The RELAP/SCDAPSIM code, designed to predict the behaviour of reactor systems during normal and accident conditions, is being developed as part of an international nuclear technology development program called SDTP (SCDAP Development and Training Program). SDTP involves more than 60 organizations in 28 countries. One of the important applications of the code is for simulator training of university faculty and students, reactor analysts, and reactor operations and technical support staff. Examples of RELAP/SCDAPSIM-based system thermal hydraulic and severe accident simulator packages include the SAFSIM simulator developed by NECSA for the SAFARI research reactor in South Africa, university-developed simulators at the University of Mexico and Shanghai Jiao Tong University in China, and commercial VISA and RELSIM packages used for analyst and reactor operations staff training. This paper will briefly describe the different packages/facilities. (authors)

  16. RELAP/SCDAPSIM Reactor System Simulator Development and Training for University and Reactor Applications

    Energy Technology Data Exchange (ETDEWEB)

    Hohorst, J.K.; Allison, C.M. [Innovative Systems Software, 1242 South Woodruff Avenue, Idaho Falls, Idaho 83404 (United States)

    2010-07-01

    The RELAP/SCDAPSIM code, designed to predict the behaviour of reactor systems during normal and accident conditions, is being developed as part of an international nuclear technology development program called SDTP (SCDAP Development and Training Program). SDTP involves more than 60 organizations in 28 countries. One of the important applications of the code is for simulator training of university faculty and students, reactor analysts, and reactor operations and technical support staff. Examples of RELAP/SCDAPSIM-based system thermal hydraulic and severe accident simulator packages include the SAFSIM simulator developed by NECSA for the SAFARI research reactor in South Africa, university-developed simulators at the University of Mexico and Shanghai Jiao Tong University in China, and commercial VISA and RELSIM packages used for analyst and reactor operations staff training. This paper will briefly describe the different packages/facilities. (authors)

  17. G-Tunnel Welded Tuff Mining experiment evaluations

    International Nuclear Information System (INIS)

    Zimmerman, R.M.; Bellman, R.A. Jr.; Mann, K.L.; Zerga, D.P.; Fowler, M.; Johnson, J.R.

    1988-12-01

    Designers and analysts of radioactive waste repositories must be able to predict the mechanical behavior of the host rock. Sandia National Laboratories elected to conduct a mine-by in welded tuff so that predictive-type information could be obtained regarding the response of the rock to a drill and blast excavation process, where smooth blasting techniques were used. This report describes the results of the mining processes and presents and discusses the rock mass responses to the mining and ground support activities. 37 refs., 20 figs., 7 tabs

  18. Design retrofit to prevent damage due to heat transport pump operation under conditions of significant void

    Energy Technology Data Exchange (ETDEWEB)

    Lam, K F [Bruce Engineering Department, In-Service Nuclear Projects, Ontario Hydro, North York, ON (Canada)

    1991-04-01

    The purpose of this paper is to provide a general review of certain key design areas which address the safety concerns of HT pump operation under conditions of significant void. To illustrate the challenges confronting designers and analysts, some of the highlights during the design of a protective system to prevent damage to HT piping and pump supports at Bruce NGS 'A' are outlined. The effects of this protective system on reactor safety are also discussed. HI pump operation under conditions of significant void offers a major challenge to designers and analysts to ensure that pump induced vibration and its effects on pump and piping are addressed. For an in-service station the search for a practical solution is often limited by existing. station equipment design and Layout. The diversity of design verification process requires a major commitment of engineering resources to ensure all. safety aspects meet the requirements of regulatory body. Work currently undertaken at Ontario Hydro Research Pump Test Complex on two-phase flow in pumps and piping may provide better prediction of vibration characteristics so that inherent conservativeness in fatigue Life prediction of HI system components can be reduced.

  19. Design retrofit to prevent damage due to heat transport pump operation under conditions of significant void

    International Nuclear Information System (INIS)

    Lam, K.F.

    1991-01-01

    The purpose of this paper is to provide a general review of certain key design areas which address the safety concerns of HT pump operation under conditions of significant void. To illustrate the challenges confronting designers and analysts, some of the highlights during the design of a protective system to prevent damage to HT piping and pump supports at Bruce NGS 'A' are outlined. The effects of this protective system on reactor safety are also discussed. HI pump operation under conditions of significant void offers a major challenge to designers and analysts to ensure that pump induced vibration and its effects on pump and piping are addressed. For an in-service station the search for a practical solution is often limited by existing. station equipment design and Layout. The diversity of design verification process requires a major commitment of engineering resources to ensure all. safety aspects meet the requirements of regulatory body. Work currently undertaken at Ontario Hydro Research Pump Test Complex on two-phase flow in pumps and piping may provide better prediction of vibration characteristics so that inherent conservativeness in fatigue Life prediction of HI system components can be reduced

  20. An Unobtrusive System to Measure, Assess, and Predict Cognitive Workload in Real-World Environments

    Science.gov (United States)

    Bracken, Bethany K.; Palmon, Noa; Elkin-Frankston, Seth; Irvin, Scott; Jenkins, Michael; Farry, Mike

    2017-01-01

    Across many careers, individuals face alternating periods of high and low attention and cognitive workload, which can result in impaired cognitive functioning and can be detrimental to job performance. For example, some professions (e.g., fire fighters, emergency medical personnel, doctors and nurses working in an emergency room, pilots) require long periods of low workload (boredom), followed by sudden, high-tempo operations during which they may be required to respond to an emergency and perform at peak cognitive levels. Conversely, other professions (e.g., air traffic controllers, market investors in financial industries, analysts) require long periods of high workload and multitasking during which the addition of just one more task results in cognitive overload resulting in mistakes. An unobtrusive system to measure, assess, and predict cognitive workload could warn individuals, their teammates, or their supervisors when steps should be taken to augment cognitive readiness. In this talk I will describe an approach to this problem that we have found to be successful across work domains including: (1) a suite of unobtrusive, field-ready neurophysiological, physiological, and behavioral sensors that are chosen to best suit the target environment; (2) custom algorithms and statistical techniques to process and time-align raw data originating from the sensor suite; (3) probabilistic and statistical models designed to interpret the data into the human state of interest (e.g., cognitive workload, attention, fatigue); (4) and machine-learning techniques to predict upcoming performance based on the current pattern of events, and (5) display of each piece of information depending on the needs of the target user who may or may not want to drill down into the functioning of the system to determine how conclusions about human state and performance are determined. I will then focus in on our experimental results from our custom functional near-infrared spectroscopy sensor

  1. Exubera. Inhale therapeutic systems.

    Science.gov (United States)

    Bindra, Sanjit; Cefalu, William T

    2002-05-01

    Inhale, in colaboration with Pfizer and Aventis Pharma (formerly Hoechst Marion Roussel; HMR), is developing an insulin formulation utilizing its pulmonary delivery technology for macromolecules for the potential treatment of type I and II diabetes. By July 2001, the phase III program had been completed and the companies had begun to assemble data for MAA and NDA filings; however, it was already clear at this time that additional data might be required for filing. By December 2001, it had been decided that the NDA should include an increased level of controlled, long-term pulmonary safety data in diabetic patients and a major study was planned to be completed in 2002, with the NDA filed thereafter (during 2002). US-05997848 was issued to Inhale Therapeutic Systems in December 1999, and corresponds to WO-09524183, filed in February 1995. Equivalent applications have appeared to date in Australia, Brazil, Canada, China, Czech Republic, Europe, Finland, Hungary, Japan, Norway, New Zealand, Poland and South Africa. This family of applications is specific to pulmonary delivery of insulin. In February 1999, Lehman Brothers gave this inhaled insulin a 60% probability of reaching market, with a possible launch date of 2001. The analysts estimated peak sales at $3 billion in 2011. In May 2000, Aventis predicted that estimated peak sales would be in excess of $1 billion. In February 2000, Merrill Lynch expected product launch in 2002 and predicted that it would be a multibillion-dollar product. Analysts Merril Lynch predicted, in September and November 2000, that the product would be launched by 2002, with sales in that year of e75 million, rising to euro 500 million in 2004. In April 2001, Merrill Lynch predicted that filing for this drug would occur in 2001. Following the report of the potential delay in regulatory filing, issued in July 2001, Deutsche Banc Alex Brown predicted a filing would take place in the fourth quarter of 2002 and launch would take place in the first

  2. Detector Fundamentals for Reachback Analysts

    Energy Technology Data Exchange (ETDEWEB)

    Karpius, Peter Joseph [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Myers, Steven Charles [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-08-03

    This presentation is a part of the DHS LSS spectroscopy course and provides an overview of the following concepts: detector system components, intrinsic and absolute efficiency, resolution and linearity, and operational issues and limits.

  3. Intermediate Infrastructure Analyst | IDRC - International ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Provides feedback for the creation of service project business cases and ... and project plans to allow IDRC to move forward with a specific product strategy. ... or team leader by undertaking research, investigations, evaluations and testing of ...

  4. Info-gap decision theory decisions under severe uncertainty

    CERN Document Server

    Ben-Haim, Yakov

    2006-01-01

    Everyone makes decisions, but not everyone is a decision analyst. A decision analyst uses quantitative models and computational methods to formulate decision algorithms, assess decision performance, identify and evaluate options, determine trade-offs and risks, evaluate strategies for investigation, and so on. This book is written for decision analysts. The term ""decision analyst"" covers an extremely broad range of practitioners. Virtually all engineers involved in design (of buildings, machines, processes, etc.) or analysis (of safety, reliability, feasibility, etc.) are decision analysts,

  5. The Hydrograph Analyst, an Arcview GIS Extension That Integrates Point, Spatial, and Temporal Data Provides A Graphical User Interface for Hydrograph Analysis

    International Nuclear Information System (INIS)

    Jones, M.L.; O'Brien, G.M.; Jones, M.L.

    2000-01-01

    The Hydrograph Analyst (HA) is an ArcView GIS 3.2 extension developed by the authors to analyze hydrographs from a network of ground-water wells and springs in a regional ground-water flow model. ArcView GIS integrates geographic, hydrologic, and descriptive information and provides the base functionality needed for hydrograph analysis. The HA extends ArcView's base functionality by automating data integration procedures and by adding capabilities to visualize and analyze hydrologic data. Data integration procedures were automated by adding functionality to the View document's Document Graphical User Interface (DocGUI). A menu allows the user to query a relational database and select sites which are displayed as a point theme in a View document. An ''Identify One to Many'' tool is provided within the View DocGUI to retrieve all hydrologic information for a selected site and display it in a simple and concise tabular format. For example, the display could contain various records from many tables storing data for one site. Another HA menu allows the user to generate a hydrograph for sites selected from the point theme. Hydrographs generated by the HA are added as hydrograph documents and accessed by the user with the Hydrograph DocGUI, which contains tools and buttons for hydrograph analysis. The Hydrograph DocGUI has a ''Select By Polygon'' tool used for isolating particular points on the hydrograph inside a user-drawn polygon or the user could isolate the same points by constructing a logical expression with the ArcView GIS ''Query Builder'' dialog that is also accessible in the Hydrograph DocGUI. Other buttons can be selected to alter the query applied to the active hydrograph. The selected points on the active hydrograph can be attributed (or flagged) individually or as a group using the ''Flag'' tool found on the Hydrograph DocGUI. The ''Flag'' tool activates a dialog box that prompts the user to select an attribute and ''methods'' or ''conditions'' that qualify

  6. Bootstrap prediction and Bayesian prediction under misspecified models

    OpenAIRE

    Fushiki, Tadayoshi

    2005-01-01

    We consider a statistical prediction problem under misspecified models. In a sense, Bayesian prediction is an optimal prediction method when an assumed model is true. Bootstrap prediction is obtained by applying Breiman's `bagging' method to a plug-in prediction. Bootstrap prediction can be considered to be an approximation to the Bayesian prediction under the assumption that the model is true. However, in applications, there are frequently deviations from the assumed model. In this paper, bo...

  7. Modeling Human Behavior to Anticipate Insider Attacks

    Directory of Open Access Journals (Sweden)

    Ryan E Hohimer

    2011-01-01

    Full Text Available The insider threat ranks among the most pressing cyber-security challenges that threaten government and industry information infrastructures. To date, no systematic methods have been developed that provide a complete and effective approach to prevent data leakage, espionage, and sabotage. Current practice is forensic in nature, relegating to the analyst the bulk of the responsibility to monitor, analyze, and correlate an overwhelming amount of data. We describe a predictive modeling framework that integrates a diverse set of data sources from the cyber domain, as well as inferred psychological/motivational factors that may underlie malicious insider exploits. This comprehensive threat assessment approach provides automated support for the detection of high-risk behavioral "triggers" to help focus the analyst's attention and inform the analysis. Designed to be domain-independent, the system may be applied to many different threat and warning analysis/sense-making problems.

  8. Protein docking prediction using predicted protein-protein interface

    Directory of Open Access Journals (Sweden)

    Li Bin

    2012-01-01

    Full Text Available Abstract Background Many important cellular processes are carried out by protein complexes. To provide physical pictures of interacting proteins, many computational protein-protein prediction methods have been developed in the past. However, it is still difficult to identify the correct docking complex structure within top ranks among alternative conformations. Results We present a novel protein docking algorithm that utilizes imperfect protein-protein binding interface prediction for guiding protein docking. Since the accuracy of protein binding site prediction varies depending on cases, the challenge is to develop a method which does not deteriorate but improves docking results by using a binding site prediction which may not be 100% accurate. The algorithm, named PI-LZerD (using Predicted Interface with Local 3D Zernike descriptor-based Docking algorithm, is based on a pair wise protein docking prediction algorithm, LZerD, which we have developed earlier. PI-LZerD starts from performing docking prediction using the provided protein-protein binding interface prediction as constraints, which is followed by the second round of docking with updated docking interface information to further improve docking conformation. Benchmark results on bound and unbound cases show that PI-LZerD consistently improves the docking prediction accuracy as compared with docking without using binding site prediction or using the binding site prediction as post-filtering. Conclusion We have developed PI-LZerD, a pairwise docking algorithm, which uses imperfect protein-protein binding interface prediction to improve docking accuracy. PI-LZerD consistently showed better prediction accuracy over alternative methods in the series of benchmark experiments including docking using actual docking interface site predictions as well as unbound docking cases.

  9. Protein docking prediction using predicted protein-protein interface.

    Science.gov (United States)

    Li, Bin; Kihara, Daisuke

    2012-01-10

    Many important cellular processes are carried out by protein complexes. To provide physical pictures of interacting proteins, many computational protein-protein prediction methods have been developed in the past. However, it is still difficult to identify the correct docking complex structure within top ranks among alternative conformations. We present a novel protein docking algorithm that utilizes imperfect protein-protein binding interface prediction for guiding protein docking. Since the accuracy of protein binding site prediction varies depending on cases, the challenge is to develop a method which does not deteriorate but improves docking results by using a binding site prediction which may not be 100% accurate. The algorithm, named PI-LZerD (using Predicted Interface with Local 3D Zernike descriptor-based Docking algorithm), is based on a pair wise protein docking prediction algorithm, LZerD, which we have developed earlier. PI-LZerD starts from performing docking prediction using the provided protein-protein binding interface prediction as constraints, which is followed by the second round of docking with updated docking interface information to further improve docking conformation. Benchmark results on bound and unbound cases show that PI-LZerD consistently improves the docking prediction accuracy as compared with docking without using binding site prediction or using the binding site prediction as post-filtering. We have developed PI-LZerD, a pairwise docking algorithm, which uses imperfect protein-protein binding interface prediction to improve docking accuracy. PI-LZerD consistently showed better prediction accuracy over alternative methods in the series of benchmark experiments including docking using actual docking interface site predictions as well as unbound docking cases.

  10. 非财务信息披露与分析师预测基于深市上市企业社会责任报告的实证检验%Nonfinancial Disclosure and Analyst Forecast Accuracy:Empirical Evidence on Corporate Social Responsibility Disclosure from Shenzhen Stock Exchange

    Institute of Scientific and Technical Information of China (English)

    李晚金; 张莉

    2014-01-01

    We examine the relationship between disclosure of nonfinancial information and an-alyst forecast accuracy using the issuance of stand-alone corporate social responsibility (CSR)re-ports to proxy for disclosure of nonfinancial information.The multiple regression analysis results show that the higher quality of the Corporate social responsibility report disclosure is associated with higher analyst forecast accuracy and the relationship is stronger in firms with more opaque financial disclosure,suggesting that this kind of non-financial information in social responsibility report not only has information content but also complements financial disclosure by mitigating the negative effect of financial opacity on forecast accuracy.%以深市上市企业披露的社会责任报告作为非财务信息的替代变量,实证检验了非财务信息披露质量与分析师盈利预测的关系。多元回归分析结果表明,企业社会责任报告披露质量越好,其分析师盈利预测越精确,并且在财务透明度低的企业中,这种正向关系更显著。这说明社会责任报告披露的这类非财务信息对分析师预测不仅具有信息含量,而且能够通过对财务信息的补充作用,缓解财务不透明对分析师预测精度的不利后果。

  11. Incorporating a Time Horizon in Rate-of-Return Estimations: Discounted Cash Flow Model in Electric Transmission Rate Cases

    International Nuclear Information System (INIS)

    Chatterjee, Bishu; Sharp, Peter A.

    2006-01-01

    Electric transmission and other rate cases use a form of the discounted cash flow model with a single long-term growth rate to estimate rates of return on equity. It cannot incorporate information about the appropriate time horizon for which analysts' estimates of earnings growth have predictive powers. Only a non-constant growth model can explicitly recognize the importance of the time horizon in an ROE calculation. (author)

  12. Cost approach of health care entity intangible asset valuation.

    Science.gov (United States)

    Reilly, Robert F

    2012-01-01

    In the valuation synthesis and conclusion process, the analyst should consider the following question: Does the selected valuation approach(es) and method(s) accomplish the analyst's assignment? Also, does the selected valuation approach and method actually quantify the desired objective of the intangible asset analysis? The analyst should also consider if the selected valuation approach and method analyzes the appropriate bundle of legal rights. The analyst should consider if there were sufficient empirical data available to perform the selected valuation approach and method. The valuation synthesis should consider if there were sufficient data available to make the analyst comfortable with the value conclusion. The valuation analyst should consider if the selected approach and method will be understandable to the intended audience. In the valuation synthesis and conclusion, the analyst should also consider which approaches and methods deserve the greatest consideration with respect to the intangible asset's RUL. The intangible asset RUL is a consideration of each valuation approach. In the income approach, the RUL may affect the projection period for the intangible asset income subject to either yield capitalization or direct capitalization. In the cost approach, the RUL may affect the total amount of obsolescence, if any, from the estimate cost measure (that is, the intangible reproduction cost new or replacement cost new). In the market approach, the RUL may effect the selection, rejection, and/or adjustment of the comparable or guideline intangible asset sale and license transactional data. The experienced valuation analyst will use professional judgment to weight the various value indications to conclude a final intangible asset value, based on: The analyst's confidence in the quantity and quality of available data; The analyst's level of due diligence performed on that data; The relevance of the valuation method to the intangible asset life cycle stage and

  13. WALS Prediction

    NARCIS (Netherlands)

    Magnus, J.R.; Wang, W.; Zhang, Xinyu

    2012-01-01

    Abstract: Prediction under model uncertainty is an important and difficult issue. Traditional prediction methods (such as pretesting) are based on model selection followed by prediction in the selected model, but the reported prediction and the reported prediction variance ignore the uncertainty

  14. Building Fire Behavior Analyst (FBAN) capability and capacity: Lessons learned From Victoria, Australia's Bushfire Behavior Predictive Services Strategy

    Science.gov (United States)

    K. E. Gibos; A. Slijepcevic; T. Wells; L. Fogarty

    2015-01-01

    Wildland fire managers must frequently make meaning from chaos in order to protect communities and infrastructure from the negative impacts of fire. Fire management personnel are increasingly turning to science to support their experience-based decision-making processes and to provide clear, confident leadership for communities frequently exposed to risk from wildfire...

  15. Shades of Gray: Releasing the Cognitive Binds that Blind Us

    Science.gov (United States)

    2016-09-01

    resulting in a lack of analyst awareness at the crucial beginning stages of their careers . Some of the programs discovered in the research are...tradecraft toolkit of all intelligence analysts.”39 Heuer’s book is recommended reading for all perspective intelligence analysts by the National...www.hellenext.org/internship/disney-company-global-intelligence-analyst-intern-spring-2016/. 73 Walt Disney Company, “ Careers ,” accessed January 7, 2016

  16. G-tunnel welded tuff mining experiment preparations

    International Nuclear Information System (INIS)

    Zimmerman, R.M.; Bellman, R.A. Jr.; Mann, K.L.; Zerga, D.P.

    1991-01-01

    Designers and analysts of radioactive waste repositories must be able to predict the mechanical behavior of the host rock. Sandia National Laboratories elected to conduct a mine-by in welded tuff so that predictive-type information could be obtained regarding the response of the rock to a drill and blast excavation process, where smooth blasting techniques were used. Included in the study were evaluations of and recommendations for various measurement systems that might be used in future mine-by efforts. This report summarizes the preparations leading to the recording of data. 17 refs., 27 figs., 5 tabs

  17. Integrated Baseline System (IBS) Version 2.0: Models guide

    Energy Technology Data Exchange (ETDEWEB)

    1994-03-01

    The Integrated Baseline System (IBS) is an emergency management planning and analysis tool being developed under the direction of the US Army Nuclear and Chemical Agency. This Models Guide summarizes the IBS use of several computer models for predicting the results of emergency situations. These include models for predicting dispersion/doses of airborne contaminants, traffic evacuation, explosion effects, heat radiation from a fire, and siren sound transmission. The guide references additional technical documentation on the models when such documentation is available from other sources. The audience for this manual is chiefly emergency management planners and analysts, but also data managers and system managers.

  18. G-Tunnel welded tuff mining experiment data summary

    International Nuclear Information System (INIS)

    Zimmerman, R.M.; Bellman, R.A. Jr.; Mann, K.L.; Zerga, D.P.; Fowler, M.

    1990-03-01

    Designers and analysts of radioactive waste repositories must be ably to predict the mechanical behavior of the host rock. Sandia National Laboratories elected to conduct a mine-by in welded tuff so that predictive-type information could be obtained regarding the response of the rock to a drill and blast excavation process, where smooth blasting techniques were used. Included in the study were evaluations of and recommendations for various measurement systems that might be used in future mine by efforts. This report summarizes all of the data obtained in the welded tuff mining experiment. 6 refs., 29 figs., 12 tabs

  19. Data-Based Predictive Control with Multirate Prediction Step

    Science.gov (United States)

    Barlow, Jonathan S.

    2010-01-01

    Data-based predictive control is an emerging control method that stems from Model Predictive Control (MPC). MPC computes current control action based on a prediction of the system output a number of time steps into the future and is generally derived from a known model of the system. Data-based predictive control has the advantage of deriving predictive models and controller gains from input-output data. Thus, a controller can be designed from the outputs of complex simulation code or a physical system where no explicit model exists. If the output data happens to be corrupted by periodic disturbances, the designed controller will also have the built-in ability to reject these disturbances without the need to know them. When data-based predictive control is implemented online, it becomes a version of adaptive control. One challenge of MPC is computational requirements increasing with prediction horizon length. This paper develops a closed-loop dynamic output feedback controller that minimizes a multi-step-ahead receding-horizon cost function with multirate prediction step. One result is a reduced influence of prediction horizon and the number of system outputs on the computational requirements of the controller. Another result is an emphasis on portions of the prediction window that are sampled more frequently. A third result is the ability to include more outputs in the feedback path than in the cost function.

  20. Purposeful selection of variables in logistic regression

    Directory of Open Access Journals (Sweden)

    Williams David Keith

    2008-12-01

    Full Text Available Abstract Background The main problem in many model-building situations is to choose from a large set of covariates those that should be included in the "best" model. A decision to keep a variable in the model might be based on the clinical or statistical significance. There are several variable selection algorithms in existence. Those methods are mechanical and as such carry some limitations. Hosmer and Lemeshow describe a purposeful selection of covariates within which an analyst makes a variable selection decision at each step of the modeling process. Methods In this paper we introduce an algorithm which automates that process. We conduct a simulation study to compare the performance of this algorithm with three well documented variable selection procedures in SAS PROC LOGISTIC: FORWARD, BACKWARD, and STEPWISE. Results We show that the advantage of this approach is when the analyst is interested in risk factor modeling and not just prediction. In addition to significant covariates, this variable selection procedure has the capability of retaining important confounding variables, resulting potentially in a slightly richer model. Application of the macro is further illustrated with the Hosmer and Lemeshow Worchester Heart Attack Study (WHAS data. Conclusion If an analyst is in need of an algorithm that will help guide the retention of significant covariates as well as confounding ones they should consider this macro as an alternative tool.

  1. Comparison of typical inelastic analysis predictions with benchmark problem experimental results

    International Nuclear Information System (INIS)

    Clinard, J.A.; Corum, J.M.; Sartory, W.K.

    1975-01-01

    The results of exemplary inelastic analyses are presented for a series of experimental benchmark problems. Consistent analytical procedures and constitutive relations were used in each of the analyses, and published material behavior data were used in all cases. Two finite-element inelastic computer programs were employed. These programs implement the analysis procedures and constitutive equations for Type 304 stainless steel that are currently used in many analyses of elevated-temperature nuclear reactor system components. The analysis procedures and constitutive relations are briefly discussed, and representative analytical results are presented and compared to the test data. The results that are presented demonstrate the feasibility of performing inelastic analyses, and they are indicative of the general level of agreement that the analyst might expect when using conventional inelastic analysis procedures. (U.S.)

  2. Technical difficulties and challenges for performing safety analysis on digital I and C systems

    International Nuclear Information System (INIS)

    Yih, Swu

    1996-01-01

    Performing safety analysis on digital I and C systems is an important task for nuclear safety analysts. The analysis results can not only confirm that the system is well-developed but also provide crucial evidence for licensing process. However, currently both I and C developers and regulators have difficulties in evaluating the safety of digital I and C systems. To investigate this problem, this paper propose a frame-based model to analyze the working and failure mechanisms of software and its interaction with the environment. Valid isomorphic relationship between the logical (software) and the physical (hardware environment) frame is identified as a major factor that determines the safe behavior of the software. The failures that may potentially cause the violation of isomorphic relations are also discussed. To perform safety analysis on digital I and C systems, analysts need to predict the effects incurred by such failures. However, due to lack of continuity, regularity, integrity, and high complexity of software structure, software does not have a stable and predictable pattern of behavior, which in turn makes the trustworthiness of results of software safety analysis susceptible. Our model can explain many troublesome events experienced by computer controlled systems. Implications and possible directions for improvement are also discussed. (author)

  3. A New Methodology for the Integration of Performance Materials into the Clothing Curriculum

    OpenAIRE

    Power, Jess

    2014-01-01

    This paper presents a model for integrating the study of performance materials into the clothing curriculum. In recent years there has been an increase in demand for stylish, functional and versatile sports apparel. Analysts predict this will reach US$126.30 billion by 2015. This growth is accredited to dramatic lifestyle changes and increasing participation in sports/leisurely pursuits particularly by women. The desire to own performance clothing for specific outdoor pursuits is increasing a...

  4. How to gamify? A method for designing gamification

    OpenAIRE

    Morschheuser, Benedikt; Werder, Karl; Hamari, Juho; Abe, Julian

    2017-01-01

    During recent years, gamification has become a popular method of enriching information technologies. Popular business analysts have made promising predictions about penetration of gamification, however, it has also been estimated that most gamifica-tion efforts will fail due to poor understanding of how gamification should be designed and implemented. Therefore, in this paper we seek to advance the understanding of best practices related to the gamifica-tion design process. We approach this r...

  5. Proceedings of Conference on Variable-Resolution Modeling, Washington, DC, 5-6 May 1992

    Science.gov (United States)

    1992-05-01

    lag (MM Kim (S󈨘-M 881 received the B.S.IM1. and M.S.F.n degrees from Ptisan National t.’ni- veisnv. Korea , and kwmpook National Univer- sity. Koiea...position in the Department Electronics. National Fisheries University of Pusan. Pusan. Korea , research interests include artificial intelligence...with the data or the modeler/analyst/ gamer is forced to make up interactions such as fire allocation, detailed acquisition predictions, small unit

  6. Mortality, integrity, and psychoanalysis (who are you to me? Who am I to you?).

    Science.gov (United States)

    Pinsky, Ellen

    2014-01-01

    The author narrates her experience of mourning her therapist's sudden death. The profession has neglected implications of the analyst's mortality: what is lost or vulnerable to loss? What is that vulnerability's function? The author's process of mourning included her writing and her becoming an analyst. Both pursuits inspired reflections on mortality in two overlapping senses: bodily (the analyst is mortal and can die) and character (the analyst is mortal and can err). The subject thus expands to include impaired character and ethical violations. Paradoxically, the analyst's human limitations threaten each psychoanalytic situation, but also enable it: human imperfection animates the work. The essay ends with a specific example of integrity. © 2014 The Psychoanalytic Quarterly, Inc.

  7. 78 FR 77194 - Data Collection Available for Public Comments

    Science.gov (United States)

    2013-12-20

    ... comments to Andrea Giles, Supervisory Financial Analyst, Office of Credit Risk Management, Small Business..., Supervisory Financial Analyst, 202-205-6301, [email protected] , or Curtis B. Rich, Management Analyst, 202... these institutions, SBA requires them to submit audited financial statements annually as well as interim...

  8. Lawyers, Accountants and Financial Analysts: The “Architects” of the New EU Regime of Corporate Accountability

    Directory of Open Access Journals (Sweden)

    David Monciardini

    2016-09-01

    Full Text Available International accounting rules are increasingly under pressure as they are considered inadequate to respond to major changes in the way business is conducted, how it creates value and the context in which operates. The paper identifies and juxtaposes two regulatory trends in this re-definition of traditional accounting: ‘accounting for intangible assets’ and ‘corporate social accountability’. They are partially overlapping and both demand to go beyond accounting for physical and financial assets. However, they are underpinned by different rationales and supported by competing professional claims. Deploying a reflexive socio-legal approach, the article outlines a preliminary symbolic ‘archaeology’ of these regulatory trends. Drawing on interviews and documents analysis, it highlights the role of three professional communities in shaping regulatory changes: international accountants, activist-lawyers and financial analysts. Competing for the definition of what counts and what has value, they are generating an intriguing debate about the boundaries between business and society. Las normas internacionales de contabilidad están bajo una presión cada vez mayor, ya que se consideran inadecuadas para responder a los grandes cambios que se están produciendo en la forma en que se dirigen las empresas, cómo se crea valor y el contexto en el que se opera. El artículo identifica y yuxtapone dos tendencias reguladoras en esta redefinición de la contabilidad tradicional: "contabilidad de los activos intangibles" y "responsabilidad social empresarial". Se superponen parcialmente y ambas exigen ir más allá de la contabilidad de los activos físicos y financieros. Sin embargo, están respaldadas por distintas razones y cuentan con el apoyo de profesionales competentes. Implementando un enfoque socio-jurídico reflexivo, el artículo describe una "arqueología" preliminar simbólica de estas tendencias regulatorias. A partir de entrevistas y

  9. Generalized Predictive Control and Neural Generalized Predictive Control

    Directory of Open Access Journals (Sweden)

    Sadhana CHIDRAWAR

    2008-12-01

    Full Text Available As Model Predictive Control (MPC relies on the predictive Control using a multilayer feed forward network as the plants linear model is presented. In using Newton-Raphson as the optimization algorithm, the number of iterations needed for convergence is significantly reduced from other techniques. This paper presents a detailed derivation of the Generalized Predictive Control and Neural Generalized Predictive Control with Newton-Raphson as minimization algorithm. Taking three separate systems, performances of the system has been tested. Simulation results show the effect of neural network on Generalized Predictive Control. The performance comparison of this three system configurations has been given in terms of ISE and IAE.

  10. 78 FR 15796 - Data Collection Available for Public Comments

    Science.gov (United States)

    2013-03-12

    ... estimated burden and enhance the quality of the collections, to Rachel Newman Karton, Program Analyst..., Washington, DC 20416. FOR FURTHER INFORMATION CONTACT: Rachel Newman Karton, Program Analyst, 202-619-1618 rachel[email protected] Curtis B. Rich, Management Analyst, 202-205-7030 [email protected] . Title...

  11. Determinants of Financial Sustainability for Farm Credit Applications—A Delphi Study

    Directory of Open Access Journals (Sweden)

    Johannes I. F. Henning

    2016-01-01

    Full Text Available Farmers use credit from commercial credit providers to finance production activities. Commercial credit providers have to predict the financial sustainability of the enterprise to ensure that the borrower will have the ability to repay the loan. A Delphi study was conducted to explore what factors are used as indicators of loan-repayment ability of farmers. The objective was not only to identify factors that are currently considered, but also to identify other personal attributes that may improve the accuracy in predicting the repayment ability of potential borrowers. The Delphi was applied to a panel consisting of nine credit analysts and credit managers from a commercial credit provider in South Africa. The results indicate that the current and past financial performance, account standing, collateral, and credit record of the farm are very important in the assessment of applications in terms of financial performance. Experience and the success factors compared to competitors were found to be important, while age and education/qualification are regarded as less important in predicting repayment ability. The results also show that, although not currently objectively included in credit evaluations, credit analysis regards leadership and human relations; commitment and confidence; internal locus of control; self-efficacy; calculated risk taking; need for achievement; and opportunity seeking as important indicators of the ability of potential borrows to repay their loans. Thus, credit analysts and managers also regard management abilities and entrepreneurial characteristics of potential borrowers to be good indicators of repayment ability. Results from this research provide new indicator factors that can be used to extend existing credit evaluation instruments in order to more accurately predict repayment ability.

  12. Comparison of typical inelastic analysis predictions with benchmark problem experimental results

    International Nuclear Information System (INIS)

    Clinard, J.A.; Corum, J.M.; Sartory, W.K.

    1975-01-01

    The results of exemplary inelastic analyses for experimental benchmark problems on reactor components are presented. Consistent analytical procedures and constitutive relations were used in each of the analyses, and the material behavior data presented in the Appendix were used in all cases. Two finite-element inelastic computer programs were employed. These programs implement the analysis procedures and constitutive equations for type 304 stainless steel that are currently used in many analyses of elevated-temperature nuclear reactor system components. The analysis procedures and constitutive relations are briefly discussed, and representative analytical results are presented and compared to the test data. The results that are presented demonstrate the feasibility of performing inelastic analyses for the types of problems discussed, and they are indicative of the general level of agreement that the analyst might expect when using conventional inelastic analysis procedures. (U.S.)

  13. Capital Romance: Why Wall Street Fell in Love with Higher Education.

    Science.gov (United States)

    Ortmann, Andreas

    2001-01-01

    Investigates arguments market analysts employ to persuade investors to purchase stocks in for-profit higher education companies; describes role of market analysts; evaluates relative importance of reasons analysts typically give to persuade investors; compares reasons with modern economic theories of firms and markets and finds them to be…

  14. This art of psychoanalysis. Dreaming undreamt dreams and interrupted cries.

    Science.gov (United States)

    Ogden, Thomas H

    2004-08-01

    It is the art of psychoanalysis in the making, a process inventing itself as it goes, that is the subject of this paper. The author articulates succinctly how he conceives of psychoanalysis, and offers a detailed clinical illustration. He suggests that each analysand unconsciously (and ambivalently) is seeking help in dreaming his 'night terrors' (his undreamt and undreamable dreams) and his 'nightmares' (his dreams that are interrupted when the pain of the emotional experience being dreamt exceeds his capacity for dreaming). Undreamable dreams are understood as manifestations of psychotic and psychically foreclosed aspects of the personality; interrupted dreams are viewed as reflections of neurotic and other non-psychotic parts of the personality. The analyst's task is to generate conditions that may allow the analysand--with the analyst's participation--to dream the patient's previously undreamable and interrupted dreams. A significant part of the analyst's participation in the patient's dreaming takes the form of the analyst's reverie experience. In the course of this conjoint work of dreaming in the analytic setting, the analyst may get to know the analysand sufficiently well for the analyst to be able to say something that is true to what is occurring at an unconscious level in the analytic relationship. The analyst's use of language contributes significantly to the possibility that the patient will be able to make use of what the analyst has said for purposes of dreaming his own experience, thereby dreaming himself more fully into existence.

  15. Junior Information Management Analyst | IDRC - International ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    These tasks are performed recognising that information is an important asset for ... gathering, documenting, and analysing IM business requirements;; applying ... content types and metadata requirements;; defining the security and access ...

  16. Intermediate Systems Analyst | IDRC - International Development ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    ... bring to the System Development Group the necessary skills to understand in depth ... Participates in collaborative technical discussions with IDRC management, ... Responsible to monitor and oversee private sector consultant and part-time ...

  17. Therapeutic action and the analyst's responsibility.

    Science.gov (United States)

    Greenberg, Jay

    2015-02-01

    Models of the psychoanalytic situation can usefully be thought of as fictions. Viewed this way, the models can be understood as narrative structures that shape what we are able to see and how we are able to think about what happens between us and our analysands. Theories of therapeutic action are elements of what can be called a "controlling fiction," mediating between these theories and our very real responsibilities, both to our preferred method and to a suffering patient. This venture into comparative psychoanalysis is illustrated by a discussion of published case material. © 2015 by the American Psychoanalytic Association.

  18. Shakespeare for Analysts: Literature and Intelligence

    Science.gov (United States)

    2003-07-01

    helping to finance and at the same time providing the moral justification for his war: 29 Spiekerman, Shakespeare’s Political Realism, 131. 30... alchemy , Will change to virtue and to worthiness. Cassius Him and his worth and our great need of him You have right well conceited. Julius Caesar

  19. Zdeněk Kopal: Numerical Analyst

    Science.gov (United States)

    Křížek, M.

    2015-07-01

    We give a brief overview of Zdeněk Kopal's life, his activities in the Czech Astronomical Society, his collaboration with Vladimír Vand, and his studies at Charles University, Cambridge, Harvard, and MIT. Then we survey Kopal's professional life. He published 26 monographs and 20 conference proceedings. We will concentrate on Kopal's extensive monograph Numerical Analysis (1955, 1961) that is widely accepted to be the first comprehensive textbook on numerical methods. It describes, for instance, methods for polynomial interpolation, numerical differentiation and integration, numerical solution of ordinary differential equations with initial or boundary conditions, and numerical solution of integral and integro-differential equations. Special emphasis will be laid on error analysis. Kopal himself applied numerical methods to celestial mechanics, in particular to the N-body problem. He also used Fourier analysis to investigate light curves of close binaries to discover their properties. This is, in fact, a problem from mathematical analysis.

  20. Gamma-Ray Interactions for Reachback Analysts

    Energy Technology Data Exchange (ETDEWEB)

    Karpius, Peter Joseph [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Myers, Steven Charles [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-08-02

    This presentation is a part of the DHS LSS spectroscopy training course and presents an overview of the following concepts: identification and measurement of gamma rays; use of gamma counts and energies in research. Understanding the basic physics of how gamma rays interact with matter can clarify how certain features in a spectrum were produced.

  1. Technology Infrastructure, Analyst (Network, Video and ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    He/she participates in the business continuity and security of systems. ... Strategic Planning and Product Strategy ... in IT enhancement initiatives and projects as a team member or team leader by undertaking research, evaluations and testing ...

  2. Machine Learning Technologies Translates Vigilant Surveillance Satellite Big Data into Predictive Alerts for Environmental Stressors

    Science.gov (United States)

    Johnson, S. P.; Rohrer, M. E.

    2017-12-01

    The application of scientific research pertaining to satellite imaging and data processing has facilitated the development of dynamic methodologies and tools that utilize nanosatellites and analytical platforms to address the increasing scope, scale, and intensity of emerging environmental threats to national security. While the use of remotely sensed data to monitor the environment at local and global scales is not a novel proposition, the application of advances in nanosatellites and analytical platforms are capable of overcoming the data availability and accessibility barriers that have historically impeded the timely detection, identification, and monitoring of these stressors. Commercial and university-based applications of these technologies were used to identify and evaluate their capacity as security-motivated environmental monitoring tools. Presently, nanosatellites can provide consumers with 1-meter resolution imaging, frequent revisits, and customizable tasking, allowing users to define an appropriate temporal scale for high resolution data collection that meets their operational needs. Analytical platforms are capable of ingesting increasingly large and diverse volumes of data, delivering complex analyses in the form of interpretation-ready data products and solutions. The synchronous advancement of these technologies creates the capability of analytical platforms to deliver interpretable products from persistently collected high-resolution data that meet varying temporal and geographic scale requirements. In terms of emerging environmental threats, these advances translate into customizable and flexible tools that can respond to and accommodate the evolving nature of environmental stressors. This presentation will demonstrate the capability of nanosatellites and analytical platforms to provide timely, relevant, and actionable information that enables environmental analysts and stakeholders to make informed decisions regarding the prevention

  3. Influence of accounting concepts and regulatory rules on the funding of power reactor decommissioning costs

    International Nuclear Information System (INIS)

    Ferguson, J.S.

    1985-01-01

    Under normal circumstances, an evaluation of nuclear plant decommissioning costs by an engineering analyst will not produce the same results as an evaluation by a financial analyst. These analysts should understand evaluations based on each other's bases to ensure that their evaluation techniques are appropriate for the circumstances. The intent of this discussion is to enhance that understanding by describing the accounting and regulatory framework that is applicable to the decommissioning costs of U.S. nuclear power plants, and by explaining why evaluations of decommissioning costs prepared by engineering analysts often look different from evaluations prepared by financial analysts. Of major importance are the financial implications of several methods of funding the decommissioning costs. Since many owners of nuclear plants are subject to revenue rate regulation, financial implications often translate directly to regulatory implications

  4. Wall Street's assessment of plastic surgery--related technology: a clinical and financial analysis.

    Science.gov (United States)

    Krieger, L M; Shaw, W W

    2000-02-01

    Many plastic surgeons develop technologies that are manufactured by Wall Street-financed companies. Others participate in the stock market as investors. This study examines the bioengineered skin industry to determine whether it integrates clinical and financial information as Wall Street tenets would predict, and to see whether the financial performance of these companies provides any lessons for practicing plastic surgeons. In efficient markets, the assumptions on which independent financial analysts base their company sales and earnings projections are clinically reasonable, the volatility of a company's stock price does not irrationally differ from that of its industry sector, and the buy/sell recommendations of analysts are roughly congruent. For the companies in this study, these key financial parameters were compared with a benchmark index of 69 biotech companies of similar age and annual revenues (Student's t test). Five bioengineered skin companies were included in the study. Analysts estimated that each company would sell its product to between 24 and 45 percent of its target clinical population. The average stock price volatility was significantly higher for study companies than for those in the benchmark index (p companies were significantly less congruent than those for the benchmark companies (p invest in the stock market, because of their unique clinical experience, may sometimes be in the position to evaluate new technologies and companies better than Wall Street experts. Well-timed trades that use this expertise can result in opportunities for profit.

  5. Composable Analytic Systems for next-generation intelligence analysis

    Science.gov (United States)

    DiBona, Phil; Llinas, James; Barry, Kevin

    2015-05-01

    Lockheed Martin Advanced Technology Laboratories (LM ATL) is collaborating with Professor James Llinas, Ph.D., of the Center for Multisource Information Fusion at the University at Buffalo (State of NY), researching concepts for a mixed-initiative associate system for intelligence analysts to facilitate reduced analysis and decision times while proactively discovering and presenting relevant information based on the analyst's needs, current tasks and cognitive state. Today's exploitation and analysis systems have largely been designed for a specific sensor, data type, and operational context, leading to difficulty in directly supporting the analyst's evolving tasking and work product development preferences across complex Operational Environments. Our interactions with analysts illuminate the need to impact the information fusion, exploitation, and analysis capabilities in a variety of ways, including understanding data options, algorithm composition, hypothesis validation, and work product development. Composable Analytic Systems, an analyst-driven system that increases flexibility and capability to effectively utilize Multi-INT fusion and analytics tailored to the analyst's mission needs, holds promise to addresses the current and future intelligence analysis needs, as US forces engage threats in contested and denied environments.

  6. High Level Information Fusion (HLIF) with nested fusion loops

    Science.gov (United States)

    Woodley, Robert; Gosnell, Michael; Fischer, Amber

    2013-05-01

    Situation modeling and threat prediction require higher levels of data fusion in order to provide actionable information. Beyond the sensor data and sources the analyst has access to, the use of out-sourced and re-sourced data is becoming common. Through the years, some common frameworks have emerged for dealing with information fusion—perhaps the most ubiquitous being the JDL Data Fusion Group and their initial 4-level data fusion model. Since these initial developments, numerous models of information fusion have emerged, hoping to better capture the human-centric process of data analyses within a machine-centric framework. 21st Century Systems, Inc. has developed Fusion with Uncertainty Reasoning using Nested Assessment Characterizer Elements (FURNACE) to address challenges of high level information fusion and handle bias, ambiguity, and uncertainty (BAU) for Situation Modeling, Threat Modeling, and Threat Prediction. It combines JDL fusion levels with nested fusion loops and state-of-the-art data reasoning. Initial research has shown that FURNACE is able to reduce BAU and improve the fusion process by allowing high level information fusion (HLIF) to affect lower levels without the double counting of information or other biasing issues. The initial FURNACE project was focused on the underlying algorithms to produce a fusion system able to handle BAU and repurposed data in a cohesive manner. FURNACE supports analyst's efforts to develop situation models, threat models, and threat predictions to increase situational awareness of the battlespace. FURNACE will not only revolutionize the military intelligence realm, but also benefit the larger homeland defense, law enforcement, and business intelligence markets.

  7. Predicting outdoor sound

    CERN Document Server

    Attenborough, Keith; Horoshenkov, Kirill

    2014-01-01

    1. Introduction  2. The Propagation of Sound Near Ground Surfaces in a Homogeneous Medium  3. Predicting the Acoustical Properties of Outdoor Ground Surfaces  4. Measurements of the Acoustical Properties of Ground Surfaces and Comparisons with Models  5. Predicting Effects of Source Characteristics on Outdoor Sound  6. Predictions, Approximations and Empirical Results for Ground Effect Excluding Meteorological Effects  7. Influence of Source Motion on Ground Effect and Diffraction  8. Predicting Effects of Mixed Impedance Ground  9. Predicting the Performance of Outdoor Noise Barriers  10. Predicting Effects of Vegetation, Trees and Turbulence  11. Analytical Approximations including Ground Effect, Refraction and Turbulence  12. Prediction Schemes  13. Predicting Sound in an Urban Environment.

  8. Prediction-error of Prediction Error (PPE)-based Reversible Data Hiding

    OpenAIRE

    Wu, Han-Zhou; Wang, Hong-Xia; Shi, Yun-Qing

    2016-01-01

    This paper presents a novel reversible data hiding (RDH) algorithm for gray-scaled images, in which the prediction-error of prediction error (PPE) of a pixel is used to carry the secret data. In the proposed method, the pixels to be embedded are firstly predicted with their neighboring pixels to obtain the corresponding prediction errors (PEs). Then, by exploiting the PEs of the neighboring pixels, the prediction of the PEs of the pixels can be determined. And, a sorting technique based on th...

  9. Predictability and Prediction for an Experimental Cultural Market

    Science.gov (United States)

    Colbaugh, Richard; Glass, Kristin; Ormerod, Paul

    Individuals are often influenced by the behavior of others, for instance because they wish to obtain the benefits of coordinated actions or infer otherwise inaccessible information. In such situations this social influence decreases the ex ante predictability of the ensuing social dynamics. We claim that, interestingly, these same social forces can increase the extent to which the outcome of a social process can be predicted very early in the process. This paper explores this claim through a theoretical and empirical analysis of the experimental music market described and analyzed in [1]. We propose a very simple model for this music market, assess the predictability of market outcomes through formal analysis of the model, and use insights derived through this analysis to develop algorithms for predicting market share winners, and their ultimate market shares, in the very early stages of the market. The utility of these predictive algorithms is illustrated through analysis of the experimental music market data sets [2].

  10. Collaborative human-machine analysis using a controlled natural language

    Science.gov (United States)

    Mott, David H.; Shemanski, Donald R.; Giammanco, Cheryl; Braines, Dave

    2015-05-01

    A key aspect of an analyst's task in providing relevant information from data is the reasoning about the implications of that data, in order to build a picture of the real world situation. This requires human cognition, based upon domain knowledge about individuals, events and environmental conditions. For a computer system to collaborate with an analyst, it must be capable of following a similar reasoning process to that of the analyst. We describe ITA Controlled English (CE), a subset of English to represent analyst's domain knowledge and reasoning, in a form that it is understandable by both analyst and machine. CE can be used to express domain rules, background data, assumptions and inferred conclusions, thus supporting human-machine interaction. A CE reasoning and modeling system can perform inferences from the data and provide the user with conclusions together with their rationale. We present a logical problem called the "Analysis Game", used for training analysts, which presents "analytic pitfalls" inherent in many problems. We explore an iterative approach to its representation in CE, where a person can develop an understanding of the problem solution by incremental construction of relevant concepts and rules. We discuss how such interactions might occur, and propose that such techniques could lead to better collaborative tools to assist the analyst and avoid the "pitfalls".

  11. Comparison of ArcGIS and SAS Geostatistical Analyst to Estimate Population-Weighted Monthly Temperature for US Counties.

    Science.gov (United States)

    Xiaopeng, Q I; Liang, Wei; Barker, Laurie; Lekiachvili, Akaki; Xingyou, Zhang

    Temperature changes are known to have significant impacts on human health. Accurate estimates of population-weighted average monthly air temperature for US counties are needed to evaluate temperature's association with health behaviours and disease, which are sampled or reported at the county level and measured on a monthly-or 30-day-basis. Most reported temperature estimates were calculated using ArcGIS, relatively few used SAS. We compared the performance of geostatistical models to estimate population-weighted average temperature in each month for counties in 48 states using ArcGIS v9.3 and SAS v 9.2 on a CITGO platform. Monthly average temperature for Jan-Dec 2007 and elevation from 5435 weather stations were used to estimate the temperature at county population centroids. County estimates were produced with elevation as a covariate. Performance of models was assessed by comparing adjusted R 2 , mean squared error, root mean squared error, and processing time. Prediction accuracy for split validation was above 90% for 11 months in ArcGIS and all 12 months in SAS. Cokriging in SAS achieved higher prediction accuracy and lower estimation bias as compared to cokriging in ArcGIS. County-level estimates produced by both packages were positively correlated (adjusted R 2 range=0.95 to 0.99); accuracy and precision improved with elevation as a covariate. Both methods from ArcGIS and SAS are reliable for U.S. county-level temperature estimates; However, ArcGIS's merits in spatial data pre-processing and processing time may be important considerations for software selection, especially for multi-year or multi-state projects.

  12. Earthquake prediction

    International Nuclear Information System (INIS)

    Ward, P.L.

    1978-01-01

    The state of the art of earthquake prediction is summarized, the possible responses to such prediction are examined, and some needs in the present prediction program and in research related to use of this new technology are reviewed. Three basic aspects of earthquake prediction are discussed: location of the areas where large earthquakes are most likely to occur, observation within these areas of measurable changes (earthquake precursors) and determination of the area and time over which the earthquake will occur, and development of models of the earthquake source in order to interpret the precursors reliably. 6 figures

  13. Symplectic Geometric Algorithms for Hamiltonian Systems

    CERN Document Server

    Feng, Kang

    2010-01-01

    "Symplectic Geometry Algorithms for Hamiltonian Systems" will be useful not only for numerical analysts, but also for those in theoretical physics, computational chemistry, celestial mechanics, etc. The book generalizes and develops the generating function and Hamilton-Jacobi equation theory from the perspective of the symplectic geometry and symplectic algebra. It will be a useful resource for engineers and scientists in the fields of quantum theory, astrophysics, atomic and molecular dynamics, climate prediction, oil exploration, etc. Therefore a systematic research and development

  14. Will Saudi Arabia run out of money? : “they talk the talk, but can they walk the walk?”

    OpenAIRE

    Mundal, Bjørnar; Waaler, Frederik

    2015-01-01

    We examine when and if Saudi Arabia run out of money in order to predict if Saudi Arabia will change their oil policy due to the low oil price we are facing today. This is done by running different static analyses, time series analysis and an analysis on the effects of cutting production based on elasticity of supply. We find that Saudi Arabia run out of money if they continue without financial and strategic adjustments. However, our findings show that other analysts overestima...

  15. Data analysis through interactive computer animation method (DATICAM)

    International Nuclear Information System (INIS)

    Curtis, J.N.; Schwieder, D.H.

    1983-01-01

    DATICAM is an interactive computer animation method designed to aid in the analysis of nuclear research data. DATICAM was developed at the Idaho National Engineering Laboratory (INEL) by EG and G Idaho, Inc. INEL analysts use DATICAM to produce computer codes that are better able to predict the behavior of nuclear power reactors. In addition to increased code accuracy, DATICAM has saved manpower and computer costs. DATICAM has been generalized to assist in the data analysis of virtually any data-producing dynamic process

  16. Princes, priests, and people is Saudi Arabia the next Iran

    OpenAIRE

    Waltermire, Bradley J.

    2005-01-01

    Renewed violent attacks in Saudi Arabia against the monarchy, combined with growing concern over royal corruption has led some analysts to predict that Saudi Arabia is likely to be "the next Iran"-that Islamist revolutionaries will come to power in Riyadh. I test this theory through the lens of network analysis in order to measure the degree of state-society integration in Pahlavi Iran and Saudi Arabia. My analysis finds that a) the Saudi state is far more integrated in society through so...

  17. CASL Dakota Capabilities Summary

    Energy Technology Data Exchange (ETDEWEB)

    Adams, Brian M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Simmons, Chris [Univ. of Texas, Austin, TX (United States); Williams, Brian J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-10-10

    The Dakota software project serves the mission of Sandia National Laboratories and supports a worldwide user community by delivering state-of-the-art research and robust, usable software for optimization and uncertainty quantification. These capabilities enable advanced exploration and riskinformed prediction with a wide range of computational science and engineering models. Dakota is the verification and validation (V&V) / uncertainty quantification (UQ) software delivery vehicle for CASL, allowing analysts across focus areas to apply these capabilities to myriad nuclear engineering analyses.

  18. Mechanical Characterization of 3D Woven Carbon Composite

    Science.gov (United States)

    2017-09-18

    Andrew Littlefield Purchase Order #: 4601885344 Analyst : R. Martin / M. Brady Attachments : 2 graphs Date : January 23, 2017 Material : MPT-007-006...Attention : Andrew Littlefield Purchase Order #: 4601885344 Analyst : R. Martin / M. Brady Date : January 23, 2017 Material : MPT-007-006-001 Ply...Customer : US Army RDECOM-ARDEC Benet Labs Attention : Andrew Littlefield Purchase Order #: 4601885344 Analyst : R. Martin / M. Brady Attachments : 2

  19. Similarities and Differences Between Warped Linear Prediction and Laguerre Linear Prediction

    NARCIS (Netherlands)

    Brinker, Albertus C. den; Krishnamoorthi, Harish; Verbitskiy, Evgeny A.

    2011-01-01

    Linear prediction has been successfully applied in many speech and audio processing systems. This paper presents the similarities and differences between two classes of linear prediction schemes, namely, Warped Linear Prediction (WLP) and Laguerre Linear Prediction (LLP). It is shown that both

  20. Predictive Manufacturing: A Classification Strategy to Predict Product Failures

    DEFF Research Database (Denmark)

    Khan, Abdul Rauf; Schiøler, Henrik; Kulahci, Murat

    2018-01-01

    manufacturing analytics model that employs a big data approach to predicting product failures; third, we illustrate the issue of high dimensionality, along with statistically redundant information; and, finally, our proposed method will be compared against the well-known classification methods (SVM, K......-nearest neighbor, artificial neural networks). The results from real data show that our predictive manufacturing analytics approach, using genetic algorithms and Voronoi tessellations, is capable of predicting product failure with reasonable accuracy. The potential application of this method contributes...... to accurately predicting product failures, which would enable manufacturers to reduce production costs without compromising product quality....

  1. The Determinants of Sell-side Analysts’ Forecast Accuracy and Media Exposure

    Directory of Open Access Journals (Sweden)

    Samira Amadu Sorogho

    2017-09-01

    Full Text Available This study examines contributing factors to the differential forecasting abilities of sell-side analysts and the relation between the sentiments of these analysts and their media exposure. In particular, I investigate whether the level of optimism expressed in sell-side analysts’ reports of fifteen constituents of primarily the S&P 500 Oil and Gas Industry1, enhance the media appearance of these analysts. Using a number of variables estimated from the I/B/E/S Detail history database, 15,455 analyst reports collected from Thompson Reuters Investext and analyst media appearances obtained from Dow Jones Factiva from 1999 to 2014, I run a multiple linear regression to determine the effect of independent variables on dependent variables.  I find that an analyst’s forecast accuracy (as measured by the errors inherent in his forecasts is negatively associated with the analyst’s level of media exposure, experience, brokerage size, the number of times he revises his forecasts in a year and the number of companies followed by the analyst, and positively associated with the analyst’s level of optimism expressed in his reports, forecast horizon and the size of the company he follows.

  2. Problems of internalization: a button is a button is-not.

    Science.gov (United States)

    Rockwell, Shelley

    2014-01-01

    Analysts hope to help the patient internalize a relationship with the analyst that contrasts with the original archaic object relation. In this paper, the author describes particular difficulties in working with a patient whose defenses and anxieties were bulimic, her movement toward internalization inevitably undone. Several issues are considered: how does the nonsymbolizing patient come to internalize the analyst's understanding, and when this does not hold, what is the nature of the patient's subsequent methods of dispersal? When the patient can maintain connection to the analyst as a good object, even fleetingly, in the depressive position, the possibility of internalization and symbolic communication is increased. © 2014 The Psychoanalytic Quarterly, Inc.

  3. Eid Mawlid al-Nabi, Eid al-Fitr and Eid al-Adha; optimism and impact on analysts’ recommendations: Evidence From MENA region

    Directory of Open Access Journals (Sweden)

    Harit Satt

    2017-06-01

    Full Text Available This study investigates holidays’ effect in analyst recommendations in the MENA countries stock markets between 2004 and 2015. The findings show that on Pre-Holidays, analysts tend to issue pessimistic recommendations, and issue optimistic recommendations on Post-Holidays. Prior literature on day-of-the-week effect is endorse our results, which document an increase in stock prices during the week, and a decrease in stock prices over the weekend. We argue that analysts can benefit from the upward trend in stock prices during Post-Holidays by issuing optimistic recommendation. Analysts may as well benefit from the downward trend in stock prices by issuing pessimistic recommendations on Pre-Holidays.

  4. Applied predictive control

    CERN Document Server

    Sunan, Huang; Heng, Lee Tong

    2002-01-01

    The presence of considerable time delays in the dynamics of many industrial processes, leading to difficult problems in the associated closed-loop control systems, is a well-recognized phenomenon. The performance achievable in conventional feedback control systems can be significantly degraded if an industrial process has a relatively large time delay compared with the dominant time constant. Under these circumstances, advanced predictive control is necessary to improve the performance of the control system significantly. The book is a focused treatment of the subject matter, including the fundamentals and some state-of-the-art developments in the field of predictive control. Three main schemes for advanced predictive control are addressed in this book: • Smith Predictive Control; • Generalised Predictive Control; • a form of predictive control based on Finite Spectrum Assignment. A substantial part of the book addresses application issues in predictive control, providing several interesting case studie...

  5. Looking back to inform the future: The role of cognition in forest disturbance characterization from remote sensing imagery

    Science.gov (United States)

    Bianchetti, Raechel Anne

    Remotely sensed images have become a ubiquitous part of our daily lives. From novice users, aiding in search and rescue missions using tools such as TomNod, to trained analysts, synthesizing disparate data to address complex problems like climate change, imagery has become central to geospatial problem solving. Expert image analysts are continually faced with rapidly developing sensor technologies and software systems. In response to these cognitively demanding environments, expert analysts develop specialized knowledge and analytic skills to address increasingly complex problems. This study identifies the knowledge, skills, and analytic goals of expert image analysts tasked with identification of land cover and land use change. Analysts participating in this research are currently working as part of a national level analysis of land use change, and are well versed with the use of TimeSync, forest science, and image analysis. The results of this study benefit current analysts as it improves their awareness of their mental processes used during the image interpretation process. The study also can be generalized to understand the types of knowledge and visual cues that analysts use when reasoning with imagery for purposes beyond land use change studies. Here a Cognitive Task Analysis framework is used to organize evidence from qualitative knowledge elicitation methods for characterizing the cognitive aspects of the TimeSync image analysis process. Using a combination of content analysis, diagramming, semi-structured interviews, and observation, the study highlights the perceptual and cognitive elements of expert remote sensing interpretation. Results show that image analysts perform several standard cognitive processes, but flexibly employ these processes in response to various contextual cues. Expert image analysts' ability to think flexibly during their analysis process was directly related to their amount of image analysis experience. Additionally, results show

  6. Development of a regional ensemble prediction method for probabilistic weather prediction

    International Nuclear Information System (INIS)

    Nohara, Daisuke; Tamura, Hidetoshi; Hirakuchi, Hiromaru

    2015-01-01

    A regional ensemble prediction method has been developed to provide probabilistic weather prediction using a numerical weather prediction model. To obtain consistent perturbations with the synoptic weather pattern, both of initial and lateral boundary perturbations were given by differences between control and ensemble member of the Japan Meteorological Agency (JMA)'s operational one-week ensemble forecast. The method provides a multiple ensemble member with a horizontal resolution of 15 km for 48-hour based on a downscaling of the JMA's operational global forecast accompanied with the perturbations. The ensemble prediction was examined in the case of heavy snow fall event in Kanto area on January 14, 2013. The results showed that the predictions represent different features of high-resolution spatiotemporal distribution of precipitation affected by intensity and location of extra-tropical cyclone in each ensemble member. Although the ensemble prediction has model bias of mean values and variances in some variables such as wind speed and solar radiation, the ensemble prediction has a potential to append a probabilistic information to a deterministic prediction. (author)

  7. The lure of the symptom in psychoanalytic treatment.

    Science.gov (United States)

    Ogden, Thomas H; Gabbard, Glen O

    2010-06-01

    Psychoanalysis, which at its core is a search for truth, stands in a subversive position vis-à-vis the contemporary therapeutic culture that places a premium on symptomatic "cure." Nevertheless, analysts are vulnerable to succumbing to the internal and external pressures for the achievement of symptomatic improvement. In this communication we trace the evolution of Freud's thinking about the relationship between the aims of psychoanalysis and the alleviation of symptoms. We note that analysts today may recapitulate Freud's early struggles in their pursuit of symptom removal. We present an account of a clinical consultation in which the analytic pair were ensnared in an impasse that involved the analyst's preoccupation with the intransigence of one of the patient's symptoms. We suggest alternative ways of working with these clinical issues and offer some thoughts on how our own work as analysts and consultants to colleagues has been influenced by our understanding of what frequently occurs when the analyst becomes symptom-focused.

  8. The Divergent Paths of Behavior Analysis and Psychology: Vive la Différence!

    OpenAIRE

    Thyer, Bruce A.

    2014-01-01

    Twenty years ago I suggested that behavior analysts could effect a quiet and covert takeover of the American Psychological Association (APA). I gave as precedents the operation of similar initiatives in the nineteenth and twentieth centuries, the Darwinian-inspired X-Club, and the psychoanalytically-oriented Secret Ring. Though a conscientious program of working within established APA bylaws and rules, behavior analysts could ensure that behavior analysts were nominated for every significant ...

  9. The Temple Translator's Workstation Project

    National Research Council Canada - National Science Library

    Vanni, Michelle; Zajac, Remi

    1996-01-01

    .... The Temple Translator's Workstation is incorporated into a Tipster document management architecture and it allows both translator/analysts and monolingual analysts to use the machine- translation...

  10. Reliability of windstorm predictions in the ECMWF ensemble prediction system

    Science.gov (United States)

    Becker, Nico; Ulbrich, Uwe

    2016-04-01

    Windstorms caused by extratropical cyclones are one of the most dangerous natural hazards in the European region. Therefore, reliable predictions of such storm events are needed. Case studies have shown that ensemble prediction systems (EPS) are able to provide useful information about windstorms between two and five days prior to the event. In this work, ensemble predictions with the European Centre for Medium-Range Weather Forecasts (ECMWF) EPS are evaluated in a four year period. Within the 50 ensemble members, which are initialized every 12 hours and are run for 10 days, windstorms are identified and tracked in time and space. By using a clustering approach, different predictions of the same storm are identified in the different ensemble members and compared to reanalysis data. The occurrence probability of the predicted storms is estimated by fitting a bivariate normal distribution to the storm track positions. Our results show, for example, that predicted storm clusters with occurrence probabilities of more than 50% have a matching observed storm in 80% of all cases at a lead time of two days. The predicted occurrence probabilities are reliable up to 3 days lead time. At longer lead times the occurrence probabilities are overestimated by the EPS.

  11. Accurate and dynamic predictive model for better prediction in medicine and healthcare.

    Science.gov (United States)

    Alanazi, H O; Abdullah, A H; Qureshi, K N; Ismail, A S

    2018-05-01

    Information and communication technologies (ICTs) have changed the trend into new integrated operations and methods in all fields of life. The health sector has also adopted new technologies to improve the systems and provide better services to customers. Predictive models in health care are also influenced from new technologies to predict the different disease outcomes. However, still, existing predictive models have suffered from some limitations in terms of predictive outcomes performance. In order to improve predictive model performance, this paper proposed a predictive model by classifying the disease predictions into different categories. To achieve this model performance, this paper uses traumatic brain injury (TBI) datasets. TBI is one of the serious diseases worldwide and needs more attention due to its seriousness and serious impacts on human life. The proposed predictive model improves the predictive performance of TBI. The TBI data set is developed and approved by neurologists to set its features. The experiment results show that the proposed model has achieved significant results including accuracy, sensitivity, and specificity.

  12. Computational prediction of protein-protein interactions in Leishmania predicted proteomes.

    Directory of Open Access Journals (Sweden)

    Antonio M Rezende

    Full Text Available The Trypanosomatids parasites Leishmania braziliensis, Leishmania major and Leishmania infantum are important human pathogens. Despite of years of study and genome availability, effective vaccine has not been developed yet, and the chemotherapy is highly toxic. Therefore, it is clear just interdisciplinary integrated studies will have success in trying to search new targets for developing of vaccines and drugs. An essential part of this rationale is related to protein-protein interaction network (PPI study which can provide a better understanding of complex protein interactions in biological system. Thus, we modeled PPIs for Trypanosomatids through computational methods using sequence comparison against public database of protein or domain interaction for interaction prediction (Interolog Mapping and developed a dedicated combined system score to address the predictions robustness. The confidence evaluation of network prediction approach was addressed using gold standard positive and negative datasets and the AUC value obtained was 0.94. As result, 39,420, 43,531 and 45,235 interactions were predicted for L. braziliensis, L. major and L. infantum respectively. For each predicted network the top 20 proteins were ranked by MCC topological index. In addition, information related with immunological potential, degree of protein sequence conservation among orthologs and degree of identity compared to proteins of potential parasite hosts was integrated. This information integration provides a better understanding and usefulness of the predicted networks that can be valuable to select new potential biological targets for drug and vaccine development. Network modularity which is a key when one is interested in destabilizing the PPIs for drug or vaccine purposes along with multiple alignments of the predicted PPIs were performed revealing patterns associated with protein turnover. In addition, around 50% of hypothetical protein present in the networks

  13. Pre-Conflict Management Tools: Winning the Peace

    National Research Council Canada - National Science Library

    Frank, Aaron B

    2005-01-01

    The Pre-Conflict Management Tools (PCMT) Program was developed to transform how intelligence analysts, policy analysts, operational planners, and decisionmakers interact when confronting highly complex strategic problems...

  14. Application of the Augmented Operator Function Model for Developing Cognitive Metrics in Persistent Surveillance

    Science.gov (United States)

    2013-09-26

    256.489.8584 or via email at brad.atkins@radiancetech.com if you have any questions or concerns. Sincerely, Radiance Technologies, Inc. ’Wl<.~ Brad K...exploiting FMV data for real-time support. In fact, FMV analysts reported they did not multitask at all, while WAMI analysts reported monitoring...approach that is applicable to WAMI exploitation. Tools are needed to allow analysts to multitask , monitor multiple events within the AOI, and to

  15. Assessment of Computational Fluid Dynamics (CFD) Models for Shock Boundary-Layer Interaction

    Science.gov (United States)

    DeBonis, James R.; Oberkampf, William L.; Wolf, Richard T.; Orkwis, Paul D.; Turner, Mark G.; Babinsky, Holger

    2011-01-01

    A workshop on the computational fluid dynamics (CFD) prediction of shock boundary-layer interactions (SBLIs) was held at the 48th AIAA Aerospace Sciences Meeting. As part of the workshop numerous CFD analysts submitted solutions to four experimentally measured SBLIs. This paper describes the assessment of the CFD predictions. The assessment includes an uncertainty analysis of the experimental data, the definition of an error metric and the application of that metric to the CFD solutions. The CFD solutions provided very similar levels of error and in general it was difficult to discern clear trends in the data. For the Reynolds Averaged Navier-Stokes methods the choice of turbulence model appeared to be the largest factor in solution accuracy. Large-eddy simulation methods produced error levels similar to RANS methods but provided superior predictions of normal stresses.

  16. A psychoanalytical phenomenology of perversion.

    Science.gov (United States)

    Jiménez, Juan Pablo

    2004-02-01

    After stating that the current tasks of psychoanalytic research should fundamentally include the exploration of the analyst's mental processes in sessions with the patient, the author describes the analytical relation as one having an intersubjective nature. Seen from the outside, the analytical relation evidences two poles: a symmetric structural pole where both analyst and patient share a single world and a single approach to reality, and a functional asymmetric pole that defines the assignment of the respective roles. In the analysis of a perverse patient, the symmetry-asymmetry polarities acquire some very particular characteristics. Seen from the perspective of the analyst's subjectivity, perversion appears in the analyst's mind as a surreptitious and unexpected transgression of the basic agreement that facilitates and structures intersubjective encounters. It may go as far as altering the Aristotelian rules of logic. When coming into contact with the psychic reality of a perverse patient, what happens in the analyst's mind is that a world takes shape. This world is misleadingly coloured by an erotisation that sooner or later will acquire some characteristics of violence. The perverse nucleus, as a false reality, remains dangling in mid-air as an experience that is inaccessible to the analyst's empathy. The only way the analyst can reach it is from the 'periphery' of the patient's psychic reality, by trying in an indirect way to lead him back to his intersubjective roots. At this point, the author's intention is to explain this intersubjective phenomenon in terms of metapsychological and empirical research-based theories. Finally, some ideas on the psychogenesis of perversion are set forth.

  17. Predictive systems ecology.

    Science.gov (United States)

    Evans, Matthew R; Bithell, Mike; Cornell, Stephen J; Dall, Sasha R X; Díaz, Sandra; Emmott, Stephen; Ernande, Bruno; Grimm, Volker; Hodgson, David J; Lewis, Simon L; Mace, Georgina M; Morecroft, Michael; Moustakas, Aristides; Murphy, Eugene; Newbold, Tim; Norris, K J; Petchey, Owen; Smith, Matthew; Travis, Justin M J; Benton, Tim G

    2013-11-22

    Human societies, and their well-being, depend to a significant extent on the state of the ecosystems that surround them. These ecosystems are changing rapidly usually in response to anthropogenic changes in the environment. To determine the likely impact of environmental change on ecosystems and the best ways to manage them, it would be desirable to be able to predict their future states. We present a proposal to develop the paradigm of predictive systems ecology, explicitly to understand and predict the properties and behaviour of ecological systems. We discuss the necessary and desirable features of predictive systems ecology models. There are places where predictive systems ecology is already being practised and we summarize a range of terrestrial and marine examples. Significant challenges remain but we suggest that ecology would benefit both as a scientific discipline and increase its impact in society if it were to embrace the need to become more predictive.

  18. Predictable or not predictable? The MOV question

    International Nuclear Information System (INIS)

    Thibault, C.L.; Matzkiw, J.N.; Anderson, J.W.; Kessler, D.W.

    1994-01-01

    Over the past 8 years, the nuclear industry has struggled to understand the dynamic phenomena experienced during motor-operated valve (MOV) operation under differing flow conditions. For some valves and designs, their operational functionality has been found to be predictable; for others, unpredictable. Although much has been accomplished over this period of time, especially on modeling valve dynamics, the unpredictability of many valves and designs still exists. A few valve manufacturers are focusing on improving design and fabrication techniques to enhance product reliability and predictability. However, this approach does not address these issues for installed and inpredictable valves. This paper presents some of the more promising techniques that Wyle Laboratories has explored with potential for transforming unpredictable valves to predictable valves and for retrofitting installed MOVs. These techniques include optimized valve tolerancing, surrogated material evaluation, and enhanced surface treatments

  19. Neural Fuzzy Inference System-Based Weather Prediction Model and Its Precipitation Predicting Experiment

    Directory of Open Access Journals (Sweden)

    Jing Lu

    2014-11-01

    Full Text Available We propose a weather prediction model in this article based on neural network and fuzzy inference system (NFIS-WPM, and then apply it to predict daily fuzzy precipitation given meteorological premises for testing. The model consists of two parts: the first part is the “fuzzy rule-based neural network”, which simulates sequential relations among fuzzy sets using artificial neural network; and the second part is the “neural fuzzy inference system”, which is based on the first part, but could learn new fuzzy rules from the previous ones according to the algorithm we proposed. NFIS-WPM (High Pro and NFIS-WPM (Ave are improved versions of this model. It is well known that the need for accurate weather prediction is apparent when considering the benefits. However, the excessive pursuit of accuracy in weather prediction makes some of the “accurate” prediction results meaningless and the numerical prediction model is often complex and time-consuming. By adapting this novel model to a precipitation prediction problem, we make the predicted outcomes of precipitation more accurate and the prediction methods simpler than by using the complex numerical forecasting model that would occupy large computation resources, be time-consuming and which has a low predictive accuracy rate. Accordingly, we achieve more accurate predictive precipitation results than by using traditional artificial neural networks that have low predictive accuracy.

  20. Prediction of residential radon exposure of the whole Swiss population: comparison of model-based predictions with measurement-based predictions.

    Science.gov (United States)

    Hauri, D D; Huss, A; Zimmermann, F; Kuehni, C E; Röösli, M

    2013-10-01

    Radon plays an important role for human exposure to natural sources of ionizing radiation. The aim of this article is to compare two approaches to estimate mean radon exposure in the Swiss population: model-based predictions at individual level and measurement-based predictions based on measurements aggregated at municipality level. A nationwide model was used to predict radon levels in each household and for each individual based on the corresponding tectonic unit, building age, building type, soil texture, degree of urbanization, and floor. Measurement-based predictions were carried out within a health impact assessment on residential radon and lung cancer. Mean measured radon levels were corrected for the average floor distribution and weighted with population size of each municipality. Model-based predictions yielded a mean radon exposure of the Swiss population of 84.1 Bq/m(3) . Measurement-based predictions yielded an average exposure of 78 Bq/m(3) . This study demonstrates that the model- and the measurement-based predictions provided similar results. The advantage of the measurement-based approach is its simplicity, which is sufficient for assessing exposure distribution in a population. The model-based approach allows predicting radon levels at specific sites, which is needed in an epidemiological study, and the results do not depend on how the measurement sites have been selected. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  1. Eid Mawlid al-Nabi, Eid al-Fitr and Eid al-Adha; optimism and impact on analysts’ recommendations: Evidence From MENA region

    OpenAIRE

    Harit Satt

    2017-01-01

    This study investigates holidays’ effect in analyst recommendations in the MENA countries stock markets between 2004 and 2015. The findings show that on Pre-Holidays, analysts tend to issue pessimistic recommendations, and issue optimistic recommendations on Post-Holidays. Prior literature on day-of-the-week effect is endorse our results, which document an increase in stock prices during the week, and a decrease in stock prices over the weekend. We argue that analysts can benefit from the upw...

  2. Predictive modeling of complications.

    Science.gov (United States)

    Osorio, Joseph A; Scheer, Justin K; Ames, Christopher P

    2016-09-01

    Predictive analytic algorithms are designed to identify patterns in the data that allow for accurate predictions without the need for a hypothesis. Therefore, predictive modeling can provide detailed and patient-specific information that can be readily applied when discussing the risks of surgery with a patient. There are few studies using predictive modeling techniques in the adult spine surgery literature. These types of studies represent the beginning of the use of predictive analytics in spine surgery outcomes. We will discuss the advancements in the field of spine surgery with respect to predictive analytics, the controversies surrounding the technique, and the future directions.

  3. The organically bound tritium: an analyst vision

    International Nuclear Information System (INIS)

    Ansoborlo, E.; Baglan, N.

    2009-01-01

    The authors report the work of a work group on tritium analysis. They recall the different physical forms of tritium: gas (HT, hydrogen-tritium), water vapour (HTO or tritiated water) or methane (CH3T), but also in organic compounds (OBT, organically bound tritium) which are either exchangeable or non-exchangeable. They evoke measurement techniques and methods, notably to determine the tritium volume activity. They discuss the possibilities to analyse and distinguish exchangeable and non-exchangeable OBTs

  4. Financial analysts and their opinion about nuclear

    International Nuclear Information System (INIS)

    Vos, Patrick H. de

    1999-01-01

    Electrabel is a Belgian utility with business activities ranging from electricity generation and transmission (market share 88 %) to direct distribution of electricity to industrial customers. Electrabel is listed at the Brussels stock market and ranks among the three highest market capitalisation. The Electrabel owns the following NPPs: Doel 1, Tihange 1, Doel 2, Doel, Tihange, Doel 4, Tihange 3, Chooz B and Tricastin. Electrabel opted for an integrated communication strategy, that is to say that the release of information relating to its image, both inside and outside the company, to the media and financial circles and even the marketing and logistics of communication must present a consistent message overall; it is only the language that is adapted to each target public. The communication policy aims mainly to provide communication that is as objective as possible in conjunction with our discussion partners, that is to say on the basis of a long-term professional relationship in a climate of mutual confidence

  5. IT Operations Analyst | IDRC - International Development Research ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Contributes to the ISR work plan and participates in the setting of ISR priorities ... Assists the project manager and technical team in the development of ... in the preparation of briefing notes, technology summaries and analysis reports.

  6. Michael Howard: Military Historian and Strategic Analyst.

    Science.gov (United States)

    1983-06-10

    eighties it became obvious that "the growing acceptability of mili- tary studies" was a consequence of a careful nurturing during the embryo stage by...would decide the destiny of the nation. If the machine gun and the artillery piece meant the "battle went on for longer than expected; the casualties

  7. Analysts' earnings forecasts and international asset allocation

    NARCIS (Netherlands)

    Huijgen, Carel; Plantinga, Auke

    1999-01-01

    The aim of this paper is to investigate whether financial analysts’ earnings forecasts are informative from the viewpoint of allocating investments across different stock markets. Therefore we develop a country forecast indicator reflecting the analysts’ prospects for specific stock markets. The

  8. IT Service Desk Analyst | IDRC - International Development ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    S/he participates in IMTD-led projects to represent users' interests and to determine impact on ... Recommends hardware and software changes and updates to the ... in the Service Desk incident management system in order to help determine, ...

  9. Can Social Networks Assist Analysts Fight Terrorism?

    Science.gov (United States)

    2011-06-01

    Facebook set up 53 AMBER alert pages, one for each of the 50 states, along with pages for the District of Columbia, Puerto Rico, and the U.S. Virgin ...was the commonplace name of massive blizzard-like storms that plagued the northeast during the 2009-2010 holidays . Ryan Ozimek and his team at PICnet

  10. Senior Analyst | IDRC - International Development Research Centre

    International Development Research Centre (IDRC) Digital Library (Canada)

    Primary Duties or Responsibilities Strategic Planning and Development ... Conveys the directions determined by the Board and provides details on IDRC operations ... the opportunities and challenges these present for IDRC's business model; ...

  11. Gravity's ghost and big dog scientific discovery and social analysis in the twenty-first century

    CERN Document Server

    Collins, Harry

    2013-01-01

    Gravity's Ghost and Big Dog brings to life science's efforts to detect cosmic gravitational waves. These ripples in space-time are predicted by general relativity, and their discovery will not only demonstrate the truth of Einstein's theories but also transform astronomy. Although no gravitational wave has ever been directly detected, the previous five years have been an especially exciting period in the field. Here sociologist Harry Collins offers readers an unprecedented view of gravitational wave research and explains what it means for an analyst to do work of this kind.

  12. Endogenous scheduling preferences and congestion

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; Small, Kenneth

    2010-01-01

    and leisure, but agglomeration economies at home and at work lead to scheduling preferences forming endogenously. Using bottleneck congestion technology, we obtain an equilibrium queuing pattern consistent with a general version of the Vickrey bottleneck model. However, the policy implications are different....... Compared to the predictions of an analyst observing untolled equilibrium and taking scheduling preferences as exogenous, we find that both the optimal capacity and the marginal external cost of congestion have changed. The benefits of tolling are greater, and the optimal time varying toll is different....

  13. Technical Equivalency Documentation for a Newly Acquired Alpha Spectroscopy System

    International Nuclear Information System (INIS)

    Hickman, D P; Fisher, S K; Hann, P R; Hume, R

    2007-01-01

    The response of a recently acquired Canberra(trademark) Alpha Analyst 'Blue' system (Chamber Number's 173-208) used by the Hazards Control, Radiation Safety Section, WBC/Spectroscopy Team has been studied with respect to an existing Canberra system. The existing Canberra system consists of thirty Alpha Analyst dual chambers Model XXXX comprising a total of sixty detectors (Chambers Number's 101-124 and 137-172). The existing chambers were previously compared to an older system consisting of thirty-six Model 7401 alpha spectrometry chambers (Chamber Number's 1-36) Chambers 101-124 and 137-172 are DOELAP accredited. The older system was previously DOELAP accredited for the routine Alpha Spectroscopy program used in LLNL's in vitro bioassay program. The newly acquired Alpha Analyst system operates on a network with software that controls and performs analysis of the current Alpha Analyst system (Chamber Number's 101-124 and 137-172). This exact same software is used for the current system and the newly acquired system and is DOELAP accredited. This document compares results from the existing Alpha System with the newer Alpha Analyst system

  14. Redeye: A Digital Library for Forensic Document Triage

    Energy Technology Data Exchange (ETDEWEB)

    Bogen, Paul Logasa [ORNL; McKenzie, Amber T [ORNL; Gillen, Rob [ORNL

    2013-01-01

    Forensic document analysis has become an important aspect of investigation of many different kinds of crimes from money laundering to fraud and from cybercrime to smuggling. The current workflow for analysts includes powerful tools, such as Palantir and Analyst s Notebook, for moving from evidence to actionable intelligence and tools for finding documents among the millions of files on a hard disk, such as FTK. However, the analysts often leave the process of sorting through collections of seized documents to filter out the noise from the actual evidence to a highly labor-intensive manual effort. This paper presents the Redeye Analysis Workbench, a tool to help analysts move from manual sorting of a collection of documents to performing intelligent document triage over a digital library. We will discuss the tools and techniques we build upon in addition to an in-depth discussion of our tool and how it addresses two major use cases we observed analysts performing. Finally, we also include a new layout algorithm for radial graphs that is used to visualize clusters of documents in our system.

  15. Predictive medicine

    NARCIS (Netherlands)

    Boenink, Marianne; ten Have, Henk

    2015-01-01

    In the last part of the twentieth century, predictive medicine has gained currency as an important ideal in biomedical research and health care. Research in the genetic and molecular basis of disease suggested that the insights gained might be used to develop tests that predict the future health

  16. Predictability of Conversation Partners

    Science.gov (United States)

    Takaguchi, Taro; Nakamura, Mitsuhiro; Sato, Nobuo; Yano, Kazuo; Masuda, Naoki

    2011-08-01

    Recent developments in sensing technologies have enabled us to examine the nature of human social behavior in greater detail. By applying an information-theoretic method to the spatiotemporal data of cell-phone locations, [C. Song , ScienceSCIEAS0036-8075 327, 1018 (2010)] found that human mobility patterns are remarkably predictable. Inspired by their work, we address a similar predictability question in a different kind of human social activity: conversation events. The predictability in the sequence of one’s conversation partners is defined as the degree to which one’s next conversation partner can be predicted given the current partner. We quantify this predictability by using the mutual information. We examine the predictability of conversation events for each individual using the longitudinal data of face-to-face interactions collected from two company offices in Japan. Each subject wears a name tag equipped with an infrared sensor node, and conversation events are marked when signals are exchanged between sensor nodes in close proximity. We find that the conversation events are predictable to a certain extent; knowing the current partner decreases the uncertainty about the next partner by 28.4% on average. Much of the predictability is explained by long-tailed distributions of interevent intervals. However, a predictability also exists in the data, apart from the contribution of their long-tailed nature. In addition, an individual’s predictability is correlated with the position of the individual in the static social network derived from the data. Individuals confined in a community—in the sense of an abundance of surrounding triangles—tend to have low predictability, and those bridging different communities tend to have high predictability.

  17. Predictability of Conversation Partners

    Directory of Open Access Journals (Sweden)

    Taro Takaguchi

    2011-09-01

    Full Text Available Recent developments in sensing technologies have enabled us to examine the nature of human social behavior in greater detail. By applying an information-theoretic method to the spatiotemporal data of cell-phone locations, [C. Song et al., Science 327, 1018 (2010SCIEAS0036-8075] found that human mobility patterns are remarkably predictable. Inspired by their work, we address a similar predictability question in a different kind of human social activity: conversation events. The predictability in the sequence of one’s conversation partners is defined as the degree to which one’s next conversation partner can be predicted given the current partner. We quantify this predictability by using the mutual information. We examine the predictability of conversation events for each individual using the longitudinal data of face-to-face interactions collected from two company offices in Japan. Each subject wears a name tag equipped with an infrared sensor node, and conversation events are marked when signals are exchanged between sensor nodes in close proximity. We find that the conversation events are predictable to a certain extent; knowing the current partner decreases the uncertainty about the next partner by 28.4% on average. Much of the predictability is explained by long-tailed distributions of interevent intervals. However, a predictability also exists in the data, apart from the contribution of their long-tailed nature. In addition, an individual’s predictability is correlated with the position of the individual in the static social network derived from the data. Individuals confined in a community—in the sense of an abundance of surrounding triangles—tend to have low predictability, and those bridging different communities tend to have high predictability.

  18. On becoming a psychoanalyst.

    Science.gov (United States)

    Gabbard, Glen O; Ogden, Thomas H

    2009-04-01

    One has the opportunity and responsibility to become an analyst in one's own terms in the course of the years of practice that follow the completion of formal analytic training. The authors discuss their understanding of some of the maturational experiences that have contributed to their becoming analysts in their own terms. They believe that the most important element in the process of their maturation as analysts has been the development of the capacity to make use of what is unique and idiosyncratic to each of them; each, when at his best, conducts himself as an analyst in a way that reflects his own analytic style; his own way of being with, and talking with, his patients; his own form of the practice of psychoanalysis. The types of maturational experiences that the authors examine include situations in which they have learned to listen to themselves speak with their patients and, in so doing, begin to develop a voice of their own; experiences of growth that have occurred in the context of presenting clinical material to a consultant; making self-analytic use of their experience with their patients; creating/discovering themselves as analysts in the experience of analytic writing (with particular attention paid to the maturational experience involved in writing the current paper); and responding to a need to keep changing, to be original in their thinking and behavior as analysts.

  19. Toward a Visualization-Supported Workflow for Cyber Alert Management using Threat Models and Human-Centered Design

    Energy Technology Data Exchange (ETDEWEB)

    Franklin, Lyndsey; Pirrung, Megan A.; Blaha, Leslie M.; Dowling, Michelle V.; Feng, Mi

    2017-10-09

    Cyber network analysts follow complex processes in their investigations of potential threats to their network. Much research is dedicated to providing automated tool support in the effort to make their tasks more efficient, accurate, and timely. This tool support comes in a variety of implementations from machine learning algorithms that monitor streams of data to visual analytic environments for exploring rich and noisy data sets. Cyber analysts, however, often speak of a need for tools which help them merge the data they already have and help them establish appropriate baselines against which to compare potential anomalies. Furthermore, existing threat models that cyber analysts regularly use to structure their investigation are not often leveraged in support tools. We report on our work with cyber analysts to understand they analytic process and how one such model, the MITRE ATT&CK Matrix [32], is used to structure their analytic thinking. We present our efforts to map specific data needed by analysts into the threat model to inform our eventual visualization designs. We examine data mapping for gaps where the threat model is under-supported by either data or tools. We discuss these gaps as potential design spaces for future research efforts. We also discuss the design of a prototype tool that combines machine-learning and visualization components to support cyber analysts working with this threat model.

  20. A note on notes: note taking and containment.

    Science.gov (United States)

    Levine, Howard B

    2007-07-01

    In extreme situations of massive projective identification, both the analyst and the patient may come to share a fantasy or belief that his or her own psychic reality will be annihilated if the psychic reality of the other is accepted or adopted (Britton 1998). In the example of' Dr. M and his patient, the paradoxical dilemma around note taking had highly specific transference meanings; it was not simply an instance of the generalized human response of distracted attention that Freud (1912) had spoken of, nor was it the destabilization of analytic functioning that I tried to describe in my work with Mr. L. Whether such meanings will always exist in these situations remains a matter to be determined by further clinical experience. In reopening a dialogue about note taking during sessions, I have attempted to move the discussion away from categorical injunctions about what analysis should or should not do, and instead to foster a more nuanced, dynamic, and pair-specific consideration of the analyst's functioning in the immediate context of the analytic relationship. There is, of course, a wide variety of listening styles among analysts, and each analyst's mental functioning may be affected differently by each patient whom the analyst sees. I have raised many questions in the hopes of stimulating an expanded discussion that will allow us to share our experiences and perhaps reach additional conclusions. Further consideration may lead us to decide whether note taking may have very different meanings for other analysts and analyst-patient pairs, and whether it may serve useful functions in addition to the one that I have described.

  1. Structural prediction in aphasia

    Directory of Open Access Journals (Sweden)

    Tessa Warren

    2015-05-01

    Full Text Available There is considerable evidence that young healthy comprehenders predict the structure of upcoming material, and that their processing is facilitated when they encounter material matching those predictions (e.g., Staub & Clifton, 2006; Yoshida, Dickey & Sturt, 2013. However, less is known about structural prediction in aphasia. There is evidence that lexical prediction may be spared in aphasia (Dickey et al., 2014; Love & Webb, 1977; cf. Mack et al, 2013. However, predictive mechanisms supporting facilitated lexical access may not necessarily support structural facilitation. Given that many people with aphasia (PWA exhibit syntactic deficits (e.g. Goodglass, 1993, PWA with such impairments may not engage in structural prediction. However, recent evidence suggests that some PWA may indeed predict upcoming structure (Hanne, Burchert, De Bleser, & Vashishth, 2015. Hanne et al. tracked the eyes of PWA (n=8 with sentence-comprehension deficits while they listened to reversible subject-verb-object (SVO and object-verb-subject (OVS sentences in German, in a sentence-picture matching task. Hanne et al. manipulated case and number marking to disambiguate the sentences’ structure. Gazes to an OVS or SVO picture during the unfolding of a sentence were assumed to indicate prediction of the structure congruent with that picture. According to this measure, the PWA’s structural prediction was impaired compared to controls, but they did successfully predict upcoming structure when morphosyntactic cues were strong and unambiguous. Hanne et al.’s visual-world evidence is suggestive, but their forced-choice sentence-picture matching task places tight constraints on possible structural predictions. Clearer evidence of structural prediction would come from paradigms where the content of upcoming material is not as constrained. The current study used self-paced reading study to examine structural prediction among PWA in less constrained contexts. PWA (n=17 who

  2. Ground motion predictions

    Energy Technology Data Exchange (ETDEWEB)

    Loux, P C [Environmental Research Corporation, Alexandria, VA (United States)

    1969-07-01

    Nuclear generated ground motion is defined and then related to the physical parameters that cause it. Techniques employed for prediction of ground motion peak amplitude, frequency spectra and response spectra are explored, with initial emphasis on the analysis of data collected at the Nevada Test Site (NTS). NTS postshot measurements are compared with pre-shot predictions. Applicability of these techniques to new areas, for example, Plowshare sites, must be questioned. Fortunately, the Atomic Energy Commission is sponsoring complementary studies to improve prediction capabilities primarily in new locations outside the NTS region. Some of these are discussed in the light of anomalous seismic behavior, and comparisons are given showing theoretical versus experimental results. In conclusion, current ground motion prediction techniques are applied to events off the NTS. Predictions are compared with measurements for the event Faultless and for the Plowshare events, Gasbuggy, Cabriolet, and Buggy I. (author)

  3. Evaluating prediction uncertainty

    International Nuclear Information System (INIS)

    McKay, M.D.

    1995-03-01

    The probability distribution of a model prediction is presented as a proper basis for evaluating the uncertainty in a model prediction that arises from uncertainty in input values. Determination of important model inputs and subsets of inputs is made through comparison of the prediction distribution with conditional prediction probability distributions. Replicated Latin hypercube sampling and variance ratios are used in estimation of the distributions and in construction of importance indicators. The assumption of a linear relation between model output and inputs is not necessary for the indicators to be effective. A sequential methodology which includes an independent validation step is applied in two analysis applications to select subsets of input variables which are the dominant causes of uncertainty in the model predictions. Comparison with results from methods which assume linearity shows how those methods may fail. Finally, suggestions for treating structural uncertainty for submodels are presented

  4. Ground motion predictions

    International Nuclear Information System (INIS)

    Loux, P.C.

    1969-01-01

    Nuclear generated ground motion is defined and then related to the physical parameters that cause it. Techniques employed for prediction of ground motion peak amplitude, frequency spectra and response spectra are explored, with initial emphasis on the analysis of data collected at the Nevada Test Site (NTS). NTS postshot measurements are compared with pre-shot predictions. Applicability of these techniques to new areas, for example, Plowshare sites, must be questioned. Fortunately, the Atomic Energy Commission is sponsoring complementary studies to improve prediction capabilities primarily in new locations outside the NTS region. Some of these are discussed in the light of anomalous seismic behavior, and comparisons are given showing theoretical versus experimental results. In conclusion, current ground motion prediction techniques are applied to events off the NTS. Predictions are compared with measurements for the event Faultless and for the Plowshare events, Gasbuggy, Cabriolet, and Buggy I. (author)

  5. 10 CFR 455.20 - Contents of State Plan.

    Science.gov (United States)

    2010-01-01

    ... priority given to fuel types that include oil, natural gas, and electricity, under § 455.131(c)(2); (f) The... assistance analysts. Such policies shall require that technical assistance analysts be free from financial...

  6. Cloud prediction of protein structure and function with PredictProtein for Debian.

    Science.gov (United States)

    Kaján, László; Yachdav, Guy; Vicedo, Esmeralda; Steinegger, Martin; Mirdita, Milot; Angermüller, Christof; Böhm, Ariane; Domke, Simon; Ertl, Julia; Mertes, Christian; Reisinger, Eva; Staniewski, Cedric; Rost, Burkhard

    2013-01-01

    We report the release of PredictProtein for the Debian operating system and derivatives, such as Ubuntu, Bio-Linux, and Cloud BioLinux. The PredictProtein suite is available as a standard set of open source Debian packages. The release covers the most popular prediction methods from the Rost Lab, including methods for the prediction of secondary structure and solvent accessibility (profphd), nuclear localization signals (predictnls), and intrinsically disordered regions (norsnet). We also present two case studies that successfully utilize PredictProtein packages for high performance computing in the cloud: the first analyzes protein disorder for whole organisms, and the second analyzes the effect of all possible single sequence variants in protein coding regions of the human genome.

  7. Fault tree analysis: concepts and techniques

    International Nuclear Information System (INIS)

    Fussell, J.B.

    1976-01-01

    Concepts and techniques of fault tree analysis have been developed over the past decade and now predictions from this type analysis are important considerations in the design of many systems such as aircraft, ships and their electronic systems, missiles, and nuclear reactor systems. Routine, hardware-oriented fault tree construction can be automated; however, considerable effort is needed in this area to get the methodology into production status. When this status is achieved, the entire analysis of hardware systems will be automated except for the system definition step. Automated analysis is not undesirable; to the contrary, when verified on adequately complex systems, automated analysis could well become a routine analysis. It could also provide an excellent start for a more in-depth fault tree analysis that includes environmental effects, common mode failure, and human errors. The automated analysis is extremely fast and frees the analyst from the routine hardware-oriented fault tree construction, as well as eliminates logic errors and errors of oversight in this part of the analysis. Automated analysis then affords the analyst a powerful tool to allow his prime efforts to be devoted to unearthing more subtle aspects of the modes of failure of the system

  8. Review of data mining applications for quality assessment in manufacturing industry: support vector machines

    Directory of Open Access Journals (Sweden)

    Rostami Hamidey

    2015-01-01

    Full Text Available In many modern manufacturing industries, data that characterize the manufacturing process are electronically collected and stored in databases. Due to advances in data collection systems and analysis tools, data mining (DM has widely been applied for quality assessment (QA in manufacturing industries. In DM, the choice of technique to be used in analyzing a dataset and assessing the quality depend on the understanding of the analyst. On the other hand, with the advent of improved and efficient prediction techniques, there is a need for an analyst to know which tool performs better for a particular type of dataset. Although a few review papers have recently been published to discuss DM applications in manufacturing for QA, this paper provides an extensive review to investigate the application of a special DM technique, namely support vector machine (SVM to deal with QA problems. This review provides a comprehensive analysis of the literature from various points of view as DM concepts, data preprocessing, DM applications for each quality task, SVM preliminaries, and application results. Summary tables and figures are also provided besides to the analyses. Finally, conclusions and future research directions are provided.

  9. MJO prediction skill of the subseasonal-to-seasonal (S2S) prediction models

    Science.gov (United States)

    Son, S. W.; Lim, Y.; Kim, D.

    2017-12-01

    The Madden-Julian Oscillation (MJO), the dominant mode of tropical intraseasonal variability, provides the primary source of tropical and extratropical predictability on subseasonal to seasonal timescales. To better understand its predictability, this study conducts quantitative evaluation of MJO prediction skill in the state-of-the-art operational models participating in the subseasonal-to-seasonal (S2S) prediction project. Based on bivariate correlation coefficient of 0.5, the S2S models exhibit MJO prediction skill ranging from 12 to 36 days. These prediction skills are affected by both the MJO amplitude and phase errors, the latter becoming more important with forecast lead times. Consistent with previous studies, the MJO events with stronger initial amplitude are typically better predicted. However, essentially no sensitivity to the initial MJO phase is observed. Overall MJO prediction skill and its inter-model spread are further related with the model mean biases in moisture fields and longwave cloud-radiation feedbacks. In most models, a dry bias quickly builds up in the deep tropics, especially across the Maritime Continent, weakening horizontal moisture gradient. This likely dampens the organization and propagation of MJO. Most S2S models also underestimate the longwave cloud-radiation feedbacks in the tropics, which may affect the maintenance of the MJO convective envelop. In general, the models with a smaller bias in horizontal moisture gradient and longwave cloud-radiation feedbacks show a higher MJO prediction skill, suggesting that improving those processes would enhance MJO prediction skill.

  10. Weighted-Average Least Squares Prediction

    NARCIS (Netherlands)

    Magnus, Jan R.; Wang, Wendun; Zhang, Xinyu

    2016-01-01

    Prediction under model uncertainty is an important and difficult issue. Traditional prediction methods (such as pretesting) are based on model selection followed by prediction in the selected model, but the reported prediction and the reported prediction variance ignore the uncertainty from the

  11. Dopamine reward prediction error coding.

    Science.gov (United States)

    Schultz, Wolfram

    2016-03-01

    Reward prediction errors consist of the differences between received and predicted rewards. They are crucial for basic forms of learning about rewards and make us strive for more rewards-an evolutionary beneficial trait. Most dopamine neurons in the midbrain of humans, monkeys, and rodents signal a reward prediction error; they are activated by more reward than predicted (positive prediction error), remain at baseline activity for fully predicted rewards, and show depressed activity with less reward than predicted (negative prediction error). The dopamine signal increases nonlinearly with reward value and codes formal economic utility. Drugs of addiction generate, hijack, and amplify the dopamine reward signal and induce exaggerated, uncontrolled dopamine effects on neuronal plasticity. The striatum, amygdala, and frontal cortex also show reward prediction error coding, but only in subpopulations of neurons. Thus, the important concept of reward prediction errors is implemented in neuronal hardware.

  12. Linux malware incident response an excerpt from malware forensic field guide for Linux systems

    CERN Document Server

    Malin, Cameron H; Aquilina, James M

    2013-01-01

    Linux Malware Incident Response is a ""first look"" at the Malware Forensics Field Guide for Linux Systems, exhibiting the first steps in investigating Linux-based incidents. The Syngress Digital Forensics Field Guides series includes companions for any digital and computer forensic investigator and analyst. Each book is a ""toolkit"" with checklists for specific tasks, case studies of difficult situations, and expert analyst tips. This compendium of tools for computer forensics analysts and investigators is presented in a succinct outline format with cross-references to suppleme

  13. "When does making detailed predictions make predictions worse?": Correction to Kelly and Simmons (2016).

    Science.gov (United States)

    2016-10-01

    Reports an error in "When Does Making Detailed Predictions Make Predictions Worse" by Theresa F. Kelly and Joseph P. Simmons ( Journal of Experimental Psychology: General , Advanced Online Publication, Aug 8, 2016, np). In the article, the symbols in Figure 2 were inadvertently altered in production. All versions of this article have been corrected. (The following abstract of the original article appeared in record 2016-37952-001.) In this article, we investigate whether making detailed predictions about an event worsens other predictions of the event. Across 19 experiments, 10,896 participants, and 407,045 predictions about 724 professional sports games, we find that people who made detailed predictions about sporting events (e.g., how many hits each baseball team would get) made worse predictions about more general outcomes (e.g., which team would win). We rule out that this effect is caused by inattention or fatigue, thinking too hard, or a differential reliance on holistic information about the teams. Instead, we find that thinking about game-relevant details before predicting winning teams causes people to give less weight to predictive information, presumably because predicting details makes useless or redundant information more accessible and thus more likely to be incorporated into forecasts. Furthermore, we show that this differential use of information can be used to predict what kinds of events will and will not be susceptible to the negative effect of making detailed predictions. PsycINFO Database Record (c) 2016 APA, all rights reserved

  14. Automatically explaining machine learning prediction results: a demonstration on type 2 diabetes risk prediction.

    Science.gov (United States)

    Luo, Gang

    2016-01-01

    Predictive modeling is a key component of solutions to many healthcare problems. Among all predictive modeling approaches, machine learning methods often achieve the highest prediction accuracy, but suffer from a long-standing open problem precluding their widespread use in healthcare. Most machine learning models give no explanation for their prediction results, whereas interpretability is essential for a predictive model to be adopted in typical healthcare settings. This paper presents the first complete method for automatically explaining results for any machine learning predictive model without degrading accuracy. We did a computer coding implementation of the method. Using the electronic medical record data set from the Practice Fusion diabetes classification competition containing patient records from all 50 states in the United States, we demonstrated the method on predicting type 2 diabetes diagnosis within the next year. For the champion machine learning model of the competition, our method explained prediction results for 87.4 % of patients who were correctly predicted by the model to have type 2 diabetes diagnosis within the next year. Our demonstration showed the feasibility of automatically explaining results for any machine learning predictive model without degrading accuracy.

  15. Predictive Modelling and Time: An Experiment in Temporal Archaeological Predictive Models

    OpenAIRE

    David Ebert

    2006-01-01

    One of the most common criticisms of archaeological predictive modelling is that it fails to account for temporal or functional differences in sites. However, a practical solution to temporal or functional predictive modelling has proven to be elusive. This article discusses temporal predictive modelling, focusing on the difficulties of employing temporal variables, then introduces and tests a simple methodology for the implementation of temporal modelling. The temporal models thus created ar...

  16. NOAA's National Air Quality Predictions and Development of Aerosol and Atmospheric Composition Prediction Components for the Next Generation Global Prediction System

    Science.gov (United States)

    Stajner, I.; Hou, Y. T.; McQueen, J.; Lee, P.; Stein, A. F.; Tong, D.; Pan, L.; Huang, J.; Huang, H. C.; Upadhayay, S.

    2016-12-01

    NOAA provides operational air quality predictions using the National Air Quality Forecast Capability (NAQFC): ozone and wildfire smoke for the United States and airborne dust for the contiguous 48 states at http://airquality.weather.gov. NOAA's predictions of fine particulate matter (PM2.5) became publicly available in February 2016. Ozone and PM2.5 predictions are produced using a system that operationally links the Community Multiscale Air Quality (CMAQ) model with meteorological inputs from the North American mesoscale forecast Model (NAM). Smoke and dust predictions are provided using the Hybrid Single Particle Lagrangian Integrated Trajectory (HYSPLIT) model. Current NAQFC focus is on updating CMAQ to version 5.0.2, improving PM2.5 predictions, and updating emissions estimates, especially for NOx using recently observed trends. Wildfire smoke emissions from a newer version of the USFS BlueSky system are being included in a new configuration of the NAQFC NAM-CMAQ system, which is re-run for the previous 24 hours when the wildfires were observed from satellites, to better represent wildfire emissions prior to initiating predictions for the next 48 hours. In addition, NOAA is developing the Next Generation Global Prediction System (NGGPS) to represent the earth system for extended weather prediction. NGGPS will include a representation of atmospheric dynamics, physics, aerosols and atmospheric composition as well as coupling with ocean, wave, ice and land components. NGGPS is being developed with a broad community involvement, including community developed components and academic research to develop and test potential improvements for potentially inclusion in NGGPS. Several investigators at NOAA's research laboratories and in academia are working to improve the aerosol and gaseous chemistry representation for NGGPS, to develop and evaluate the representation of atmospheric composition, and to establish and improve the coupling with radiation and microphysics

  17. Reward positivity: Reward prediction error or salience prediction error?

    Science.gov (United States)

    Heydari, Sepideh; Holroyd, Clay B

    2016-08-01

    The reward positivity is a component of the human ERP elicited by feedback stimuli in trial-and-error learning and guessing tasks. A prominent theory holds that the reward positivity reflects a reward prediction error signal that is sensitive to outcome valence, being larger for unexpected positive events relative to unexpected negative events (Holroyd & Coles, 2002). Although the theory has found substantial empirical support, most of these studies have utilized either monetary or performance feedback to test the hypothesis. However, in apparent contradiction to the theory, a recent study found that unexpected physical punishments also elicit the reward positivity (Talmi, Atkinson, & El-Deredy, 2013). The authors of this report argued that the reward positivity reflects a salience prediction error rather than a reward prediction error. To investigate this finding further, in the present study participants navigated a virtual T maze and received feedback on each trial under two conditions. In a reward condition, the feedback indicated that they would either receive a monetary reward or not and in a punishment condition the feedback indicated that they would receive a small shock or not. We found that the feedback stimuli elicited a typical reward positivity in the reward condition and an apparently delayed reward positivity in the punishment condition. Importantly, this signal was more positive to the stimuli that predicted the omission of a possible punishment relative to stimuli that predicted a forthcoming punishment, which is inconsistent with the salience hypothesis. © 2016 Society for Psychophysiological Research.

  18. Dopamine reward prediction error coding

    OpenAIRE

    Schultz, Wolfram

    2016-01-01

    Reward prediction errors consist of the differences between received and predicted rewards. They are crucial for basic forms of learning about rewards and make us strive for more rewards?an evolutionary beneficial trait. Most dopamine neurons in the midbrain of humans, monkeys, and rodents signal a reward prediction error; they are activated by more reward than predicted (positive prediction error), remain at baseline activity for fully predicted rewards, and show depressed activity with less...

  19. Station Set Residual: Event Classification Using Historical Distribution of Observing Stations

    Science.gov (United States)

    Procopio, Mike; Lewis, Jennifer; Young, Chris

    2010-05-01

    Analysts working at the International Data Centre in support of treaty monitoring through the Comprehensive Nuclear-Test-Ban Treaty Organization spend a significant amount of time reviewing hypothesized seismic events produced by an automatic processing system. When reviewing these events to determine their legitimacy, analysts take a variety of approaches that rely heavily on training and past experience. One method used by analysts to gauge the validity of an event involves examining the set of stations involved in the detection of an event. In particular, leveraging past experience, an analyst can say that an event located in a certain part of the world is expected to be detected by Stations A, B, and C. Implicit in this statement is that such an event would usually not be detected by Stations X, Y, or Z. For some well understood parts of the world, the absence of one or more "expected" stations—or the presence of one or more "unexpected" stations—is correlated with a hypothesized event's legitimacy and to its survival to the event bulletin. The primary objective of this research is to formalize and quantify the difference between the observed set of stations detecting some hypothesized event, versus the expected set of stations historically associated with detecting similar nearby events close in magnitude. This Station Set Residual can be quantified in many ways, some of which are correlated with the analysts' determination of whether or not the event is valid. We propose that this Station Set Residual score can be used to screen out certain classes of "false" events produced by automatic processing with a high degree of confidence, reducing the analyst burden. Moreover, we propose that the visualization of the historically expected distribution of detecting stations can be immediately useful as an analyst aid during their review process.

  20. Model Prediction Control For Water Management Using Adaptive Prediction Accuracy

    NARCIS (Netherlands)

    Tian, X.; Negenborn, R.R.; Van Overloop, P.J.A.T.M.; Mostert, E.

    2014-01-01

    In the field of operational water management, Model Predictive Control (MPC) has gained popularity owing to its versatility and flexibility. The MPC controller, which takes predictions, time delay and uncertainties into account, can be designed for multi-objective management problems and for

  1. Transionospheric propagation predictions

    Science.gov (United States)

    Klobucher, J. A.; Basu, S.; Basu, S.; Bernhardt, P. A.; Davies, K.; Donatelli, D. E.; Fremouw, E. J.; Goodman, J. M.; Hartmann, G. K.; Leitinger, R.

    1979-01-01

    The current status and future prospects of the capability to make transionospheric propagation predictions are addressed, highlighting the effects of the ionized media, which dominate for frequencies below 1 to 3 GHz, depending upon the state of the ionosphere and the elevation angle through the Earth-space path. The primary concerns are the predictions of time delay of signal modulation (group path delay) and of radio wave scintillation. Progress in these areas is strongly tied to knowledge of variable structures in the ionosphere ranging from the large scale (thousands of kilometers in horizontal extent) to the fine scale (kilometer size). Ionospheric variability and the relative importance of various mechanisms responsible for the time histories observed in total electron content (TEC), proportional to signal group delay, and in irregularity formation are discussed in terms of capability to make both short and long term predictions. The data base upon which predictions are made is examined for its adequacy, and the prospects for prediction improvements by more theoretical studies as well as by increasing the available statistical data base are examined.

  2. Predictable grammatical constructions

    DEFF Research Database (Denmark)

    Lucas, Sandra

    2015-01-01

    My aim in this paper is to provide evidence from diachronic linguistics for the view that some predictable units are entrenched in grammar and consequently in human cognition, in a way that makes them functionally and structurally equal to nonpredictable grammatical units, suggesting that these p......My aim in this paper is to provide evidence from diachronic linguistics for the view that some predictable units are entrenched in grammar and consequently in human cognition, in a way that makes them functionally and structurally equal to nonpredictable grammatical units, suggesting...... that these predictable units should be considered grammatical constructions on a par with the nonpredictable constructions. Frequency has usually been seen as the only possible argument speaking in favor of viewing some formally and semantically fully predictable units as grammatical constructions. However, this paper...... semantically and formally predictable. Despite this difference, [méllo INF], like the other future periphrases, seems to be highly entrenched in the cognition (and grammar) of Early Medieval Greek language users, and consequently a grammatical construction. The syntactic evidence speaking in favor of [méllo...

  3. Review of the Scientific Evidence of Using Population Normative Values for Post-Concussive Computerized Neurocognitive Assessments

    Science.gov (United States)

    2016-02-10

    R, Virginia Commonwealth University  Dr. Chris Giza, Professor of Pediatric Neurology and Neurosurgery, UCLA Brain Injury Research Center October...Sara Higgins , MPH Analyst, Grant Thornton LLP Kendal Brown, MBA Management Analyst, Information Innovators, Inc. Margaret Welsh Management

  4. Collaborative interactive visualization: exploratory concept

    Science.gov (United States)

    Mokhtari, Marielle; Lavigne, Valérie; Drolet, Frédéric

    2015-05-01

    Dealing with an ever increasing amount of data is a challenge that military intelligence analysts or team of analysts face day to day. Increased individual and collective comprehension goes through collaboration between people. Better is the collaboration, better will be the comprehension. Nowadays, various technologies support and enhance collaboration by allowing people to connect and collaborate in settings as varied as across mobile devices, over networked computers, display walls, tabletop surfaces, to name just a few. A powerful collaboration system includes traditional and multimodal visualization features to achieve effective human communication. Interactive visualization strengthens collaboration because this approach is conducive to incrementally building a mental assessment of the data meaning. The purpose of this paper is to present an overview of the envisioned collaboration architecture and the interactive visualization concepts underlying the Sensemaking Support System prototype developed to support analysts in the context of the Joint Intelligence Collection and Analysis Capability project at DRDC Valcartier. It presents the current version of the architecture, discusses future capabilities to help analyst(s) in the accomplishment of their tasks and finally recommends collaboration and visualization technologies allowing to go a step further both as individual and as a team.

  5. Don'T wag the dog: extending the reach of applied behavior analysis.

    Science.gov (United States)

    Normand, Matthew P; Kohn, Carolynn S

    2013-01-01

    We argue that the field of behavior analysis would be best served if behavior analysts worked to extend the reach of behavioral services into a more diverse range of settings and with more varied populations, with an emphasis on the establishment of new career opportunities for graduating students. This is not a new proposal, but it is a tall order; it is not difficult to see why many would choose a surer route to gainful employment. Currently, the most fruitful career path for behavior analysts in practice is in the area of autism and developmental disabilities. For the continued growth of the field of behavior analysis, however, it is important to foster new career opportunities for those trained as behavior analysts. Toward this end, we identify several fields that seem well suited to behavior analysts and summarize the training requirements and likely professional outcomes for behavior analysts who pursue education and certification in these fields. These fields require relatively little additional formal training in the hopes of minimizing the response effort necessary for individuals who have already completed a rigorous program of graduate study in behavior analysis.

  6. Financial Analysts’ Forecasts

    DEFF Research Database (Denmark)

    Stæhr, Simone

    . The primary focus is on financial analysts in the task of conducting earnings forecasts while a secondary focus is on investors’ abilities to interpret and make use of these forecasts. Simply put, financial analysts can be seen as information intermediators receiving inputs to their analyses from firm...... in the decision making and the magnitude of these constraints does sometimes vary with personal traits. Therefore, to the extent that financial analysts are subjects to behavioral biases their outputs to the investors are likely to be biased by their interpretation of information. Because investors need accuracy...... management and providing outputs to the investors. Amongst various outputs from the analysts are forecasts of earnings. According to decision theories mostly from the literature in psychology all humans are affected by cognitive constraints to some degree. These constraints may lead to unintentional biases...

  7. Candidate Prediction Models and Methods

    DEFF Research Database (Denmark)

    Nielsen, Henrik Aalborg; Nielsen, Torben Skov; Madsen, Henrik

    2005-01-01

    This document lists candidate prediction models for Work Package 3 (WP3) of the PSO-project called ``Intelligent wind power prediction systems'' (FU4101). The main focus is on the models transforming numerical weather predictions into predictions of power production. The document also outlines...... the possibilities w.r.t. different numerical weather predictions actually available to the project....

  8. Automatic titrator for high precision plutonium assay

    International Nuclear Information System (INIS)

    Jackson, D.D.; Hollen, R.M.

    1986-01-01

    Highly precise assay of plutonium metal is required for accountability measurements. We have developed an automatic titrator for this determination which eliminates analyst bias and requires much less analyst time. The analyst is only required to enter sample data and start the titration. The automated instrument titrates the sample, locates the end point, and outputs the results as a paper tape printout. Precision of the titration is less than 0.03% relative standard deviation for a single determination at the 250-mg plutonium level. The titration time is less than 5 min

  9. Prediction Markets

    DEFF Research Database (Denmark)

    Horn, Christian Franz; Ivens, Bjørn Sven; Ohneberg, Michael

    2014-01-01

    In recent years, Prediction Markets gained growing interest as a forecasting tool among researchers as well as practitioners, which resulted in an increasing number of publications. In order to track the latest development of research, comprising the extent and focus of research, this article...... provides a comprehensive review and classification of the literature related to the topic of Prediction Markets. Overall, 316 relevant articles, published in the timeframe from 2007 through 2013, were identified and assigned to a herein presented classification scheme, differentiating between descriptive...... works, articles of theoretical nature, application-oriented studies and articles dealing with the topic of law and policy. The analysis of the research results reveals that more than half of the literature pool deals with the application and actual function tests of Prediction Markets. The results...

  10. Adding propensity scores to pure prediction models fails to improve predictive performance

    Directory of Open Access Journals (Sweden)

    Amy S. Nowacki

    2013-08-01

    Full Text Available Background. Propensity score usage seems to be growing in popularity leading researchers to question the possible role of propensity scores in prediction modeling, despite the lack of a theoretical rationale. It is suspected that such requests are due to the lack of differentiation regarding the goals of predictive modeling versus causal inference modeling. Therefore, the purpose of this study is to formally examine the effect of propensity scores on predictive performance. Our hypothesis is that a multivariable regression model that adjusts for all covariates will perform as well as or better than those models utilizing propensity scores with respect to model discrimination and calibration.Methods. The most commonly encountered statistical scenarios for medical prediction (logistic and proportional hazards regression were used to investigate this research question. Random cross-validation was performed 500 times to correct for optimism. The multivariable regression models adjusting for all covariates were compared with models that included adjustment for or weighting with the propensity scores. The methods were compared based on three predictive performance measures: (1 concordance indices; (2 Brier scores; and (3 calibration curves.Results. Multivariable models adjusting for all covariates had the highest average concordance index, the lowest average Brier score, and the best calibration. Propensity score adjustment and inverse probability weighting models without adjustment for all covariates performed worse than full models and failed to improve predictive performance with full covariate adjustment.Conclusion. Propensity score techniques did not improve prediction performance measures beyond multivariable adjustment. Propensity scores are not recommended if the analytical goal is pure prediction modeling.

  11. Mathematical Modelling for the Evaluation of Automated Speech Recognition Systems--Research Area 3.3.1 (c)

    Science.gov (United States)

    2016-01-07

    news. Both of these resemble typical activities of intelligence analysts in OSINT processing and production applications. We assessed two task...intelligence analysts in a number of OSINT processing and production applications. (5) Summary of the most important results In both settings

  12. 76 FR 66724 - Submission for OMB Review; Comment Request: New Proposed Collection, Neuropsychosocial Measures...

    Science.gov (United States)

    2011-10-27

    ... (b) shall-- (1) Incorporate behavioral, emotional, educational, and contextual consequences to enable... Ms. Jamelle E. Banks, Public Health Analyst, Office of Science Policy, Analysis and Communication...: October 20, 2011. Jamelle E. Banks, Public Health Analyst, Office of Science Policy, Analysis and...

  13. 76 FR 62077 - Submission for OBM Review; Comment Request; New Proposed Collection, Environmental Science...

    Science.gov (United States)

    2011-10-06

    ... under subsection (b) shall-- (1) incorporate behavioral, emotional, educational, and contextual... instruments, contact Ms. Jamelle E. Banks, Public Health Analyst, Office of Science Policy, Analysis and.... Dated: September 30, 2011. Jamelle E. Banks, Public Health Analyst, Office of Science Policy, Analysis...

  14. Global Logistics Management

    Science.gov (United States)

    2011-07-21

    Phillips, Richard Spencer, and Leigh Warner. Catherine Whittington served as the Board Staff Analyst. PROCESS The Task Group conducted more than...Chair) Mr. Pierre Chao Mr. William Phillips Mr. Richard Spencer Ms. Leigh Warner DBB Staff Analyst Catherine Whittington 2 Methodology

  15. 20 CFR 633.304 - Section 402 cost allocation.

    Science.gov (United States)

    2010-04-01

    ... accounted for as follows: (1) Administration. Administration costs consist of all direct and indirect costs... direct program administrative positions such as supervisors, program analysts, labor market analysts, and... to participants, classroom space and utility costs; job search assistance, labor market orientation...

  16. Development and validation of a risk model for prediction of hazardous alcohol consumption in general practice attendees: the predictAL study.

    Science.gov (United States)

    King, Michael; Marston, Louise; Švab, Igor; Maaroos, Heidi-Ingrid; Geerlings, Mirjam I; Xavier, Miguel; Benjamin, Vicente; Torres-Gonzalez, Francisco; Bellon-Saameno, Juan Angel; Rotar, Danica; Aluoja, Anu; Saldivia, Sandra; Correa, Bernardo; Nazareth, Irwin

    2011-01-01

    Little is known about the risk of progression to hazardous alcohol use in people currently drinking at safe limits. We aimed to develop a prediction model (predictAL) for the development of hazardous drinking in safe drinkers. A prospective cohort study of adult general practice attendees in six European countries and Chile followed up over 6 months. We recruited 10,045 attendees between April 2003 to February 2005. 6193 European and 2462 Chilean attendees recorded AUDIT scores below 8 in men and 5 in women at recruitment and were used in modelling risk. 38 risk factors were measured to construct a risk model for the development of hazardous drinking using stepwise logistic regression. The model was corrected for over fitting and tested in an external population. The main outcome was hazardous drinking defined by an AUDIT score ≥8 in men and ≥5 in women. 69.0% of attendees were recruited, of whom 89.5% participated again after six months. The risk factors in the final predictAL model were sex, age, country, baseline AUDIT score, panic syndrome and lifetime alcohol problem. The predictAL model's average c-index across all six European countries was 0.839 (95% CI 0.805, 0.873). The Hedge's g effect size for the difference in log odds of predicted probability between safe drinkers in Europe who subsequently developed hazardous alcohol use and those who did not was 1.38 (95% CI 1.25, 1.51). External validation of the algorithm in Chilean safe drinkers resulted in a c-index of 0.781 (95% CI 0.717, 0.846) and Hedge's g of 0.68 (95% CI 0.57, 0.78). The predictAL risk model for development of hazardous consumption in safe drinkers compares favourably with risk algorithms for disorders in other medical settings and can be a useful first step in prevention of alcohol misuse.

  17. Development and validation of a risk model for prediction of hazardous alcohol consumption in general practice attendees: the predictAL study.

    Directory of Open Access Journals (Sweden)

    Michael King

    Full Text Available Little is known about the risk of progression to hazardous alcohol use in people currently drinking at safe limits. We aimed to develop a prediction model (predictAL for the development of hazardous drinking in safe drinkers.A prospective cohort study of adult general practice attendees in six European countries and Chile followed up over 6 months. We recruited 10,045 attendees between April 2003 to February 2005. 6193 European and 2462 Chilean attendees recorded AUDIT scores below 8 in men and 5 in women at recruitment and were used in modelling risk. 38 risk factors were measured to construct a risk model for the development of hazardous drinking using stepwise logistic regression. The model was corrected for over fitting and tested in an external population. The main outcome was hazardous drinking defined by an AUDIT score ≥8 in men and ≥5 in women.69.0% of attendees were recruited, of whom 89.5% participated again after six months. The risk factors in the final predictAL model were sex, age, country, baseline AUDIT score, panic syndrome and lifetime alcohol problem. The predictAL model's average c-index across all six European countries was 0.839 (95% CI 0.805, 0.873. The Hedge's g effect size for the difference in log odds of predicted probability between safe drinkers in Europe who subsequently developed hazardous alcohol use and those who did not was 1.38 (95% CI 1.25, 1.51. External validation of the algorithm in Chilean safe drinkers resulted in a c-index of 0.781 (95% CI 0.717, 0.846 and Hedge's g of 0.68 (95% CI 0.57, 0.78.The predictAL risk model for development of hazardous consumption in safe drinkers compares favourably with risk algorithms for disorders in other medical settings and can be a useful first step in prevention of alcohol misuse.

  18. Genomic prediction of complex human traits: relatedness, trait architecture and predictive meta-models

    Science.gov (United States)

    Spiliopoulou, Athina; Nagy, Reka; Bermingham, Mairead L.; Huffman, Jennifer E.; Hayward, Caroline; Vitart, Veronique; Rudan, Igor; Campbell, Harry; Wright, Alan F.; Wilson, James F.; Pong-Wong, Ricardo; Agakov, Felix; Navarro, Pau; Haley, Chris S.

    2015-01-01

    We explore the prediction of individuals' phenotypes for complex traits using genomic data. We compare several widely used prediction models, including Ridge Regression, LASSO and Elastic Nets estimated from cohort data, and polygenic risk scores constructed using published summary statistics from genome-wide association meta-analyses (GWAMA). We evaluate the interplay between relatedness, trait architecture and optimal marker density, by predicting height, body mass index (BMI) and high-density lipoprotein level (HDL) in two data cohorts, originating from Croatia and Scotland. We empirically demonstrate that dense models are better when all genetic effects are small (height and BMI) and target individuals are related to the training samples, while sparse models predict better in unrelated individuals and when some effects have moderate size (HDL). For HDL sparse models achieved good across-cohort prediction, performing similarly to the GWAMA risk score and to models trained within the same cohort, which indicates that, for predicting traits with moderately sized effects, large sample sizes and familial structure become less important, though still potentially useful. Finally, we propose a novel ensemble of whole-genome predictors with GWAMA risk scores and demonstrate that the resulting meta-model achieves higher prediction accuracy than either model on its own. We conclude that although current genomic predictors are not accurate enough for diagnostic purposes, performance can be improved without requiring access to large-scale individual-level data. Our methodologically simple meta-model is a means of performing predictive meta-analysis for optimizing genomic predictions and can be easily extended to incorporate multiple population-level summary statistics or other domain knowledge. PMID:25918167

  19. Predicting scholars' scientific impact.

    Directory of Open Access Journals (Sweden)

    Amin Mazloumian

    Full Text Available We tested the underlying assumption that citation counts are reliable predictors of future success, analyzing complete citation data on the careers of ~150,000 scientists. Our results show that i among all citation indicators, the annual citations at the time of prediction is the best predictor of future citations, ii future citations of a scientist's published papers can be predicted accurately (r(2 = 0.80 for a 1-year prediction, P<0.001 but iii future citations of future work are hardly predictable.

  20. The Prediction Properties of Inverse and Reverse Regression for the Simple Linear Calibration Problem

    Science.gov (United States)

    Parker, Peter A.; Geoffrey, Vining G.; Wilson, Sara R.; Szarka, John L., III; Johnson, Nels G.

    2010-01-01

    The calibration of measurement systems is a fundamental but under-studied problem within industrial statistics. The origins of this problem go back to basic chemical analysis based on NIST standards. In today's world these issues extend to mechanical, electrical, and materials engineering. Often, these new scenarios do not provide "gold standards" such as the standard weights provided by NIST. This paper considers the classic "forward regression followed by inverse regression" approach. In this approach the initial experiment treats the "standards" as the regressor and the observed values as the response to calibrate the instrument. The analyst then must invert the resulting regression model in order to use the instrument to make actual measurements in practice. This paper compares this classical approach to "reverse regression," which treats the standards as the response and the observed measurements as the regressor in the calibration experiment. Such an approach is intuitively appealing because it avoids the need for the inverse regression. However, it also violates some of the basic regression assumptions.

  1. Protein Sorting Prediction

    DEFF Research Database (Denmark)

    Nielsen, Henrik

    2017-01-01

    and drawbacks of each of these approaches is described through many examples of methods that predict secretion, integration into membranes, or subcellular locations in general. The aim of this chapter is to provide a user-level introduction to the field with a minimum of computational theory.......Many computational methods are available for predicting protein sorting in bacteria. When comparing them, it is important to know that they can be grouped into three fundamentally different approaches: signal-based, global-property-based and homology-based prediction. In this chapter, the strengths...

  2. 'Red Flag' Predictions

    DEFF Research Database (Denmark)

    Hallin, Carina Antonia; Andersen, Torben Juul; Tveterås, Sigbjørn

    -generation prediction markets and outline its unique features as a third-generation prediction market. It is argued that frontline employees gain deep insights when they execute operational activities on an ongoing basis in the organization. The experiential learning from close interaction with internal and external......This conceptual article introduces a new way to predict firm performance based on aggregation of sensing among frontline employees about changes in operational capabilities to update strategic action plans and generate innovations. We frame the approach in the context of first- and second...

  3. Predictive value of diminutive colonic adenoma trial: the PREDICT trial.

    Science.gov (United States)

    Schoenfeld, Philip; Shad, Javaid; Ormseth, Eric; Coyle, Walter; Cash, Brooks; Butler, James; Schindler, William; Kikendall, Walter J; Furlong, Christopher; Sobin, Leslie H; Hobbs, Christine M; Cruess, David; Rex, Douglas

    2003-05-01

    Diminutive adenomas (1-9 mm in diameter) are frequently found during colon cancer screening with flexible sigmoidoscopy (FS). This trial assessed the predictive value of these diminutive adenomas for advanced adenomas in the proximal colon. In a multicenter, prospective cohort trial, we matched 200 patients with normal FS and 200 patients with diminutive adenomas on FS for age and gender. All patients underwent colonoscopy. The presence of advanced adenomas (adenoma >or= 10 mm in diameter, villous adenoma, adenoma with high grade dysplasia, and colon cancer) and adenomas (any size) was recorded. Before colonoscopy, patients completed questionnaires about risk factors for adenomas. The prevalence of advanced adenomas in the proximal colon was similar in patients with diminutive adenomas and patients with normal FS (6% vs. 5.5%, respectively) (relative risk, 1.1; 95% confidence interval [CI], 0.5-2.6). Diminutive adenomas on FS did not accurately predict advanced adenomas in the proximal colon: sensitivity, 52% (95% CI, 32%-72%); specificity, 50% (95% CI, 49%-51%); positive predictive value, 6% (95% CI, 4%-8%); and negative predictive value, 95% (95% CI, 92%-97%). Male gender (odds ratio, 1.63; 95% CI, 1.01-2.61) was associated with an increased risk of proximal colon adenomas. Diminutive adenomas on sigmoidoscopy may not accurately predict advanced adenomas in the proximal colon.

  4. 77 FR 15037 - Agency Information Collection Activities: Proposed Collection; Comment Request-Special Nutrition...

    Science.gov (United States)

    2012-03-14

    ... Program Analyst, Office of Research and Analysis, Food and Nutrition Service/USDA, 3101 Park Center Drive... DEPARTMENT OF AGRICULTURE Food and Nutrition Service Agency Information Collection Activities... may be sent to: John Endahl, Senior Program Analyst, Office of Research and Analysis, Food and...

  5. Report To The Secretary Of Defense - Global Logistics Management

    Science.gov (United States)

    2011-07-01

    Spencer, and Leigh Warner. Catherine Whittington served as the Board’s Staff Analyst. PROCESS The Task Group conducted more than 30 interviews...Phillips Mr. Richard Spencer Ms. Leigh Warner DBB Staff Analyst Ms. Catherine Whittington 2 Methodology  Reviewed DoD Directives and

  6. Diagnosis in the Olap Context

    NARCIS (Netherlands)

    E.A.M. Caron (Emiel); H.A.M. Daniels (Hennie)

    2004-01-01

    textabstractThe purpose of OLAP (On-Line Analytical Processing) systems is to provide a framework for the analysis of multidimensional data. Many tasks related to analysing multidimensional data and making business decisions are still carried out manually by analysts (e.g. financial analysts,

  7. Building a Better Strategic Analyst: A Critical Review of the U.S. Army’s All Source Analyst Training Program

    Science.gov (United States)

    2008-05-15

    intelligence enterprise to describe the idea used in this monograph. 12 David Brooks, “The Elephantiasis of Reason,” The Atlantic Monthly. (January/February... Elephantiasis of Reason”. The Atlantic Monthly. (January/February, 2003). Boyd, Dennis & Bee, Helen. 2006. Lifespan Development. Fourth Edition. Allyn

  8. G-Tunnel pressurized slot-testing preparations

    International Nuclear Information System (INIS)

    Zimmerman, R.M.; Sifre-Soto, C.; Mann, K.L.; Bellman, R.A. Jr.; Luker, S.; Dodds, D.J.

    1992-04-01

    Designers and analysts of radioactive waste repositories must be able to predict the mechanical behavior of the host rock. Sandia National laboratories elected to conduct a development program on pressurized slot testing and featured (1) development of an improved method to cut slots using a chain saw with diamond-tipped cutters, (2) measurements useful for determining in situ stresses normal to slots, (3) measurements applicable for determining the in situ modulus of deformation parallel to a drift surface, and (4) evaluations of the potentials of pressurized slot strength testing. This report describes the preparations leading to the measurements and evaluations

  9. Nuclear criticality research at the University of New Mexico

    International Nuclear Information System (INIS)

    Busch, R.D.

    1997-01-01

    Two projects at the University of New Mexico are briefly described. The university's Chemical and Nuclear Engineering Department has completed the final draft of a primer for MCNP4A, which it plans to publish soon. The primer was written to help an analyst who has little experience with the MCNP code to perform criticality safety analyses. In addition, the department has carried out a series of approach-to-critical experiments on the SHEBA-II, a UO 2 F 2 solution critical assembly at Los Alamos National Laboratory. The results obtained differed slightly from what was predicted by the TWODANT code

  10. Response of ventilation dampers to large airflow pulses

    International Nuclear Information System (INIS)

    Gregory, W.S.; Smith, P.R.

    1985-04-01

    The results of an experiment program to evaluate the response of ventilation system dampers to simulated tornado transients are reported. Relevant data, such as damper response time, flow rate and pressure drop, and flow/pressure vs blade angle, were obtained, and the response of one tornado protective damper to simulated tornado transients was evaluated. Empirical relationships that will allow the data to be integrated into flow dynamics codes were developed. These flow dynamics codes can be used by safety analysts to predict the response of nuclear facility ventilation systems to tornado depressurization. 3 refs., 21 figs., 6 tabs

  11. On the Usability of Spoken Dialogue Systems

    DEFF Research Database (Denmark)

    Larsen, Lars Bo

     This work is centred on the methods and problems associated with defining and measuring the usability of Spoken Dialogue Systems (SDS). The starting point is the fact that speech based interfaces has several times during the last 20 years fallen short of the high expectations and predictions held...... by industry, researchers and analysts. Several studies in the literature of SDS indicate that this can be ascribed to a lack of attention from the speech technology community towards the usability of such systems. The experimental results presented in this work are based on a field trial with the OVID home...

  12. A Tuned Single Parameter for Representing Conjunction Risk

    Science.gov (United States)

    Plakaloic, D.; Hejduk, M. D.; Frigm, R. C.; Newman, L. K.

    2011-01-01

    Satellite conjunction assessment risk analysis is a subjective enterprise that can benefit from quantitative aids and, to this end, NASA/GSFC has developed a fuzzy logic construct - called the F-value - to attempt to provide a statement of conjunction risk that amalgamates multiple indices and yields a more stable intra-event assessment. This construct has now sustained an extended tuning procedure against heuristic analyst assessment of event risk. The tuning effort has resulted in modifications to the calculation procedure and the adjustment of tuning coefficients, producing a construct with both more predictive force and a better statement of its error.

  13. Review of Nearshore Morphologic Prediction

    Science.gov (United States)

    Plant, N. G.; Dalyander, S.; Long, J.

    2014-12-01

    The evolution of the world's erodible coastlines will determine the balance between the benefits and costs associated with human and ecological utilization of shores, beaches, dunes, barrier islands, wetlands, and estuaries. So, we would like to predict coastal evolution to guide management and planning of human and ecological response to coastal changes. After decades of research investment in data collection, theoretical and statistical analysis, and model development we have a number of empirical, statistical, and deterministic models that can predict the evolution of the shoreline, beaches, dunes, and wetlands over time scales of hours to decades, and even predict the evolution of geologic strata over the course of millennia. Comparisons of predictions to data have demonstrated that these models can have meaningful predictive skill. But these comparisons also highlight the deficiencies in fundamental understanding, formulations, or data that are responsible for prediction errors and uncertainty. Here, we review a subset of predictive models of the nearshore to illustrate tradeoffs in complexity, predictive skill, and sensitivity to input data and parameterization errors. We identify where future improvement in prediction skill will result from improved theoretical understanding, and data collection, and model-data assimilation.

  14. The Prediction Value

    NARCIS (Netherlands)

    Koster, M.; Kurz, S.; Lindner, I.; Napel, S.

    2013-01-01

    We introduce the prediction value (PV) as a measure of players’ informational importance in probabilistic TU games. The latter combine a standard TU game and a probability distribution over the set of coalitions. Player i’s prediction value equals the difference between the conditional expectations

  15. Predicting Energy Consumption for Potential Effective Use in Hybrid Vehicle Powertrain Management Using Driver Prediction

    Science.gov (United States)

    Magnuson, Brian

    A proof-of-concept software-in-the-loop study is performed to assess the accuracy of predicted net and charge-gaining energy consumption for potential effective use in optimizing powertrain management of hybrid vehicles. With promising results of improving fuel efficiency of a thermostatic control strategy for a series, plug-ing, hybrid-electric vehicle by 8.24%, the route and speed prediction machine learning algorithms are redesigned and implemented for real- world testing in a stand-alone C++ code-base to ingest map data, learn and predict driver habits, and store driver data for fast startup and shutdown of the controller or computer used to execute the compiled algorithm. Speed prediction is performed using a multi-layer, multi-input, multi- output neural network using feed-forward prediction and gradient descent through back- propagation training. Route prediction utilizes a Hidden Markov Model with a recurrent forward algorithm for prediction and multi-dimensional hash maps to store state and state distribution constraining associations between atomic road segments and end destinations. Predicted energy is calculated using the predicted time-series speed and elevation profile over the predicted route and the road-load equation. Testing of the code-base is performed over a known road network spanning 24x35 blocks on the south hill of Spokane, Washington. A large set of training routes are traversed once to add randomness to the route prediction algorithm, and a subset of the training routes, testing routes, are traversed to assess the accuracy of the net and charge-gaining predicted energy consumption. Each test route is traveled a random number of times with varying speed conditions from traffic and pedestrians to add randomness to speed prediction. Prediction data is stored and analyzed in a post process Matlab script. The aggregated results and analysis of all traversals of all test routes reflect the performance of the Driver Prediction algorithm. The

  16. Is Time Predictability Quantifiable?

    DEFF Research Database (Denmark)

    Schoeberl, Martin

    2012-01-01

    Computer architects and researchers in the realtime domain start to investigate processors and architectures optimized for real-time systems. Optimized for real-time systems means time predictable, i.e., architectures where it is possible to statically derive a tight bound of the worst......-case execution time. To compare different approaches we would like to quantify time predictability. That means we need to measure time predictability. In this paper we discuss the different approaches for these measurements and conclude that time predictability is practically not quantifiable. We can only...... compare the worst-case execution time bounds of different architectures....

  17. Modeling the impacts of environmental policies on agricultural imports

    NARCIS (Netherlands)

    Larson, B.A.; Scatasta, S.

    2005-01-01

    For current policy debates in agricultural and food industries, policy analysts need to evaluate the impacts of how proposed changes in domestic environmental regulations may alter agricultural trade in the future. Given the industry-specific nature of many policies issues, analysts need sector and

  18. Effects of prior interpretation on situation assessment is crime analysis

    NARCIS (Netherlands)

    Kerstholt, J.H.; Eikelboom, A.R.

    2007-01-01

    Purpose - To investigate the effects of prior case knowledge on the judgement of crime analysts. Design/methodology/approach - Explains that crime analysts assist when an investigation team has converged/agreed on a probable scenario, attributes this convergence to group-think, but points out this

  19. Methodology for Participatory Policy Analysis

    NARCIS (Netherlands)

    Geurts, J.L.A.; Joldersma, F.

    2001-01-01

    In the course of time it has become clear that policy analysts who use traditional formal modeling techniques have limited impact on policy making regarding complex policy problems. These kinds of problems require the analyst to combine scientific insights with subjective knowledge resources and to

  20. 77 FR 33683 - Privacy Act of 1974: Implementation of Exemptions; Department of Homeland Security, U.S. Customs...

    Science.gov (United States)

    2012-06-07

    ... principal types of users will access AFI: DHS analysts and DHS finished intelligence product users. Analysts... Border Protection, DHS/CBP--017 Analytical Framework for Intelligence (AFI) System of Records AGENCY... Framework for Intelligence (AFI) System of Records'' and this proposed rulemaking. In this proposed...

  1. Predicting AD conversion

    DEFF Research Database (Denmark)

    Liu, Yawu; Mattila, Jussi; Ruiz, Miguel �ngel Mu�oz

    2013-01-01

    To compare the accuracies of predicting AD conversion by using a decision support system (PredictAD tool) and current research criteria of prodromal AD as identified by combinations of episodic memory impairment of hippocampal type and visual assessment of medial temporal lobe atrophy (MTA) on MRI...

  2. Filtering and prediction

    CERN Document Server

    Fristedt, B; Krylov, N

    2007-01-01

    Filtering and prediction is about observing moving objects when the observations are corrupted by random errors. The main focus is then on filtering out the errors and extracting from the observations the most precise information about the object, which itself may or may not be moving in a somewhat random fashion. Next comes the prediction step where, using information about the past behavior of the object, one tries to predict its future path. The first three chapters of the book deal with discrete probability spaces, random variables, conditioning, Markov chains, and filtering of discrete Markov chains. The next three chapters deal with the more sophisticated notions of conditioning in nondiscrete situations, filtering of continuous-space Markov chains, and of Wiener process. Filtering and prediction of stationary sequences is discussed in the last two chapters. The authors believe that they have succeeded in presenting necessary ideas in an elementary manner without sacrificing the rigor too much. Such rig...

  3. Practical application of decision support metrics for power plant risk-informed asset management

    International Nuclear Information System (INIS)

    Liming, James K.; Johnson, David H.; Kee, Ernest J.; Sun, Alice Y.; Young, Garry G.

    2003-01-01

    The objective of this paper is to provide electric utilities with a concept for developing and applying effective decision support metrics via integrated risk-informed asset management (RIAM) programs for power stations and generating companies. RIAM is a process by which analysts review historical performance and develop predictive logic models and data analyses to predict critical decision support figures-of-merit (or metrics) for generating station managers and electric utility company executives. These metrics include, but are not limited to, the following; profitability, net benefit, benefit-to-cost ratio, projected return on investment, projected revenue, projected costs, asset value, safety (catastrophic facility damage frequency and consequences, etc.), power production availability (capacity factor, etc.), efficiency (heat rate), and others. RIAM applies probabilistic safety assessment (PSA) techniques and generates predictions probabilistically so that metrics information can be supplied to managers in terms of probability distributions as well as point estimates. This enables the managers to apply the concept of 'confidence levels' in their critical decision-making processes. (author)

  4. Refleksi Perilaku Pengguna Laporan Keuangan Atas Praktik Manajemen Laba dalam Perspektif Weton

    Directory of Open Access Journals (Sweden)

    Lilik Purwanti

    2015-12-01

    Full Text Available This research aimed to find the interpretation of earnings management practices in users perception of financial statements based on weton. By using qualitative approach, the data collected from tax inspector, credit analyst, and investor. The result shows that the tax inspector (Senin wage interprets earnings management practices as lipstick and earnings manipulation while Credit analyst (Senin Pon interpret as a cosmetic. Investor (Selasa Paing interprets it as earning engineer. The behavior of tax inspector does not reflect his character while credit analyst and investor behavior reflect their character. Family and working environment, experience, and positive thinking influence the character building.

  5. Bring It to the Pitch: Combining Video and Movement Data to Enhance Team Sport Analysis.

    Science.gov (United States)

    Stein, Manuel; Janetzko, Halldor; Lamprecht, Andreas; Breitkreutz, Thorsten; Zimmermann, Philipp; Goldlucke, Bastian; Schreck, Tobias; Andrienko, Gennady; Grossniklaus, Michael; Keim, Daniel A

    2018-01-01

    Analysts in professional team sport regularly perform analysis to gain strategic and tactical insights into player and team behavior. Goals of team sport analysis regularly include identification of weaknesses of opposing teams, or assessing performance and improvement potential of a coached team. Current analysis workflows are typically based on the analysis of team videos. Also, analysts can rely on techniques from Information Visualization, to depict e.g., player or ball trajectories. However, video analysis is typically a time-consuming process, where the analyst needs to memorize and annotate scenes. In contrast, visualization typically relies on an abstract data model, often using abstract visual mappings, and is not directly linked to the observed movement context anymore. We propose a visual analytics system that tightly integrates team sport video recordings with abstract visualization of underlying trajectory data. We apply appropriate computer vision techniques to extract trajectory data from video input. Furthermore, we apply advanced trajectory and movement analysis techniques to derive relevant team sport analytic measures for region, event and player analysis in the case of soccer analysis. Our system seamlessly integrates video and visualization modalities, enabling analysts to draw on the advantages of both analysis forms. Several expert studies conducted with team sport analysts indicate the effectiveness of our integrated approach.

  6. PredictSNP: robust and accurate consensus classifier for prediction of disease-related mutations.

    Directory of Open Access Journals (Sweden)

    Jaroslav Bendl

    2014-01-01

    Full Text Available Single nucleotide variants represent a prevalent form of genetic variation. Mutations in the coding regions are frequently associated with the development of various genetic diseases. Computational tools for the prediction of the effects of mutations on protein function are very important for analysis of single nucleotide variants and their prioritization for experimental characterization. Many computational tools are already widely employed for this purpose. Unfortunately, their comparison and further improvement is hindered by large overlaps between the training datasets and benchmark datasets, which lead to biased and overly optimistic reported performances. In this study, we have constructed three independent datasets by removing all duplicities, inconsistencies and mutations previously used in the training of evaluated tools. The benchmark dataset containing over 43,000 mutations was employed for the unbiased evaluation of eight established prediction tools: MAPP, nsSNPAnalyzer, PANTHER, PhD-SNP, PolyPhen-1, PolyPhen-2, SIFT and SNAP. The six best performing tools were combined into a consensus classifier PredictSNP, resulting into significantly improved prediction performance, and at the same time returned results for all mutations, confirming that consensus prediction represents an accurate and robust alternative to the predictions delivered by individual tools. A user-friendly web interface enables easy access to all eight prediction tools, the consensus classifier PredictSNP and annotations from the Protein Mutant Database and the UniProt database. The web server and the datasets are freely available to the academic community at http://loschmidt.chemi.muni.cz/predictsnp.

  7. Genomic prediction using subsampling.

    Science.gov (United States)

    Xavier, Alencar; Xu, Shizhong; Muir, William; Rainey, Katy Martin

    2017-03-24

    Genome-wide assisted selection is a critical tool for the genetic improvement of plants and animals. Whole-genome regression models in Bayesian framework represent the main family of prediction methods. Fitting such models with a large number of observations involves a prohibitive computational burden. We propose the use of subsampling bootstrap Markov chain in genomic prediction. Such method consists of fitting whole-genome regression models by subsampling observations in each round of a Markov Chain Monte Carlo. We evaluated the effect of subsampling bootstrap on prediction and computational parameters. Across datasets, we observed an optimal subsampling proportion of observations around 50% with replacement, and around 33% without replacement. Subsampling provided a substantial decrease in computation time, reducing the time to fit the model by half. On average, losses on predictive properties imposed by subsampling were negligible, usually below 1%. For each dataset, an optimal subsampling point that improves prediction properties was observed, but the improvements were also negligible. Combining subsampling with Gibbs sampling is an interesting ensemble algorithm. The investigation indicates that the subsampling bootstrap Markov chain algorithm substantially reduces computational burden associated with model fitting, and it may slightly enhance prediction properties.

  8. A methodology to design heuristics for model selection based on the characteristics of data: Application to investigate when the Negative Binomial Lindley (NB-L) is preferred over the Negative Binomial (NB).

    Science.gov (United States)

    Shirazi, Mohammadali; Dhavala, Soma Sekhar; Lord, Dominique; Geedipally, Srinivas Reddy

    2017-10-01

    Safety analysts usually use post-modeling methods, such as the Goodness-of-Fit statistics or the Likelihood Ratio Test, to decide between two or more competitive distributions or models. Such metrics require all competitive distributions to be fitted to the data before any comparisons can be accomplished. Given the continuous growth in introducing new statistical distributions, choosing the best one using such post-modeling methods is not a trivial task, in addition to all theoretical or numerical issues the analyst may face during the analysis. Furthermore, and most importantly, these measures or tests do not provide any intuitions into why a specific distribution (or model) is preferred over another (Goodness-of-Logic). This paper ponders into these issues by proposing a methodology to design heuristics for Model Selection based on the characteristics of data, in terms of descriptive summary statistics, before fitting the models. The proposed methodology employs two analytic tools: (1) Monte-Carlo Simulations and (2) Machine Learning Classifiers, to design easy heuristics to predict the label of the 'most-likely-true' distribution for analyzing data. The proposed methodology was applied to investigate when the recently introduced Negative Binomial Lindley (NB-L) distribution is preferred over the Negative Binomial (NB) distribution. Heuristics were designed to select the 'most-likely-true' distribution between these two distributions, given a set of prescribed summary statistics of data. The proposed heuristics were successfully compared against classical tests for several real or observed datasets. Not only they are easy to use and do not need any post-modeling inputs, but also, using these heuristics, the analyst can attain useful information about why the NB-L is preferred over the NB - or vice versa- when modeling data. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Environmental Policy Tools: A User’s Guide.

    Science.gov (United States)

    1995-09-01

    Vitenza ADMINISTRATIVE STAFF Senior Analyst Intern Kathleen Bell Office Administrator Jan Linsenmeyer Research Analyst Nellie Hammond Administrative...Environmental Law Re- ment 35(1), 1993. view 4(86):100-103, 1980. 92. Knox, R.J., "Environmental Equity," Jour- 102. Leyden , P., "RECLAIM: Los Angeles

  10. 77 FR 33422 - Utility Scale Wind Towers From the People's Republic of China: Preliminary Affirmative...

    Science.gov (United States)

    2012-06-06

    ... Trade Analyst, AD/CVD Operations, Office 3, through Melissa G. Skinner, Director, AD/CVD Operations... questionnaire to the GOC regarding the provision of electricity for less than adequate remuneration (LTAR) and.... Skinner, Director, AD/CVD Operations, Office 3, from Patricia M. Tran, International Trade Analyst, AD/CVD...

  11. Supporting exploration awareness for visual analytics

    NARCIS (Netherlands)

    Shrinivasan, Y.B.; Wijk, van J.J.

    2008-01-01

    While exploring data using information visualization, analysts try to make sense of the data, build cases, and present them to others. However, if the exploration is long or done in multiple sessions, it can be hard for analysts to remember all interesting visualizations and the relationships among

  12. 78 FR 76161 - Notice of Regulatory Waiver Requests Granted for the Third Quarter of Calendar Year 2013

    Science.gov (United States)

    2013-12-16

    ... annually in operating expenses. Contact: Shelley M. McCracken-Rania, Senior Financial Analyst, Office of...: Shelley M. McCracken-Rania, Senior Financial Analyst, Office of Healthcare Programs, Office of Housing... the needs for capitalized interest, up to $25 million. Contact: Shelley M. McCracken-Rania, Senior...

  13. Emerging approaches in predictive toxicology.

    Science.gov (United States)

    Zhang, Luoping; McHale, Cliona M; Greene, Nigel; Snyder, Ronald D; Rich, Ivan N; Aardema, Marilyn J; Roy, Shambhu; Pfuhler, Stefan; Venkatactahalam, Sundaresan

    2014-12-01

    Predictive toxicology plays an important role in the assessment of toxicity of chemicals and the drug development process. While there are several well-established in vitro and in vivo assays that are suitable for predictive toxicology, recent advances in high-throughput analytical technologies and model systems are expected to have a major impact on the field of predictive toxicology. This commentary provides an overview of the state of the current science and a brief discussion on future perspectives for the field of predictive toxicology for human toxicity. Computational models for predictive toxicology, needs for further refinement and obstacles to expand computational models to include additional classes of chemical compounds are highlighted. Functional and comparative genomics approaches in predictive toxicology are discussed with an emphasis on successful utilization of recently developed model systems for high-throughput analysis. The advantages of three-dimensional model systems and stem cells and their use in predictive toxicology testing are also described. © 2014 Wiley Periodicals, Inc.

  14. Earthquake prediction with electromagnetic phenomena

    Energy Technology Data Exchange (ETDEWEB)

    Hayakawa, Masashi, E-mail: hayakawa@hi-seismo-em.jp [Hayakawa Institute of Seismo Electomagnetics, Co. Ltd., University of Electro-Communications (UEC) Incubation Center, 1-5-1 Chofugaoka, Chofu Tokyo, 182-8585 (Japan); Advanced Wireless & Communications Research Center, UEC, Chofu Tokyo (Japan); Earthquake Analysis Laboratory, Information Systems Inc., 4-8-15, Minami-aoyama, Minato-ku, Tokyo, 107-0062 (Japan); Fuji Security Systems. Co. Ltd., Iwato-cho 1, Shinjyuku-ku, Tokyo (Japan)

    2016-02-01

    Short-term earthquake (EQ) prediction is defined as prospective prediction with the time scale of about one week, which is considered to be one of the most important and urgent topics for the human beings. If this short-term prediction is realized, casualty will be drastically reduced. Unlike the conventional seismic measurement, we proposed the use of electromagnetic phenomena as precursors to EQs in the prediction, and an extensive amount of progress has been achieved in the field of seismo-electromagnetics during the last two decades. This paper deals with the review on this short-term EQ prediction, including the impossibility myth of EQs prediction by seismometers, the reason why we are interested in electromagnetics, the history of seismo-electromagnetics, the ionospheric perturbation as the most promising candidate of EQ prediction, then the future of EQ predictology from two standpoints of a practical science and a pure science, and finally a brief summary.

  15. Prediction of bull fertility.

    Science.gov (United States)

    Utt, Matthew D

    2016-06-01

    Prediction of male fertility is an often sought-after endeavor for many species of domestic animals. This review will primarily focus on providing some examples of dependent and independent variables to stimulate thought about the approach and methodology of identifying the most appropriate of those variables to predict bull (bovine) fertility. Although the list of variables will continue to grow with advancements in science, the principles behind making predictions will likely not change significantly. The basic principle of prediction requires identifying a dependent variable that is an estimate of fertility and an independent variable or variables that may be useful in predicting the fertility estimate. Fertility estimates vary in which parts of the process leading to conception that they infer about and the amount of variation that influences the estimate and the uncertainty thereof. The list of potential independent variables can be divided into competence of sperm based on their performance in bioassays or direct measurement of sperm attributes. A good prediction will use a sample population of bulls that is representative of the population to which an inference will be made. Both dependent and independent variables should have a dynamic range in their values. Careful selection of independent variables includes reasonable measurement repeatability and minimal correlation among variables. Proper estimation and having an appreciation of the degree of uncertainty of dependent and independent variables are crucial for using predictions to make decisions regarding bull fertility. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Climate Prediction - NOAA's National Weather Service

    Science.gov (United States)

    Statistical Models... MOS Prod GFS-LAMP Prod Climate Past Weather Predictions Weather Safety Weather Radio National Weather Service on FaceBook NWS on Facebook NWS Director Home > Climate > Predictions Climate Prediction Long range forecasts across the U.S. Climate Prediction Web Sites Climate Prediction

  17. Presence, Mourning, and Beauty: Elements of Analytic Process.

    Science.gov (United States)

    Markman, Henry C

    2017-12-01

    Analyst and patient occasionally arrive at moments of heightened meaning and aliveness. These moments can be transformative and lead to psychic change in the patient. They give life and arouse hope, and feel "real" in a new way, though often entailing emotional turbulence. Specific internal work must be done by the analyst to allow for and foster these experiences. This involves a kind of mourning process in the analyst that allows for "presence" and "availability" as described by Gabriel Marcel, and for the "at-one-ment" described by Bion. These transforming moments can be viewed in an aesthetic realm, along the lines of Keats's "Beauty is truth, truth beauty." This embodies the analytic value of emotional truth. These moments are shared and their emergence is an intersubjective creation. Clinical illustrations show how the internal work of mourning by the analyst through directed introspection allows for presence and availability, and then for shared moments of beauty with the patient.

  18. 1. On note taking.

    Science.gov (United States)

    Plaut, Alfred B J

    2005-02-01

    In this paper the author explores the theoretical and technical issues relating to taking notes of analytic sessions, using an introspective approach. The paper discusses the lack of a consistent approach to note taking amongst analysts and sets out to demonstrate that systematic note taking can be helpful to the analyst. The author describes his discovery that an initial phase where as much data was recorded as possible did not prove to be reliably helpful in clinical work and initially actively interfered with recall in subsequent sessions. The impact of the nature of the analytic session itself and the focus of the analyst's interest on recall is discussed. The author then describes how he modified his note taking technique to classify information from sessions into four categories which enabled the analyst to select which information to record in notes. The characteristics of memory and its constructive nature are discussed in relation to the problems that arise in making accurate notes of analytic sessions.

  19. Internalization, separation-individuation, and the nature of therapeutic action.

    Science.gov (United States)

    Blatt, S J; Behrends, R S

    1987-01-01

    Based on the assumption that the mutative factors that facilitate growth in psychoanalysis involve the same fundamental mechanisms that lead to psychological growth in normal development, this paper considers the constant oscillation between gratification and deprivation leading to internalization as the central therapeutic mechanism of the psychoanalytic process. Patients experience the analytic process as a series of gratifying involvements and experienced incompatibilities that facilitate internalization, whereby the patient recovers lost or disrupted regulatory, gratifying interactions with the analyst, which are real or fantasied, by appropriating these interactions, transforming them into their own, enduring, self-generated functions and characteristics. Patients internalize not only the analyst's interpretive activity, but also the analyst's sensitivity, compassion and acceptance, and, in addition, their own activity in relation to the analyst such as free association. Both interpretation and the therapeutic relationship can contain elements of gratifying involvement and experienced incompatibility that lead to internalization and therefore both can be mutative factors in the therapeutic process.

  20. Tide Predictions, California, 2014, NOAA

    Data.gov (United States)

    U.S. Environmental Protection Agency — The predictions from the web based NOAA Tide Predictions are based upon the latest information available as of the date of the user's request. Tide predictions...

  1. Decadal climate prediction (project GCEP).

    Science.gov (United States)

    Haines, Keith; Hermanson, Leon; Liu, Chunlei; Putt, Debbie; Sutton, Rowan; Iwi, Alan; Smith, Doug

    2009-03-13

    Decadal prediction uses climate models forced by changing greenhouse gases, as in the International Panel for Climate Change, but unlike longer range predictions they also require initialization with observations of the current climate. In particular, the upper-ocean heat content and circulation have a critical influence. Decadal prediction is still in its infancy and there is an urgent need to understand the important processes that determine predictability on these timescales. We have taken the first Hadley Centre Decadal Prediction System (DePreSys) and implemented it on several NERC institute compute clusters in order to study a wider range of initial condition impacts on decadal forecasting, eventually including the state of the land and cryosphere. The eScience methods are used to manage submission and output from the many ensemble model runs required to assess predictive skill. Early results suggest initial condition skill may extend for several years, even over land areas, but this depends sensitively on the definition used to measure skill, and alternatives are presented. The Grid for Coupled Ensemble Prediction (GCEP) system will allow the UK academic community to contribute to international experiments being planned to explore decadal climate predictability.

  2. Automatic fault tree generation in the EPR PSA project

    International Nuclear Information System (INIS)

    Villatte, N; Nonclercq, P.; Taupy, S.

    2012-01-01

    Tools (KB3 and Atelier EPS) have been developed at EDF to assist the analysts in building fault trees for PSA (Probabilistic Safety Assessment) and importing them into RiskSpectrum (RiskSpectrum is a Swedish code used at EDF for PSA). System modelling is performed using KB3 software with a knowledge base describing generic classes of components with their behaviour and failure modes. Using these classes of components, the analyst can describe (using a graphical system editor): a simplified system diagram from the mechanical system drawings and functional descriptions, the missions of the studied system (in a form of high level fault trees) and its different configurations for the missions. He can also add specific knowledge about the system. Then, the analyst chooses missions and configurations to specify and launch fault trees generations. From the system description, KB3 produces by backward-chaining on rules, detailed system fault trees. These fault trees are finally imported into RiskSpectrum (they are converted by Atelier EPS into a format readable by RiskSpectrum). KB3 and Atelier EPS have been used to create the majority of the fault trees for the EDF EPR Probabilistic Safety Analysis conducted from November 2009 to March 2010. 25 systems were modelled, and 127 fault trees were automatically generated in a rather short time by different analysts with the help of these tools. A feedback shows a lot of advantages to use KB3 and Atelier EPS: homogeneity and consistency between the different generated fault trees, traceability of modelling, control of modelling and last but not least: the automation of detailed fault tree creation relieves the human analyst of this tedious task so that he can focus his attention on more important tasks: modelling the failure of a function. This industrial application has also helped us gather an interesting feedback from the analysts that should help us improve the handling of the tools. We propose in this paper indeed some

  3. Potential Predictability and Prediction Skill for Southern Peru Summertime Rainfall

    Science.gov (United States)

    WU, S.; Notaro, M.; Vavrus, S. J.; Mortensen, E.; Block, P. J.; Montgomery, R. J.; De Pierola, J. N.; Sanchez, C.

    2016-12-01

    The central Andes receive over 50% of annual climatological rainfall during the short period of January-March. This summertime rainfall exhibits strong interannual and decadal variability, including severe drought events that incur devastating societal impacts and cause agricultural communities and mining facilities to compete for limited water resources. An improved seasonal prediction skill of summertime rainfall would aid in water resource planning and allocation across the water-limited southern Peru. While various underlying mechanisms have been proposed by past studies for the drivers of interannual variability in summertime rainfall across southern Peru, such as the El Niño-Southern Oscillation (ENSO), Madden Julian Oscillation (MJO), and extratropical forcings, operational forecasts continue to be largely based on rudimentary ENSO-based indices, such as NINO3.4, justifying further exploration of predictive skill. In order to bridge this gap between the understanding of driving mechanisms and the operational forecast, we performed systematic studies on the predictability and prediction skill of southern Peru summertime rainfall by constructing statistical forecast models using best available weather station and reanalysis datasets. At first, by assuming the first two empirical orthogonal functions (EOFs) of summertime rainfall are predictable, the potential predictability skill was evaluated for southern Peru. Then, we constructed a simple regression model, based on the time series of tropical Pacific sea-surface temperatures (SSTs), and a more advanced Linear Inverse Model (LIM), based on the EOFs of tropical ocean SSTs and large-scale atmosphere variables from reanalysis. Our results show that the LIM model consistently outperforms the more rudimentary regression models on the forecast skill of domain averaged precipitation index and individual station indices. The improvement of forecast correlation skill ranges from 10% to over 200% for different

  4. Predicting epileptic seizures in advance.

    Directory of Open Access Journals (Sweden)

    Negin Moghim

    Full Text Available Epilepsy is the second most common neurological disorder, affecting 0.6-0.8% of the world's population. In this neurological disorder, abnormal activity of the brain causes seizures, the nature of which tend to be sudden. Antiepileptic Drugs (AEDs are used as long-term therapeutic solutions that control the condition. Of those treated with AEDs, 35% become resistant to medication. The unpredictable nature of seizures poses risks for the individual with epilepsy. It is clearly desirable to find more effective ways of preventing seizures for such patients. The automatic detection of oncoming seizures, before their actual onset, can facilitate timely intervention and hence minimize these risks. In addition, advance prediction of seizures can enrich our understanding of the epileptic brain. In this study, drawing on the body of work behind automatic seizure detection and prediction from digitised Invasive Electroencephalography (EEG data, a prediction algorithm, ASPPR (Advance Seizure Prediction via Pre-ictal Relabeling, is described. ASPPR facilitates the learning of predictive models targeted at recognizing patterns in EEG activity that are in a specific time window in advance of a seizure. It then exploits advanced machine learning coupled with the design and selection of appropriate features from EEG signals. Results, from evaluating ASPPR independently on 21 different patients, suggest that seizures for many patients can be predicted up to 20 minutes in advance of their onset. Compared to benchmark performance represented by a mean S1-Score (harmonic mean of Sensitivity and Specificity of 90.6% for predicting seizure onset between 0 and 5 minutes in advance, ASPPR achieves mean S1-Scores of: 96.30% for prediction between 1 and 6 minutes in advance, 96.13% for prediction between 8 and 13 minutes in advance, 94.5% for prediction between 14 and 19 minutes in advance, and 94.2% for prediction between 20 and 25 minutes in advance.

  5. Are abrupt climate changes predictable?

    Science.gov (United States)

    Ditlevsen, Peter

    2013-04-01

    It is taken for granted that the limited predictability in the initial value problem, the weather prediction, and the predictability of the statistics are two distinct problems. Lorenz (1975) dubbed this predictability of the first and the second kind respectively. Predictability of the first kind in a chaotic dynamical system is limited due to the well-known critical dependence on initial conditions. Predictability of the second kind is possible in an ergodic system, where either the dynamics is known and the phase space attractor can be characterized by simulation or the system can be observed for such long times that the statistics can be obtained from temporal averaging, assuming that the attractor does not change in time. For the climate system the distinction between predictability of the first and the second kind is fuzzy. This difficulty in distinction between predictability of the first and of the second kind is related to the lack of scale separation between fast and slow components of the climate system. The non-linear nature of the problem furthermore opens the possibility of multiple attractors, or multiple quasi-steady states. As the ice-core records show, the climate has been jumping between different quasi-stationary climates, stadials and interstadials through the Dansgaard-Oechger events. Such a jump happens very fast when a critical tipping point has been reached. The question is: Can such a tipping point be predicted? This is a new kind of predictability: the third kind. If the tipping point is reached through a bifurcation, where the stability of the system is governed by some control parameter, changing in a predictable way to a critical value, the tipping is predictable. If the sudden jump occurs because internal chaotic fluctuations, noise, push the system across a barrier, the tipping is as unpredictable as the triggering noise. In order to hint at an answer to this question, a careful analysis of the high temporal resolution NGRIP isotope

  6. Predicting Free Recalls

    Science.gov (United States)

    Laming, Donald

    2006-01-01

    This article reports some calculations on free-recall data from B. Murdock and J. Metcalfe (1978), with vocal rehearsal during the presentation of a list. Given the sequence of vocalizations, with the stimuli inserted in their proper places, it is possible to predict the subsequent sequence of recalls--the predictions taking the form of a…

  7. Deep Visual Attention Prediction

    Science.gov (United States)

    Wang, Wenguan; Shen, Jianbing

    2018-05-01

    In this work, we aim to predict human eye fixation with view-free scenes based on an end-to-end deep learning architecture. Although Convolutional Neural Networks (CNNs) have made substantial improvement on human attention prediction, it is still needed to improve CNN based attention models by efficiently leveraging multi-scale features. Our visual attention network is proposed to capture hierarchical saliency information from deep, coarse layers with global saliency information to shallow, fine layers with local saliency response. Our model is based on a skip-layer network structure, which predicts human attention from multiple convolutional layers with various reception fields. Final saliency prediction is achieved via the cooperation of those global and local predictions. Our model is learned in a deep supervision manner, where supervision is directly fed into multi-level layers, instead of previous approaches of providing supervision only at the output layer and propagating this supervision back to earlier layers. Our model thus incorporates multi-level saliency predictions within a single network, which significantly decreases the redundancy of previous approaches of learning multiple network streams with different input scales. Extensive experimental analysis on various challenging benchmark datasets demonstrate our method yields state-of-the-art performance with competitive inference time.

  8. A grey NGM(1,1, k) self-memory coupling prediction model for energy consumption prediction.

    Science.gov (United States)

    Guo, Xiaojun; Liu, Sifeng; Wu, Lifeng; Tang, Lingling

    2014-01-01

    Energy consumption prediction is an important issue for governments, energy sector investors, and other related corporations. Although there are several prediction techniques, selection of the most appropriate technique is of vital importance. As for the approximate nonhomogeneous exponential data sequence often emerging in the energy system, a novel grey NGM(1,1, k) self-memory coupling prediction model is put forward in order to promote the predictive performance. It achieves organic integration of the self-memory principle of dynamic system and grey NGM(1,1, k) model. The traditional grey model's weakness as being sensitive to initial value can be overcome by the self-memory principle. In this study, total energy, coal, and electricity consumption of China is adopted for demonstration by using the proposed coupling prediction technique. The results show the superiority of NGM(1,1, k) self-memory coupling prediction model when compared with the results from the literature. Its excellent prediction performance lies in that the proposed coupling model can take full advantage of the systematic multitime historical data and catch the stochastic fluctuation tendency. This work also makes a significant contribution to the enrichment of grey prediction theory and the extension of its application span.

  9. The IntFOLD server: an integrated web resource for protein fold recognition, 3D model quality assessment, intrinsic disorder prediction, domain prediction and ligand binding site prediction.

    Science.gov (United States)

    Roche, Daniel B; Buenavista, Maria T; Tetchner, Stuart J; McGuffin, Liam J

    2011-07-01

    The IntFOLD server is a novel independent server that integrates several cutting edge methods for the prediction of structure and function from sequence. Our guiding principles behind the server development were as follows: (i) to provide a simple unified resource that makes our prediction software accessible to all and (ii) to produce integrated output for predictions that can be easily interpreted. The output for predictions is presented as a simple table that summarizes all results graphically via plots and annotated 3D models. The raw machine readable data files for each set of predictions are also provided for developers, which comply with the Critical Assessment of Methods for Protein Structure Prediction (CASP) data standards. The server comprises an integrated suite of five novel methods: nFOLD4, for tertiary structure prediction; ModFOLD 3.0, for model quality assessment; DISOclust 2.0, for disorder prediction; DomFOLD 2.0 for domain prediction; and FunFOLD 1.0, for ligand binding site prediction. Predictions from the IntFOLD server were found to be competitive in several categories in the recent CASP9 experiment. The IntFOLD server is available at the following web site: http://www.reading.ac.uk/bioinf/IntFOLD/.

  10. 40 CFR 136.6 - Method modifications and analytical requirements.

    Science.gov (United States)

    2010-07-01

    ... modifications and analytical requirements. (a) Definitions of terms used in this section. (1) Analyst means the..., oil and grease, total suspended solids, total phenolics, turbidity, chemical oxygen demand, and.... Except as set forth in paragraph (b)(3) of this section, an analyst may modify an approved test procedure...

  11. 17 CFR 242.500 - Definitions.

    Science.gov (United States)

    2010-04-01

    .... market is not the principal trading market. Public appearance means any participation by a research... activities of research analysts or the content of research reports; and (ii) If the broker or dealer... broker or dealer from influencing the activities of research analysts and the content of research reports...

  12. Internal and External Crisis Early Warning and Monitoring.

    Science.gov (United States)

    1980-12-01

    very fundamental sense, an I&W analyst is an (ap- plied) empirical scientist testing hypotheses. It is a canon of scientific research that, with the...creeping expropriation, taxation changes, etc. The I&W analyst is similarly concerned with a range of discrete risks; risk in general is equivalent to

  13. The Effect of a Workload-Preview on Task-Prioritization and Task-Performance

    Science.gov (United States)

    Minotra, Dev

    2012-01-01

    With increased volume and sophistication of cyber attacks in recent years, maintaining situation awareness and effective task-prioritization strategy is critical to the task of cybersecurity analysts. However, high levels of mental-workload associated with the task of cybersecurity analyst's limits their ability to prioritize tasks.…

  14. Relating Actor Analysis Methods to Policy Problems

    NARCIS (Netherlands)

    Van der Lei, T.E.

    2009-01-01

    For a policy analyst the policy problem is the starting point for the policy analysis process. During this process the policy analyst structures the policy problem and makes a choice for an appropriate set of methods or techniques to analyze the problem (Goeller 1984). The methods of the policy

  15. Team Collaboration: The Use of Behavior Principles for Serving Students with ASD

    Science.gov (United States)

    Donaldson, Amy L.; Stahmer, Aubyn C.

    2014-01-01

    Purpose: Speech-language pathologists (SLPs) and behavior analysts are key members of school-based teams that serve children with autism spectrum disorders (ASD). Behavior analysts approach assessment and intervention through the lens of applied behavior analysis (ABA). ABA-based interventions have been found effective for targeting skills across…

  16. Download this PDF file

    African Journals Online (AJOL)

    ; Young, S.I.; Ho, S.K.; Mizura, S.S. Pertanika 1988, 11, 39. 5. Fogg, A.G.; Summan, A.M. Analyst 1985, 108, 691. 6. Lindquist, J.; Farroha, S.M. Analyst 1975, 100, 377. 7. Wekesa, N.M.N.; Chhabra, S.C.; Thairu, H.M. Bull. Chem. Soc. Ethiop.

  17. Prediction skill of rainstorm events over India in the TIGGE weather prediction models

    Science.gov (United States)

    Karuna Sagar, S.; Rajeevan, M.; Vijaya Bhaskara Rao, S.; Mitra, A. K.

    2017-12-01

    Extreme rainfall events pose a serious threat of leading to severe floods in many countries worldwide. Therefore, advance prediction of its occurrence and spatial distribution is very essential. In this paper, an analysis has been made to assess the skill of numerical weather prediction models in predicting rainstorms over India. Using gridded daily rainfall data set and objective criteria, 15 rainstorms were identified during the monsoon season (June to September). The analysis was made using three TIGGE (THe Observing System Research and Predictability Experiment (THORPEX) Interactive Grand Global Ensemble) models. The models considered are the European Centre for Medium-Range Weather Forecasts (ECMWF), National Centre for Environmental Prediction (NCEP) and the UK Met Office (UKMO). Verification of the TIGGE models for 43 observed rainstorm days from 15 rainstorm events has been made for the period 2007-2015. The comparison reveals that rainstorm events are predictable up to 5 days in advance, however with a bias in spatial distribution and intensity. The statistical parameters like mean error (ME) or Bias, root mean square error (RMSE) and correlation coefficient (CC) have been computed over the rainstorm region using the multi-model ensemble (MME) mean. The study reveals that the spread is large in ECMWF and UKMO followed by the NCEP model. Though the ensemble spread is quite small in NCEP, the ensemble member averages are not well predicted. The rank histograms suggest that the forecasts are under prediction. The modified Contiguous Rain Area (CRA) technique was used to verify the spatial as well as the quantitative skill of the TIGGE models. Overall, the contribution from the displacement and pattern errors to the total RMSE is found to be more in magnitude. The volume error increases from 24 hr forecast to 48 hr forecast in all the three models.

  18. Predicting Well-Being in Europe?

    DEFF Research Database (Denmark)

    Hussain, M. Azhar

    2015-01-01

    Has the worst financial and economic crisis since the 1930s reduced the subjective wellbeing function's predictive power? Regression models for happiness are estimated for the three first rounds of the European Social Survey (ESS); 2002, 2004 and 2006. Several explanatory variables are significant...... happiness. Nevertheless, 73% of the predictions in 2008 and 57% of predictions in 2010 were within the margin of error. These correct prediction percentages are not unusually low - rather they are slightly higher than before the crisis. It is surprising that happiness predictions are not adversely affected...... by the crisis. On the other hand, results are consistent with the adaption hypothesis. The same exercise is conducted applying life satisfaction instead of happiness, but we reject, against expectation, that (more transient) happiness is harder to predict than life satisfaction. Fifteen ESS countries surveyed...

  19. Prediction of postoperative pain: a systematic review of predictive experimental pain studies

    DEFF Research Database (Denmark)

    Werner, Mads Utke; Mjöbo, Helena N; Nielsen, Per R

    2010-01-01

    Quantitative testing of a patient's basal pain perception before surgery has the potential to be of clinical value if it can accurately predict the magnitude of pain and requirement of analgesics after surgery. This review includes 14 studies that have investigated the correlation between...... preoperative responses to experimental pain stimuli and clinical postoperative pain and demonstrates that the preoperative pain tests may predict 4-54% of the variance in postoperative pain experience depending on the stimulation methods and the test paradigm used. The predictive strength is much higher than...

  20. Application of decline curve analysis to estimate recovery factors for carbon dioxide enhanced oil recovery

    Science.gov (United States)

    Jahediesfanjani, Hossein

    2017-07-17

    IntroductionIn the decline curve analysis (DCA) method of estimating recoverable hydrocarbon volumes, the analyst uses historical production data from a well, lease, group of wells (or pattern), or reservoir and plots production rates against time or cumu­lative production for the analysis. The DCA of an individual well is founded on the same basis as the fluid-flow principles that are used for pressure-transient analysis of a single well in a reservoir domain and therefore can provide scientifically reasonable and accurate results. However, when used for a group of wells, a lease, or a reservoir, the DCA becomes more of an empirical method. Plots from the DCA reflect the reservoir response to the oil withdrawal (or production) under the prevailing operating and reservoir conditions, and they continue to be good tools for estimating recoverable hydrocarbon volumes and future production rates. For predicting the total recov­erable hydrocarbon volume, the DCA results can help the analyst to evaluate the reservoir performance under any of the three phases of reservoir productive life—primary, secondary (waterflood), or tertiary (enhanced oil recovery) phases—so long as the historical production data are sufficient to establish decline trends at the end of the three phases.

  1. ASN reputation system model

    Science.gov (United States)

    Hutchinson, Steve; Erbacher, Robert F.

    2015-05-01

    Network security monitoring is currently challenged by its reliance on human analysts and the inability for tools to generate indications and warnings for previously unknown attacks. We propose a reputation system based on IP address set membership within the Autonomous System Number (ASN) system. Essentially, a metric generated based on the historic behavior, or misbehavior, of nodes within a given ASN can be used to predict future behavior and provide a mechanism to locate network activity requiring inspection. This will provide reinforcement of notifications and warnings and lead to inspection for ASNs known to be problematic even if initial inspection leads to interpretation of the event as innocuous. We developed proof of concept capabilities to generate the IP address to ASN set membership and analyze the impact of the results. These results clearly show that while some ASNs are one-offs with individual or small numbers of misbehaving IP addresses, there are definitive ASNs with a history of long term and wide spread misbehaving IP addresses. These ASNs with long histories are what we are especially interested in and will provide an additional correlation metric for the human analyst and lead to new tools to aid remediation of these IP address blocks.

  2. Cost Beneftt Analysts of LH2 PadB

    Science.gov (United States)

    Mott, Brittany

    2013-01-01

    This analysis is used to evaluate, from a cost and benefit perspective, potential outcomes when replacing the pressurization switches and the pressurization system to meet the needs of the LH2 storage system at Pad B. This also includes alternatives, tangible and intangible benefits, and the results of the analysis.

  3. Improving Information Operations with a Military Cultural Analyst

    Science.gov (United States)

    2005-01-25

    Communicating Across Cultures, (Belmont, CA: Wadsworth Publishing Company, 1996), 24. 44 Ibid. 45 Marieke de Mooij, Global Marketing and Advertising...United States Army Training and Doctrine Command, 1992. De Mooij, Marieke. Global Marketing and Advertising: Understanding Cultural Paradoxes

  4. Intelligence Analysts Need Training on How to Think

    National Research Council Canada - National Science Library

    Hanson, Andrew

    2008-01-01

    .... It has been an ongoing struggle to adapt to unconventional methods. Now that the US is in a new kind of war, it is important to train soldiers not only to win today, but win in the future as well...

  5. Analyst: Soldier fails to sway election / Joel Alas

    Index Scriptorium Estoniae

    Alas, Joel

    2007-01-01

    President Toomas Hendrik Ilves ei kuulutanud välja keelatud rajatise kõrvaldamise seadust, sest see on põhiseadusega vastuolus. Politoloog Vello Pettai hinnangul pole valijatele Tõnismäe pronkssõduri teema oluline

  6. Cost Estimating Cases: Educational Tools for Cost Analysts

    Science.gov (United States)

    1993-09-01

    only appropriate documentation should be provided. In other words, students should not submit all of the documentation possible using ACEIT , only that...case was their lack of understanding of the ACEIT software used to conduct the estimate. Specifically, many students misinterpreted the cost...estimating relationships (CERs) embedded in the 49 software. Additionally, few of the students were able to properly organize the ACEIT documentation output

  7. When predictions take control: The effect of task predictions on task switching performance

    Directory of Open Access Journals (Sweden)

    Wout eDuthoo

    2012-08-01

    Full Text Available In this paper, we aimed to investigate the role of self-generated predictions in the flexible control of behaviour. Therefore, we ran a task switching experiment in which participants were asked to try to predict the upcoming task in three conditions varying in switch rate (30%, 50% and 70%. Irrespective of their predictions, the colour of the target indicated which task participants had to perform. In line with previous studies (Mayr, 2006; Monsell & Mizon, 2006, the switch cost was attenuated as the switch rate increased. Importantly, a clear task repetition bias was found in all conditions, yet the task repetition prediction rate dropped from 78% over 66% to 49% with increasing switch probability in the three conditions. Irrespective of condition, the switch cost was strongly reduced in expectation of a task alternation compared to the cost of an unexpected task alternation following repetition predictions. Hence, our data suggest that the reduction in the switch cost with increasing switch probability is caused by a diminished expectancy for the task to repeat. Taken together, this paper highlights the importance of predictions in the flexible control of behaviour, and suggests a crucial role for task repetition expectancy in the context-sensitive adjusting of task switching performance.

  8. An Overview of Practical Applications of Protein Disorder Prediction and Drive for Faster, More Accurate Predictions.

    Science.gov (United States)

    Deng, Xin; Gumm, Jordan; Karki, Suman; Eickholt, Jesse; Cheng, Jianlin

    2015-07-07

    Protein disordered regions are segments of a protein chain that do not adopt a stable structure. Thus far, a variety of protein disorder prediction methods have been developed and have been widely used, not only in traditional bioinformatics domains, including protein structure prediction, protein structure determination and function annotation, but also in many other biomedical fields. The relationship between intrinsically-disordered proteins and some human diseases has played a significant role in disorder prediction in disease identification and epidemiological investigations. Disordered proteins can also serve as potential targets for drug discovery with an emphasis on the disordered-to-ordered transition in the disordered binding regions, and this has led to substantial research in drug discovery or design based on protein disordered region prediction. Furthermore, protein disorder prediction has also been applied to healthcare by predicting the disease risk of mutations in patients and studying the mechanistic basis of diseases. As the applications of disorder prediction increase, so too does the need to make quick and accurate predictions. To fill this need, we also present a new approach to predict protein residue disorder using wide sequence windows that is applicable on the genomic scale.

  9. An Overview of Practical Applications of Protein Disorder Prediction and Drive for Faster, More Accurate Predictions

    Directory of Open Access Journals (Sweden)

    Xin Deng

    2015-07-01

    Full Text Available Protein disordered regions are segments of a protein chain that do not adopt a stable structure. Thus far, a variety of protein disorder prediction methods have been developed and have been widely used, not only in traditional bioinformatics domains, including protein structure prediction, protein structure determination and function annotation, but also in many other biomedical fields. The relationship between intrinsically-disordered proteins and some human diseases has played a significant role in disorder prediction in disease identification and epidemiological investigations. Disordered proteins can also serve as potential targets for drug discovery with an emphasis on the disordered-to-ordered transition in the disordered binding regions, and this has led to substantial research in drug discovery or design based on protein disordered region prediction. Furthermore, protein disorder prediction has also been applied to healthcare by predicting the disease risk of mutations in patients and studying the mechanistic basis of diseases. As the applications of disorder prediction increase, so too does the need to make quick and accurate predictions. To fill this need, we also present a new approach to predict protein residue disorder using wide sequence windows that is applicable on the genomic scale.

  10. Users’ Information Requirements and Narrative Reporting: The Case of Iranian Companies

    Directory of Open Access Journals (Sweden)

    Bikram Chatterjee

    2010-06-01

    Full Text Available This paper investigates whether the narrative section of Iranian companies’ annual reports satisfies theinformation requirements of financial analysts employed by institutional investors. Taking a group ofstakeholders (i.e. financial analysts as the sample, a questionnaire survey was conducted to identify their topthree information needs from the narrative sections of company annual reports in each of three informationcategories: Present, Analytical and Prospective. Following this survey, a checklist was prepared to analysewhether Iranian companies are disclosing this information required by financial analysts. Overall, the resultspartially support stakeholder theory as there is a general lack of information flow on the part of Iranian listedcompanies in meeting their stakeholders’ information needs.

  11. Confidence scores for prediction models

    DEFF Research Database (Denmark)

    Gerds, Thomas Alexander; van de Wiel, MA

    2011-01-01

    In medical statistics, many alternative strategies are available for building a prediction model based on training data. Prediction models are routinely compared by means of their prediction performance in independent validation data. If only one data set is available for training and validation,...

  12. Predicting beta-turns and their types using predicted backbone dihedral angles and secondary structures.

    Science.gov (United States)

    Kountouris, Petros; Hirst, Jonathan D

    2010-07-31

    Beta-turns are secondary structure elements usually classified as coil. Their prediction is important, because of their role in protein folding and their frequent occurrence in protein chains. We have developed a novel method that predicts beta-turns and their types using information from multiple sequence alignments, predicted secondary structures and, for the first time, predicted dihedral angles. Our method uses support vector machines, a supervised classification technique, and is trained and tested on three established datasets of 426, 547 and 823 protein chains. We achieve a Matthews correlation coefficient of up to 0.49, when predicting the location of beta-turns, the highest reported value to date. Moreover, the additional dihedral information improves the prediction of beta-turn types I, II, IV, VIII and "non-specific", achieving correlation coefficients up to 0.39, 0.33, 0.27, 0.14 and 0.38, respectively. Our results are more accurate than other methods. We have created an accurate predictor of beta-turns and their types. Our method, called DEBT, is available online at http://comp.chem.nottingham.ac.uk/debt/.

  13. A Grey NGM(1,1, k) Self-Memory Coupling Prediction Model for Energy Consumption Prediction

    Science.gov (United States)

    Guo, Xiaojun; Liu, Sifeng; Wu, Lifeng; Tang, Lingling

    2014-01-01

    Energy consumption prediction is an important issue for governments, energy sector investors, and other related corporations. Although there are several prediction techniques, selection of the most appropriate technique is of vital importance. As for the approximate nonhomogeneous exponential data sequence often emerging in the energy system, a novel grey NGM(1,1, k) self-memory coupling prediction model is put forward in order to promote the predictive performance. It achieves organic integration of the self-memory principle of dynamic system and grey NGM(1,1, k) model. The traditional grey model's weakness as being sensitive to initial value can be overcome by the self-memory principle. In this study, total energy, coal, and electricity consumption of China is adopted for demonstration by using the proposed coupling prediction technique. The results show the superiority of NGM(1,1, k) self-memory coupling prediction model when compared with the results from the literature. Its excellent prediction performance lies in that the proposed coupling model can take full advantage of the systematic multitime historical data and catch the stochastic fluctuation tendency. This work also makes a significant contribution to the enrichment of grey prediction theory and the extension of its application span. PMID:25054174

  14. A Grey NGM(1,1,k Self-Memory Coupling Prediction Model for Energy Consumption Prediction

    Directory of Open Access Journals (Sweden)

    Xiaojun Guo

    2014-01-01

    Full Text Available Energy consumption prediction is an important issue for governments, energy sector investors, and other related corporations. Although there are several prediction techniques, selection of the most appropriate technique is of vital importance. As for the approximate nonhomogeneous exponential data sequence often emerging in the energy system, a novel grey NGM(1,1,k self-memory coupling prediction model is put forward in order to promote the predictive performance. It achieves organic integration of the self-memory principle of dynamic system and grey NGM(1,1,k model. The traditional grey model’s weakness as being sensitive to initial value can be overcome by the self-memory principle. In this study, total energy, coal, and electricity consumption of China is adopted for demonstration by using the proposed coupling prediction technique. The results show the superiority of NGM(1,1,k self-memory coupling prediction model when compared with the results from the literature. Its excellent prediction performance lies in that the proposed coupling model can take full advantage of the systematic multitime historical data and catch the stochastic fluctuation tendency. This work also makes a significant contribution to the enrichment of grey prediction theory and the extension of its application span.

  15. From Franchise to Programming: Jobs in Cable Television.

    Science.gov (United States)

    Stanton, Michael

    1985-01-01

    This article takes a look at some of the key jobs at every level of the cable industry. It discusses winning a franchise, building and running the system, and programing and production. Job descriptions include engineer, market analyst, programers, financial analysts, strand mappers, customer service representatives, access coordinator, and studio…

  16. 78 FR 42583 - Data Collection Available for Public Comments

    Science.gov (United States)

    2013-07-16

    ... participate or seek to participate in the program and is used for portfolio risk management loan monitoring... Curtis B. Rich, Management Analyst, 202-205-7030 [email protected] . Title: ''Gulf Opportunity Pilot..., Management Analyst. [FR Doc. 2013-17003 Filed 7-15-13; 8:45 am] BILLING CODE 8025-01-P ...

  17. Measuring the Effectiveness of Visual Analytics and Data Fusion Techniques on Situation Awareness in Cyber-Security

    Science.gov (United States)

    Giacobe, Nicklaus A.

    2013-01-01

    Cyber-security involves the monitoring a complex network of inter-related computers to prevent, identify and remediate from undesired actions. This work is performed in organizations by human analysts. These analysts monitor cyber-security sensors to develop and maintain situation awareness (SA) of both normal and abnormal activities that occur on…

  18. Viral IRES prediction system - a web server for prediction of the IRES secondary structure in silico.

    Directory of Open Access Journals (Sweden)

    Jun-Jie Hong

    Full Text Available The internal ribosomal entry site (IRES functions as cap-independent translation initiation sites in eukaryotic cells. IRES elements have been applied as useful tools for bi-cistronic expression vectors. Current RNA structure prediction programs are unable to predict precisely the potential IRES element. We have designed a viral IRES prediction system (VIPS to perform the IRES secondary structure prediction. In order to obtain better results for the IRES prediction, the VIPS can evaluate and predict for all four different groups of IRESs with a higher accuracy. RNA secondary structure prediction, comparison, and pseudoknot prediction programs were implemented to form the three-stage procedure for the VIPS. The backbone of VIPS includes: the RNAL fold program, aimed to predict local RNA secondary structures by minimum free energy method; the RNA Align program, intended to compare predicted structures; and pknotsRG program, used to calculate the pseudoknot structure. VIPS was evaluated by using UTR database, IRES database and Virus database, and the accuracy rate of VIPS was assessed as 98.53%, 90.80%, 82.36% and 80.41% for IRES groups 1, 2, 3, and 4, respectively. This advance useful search approach for IRES structures will facilitate IRES related studies. The VIPS on-line website service is available at http://140.135.61.250/vips/.

  19. Numerical prediction of rose growth

    NARCIS (Netherlands)

    Bernsen, E.; Bokhove, Onno; van der Sar, D.M.

    2006-01-01

    A new mathematical model is presented for the prediction of rose growth in a greenhouse. Given the measured ambient environmental conditions, the model consists of a local photosynthesis model, predicting the photosynthesis per unit leaf area, coupled to a global greenhouse model, which predicts the

  20. Seismology for rockburst prediction.

    CSIR Research Space (South Africa)

    De Beer, W

    2000-02-01

    Full Text Available project GAP409 presents a method (SOOTHSAY) for predicting larger mining induced seismic events in gold mines, as well as a pattern recognition algorithm (INDICATOR) for characterising the seismic response of rock to mining and inferring future... State. Defining the time series of a specific function on a catalogue as a prediction strategy, the algorithm currently has a success rate of 53% and 65%, respectively, of large events claimed as being predicted in these two cases, with uncertainties...

  1. Collective motion of predictive swarms.

    Directory of Open Access Journals (Sweden)

    Nathaniel Rupprecht

    Full Text Available Theoretical models of populations and swarms typically start with the assumption that the motion of agents is governed by the local stimuli. However, an intelligent agent, with some understanding of the laws that govern its habitat, can anticipate the future, and make predictions to gather resources more efficiently. Here we study a specific model of this kind, where agents aim to maximize their consumption of a diffusing resource, by attempting to predict the future of a resource field and the actions of other agents. Once the agents make a prediction, they are attracted to move towards regions that have, and will have, denser resources. We find that the further the agents attempt to see into the future, the more their attempts at prediction fail, and the less resources they consume. We also study the case where predictive agents compete against non-predictive agents and find the predictors perform better than the non-predictors only when their relative numbers are very small. We conclude that predictivity pays off either when the predictors do not see too far into the future or the number of predictors is small.

  2. Neural Networks for protein Structure Prediction

    DEFF Research Database (Denmark)

    Bohr, Henrik

    1998-01-01

    This is a review about neural network applications in bioinformatics. Especially the applications to protein structure prediction, e.g. prediction of secondary structures, prediction of surface structure, fold class recognition and prediction of the 3-dimensional structure of protein backbones...

  3. Human-machine interaction to disambiguate entities in unstructured text and structured datasets

    Science.gov (United States)

    Ward, Kevin; Davenport, Jack

    2017-05-01

    Creating entity network graphs is a manual, time consuming process for an intelligence analyst. Beyond the traditional big data problems of information overload, individuals are often referred to by multiple names and shifting titles as they advance in their organizations over time which quickly makes simple string or phonetic alignment methods for entities insufficient. Conversely, automated methods for relationship extraction and entity disambiguation typically produce questionable results with no way for users to vet results, correct mistakes or influence the algorithm's future results. We present an entity disambiguation tool, DRADIS, which aims to bridge the gap between human-centric and machinecentric methods. DRADIS automatically extracts entities from multi-source datasets and models them as a complex set of attributes and relationships. Entities are disambiguated across the corpus using a hierarchical model executed in Spark allowing it to scale to operational sized data. Resolution results are presented to the analyst complete with sourcing information for each mention and relationship allowing analysts to quickly vet the correctness of results as well as correct mistakes. Corrected results are used by the system to refine the underlying model allowing analysts to optimize the general model to better deal with their operational data. Providing analysts with the ability to validate and correct the model to produce a system they can trust enables them to better focus their time on producing higher quality analysis products.

  4. The GIS and data solution for advanced business analysis

    Directory of Open Access Journals (Sweden)

    Carmen RADUT

    2009-12-01

    Full Text Available The GIS Business Analyst is a suite of Geographic Information System (GIS-enabled tools, wizards, and data that provides business professionals with a complete solution for site evaluation, selective customer profiling, and trade area market analysis. Running simple reports, mapping the results, and performing complex probability models are among the capabilities The GIS Business Analyst offers in one affordable desktop analysis solution. Data and analyses produced by The GIS Business Analyst can be shared across departments, reducing redundant research and marketing efforts, speeding analysis of results, and increasing employee efficiency. The GIS Business Analyst is the first suite of tools for unlocking the intelligence of geography, demographic, consumer lifestyle, and business data. It is a valuable asset for business decision making such as analyzing market share and competition, determining new site expansions or reductions, and targeting new customers. The ability to analyze and visualize the geographic component of business data reveals trends, patterns, and opportunities hidden in tabular data. By combining information, such as sales data of the organization, customer information, and competitor locations, with geographic data, such as demographics, territories, or store locations, the GIS Business Analyst helps the user better understand organization market, organization customers, and organization competition. The business intelligence systems bring geographic information systems, marketing analysis tools, and demographic data products together to offer the user powerful ways to compete in today's business strategies.

  5. The learning curve, interobserver, and intraobserver agreement of endoscopic confocal laser endomicroscopy in the assessment of mucosal barrier defects.

    Science.gov (United States)

    Chang, Jeff; Ip, Matthew; Yang, Michael; Wong, Brendon; Power, Theresa; Lin, Lisa; Xuan, Wei; Phan, Tri Giang; Leong, Rupert W

    2016-04-01

    Confocal laser endomicroscopy can dynamically assess intestinal mucosal barrier defects and increased intestinal permeability (IP). These are functional features that do not have corresponding appearance on histopathology. As such, previous pathology training may not be beneficial in learning these dynamic features. This study aims to evaluate the diagnostic accuracy, learning curve, inter- and intraobserver agreement for identifying features of increased IP in experienced and inexperienced analysts and pathologists. A total of 180 endoscopic confocal laser endomicroscopy (Pentax EC-3870FK; Pentax, Tokyo, Japan) images of the terminal ileum, subdivided into 6 sets of 30 were evaluated by 6 experienced analysts, 13 inexperienced analysts, and 2 pathologists, after a 30-minute teaching session. Cell-junction enhancement, fluorescein leak, and cell dropout were used to represent increased IP and were either present or absent in each image. For each image, the diagnostic accuracy, confidence, and quality were assessed. Diagnostic accuracy was significantly higher for experienced analysts compared with inexperienced analysts from the first set (96.7% vs 83.1%, P 0.86 for experienced observers. Features representative of increased IP can be rapidly learned with high inter- and intraobserver agreement. Confidence and image quality were significant predictors of accurate interpretation. Previous pathology training did not have an effect on learning. Copyright © 2016 American Society for Gastrointestinal Endoscopy. Published by Elsevier Inc. All rights reserved.

  6. Psychotherapy in the aesthetic attitude.

    Science.gov (United States)

    Beebe, John

    2010-04-01

    Drawing upon the writings of Jungian analyst Joseph Henderson on unconscious attitudes toward culture that patients and analysts may bring to therapy, the author defines the aesthetic attitude as one of the basic ways that cultural experience is instinctively accessed and processed so that it can become part of an individual's self experience. In analytic treatment, the aesthetic attitude emerges as part of what Jung called the transcendent function to create new symbolic possibilities for the growth of consciousness. It can provide creative opportunities for new adaptation where individuation has become stuck in unconscious complexes, both personal and cultural. In contrast to formulations that have compared depth psychotherapy to religious ritual, philosophic discourse, and renewal of socialization, this paper focuses upon the considerations of beauty that make psychotherapy also an art. In psychotherapeutic work, the aesthetic attitude confronts both analyst and patient with the problem of taste, affects how the treatment is shaped and 'framed', and can grant a dimension of grace to the analyst's mirroring of the struggles that attend the patient's effort to be a more smoothly functioning human being. The patient may learn to extend the same grace to the analyst's fumbling attempts to be helpful. The author suggests that the aesthetic attitude is thus a help in the resolution of both countertransference and transference en route to psychological healing.

  7. Data Intensive Architecture for Scalable Cyber Analytics

    Energy Technology Data Exchange (ETDEWEB)

    Olsen, Bryan K.; Johnson, John R.; Critchlow, Terence J.

    2011-12-19

    Cyber analysts are tasked with the identification and mitigation of network exploits and threats. These compromises are difficult to identify due to the characteristics of cyber communication, the volume of traffic, and the duration of possible attack. In this paper, we describe a prototype implementation designed to provide cyber analysts an environment where they can interactively explore a month’s worth of cyber security data. This prototype utilized On-Line Analytical Processing (OLAP) techniques to present a data cube to the analysts. The cube provides a summary of the data, allowing trends to be easily identified as well as the ability to easily pull up the original records comprising an event of interest. The cube was built using SQL Server Analysis Services (SSAS), with the interface to the cube provided by Tableau. This software infrastructure was supported by a novel hardware architecture comprising a Netezza TwinFin® for the underlying data warehouse and a cube server with a FusionIO drive hosting the data cube. We evaluated this environment on a month’s worth of artificial, but realistic, data using multiple queries provided by our cyber analysts. As our results indicate, OLAP technology has progressed to the point where it is in a unique position to provide novel insights to cyber analysts, as long as it is supported by an appropriate data intensive architecture.

  8. Prediction of resource volumes at untested locations using simple local prediction models

    Science.gov (United States)

    Attanasi, E.D.; Coburn, T.C.; Freeman, P.A.

    2006-01-01

    This paper shows how local spatial nonparametric prediction models can be applied to estimate volumes of recoverable gas resources at individual undrilled sites, at multiple sites on a regional scale, and to compute confidence bounds for regional volumes based on the distribution of those estimates. An approach that combines cross-validation, the jackknife, and bootstrap procedures is used to accomplish this task. Simulation experiments show that cross-validation can be applied beneficially to select an appropriate prediction model. The cross-validation procedure worked well for a wide range of different states of nature and levels of information. Jackknife procedures are used to compute individual prediction estimation errors at undrilled locations. The jackknife replicates also are used with a bootstrap resampling procedure to compute confidence bounds for the total volume. The method was applied to data (partitioned into a training set and target set) from the Devonian Antrim Shale continuous-type gas play in the Michigan Basin in Otsego County, Michigan. The analysis showed that the model estimate of total recoverable volumes at prediction sites is within 4 percent of the total observed volume. The model predictions also provide frequency distributions of the cell volumes at the production unit scale. Such distributions are the basis for subsequent economic analyses. ?? Springer Science+Business Media, LLC 2007.

  9. Adaptive Outlier-tolerant Exponential Smoothing Prediction Algorithms with Applications to Predict the Temperature in Spacecraft

    OpenAIRE

    Hu Shaolin; Zhang Wei; Li Ye; Fan Shunxi

    2011-01-01

    The exponential smoothing prediction algorithm is widely used in spaceflight control and in process monitoring as well as in economical prediction. There are two key conundrums which are open: one is about the selective rule of the parameter in the exponential smoothing prediction, and the other is how to improve the bad influence of outliers on prediction. In this paper a new practical outlier-tolerant algorithm is built to select adaptively proper parameter, and the exponential smoothing pr...

  10. Prediction of interannual climate variations

    International Nuclear Information System (INIS)

    Shukla, J.

    1993-01-01

    It has been known for some time that the behavior of the short-term fluctuations of the earth's atmosphere resembles that of a chaotic non-linear dynamical system, and that the day-to-day weather cannot be predicted beyond a few weeks. However, it has also been found that the interactions of the atmosphere with the underlying oceans and the land surfaces can produce fluctuations whose time scales are much longer than the limits of deterministic prediction of weather. It is, therefore, natural to ask whether it is possible that the seasonal and longer time averages of climate fluctuations can be predicted with sufficient skill to be beneficial for social and economic applications, even though the details of day-to-day weather cannot be predicted beyond a few weeks. The main objective of the workshop was to address this question by assessing the current state of knowledge on predictability of seasonal and interannual climate variability and to investigate various possibilities for its prediction. (orig./KW)

  11. Dynamical Predictability of Monthly Means.

    Science.gov (United States)

    Shukla, J.

    1981-12-01

    We have attempted to determine the theoretical upper limit of dynamical predictability of monthly means for prescribed nonfluctuating external forcings. We have extended the concept of `classical' predictability, which primarily refers to the lack of predictability due mainly to the instabilities of synoptic-scale disturbances, to the predictability of time averages, which are determined by the predictability of low-frequency planetary waves. We have carded out 60-day integrations of a global general circulation model with nine different initial conditions but identical boundary conditions of sea surface temperature, snow, sea ice and soil moisture. Three of these initial conditions are the observed atmospheric conditions on 1 January of 1975, 1976 and 1977. The other six initial conditions are obtained by superimposing over the observed initial conditions a random perturbation comparable to the errors of observation. The root-mean-square (rms) error of random perturbations at all the grid points and all the model levels is 3 m s1 in u and v components of wind. The rms vector wind error between the observed initial conditions is >15 m s1.It is hypothesized that for a given averaging period, if the rms error among the time averages predicted from largely different initial conditions becomes comparable to the rms error among the time averages predicted from randomly perturbed initial conditions, the time averages are dynamically unpredictable. We have carried out the analysis of variance to compare the variability, among the three groups, due to largely different initial conditions, and within each group due to random perturbations.It is found that the variances among the first 30-day means, predicted from largely different initial conditions, are significantly different from the variances due to random perturbations in the initial conditions, whereas the variances among 30-day means for days 31-60 are not distinguishable from the variances due to random initial

  12. Predictive Analytics in Information Systems Research

    OpenAIRE

    Shmueli, Galit; Koppius, Otto

    2011-01-01

    textabstractThis research essay highlights the need to integrate predictive analytics into information systems research and shows several concrete ways in which this goal can be accomplished. Predictive analytics include empirical methods (statistical and other) that generate data predictions as well as methods for assessing predictive power. Predictive analytics not only assist in creating practically useful models, they also play an important role alongside explanatory modeling in theory bu...

  13. Prediction of intermetallic compounds

    International Nuclear Information System (INIS)

    Burkhanov, Gennady S; Kiselyova, N N

    2009-01-01

    The problems of predicting not yet synthesized intermetallic compounds are discussed. It is noted that the use of classical physicochemical analysis in the study of multicomponent metallic systems is faced with the complexity of presenting multidimensional phase diagrams. One way of predicting new intermetallics with specified properties is the use of modern processing technology with application of teaching of image recognition by the computer. The algorithms used most often in these methods are briefly considered and the efficiency of their use for predicting new compounds is demonstrated.

  14. Learned predictiveness and outcome predictability effects are not simply two sides of the same coin.

    Science.gov (United States)

    Thorwart, Anna; Livesey, Evan J; Wilhelm, Francisco; Liu, Wei; Lachnit, Harald

    2017-10-01

    The Learned Predictiveness effect refers to the observation that learning about the relationship between a cue and an outcome is influenced by the predictive relevance of the cue for other outcomes. Similarly, the Outcome Predictability effect refers to a recent observation that the previous predictability of an outcome affects learning about this outcome in new situations, too. We hypothesize that both effects may be two manifestations of the same phenomenon and stimuli that have been involved in highly predictive relationships may be learned about faster when they are involved in new relationships regardless of their functional role in predictive learning as cues and outcomes. Four experiments manipulated both the relationships and the function of the stimuli. While we were able to replicate the standard effects, they did not survive a transfer to situations where the functional role of the stimuli changed, that is the outcome of the first phase becomes a cue in the second learning phase or the cue of the first phase becomes the outcome of the second phase. Furthermore, unlike learned predictiveness, there was little indication that the distribution of overt attention in the second phase was influenced by previous predictability. The results suggest that these 2 very similar effects are not manifestations of a more general phenomenon but rather independent from each other. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  15. Estimating Model Prediction Error: Should You Treat Predictions as Fixed or Random?

    Science.gov (United States)

    Wallach, Daniel; Thorburn, Peter; Asseng, Senthold; Challinor, Andrew J.; Ewert, Frank; Jones, James W.; Rotter, Reimund; Ruane, Alexander

    2016-01-01

    Crop models are important tools for impact assessment of climate change, as well as for exploring management options under current climate. It is essential to evaluate the uncertainty associated with predictions of these models. We compare two criteria of prediction error; MSEP fixed, which evaluates mean squared error of prediction for a model with fixed structure, parameters and inputs, and MSEP uncertain( X), which evaluates mean squared error averaged over the distributions of model structure, inputs and parameters. Comparison of model outputs with data can be used to estimate the former. The latter has a squared bias term, which can be estimated using hindcasts, and a model variance term, which can be estimated from a simulation experiment. The separate contributions to MSEP uncertain (X) can be estimated using a random effects ANOVA. It is argued that MSEP uncertain (X) is the more informative uncertainty criterion, because it is specific to each prediction situation.

  16. How (not) to Lie with Benefit-Cost Analysis

    OpenAIRE

    Scott Farrow

    2013-01-01

    Benefit-cost analysis is seen by some as a controversial activity in which the analyst can significantly bias the results. This note highlights some of the ways that analysts can "lie" in a benefit-cost analysis but more importantly, provides guidance on how not to lie and how to better inform public decisionmakers.

  17. Characteristics of the Navy Laboratory Warfare Center Technical Workforce

    Science.gov (United States)

    2013-09-29

    Mathematics and Information Science (M&IS) Actuarial Science 1510 Computer Science 1550 Gen. Math & Statistics 1501 Mathematics 1520 Operations...Admin. Network Systems & Data Communication Analysts Actuaries Mathematicians Operations Research Analyst Statisticians Social Science (SS...workforce was sub-divided into six broad occupational groups: Life Science , Physical Science , Engineering, Mathematics, Computer Science and Information

  18. The Allocation of Visual Attention in Multimedia Search Interfaces

    Science.gov (United States)

    Hughes, Edith Allen

    2017-01-01

    Multimedia analysts are challenged by the massive numbers of unconstrained video clips generated daily. Such clips can include any possible scene and events, and generally have limited quality control. Analysts who must work with such data are overwhelmed by its volume and lack of computational tools to probe it effectively. Even with advances…

  19. Peak-summer East Asian rainfall predictability and prediction part II: extratropical East Asia

    Science.gov (United States)

    Yim, So-Young; Wang, Bin; Xing, Wen

    2016-07-01

    The part II of the present study focuses on northern East Asia (NEA: 26°N-50°N, 100°-140°E), exploring the source and limit of the predictability of the peak summer (July-August) rainfall. Prediction of NEA peak summer rainfall is extremely challenging because of the exposure of the NEA to midlatitude influence. By examining four coupled climate models' multi-model ensemble (MME) hindcast during 1979-2010, we found that the domain-averaged MME temporal correlation coefficient (TCC) skill is only 0.13. It is unclear whether the dynamical models' poor skills are due to limited predictability of the peak-summer NEA rainfall. In the present study we attempted to address this issue by applying predictable mode analysis method using 35-year observations (1979-2013). Four empirical orthogonal modes of variability and associated major potential sources of variability are identified: (a) an equatorial western Pacific (EWP)-NEA teleconnection driven by EWP sea surface temperature (SST) anomalies, (b) a western Pacific subtropical high and Indo-Pacific dipole SST feedback mode, (c) a central Pacific-El Nino-Southern Oscillation mode, and (d) a Eurasian wave train pattern. Physically meaningful predictors for each principal component (PC) were selected based on analysis of the lead-lag correlations with the persistent and tendency fields of SST and sea-level pressure from March to June. A suite of physical-empirical (P-E) models is established to predict the four leading PCs. The peak summer rainfall anomaly pattern is then objectively predicted by using the predicted PCs and the corresponding observed spatial patterns. A 35-year cross-validated hindcast over the NEA yields a domain-averaged TCC skill of 0.36, which is significantly higher than the MME dynamical hindcast (0.13). The estimated maximum potential attainable TCC skill averaged over the entire domain is around 0.61, suggesting that the current dynamical prediction models may have large rooms to improve

  20. Predicting child maltreatment: A meta-analysis of the predictive validity of risk assessment instruments.

    Science.gov (United States)

    van der Put, Claudia E; Assink, Mark; Boekhout van Solinge, Noëlle F

    2017-11-01

    Risk assessment is crucial in preventing child maltreatment since it can identify high-risk cases in need of child protection intervention. Despite widespread use of risk assessment instruments in child welfare, it is unknown how well these instruments predict maltreatment and what instrument characteristics are associated with higher levels of predictive validity. Therefore, a multilevel meta-analysis was conducted to examine the predictive accuracy of (characteristics of) risk assessment instruments. A literature search yielded 30 independent studies (N=87,329) examining the predictive validity of 27 different risk assessment instruments. From these studies, 67 effect sizes could be extracted. Overall, a medium significant effect was found (AUC=0.681), indicating a moderate predictive accuracy. Moderator analyses revealed that onset of maltreatment can be better predicted than recurrence of maltreatment, which is a promising finding for early detection and prevention of child maltreatment. In addition, actuarial instruments were found to outperform clinical instruments. To bring risk and needs assessment in child welfare to a higher level, actuarial instruments should be further developed and strengthened by distinguishing risk assessment from needs assessment and by integrating risk assessment with case management. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Integrating geophysics and hydrology for reducing the uncertainty of groundwater model predictions and improved prediction performance

    DEFF Research Database (Denmark)

    Christensen, Nikolaj Kruse; Christensen, Steen; Ferre, Ty

    the integration of geophysical data in the construction of a groundwater model increases the prediction performance. We suggest that modelers should perform a hydrogeophysical “test-bench” analysis of the likely value of geophysics data for improving groundwater model prediction performance before actually...... and the resulting predictions can be compared with predictions from the ‘true’ model. By performing this analysis we expect to give the modeler insight into how the uncertainty of model-based prediction can be reduced.......A major purpose of groundwater modeling is to help decision-makers in efforts to manage the natural environment. Increasingly, it is recognized that both the predictions of interest and their associated uncertainties should be quantified to support robust decision making. In particular, decision...

  2. Aircraft noise prediction program theoretical manual: Rotorcraft System Noise Prediction System (ROTONET), part 4

    Science.gov (United States)

    Weir, Donald S.; Jumper, Stephen J.; Burley, Casey L.; Golub, Robert A.

    1995-01-01

    This document describes the theoretical methods used in the rotorcraft noise prediction system (ROTONET), which is a part of the NASA Aircraft Noise Prediction Program (ANOPP). The ANOPP code consists of an executive, database manager, and prediction modules for jet engine, propeller, and rotor noise. The ROTONET subsystem contains modules for the prediction of rotor airloads and performance with momentum theory and prescribed wake aerodynamics, rotor tone noise with compact chordwise and full-surface solutions to the Ffowcs-Williams-Hawkings equations, semiempirical airfoil broadband noise, and turbulence ingestion broadband noise. Flight dynamics, atmosphere propagation, and noise metric calculations are covered in NASA TM-83199, Parts 1, 2, and 3.

  3. Time-predictable Stack Caching

    DEFF Research Database (Denmark)

    Abbaspourseyedi, Sahar

    completely. Thus, in systems with hard deadlines the worst-case execution time (WCET) of the real-time software running on them needs to be bounded. Modern architectures use features such as pipelining and caches for improving the average performance. These features, however, make the WCET analysis more...... addresses, provides an opportunity to predict and tighten the WCET of accesses to data in caches. In this thesis, we introduce the time-predictable stack cache design and implementation within a time-predictable processor. We introduce several optimizations to our design for tightening the WCET while...... keeping the timepredictability of the design intact. Moreover, we provide a solution for reducing the cost of context switching in a system using the stack cache. In design of these caches, we use custom hardware and compiler support for delivering time-predictable stack data accesses. Furthermore...

  4. Implementation of short-term prediction

    Energy Technology Data Exchange (ETDEWEB)

    Landberg, L; Joensen, A; Giebel, G [and others

    1999-03-01

    This paper will giver a general overview of the results from a EU JOULE funded project (`Implementing short-term prediction at utilities`, JOR3-CT95-0008). Reference will be given to specialised papers where applicable. The goal of the project was to implement wind farm power output prediction systems in operational environments at a number of utilities in Europe. Two models were developed, one by Risoe and one by the Technical University of Denmark (DTU). Both prediction models used HIRLAM predictions from the Danish Meteorological Institute (DMI). (au) EFP-94; EU-JOULE. 11 refs.

  5. Earthquake prediction by Kina Method

    International Nuclear Information System (INIS)

    Kianoosh, H.; Keypour, H.; Naderzadeh, A.; Motlagh, H.F.

    2005-01-01

    Earthquake prediction has been one of the earliest desires of the man. Scientists have worked hard to predict earthquakes for a long time. The results of these efforts can generally be divided into two methods of prediction: 1) Statistical Method, and 2) Empirical Method. In the first method, earthquakes are predicted using statistics and probabilities, while the second method utilizes variety of precursors for earthquake prediction. The latter method is time consuming and more costly. However, the result of neither method has fully satisfied the man up to now. In this paper a new method entitled 'Kiana Method' is introduced for earthquake prediction. This method offers more accurate results yet lower cost comparing to other conventional methods. In Kiana method the electrical and magnetic precursors are measured in an area. Then, the time and the magnitude of an earthquake in the future is calculated using electrical, and in particular, electrical capacitors formulas. In this method, by daily measurement of electrical resistance in an area we make clear that the area is capable of earthquake occurrence in the future or not. If the result shows a positive sign, then the occurrence time and the magnitude can be estimated by the measured quantities. This paper explains the procedure and details of this prediction method. (authors)

  6. Audiovisual biofeedback improves motion prediction accuracy.

    Science.gov (United States)

    Pollock, Sean; Lee, Danny; Keall, Paul; Kim, Taeho

    2013-04-01

    The accuracy of motion prediction, utilized to overcome the system latency of motion management radiotherapy systems, is hampered by irregularities present in the patients' respiratory pattern. Audiovisual (AV) biofeedback has been shown to reduce respiratory irregularities. The aim of this study was to test the hypothesis that AV biofeedback improves the accuracy of motion prediction. An AV biofeedback system combined with real-time respiratory data acquisition and MR images were implemented in this project. One-dimensional respiratory data from (1) the abdominal wall (30 Hz) and (2) the thoracic diaphragm (5 Hz) were obtained from 15 healthy human subjects across 30 studies. The subjects were required to breathe with and without the guidance of AV biofeedback during each study. The obtained respiratory signals were then implemented in a kernel density estimation prediction algorithm. For each of the 30 studies, five different prediction times ranging from 50 to 1400 ms were tested (150 predictions performed). Prediction error was quantified as the root mean square error (RMSE); the RMSE was calculated from the difference between the real and predicted respiratory data. The statistical significance of the prediction results was determined by the Student's t-test. Prediction accuracy was considerably improved by the implementation of AV biofeedback. Of the 150 respiratory predictions performed, prediction accuracy was improved 69% (103/150) of the time for abdominal wall data, and 78% (117/150) of the time for diaphragm data. The average reduction in RMSE due to AV biofeedback over unguided respiration was 26% (p biofeedback improves prediction accuracy. This would result in increased efficiency of motion management techniques affected by system latencies used in radiotherapy.

  7. The trickle-down effect of predictability: Secondary task performance benefits from predictability in the primary task.

    Directory of Open Access Journals (Sweden)

    Magdalena Ewa Król

    Full Text Available Predictions optimize processing by reducing attentional resources allocation to expected or predictable sensory data. Our study demonstrates that these saved processing resources can be then used on concurrent stimuli, and in consequence improve their processing and encoding. We illustrate this "trickle-down" effect with a dual task, where the primary task varied in terms of predictability. The primary task involved detection of a pre-specified symbol that appeared at some point of a short video of a dot moving along a random, semi-predictable or predictable trajectory. The concurrent secondary task involved memorization of photographs representing either emotionally neutral or non-neutral (social or threatening content. Performance in the secondary task was measured by a memory test. We found that participants allocated more attention to unpredictable (random and semi-predictable stimuli than to predictable stimuli. Additionally, when the stimuli in the primary task were more predictable, participants performed better in the secondary task, as evidenced by higher sensitivity in the memory test. Finally, social or threatening stimuli were allocated more "looking time" and a larger number of saccades than neutral stimuli. This effect was stronger for the threatening stimuli than social stimuli. Thus, predictability of environmental input is used in optimizing the allocation of attentional resources, which trickles-down and benefits the processing of concurrent stimuli.

  8. The trickle-down effect of predictability: Secondary task performance benefits from predictability in the primary task.

    Science.gov (United States)

    Król, Magdalena Ewa; Król, Michał

    2017-01-01

    Predictions optimize processing by reducing attentional resources allocation to expected or predictable sensory data. Our study demonstrates that these saved processing resources can be then used on concurrent stimuli, and in consequence improve their processing and encoding. We illustrate this "trickle-down" effect with a dual task, where the primary task varied in terms of predictability. The primary task involved detection of a pre-specified symbol that appeared at some point of a short video of a dot moving along a random, semi-predictable or predictable trajectory. The concurrent secondary task involved memorization of photographs representing either emotionally neutral or non-neutral (social or threatening) content. Performance in the secondary task was measured by a memory test. We found that participants allocated more attention to unpredictable (random and semi-predictable) stimuli than to predictable stimuli. Additionally, when the stimuli in the primary task were more predictable, participants performed better in the secondary task, as evidenced by higher sensitivity in the memory test. Finally, social or threatening stimuli were allocated more "looking time" and a larger number of saccades than neutral stimuli. This effect was stronger for the threatening stimuli than social stimuli. Thus, predictability of environmental input is used in optimizing the allocation of attentional resources, which trickles-down and benefits the processing of concurrent stimuli.

  9. NOAA's National Air Quality Prediction and Development of Aerosol and Atmospheric Composition Prediction Components for NGGPS

    Science.gov (United States)

    Stajner, I.; McQueen, J.; Lee, P.; Stein, A. F.; Wilczak, J. M.; Upadhayay, S.; daSilva, A.; Lu, C. H.; Grell, G. A.; Pierce, R. B.

    2017-12-01

    NOAA's operational air quality predictions of ozone, fine particulate matter (PM2.5) and wildfire smoke over the United States and airborne dust over the contiguous 48 states are distributed at http://airquality.weather.gov. The National Air Quality Forecast Capability (NAQFC) providing these predictions was updated in June 2017. Ozone and PM2.5 predictions are now produced using the system linking the Community Multiscale Air Quality model (CMAQ) version 5.0.2 with meteorological inputs from the North American Mesoscale Forecast System (NAM) version 4. Predictions of PM2.5 include intermittent dust emissions and wildfire emissions from an updated version of BlueSky system. For the latter, the CMAQ system is initialized by rerunning it over the previous 24 hours to include wildfire emissions at the time when they were observed from the satellites. Post processing to reduce the bias in PM2.5 prediction was updated using the Kalman filter analog (KFAN) technique. Dust related aerosol species at the CMAQ domain lateral boundaries now come from the NEMS Global Aerosol Component (NGAC) v2 predictions. Further development of NAQFC includes testing of CMAQ predictions to 72 hours, Canadian fire emissions data from Environment and Climate Change Canada (ECCC) and the KFAN technique to reduce bias in ozone predictions. NOAA is developing the Next Generation Global Predictions System (NGGPS) with an aerosol and gaseous atmospheric composition component to improve and integrate aerosol and ozone predictions and evaluate their impacts on physics, data assimilation and weather prediction. Efforts are underway to improve cloud microphysics, investigate aerosol effects and include representations of atmospheric composition of varying complexity into NGGPS: from the operational ozone parameterization, GOCART aerosols, with simplified ozone chemistry, to CMAQ chemistry with aerosol modules. We will present progress on community building, planning and development of NGGPS.

  10. Archaeological predictive model set.

    Science.gov (United States)

    2015-03-01

    This report is the documentation for Task 7 of the Statewide Archaeological Predictive Model Set. The goal of this project is to : develop a set of statewide predictive models to assist the planning of transportation projects. PennDOT is developing t...

  11. Where are you, my beloved? On absence, loss, and the enigma of telepathic dreams.

    Science.gov (United States)

    Eshel, Ofra

    2006-12-01

    The subject of dream telepathy (especially patients' telepathic dreams) and related phenomena in the psychoanalytic context has been a controversial, disturbing 'foreign body' ever since it was introduced into psychoanalysis by Freud in 1921. Telepathy- suffering (or intense feeling) at a distance (Greek: pathos + tele)-is the transfer or communication of thoughts, impressions and information over distance between two people without the normal operation of the recognized sense organs. The author offers a comprehensive historical review of the psychoanalytic literature on this controversial issue, beginning with Freud's years-long struggles over the possibility of thoughttransference and dream telepathy. She then describes her own analytic encounter over the years with five patients' telepathic dreams-dreams involving precise details of the time, place, sensory impressions, and experiential states that the analyst was in at that time, which the patients could not have known through ordinary sensory perception and communication. The author's ensuing explanation combines contributory factors involving patient, archaic communication and analyst. Each of these patients, in early childhood, had a mother who was emotionally absent-within-absence, due to the absence of a significant figure in her own life. This primary traumatic loss was imprinted in their nascent selves and inchoate relating to others, with a fixation on a nonverbal, archaic mode of communication. The patient's telepathic dream is formed as a search engine when the analyst is suddenly emotionally absent, in order to find the analyst and thus halt the process of abandonment and prevent collapse into the despair of the early traumatization. Hence, the telepathic dream embodies an enigmatic 'impossible' extreme of patient-analyst deep-level interconnectedness and unconscious communication in the analytic process. This paper is part of the author's endeavour to grasp the true experiential scope and

  12. Savannah River Site human error data base development for nonreactor nuclear facilities

    International Nuclear Information System (INIS)

    Benhardt, H.C.; Held, J.E.; Olsen, L.M.; Vail, R.E.; Eide, S.A.

    1994-01-01

    As part of an overall effort to upgrade and streamline methodologies for safety analyses of nonreactor nuclear facilities at the Savannah River Site (SRS), a human error data base has been developed and is presented in this report. The data base fulfills several needs of risk analysts supporting safety analysis report (SAR) development. First, it provides a single source for probabilities or rates for a wide variety of human errors associated with the SRS nonreactor nuclear facilities. Second, it provides a documented basis for human error probabilities or rates. And finally, it provides actual SRS-specific human error data to support many of the error probabilities or rates. Use of a single, documented reference source for human errors, supported by SRS-specific human error data, will improve the consistency and accuracy of human error modeling by SRS risk analysts. It is envisioned that SRS risk analysts will use this report as both a guide to identifying the types of human errors that may need to be included in risk models such as fault and event trees, and as a source for human error probabilities or rates. For each human error in this report, ffime different mean probabilities or rates are presented to cover a wide range of conditions and influencing factors. The ask analysts must decide which mean value is most appropriate for each particular application. If other types of human errors are needed for the risk models, the analyst must use other sources. Finally, if human enors are dominant in the quantified risk models (based on the values obtained fmm this report), then it may be appropriate to perform detailed human reliability analyses (HRAS) for the dominant events. This document does not provide guidance for such refined HRAS; in such cases experienced human reliability analysts should be involved

  13. Projective identification and consciousness alteration: a bridge between psychoanalysis and neuroscience?

    Science.gov (United States)

    Cimino, Cristiana; Correale, Antonello

    2005-02-01

    The authors claim that projective identification in the process of analysis should be considered in a circumscribed manner and seen as a very specific type of communication between the patient and the analyst, characterised through a modality that is simultaneously active, unconscious and discrete. In other words, the patient actively, though unconsciously and discretely--that is, in specific moments of the analysis--brings about particular changes in the analysts state. From the analyst's side, the effect of this type of communication is a sudden change in his general state--a sense of passivity and coercion and a change in the state of consciousness. This altered consciousness can range from an almost automatic repetition of a relational script to a moderate or serious contraction of the field of attention to full-fledged changes in the analyst's sense of self. The authors propose the theory that this type of communication is, in fact, the expression of traumatic contents of experiences emerging from the non-declarative memory. These contents belong to a pre-symbolic and pre-representative area of the mind. They are made of inert fragments of psychic material that are felt rather than thought, which can thus be viewed as a kind of writing to be completed. These pieces of psychic material are the expression of traumatic experiences that in turn exercise a traumatic effect on the analyst, inducing an altered state of consciousness in him as well. Such material should be understood as belonging to an unrepressed unconscious. Restitution of these fragments to the patient in representable forms must take place gradually and without trying to accelerate the timing, in order to avoid the possibility that the restitution itself constitute an acting on the part of the analyst, which would thus be a traumatic response to the traumatic action of the analytic material.

  14. Continued slide seen for C.I.S. oil production

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    This paper reports that oil production in the Commonwealth of Independent States may dip to 7.7 million b/d next year. Robert Ebel of the Center for strategic and International Studies, Washington, D.C., made that prediction before a meeting of the National Association of Petroleum Investment Analysts. Oil and Gas Journal's latest worldwide oil production figures peg the C.I.S. volume at 8.689 million b/d last August. Ebel said a September decree will allow oil prices to move in line with the market and with costs of production. That in turn will lead to development of a deregulated domestic oil market

  15. Psychohistory before Hitler: early military analyses of German national psychology.

    Science.gov (United States)

    Bendersky, J W

    1988-04-01

    As part of a grandiose post-World War I psychological project to predict the behavior of nations, the U.S. Military Intelligence Division (MID) utilized racial and social psychological theories to explain an alleged problematic German national character. Though unsuccessful, this project has major significance in the history of psychohistory. For the newly discovered MID files reveal that ideas, attitudes, and biases many psychohistorians subsequently identified as manifestations of a peculiar German national character had previously been held by American officers and reputable psychologists. What American analysts would, in 1940, view as symptoms of a maladjusted German mind, their predecessors had, in 1920, considered valid scientific concepts.

  16. A Case Study of Risk Informed Asset Management (RIAM) for Nuclear Power Plant

    International Nuclear Information System (INIS)

    Lee, Gyoung Cheol; Jeong, Yong Hoon; Chang, Soon Heung; Chung, Dae Wook

    2006-01-01

    Recently, the concern for Nuclear Asset Management (NAM) is increasing in nuclear industry. Asset Management is management of the financial assets of a company in order to maximize return. However, asset management in the nuclear industry is needed for coincidental consideration of nuclear safety and risk. Over fast several years, efforts for development of safety concerned and financial asset maximizing method, process and tools have been continued internationally. Risk Informed Asset Management (RIAM) is a methodology, process, and (eventually) software tool by which analyst review historical performance and develop predictive logic models and data analyses to provide plant manager and company decision-makers critical quantitative performance indicators

  17. Branch prediction in the pentium family

    DEFF Research Database (Denmark)

    Fog, Agner

    1998-01-01

    How the branch prediction mechanism in the Pentium has been uncovered with all its quirks, and the incredibly more effective branch prediction in the later versions.......How the branch prediction mechanism in the Pentium has been uncovered with all its quirks, and the incredibly more effective branch prediction in the later versions....

  18. The triadic intersubjective matrix in supervision: the use of disclosure to work through painful affects.

    Science.gov (United States)

    Brown, Lawrence J; Miller, Martin

    2002-08-01

    The use of the psychoanalyst's subjective reactions as a tool to better understand his/her patient has been a central feature of clinical thinking in recent decades. While there has been much discussion and debate about the analyst's use of countertransference in individual psychoanalysis, including possible disclosure of his/her feelings to the patient, the literature on supervision has been slower to consider such matters. The attention to parallel processes in supervision has been helpful in appreciating the impact of affects arising in either the analyst/patient or the supervisor/analyst dyads upon the analytic treatment and its supervision. This contribution addresses the ways in which overlapping aspects of the personalities of the supervisor, analyst and patient may intersect and create resistances in the treatment. That three-way intersection, described here as the triadic intersubjective matrix, is considered inevitable in all supervised treatments. A clinical example from the termination phase of a supervised analysis of an adolescent is offered to illustrate these points. Finally, the question of self-disclosure as an aspect of the supervisory alliance is also discussed.

  19. Prediction during sentence comprehension in aphasia

    Directory of Open Access Journals (Sweden)

    Michael Walsh Dickey

    2014-04-01

    Full Text Available Much recent psycholinguistic work has focused on prediction in language comprehension (Altmann & Kamide, 1999; Federmeier, 2007; Levy, 2008. Unimpaired adults predict upcoming words and phrases based on material in the preceding context, like verbs (Altmann & Kamide, 1999 or constraining sentence contexts (Federmeier, 2007. Several models have tied rapid prediction to the language production system (Federmeier, 2007; Pickering & Garrod, 2013; Dell & Chang, 2014. Evidence for this link comes from that fact that older adults with lower verbal fluency show less predictive behavior (Federmeier, et al., 2010; DeLong, et al., 2012. Prediction in aphasic language comprehension has not been widely investigated, even though constraining sentence contexts are strongly facilitative for naming in aphasia (e.g., Love & Webb, 1977. Mack, et al. (2013 found in a visual-world task that people with aphasia (PWA do not predict upcoming objects based on verbs (cf. Altmann & Kamide, 1999. This finding suggests that prediction may be reduced in aphasia. However, it is unclear whether reduced prediction was caused by language-production impairments: all the PWA in their study had non-fluent aphasia. The current study examined whether PWA show evidence of prediction based on constraining sentence contexts (e.g., Federmeier, 2007. Specifically, it tested whether they exhibited facilitation for highly predictable words in reading, using materials that have previously demonstrated strong predictability effects for unimpaired adults (Rayner, et al., 2004. In addition, it tested whether differences in language-production ability among PWA accounted for differences in predictive behavior (viz. Pickering & Garrod, 2013; Dell & Chang, 2014. Eight PWA read sentences adapted from Rayner, et al. (2004 in a self-paced reading task. The materials crossed word frequency with predictability: high- vs. low-frequency words (bottle/diaper were preceded by contexts which made them

  20. Stuck pipe prediction

    KAUST Repository

    Alzahrani, Majed; Alsolami, Fawaz; Chikalov, Igor; Algharbi, Salem; Aboudi, Faisal; Khudiri, Musab

    2016-01-01

    Disclosed are various embodiments for a prediction application to predict a stuck pipe. A linear regression model is generated from hook load readings at corresponding bit depths. A current hook load reading at a current bit depth is compared with a normal hook load reading from the linear regression model. A current hook load greater than a normal hook load for a given bit depth indicates the likelihood of a stuck pipe.