WorldWideScience

Sample records for human error analysis

  1. Notes on human error analysis and prediction

    International Nuclear Information System (INIS)

    Rasmussen, J.

    1978-11-01

    The notes comprise an introductory discussion of the role of human error analysis and prediction in industrial risk analysis. Following this introduction, different classes of human errors and role in industrial systems are mentioned. Problems related to the prediction of human behaviour in reliability and safety analysis are formulated and ''criteria for analyzability'' which must be met by industrial systems so that a systematic analysis can be performed are suggested. The appendices contain illustrative case stories and a review of human error reports for the task of equipment calibration and testing as found in the US Licensee Event Reports. (author)

  2. Human Error Analysis by Fuzzy-Set

    International Nuclear Information System (INIS)

    Situmorang, Johnny

    1996-01-01

    In conventional HRA the probability of Error is treated as a single and exact value through constructing even tree, but in this moment the Fuzzy-Set Theory is used. Fuzzy set theory treat the probability of error as a plausibility which illustrate a linguistic variable. Most parameter or variable in human engineering been defined verbal good, fairly good, worst etc. Which describe a range of any value of probability. For example this analysis is quantified the human error in calibration task, and the probability of miscalibration is very low

  3. A technique for human error analysis (ATHEANA)

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, S.E.; Ramey-Smith, A.M.; Wreathall, J.; Parry, G.W. [and others

    1996-05-01

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions.

  4. A technique for human error analysis (ATHEANA)

    International Nuclear Information System (INIS)

    Cooper, S.E.; Ramey-Smith, A.M.; Wreathall, J.; Parry, G.W.

    1996-05-01

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions

  5. Analysis of Employee's Survey for Preventing Human-Errors

    International Nuclear Information System (INIS)

    Sung, Chanho; Kim, Younggab; Joung, Sanghoun

    2013-01-01

    Human errors in nuclear power plant can cause large and small events or incidents. These events or incidents are one of main contributors of reactor trip and might threaten the safety of nuclear plants. To prevent human-errors, KHNP(nuclear power plants) introduced 'Human-error prevention techniques' and have applied the techniques to main parts such as plant operation, operation support, and maintenance and engineering. This paper proposes the methods to prevent and reduce human-errors in nuclear power plants through analyzing survey results which includes the utilization of the human-error prevention techniques and the employees' awareness of preventing human-errors. With regard to human-error prevention, this survey analysis presented the status of the human-error prevention techniques and the employees' awareness of preventing human-errors. Employees' understanding and utilization of the techniques was generally high and training level of employee and training effect on actual works were in good condition. Also, employees answered that the root causes of human-error were due to working environment including tight process, manpower shortage, and excessive mission rather than personal negligence or lack of personal knowledge. Consideration of working environment is certainly needed. At the present time, based on analyzing this survey, the best methods of preventing human-error are personal equipment, training/education substantiality, private mental health check before starting work, prohibit of multiple task performing, compliance with procedures, and enhancement of job site review. However, the most important and basic things for preventing human-error are interests of workers and organizational atmosphere such as communication between managers and workers, and communication between employees and bosses

  6. Human Error Assessmentin Minefield Cleaning Operation Using Human Event Analysis

    Directory of Open Access Journals (Sweden)

    Mohammad Hajiakbari

    2015-12-01

    Full Text Available Background & objective: Human error is one of the main causes of accidents. Due to the unreliability of the human element and the high-risk nature of demining operations, this study aimed to assess and manage human errors likely to occur in such operations. Methods: This study was performed at a demining site in war zones located in the West of Iran. After acquiring an initial familiarity with the operations, methods, and tools of clearing minefields, job task related to clearing landmines were specified. Next, these tasks were studied using HTA and related possible errors were assessed using ATHEANA. Results: de-mining task was composed of four main operations, including primary detection, technical identification, investigation, and neutralization. There were found four main reasons for accidents occurring in such operations; walking on the mines, leaving mines with no action, error in neutralizing operation and environmental explosion. The possibility of human error in mine clearance operations was calculated as 0.010. Conclusion: The main causes of human error in de-mining operations can be attributed to various factors such as poor weather and operating conditions like outdoor work, inappropriate personal protective equipment, personality characteristics, insufficient accuracy in the work, and insufficient time available. To reduce the probability of human error in de-mining operations, the aforementioned factors should be managed properly.

  7. Trial application of a technique for human error analysis (ATHEANA)

    International Nuclear Information System (INIS)

    Bley, D.C.; Cooper, S.E.; Parry, G.W.

    1996-01-01

    The new method for HRA, ATHEANA, has been developed based on a study of the operating history of serious accidents and an understanding of the reasons why people make errors. Previous publications associated with the project have dealt with the theoretical framework under which errors occur and the retrospective analysis of operational events. This is the first attempt to use ATHEANA in a prospective way, to select and evaluate human errors within the PSA context

  8. SHEAN (Simplified Human Error Analysis code) and automated THERP

    International Nuclear Information System (INIS)

    Wilson, J.R.

    1993-01-01

    One of the most widely used human error analysis tools is THERP (Technique for Human Error Rate Prediction). Unfortunately, this tool has disadvantages. The Nuclear Regulatory Commission, realizing these drawbacks, commissioned Dr. Swain, the author of THERP, to create a simpler, more consistent tool for deriving human error rates. That effort produced the Accident Sequence Evaluation Program Human Reliability Analysis Procedure (ASEP), which is more conservative than THERP, but a valuable screening tool. ASEP involves answering simple questions about the scenario in question, and then looking up the appropriate human error rate in the indicated table (THERP also uses look-up tables, but four times as many). The advantages of ASEP are that human factors expertise is not required, and the training to use the method is minimal. Although not originally envisioned by Dr. Swain, the ASEP approach actually begs to be computerized. That WINCO did, calling the code SHEAN, for Simplified Human Error ANalysis. The code was done in TURBO Basic for IBM or IBM-compatible MS-DOS, for fast execution. WINCO is now in the process of comparing this code against THERP for various scenarios. This report provides a discussion of SHEAN

  9. Applications of human error analysis to aviation and space operations

    International Nuclear Information System (INIS)

    Nelson, W.R.

    1998-01-01

    For the past several years at the Idaho National Engineering and Environmental Laboratory (INEEL) we have been working to apply methods of human error analysis to the design of complex systems. We have focused on adapting human reliability analysis (HRA) methods that were developed for Probabilistic Safety Assessment (PSA) for application to system design. We are developing methods so that human errors can be systematically identified during system design, the potential consequences of each error can be assessed, and potential corrective actions (e.g. changes to system design or procedures) can be identified. These applications lead to different requirements when compared with HR.As performed as part of a PSA. For example, because the analysis will begin early during the design stage, the methods must be usable when only partial design information is available. In addition, the ability to perform numerous ''what if'' analyses to identify and compare multiple design alternatives is essential. Finally, since the goals of such human error analyses focus on proactive design changes rather than the estimate of failure probabilities for PRA, there is more emphasis on qualitative evaluations of error relationships and causal factors than on quantitative estimates of error frequency. The primary vehicle we have used to develop and apply these methods has been a series of prqjects sponsored by the National Aeronautics and Space Administration (NASA) to apply human error analysis to aviation operations. The first NASA-sponsored project had the goal to evaluate human errors caused by advanced cockpit automation. Our next aviation project focused on the development of methods and tools to apply human error analysis to the design of commercial aircraft. This project was performed by a consortium comprised of INEEL, NASA, and Boeing Commercial Airplane Group. The focus of the project was aircraft design and procedures that could lead to human errors during airplane maintenance

  10. Interactive analysis of human error factors in NPP operation events

    International Nuclear Information System (INIS)

    Zhang Li; Zou Yanhua; Huang Weigang

    2010-01-01

    Interactive of human error factors in NPP operation events were introduced, and 645 WANO operation event reports from 1999 to 2008 were analyzed, among which 432 were found relative to human errors. After classifying these errors with the Root Causes or Causal Factors, and then applying SPSS for correlation analysis,we concluded: (1) Personnel work practices are restricted by many factors. Forming a good personnel work practices is a systematic work which need supports in many aspects. (2)Verbal communications,personnel work practices, man-machine interface and written procedures and documents play great roles. They are four interaction factors which often come in bundle. If some improvements need to be made on one of them,synchronous measures are also necessary for the others.(3) Management direction and decision process, which are related to management,have a significant interaction with personnel factors. (authors)

  11. Estimation of the human error probabilities in the human reliability analysis

    International Nuclear Information System (INIS)

    Liu Haibin; He Xuhong; Tong Jiejuan; Shen Shifei

    2006-01-01

    Human error data is an important issue of human reliability analysis (HRA). Using of Bayesian parameter estimation, which can use multiple information, such as the historical data of NPP and expert judgment data to modify the human error data, could get the human error data reflecting the real situation of NPP more truly. This paper, using the numeric compute program developed by the authors, presents some typical examples to illustrate the process of the Bayesian parameter estimation in HRA and discusses the effect of different modification data on the Bayesian parameter estimation. (authors)

  12. Knowledge-base for the new human reliability analysis method, A Technique for Human Error Analysis (ATHEANA)

    International Nuclear Information System (INIS)

    Cooper, S.E.; Wreathall, J.; Thompson, C.M., Drouin, M.; Bley, D.C.

    1996-01-01

    This paper describes the knowledge base for the application of the new human reliability analysis (HRA) method, a ''A Technique for Human Error Analysis'' (ATHEANA). Since application of ATHEANA requires the identification of previously unmodeled human failure events, especially errors of commission, and associated error-forcing contexts (i.e., combinations of plant conditions and performance shaping factors), this knowledge base is an essential aid for the HRA analyst

  13. Human error in strabismus surgery: Quantification with a sensitivity analysis

    NARCIS (Netherlands)

    S. Schutte (Sander); J.R. Polling (Jan Roelof); F.C.T. van der Helm (Frans); H.J. Simonsz (Huib)

    2009-01-01

    textabstractBackground: Reoperations are frequently necessary in strabismus surgery. The goal of this study was to analyze human-error related factors that introduce variability in the results of strabismus surgery in a systematic fashion. Methods: We identified the primary factors that influence

  14. Human error in strabismus surgery : Quantification with a sensitivity analysis

    NARCIS (Netherlands)

    Schutte, S.; Polling, J.R.; Van der Helm, F.C.T.; Simonsz, H.J.

    2008-01-01

    Background- Reoperations are frequently necessary in strabismus surgery. The goal of this study was to analyze human-error related factors that introduce variability in the results of strabismus surgery in a systematic fashion. Methods- We identified the primary factors that influence the outcome of

  15. Latent human error analysis and efficient improvement strategies by fuzzy TOPSIS in aviation maintenance tasks.

    Science.gov (United States)

    Chiu, Ming-Chuan; Hsieh, Min-Chih

    2016-05-01

    The purposes of this study were to develop a latent human error analysis process, to explore the factors of latent human error in aviation maintenance tasks, and to provide an efficient improvement strategy for addressing those errors. First, we used HFACS and RCA to define the error factors related to aviation maintenance tasks. Fuzzy TOPSIS with four criteria was applied to evaluate the error factors. Results show that 1) adverse physiological states, 2) physical/mental limitations, and 3) coordination, communication, and planning are the factors related to airline maintenance tasks that could be addressed easily and efficiently. This research establishes a new analytic process for investigating latent human error and provides a strategy for analyzing human error using fuzzy TOPSIS. Our analysis process complements shortages in existing methodologies by incorporating improvement efficiency, and it enhances the depth and broadness of human error analysis methodology. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  16. AGAPE-ET for human error analysis of emergency tasks and its application

    International Nuclear Information System (INIS)

    Kim, J. H.; Jeong, W. D.

    2002-01-01

    The paper presents a proceduralised human reliability analysis (HRA) methodology, AGAPE-ET (A Guidance And Procedure for Human Error Analysis for Emergency Tasks), covering both qualitative error analysis and quantification of human error probability (HEP) of emergency tasks in nuclear power plants. The AGAPE-ET method is based on the simplified cognitive model. By each cognitive function, error causes or error-likely situations have been identified considering the characteristics of the performance of each cognitive function and influencing mechanism of the performance influencing factors (PIFs) on the cognitive function. Then, error analysis items have been determined from the identified error causes or error-likely situations and a human error analysis procedure based on the error analysis items is organised to help the analysts cue or guide overall human error analysis. The basic scheme for the quantification of HEP consists in the multiplication of the BHEP assigned by the error analysis item and the weight from the influencing factors decision tree (IFDT) constituted by cognitive function. The method can be characterised by the structured identification of the weak points of the task required to perform and the efficient analysis process that the analysts have only to carry out with the necessary cognitive functions. The paper also presents the application of AGAPE-ET to 31 nuclear emergency tasks and its results

  17. PRA (probabilistic risk analysis) in the nuclear sector. Quantifying human error and human malice

    International Nuclear Information System (INIS)

    Heyes, A.G.

    1995-01-01

    Regardless of the regulatory style chosen ('command and control' or 'functional') a vital prerequisite for coherent safety regulations in the nuclear power industry is the ability to assess accident risk. In this paper we present a critical analysis of current techniques of probabilistic risk analysis applied in the industry, with particular regard to the problems of quantifying risks arising from, or exacerbated by, human risk and/or human error. (Author)

  18. A Conceptual Framework of Human Reliability Analysis for Execution Human Error in NPP Advanced MCRs

    International Nuclear Information System (INIS)

    Jang, In Seok; Kim, Ar Ryum; Seong, Poong Hyun; Jung, Won Dea

    2014-01-01

    The operation environment of Main Control Rooms (MCRs) in Nuclear Power Plants (NPPs) has changed with the adoption of new human-system interfaces that are based on computer-based technologies. The MCRs that include these digital and computer technologies, such as large display panels, computerized procedures, and soft controls, are called Advanced MCRs. Among the many features of Advanced MCRs, soft controls are a particularly important feature because the operation action in NPP Advanced MCRs is performed by soft control. Using soft controls such as mouse control, and touch screens, operators can select a specific screen, then choose the controller, and finally manipulate the given devices. Due to the different interfaces between soft control and hardwired conventional type control, different human error probabilities and a new Human Reliability Analysis (HRA) framework should be considered in the HRA for advanced MCRs. In other words, new human error modes should be considered for interface management tasks such as navigation tasks, and icon (device) selection tasks in monitors and a new framework of HRA method taking these newly generated human error modes into account should be considered. In this paper, a conceptual framework for a HRA method for the evaluation of soft control execution human error in advanced MCRs is suggested by analyzing soft control tasks

  19. A Conceptual Framework of Human Reliability Analysis for Execution Human Error in NPP Advanced MCRs

    Energy Technology Data Exchange (ETDEWEB)

    Jang, In Seok; Kim, Ar Ryum; Seong, Poong Hyun [KAIST, Daejeon (Korea, Republic of); Jung, Won Dea [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-08-15

    The operation environment of Main Control Rooms (MCRs) in Nuclear Power Plants (NPPs) has changed with the adoption of new human-system interfaces that are based on computer-based technologies. The MCRs that include these digital and computer technologies, such as large display panels, computerized procedures, and soft controls, are called Advanced MCRs. Among the many features of Advanced MCRs, soft controls are a particularly important feature because the operation action in NPP Advanced MCRs is performed by soft control. Using soft controls such as mouse control, and touch screens, operators can select a specific screen, then choose the controller, and finally manipulate the given devices. Due to the different interfaces between soft control and hardwired conventional type control, different human error probabilities and a new Human Reliability Analysis (HRA) framework should be considered in the HRA for advanced MCRs. In other words, new human error modes should be considered for interface management tasks such as navigation tasks, and icon (device) selection tasks in monitors and a new framework of HRA method taking these newly generated human error modes into account should be considered. In this paper, a conceptual framework for a HRA method for the evaluation of soft control execution human error in advanced MCRs is suggested by analyzing soft control tasks.

  20. Analysis of Human Error Types and Performance Shaping Factors in the Next Generation Main Control Room

    International Nuclear Information System (INIS)

    Sin, Y. C.; Jung, Y. S.; Kim, K. H.; Kim, J. H.

    2008-04-01

    Main control room of nuclear power plants has been computerized and digitalized in new and modernized plants, as information and digital technologies make great progresses and become mature. Survey on human factors engineering issues in advanced MCRs: Model-based approach, Literature survey-based approach. Analysis of human error types and performance shaping factors is analysis of three human errors. The results of project can be used for task analysis, evaluation of human error probabilities, and analysis of performance shaping factors in the HRA analysis

  1. HUMAN RELIABILITY ANALYSIS DENGAN PENDEKATAN COGNITIVE RELIABILITY AND ERROR ANALYSIS METHOD (CREAM

    Directory of Open Access Journals (Sweden)

    Zahirah Alifia Maulida

    2015-01-01

    Full Text Available Kecelakaan kerja pada bidang grinding dan welding menempati urutan tertinggi selama lima tahun terakhir di PT. X. Kecelakaan ini disebabkan oleh human error. Human error terjadi karena pengaruh lingkungan kerja fisik dan non fisik.Penelitian kali menggunakan skenario untuk memprediksi serta mengurangi kemungkinan terjadinya error pada manusia dengan pendekatan CREAM (Cognitive Reliability and Error Analysis Method. CREAM adalah salah satu metode human reliability analysis yang berfungsi untuk mendapatkan nilai Cognitive Failure Probability (CFP yang dapat dilakukan dengan dua cara yaitu basic method dan extended method. Pada basic method hanya akan didapatkan nilai failure probabailty secara umum, sedangkan untuk extended method akan didapatkan CFP untuk setiap task. Hasil penelitian menunjukkan faktor- faktor yang mempengaruhi timbulnya error pada pekerjaan grinding dan welding adalah kecukupan organisasi, kecukupan dari Man Machine Interface (MMI & dukungan operasional, ketersediaan prosedur/ perencanaan, serta kecukupan pelatihan dan pengalaman. Aspek kognitif pada pekerjaan grinding yang memiliki nilai error paling tinggi adalah planning dengan nilai CFP 0.3 dan pada pekerjaan welding yaitu aspek kognitif execution dengan nilai CFP 0.18. Sebagai upaya untuk mengurangi nilai error kognitif pada pekerjaan grinding dan welding rekomendasi yang diberikan adalah memberikan training secara rutin, work instrucstion yang lebih rinci dan memberikan sosialisasi alat. Kata kunci: CREAM (cognitive reliability and error analysis method, HRA (human reliability analysis, cognitive error Abstract The accidents in grinding and welding sectors were the highest cases over the last five years in PT. X and it caused by human error. Human error occurs due to the influence of working environment both physically and non-physically. This study will implement an approaching scenario called CREAM (Cognitive Reliability and Error Analysis Method. CREAM is one of human

  2. Human errors identification using the human factors analysis and classification system technique (HFACS

    Directory of Open Access Journals (Sweden)

    G. A. Shirali

    2013-12-01

    .Result: In this study, 158 reports of accident in Ahvaz steel industry were analyzed by HFACS technique. This analysis showed that most of the human errors were: in the first level was related to the skill-based errors, in the second to the physical environment, in the third level to the inadequate supervision and in the fourth level to the management of resources. .Conclusion: Studying and analyzing of past events using the HFACS technique can identify the major and root causes of accidents and can be effective on prevent repetitions of such mishaps. Also, it can be used as a basis for developing strategies to prevent future events in steel industries.

  3. Research on Human-Error Factors of Civil Aircraft Pilots Based On Grey Relational Analysis

    Directory of Open Access Journals (Sweden)

    Guo Yundong

    2018-01-01

    Full Text Available In consideration of the situation that civil aviation accidents involve many human-error factors and show the features of typical grey systems, an index system of civil aviation accident human-error factors is built using human factor analysis and classification system model. With the data of accidents happened worldwide between 2008 and 2011, the correlation between human-error factors can be analyzed quantitatively using the method of grey relational analysis. Research results show that the order of main factors affecting pilot human-error factors is preconditions for unsafe acts, unsafe supervision, organization and unsafe acts. The factor related most closely with second-level indexes and pilot human-error factors is the physical/mental limitations of pilots, followed by supervisory violations. The relevancy between the first-level indexes and the corresponding second-level indexes and the relevancy between second-level indexes can also be analyzed quantitatively.

  4. The recovery factors analysis of the human errors for research reactors

    International Nuclear Information System (INIS)

    Farcasiu, M.; Nitoi, M.; Apostol, M.; Turcu, I.; Florescu, Ghe.

    2006-01-01

    The results of many Probabilistic Safety Assessment (PSA) studies show a very significant contribution of human errors to systems unavailability of the nuclear installations. The treatment of human interactions is considered one of the major limitations in the context of PSA. To identify those human actions that can have an effect on system reliability or availability applying the Human Reliability Analysis (HRA) is necessary. The recovery factors analysis of the human action is an important step in HRA. This paper presents how can be reduced the human errors probabilities (HEP) using those elements that have the capacity to recovery human error. The recovery factors modeling is marked to identify error likelihood situations or situations that conduct at development of the accident. This analysis is realized by THERP method. The necessary information was obtained from the operating experience of the research reactor TRIGA of the INR Pitesti. The required data were obtained from generic databases. (authors)

  5. Human reliability analysis during PSA at Trillo NPP: main characteristics and analysis of diagnostic errors

    International Nuclear Information System (INIS)

    Barquin, M.A.; Gomez, F.

    1998-01-01

    The design difference between Trillo NPP and other Spanish nuclear power plants (basic Westinghouse and General Electric designs) were made clear in the Human Reliability Analysis of the Probabilistic Safety Analysis (PSA) for Trillo NPP. The object of this paper is to describe the most significant characteristics of the Human Reliability Analysis carried out in the PSA, with special emphasis on the possible diagnostic errors and their consequences, based on the characteristics in the Emergency Operations Manual for Trillo NPP. - In the case of human errors before the initiating event (type 1), the existence of four redundancies in most of the plant safety systems, means that the impact of this type or error on the final results of the PSA is insignificant. However, in the case common cause errors, especially in certain calibration errors, some actions are significant in the final equation for core damage - The number of human actions that the operator has to carry out during the accidents (type 3) modelled, is relatively small in comparison with this value in other PSAs. This is basically due to the high level of automation at Rillo NPP - The Plant Operations Manual cannot be strictly considered to be a symptoms-based procedure. The operation Group must select the chapter from the Operations Manual to be followed, after having diagnosed the perturbing event, using for this purpose and Emergency and Anomaly Decision Tree (M.O.3.0.1) based on the different indications, alarms and symptoms present in the plant after the perturbing event. For this reason, it was decided to analyse the possible diagnosis errors. In the bibliography on diagnosis and commission errors available at the present time, there is no precise methodology for the analysis of this type of error and its incorporation into PSAs. The method used in the PSA for Trillo y NPP to evaluate this type of interaction, is to develop a Diagnosis Error Table, the object of which is to identify the situations in

  6. Analysis of Employee's Survey for Preventing Human-Errors

    Energy Technology Data Exchange (ETDEWEB)

    Sung, Chanho; Kim, Younggab; Joung, Sanghoun [KHNP Central Research Institute, Daejeon (Korea, Republic of)

    2013-10-15

    Human errors in nuclear power plant can cause large and small events or incidents. These events or incidents are one of main contributors of reactor trip and might threaten the safety of nuclear plants. To prevent human-errors, KHNP(nuclear power plants) introduced 'Human-error prevention techniques' and have applied the techniques to main parts such as plant operation, operation support, and maintenance and engineering. This paper proposes the methods to prevent and reduce human-errors in nuclear power plants through analyzing survey results which includes the utilization of the human-error prevention techniques and the employees' awareness of preventing human-errors. With regard to human-error prevention, this survey analysis presented the status of the human-error prevention techniques and the employees' awareness of preventing human-errors. Employees' understanding and utilization of the techniques was generally high and training level of employee and training effect on actual works were in good condition. Also, employees answered that the root causes of human-error were due to working environment including tight process, manpower shortage, and excessive mission rather than personal negligence or lack of personal knowledge. Consideration of working environment is certainly needed. At the present time, based on analyzing this survey, the best methods of preventing human-error are personal equipment, training/education substantiality, private mental health check before starting work, prohibit of multiple task performing, compliance with procedures, and enhancement of job site review. However, the most important and basic things for preventing human-error are interests of workers and organizational atmosphere such as communication between managers and workers, and communication between employees and bosses.

  7. Application of grey incidence analysis to connection between human errors and root cause

    International Nuclear Information System (INIS)

    Ren Yinxiang; Yu Ren; Zhou Gang; Chen Dengke

    2008-01-01

    By introducing grey incidence analysis, the relatively important impact of root cause upon human errors was researched in the paper. On the basis of WANO statistic data and grey incidence analysis, lack of alternate examine, bad basic operation, short of theoretical knowledge, relaxation of organization and management and deficiency of regulations are the important influence of root cause on human err ors. Finally, the question to reduce human errors was discussed. (authors)

  8. An advanced human reliability analysis methodology: analysis of cognitive errors focused on

    International Nuclear Information System (INIS)

    Kim, J. H.; Jeong, W. D.

    2001-01-01

    The conventional Human Reliability Analysis (HRA) methods such as THERP/ASEP, HCR and SLIM has been criticised for their deficiency in analysing cognitive errors which occurs during operator's decision making process. In order to supplement the limitation of the conventional methods, an advanced HRA method, what is called the 2 nd generation HRA method, including both qualitative analysis and quantitative assessment of cognitive errors has been being developed based on the state-of-the-art theory of cognitive systems engineering and error psychology. The method was developed on the basis of human decision-making model and the relation between the cognitive function and the performance influencing factors. The application of the proposed method to two emergency operation tasks is presented

  9. Quality of IT service delivery — Analysis and framework for human error prevention

    KAUST Repository

    Shwartz, L.

    2010-12-01

    In this paper, we address the problem of reducing the occurrence of Human Errors that cause service interruptions in IT Service Support and Delivery operations. Analysis of a large volume of service interruption records revealed that more than 21% of interruptions were caused by human error. We focus on Change Management, the process with the largest risk of human error, and identify the main instances of human errors as the 4 Wrongs: request, time, configuration item, and command. Analysis of change records revealed that the humanerror prevention by partial automation is highly relevant. We propose the HEP Framework, a framework for execution of IT Service Delivery operations that reduces human error by addressing the 4 Wrongs using content integration, contextualization of operation patterns, partial automation of command execution, and controlled access to resources.

  10. Development of an analysis rule of diagnosis error for standard method of human reliability analysis

    International Nuclear Information System (INIS)

    Jeong, W. D.; Kang, D. I.; Jeong, K. S.

    2003-01-01

    This paper presents the status of development of Korea standard method for Human Reliability Analysis (HRA), and proposed a standard procedure and rules for the evaluation of diagnosis error probability. The quality of KSNP HRA was evaluated using the requirement of ASME PRA standard guideline, and the design requirement for the standard HRA method was defined. Analysis procedure and rules, developed so far, to analyze diagnosis error probability was suggested as a part of the standard method. And also a study of comprehensive application was performed to evaluate the suitability of the proposed rules

  11. A methodology for collection and analysis of human error data based on a cognitive model: IDA

    International Nuclear Information System (INIS)

    Shen, S.-H.; Smidts, C.; Mosleh, A.

    1997-01-01

    This paper presents a model-based human error taxonomy and data collection. The underlying model, IDA (described in two companion papers), is a cognitive model of behavior developed for analysis of the actions of nuclear power plant operating crew during abnormal situations. The taxonomy is established with reference to three external reference points (i.e. plant status, procedures, and crew) and four reference points internal to the model (i.e. information collected, diagnosis, decision, action). The taxonomy helps the analyst: (1) recognize errors as such; (2) categorize the error in terms of generic characteristics such as 'error in selection of problem solving strategies' and (3) identify the root causes of the error. The data collection methodology is summarized in post event operator interview and analysis summary forms. The root cause analysis methodology is illustrated using a subset of an actual event. Statistics, which extract generic characteristics of error prone behaviors and error prone situations are presented. Finally, applications of the human error data collection are reviewed. A primary benefit of this methodology is to define better symptom-based and other auxiliary procedures with associated training to minimize or preclude certain human errors. It also helps in design of control rooms, and in assessment of human error probabilities in the probabilistic risk assessment framework. (orig.)

  12. The treatment of commission errors in first generation human reliability analysis methods

    Energy Technology Data Exchange (ETDEWEB)

    Alvarengga, Marco Antonio Bayout; Fonseca, Renato Alves da, E-mail: bayout@cnen.gov.b, E-mail: rfonseca@cnen.gov.b [Comissao Nacional de Energia Nuclear (CNEN) Rio de Janeiro, RJ (Brazil); Melo, Paulo Fernando Frutuoso e, E-mail: frutuoso@nuclear.ufrj.b [Coordenacao dos Programas de Pos-Graduacao de Engenharia (PEN/COPPE/UFRJ), RJ (Brazil). Programa de Engenharia Nuclear

    2011-07-01

    Human errors in human reliability analysis can be classified generically as errors of omission and commission errors. Omission errors are related to the omission of any human action that should have been performed, but does not occur. Errors of commission are those related to human actions that should not be performed, but which in fact are performed. Both involve specific types of cognitive error mechanisms, however, errors of commission are more difficult to model because they are characterized by non-anticipated actions that are performed instead of others that are omitted (omission errors) or are entered into an operational task without being part of the normal sequence of this task. The identification of actions that are not supposed to occur depends on the operational context that will influence or become easy certain unsafe actions of the operator depending on the operational performance of its parameters and variables. The survey of operational contexts and associated unsafe actions is a characteristic of second-generation models, unlike the first generation models. This paper discusses how first generation models can treat errors of commission in the steps of detection, diagnosis, decision-making and implementation, in the human information processing, particularly with the use of THERP tables of errors quantification. (author)

  13. Human errors and mistakes

    International Nuclear Information System (INIS)

    Wahlstroem, B.

    1993-01-01

    Human errors have a major contribution to the risks for industrial accidents. Accidents have provided important lesson making it possible to build safer systems. In avoiding human errors it is necessary to adapt the systems to their operators. The complexity of modern industrial systems is however increasing the danger of system accidents. Models of the human operator have been proposed, but the models are not able to give accurate predictions of human performance. Human errors can never be eliminated, but their frequency can be decreased by systematic efforts. The paper gives a brief summary of research in human error and it concludes with suggestions for further work. (orig.)

  14. Analysis of measured data of human body based on error correcting frequency

    Science.gov (United States)

    Jin, Aiyan; Peipei, Gao; Shang, Xiaomei

    2014-04-01

    Anthropometry is to measure all parts of human body surface, and the measured data is the basis of analysis and study of the human body, establishment and modification of garment size and formulation and implementation of online clothing store. In this paper, several groups of the measured data are gained, and analysis of data error is gotten by analyzing the error frequency and using analysis of variance method in mathematical statistics method. Determination of the measured data accuracy and the difficulty of measured parts of human body, further studies of the causes of data errors, and summarization of the key points to minimize errors possibly are also mentioned in the paper. This paper analyses the measured data based on error frequency, and in a way , it provides certain reference elements to promote the garment industry development.

  15. Quality of IT service delivery — Analysis and framework for human error prevention

    KAUST Repository

    Shwartz, L.; Rosu, D.; Loewenstern, D.; Buco, M. J.; Guo, S.; Lavrado, Rafael Coelho; Gupta, M.; De, P.; Madduri, V.; Singh, J. K.

    2010-01-01

    In this paper, we address the problem of reducing the occurrence of Human Errors that cause service interruptions in IT Service Support and Delivery operations. Analysis of a large volume of service interruption records revealed that more than 21

  16. The application of two recently developed human reliability techniques to cognitive error analysis

    International Nuclear Information System (INIS)

    Gall, W.

    1990-01-01

    Cognitive error can lead to catastrophic consequences for manned systems, including those whose design renders them immune to the effects of physical slips made by operators. Four such events, pressurized water and boiling water reactor accidents which occurred recently, were analysed. The analysis identifies the factors which contributed to the errors and suggests practical strategies for error recovery or prevention. Two types of analysis were conducted: an unstructured analysis based on the analyst's knowledge of psychological theory, and a structured analysis using two recently-developed human reliability analysis techniques. In general, the structured techniques required less effort to produce results and these were comparable to those of the unstructured analysis. (author)

  17. A stochastic dynamic model for human error analysis in nuclear power plants

    Science.gov (United States)

    Delgado-Loperena, Dharma

    Nuclear disasters like Three Mile Island and Chernobyl indicate that human performance is a critical safety issue, sending a clear message about the need to include environmental press and competence aspects in research. This investigation was undertaken to serve as a roadmap for studying human behavior through the formulation of a general solution equation. The theoretical model integrates models from two heretofore-disassociated disciplines (behavior specialists and technical specialists), that historically have independently studied the nature of error and human behavior; including concepts derived from fractal and chaos theory; and suggests re-evaluation of base theory regarding human error. The results of this research were based on comprehensive analysis of patterns of error, with the omnipresent underlying structure of chaotic systems. The study of patterns lead to a dynamic formulation, serving for any other formula used to study human error consequences. The search for literature regarding error yielded insight for the need to include concepts rooted in chaos theory and strange attractors---heretofore unconsidered by mainstream researchers who investigated human error in nuclear power plants or those who employed the ecological model in their work. The study of patterns obtained from the rupture of a steam generator tube (SGTR) event simulation, provided a direct application to aspects of control room operations in nuclear power plant operations. In doing so, the conceptual foundation based in the understanding of the patterns of human error analysis can be gleaned, resulting in reduced and prevent undesirable events.

  18. Analysis of human error and organizational deficiency in events considering risk significance

    International Nuclear Information System (INIS)

    Lee, Yong Suk; Kim, Yoonik; Kim, Say Hyung; Kim, Chansoo; Chung, Chang Hyun; Jung, Won Dea

    2004-01-01

    In this study, we analyzed human and organizational deficiencies in the trip events of Korean nuclear power plants. K-HPES items were used in human error analysis, and the organizational factors by Jacobs and Haber were used for organizational deficiency analysis. We proposed the use of CCDP as a risk measure to consider risk information in prioritizing K-HPES items and organizational factors. Until now, the risk significance of events has not been considered in human error and organizational deficiency analysis. Considering the risk significance of events in the process of analysis is necessary for effective enhancement of nuclear power plant safety by focusing on causes of human error and organizational deficiencies that are associated with significant risk

  19. Human Errors in Decision Making

    OpenAIRE

    Mohamad, Shahriari; Aliandrina, Dessy; Feng, Yan

    2005-01-01

    The aim of this paper was to identify human errors in decision making process. The study was focused on a research question such as: what could be the human error as a potential of decision failure in evaluation of the alternatives in the process of decision making. Two case studies were selected from the literature and analyzed to find the human errors contribute to decision fail. Then the analysis of human errors was linked with mental models in evaluation of alternative step. The results o...

  20. Human error and the problem of causality in analysis of accidents

    DEFF Research Database (Denmark)

    Rasmussen, Jens

    1990-01-01

    , designers or managers have played a major role. There are, however, several basic problems in analysis of accidents and identification of human error. This paper addresses the nature of causal explanations and the ambiguity of the rules applied for identification of the events to include in analysis......Present technology is characterized by complexity, rapid change and growing size of technical systems. This has caused increasing concern with the human involvement in system safety. Analyses of the major accidents during recent decades have concluded that human errors on part of operators...

  1. Review of human error analysis methodologies and case study for accident management

    International Nuclear Information System (INIS)

    Jung, Won Dae; Kim, Jae Whan; Lee, Yong Hee; Ha, Jae Joo

    1998-03-01

    In this research, we tried to establish the requirements for the development of a new human error analysis method. To achieve this goal, we performed a case study as following steps; 1. review of the existing HEA methods 2. selection of those methods which are considered to be appropriate for the analysis of operator's tasks in NPPs 3. choice of tasks for the application, selected for the case study: HRMS (Human reliability management system), PHECA (Potential Human Error Cause Analysis), CREAM (Cognitive Reliability and Error Analysis Method). And, as the tasks for the application, 'bleed and feed operation' and 'decision-making for the reactor cavity flooding' tasks are chosen. We measured the applicability of the selected methods to the NPP tasks, and evaluated the advantages and disadvantages between each method. The three methods are turned out to be applicable for the prediction of human error. We concluded that both of CREAM and HRMS are equipped with enough applicability for the NPP tasks, however, compared two methods. CREAM is thought to be more appropriate than HRMS from the viewpoint of overall requirements. The requirements for the new HEA method obtained from the study can be summarized as follows; firstly, it should deal with cognitive error analysis, secondly, it should have adequate classification system for the NPP tasks, thirdly, the description on the error causes and error mechanisms should be explicit, fourthly, it should maintain the consistency of the result by minimizing the ambiguity in each step of analysis procedure, fifty, it should be done with acceptable human resources. (author). 25 refs., 30 tabs., 4 figs

  2. A human error taxonomy and its application to an automatic method accident analysis

    International Nuclear Information System (INIS)

    Matthews, R.H.; Winter, P.W.

    1983-01-01

    Commentary is provided on the quantification aspects of human factors analysis in risk assessment. Methods for quantifying human error in a plant environment are discussed and their application to system quantification explored. Such a programme entails consideration of the data base and a taxonomy of factors contributing to human error. A multi-levelled approach to system quantification is proposed, each level being treated differently drawing on the advantages of different techniques within the fault/event tree framework. Management, as controller of organization, planning and procedure, is assigned a dominant role. (author)

  3. A Method and Support Tool for the Analysis of Human Error Hazards in Digital Devices

    International Nuclear Information System (INIS)

    Lee, Yong Hee; Kim, Seon Soo; Lee, Yong Hee

    2012-01-01

    In recent years, many nuclear power plants have adopted modern digital I and C technologies since they are expected to significantly improve their performance and safety. Modern digital technologies were expected to significantly improve both the economical efficiency and safety of nuclear power plants. However, the introduction of an advanced main control room (MCR) is accompanied with lots of changes in forms and features and differences through virtue of new digital devices. Many user-friendly displays and new features in digital devices are not enough to prevent human errors in nuclear power plants (NPPs). It may be an urgent to matter find the human errors potentials due to digital devices, and their detailed mechanisms. We can then consider them during the design of digital devices and their interfaces. The characteristics of digital technologies and devices may give many opportunities to the interface management, and can be integrated into a compact single workstation in an advanced MCR, such that workers can operate the plant with minimum burden under any operating condition. However, these devices may introduce new types of human errors, and thus we need a means to evaluate and prevent such errors, especially within digital devices for NPPs. This research suggests a new method named HEA-BIS (Human Error Analysis based on Interaction Segment) to confirm and detect human errors associated with digital devices. This method can be facilitated by support tools when used to ensure the safety when applying digital devices in NPPs

  4. Working group of experts on rare events in human error analysis and quantification

    International Nuclear Information System (INIS)

    Goodstein, L.P.

    1977-01-01

    In dealing with the reference problem of rare events in nuclear power plants, the group has concerned itself with the man-machine system and, in particular, with human error analysis and quantification. The Group was requested to review methods of human reliability prediction, to evaluate the extent to which such analyses can be formalized and to establish criteria to be met by task conditions and system design which would permit a systematic, formal analysis. Recommendations are given on the Fessenheim safety system

  5. A Human Error Analysis with Physiological Signals during Utilizing Digital Devices

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yong Hee; Oh, Yeon Ju; Shin, Kwang Hyeon [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2011-10-15

    The introduction of advanced MCR is accompanied with lots of changes and different forms and features through the virtue of new digital technologies. There are various kinds of digital devices such as flat panel displays, touch screens, and so on. The characteristics of these digital devices give many chances to the interface management, and can be integrated into a compact single workstation in an advanced MCR so that workers can operate the plant with minimum burden during any operating condition. However, these devices may introduce new types of human errors, and thus we need a means to evaluate and prevent such error, especially those related to the digital devices. Human errors have been retrospectively assessed for accident reviews and quantitatively evaluated through HRA for PSA. However, the ergonomic verification and validation is an important process to defend all human error potential in the NPP design. HRA is a crucial part of a PSA, and helps in preparing a countermeasure for design by drawing potential human error items that affect the overall safety of NPPs. Various HRA techniques are available however: they reveal shortages of the HMI design in the digital era. - HRA techniques depend on PSFs: this means that the scope dealing with human factors is previously limited, and thus all attributes of new digital devices may not be considered in HRA. - The data used to HRA are not close to the evaluation items. So, human error analysis is not easy to apply to design by several individual experiments and cases. - The results of HRA are not statistically meaningful because accidents including human errors in NPPs are rare and have been estimated as having an extremely low probability

  6. A Human Reliability Analysis of Post- Accident Human Errors in the Low Power and Shutdown PSA of KSNP

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Daeil; Kim, J. H.; Jang, S. C

    2007-03-15

    Korea Atomic Energy Research Institute, using the ANS low power and shutdown (LPSD) probabilistic risk assessment (PRA) Standard, evaluated the LPSD PSA model of the KSNP, Yonggwang Units 5 and 6, and identified the items to be improved. The evaluation results of human reliability analysis (HRA) of the post-accident human errors in the LPSD PSA model for the KSNP showed that 10 items among 19 items of supporting requirements for those in the ANS PRA Standard were identified as them to be improved. Thus, we newly carried out a HRA for post-accident human errors in the LPSD PSA model for the KSNP. Following tasks are the improvements in the HRA of post-accident human errors of the LPSD PSA model for the KSNP compared with the previous one: Interviews with operators in the interpretation of the procedure, modeling of operator actions, and the quantification results of human errors, site visit. Applications of limiting value to the combined post-accident human errors. Documentation of information of all the input and bases for the detailed quantifications and the dependency analysis using the quantification sheets The assessment results for the new HRA results of post-accident human errors using the ANS LPSD PRA Standard show that above 80% items of its supporting requirements for post-accident human errors were graded as its Category II. The number of the re-estimated human errors using the LPSD Korea Standard HRA method is 385. Among them, the number of individual post-accident human errors is 253. The number of dependent post-accident human errors is 135. The quantification results of the LPSD PSA model for the KSNP with new HEPs show that core damage frequency (CDF) is increased by 5.1% compared with the previous baseline CDF It is expected that this study results will be greatly helpful to improve the PSA quality for the domestic nuclear power plants because they have sufficient PSA quality to meet the Category II of Supporting Requirements for the post

  7. A Human Reliability Analysis of Post- Accident Human Errors in the Low Power and Shutdown PSA of KSNP

    International Nuclear Information System (INIS)

    Kang, Daeil; Kim, J. H.; Jang, S. C.

    2007-03-01

    Korea Atomic Energy Research Institute, using the ANS low power and shutdown (LPSD) probabilistic risk assessment (PRA) Standard, evaluated the LPSD PSA model of the KSNP, Yonggwang Units 5 and 6, and identified the items to be improved. The evaluation results of human reliability analysis (HRA) of the post-accident human errors in the LPSD PSA model for the KSNP showed that 10 items among 19 items of supporting requirements for those in the ANS PRA Standard were identified as them to be improved. Thus, we newly carried out a HRA for post-accident human errors in the LPSD PSA model for the KSNP. Following tasks are the improvements in the HRA of post-accident human errors of the LPSD PSA model for the KSNP compared with the previous one: Interviews with operators in the interpretation of the procedure, modeling of operator actions, and the quantification results of human errors, site visit. Applications of limiting value to the combined post-accident human errors. Documentation of information of all the input and bases for the detailed quantifications and the dependency analysis using the quantification sheets The assessment results for the new HRA results of post-accident human errors using the ANS LPSD PRA Standard show that above 80% items of its supporting requirements for post-accident human errors were graded as its Category II. The number of the re-estimated human errors using the LPSD Korea Standard HRA method is 385. Among them, the number of individual post-accident human errors is 253. The number of dependent post-accident human errors is 135. The quantification results of the LPSD PSA model for the KSNP with new HEPs show that core damage frequency (CDF) is increased by 5.1% compared with the previous baseline CDF It is expected that this study results will be greatly helpful to improve the PSA quality for the domestic nuclear power plants because they have sufficient PSA quality to meet the Category II of Supporting Requirements for the post

  8. Systematic Analysis of Video Data from Different Human-Robot Interaction Studies: A Categorisation of Social Signals During Error Situations

    OpenAIRE

    Manuel eGiuliani; Nicole eMirnig; Gerald eStollnberger; Susanne eStadler; Roland eBuchner; Manfred eTscheligi

    2015-01-01

    Human?robot interactions are often affected by error situations that are caused by either the robot or the human. Therefore, robots would profit from the ability to recognize when error situations occur. We investigated the verbal and non-verbal social signals that humans show when error situations occur in human?robot interaction experiments. For that, we analyzed 201 videos of five human?robot interaction user studies with varying tasks from four independent projects. The analysis shows tha...

  9. Human errors in NPP operations

    International Nuclear Information System (INIS)

    Sheng Jufang

    1993-01-01

    Based on the operational experiences of nuclear power plants (NPPs), the importance of studying human performance problems is described. Statistical analysis on the significance or frequency of various root-causes and error-modes from a large number of human-error-related events demonstrate that the defects in operation/maintenance procedures, working place factors, communication and training practices are primary root-causes, while omission, transposition, quantitative mistake are the most frequent among the error-modes. Recommendations about domestic research on human performance problem in NPPs are suggested

  10. Analysis of Human Errors in Japanese Nuclear Power Plants using JHPES/JAESS

    International Nuclear Information System (INIS)

    Kojima, Mitsuhiro; Mimura, Masahiro; Yamaguchi, Osamu

    1998-01-01

    CRIEPI (Central Research Institute for Electric Power Industries) / HFC (Human Factors research Center) developed J-HPES (Japanese version of Human Performance Enhancement System) based on the HPES which was originally developed by INPO to analyze events resulted from human errors. J-HPES was systematized into a computer program named JAESS (J-HPES Analysis and Evaluation Support System) and both systems were distributed to all Japanese electric power companies to analyze events by themselves. CRIEPI / HFC also analyzed the incidents in Japanese nuclear power plants (NPPs) which were officially reported and identified as human error related with J-HPES / JAESS. These incidents have numbered up to 188 cases over the last 30 years. An outline of this analysis is given, and some preliminary findings are shown. (authors)

  11. A human error analysis methodology, AGAPE-ET, for emergency tasks in nuclear power plants and its application

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae Whan; Jung, Won Dea [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2002-03-01

    This report presents a procedurised human reliability analysis (HRA) methodology, AGAPE-ET (A Guidance And Procedure for Human Error Analysis for Emergency Tasks), for both qualitative error analysis and quantification of human error probability (HEP) of emergency tasks in nuclear power plants. The AGAPE-ET is based on the simplified cognitive model. By each cognitive function, error causes or error-likely situations have been identified considering the characteristics of the performance of each cognitive function and influencing mechanism of PIFs on the cognitive function. Then, error analysis items have been determined from the identified error causes or error-likely situations to help the analysts cue or guide overall human error analysis. A human error analysis procedure based on the error analysis items is organised. The basic scheme for the quantification of HEP consists in the multiplication of the BHEP assigned by the error analysis item and the weight from the influencing factors decision tree (IFDT) constituted by cognitive function. The method can be characterised by the structured identification of the weak points of the task required to perform and the efficient analysis process that the analysts have only to carry out with the necessary cognitive functions. The report also presents the the application of AFAPE-ET to 31 nuclear emergency tasks and its results. 42 refs., 7 figs., 36 tabs. (Author)

  12. Human error analysis project (HEAP) - The fourth pilot study: verbal data for analysis of operator performance

    International Nuclear Information System (INIS)

    Braarud, Per Oeyvind; Droeyvoldsmo, Asgeir; Hollnagel, Erik

    1997-06-01

    This report is the second report from the Pilot study No. 4 within the Human Error Analyses Project (HEAP). The overall objective of HEAP is to provide a better understanding and explicit modelling of how and why ''cognitive errors'' occur. This study investigated the contribution from different verbal data sources for analysis of control room operator's performance. Operator's concurrent verbal report, retrospective verbal report, and process expert's comments were compared for their contribution to an operator performance measure. This study looked into verbal protocols for single operator and for team. The main findings of the study were that all the three verbal data sources could be used to study performance. There was a relative high overlap between the data sources, but also a unique contribution from each source. There was a common pattern in the types of operator activities the data sources gave information about. The operator's concurrent protocol overall contained slightly more information on the operator's activities than the other two verbal sources. The study also showed that concurrent verbal protocol is feasible and useful for analysis of team's activities during a scenario. (author)

  13. Information Management System Development for the Characterization and Analysis of Human Error in Naval Aviation Maintenance Related Mishaps

    National Research Council Canada - National Science Library

    Wood, Brian

    2000-01-01

    .... The Human Factors Analysis and Classification System-Maintenance Extension taxonomy, an effective framework for classifying and analyzing the presence of maintenance errors that lead to mishaps...

  14. Application of human error theory in case analysis of wrong procedures.

    Science.gov (United States)

    Duthie, Elizabeth A

    2010-06-01

    The goal of this study was to contribute to the emerging body of literature about the role of human behaviors and cognitive processes in the commission of wrong procedures. Case analysis of 5 wrong procedures in operative and nonoperative settings using James Reason's human error theory was performed. The case analysis showed that cognitive underspecification, cognitive flips, automode processing, and skill-based errors were contributory to wrong procedures. Wrong-site procedures accounted for the preponderance of the cases. Front-line supervisory staff used corrective actions that focused on the performance of the individual without taking into account cognitive factors. System fixes using human cognition concepts have a greater chance of achieving sustainable safety outcomes than those that are based on the traditional approach of counseling, education, and disciplinary action for staff.

  15. Systematic analysis of dependent human errors from the maintenance history at finnish NPPs - A status report

    Energy Technology Data Exchange (ETDEWEB)

    Laakso, K. [VTT Industrial Systems (Finland)

    2002-12-01

    Operating experience has shown missed detection events, where faults have passed inspections and functional tests to operating periods after the maintenance activities during the outage. The causes of these failures have often been complex event sequences, involving human and organisational factors. Especially common cause and other dependent failures of safety systems may significantly contribute to the reactor core damage risk. The topic has been addressed in the Finnish studies of human common cause failures, where experiences on latent human errors have been searched and analysed in detail from the maintenance history. The review of the bulk of the analysis results of the Olkiluoto and Loviisa plant sites shows that the instrumentation and control and electrical equipment is more prone to human error caused failure events than the other maintenance and that plant modifications and also predetermined preventive maintenance are significant sources of common cause failures. Most errors stem from the refuelling and maintenance outage period at the both sites, and less than half of the dependent errors were identified during the same outage. The dependent human errors originating from modifications could be reduced by a more tailored specification and coverage of their start-up testing programs. Improvements could also be achieved by a more case specific planning of the installation inspection and functional testing of complicated maintenance works or work objects of higher plant safety and availability importance. A better use and analysis of condition monitoring information for maintenance steering could also help. The feedback from discussions of the analysis results with plant experts and professionals is still crucial in developing the final conclusions and recommendations that meet the specific development needs at the plants. (au)

  16. A Human Error Analysis Procedure for Identifying Potential Error Modes and Influencing Factors for Test and Maintenance Activities

    International Nuclear Information System (INIS)

    Kim, Jae Whan; Park, Jin Kyun

    2010-01-01

    Periodic or non-periodic test and maintenance (T and M) activities in large, complex systems such as nuclear power plants (NPPs) are essential for sustaining stable and safe operation of the systems. On the other hand, it also has been raised that human erroneous actions that might occur during T and M activities has the possibility of incurring unplanned reactor trips (RTs) or power derate, making safety-related systems unavailable, or making the reliability of components degraded. Contribution of human errors during normal and abnormal activities of NPPs to the unplanned RTs is known to be about 20% of the total events. This paper introduces a procedure for predictively analyzing human error potentials when maintenance personnel perform T and M tasks based on a work procedure or their work plan. This procedure helps plant maintenance team prepare for plausible human errors. The procedure to be introduced is focusing on the recurrent error forms (or modes) in execution-based errors such as wrong object, omission, too little, and wrong action

  17. An Analysis and Quantification Method of Human Errors of Soft Controls in Advanced MCRs

    International Nuclear Information System (INIS)

    Lee, Seung Jun; Kim, Jae Whan; Jang, Seung Cheol

    2011-01-01

    In this work, a method was proposed for quantifying human errors that may occur during operation executions using soft control. Soft controls of advanced main control rooms (MCRs) have totally different features from conventional controls, and thus they may have different human error modes and occurrence probabilities. It is important to define the human error modes and to quantify the error probability for evaluating the reliability of the system and preventing errors. This work suggests a modified K-HRA method for quantifying error probability

  18. Trend analysis of human error events and assessment of their proactive prevention measure at Rokkasho reprocessing plant

    International Nuclear Information System (INIS)

    Yamazaki, Satoru; Tanaka, Izumi; Wakabayashi, Toshio

    2012-01-01

    A trend analysis of human error events is important for preventing the recurrence of human error events. We propose a new method for identifying the common characteristics from results of trend analysis, such as the latent weakness of organization, and a management process for strategic error prevention. In this paper, we describe a trend analysis method for human error events that have been accumulated in the organization and the utilization of the results of trend analysis to prevent accidents proactively. Although the systematic analysis of human error events, the monitoring of their overall trend, and the utilization of the analyzed results have been examined for the plant operation, such information has never been utilized completely. Sharing information on human error events and analyzing their causes lead to the clarification of problems in the management and human factors. This new method was applied to the human error events that occurred in the Rokkasho reprocessing plant from 2010 October. Results revealed that the output of this method is effective in judging the error prevention plan and that the number of human error events is reduced to about 50% those observed in 2009 and 2010. (author)

  19. Review of U.S. Army Unmanned Aerial Systems Accident Reports: Analysis of Human Error Contributions

    Science.gov (United States)

    2018-03-20

    within report documents. The information presented was obtained through a request to use the U.S. Army Combat Readiness Center’s Risk Management ...controlled flight into terrain (13 accidents), fueling errors by improper techniques (7 accidents), and a variety of maintenance errors (10 accidents). The...and 9 of the 10 maintenance accidents. Table 4. Frequencies Based on Source of Human Error Human error source Presence Poor Planning

  20. Detailed semantic analyses of human error incidents occurring at nuclear power plants. Extraction of periodical transition of error occurrence patterns by applying multivariate analysis

    International Nuclear Information System (INIS)

    Hirotsu, Yuko; Suzuki, Kunihiko; Takano, Kenichi; Kojima, Mitsuhiro

    2000-01-01

    It is essential for preventing the recurrence of human error incidents to analyze and evaluate them with the emphasis on human factor. Detailed and structured analyses of all incidents at domestic nuclear power plants (NPPs) reported during last 31 years have been conducted based on J-HPES, in which total 193 human error cases are identified. Results obtained by the analyses have been stored into the J-HPES database. In the previous study, by applying multivariate analysis to above case studies, it was suggested that there were several occurrence patterns identified of how errors occur at NPPs. It was also clarified that the causes related to each human error are different depending on age of their occurrence. This paper described the obtained results in respects of periodical transition of human error occurrence patterns. By applying multivariate analysis to the above data, it was suggested there were two types of error occurrence patterns as to each human error type. First type is common occurrence patterns, not depending on the age, and second type is the one influenced by periodical characteristics. (author)

  1. ATHEANA: A Technique for Human Error Analysis: An Overview of Its Methodological Basis

    International Nuclear Information System (INIS)

    Wreathall, John; Ramey-Smith, Ann

    1998-01-01

    The U.S. NRC has developed a new human reliability analysis (HRA) method, called A Technique for Human Event Analysis (ATHEANA), to provide a way of modeling the so-called 'errors of commission' - that is, situations in which operators terminate or disable engineered safety features (ESFs) or similar equipment during accident conditions, thereby putting the plant at an increased risk of core damage. In its reviews of operational events, NRC has found that these errors of commission occur with a relatively high frequency (as high as 2 or 3 per year), but are noticeably missing from the scope of most current probabilistic risk assessments (PRAs). This new method was developed through a formalized approach that describes what can occur when operators behave rationally but have inadequate knowledge or poor judgement. In particular, the method is based on models of decision-making and response planning that have been used extensively in the aviation field, and on the analysis of major accidents in both the nuclear and non-nuclear fields. Other papers at this conference present summaries of these event analyses in both the nuclear and non-nuclear fields. This paper presents an overview of ATHEANA and summarizes how the method structures the analysis of operationally significant events, and helps HRA analysts identify and model potentially risk-significant errors of commission in plant PRAs. (authors)

  2. A trend analysis of human error events for proactive prevention of accidents. Methodology development and effective utilization

    International Nuclear Information System (INIS)

    Hirotsu, Yuko; Ebisu, Mitsuhiro; Aikawa, Takeshi; Matsubara, Katsuyuki

    2006-01-01

    This paper described methods for analyzing human error events that has been accumulated in the individual plant and for utilizing the result to prevent accidents proactively. Firstly, a categorization framework of trigger action and causal factors of human error events were reexamined, and the procedure to analyze human error events was reviewed based on the framework. Secondly, a method for identifying the common characteristics of trigger action data and of causal factor data accumulated by analyzing human error events was clarified. In addition, to utilize the results of trend analysis effectively, methods to develop teaching material for safety education, to develop the checkpoints for the error prevention and to introduce an error management process for strategic error prevention were proposed. (author)

  3. Human reliability data, human error and accident models--illustration through the Three Mile Island accident analysis

    International Nuclear Information System (INIS)

    Le Bot, Pierre

    2004-01-01

    Our first objective is to provide a panorama of Human Reliability data used in EDF's Safety Probabilistic Studies, and then, since these concepts are at the heart of Human Reliability and its methods, to go over the notion of human error and the understanding of accidents. We are not sure today that it is actually possible to provide in this field a foolproof and productive theoretical framework. Consequently, the aim of this article is to suggest potential paths of action and to provide information on EDF's progress along those paths which enables us to produce the most potentially useful Human Reliability analyses while taking into account current knowledge in Human Sciences. The second part of this article illustrates our point of view as EDF researchers through the analysis of the most famous civil nuclear accident, the Three Mile Island unit accident in 1979. Analysis of this accident allowed us to validate our positions regarding the need to move, in the case of an accident, from the concept of human error to that of systemic failure in the operation of systems such as a nuclear power plant. These concepts rely heavily on the notion of distributed cognition and we will explain how we applied it. These concepts were implemented in the MERMOS Human Reliability Probabilistic Assessment methods used in the latest EDF Probabilistic Human Reliability Assessment. Besides the fact that it is not very productive to focus exclusively on individual psychological error, the design of the MERMOS method and its implementation have confirmed two things: the significance of qualitative data collection for Human Reliability, and the central role held by Human Reliability experts in building knowledge about emergency operation, which in effect consists of Human Reliability data collection. The latest conclusion derived from the implementation of MERMOS is that, considering the difficulty in building 'generic' Human Reliability data in the field we are involved in, the best

  4. Human Error Analysis in a Permit to Work System: A Case Study in a Chemical Plant

    Directory of Open Access Journals (Sweden)

    Mehdi Jahangiri

    2016-03-01

    Conclusion: The SPAR-H method applied in this study could analyze and quantify the potential human errors and extract the required measures for reducing the error probabilities in PTW system. Some suggestions to reduce the likelihood of errors, especially in the field of modifying the performance shaping factors and dependencies among tasks are provided.

  5. A basic framework for the analysis of the human error potential due to the computerization in nuclear power plants

    International Nuclear Information System (INIS)

    Lee, Y. H.

    1999-01-01

    Computerization and its vivid benefits expected in the nuclear power plant design cannot be realized without verifying the inherent safety problems. Human error aspect is also included in the verification issues. The verification spans from the perception of the changes in operation functions such as automation to the unfamiliar experience of operators due to the interface change. Therefore, a new framework for human error analysis might capture both the positive and the negative effect of the computerization. This paper suggest a basic framework for error identification through the review of the existing human error studies and the experience of computerizations in nuclear power plants

  6. Results of a nuclear power plant Application of a new technique for human error analysis (ATHEANA)

    International Nuclear Information System (INIS)

    Forester, J.A.; Whitehead, D.W.; Kolaczkowski, A.M.; Thompson, C.M.

    1997-01-01

    A new method to analyze human errors has been demonstrated at a pressurized water reactor (PWR) nuclear power plant. This was the first application of the new method referred to as A Technique for Human Error Analysis (ATHEANA). The main goals of the demonstration were to test the ATHEANA process as described in the frame-of-reference manual and the implementation guideline, test a training package developed for the method, test the hypothesis that plant operators and trainers have significant insight into the error-forcing-contexts (EFCs) that can make unsafe actions (UAs) more likely, and to identify ways to improve the method and its documentation. A set of criteria to evaluate the open-quotes successclose quotes of the ATHEANA method as used in the demonstration was identified. A human reliability analysis (HRA) team was formed that consisted of an expert in probabilistic risk assessment (PRA) with some background in HRA (not ATHEANA) and four personnel from the nuclear power plant. Personnel from the plant included two individuals from their PRA staff and two individuals from their training staff. Both individuals from training are currently licensed operators and one of them was a senior reactor operator open-quotes on shiftclose quotes until a few months before the demonstration. The demonstration was conducted over a 5 month period and was observed by members of the Nuclear Regulatory Commission's ATHEANA development team, who also served as consultants to the HRA team when necessary. Example results of the demonstration to date, including identified human failure events (HFEs), UAs, and EFCs are discussed. Also addressed is how simulator exercises are used in the ATHEANA demonstration project

  7. The common mode failures analysis of the redundent system with dependent human error

    International Nuclear Information System (INIS)

    Kim, M.K.; Chang, S.H.

    1983-01-01

    Common mode failures (CMFs) have been a serious concern in the nuclear power plant. Thereis a broad category of the failure mechanisms that can cause common mode failures. This paper is a theoretical investigation of the CMFs on the unavailability of the redundent system. It is assumed that the total CMFs consist of the potential CMFs and the dependent human error CMFs. As the human error dependency is higher, the total CMFs are more effected by the dependent human error. If the human error dependence is lower, the system unavailability strongly depends on the potential CMFs, rather than the mechanical failure or the dependent human error. And it is shown that the total CMFs are dominant factor to the unavailability of the redundent system. (Author)

  8. Game Design Principles based on Human Error

    Directory of Open Access Journals (Sweden)

    Guilherme Zaffari

    2016-03-01

    Full Text Available This paper displays the result of the authors’ research regarding to the incorporation of Human Error, through design principles, to video game design. In a general way, designers must consider Human Error factors throughout video game interface development; however, when related to its core design, adaptations are in need, since challenge is an important factor for fun and under the perspective of Human Error, challenge can be considered as a flaw in the system. The research utilized Human Error classifications, data triangulation via predictive human error analysis, and the expanded flow theory to allow the design of a set of principles in order to match the design of playful challenges with the principles of Human Error. From the results, it was possible to conclude that the application of Human Error in game design has a positive effect on player experience, allowing it to interact only with errors associated with the intended aesthetics of the game.

  9. Review of advances in human reliability analysis of errors of commission, Part 1: EOC identification

    International Nuclear Information System (INIS)

    Reer, Bernhard

    2008-01-01

    In close connection with examples relevant to contemporary probabilistic safety assessment (PSA), a review of advances in human reliability analysis (HRA) of post-initiator errors of commission (EOCs), i.e. inappropriate actions under abnormal operating conditions, has been carried out. The review comprises both EOC identification (part 1) and quantification (part 2); part 1 is presented in this article. Emerging HRA methods addressing the problem of EOC identification are: A Technique for Human Event Analysis (ATHEANA), the EOC HRA method developed by Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS), the Misdiagnosis Tree Analysis (MDTA) method, and the Commission Errors Search and Assessment (CESA) method. Most of the EOCs referred to in predictive studies comprise the stop of running or the inhibition of anticipated functions; a few comprise the start of a function. The CESA search scheme-which proceeds from possible operator actions to the affected systems to scenarios and uses procedures and importance measures as key sources of input information-provides a formalized way for identifying relatively important scenarios with EOC opportunities. In the implementation however, attention should be paid regarding EOCs associated with familiar but non-procedural actions and EOCs leading to failures of manually initiated safety functions

  10. Multidisciplinary framework for human reliability analysis with an application to errors of commission and dependencies

    International Nuclear Information System (INIS)

    Barriere, M.T.; Luckas, W.J.; Wreathall, J.; Cooper, S.E.; Bley, D.C.; Ramey-Smith, A.

    1995-08-01

    Since the early 1970s, human reliability analysis (HRA) has been considered to be an integral part of probabilistic risk assessments (PRAs). Nuclear power plant (NPP) events, from Three Mile Island through the mid-1980s, showed the importance of human performance to NPP risk. Recent events demonstrate that human performance continues to be a dominant source of risk. In light of these observations, the current limitations of existing HRA approaches become apparent when the role of humans is examined explicitly in the context of real NPP events. The development of new or improved HRA methodologies to more realistically represent human performance is recognized by the Nuclear Regulatory Commission (NRC) as a necessary means to increase the utility of PRAS. To accomplish this objective, an Improved HRA Project, sponsored by the NRC's Office of Nuclear Regulatory Research (RES), was initiated in late February, 1992, at Brookhaven National Laboratory (BNL) to develop an improved method for HRA that more realistically assesses the human contribution to plant risk and can be fully integrated with PRA. This report describes the research efforts including the development of a multidisciplinary HRA framework, the characterization and representation of errors of commission, and an approach for addressing human dependencies. The implications of the research and necessary requirements for further development also are discussed

  11. Multidisciplinary framework for human reliability analysis with an application to errors of commission and dependencies

    Energy Technology Data Exchange (ETDEWEB)

    Barriere, M.T.; Luckas, W.J. [Brookhaven National Lab., Upton, NY (United States); Wreathall, J. [Wreathall (John) and Co., Dublin, OH (United States); Cooper, S.E. [Science Applications International Corp., Reston, VA (United States); Bley, D.C. [PLG, Inc., Newport Beach, CA (United States); Ramey-Smith, A. [Nuclear Regulatory Commission, Washington, DC (United States). Div. of Systems Technology

    1995-08-01

    Since the early 1970s, human reliability analysis (HRA) has been considered to be an integral part of probabilistic risk assessments (PRAs). Nuclear power plant (NPP) events, from Three Mile Island through the mid-1980s, showed the importance of human performance to NPP risk. Recent events demonstrate that human performance continues to be a dominant source of risk. In light of these observations, the current limitations of existing HRA approaches become apparent when the role of humans is examined explicitly in the context of real NPP events. The development of new or improved HRA methodologies to more realistically represent human performance is recognized by the Nuclear Regulatory Commission (NRC) as a necessary means to increase the utility of PRAS. To accomplish this objective, an Improved HRA Project, sponsored by the NRC`s Office of Nuclear Regulatory Research (RES), was initiated in late February, 1992, at Brookhaven National Laboratory (BNL) to develop an improved method for HRA that more realistically assesses the human contribution to plant risk and can be fully integrated with PRA. This report describes the research efforts including the development of a multidisciplinary HRA framework, the characterization and representation of errors of commission, and an approach for addressing human dependencies. The implications of the research and necessary requirements for further development also are discussed.

  12. Web-Based Information Management System for the Investigation, Reporting, and Analysis of Human Error in Naval Aviation Maintenance

    National Research Council Canada - National Science Library

    Boex, Anthony

    2001-01-01

    .... The Human Factors Analysis and Classification System-Maintenance Extension (HFACS-ME) taxonomy, a framework for classifying and analyzing the presence of maintenance errors that lead to mishaps, is the foundation of this tool...

  13. Human reliability analysis of errors of commission: a review of methods and applications

    Energy Technology Data Exchange (ETDEWEB)

    Reer, B

    2007-06-15

    Illustrated by specific examples relevant to contemporary probabilistic safety assessment (PSA), this report presents a review of human reliability analysis (HRA) addressing post initiator errors of commission (EOCs), i.e. inappropriate actions under abnormal operating conditions. The review addressed both methods and applications. Emerging HRA methods providing advanced features and explicit guidance suitable for PSA are: A Technique for Human Event Analysis (ATHEANA, key publications in 1998/2000), Methode d'Evaluation de la Realisation des Missions Operateur pour la Surete (MERMOS, 1998/2000), the EOC HRA method developed by the Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS, 2003), the Misdiagnosis Tree Analysis (MDTA) method (2005/2006), the Cognitive Reliability and Error Analysis Method (CREAM, 1998), and the Commission Errors Search and Assessment (CESA) method (2002/2004). As a result of a thorough investigation of various PSA/HRA applications, this paper furthermore presents an overview of EOCs (termination of safety injection, shutdown of secondary cooling, etc.) referred to in predictive studies and a qualitative review of cases of EOC quantification. The main conclusions of the review of both the methods and the EOC HRA cases are: (1) The CESA search scheme, which proceeds from possible operator actions to the affected systems to scenarios, may be preferable because this scheme provides a formalized way for identifying relatively important scenarios with EOC opportunities; (2) an EOC identification guidance like CESA, which is strongly based on the procedural guidance and important measures of systems or components affected by inappropriate actions, however should pay some attention to EOCs associated with familiar but non-procedural actions and EOCs leading to failures of manually initiated safety functions. (3) Orientations of advanced EOC quantification comprise a) modeling of multiple contexts for a given scenario, b) accounting for

  14. Human reliability analysis of errors of commission: a review of methods and applications

    International Nuclear Information System (INIS)

    Reer, B.

    2007-06-01

    Illustrated by specific examples relevant to contemporary probabilistic safety assessment (PSA), this report presents a review of human reliability analysis (HRA) addressing post initiator errors of commission (EOCs), i.e. inappropriate actions under abnormal operating conditions. The review addressed both methods and applications. Emerging HRA methods providing advanced features and explicit guidance suitable for PSA are: A Technique for Human Event Analysis (ATHEANA, key publications in 1998/2000), Methode d'Evaluation de la Realisation des Missions Operateur pour la Surete (MERMOS, 1998/2000), the EOC HRA method developed by the Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS, 2003), the Misdiagnosis Tree Analysis (MDTA) method (2005/2006), the Cognitive Reliability and Error Analysis Method (CREAM, 1998), and the Commission Errors Search and Assessment (CESA) method (2002/2004). As a result of a thorough investigation of various PSA/HRA applications, this paper furthermore presents an overview of EOCs (termination of safety injection, shutdown of secondary cooling, etc.) referred to in predictive studies and a qualitative review of cases of EOC quantification. The main conclusions of the review of both the methods and the EOC HRA cases are: (1) The CESA search scheme, which proceeds from possible operator actions to the affected systems to scenarios, may be preferable because this scheme provides a formalized way for identifying relatively important scenarios with EOC opportunities; (2) an EOC identification guidance like CESA, which is strongly based on the procedural guidance and important measures of systems or components affected by inappropriate actions, however should pay some attention to EOCs associated with familiar but non-procedural actions and EOCs leading to failures of manually initiated safety functions. (3) Orientations of advanced EOC quantification comprise a) modeling of multiple contexts for a given scenario, b) accounting for

  15. Review of advances in human reliability analysis of errors of commission-Part 2: EOC quantification

    International Nuclear Information System (INIS)

    Reer, Bernhard

    2008-01-01

    In close connection with examples relevant to contemporary probabilistic safety assessment (PSA), a review of advances in human reliability analysis (HRA) of post-initiator errors of commission (EOCs), i.e. inappropriate actions under abnormal operating conditions, has been carried out. The review comprises both EOC identification (part 1) and quantification (part 2); part 2 is presented in this article. Emerging HRA methods in this field are: ATHEANA, MERMOS, the EOC HRA method developed by Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS), the MDTA method and CREAM. The essential advanced features are on the conceptual side, especially to envisage the modeling of multiple contexts for an EOC to be quantified (ATHEANA, MERMOS and MDTA), in order to explicitly address adverse conditions. There is promising progress in providing systematic guidance to better account for cognitive demands and tendencies (GRS, CREAM), and EOC recovery (MDTA). Problematic issues are associated with the implementation of multiple context modeling and the assessment of context-specific error probabilities. Approaches for task or error opportunity scaling (CREAM, GRS) and the concept of reference cases (ATHEANA outlook) provide promising orientations for achieving progress towards data-based quantification. Further development work is needed and should be carried out in close connection with large-scale applications of existing approaches

  16. Human Error and General Aviation Accidents: A Comprehensive, Fine-Grained Analysis Using HFACS

    National Research Council Canada - National Science Library

    Wiegmann, Douglas; Faaborg, Troy; Boquet, Albert; Detwiler, Cristy; Holcomb, Kali; Shappell, Scott

    2005-01-01

    ... of both commercial and general aviation (GA) accidents. These analyses have helped to identify general trends in the types of human factors issues and aircrew errors that have contributed to civil aviation accidents...

  17. Cause analysis and preventives for human error events in Daya Bay NPP

    International Nuclear Information System (INIS)

    Huang Weigang; Zhang Li

    1998-01-01

    Daya Bay Nuclear Power Plant is put into commercial operation in 1994 Until 1996, there are 368 human error events in operating and maintenance area, occupying 39% of total events. These events occurred mainly in the processes of maintenance, test equipment isolation and system on-line, in particular in refuelling and maintenance. The author analyses root causes for human errorievents, which are mainly operator omission or error procedure deficiency; procedure not followed; lack of training; communication failures; work management inadequacy. The protective measures and treatment principle for human error events are also discussed, and several examples applying them are given. Finally, it is put forward that key to prevent human error event lies in the coordination and management, person in charge of work, and good work habits of staffs

  18. Human error data collection analysis program undertaken since 1982 by Electricite de France with INPO

    International Nuclear Information System (INIS)

    Ghertman, F.; Dietz, P.

    1985-01-01

    The preoccupation for reducing in frequency and importance events which harm at various degrees the availability, the safety and the security of nuclear power plants lead Electricite de France, in cooperation with INPO (Institute of Nuclear Power Operations) to launch a Human Error Collection and Analysis Program. On account with the difficulties met to develop such a program, it has been decided to begin with a pilot data collection limited to a six months period (October 1982 to April 1983) and three nuclear power plants (three US units and two French units). This pilot data collection followed four steps: (1) elaboration of the collection methodology; (2) sensitization and related training of the power plant personnel; (3) data collection in the power plant; and (4) analysis of the data and results. Each of the steps are discussed in the paper

  19. Understanding human management of automation errors

    Science.gov (United States)

    McBride, Sara E.; Rogers, Wendy A.; Fisk, Arthur D.

    2013-01-01

    Automation has the potential to aid humans with a diverse set of tasks and support overall system performance. Automated systems are not always reliable, and when automation errs, humans must engage in error management, which is the process of detecting, understanding, and correcting errors. However, this process of error management in the context of human-automation interaction is not well understood. Therefore, we conducted a systematic review of the variables that contribute to error management. We examined relevant research in human-automation interaction and human error to identify critical automation, person, task, and emergent variables. We propose a framework for management of automation errors to incorporate and build upon previous models. Further, our analysis highlights variables that may be addressed through design and training to positively influence error management. Additional efforts to understand the error management process will contribute to automation designed and implemented to support safe and effective system performance. PMID:25383042

  20. A qualitative description of human error

    International Nuclear Information System (INIS)

    Li Zhaohuan

    1992-11-01

    The human error has an important contribution to risk of reactor operation. The insight and analytical model are main parts in human reliability analysis. It consists of the concept of human error, the nature, the mechanism of generation, the classification and human performance influence factors. On the operating reactor the human error is defined as the task-human-machine mismatch. The human error event is focused on the erroneous action and the unfavored result. From the time limitation of performing a task, the operation is divided into time-limited and time-opened. The HCR (human cognitive reliability) model is suited for only time-limited. The basic cognitive process consists of the information gathering, cognition/thinking, decision making and action. The human erroneous action may be generated in any stage of this process. The more natural ways to classify human errors are presented. The human performance influence factors including personal, organizational and environmental factors are also listed

  1. A qualitative description of human error

    Energy Technology Data Exchange (ETDEWEB)

    Zhaohuan, Li [Academia Sinica, Beijing, BJ (China). Inst. of Atomic Energy

    1992-11-01

    The human error has an important contribution to risk of reactor operation. The insight and analytical model are main parts in human reliability analysis. It consists of the concept of human error, the nature, the mechanism of generation, the classification and human performance influence factors. On the operating reactor the human error is defined as the task-human-machine mismatch. The human error event is focused on the erroneous action and the unfavored result. From the time limitation of performing a task, the operation is divided into time-limited and time-opened. The HCR (human cognitive reliability) model is suited for only time-limited. The basic cognitive process consists of the information gathering, cognition/thinking, decision making and action. The human erroneous action may be generated in any stage of this process. The more natural ways to classify human errors are presented. The human performance influence factors including personal, organizational and environmental factors are also listed.

  2. Human error probability evaluation as part of reliability analysis of digital protection system of advanced pressurized water reactor - APR 1400

    International Nuclear Information System (INIS)

    Varde, P. V.; Lee, D. Y.; Han, J. B.

    2003-03-01

    A case of study on human reliability analysis has been performed as part of reliability analysis of digital protection system of the reactor automatically actuates the shutdown system of the reactor when demanded. However, the safety analysis takes credit for operator action as a diverse mean for tripping the reactor for, though a low probability, ATWS scenario. Based on the available information two cases, viz., human error in tripping the reactor and calibration error for instrumentations in protection system, have been analyzed. Wherever applicable a parametric study has also been performed

  3. Practical Insights from Initial Studies Related to Human Error Analysis Project (HEAP)

    International Nuclear Information System (INIS)

    Follesoe, Knut; Kaarstad, Magnhild; Droeivoldsmo, Asgeir; Hollnagel, Erik; Kirwan; Barry

    1996-01-01

    This report presents practical insights made from an analysis of the three initial studies in the Human Error Analysis Project (HEAP), and the first study in the US NRC Staffing Project. These practical insights relate to our understanding of diagnosis in Nuclear Power Plant (NPP) emergency scenarios and, in particular, the factors that influence whether a diagnosis will succeed or fail. The insights reported here focus on three inter-related areas: (1) the diagnostic strategies and styles that have been observed in single operator and team-based studies; (2) the qualitative aspects of the key operator support systems, namely VDU interfaces, alarms, training and procedures, that have affected the outcome of diagnosis; and (3) the overall success rates of diagnosis and the error types that have been observed in the various studies. With respect to diagnosis, certain patterns have emerged from the various studies, depending on whether operators were alone or in teams, and on their familiarity with the process. Some aspects of the interface and alarm systems were found to contribute to diagnostic failures while others supported performance and recovery. Similar results were found for training and experience. Furthermore, the availability of procedures did not preclude the need for some diagnosis. With respect to HRA and PSA, it was possible to record the failure types seen in the studies, and in some cases to give crude estimates of the failure likelihood for certain scenarios. Although these insights are interim in nature, they do show the type of information that can be derived from these studies. More importantly, they clarify aspects of our understanding of diagnosis in NPP emergencies, including implications for risk assessment, operator support systems development, and for research into diagnosis in a broader range of fields than the nuclear power industry. (author)

  4. A theory of human error

    Science.gov (United States)

    Mcruer, D. T.; Clement, W. F.; Allen, R. W.

    1981-01-01

    Human errors tend to be treated in terms of clinical and anecdotal descriptions, from which remedial measures are difficult to derive. Correction of the sources of human error requires an attempt to reconstruct underlying and contributing causes of error from the circumstantial causes cited in official investigative reports. A comprehensive analytical theory of the cause-effect relationships governing propagation of human error is indispensable to a reconstruction of the underlying and contributing causes. A validated analytical theory of the input-output behavior of human operators involving manual control, communication, supervisory, and monitoring tasks which are relevant to aviation, maritime, automotive, and process control operations is highlighted. This theory of behavior, both appropriate and inappropriate, provides an insightful basis for investigating, classifying, and quantifying the needed cause-effect relationships governing propagation of human error.

  5. The role of human error in risk analysis: Application to pre- and post-maintenance procedures of process facilities

    International Nuclear Information System (INIS)

    Noroozi, Alireza; Khakzad, Nima; Khan, Faisal; MacKinnon, Scott; Abbassi, Rouzbeh

    2013-01-01

    Human factors play an important role in the safe operation of a facility. Human factors include the systematic application of information about human characteristics and behavior to increase the safety of a process system. A significant proportion of human errors occur during the maintenance phase. However, the quantification of human error probabilities in the maintenance phase has not been given the amount of attention it deserves. This paper focuses on a human factors analysis in pre-and post- pump maintenance operations. The procedures for removing process equipment from service (pre-maintenance) and returning the equipment to service (post-maintenance) are considered for possible failure scenarios. For each scenario, human error probability is calculated for each activity using the Success Likelihood Index Method (SLIM). Consequences are also assessed in this methodology. The risk assessment is conducted for each component and the overall risk is estimated by adding individual risks. The present study is aimed at highlighting the importance of considering human error in quantitative risk analyses. The developed methodology has been applied to a case study of an offshore process facility

  6. Study on a new framework of Human Reliability Analysis to evaluate soft control execution error in advanced MCRs of NPPs

    International Nuclear Information System (INIS)

    Jang, Inseok; Kim, Ar Ryum; Jung, Wondea; Seong, Poong Hyun

    2016-01-01

    Highlights: • The operation environment of MCRs in NPPs has changed by adopting new HSIs. • The operation action in NPP Advanced MCRs is performed by soft control. • New HRA framework should be considered in the HRA for advanced MCRs. • HRA framework for evaluation of soft control execution human error is suggested. • Suggested method will be helpful to analyze human reliability in advance MCRs. - Abstract: Since the Three Mile Island (TMI)-2 accident, human error has been recognized as one of the main causes of Nuclear Power Plant (NPP) accidents, and numerous studies related to Human Reliability Analysis (HRA) have been carried out. Most of these methods were developed considering the conventional type of Main Control Rooms (MCRs). However, the operating environment of MCRs in NPPs has changed with the adoption of new Human-System Interfaces (HSIs) that are based on computer-based technologies. The MCRs that include these digital technologies, such as large display panels, computerized procedures, and soft controls, are called advanced MCRs. Among the many features of advanced MCRs, soft controls are a particularly important feature because operating actions in NPP advanced MCRs are performed by soft control. Due to the differences in interfaces between soft control and hardwired conventional type control, different Human Error Probabilities (HEPs) and a new HRA framework should be considered in the HRA for advanced MCRs. To this end, a new framework of a HRA method for evaluating soft control execution human error is suggested by performing a soft control task analysis and the literature regarding widely accepted human error taxonomies is reviewed. Moreover, since most current HRA databases deal with operation in conventional MCRs and are not explicitly designed to deal with digital HSIs, empirical analysis of human error and error recovery considering soft controls under an advanced MCR mockup are carried out to collect human error data, which is

  7. Systematic analysis of video data from different human-robot interaction studies: a categorization of social signals during error situations.

    Science.gov (United States)

    Giuliani, Manuel; Mirnig, Nicole; Stollnberger, Gerald; Stadler, Susanne; Buchner, Roland; Tscheligi, Manfred

    2015-01-01

    Human-robot interactions are often affected by error situations that are caused by either the robot or the human. Therefore, robots would profit from the ability to recognize when error situations occur. We investigated the verbal and non-verbal social signals that humans show when error situations occur in human-robot interaction experiments. For that, we analyzed 201 videos of five human-robot interaction user studies with varying tasks from four independent projects. The analysis shows that there are two types of error situations: social norm violations and technical failures. Social norm violations are situations in which the robot does not adhere to the underlying social script of the interaction. Technical failures are caused by technical shortcomings of the robot. The results of the video analysis show that the study participants use many head movements and very few gestures, but they often smile, when in an error situation with the robot. Another result is that the participants sometimes stop moving at the beginning of error situations. We also found that the participants talked more in the case of social norm violations and less during technical failures. Finally, the participants use fewer non-verbal social signals (for example smiling, nodding, and head shaking), when they are interacting with the robot alone and no experimenter or other human is present. The results suggest that participants do not see the robot as a social interaction partner with comparable communication skills. Our findings have implications for builders and evaluators of human-robot interaction systems. The builders need to consider including modules for recognition and classification of head movements to the robot input channels. The evaluators need to make sure that the presence of an experimenter does not skew the results of their user studies.

  8. New method of classifying human errors at nuclear power plants and the analysis results of applying this method to maintenance errors at domestic plants

    International Nuclear Information System (INIS)

    Takagawa, Kenichi; Miyazaki, Takamasa; Gofuku, Akio; Iida, Hiroyasu

    2007-01-01

    Since many of the adverse events that have occurred in nuclear power plants in Japan and abroad have been related to maintenance or operation, it is necessary to plan preventive measures based on detailed analyses of human errors made by maintenance workers or operators. Therefore, before planning preventive measures, we developed a new method of analyzing human errors. Since each human error is an unsafe action caused by some misjudgement made by a person, we decided to classify them into six categories according to the stage in the judgment process in which the error was made. By further classifying each error into either an omission-type or commission-type, we produced 12 categories of errors. Then, we divided them into the two categories of basic error tendencies and individual error tendencies, and categorized background factors into four categories: imperfect planning; imperfect facilities or tools; imperfect environment; and imperfect instructions or communication. We thus defined the factors in each category to make it easy to identify factors that caused the error. Then using this method, we studied the characteristics of human errors that involved maintenance workers and planners since many maintenance errors have occurred. Among the human errors made by workers (worker errors) during the implementation stage, the following three types were prevalent with approximately 80%: commission-type 'projection errors', omission-type comprehension errors' and commission type 'action errors'. The most common among the individual factors of worker errors was 'repetition or habit' (schema), based on the assumption of a typical situation, and the half number of the 'repetition or habit' cases (schema) were not influenced by any background factors. The most common background factor that contributed to the individual factor was 'imperfect work environment', followed by 'insufficient knowledge'. Approximately 80% of the individual factors were 'repetition or habit' or

  9. A model-based and computer-aided approach to analysis of human errors in nuclear power plants

    International Nuclear Information System (INIS)

    Yoon, Wan C.; Lee, Yong H.; Kim, Young S.

    1996-01-01

    Since the operator's mission in NPPs is increasingly defined by cognitive tasks such as monitoring, diagnosis and planning, the focus of human error analysis should also move from external actions to internal decision-making processes. While more elaborate analysis of cognitive aspects of human errors will help understand their causes and derive effective countermeasures, a lack of framework and an arbitrary resolution of description may hamper the effectiveness of such analysis. This paper presents new model-based schemes of event description and error classification as well as an interactive computerized support system. The schemes and the support system were produced in an effort to develop an improved version of HPES. The use of a decision-making model enables the analyst to document cognitive aspects of human performance explicitly and in a proper resolution. The stage-specific terms used in the proposed schemes make the task of characterizing human errors easier and confident for field analysts. The support system was designed to help the analyst achieve a contextually well-integrated analysis throughout the different parts of HPES

  10. Human error mechanisms in complex work environments

    International Nuclear Information System (INIS)

    Rasmussen, J.

    1988-01-01

    Human error taxonomies have been developed from analysis of industrial incident reports as well as from psychological experiments. In this paper the results of the two approaches are reviewed and compared. It is found, in both cases, that a fairly small number of basic psychological mechanisms will account for most of the action errors observed. In addition, error mechanisms appear to be intimately related to the development of high skill and know-how in a complex work context. This relationship between errors and human adaptation is discussed in detail for individuals and organisations. The implications for system safety and briefly mentioned, together with the implications for system design. (author)

  11. Human error mechanisms in complex work environments

    International Nuclear Information System (INIS)

    Rasmussen, Jens; Danmarks Tekniske Hoejskole, Copenhagen)

    1988-01-01

    Human error taxonomies have been developed from analysis of industrial incident reports as well as from psychological experiments. In this paper the results of the two approaches are reviewed and compared. It is found, in both cases, that a fairly small number of basic psychological mechanisms will account for most of the action errors observed. In addition, error mechanisms appear to be intimately related to the development of high skill and know-how in a complex work context. This relationship between errors and human adaptation is discussed in detail for individuals and organisations. The implications for system safety are briefly mentioned, together with the implications for system design. (author)

  12. Monte Carlo simulation of expert judgments on human errors in chemical analysis--a case study of ICP-MS.

    Science.gov (United States)

    Kuselman, Ilya; Pennecchi, Francesca; Epstein, Malka; Fajgelj, Ales; Ellison, Stephen L R

    2014-12-01

    Monte Carlo simulation of expert judgments on human errors in a chemical analysis was used for determination of distributions of the error quantification scores (scores of likelihood and severity, and scores of effectiveness of a laboratory quality system in prevention of the errors). The simulation was based on modeling of an expert behavior: confident, reasonably doubting and irresolute expert judgments were taken into account by means of different probability mass functions (pmfs). As a case study, 36 scenarios of human errors which may occur in elemental analysis of geological samples by ICP-MS were examined. Characteristics of the score distributions for three pmfs of an expert behavior were compared. Variability of the scores, as standard deviation of the simulated score values from the distribution mean, was used for assessment of the score robustness. A range of the score values, calculated directly from elicited data and simulated by a Monte Carlo method for different pmfs, was also discussed from the robustness point of view. It was shown that robustness of the scores, obtained in the case study, can be assessed as satisfactory for the quality risk management and improvement of a laboratory quality system against human errors. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. Human errors related to maintenance and modifications

    International Nuclear Information System (INIS)

    Laakso, K.; Pyy, P.; Reiman, L.

    1998-01-01

    The focus in human reliability analysis (HRA) relating to nuclear power plants has traditionally been on human performance in disturbance conditions. On the other hand, some studies and incidents have shown that also maintenance errors, which have taken place earlier in plant history, may have an impact on the severity of a disturbance, e.g. if they disable safety related equipment. Especially common cause and other dependent failures of safety systems may significantly contribute to the core damage risk. The first aim of the study was to identify and give examples of multiple human errors which have penetrated the various error detection and inspection processes of plant safety barriers. Another objective was to generate numerical safety indicators to describe and forecast the effectiveness of maintenance. A more general objective was to identify needs for further development of maintenance quality and planning. In the first phase of this operational experience feedback analysis, human errors recognisable in connection with maintenance were looked for by reviewing about 4400 failure and repair reports and some special reports which cover two nuclear power plant units on the same site during 1992-94. A special effort was made to study dependent human errors since they are generally the most serious ones. An in-depth root cause analysis was made for 14 dependent errors by interviewing plant maintenance foremen and by thoroughly analysing the errors. A more simple treatment was given to maintenance-related single errors. The results were shown as a distribution of errors among operating states i.a. as regards the following matters: in what operational state the errors were committed and detected; in what operational and working condition the errors were detected, and what component and error type they were related to. These results were presented separately for single and dependent maintenance-related errors. As regards dependent errors, observations were also made

  14. Human Error and Organizational Management

    Directory of Open Access Journals (Sweden)

    Alecxandrina DEACONU

    2009-01-01

    Full Text Available The concern for performance is a topic that raises interest in the businessenvironment but also in other areas that – even if they seem distant from thisworld – are aware of, interested in or conditioned by the economy development.As individual performance is very much influenced by the human resource, wechose to analyze in this paper the mechanisms that generate – consciously or not–human error nowadays.Moreover, the extremely tense Romanian context,where failure is rather a rule than an exception, made us investigate thephenomenon of generating a human error and the ways to diminish its effects.

  15. Human Error Analysis Project (HEAP) - The Fourth Pilot Study: Scoring and Analysis of Raw Data Types

    International Nuclear Information System (INIS)

    Hollnagel, Erik; Braarud; Per Oeyvind; Droeivoldsmo, Asgeir; Follesoe; Knut; Helgar, Stein; Kaarstad, Magnhild

    1996-01-01

    Pilot study No. 4 rounded off the series of pilot studies by looking at the important issue of the quality of the various data sources. The preceding experiments had clearly shown that that it was necessary to use both concurrent and interrupted verbal protocols, and also that information about eye movements was very valuable. The effort and resources needed to analyse a combination of the different data sources is, however, significant, and it was therefore important to find out whether one or more of the data sources could replace another. In order to determine this issue, pilot study No. 4 looked specifically at the quality of information provided by different data sources. The main hypotheses were that information about operators' diagnosis and decision making would be provided by verbal protocols, expert commentators, and auto-confrontation protocols, that the data sources would be valid, and that they would complement each other. The study used three main data sources: (1) concurrent verbal protocols, which were the operators' verbalisations during the experiment; (2) expert commentator reports, which were descriptions by process experts of the operators' performance; and (3) auto-confrontation, which were the operators' comments on their performance based on a replay of the performance recording minus the concurrent verbal protocol. Additional data sources were eye movement recordings, process data, alarms, etc. The three main data sources were treated as independent variables and applied according to an experimental design that facilitated the test of the main hypotheses. The pilot study produced altogether 59 verbal protocols, some of which were in Finnish. After a translation into English, each protocol was analysed and scored according to a specific scheme. The scoring was designed to facilitate the evaluation of the experimental hypotheses. Due to the considerable work involved, the analysis process has only been partly completed, and no firm results

  16. Quantitative developments in the cognitive reliability and error analysis method (CREAM) for the assessment of human performance

    International Nuclear Information System (INIS)

    Marseguerra, Marzio; Zio, Enrico; Librizzi, Massimo

    2006-01-01

    The current 'second generation' approaches in human reliability analysis focus their attention on the contextual conditions under which a given action is performed rather than on the notion of inherent human error probabilities, as was done in the earlier 'first generation' techniques. Among the 'second generation' methods, this paper considers the Cognitive Reliability and Error Analysis Method (CREAM) and proposes some developments with respect to a systematic procedure for computing probabilities of action failure. The starting point for the quantification is a previously introduced fuzzy version of the CREAM paradigm which is here further extended to include uncertainty on the qualification of the conditions under which the action is performed and to account for the fact that the effects of the common performance conditions (CPCs) on performance reliability may not all be equal. By the proposed approach, the probability of action failure is estimated by rating the performance conditions in terms of their effect on the action

  17. The analysis of human error as causes in the maintenance of machines: a case study in mining companies

    Directory of Open Access Journals (Sweden)

    Kovacevic, Srdja

    2016-12-01

    Full Text Available This paper describes the two-step method used to analyse the factors and aspects influencing human error during the maintenance of mining machines. The first step is the cause-effect analysis, supported by brainstorming, where five factors and 21 aspects are identified. During the second step, the group fuzzy analytic hierarchy process is used to rank the identified factors and aspects. A case study is done on mining companies in Serbia. The key aspects are ranked according to an analysis that included experts who assess risks in mining companies (a maintenance engineer, a technologist, an ergonomist, a psychologist, and an organisational scientist. Failure to follow technical maintenance instructions, poor organisation of the training process, inadequate diagnostic equipment, and a lack of understanding of the work process are identified as the most important causes of human error.

  18. Analysis of Human Errors in Industrial Incidents and Accidents for Improvement of Work Safety

    DEFF Research Database (Denmark)

    Leplat, J.; Rasmussen, Jens

    1984-01-01

    Methods for the analysis of work accidents are discussed, and a description is given of the use of a causal situation analysis in terms of a 'variation tree' in order to explain the course of events of the individual cases and to identify possible improvements. The difficulties in identifying...... 'causes' of accidents are discussed, and it is proposed to analyze accident reports with the specific aim of identifying the potential for future improvements rather than causes of past events. In contrast to traditional statistical analysis of work accident data, which typically give very general...... recommendations, the method proposed identifies very explicit countermeasures. Improvements require a change in human decisions during equipment design, work planning, or the execution itself. The use of a model of human behavior drawing a distinction between automated skill-based behavior, rule-based 'know...

  19. Human decision error (HUMDEE) trees

    International Nuclear Information System (INIS)

    Ostrom, L.T.

    1993-01-01

    Graphical presentations of human actions in incident and accident sequences have been used for many years. However, for the most part, human decision making has been underrepresented in these trees. This paper presents a method of incorporating the human decision process into graphical presentations of incident/accident sequences. This presentation is in the form of logic trees. These trees are called Human Decision Error Trees or HUMDEE for short. The primary benefit of HUMDEE trees is that they graphically illustrate what else the individuals involved in the event could have done to prevent either the initiation or continuation of the event. HUMDEE trees also present the alternate paths available at the operator decision points in the incident/accident sequence. This is different from the Technique for Human Error Rate Prediction (THERP) event trees. There are many uses of these trees. They can be used for incident/accident investigations to show what other courses of actions were available and for training operators. The trees also have a consequence component so that not only the decision can be explored, also the consequence of that decision

  20. Uncertainty quantification and error analysis

    Energy Technology Data Exchange (ETDEWEB)

    Higdon, Dave M [Los Alamos National Laboratory; Anderson, Mark C [Los Alamos National Laboratory; Habib, Salman [Los Alamos National Laboratory; Klein, Richard [Los Alamos National Laboratory; Berliner, Mark [OHIO STATE UNIV.; Covey, Curt [LLNL; Ghattas, Omar [UNIV OF TEXAS; Graziani, Carlo [UNIV OF CHICAGO; Seager, Mark [LLNL; Sefcik, Joseph [LLNL; Stark, Philip [UC/BERKELEY; Stewart, James [SNL

    2010-01-01

    UQ studies all sources of error and uncertainty, including: systematic and stochastic measurement error; ignorance; limitations of theoretical models; limitations of numerical representations of those models; limitations on the accuracy and reliability of computations, approximations, and algorithms; and human error. A more precise definition for UQ is suggested below.

  1. Development of an FAA-EUROCONTROL technique for the analysis of human error in ATM : final report.

    Science.gov (United States)

    2002-07-01

    Human error has been identified as a dominant risk factor in safety-oriented industries such as air traffic control (ATC). However, little is known about the factors leading to human errors in current air traffic management (ATM) systems. The first s...

  2. Simulator data on human error probabilities

    International Nuclear Information System (INIS)

    Kozinsky, E.J.; Guttmann, H.E.

    1982-01-01

    Analysis of operator errors on NPP simulators is being used to determine Human Error Probabilities (HEP) for task elements defined in NUREG/CR 1278. Simulator data tapes from research conducted by EPRI and ORNL are being analyzed for operator error rates. The tapes collected, using Performance Measurement System software developed for EPRI, contain a history of all operator manipulations during simulated casualties. Analysis yields a time history or Operational Sequence Diagram and a manipulation summary, both stored in computer data files. Data searches yield information on operator errors of omission and commission. This work experimentally determines HEPs for Probabilistic Risk Assessment calculations. It is the only practical experimental source of this data to date

  3. Simulator data on human error probabilities

    International Nuclear Information System (INIS)

    Kozinsky, E.J.; Guttmann, H.E.

    1981-01-01

    Analysis of operator errors on NPP simulators is being used to determine Human Error Probabilities (HEP) for task elements defined in NUREG/CR-1278. Simulator data tapes from research conducted by EPRI and ORNL are being analyzed for operator error rates. The tapes collected, using Performance Measurement System software developed for EPRI, contain a history of all operator manipulations during simulated casualties. Analysis yields a time history or Operational Sequence Diagram and a manipulation summary, both stored in computer data files. Data searches yield information on operator errors of omission and commission. This work experimentally determined HEP's for Probabilistic Risk Assessment calculations. It is the only practical experimental source of this data to date

  4. Chernobyl - system accident or human error?

    International Nuclear Information System (INIS)

    Stang, E.

    1996-01-01

    Did human error cause the Chernobyl disaster? The standard point of view is that operator error was the root cause of the disaster. This was also the view of the Soviet Accident Commission. The paper analyses the operator errors at Chernobyl in a system context. The reactor operators committed errors that depended upon a lot of other failures that made up a complex accident scenario. The analysis is based on Charles Perrow's analysis of technological disasters. Failure possibility is an inherent property of high-risk industrial installations. The Chernobyl accident consisted of a chain of events that were both extremely improbable and difficult to predict. It is not reasonable to put the blame for the disaster on the operators. (author)

  5. Information Management System Development for the Investigation, Reporting, and Analysis of Human Error in Naval Aviation Maintenance

    National Research Council Canada - National Science Library

    Nelson, Douglas

    2001-01-01

    The purpose of this research is to evaluate and refine a safety information management system that will facilitate data collection, organization, query, analysis and reporting of maintenance errors...

  6. Human Errors and Bridge Management Systems

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle; Nowak, A. S.

    on basis of reliability profiles for bridges without human errors are extended to include bridges with human errors. The first rehabilitation distributions for bridges without and with human errors are combined into a joint first rehabilitation distribution. The methodology presented is illustrated...... for reinforced concrete bridges....

  7. Trend analysis and comparison of operators' human error events occurred at overseas and domestic nuclear power plants

    International Nuclear Information System (INIS)

    Takagawa, Kenichi

    2006-01-01

    Human errors by operators at overseas and domestic nuclear power plants during the period from 2002 to 2005 were compared and their trends analyzed. The most frequently cited cause of such errors was 'insufficient team monitoring' (inadequate superiors' and other crews' instructions and supervision) both at overseas and domestic plants, followed by 'insufficient self-checking' (lack of cautions by the operator himself). A comparison of the effects of the errors on the operations of plants in Japan and the United Sates showed that the drop in plant output and plant shutdowns at plants in Japan were approximately one-tenth of those in the United States. The ratio of automatic reactor trips to the total number of human errors reported is about 6% for both Japanese and American plants. Looking at changes in the incidence of human errors by years of occurrence, although a distinctive trend cannot be identified for domestic nuclear power plants due to insufficient reported cases, 'inadequate self-checking' as a factor contributing to human errors at overseas nuclear power plants has decreased significantly over the past four years. Regarding changes in the effects of human errors on the operations of plants during the four-year period, events leading to an automatic reactor trip have tended to increase at American plants. Conceivable factors behind this increasing tendency included lack of operating experience by a team (e.g., plant transients and reactor shutdowns and startups) and excessive dependence on training simulators. (author)

  8. Analysis of Task Types and Error Types of the Human Actions Involved in the Human-related Unplanned Reactor Trip Events

    International Nuclear Information System (INIS)

    Kim, Jae Whan; Park, Jin Kyun; Jung, Won Dea

    2008-02-01

    This report provides the task types and error types involved in the unplanned reactor trip events that have occurred during 1986 - 2006. The events that were caused by the secondary system of the nuclear power plants amount to 67 %, and the remaining 33 % was by the primary system. The contribution of the activities of the plant personnel was identified as the following order: corrective maintenance (25.7 %), planned maintenance (22.8 %), planned operation (19.8 %), periodic preventive maintenance (14.9 %), response to a transient (9.9 %), and design/manufacturing/installation (9.9%). According to the analysis of error modes, the error modes such as control failure (22.2 %), wrong object (18.5 %), omission (14.8 %), wrong action (11.1 %), and inadequate (8.3 %) take up about 75 % of all the unplanned trip events. The analysis of the cognitive functions involved showed that the planning function makes the highest contribution to the human actions leading to unplanned reactor trips, and it is followed by the observation function (23.4%), the execution function (17.8 %), and the interpretation function (10.3 %). The results of this report are to be used as important bases for development of the error reduction measures or development of the error mode prediction system for the test and maintenance tasks in nuclear power plants

  9. Analysis of Task Types and Error Types of the Human Actions Involved in the Human-related Unplanned Reactor Trip Events

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae Whan; Park, Jin Kyun; Jung, Won Dea

    2008-02-15

    This report provides the task types and error types involved in the unplanned reactor trip events that have occurred during 1986 - 2006. The events that were caused by the secondary system of the nuclear power plants amount to 67 %, and the remaining 33 % was by the primary system. The contribution of the activities of the plant personnel was identified as the following order: corrective maintenance (25.7 %), planned maintenance (22.8 %), planned operation (19.8 %), periodic preventive maintenance (14.9 %), response to a transient (9.9 %), and design/manufacturing/installation (9.9%). According to the analysis of error modes, the error modes such as control failure (22.2 %), wrong object (18.5 %), omission (14.8 %), wrong action (11.1 %), and inadequate (8.3 %) take up about 75 % of all the unplanned trip events. The analysis of the cognitive functions involved showed that the planning function makes the highest contribution to the human actions leading to unplanned reactor trips, and it is followed by the observation function (23.4%), the execution function (17.8 %), and the interpretation function (10.3 %). The results of this report are to be used as important bases for development of the error reduction measures or development of the error mode prediction system for the test and maintenance tasks in nuclear power plants.

  10. Collection of offshore human error probability data

    International Nuclear Information System (INIS)

    Basra, Gurpreet; Kirwan, Barry

    1998-01-01

    Accidents such as Piper Alpha have increased concern about the effects of human errors in complex systems. Such accidents can in theory be predicted and prevented by risk assessment, and in particular human reliability assessment (HRA), but HRA ideally requires qualitative and quantitative human error data. A research initiative at the University of Birmingham led to the development of CORE-DATA, a Computerised Human Error Data Base. This system currently contains a reasonably large number of human error data points, collected from a variety of mainly nuclear-power related sources. This article outlines a recent offshore data collection study, concerned with collecting lifeboat evacuation data. Data collection methods are outlined and a selection of human error probabilities generated as a result of the study are provided. These data give insights into the type of errors and human failure rates that could be utilised to support offshore risk analyses

  11. The cost of human error intervention

    International Nuclear Information System (INIS)

    Bennett, C.T.; Banks, W.W.; Jones, E.D.

    1994-03-01

    DOE has directed that cost-benefit analyses be conducted as part of the review process for all new DOE orders. This new policy will have the effect of ensuring that DOE analysts can justify the implementation costs of the orders that they develop. We would like to argue that a cost-benefit analysis is merely one phase of a complete risk management program -- one that would more than likely start with a probabilistic risk assessment. The safety community defines risk as the probability of failure times the severity of consequence. An engineering definition of failure can be considered in terms of physical performance, as in mean-time-between-failure; or, it can be thought of in terms of human performance, as in probability of human error. The severity of consequence of a failure can be measured along any one of a number of dimensions -- economic, political, or social. Clearly, an analysis along one dimension cannot be directly compared to another but, a set of cost-benefit analyses, based on a series of cost-dimensions, can be extremely useful to managers who must prioritize their resources. Over the last two years, DOE has been developing a series of human factors orders, directed a lowering the probability of human error -- or at least changing the distribution of those errors. The following discussion presents a series of cost-benefit analyses using historical events in the nuclear industry. However, we would first like to discuss some of the analytic cautions that must be considered when we deal with human error

  12. The action characterization matrix: A link between HERA (Human Events Reference for ATHEANA) and ATHEANA (a technique for human error analysis)

    International Nuclear Information System (INIS)

    Hahn, H.A.

    1997-01-01

    The Technique for Human Error Analysis (ATHEANA) is a newly developed human reliability analysis (HRA) methodology that aims to facilitate better representation and integration of human performance into probabilistic risk assessment (PRA) modeling and quantification by analyzing risk-significant operating experience in the context of existing behavior science models. The fundamental premise of ATHEANA is that error-forcing contexts (EFCs), which refer to combinations of equipment/material conditions and performance shaping factors (PSFs), set up or create the conditions under which unsafe actions (UAs) can occur. ATHEANA is being developed in the context of nuclear power plant (NPP) PRAs, and much of the language used to describe the method and provide examples of its application are specific to that industry. Because ATHEANA relies heavily on the analysis of operational events that have already occurred as a mechanism for generating creative thinking about possible EFCs, a database, called the Human Events Reference for ATHEANA (HERA), has been developed to support the methodology. Los Alamos National Laboratory's (LANL) Human Factors Group has recently joined the ATHEANA project team; LANL is responsible for further developing the database structure and for analyzing additional exemplar operational events for entry into the database. The Action Characterization Matrix (ACM) is conceived as a bridge between the HERA database structure and ATHEANA. Specifically, the ACM allows each unsafe action or human failure event to be characterized according to its representation along each of six different dimensions: system status, initiator status, unsafe action mechanism, information processing stage, equipment/material conditions, and performance shaping factors. This report describes the development of the ACM and provides details on the structure and content of its dimensions

  13. Errors from Image Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wood, William Monford [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-02-23

    Presenting a systematic study of the standard analysis of rod-pinch radiographs for obtaining quantitative measurements of areal mass densities, and making suggestions for improving the methodology of obtaining quantitative information from radiographed objects.

  14. Human Error Mechanisms in Complex Work Environments

    DEFF Research Database (Denmark)

    Rasmussen, Jens

    1988-01-01

    will account for most of the action errors observed. In addition, error mechanisms appear to be intimately related to the development of high skill and know-how in a complex work context. This relationship between errors and human adaptation is discussed in detail for individuals and organisations...

  15. Socializing the human factors analysis and classification system: incorporating social psychological phenomena into a human factors error classification system.

    Science.gov (United States)

    Paletz, Susannah B F; Bearman, Christopher; Orasanu, Judith; Holbrook, Jon

    2009-08-01

    The presence of social psychological pressures on pilot decision making was assessed using qualitative analyses of critical incident interviews. Social psychological phenomena have long been known to influence attitudes and behavior but have not been highlighted in accident investigation models. Using a critical incident method, 28 pilots who flew in Alaska were interviewed. The participants were asked to describe a situation involving weather when they were pilot in command and found their skills challenged. They were asked to describe the incident in detail but were not explicitly asked to identify social pressures. Pressures were extracted from transcripts in a bottom-up manner and then clustered into themes. Of the 28 pilots, 16 described social psychological pressures on their decision making, specifically, informational social influence, the foot-in-the-door persuasion technique, normalization of deviance, and impression management and self-consistency motives. We believe accident and incident investigations can benefit from explicit inclusion of common social psychological pressures. We recommend specific ways of incorporating these pressures into theHuman Factors Analysis and Classification System.

  16. Trend analysis of nuclear reactor automatic trip events subjected to operator's human error at United States nuclear power plants

    International Nuclear Information System (INIS)

    Takagawa, Kenichi

    2009-01-01

    Trends in nuclear reactor automatic trip events due to human errors during plant operating mode have been analyzed by extracting 20 events which took place in the United States during the period of seven years from 2002 to 2008, cited in the LERs (Licensee Event Reports) submitted to the US Nuclear Regulatory Commission (NRC). It was shown that the yearly number of events was relatively large before 2005, and thereafter the number decreased. A period of stable operation, in which the yearly number was kept very small, continued for about three years, and then the yearly number turned to increase again. Before 2005, automatic trip events occurred more frequently during periodic inspections or start-up/shut-down operations. The recent trends, however, indicate that trip events became more frequent due to human errors during daily operations. Human errors were mostly caused by the self-conceit and carelessness of operators through the whole period. The before mentioned trends in the yearly number of events might be explained as follows. The decrease in the automatic trip events is attributed to sharing trouble information, leading as a consequence to improvement of the manual and training for the operations which have a higher potential risk of automatic trip. Then, while the period of stable operation continued, some operators came to pay less attention to preventing human errors and not interest in the training, leading to automatic trip events in reality due to miss-operation. From these analyses on trouble experiences in the US, we learnt the followings to prevent the occurrence similar troubles in Japan: Operators should be thoroughly skilled in basic actions to prevent human errors as persons concerned. And it should be further emphasized that they should elaborate by imaging actual plant operations even though the simulator training gives them successful experiences. (author)

  17. Detailed analysis of inversions predicted between two human genomes: errors, real polymorphisms, and their origin and population distribution.

    Science.gov (United States)

    Vicente-Salvador, David; Puig, Marta; Gayà-Vidal, Magdalena; Pacheco, Sarai; Giner-Delgado, Carla; Noguera, Isaac; Izquierdo, David; Martínez-Fundichely, Alexander; Ruiz-Herrera, Aurora; Estivill, Xavier; Aguado, Cristina; Lucas-Lledó, José Ignacio; Cáceres, Mario

    2017-02-01

    The growing catalogue of structural variants in humans often overlooks inversions as one of the most difficult types of variation to study, even though they affect phenotypic traits in diverse organisms. Here, we have analysed in detail 90 inversions predicted from the comparison of two independently assembled human genomes: the reference genome (NCBI36/HG18) and HuRef. Surprisingly, we found that two thirds of these predictions (62) represent errors either in assembly comparison or in one of the assemblies, including 27 misassembled regions in HG18. Next, we validated 22 of the remaining 28 potential polymorphic inversions using different PCR techniques and characterized their breakpoints and ancestral state. In addition, we determined experimentally the derived allele frequency in Europeans for 17 inversions (DAF = 0.01-0.80), as well as the distribution in 14 worldwide populations for 12 of them based on the 1000 Genomes Project data. Among the validated inversions, nine have inverted repeats (IRs) at their breakpoints, and two show nucleotide variation patterns consistent with a recurrent origin. Conversely, inversions without IRs have a unique origin and almost all of them show deletions or insertions at the breakpoints in the derived allele mediated by microhomology sequences, which highlights the importance of mechanisms like FoSTeS/MMBIR in the generation of complex rearrangements in the human genome. Finally, we found several inversions located within genes and at least one candidate to be positively selected in Africa. Thus, our study emphasizes the importance of careful analysis and validation of large-scale genomic predictions to extract reliable biological conclusions. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  18. Research trend on human error reduction

    International Nuclear Information System (INIS)

    Miyaoka, Sadaoki

    1990-01-01

    Human error has been the problem in all industries. In 1988, the Bureau of Mines, Department of the Interior, USA, carried out the worldwide survey on the human error in all industries in relation to the fatal accidents in mines. There was difference in the results according to the methods of collecting data, but the proportion that human error took in the total accidents distributed in the wide range of 20∼85%, and was 35% on the average. The rate of occurrence of accidents and troubles in Japanese nuclear power stations is shown, and the rate of occurrence of human error is 0∼0.5 cases/reactor-year, which did not much vary. Therefore, the proportion that human error took in the total tended to increase, and it has become important to reduce human error for lowering the rate of occurrence of accidents and troubles hereafter. After the TMI accident in 1979 in USA, the research on man-machine interface became active, and after the Chernobyl accident in 1986 in USSR, the problem of organization and management has been studied. In Japan, 'Safety 21' was drawn up by the Advisory Committee for Energy, and also the annual reports on nuclear safety pointed out the importance of human factors. The state of the research on human factors in Japan and abroad and three targets to reduce human error are reported. (K.I.)

  19. Dependent Human Error Probability Assessment

    International Nuclear Information System (INIS)

    Simic, Z.; Mikulicic, V.; Vukovic, I.

    2006-01-01

    This paper presents an assessment of the dependence between dynamic operator actions modeled in a Nuclear Power Plant (NPP) PRA and estimate the associated impact on Core damage frequency (CDF). This assessment was done improve HEP dependencies implementation inside existing PRA. All of the dynamic operator actions modeled in the NPP PRA are included in this assessment. Determining the level of HEP dependence and the associated influence on CDF are the major steps of this assessment. A decision on how to apply the results, i.e., should permanent HEP model changes be made, is based on the resulting relative CDF increase. Some CDF increase was selected as a threshold based on the NPP base CDF value and acceptance guidelines from the Regulatory Guide 1.174. HEP dependence resulting in a CDF increase of > 5E-07 would be considered potential candidates for specific incorporation into the baseline model. The approach used to judge the level of dependence between operator actions is based on dependency level categories and conditional probabilities developed in the Handbook of Human Reliability Analysis with Emphasis on Nuclear Power Plant Applications NUREG/CR-1278. To simplify the process, NUREG/CR-1278 identifies five levels of dependence: ZD (zero dependence), LD (low dependence), MD (moderate dependence), HD (high dependence), and CD (complete dependence). NUREG/CR-1278 also identifies several qualitative factors that could be involved in determining the level of dependence. Based on the NUREG/CR-1278 information, Time, Function, and Spatial attributes were judged to be the most important considerations when determining the level of dependence between operator actions within an accident sequence. These attributes were used to develop qualitative criteria (rules) that were used to judge the level of dependence (CD, HD, MD, LD, ZD) between the operator actions. After the level of dependence between the various HEPs is judged, quantitative values associated with the

  20. Analysis of Medication Error Reports

    Energy Technology Data Exchange (ETDEWEB)

    Whitney, Paul D.; Young, Jonathan; Santell, John; Hicks, Rodney; Posse, Christian; Fecht, Barbara A.

    2004-11-15

    In medicine, as in many areas of research, technological innovation and the shift from paper based information to electronic records has created a climate of ever increasing availability of raw data. There has been, however, a corresponding lag in our abilities to analyze this overwhelming mass of data, and classic forms of statistical analysis may not allow researchers to interact with data in the most productive way. This is true in the emerging area of patient safety improvement. Traditionally, a majority of the analysis of error and incident reports has been carried out based on an approach of data comparison, and starts with a specific question which needs to be answered. Newer data analysis tools have been developed which allow the researcher to not only ask specific questions but also to “mine” data: approach an area of interest without preconceived questions, and explore the information dynamically, allowing questions to be formulated based on patterns brought up by the data itself. Since 1991, United States Pharmacopeia (USP) has been collecting data on medication errors through voluntary reporting programs. USP’s MEDMARXsm reporting program is the largest national medication error database and currently contains well over 600,000 records. Traditionally, USP has conducted an annual quantitative analysis of data derived from “pick-lists” (i.e., items selected from a list of items) without an in-depth analysis of free-text fields. In this paper, the application of text analysis and data analysis tools used by Battelle to analyze the medication error reports already analyzed in the traditional way by USP is described. New insights and findings were revealed including the value of language normalization and the distribution of error incidents by day of the week. The motivation for this effort is to gain additional insight into the nature of medication errors to support improvements in medication safety.

  1. A strategy to the development of a human error analysis method for accident management in nuclear power plants using industrial accident dynamics

    International Nuclear Information System (INIS)

    Lee, Yong Hee; Kim, Jae Whan; Jung, Won Dae; Ha, Jae Ju

    1998-06-01

    This technical report describes the early progress of he establishment of a human error analysis method as a part of a human reliability analysis(HRA) method for the assessment of the human error potential in a given accident management strategy. At first, we review the shortages and limitations of the existing HRA methods through an example application. In order to enhance the bias to the quantitative aspect of the HRA method, we focused to the qualitative aspect, i.e., human error analysis(HEA), during the proposition of a strategy to the new method. For the establishment of a new HEA method, we discuss the basic theories and approaches to the human error in industry, and propose three basic requirements that should be maintained as pre-requisites for HEA method in practice. Finally, we test IAD(Industrial Accident Dynamics) which has been widely utilized in industrial fields, in order to know whether IAD can be so easily modified and extended to the nuclear power plant applications. We try to apply IAD to the same example case and develop new taxonomy of the performance shaping factors in accident management and their influence matrix, which could enhance the IAD method as an HEA method. (author). 33 refs., 17 tabs., 20 figs

  2. Human error in remote Afterloading Brachytherapy

    International Nuclear Information System (INIS)

    Quinn, M.L.; Callan, J.; Schoenfeld, I.; Serig, D.

    1994-01-01

    Remote Afterloading Brachytherapy (RAB) is a medical process used in the treatment of cancer. RAB uses a computer-controlled device to remotely insert and remove radioactive sources close to a target (or tumor) in the body. Some RAB problems affecting the radiation dose to the patient have been reported and attributed to human error. To determine the root cause of human error in the RAB system, a human factors team visited 23 RAB treatment sites in the US. The team observed RAB treatment planning and delivery, interviewed RAB personnel, and performed walk-throughs, during which staff demonstrated the procedures and practices used in performing RAB tasks. Factors leading to human error in the RAB system were identified. The impact of those factors on the performance of RAB was then evaluated and prioritized in terms of safety significance. Finally, the project identified and evaluated alternative approaches for resolving the safety significant problems related to human error

  3. Generalizing human error rates: A taxonomic approach

    International Nuclear Information System (INIS)

    Buffardi, L.; Fleishman, E.; Allen, J.

    1989-01-01

    It is well established that human error plays a major role in malfunctioning of complex, technological systems and in accidents associated with their operation. Estimates of the rate of human error in the nuclear industry range from 20-65% of all system failures. In response to this, the Nuclear Regulatory Commission has developed a variety of techniques for estimating human error probabilities for nuclear power plant personnel. Most of these techniques require the specification of the range of human error probabilities for various tasks. Unfortunately, very little objective performance data on error probabilities exist for nuclear environments. Thus, when human reliability estimates are required, for example in computer simulation modeling of system reliability, only subjective estimates (usually based on experts' best guesses) can be provided. The objective of the current research is to provide guidelines for the selection of human error probabilities based on actual performance data taken in other complex environments and applying them to nuclear settings. A key feature of this research is the application of a comprehensive taxonomic approach to nuclear and non-nuclear tasks to evaluate their similarities and differences, thus providing a basis for generalizing human error estimates across tasks. In recent years significant developments have occurred in classifying and describing tasks. Initial goals of the current research are to: (1) identify alternative taxonomic schemes that can be applied to tasks, and (2) describe nuclear tasks in terms of these schemes. Three standardized taxonomic schemes (Ability Requirements Approach, Generalized Information-Processing Approach, Task Characteristics Approach) are identified, modified, and evaluated for their suitability in comparing nuclear and non-nuclear power plant tasks. An agenda for future research and its relevance to nuclear power plant safety is also discussed

  4. Risk assessment of component failure modes and human errors using a new FMECA approach: application in the safety analysis of HDR brachytherapy

    International Nuclear Information System (INIS)

    Giardina, M; Castiglia, F; Tomarchio, E

    2014-01-01

    Failure mode, effects and criticality analysis (FMECA) is a safety technique extensively used in many different industrial fields to identify and prevent potential failures. In the application of traditional FMECA, the risk priority number (RPN) is determined to rank the failure modes; however, the method has been criticised for having several weaknesses. Moreover, it is unable to adequately deal with human errors or negligence. In this paper, a new versatile fuzzy rule-based assessment model is proposed to evaluate the RPN index to rank both component failure and human error. The proposed methodology is applied to potential radiological over-exposure of patients during high-dose-rate brachytherapy treatments. The critical analysis of the results can provide recommendations and suggestions regarding safety provisions for the equipment and procedures required to reduce the occurrence of accidental events. (paper)

  5. Human error theory: relevance to nurse management.

    Science.gov (United States)

    Armitage, Gerry

    2009-03-01

    Describe, discuss and critically appraise human error theory and consider its relevance for nurse managers. Healthcare errors are a persistent threat to patient safety. Effective risk management and clinical governance depends on understanding the nature of error. This paper draws upon a wide literature from published works, largely from the field of cognitive psychology and human factors. Although the content of this paper is pertinent to any healthcare professional; it is written primarily for nurse managers. Error is inevitable. Causation is often attributed to individuals, yet causation in complex environments such as healthcare is predominantly multi-factorial. Individual performance is affected by the tendency to develop prepacked solutions and attention deficits, which can in turn be related to local conditions and systems or latent failures. Blame is often inappropriate. Defences should be constructed in the light of these considerations and to promote error wisdom and organizational resilience. Managing and learning from error is seen as a priority in the British National Health Service (NHS), this can be better achieved with an understanding of the roots, nature and consequences of error. Such an understanding can provide a helpful framework for a range of risk management activities.

  6. SLIM-MAUD: an approach to assessing human error probabilities using structured expert judgment. Volume II. Detailed analysis of the technical issues

    International Nuclear Information System (INIS)

    Embrey, D.E.; Humphreys, P.; Rosa, E.A.; Kirwan, B.; Rea, K.

    1984-07-01

    This two-volume report presents the procedures and analyses performed in developing an approach for structuring expert judgments to estimate human error probabilities. Volume I presents an overview of work performed in developing the approach: SLIM-MAUD (Success Likelihood Index Methodology, implemented through the use of an interactive computer program called MAUD-Multi-Attribute Utility Decomposition). Volume II provides a more detailed analysis of the technical issues underlying the approach

  7. Understanding Human Error in Naval Aviation Mishaps.

    Science.gov (United States)

    Miranda, Andrew T

    2018-04-01

    To better understand the external factors that influence the performance and decisions of aviators involved in Naval aviation mishaps. Mishaps in complex activities, ranging from aviation to nuclear power operations, are often the result of interactions between multiple components within an organization. The Naval aviation mishap database contains relevant information, both in quantitative statistics and qualitative reports, that permits analysis of such interactions to identify how the working atmosphere influences aviator performance and judgment. Results from 95 severe Naval aviation mishaps that occurred from 2011 through 2016 were analyzed using Bayes' theorem probability formula. Then a content analysis was performed on a subset of relevant mishap reports. Out of the 14 latent factors analyzed, the Bayes' application identified 6 that impacted specific aspects of aviator behavior during mishaps. Technological environment, misperceptions, and mental awareness impacted basic aviation skills. The remaining 3 factors were used to inform a content analysis of the contextual information within mishap reports. Teamwork failures were the result of plan continuation aggravated by diffused responsibility. Resource limitations and risk management deficiencies impacted judgments made by squadron commanders. The application of Bayes' theorem to historical mishap data revealed the role of latent factors within Naval aviation mishaps. Teamwork failures were seen to be considerably damaging to both aviator skill and judgment. Both the methods and findings have direct application for organizations interested in understanding the relationships between external factors and human error. It presents real-world evidence to promote effective safety decisions.

  8. A chance to avoid mistakes human error

    International Nuclear Information System (INIS)

    Amaro, Pablo; Obeso, Eduardo; Gomez, Ruben

    2010-01-01

    Trying to give an answer to the lack of public information in the industry, in relationship with the different tools that are managed in the nuclear industry for minimizing the human error, a group of workers from different sections of the St. Maria de Garona NPP (Quality Assurance/ Organization and Human Factors) decided to embark on a challenging and exciting project: 'Write a book collecting all the knowledge accumulated during their daily activities, very often during lecture time of external information received from different organizations within the nuclear industry (INPO, WANO...), but also visiting different NPP's, maintaining meetings and participating in training courses related de Human and Organizational Factors'. Main objective of the book is presenting to the industry in general, the different tools that are used and fostered in the nuclear industry, in a practical way. In this way, the assimilation and implementation in others industries could be possible and achievable in and efficient context. One year of work, and our project is a reality. We have presented and abstract during the last Spanish Nuclear Society meeting in Sevilla, last October...and the best, the book is into the market for everybody in web-site: www.bubok.com. The book is structured in the following areas: 'Errare humanum est': Trying to present what is the human error to the reader, its origin and the different barriers. The message is that the reader see the error like something continuously present in our lives... even more frequently than we think. Studying its origin can be established aimed at barriers to avoid or at least minimize it. 'Error's bitter face': Shows the possible consequences of human errors. What better that presenting real experiences that have occurred in the industry. In the book, accidents in the nuclear industry, like Tree Mile Island NPP, Chernobyl NPP, and incidents like Davis Besse NPP in the past, helps to the reader to make a reflection about the

  9. Analysis of human error in occupational accidents in the power plant industries using combining innovative FTA and meta-heuristic algorithms

    Directory of Open Access Journals (Sweden)

    M. Omidvari

    2015-09-01

    Full Text Available Introduction: Occupational accidents are of the main issues in industries. It is necessary to identify the main root causes of accidents for their control. Several models have been proposed for determining the accidents root causes. FTA is one of the most widely used models which could graphically establish the root causes of accidents. The non-linear function is one of the main challenges in FTA compliance and in order to obtain the exact number, the meta-heuristic algorithms can be used. Material and Method: The present research was done in power plant industries in construction phase. In this study, a pattern for the analysis of human error in work-related accidents was provided by combination of neural network algorithms and FTA analytical model. Finally, using this pattern, the potential rate of all causes was determined. Result: The results showed that training, age, and non-compliance with safety principals in the workplace were the most important factors influencing human error in the occupational accident. Conclusion: According to the obtained results, it can be concluded that human errors can be greatly reduced by training, right choice of workers with regard to the type of occupations, and provision of appropriate safety conditions in the work place.

  10. Technique for human-error sequence identification and signification

    International Nuclear Information System (INIS)

    Heslinga, G.

    1988-01-01

    The aim of the present study was to investigate whether the event-tree technique can be used for the analysis of sequences of human errors that could cause initiating events. The scope of the study was limited to a consideration of the performance of procedural actions. The event-tree technique was modified to adapt it for this study and will be referred to as the 'Technique for Human-Error-Sequence Identification and Signification' (THESIS). The event trees used in this manner, i.e. THESIS event trees, appear to present additional problems if they are applied to human performance instead of technical systems. These problems, referred to as the 'Man-Related Features' of THESIS, are: the human capability to choose among several procedures, the ergonomics of the panel layout, human actions of a continuous nature, dependence between human errors, human capability to recover possible errors, the influence of memory during the recovery attempt, variability in human performance and correlations between human;erropr probabilities. The influence of these problems on the applicability of THESIS was assessed by means of mathematical analysis, field studies and laboratory experiments (author). 130 refs.; 51 figs.; 24 tabs

  11. Evaluation of human error estimation for nuclear power plants

    International Nuclear Information System (INIS)

    Haney, L.N.; Blackman, H.S.

    1987-01-01

    The dominant risk for severe accident occurrence in nuclear power plants (NPPs) is human error. The US Nuclear Regulatory Commission (NRC) sponsored an evaluation of Human Reliability Analysis (HRA) techniques for estimation of human error in NPPs. Twenty HRA techniques identified by a literature search were evaluated with criteria sets designed for that purpose and categorized. Data were collected at a commercial NPP with operators responding in walkthroughs of four severe accident scenarios and full scope simulator runs. Results suggest a need for refinement and validation of the techniques. 19 refs

  12. Using HET taxonomy to help stop human error

    OpenAIRE

    Li, Wen-Chin; Harris, Don; Stanton, Neville A.; Hsu, Yueh-Ling; Chang, Danny; Wang, Thomas; Young, Hong-Tsu

    2010-01-01

    Flight crews make positive contributions to the safety of aviation operations. Pilots have to assess continuously changing situations, evaluate potential risks, and make quick decisions. However, even well-trained and experienced pilots make errors. Accident investigations have identified that pilots’ performance is influenced significantly by the design of the flightdeck interface. This research applies hierarchical task analysis (HTA) and utilizes the Human Error Template (HET) taxonomy to ...

  13. Development of an integrated system for estimating human error probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Auflick, J.L.; Hahn, H.A.; Morzinski, J.A.

    1998-12-01

    This is the final report of a three-year, Laboratory Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). This project had as its main objective the development of a Human Reliability Analysis (HRA), knowledge-based expert system that would provide probabilistic estimates for potential human errors within various risk assessments, safety analysis reports, and hazard assessments. HRA identifies where human errors are most likely, estimates the error rate for individual tasks, and highlights the most beneficial areas for system improvements. This project accomplished three major tasks. First, several prominent HRA techniques and associated databases were collected and translated into an electronic format. Next, the project started a knowledge engineering phase where the expertise, i.e., the procedural rules and data, were extracted from those techniques and compiled into various modules. Finally, these modules, rules, and data were combined into a nearly complete HRA expert system.

  14. Human errors in operation - what to do with them?

    International Nuclear Information System (INIS)

    Michalek, J.

    2009-01-01

    'It is human to make errors!' This saying of our predecessors is still current and will continue to be valid also in the future, until human is a human. Errors cannot be completely eliminated from human activities. In average human makes two simple errors in one hour. For example, how many typing errors do we make while typing on the computer keyboard? How many times we make mistakes in writing the date in the first days of a new year? These errors have no major consequences, however, in certain situations errors of humans are very unpleasant and may be also very costly, they may even endanger human lives. (author)

  15. SHERPA: A systematic human error reduction and prediction approach

    International Nuclear Information System (INIS)

    Embrey, D.E.

    1986-01-01

    This paper describes a Systematic Human Error Reduction and Prediction Approach (SHERPA) which is intended to provide guidelines for human error reduction and quantification in a wide range of human-machine systems. The approach utilizes as its basic current cognitive models of human performance. The first module in SHERPA performs task and human error analyses, which identify likely error modes, together with guidelines for the reduction of these errors by training, procedures and equipment redesign. The second module uses a SARAH approach to quantify the probability of occurrence of the errors identified earlier, and provides cost benefit analyses to assist in choosing the appropriate error reduction approaches in the third module

  16. Operator error and emotions. Operator error and emotions - a major cause of human failure

    Energy Technology Data Exchange (ETDEWEB)

    Patterson, B.K. [Human Factors Practical Incorporated (Canada); Bradley, M. [Univ. of New Brunswick, Saint John, New Brunswick (Canada); Artiss, W.G. [Human Factors Practical (Canada)

    2000-07-01

    This paper proposes the idea that a large proportion of the incidents attributed to operator and maintenance error in a nuclear or industrial plant are actually founded in our human emotions. Basic psychological theory of emotions is briefly presented and then the authors present situations and instances that can cause emotions to swell and lead to operator and maintenance error. Since emotional information is not recorded in industrial incident reports, the challenge is extended to industry, to review incident source documents for cases of emotional involvement and to develop means to collect emotion related information in future root cause analysis investigations. Training must then be provided to operators and maintainers to enable them to know one's emotions, manage emotions, motivate one's self, recognize emotions in others and handle relationships. Effective training will reduce the instances of human error based in emotions and enable a cooperative, productive environment in which to work. (author)

  17. Operator error and emotions. Operator error and emotions - a major cause of human failure

    International Nuclear Information System (INIS)

    Patterson, B.K.; Bradley, M.; Artiss, W.G.

    2000-01-01

    This paper proposes the idea that a large proportion of the incidents attributed to operator and maintenance error in a nuclear or industrial plant are actually founded in our human emotions. Basic psychological theory of emotions is briefly presented and then the authors present situations and instances that can cause emotions to swell and lead to operator and maintenance error. Since emotional information is not recorded in industrial incident reports, the challenge is extended to industry, to review incident source documents for cases of emotional involvement and to develop means to collect emotion related information in future root cause analysis investigations. Training must then be provided to operators and maintainers to enable them to know one's emotions, manage emotions, motivate one's self, recognize emotions in others and handle relationships. Effective training will reduce the instances of human error based in emotions and enable a cooperative, productive environment in which to work. (author)

  18. A human reliability analysis (HRA) method for identifying and assessing the error of commission (EOC) from a diagnosis failure

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae Whan; Jung, Won Dea; Park, Jin Yun; Kang, Dae Il

    2005-01-01

    The study deals with a method for systematically identifying and assessing the EOC events that might be caused from a diagnosis failure or misdiagnosis of the expected events in accident scenarios of nuclear power plants. The method for EOC identification and assessment consists of three steps: analysis of the potential for a diagnosis failure (or misdiagnosis), identification of the EOC events from the diagnosis failure, quantitative assessment of the identified EOC events. As a tool for analysing a diagnosis failure, the MisDiagnosis Tree Analysis (MDTA) technique is proposed with the taxonomy of misdiagnosis causes. Also, the guidance on the identification of EOC events and the classification system and data are given for quantitiative assessment. As an applicaton of the proposed method, the EOCs identification and assessment for Younggwang 3 and 4 plants and their impact on the plant risk were performed. As the result, six events or event sequences were considered for diagnosis failures and about 20 new Human Failure Events (HFEs) involving EOCs were identified. According to the assessment of the risk impact of the identified HFEs, they increase the CDF by 11.4 % of the current CDF value, which corresponds to 10.2 % of the new CDF. The small loss of coolant accident (SLOCA) turned out to be a major contributor to the increase of CDF resulting in 9.2 % increaseof the current CDF.

  19. Error Analysis in Mathematics. Technical Report #1012

    Science.gov (United States)

    Lai, Cheng-Fei

    2012-01-01

    Error analysis is a method commonly used to identify the cause of student errors when they make consistent mistakes. It is a process of reviewing a student's work and then looking for patterns of misunderstanding. Errors in mathematics can be factual, procedural, or conceptual, and may occur for a number of reasons. Reasons why students make…

  20. An Error Analysis on TFL Learners’ Writings

    Directory of Open Access Journals (Sweden)

    Arif ÇERÇİ

    2016-12-01

    Full Text Available The main purpose of the present study is to identify and represent TFL learners’ writing errors through error analysis. All the learners started learning Turkish as foreign language with A1 (beginner level and completed the process by taking C1 (advanced certificate in TÖMER at Gaziantep University. The data of the present study were collected from 14 students’ writings in proficiency exams for each level. The data were grouped as grammatical, syntactic, spelling, punctuation, and word choice errors. The ratio and categorical distributions of identified errors were analyzed through error analysis. The data were analyzed through statistical procedures in an effort to determine whether error types differ according to the levels of the students. The errors in this study are limited to the linguistic and intralingual developmental errors

  1. Human error in maintenance: An investigative study for the factories of the future

    International Nuclear Information System (INIS)

    Dhillon, B S

    2014-01-01

    This paper presents a study of human error in maintenance. Many different aspects of human error in maintenance considered useful for the factories of the future are studied, including facts, figures, and examples; occurrence of maintenance error in equipment life cycle, elements of a maintenance person's time, maintenance environment and the causes for the occurrence of maintenance error, types and typical maintenance errors, common maintainability design errors and useful design guidelines to reduce equipment maintenance errors, maintenance work instructions, and maintenance error analysis methods

  2. Perancangan Fasilitas Kerja untuk Mereduksi Human Error

    Directory of Open Access Journals (Sweden)

    Harmein Nasution

    2012-01-01

    Full Text Available Work equipments and environment which are not design ergonomically can cause physical exhaustion to the workers. As a result of that physical exhaustion, many defects in the production lines can happen due to human error and also cause musculoskeletal complaints. To overcome, those effects, we occupied methods for analyzing the workers posture based on the SNQ (Standard Nordic Questionnaire, plibel, QEC (Quick Exposure Check and biomechanism. Moreover, we applied those methods for designing rolling machines and grip egrek ergono-mically, so that the defects on those production lines can be minimized.

  3. Analysis of human error in occupational accidents in the power plant industries using combining innovative FTA and meta-heuristic algorithms

    Directory of Open Access Journals (Sweden)

    M. Omidvari

    2015-09-01

    .Conclusion: According to the obtained results, it can be concluded that human errors can be greatly reduced by training, right choice of workers with regard to the type of occupations, and provision of appropriate safety conditions in the work place.

  4. Human Errors - A Taxonomy for Describing Human Malfunction in Industrial Installations

    DEFF Research Database (Denmark)

    Rasmussen, Jens

    1982-01-01

    This paper describes the definition and the characteristics of human errors. Different types of human behavior are classified, and their relation to different error mechanisms are analyzed. The effect of conditioning factors related to affective, motivating aspects of the work situation as well...... as physiological factors are also taken into consideration. The taxonomy for event analysis, including human malfunction, is presented. Possibilities for the prediction of human error are discussed. The need for careful studies in actual work situations is expressed. Such studies could provide a better...

  5. Effects of human errors on the determination of surveillance test interval

    International Nuclear Information System (INIS)

    Chung, Dae Wook; Koo, Bon Hyun

    1990-01-01

    This paper incorporates the effects of human error relevant to the periodic test on the unavailability of the safety system as well as the component unavailability. Two types of possible human error during the test are considered. One is the possibility that a good safety system is inadvertently left in a bad state after the test (Type A human error) and the other is the possibility that bad safety system is undetected upon the test (Type B human error). An event tree model is developed for the steady-state unavailability of safety system to determine the effects of human errors on the component unavailability and the test interval. We perform the reliability analysis of safety injection system (SIS) by applying aforementioned two types of human error to safety injection pumps. Results of various sensitivity analyses show that; 1) the appropriate test interval decreases and steady-state unavailability increases as the probabilities of both types of human errors increase, and they are far more sensitive to Type A human error than Type B and 2) the SIS unavailability increases slightly as the probability of Type B human error increases, and significantly as the probability of Type A human error increases. Therefore, to avoid underestimation, the effects of human error should be incorporated in the system reliability analysis which aims at the relaxations of the surveillance test intervals, and Type A human error has more important effect on the unavailability and surveillance test interval

  6. Human errors, countermeasures for their prevention and evaluation

    International Nuclear Information System (INIS)

    Kohda, Takehisa; Inoue, Koichi

    1992-01-01

    The accidents originated in human errors have occurred as ever in recent large accidents such as the TMI accident and the Chernobyl accident. The proportion of the accidents originated in human errors is unexpectedly high, therefore, the reliability and safety of hardware are improved hereafter, but the improvement of human reliability cannot be expected. Human errors arise by the difference between the function required for men and the function actually accomplished by men, and the results exert some adverse effect to systems. Human errors are classified into design error, manufacture error, operation error, maintenance error, checkup error and general handling error. In terms of behavior, human errors are classified into forget to do, fail to do, do that must not be done, mistake in order and do at improper time. The factors in human error occurrence are circumstantial factor, personal factor and stress factor. As the method of analyzing and evaluating human errors, system engineering method such as probabilistic risk assessment is used. The technique for human error rate prediction, the method for human cognitive reliability, confusion matrix and SLIM-MAUD are also used. (K.I.)

  7. Analysis of error patterns in clinical radiotherapy

    International Nuclear Information System (INIS)

    Macklis, Roger; Meier, Tim; Barrett, Patricia; Weinhous, Martin

    1996-01-01

    Purpose: Until very recently, prescription errors and adverse treatment events have rarely been studied or reported systematically in oncology. We wished to understand the spectrum and severity of radiotherapy errors that take place on a day-to-day basis in a high-volume academic practice and to understand the resource needs and quality assurance challenges placed on a department by rapid upswings in contract-based clinical volumes requiring additional operating hours, procedures, and personnel. The goal was to define clinical benchmarks for operating safety and to detect error-prone treatment processes that might function as 'early warning' signs. Methods: A multi-tiered prospective and retrospective system for clinical error detection and classification was developed, with formal analysis of the antecedents and consequences of all deviations from prescribed treatment delivery, no matter how trivial. A department-wide record-and-verify system was operational during this period and was used as one method of treatment verification and error detection. Brachytherapy discrepancies were analyzed separately. Results: During the analysis year, over 2000 patients were treated with over 93,000 individual fields. A total of 59 errors affecting a total of 170 individual treated fields were reported or detected during this period. After review, all of these errors were classified as Level 1 (minor discrepancy with essentially no potential for negative clinical implications). This total treatment delivery error rate (170/93, 332 or 0.18%) is significantly better than corresponding error rates reported for other hospital and oncology treatment services, perhaps reflecting the relatively sophisticated error avoidance and detection procedures used in modern clinical radiation oncology. Error rates were independent of linac model and manufacturer, time of day (normal operating hours versus late evening or early morning) or clinical machine volumes. There was some relationship to

  8. Error analysis of nuclear power plant operator cognitive behavior

    International Nuclear Information System (INIS)

    He Xuhong; Zhao Bingquan; Chen Yulong

    2001-01-01

    Nuclear power plant is a complex human-machine system integrated with many advanced machines, electron devices and automatic controls. It demands operators to have high cognitive ability and correct analysis skill. The author divides operator's cognitive process into five stages to analysis. With this cognitive model, operator's cognitive error is analysed to get the root causes and stages that error happens. The results of the analysis serve as a basis in design of control rooms and training and evaluation of operators

  9. A Comparative Study on Error Analysis

    DEFF Research Database (Denmark)

    Wu, Xiaoli; Zhang, Chun

    2015-01-01

    Title: A Comparative Study on Error Analysis Subtitle: - Belgian (L1) and Danish (L1) learners’ use of Chinese (L2) comparative sentences in written production Xiaoli Wu, Chun Zhang Abstract: Making errors is an inevitable and necessary part of learning. The collection, classification and analysis...... the occurrence of errors either in linguistic or pedagogical terms. The purpose of the current study is to demonstrate the theoretical and practical relevance of error analysis approach in CFL by investigating two cases - (1) Belgian (L1) learners’ use of Chinese (L2) comparative sentences in written production...... of errors in the written and spoken production of L2 learners has a long tradition in L2 pedagogy. Yet, in teaching and learning Chinese as a foreign language (CFL), only handful studies have been made either to define the ‘error’ in a pedagogically insightful way or to empirically investigate...

  10. Error estimation in plant growth analysis

    Directory of Open Access Journals (Sweden)

    Andrzej Gregorczyk

    2014-01-01

    Full Text Available The scheme is presented for calculation of errors of dry matter values which occur during approximation of data with growth curves, determined by the analytical method (logistic function and by the numerical method (Richards function. Further formulae are shown, which describe absolute errors of growth characteristics: Growth rate (GR, Relative growth rate (RGR, Unit leaf rate (ULR and Leaf area ratio (LAR. Calculation examples concerning the growth course of oats and maize plants are given. The critical analysis of the estimation of obtained results has been done. The purposefulness of joint application of statistical methods and error calculus in plant growth analysis has been ascertained.

  11. Basic considerations in predicting error probabilities in human task performance

    International Nuclear Information System (INIS)

    Fleishman, E.A.; Buffardi, L.C.; Allen, J.A.; Gaskins, R.C. III

    1990-04-01

    It is well established that human error plays a major role in the malfunctioning of complex systems. This report takes a broad look at the study of human error and addresses the conceptual, methodological, and measurement issues involved in defining and describing errors in complex systems. In addition, a review of existing sources of human reliability data and approaches to human performance data base development is presented. Alternative task taxonomies, which are promising for establishing the comparability on nuclear and non-nuclear tasks, are also identified. Based on such taxonomic schemes, various data base prototypes for generalizing human error rates across settings are proposed. 60 refs., 3 figs., 7 tabs

  12. Human error probability estimation using licensee event reports

    International Nuclear Information System (INIS)

    Voska, K.J.; O'Brien, J.N.

    1984-07-01

    Objective of this report is to present a method for using field data from nuclear power plants to estimate human error probabilities (HEPs). These HEPs are then used in probabilistic risk activities. This method of estimating HEPs is one of four being pursued in NRC-sponsored research. The other three are structured expert judgment, analysis of training simulator data, and performance modeling. The type of field data analyzed in this report is from Licensee Event reports (LERs) which are analyzed using a method specifically developed for that purpose. However, any type of field data or human errors could be analyzed using this method with minor adjustments. This report assesses the practicality, acceptability, and usefulness of estimating HEPs from LERs and comprehensively presents the method for use

  13. Development of Human Factor Management Requirements and Human Error Classification for the Prevention of Railway Accident

    International Nuclear Information System (INIS)

    Kwak, Sang Log; Park, Chan Woo; Shin, Seung Ryoung

    2008-08-01

    Railway accident analysis results show that accidents cased by human factors are not decreasing, whereas H/W related accidents are steadily decreasing. For the efficient management of human factors, many expertise on design, conditions, safety culture and staffing are required. But current safety management activities on safety critical works are focused on training, due to the limited resource and information. In order to improve railway safety, human factors management requirements for safety critical worker and human error classification is proposed in this report. For this accident analysis, status of safety measure on human factor, safety management system on safety critical worker, current safety planning is analysis

  14. Comprehensive analysis of a medication dosing error related to CPOE.

    Science.gov (United States)

    Horsky, Jan; Kuperman, Gilad J; Patel, Vimla L

    2005-01-01

    This case study of a serious medication error demonstrates the necessity of a comprehensive methodology for the analysis of failures in interaction between humans and information systems. The authors used a novel approach to analyze a dosing error related to computer-based ordering of potassium chloride (KCl). The method included a chronological reconstruction of events and their interdependencies from provider order entry usage logs, semistructured interviews with involved clinicians, and interface usability inspection of the ordering system. Information collected from all sources was compared and evaluated to understand how the error evolved and propagated through the system. In this case, the error was the product of faults in interaction among human and system agents that methods limited in scope to their distinct analytical domains would not identify. The authors characterized errors in several converging aspects of the drug ordering process: confusing on-screen laboratory results review, system usability difficulties, user training problems, and suboptimal clinical system safeguards that all contributed to a serious dosing error. The results of the authors' analysis were used to formulate specific recommendations for interface layout and functionality modifications, suggest new user alerts, propose changes to user training, and address error-prone steps of the KCl ordering process to reduce the risk of future medication dosing errors.

  15. Analysis of field errors in existing undulators

    International Nuclear Information System (INIS)

    Kincaid, B.M.

    1990-01-01

    The Advanced Light Source (ALS) and other third generation synchrotron light sources have been designed for optimum performance with undulator insertion devices. The performance requirements for these new undulators are explored, with emphasis on the effects of errors on source spectral brightness. Analysis of magnetic field data for several existing hybrid undulators is presented, decomposing errors into systematic and random components. An attempts is made to identify the sources of these errors, and recommendations are made for designing future insertion devices. 12 refs., 16 figs

  16. Understanding Teamwork in Trauma Resuscitation through Analysis of Team Errors

    Science.gov (United States)

    Sarcevic, Aleksandra

    2009-01-01

    An analysis of human errors in complex work settings can lead to important insights into the workspace design. This type of analysis is particularly relevant to safety-critical, socio-technical systems that are highly dynamic, stressful and time-constrained, and where failures can result in catastrophic societal, economic or environmental…

  17. Influence of organizational culture on human error

    International Nuclear Information System (INIS)

    Friedlander, M.A.; Evans, S.A.

    1996-01-01

    Much has been written in contemporary business literature during the last decade describing the role that corporate culture plays in virtually every aspect of a firm's success. In 1990 Kotter and Heskett wrote, open-quotes We found that firms with cultures that emphasized all of the key managerial constituencies (customers, stockholders, and employees) and leadership from managers at all levels out-performed firms that did not have those cultural traits by a huge margin. Over an eleven year period, the former increased revenues by an average of 682 percent versus 166 percent for the latter, expanded their workforce by 282 percent versus 36 percent, grew their stock prices by 901 percent versus 74 percent, and improved their net incomes by 756 percent versus 1 percent.close quotes Since the mid-1980s, several electric utilities have documented their efforts to undertake strategic culture change. In almost every case, these efforts have yielded dramatic improvements in the open-quotes bottom-lineclose quotes operational and financial results (e.g., Western Resources, Arizona Public Service, San Diego Gas ampersand Electric, and Electricity Trust of South Australia). Given the body of evidence that indicates a relationship between high-performing organizational culture and the financial and business success of a firm, Pennsylvania Power ampersand Light Company undertook a study to identify the relationship between organizational culture and the frequency, severity, and nature of human error at the Susquehanna Steam Electric Station. The underlying proposition for this asssessment is that organizational culture is an independent variable that transforms external events into organizational performance

  18. Error propagation analysis for a sensor system

    International Nuclear Information System (INIS)

    Yeater, M.L.; Hockenbury, R.W.; Hawkins, J.; Wilkinson, J.

    1976-01-01

    As part of a program to develop reliability methods for operational use with reactor sensors and protective systems, error propagation analyses are being made for each model. An example is a sensor system computer simulation model, in which the sensor system signature is convoluted with a reactor signature to show the effect of each in revealing or obscuring information contained in the other. The error propagation analysis models the system and signature uncertainties and sensitivities, whereas the simulation models the signatures and by extensive repetitions reveals the effect of errors in various reactor input or sensor response data. In the approach for the example presented, the errors accumulated by the signature (set of ''noise'' frequencies) are successively calculated as it is propagated stepwise through a system comprised of sensor and signal processing components. Additional modeling steps include a Fourier transform calculation to produce the usual power spectral density representation of the product signature, and some form of pattern recognition algorithm

  19. PERANCANGAN COMPUTER AIDED SYSTEM DALAM MENGANALISA HUMAN ERROR DI PERKERETAAPIAN INDONESIA

    Directory of Open Access Journals (Sweden)

    Wiwik Budiawan

    2013-06-01

    Full Text Available Kecelakaan kereta api (KA yang terjadi secara beruntun di Indonesia sudah berada pada tingkat kritis. Berdasarkan data dari Direktorat Jendral Perkeretaapian, dalam kurun 5 tahun terakhir (2005-2009 total terdapat 611 kecelakaan KA.  Banyak faktor yang berkontribusi menyebabkan terjadinya kecelakaan, antara lain: sarana, prasarana, SDM operator (human error, eksternal, dan alam.  Kegagalan manusia (Human error merupakan salah satu faktor yang berpotensi menyebabkan terjadinya suatu kecelakaan KA dan dinyatakan sebagai faktor utama penyebab terjadinya suatu kecelakaan kereta api di Indonesia. Namun, tidak jelas bagaimana teknik analisis ini dilakukan. Kajian human error yang dilakukan Komite Nasional Keselamatan Transportasi (KNKT masih relatif terbatas, tidak dilengkapi dengan metode yang sistematis. Terdapat beberapa metode yang telah dikembangkan saat ini, tetapi untuk moda transportasi kereta api masih belum banyak dikembangkan. Human Factors Analysis and Classification System (HFACS merupakan metode analisis human error yang dikembangkan dan disesuaikan dengan sistem perkeretaapian Indonesia. Guna meningkatkan keandalan dalam analisis human error, HFACS kemudian dikembangkan dalam bentuk aplikasi berbasis web yang dapat diakses di komputer maupun smartphone. Hasil penelitian ini dapat dimanfaatkan oleh KNKT sebagai metode analisis kecelakaan kereta api khususnya terkait dengan human error. Kata kunci : human error, HFACS, CAS, kereta api   Abstract Train wreck (KA which occurred in quick succession in Indonesia already at a critical level. Based on data from the Directorate General of Railways, during the last 5 years (2005-2009 there were a total of 611 railway accidents. Many factors contribute to cause accidents, such as: facilities, infrastructure, human operator (human error, external, and natural. Human failure (Human error is one of the factors that could potentially cause a train accident and expressed as the main factors causing

  20. Nursing Errors in Intensive Care Unit by Human Error Identification in Systems Tool: A Case Study

    Directory of Open Access Journals (Sweden)

    Nezamodini

    2016-03-01

    Full Text Available Background Although health services are designed and implemented to improve human health, the errors in health services are a very common phenomenon and even sometimes fatal in this field. Medical errors and their cost are global issues with serious consequences for the patients’ community that are preventable and require serious attention. Objectives The current study aimed to identify possible nursing errors applying human error identification in systems tool (HEIST in the intensive care units (ICUs of hospitals. Patients and Methods This descriptive research was conducted in the intensive care unit of a hospital in Khuzestan province in 2013. Data were collected through observation and interview by nine nurses in this section in a period of four months. Human error classification was based on Rose and Rose and Swain and Guttmann models. According to HEIST work sheets the guide questions were answered and error causes were identified after the determination of the type of errors. Results In total 527 errors were detected. The performing operation on the wrong path had the highest frequency which was 150, and the second rate with a frequency of 136 was doing the tasks later than the deadline. Management causes with a frequency of 451 were the first rank among identified errors. Errors mostly occurred in the system observation stage and among the performance shaping factors (PSFs, time was the most influencing factor in occurrence of human errors. Conclusions Finally, in order to prevent the occurrence and reduce the consequences of identified errors the following suggestions were proposed : appropriate training courses, applying work guidelines and monitoring their implementation, increasing the number of work shifts, hiring professional workforce, equipping work space with appropriate facilities and equipment.

  1. Can human error theory explain non-adherence?

    Science.gov (United States)

    Barber, Nick; Safdar, A; Franklin, Bryoney D

    2005-08-01

    To apply human error theory to explain non-adherence and examine how well it fits. Patients who were taking chronic medication were telephoned and asked whether they had been adhering to their medicine, and if not the reasons were explored and analysed according to a human error theory. Of 105 patients, 87 were contacted by telephone and they took part in the study. Forty-two recalled being non-adherent, 17 of them in the last 7 days; 11 of the 42 were intentionally non-adherent. The errors could be described by human error theory, and it explained unintentional non-adherence well, however, the application of 'rules' was difficult when considering mistakes. The consideration of error producing conditions and latent failures also revealed useful contributing factors. Human error theory offers a new and valuable way of understanding non-adherence, and could inform interventions. However, the theory needs further development to explain intentional non-adherence.

  2. Savannah River Site human error data base development for nonreactor nuclear facilities

    International Nuclear Information System (INIS)

    Benhardt, H.C.; Held, J.E.; Olsen, L.M.; Vail, R.E.; Eide, S.A.

    1994-01-01

    As part of an overall effort to upgrade and streamline methodologies for safety analyses of nonreactor nuclear facilities at the Savannah River Site (SRS), a human error data base has been developed and is presented in this report. The data base fulfills several needs of risk analysts supporting safety analysis report (SAR) development. First, it provides a single source for probabilities or rates for a wide variety of human errors associated with the SRS nonreactor nuclear facilities. Second, it provides a documented basis for human error probabilities or rates. And finally, it provides actual SRS-specific human error data to support many of the error probabilities or rates. Use of a single, documented reference source for human errors, supported by SRS-specific human error data, will improve the consistency and accuracy of human error modeling by SRS risk analysts. It is envisioned that SRS risk analysts will use this report as both a guide to identifying the types of human errors that may need to be included in risk models such as fault and event trees, and as a source for human error probabilities or rates. For each human error in this report, ffime different mean probabilities or rates are presented to cover a wide range of conditions and influencing factors. The ask analysts must decide which mean value is most appropriate for each particular application. If other types of human errors are needed for the risk models, the analyst must use other sources. Finally, if human enors are dominant in the quantified risk models (based on the values obtained fmm this report), then it may be appropriate to perform detailed human reliability analyses (HRAS) for the dominant events. This document does not provide guidance for such refined HRAS; in such cases experienced human reliability analysts should be involved

  3. Risk Management and the Concept of Human Error

    DEFF Research Database (Denmark)

    Rasmussen, Jens

    1995-01-01

    by a stochastic coincidence of faults and human errors, but by a systemic erosion of the defenses due to decision making under competitive pressure in a dynamic environment. The presentation will discuss the nature of human error and the risk management problems found in a dynamic, competitive society facing...

  4. Inherent safety, ethics and human error.

    Science.gov (United States)

    Papadaki, Maria

    2008-02-11

    A patient is to have the damaged left kidney removed. To safeguard correctness of action several layers of expert checks have been performed prior to the operation, which results in the removal of the fully functional right kidney. Nobody asked the patient. The patient did not volunteer providing "unnecessary" information. The experts know everything ... An untidy house made out of flammable materials. A careless smoker left his lit cigarette unattended. A blow of wind and the house comes in flames. Would better construction materials have prevented the accident in spite of the carelessness of the inhabitant? A tricky medical condition which is expected to provoke a patient's fast health deterioration and their slow death. The doctor takes the initiative and responsibility of performing a risky operation. The patient's life is saved and their health is re-established. This work is not, as initially intended, the result of a thorough investigation of accidents, neither contains a systematic collection of data that can support the conclusions or the suggestions made. It is in the main a compilation of personal views. These views have been established from the correlation of the results of numerous accident investigation reports with the causes of small and insignificant incidents. These incidents are related with the education of university students, regulations within an academic environment and from independent personal experience working in different countries and with people of different cultures. The analysis that follows, however, should not be perceived as a mere reference to university students and/or to a university environment. University is the place where the fundamental scientific and engineering principles are germinated while current and past university students are the future and current production and design engineers, respectively. The places where the presented incidents have occurred are not always relevant with the conclusions, thus they are not

  5. Development and evaluation of a computer-aided system for analyzing human error in railway operations

    International Nuclear Information System (INIS)

    Kim, Dong San; Baek, Dong Hyun; Yoon, Wan Chul

    2010-01-01

    As human error has been recognized as one of the major contributors to accidents in safety-critical systems, there has been a strong need for techniques that can analyze human error effectively. Although many techniques have been developed so far, much room for improvement remains. As human error analysis is a cognitively demanding and time-consuming task, it is particularly necessary to develop a computerized system supporting this task. This paper presents a computer-aided system for analyzing human error in railway operations, called Computer-Aided System for Human Error Analysis and Reduction (CAS-HEAR). It supports analysts to find multiple levels of error causes and their causal relations by using predefined links between contextual factors and causal factors as well as links between causal factors. In addition, it is based on a complete accident model; hence, it helps analysts to conduct a thorough analysis without missing any important part of human error analysis. A prototype of CAS-HEAR was evaluated by nine field investigators from six railway organizations in Korea. Its overall usefulness in human error analysis was confirmed, although development of its simplified version and some modification of the contextual factors and causal factors are required in order to ensure its practical use.

  6. The probability and the management of human error

    International Nuclear Information System (INIS)

    Dufey, R.B.; Saull, J.W.

    2004-01-01

    Embedded within modern technological systems, human error is the largest, and indeed dominant contributor to accident cause. The consequences dominate the risk profiles for nuclear power and for many other technologies. We need to quantify the probability of human error for the system as an integral contribution within the overall system failure, as it is generally not separable or predictable for actual events. We also need to provide a means to manage and effectively reduce the failure (error) rate. The fact that humans learn from their mistakes allows a new determination of the dynamic probability and human failure (error) rate in technological systems. The result is consistent with and derived from the available world data for modern technological systems. Comparisons are made to actual data from large technological systems and recent catastrophes. Best estimate values and relationships can be derived for both the human error rate, and for the probability. We describe the potential for new approaches to the management of human error and safety indicators, based on the principles of error state exclusion and of the systematic effect of learning. A new equation is given for the probability of human error (λ) that combines the influences of early inexperience, learning from experience (ε) and stochastic occurrences with having a finite minimum rate, this equation is λ 5.10 -5 + ((1/ε) - 5.10 -5 ) exp(-3*ε). The future failure rate is entirely determined by the experience: thus the past defines the future

  7. Numeracy, Literacy and Newman's Error Analysis

    Science.gov (United States)

    White, Allan Leslie

    2010-01-01

    Newman (1977, 1983) defined five specific literacy and numeracy skills as crucial to performance on mathematical word problems: reading, comprehension, transformation, process skills, and encoding. Newman's Error Analysis (NEA) provided a framework for considering the reasons that underlay the difficulties students experienced with mathematical…

  8. Human error as a source of disturbances in Swedish nuclear power plants

    International Nuclear Information System (INIS)

    Sokolowski, E.

    1985-01-01

    Events involving human errors at the Swedish nuclear power plants are registered and periodically analyzed. The philosophy behind the scheme for data collection and analysis is discussed. Human errors cause about 10% of the disturbances registered. Only a small part of these errors are committed by operators in the control room. These and other findings differ from those in other countries. Possible reasons are put forward

  9. Analysis of the interface tracking errors

    International Nuclear Information System (INIS)

    Cerne, G.; Tiselj, I.; Petelin, S.

    2001-01-01

    An important limitation of the interface-tracking algorithm is the grid density, which determines the space scale of the surface tracking. In this paper the analysis of the interface tracking errors, which occur in a dispersed flow, is performed for the VOF interface tracking method. A few simple two-fluid tests are proposed for the investigation of the interface tracking errors and their grid dependence. When the grid density becomes too coarse to follow the interface changes, the errors can be reduced either by using denser nodalization or by switching to the two-fluid model during the simulation. Both solutions are analyzed and compared on a simple vortex-flow test.(author)

  10. Analysis of gross error rates in operation of commercial nuclear power stations

    International Nuclear Information System (INIS)

    Joos, D.W.; Sabri, Z.A.; Husseiny, A.A.

    1979-01-01

    Experience in operation of US commercial nuclear power plants is reviewed over a 25-month period. The reports accumulated in that period on events of human error and component failure are examined to evaluate gross operator error rates. The impact of such errors on plant operation and safety is examined through the use of proper taxonomies of error, tasks and failures. Four categories of human errors are considered; namely, operator, maintenance, installation and administrative. The computed error rates are used to examine appropriate operator models for evaluation of operator reliability. Human error rates are found to be significant to a varying degree in both BWR and PWR. This emphasizes the import of considering human factors in safety and reliability analysis of nuclear systems. The results also indicate that human errors, and especially operator errors, do indeed follow the exponential reliability model. (Auth.)

  11. Investigations on human error hazards in recent unintended trip events of Korean nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sa Kil; Jang, Tong Il; Lee, Yong Hee; Shin, Kwang Hyeon [KAERI, Daejeon (Korea, Republic of)

    2012-10-15

    According to the Operational Performance Information System (OPIS) which has been operated to improve the public understanding by the KINS (Korea Institute of Nuclear Safety), unintended trip events by mainly human errors counted up to 38 cases (18.7%) from 2000 to 2011. Although the Nuclear Power Plant (NPP) industry in Korea has been making efforts to reduce the human errors which have largely contributed to trip events, the human error rate might keep increasing. Interestingly, digital based I and C systems is the one of the reduction factors of unintended reactor trips. Human errors, however, have occurred due to the digital based I and C systems because those systems require new or changed behaviors to the NPP operators. Therefore, it is necessary that the investigations of human errors consider a new methodology to find not only tangible behavior but also intangible behavior such as organizational behaviors. In this study we investigated human errors to find latent factors such as decisions and conditions in the all of the unintended reactor trip events during last dozen years. To find them, we applied the HFACS (Human Factors Analysis and Classification System) which is a commonly utilized tool for investigating human contributions to aviation accidents under a widespread evaluation scheme. The objective of this study is to find latent factors behind of human errors in nuclear reactor trip events. Therefore, a method to investigate unintended trip events by human errors and the results will be discussed in more detail.

  12. Investigations on human error hazards in recent unintended trip events of Korean nuclear power plants

    International Nuclear Information System (INIS)

    Kim, Sa Kil; Jang, Tong Il; Lee, Yong Hee; Shin, Kwang Hyeon

    2012-01-01

    According to the Operational Performance Information System (OPIS) which has been operated to improve the public understanding by the KINS (Korea Institute of Nuclear Safety), unintended trip events by mainly human errors counted up to 38 cases (18.7%) from 2000 to 2011. Although the Nuclear Power Plant (NPP) industry in Korea has been making efforts to reduce the human errors which have largely contributed to trip events, the human error rate might keep increasing. Interestingly, digital based I and C systems is the one of the reduction factors of unintended reactor trips. Human errors, however, have occurred due to the digital based I and C systems because those systems require new or changed behaviors to the NPP operators. Therefore, it is necessary that the investigations of human errors consider a new methodology to find not only tangible behavior but also intangible behavior such as organizational behaviors. In this study we investigated human errors to find latent factors such as decisions and conditions in the all of the unintended reactor trip events during last dozen years. To find them, we applied the HFACS (Human Factors Analysis and Classification System) which is a commonly utilized tool for investigating human contributions to aviation accidents under a widespread evaluation scheme. The objective of this study is to find latent factors behind of human errors in nuclear reactor trip events. Therefore, a method to investigate unintended trip events by human errors and the results will be discussed in more detail

  13. Error analysis of stochastic gradient descent ranking.

    Science.gov (United States)

    Chen, Hong; Tang, Yi; Li, Luoqing; Yuan, Yuan; Li, Xuelong; Tang, Yuanyan

    2013-06-01

    Ranking is always an important task in machine learning and information retrieval, e.g., collaborative filtering, recommender systems, drug discovery, etc. A kernel-based stochastic gradient descent algorithm with the least squares loss is proposed for ranking in this paper. The implementation of this algorithm is simple, and an expression of the solution is derived via a sampling operator and an integral operator. An explicit convergence rate for leaning a ranking function is given in terms of the suitable choices of the step size and the regularization parameter. The analysis technique used here is capacity independent and is novel in error analysis of ranking learning. Experimental results on real-world data have shown the effectiveness of the proposed algorithm in ranking tasks, which verifies the theoretical analysis in ranking error.

  14. The Human Bathtub: Safety and Risk Predictions Including the Dynamic Probability of Operator Errors

    International Nuclear Information System (INIS)

    Duffey, Romney B.; Saull, John W.

    2006-01-01

    Reactor safety and risk are dominated by the potential and major contribution for human error in the design, operation, control, management, regulation and maintenance of the plant, and hence to all accidents. Given the possibility of accidents and errors, now we need to determine the outcome (error) probability, or the chance of failure. Conventionally, reliability engineering is associated with the failure rate of components, or systems, or mechanisms, not of human beings in and interacting with a technological system. The probability of failure requires a prior knowledge of the total number of outcomes, which for any predictive purposes we do not know or have. Analysis of failure rates due to human error and the rate of learning allow a new determination of the dynamic human error rate in technological systems, consistent with and derived from the available world data. The basis for the analysis is the 'learning hypothesis' that humans learn from experience, and consequently the accumulated experience defines the failure rate. A new 'best' equation has been derived for the human error, outcome or failure rate, which allows for calculation and prediction of the probability of human error. We also provide comparisons to the empirical Weibull parameter fitting used in and by conventional reliability engineering and probabilistic safety analysis methods. These new analyses show that arbitrary Weibull fitting parameters and typical empirical hazard function techniques cannot be used to predict the dynamics of human errors and outcomes in the presence of learning. Comparisons of these new insights show agreement with human error data from the world's commercial airlines, the two shuttle failures, and from nuclear plant operator actions and transient control behavior observed in transients in both plants and simulators. The results demonstrate that the human error probability (HEP) is dynamic, and that it may be predicted using the learning hypothesis and the minimum

  15. FRamework Assessing Notorious Contributing Influences for Error (FRANCIE): Perspective on Taxonomy Development to Support Error Reporting and Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lon N. Haney; David I. Gertman

    2003-04-01

    Beginning in the 1980s a primary focus of human reliability analysis was estimation of human error probabilities. However, detailed qualitative modeling with comprehensive representation of contextual variables often was lacking. This was likely due to the lack of comprehensive error and performance shaping factor taxonomies, and the limited data available on observed error rates and their relationship to specific contextual variables. In the mid 90s Boeing, America West Airlines, NASA Ames Research Center and INEEL partnered in a NASA sponsored Advanced Concepts grant to: assess the state of the art in human error analysis, identify future needs for human error analysis, and develop an approach addressing these needs. Identified needs included the need for a method to identify and prioritize task and contextual characteristics affecting human reliability. Other needs identified included developing comprehensive taxonomies to support detailed qualitative modeling and to structure meaningful data collection efforts across domains. A result was the development of the FRamework Assessing Notorious Contributing Influences for Error (FRANCIE) with a taxonomy for airline maintenance tasks. The assignment of performance shaping factors to generic errors by experts proved to be valuable to qualitative modeling. Performance shaping factors and error types from such detailed approaches can be used to structure error reporting schemes. In a recent NASA Advanced Human Support Technology grant FRANCIE was refined, and two new taxonomies for use on space missions were developed. The development, sharing, and use of error taxonomies, and the refinement of approaches for increased fidelity of qualitative modeling is offered as a means to help direct useful data collection strategies.

  16. Sensitivity of risk parameters to human errors for a PWR

    International Nuclear Information System (INIS)

    Samanta, P.; Hall, R.E.; Kerr, W.

    1980-01-01

    Sensitivities of the risk parameters, emergency safety system unavailabilities, accident sequence probabilities, release category probabilities and core melt probability were investigated for changes in the human error rates within the general methodological framework of the Reactor Safety Study for a Pressurized Water Reactor (PWR). Impact of individual human errors were assessed both in terms of their structural importance to core melt and reliability importance on core melt probability. The Human Error Sensitivity Assessment of a PWR (HESAP) computer code was written for the purpose of this study

  17. Human error probability quantification using fuzzy methodology in nuclear plants

    International Nuclear Information System (INIS)

    Nascimento, Claudio Souza do

    2010-01-01

    This work obtains Human Error Probability (HEP) estimates from operator's actions in response to emergency situations a hypothesis on Research Reactor IEA-R1 from IPEN. It was also obtained a Performance Shaping Factors (PSF) evaluation in order to classify them according to their influence level onto the operator's actions and to determine these PSF actual states over the plant. Both HEP estimation and PSF evaluation were done based on Specialists Evaluation using interviews and questionnaires. Specialists group was composed from selected IEA-R1 operators. Specialist's knowledge representation into linguistic variables and group evaluation values were obtained through Fuzzy Logic and Fuzzy Set Theory. HEP obtained values show good agreement with literature published data corroborating the proposed methodology as a good alternative to be used on Human Reliability Analysis (HRA). (author)

  18. Human error prediction and countermeasures based on CREAM in spent nuclear fuel (SNF) transportation

    International Nuclear Information System (INIS)

    Kim, Jae San

    2007-02-01

    Since the 1980s, in order to secure the storage capacity of spent nuclear fuel (SNF) at NPPs, SNF assemblies have been transported on-site from one unit to another unit nearby. However in the future the amount of the spent fuel will approach capacity in the areas used, and some of these SNFs will have to be transported to an off-site spent fuel repository. Most SNF materials used at NPPs will be transported by general cargo ships from abroad, and these SNFs will be stored in an interim storage facility. In the process of transporting SNF, human interactions will involve inspecting and preparing the cask and spent fuel, loading the cask onto the vehicle or ship, transferring the cask as well as storage or monitoring the cask. The transportation of SNF involves a number of activities that depend on reliable human performance. In the case of the transport of a cask, human errors may include spent fuel bundle misidentification or cask transport accidents among others. Reviews of accident events when transporting the Radioactive Material (RAM) throughout the world indicate that human error is the major causes for more than 65% of significant events. For the safety of SNF transportation, it is very important to predict human error and to deduce a method that minimizes the human error. This study examines the human factor effects on the safety of transporting spent nuclear fuel (SNF). It predicts and identifies the possible human errors in the SNF transport process (loading, transfer and storage of the SNF). After evaluating the human error mode in each transport process, countermeasures to minimize the human error are deduced. The human errors in SNF transportation were analyzed using Hollnagel's Cognitive Reliability and Error Analysis Method (CREAM). After determining the important factors for each process, countermeasures to minimize human error are provided in three parts: System design, Operational environment, and Human ability

  19. A method for analysing incidents due to human errors on nuclear installations

    International Nuclear Information System (INIS)

    Griffon, M.

    1980-01-01

    This paper deals with the development of a methodology adapted to a detailed analysis of incidents considered to be due to human errors. An identification of human errors and a search for their eventual multiple causes is then needed. They are categorized in eight classes: education and training of personnel, installation design, work organization, time and work duration, physical environment, social environment, history of the plant and performance of the operator. The method is illustrated by the analysis of a handling incident generated by multiple human errors. (author)

  20. The psychological background about human error and safety in NPP

    International Nuclear Information System (INIS)

    Zhang Li

    1992-01-01

    A human error is one of the factors which cause an accident in NPP. The in-situ psychological background plays an important role in inducing it. The author analyzes the structure of one's psychological background when one is at work, and gives a few examples of typical psychological background resulting in human errors. Finally it points out that the fundamental way to eliminate the unfavourable psychological background of safety production is to establish the safety culture in NPP along with its characteristics

  1. Exploring human error in military aviation flight safety events using post-incident classification systems.

    Science.gov (United States)

    Hooper, Brionny J; O'Hare, David P A

    2013-08-01

    Human error classification systems theoretically allow researchers to analyze postaccident data in an objective and consistent manner. The Human Factors Analysis and Classification System (HFACS) framework is one such practical analysis tool that has been widely used to classify human error in aviation. The Cognitive Error Taxonomy (CET) is another. It has been postulated that the focus on interrelationships within HFACS can facilitate the identification of the underlying causes of pilot error. The CET provides increased granularity at the level of unsafe acts. The aim was to analyze the influence of factors at higher organizational levels on the unsafe acts of front-line operators and to compare the errors of fixed-wing and rotary-wing operations. This study analyzed 288 aircraft incidents involving human error from an Australasian military organization occurring between 2001 and 2008. Action errors accounted for almost twice (44%) the proportion of rotary wing compared to fixed wing (23%) incidents. Both classificatory systems showed significant relationships between precursor factors such as the physical environment, mental and physiological states, crew resource management, training and personal readiness, and skill-based, but not decision-based, acts. The CET analysis showed different predisposing factors for different aspects of skill-based behaviors. Skill-based errors in military operations are more prevalent in rotary wing incidents and are related to higher level supervisory processes in the organization. The Cognitive Error Taxonomy provides increased granularity to HFACS analyses of unsafe acts.

  2. Intervention strategies for the management of human error

    Science.gov (United States)

    Wiener, Earl L.

    1993-01-01

    This report examines the management of human error in the cockpit. The principles probably apply as well to other applications in the aviation realm (e.g. air traffic control, dispatch, weather, etc.) as well as other high-risk systems outside of aviation (e.g. shipping, high-technology medical procedures, military operations, nuclear power production). Management of human error is distinguished from error prevention. It is a more encompassing term, which includes not only the prevention of error, but also a means of disallowing an error, once made, from adversely affecting system output. Such techniques include: traditional human factors engineering, improvement of feedback and feedforward of information from system to crew, 'error-evident' displays which make erroneous input more obvious to the crew, trapping of errors within a system, goal-sharing between humans and machines (also called 'intent-driven' systems), paperwork management, and behaviorally based approaches, including procedures, standardization, checklist design, training, cockpit resource management, etc. Fifteen guidelines for the design and implementation of intervention strategies are included.

  3. Modeling Human Error Mechanism for Soft Control in Advanced Control Rooms (ACRs)

    International Nuclear Information System (INIS)

    Aljneibi, Hanan Salah Ali; Ha, Jun Su; Kang, Seongkeun; Seong, Poong Hyun

    2015-01-01

    To achieve the switch from conventional analog-based design to digital design in ACRs, a large number of manual operating controls and switches have to be replaced by a few common multi-function devices which is called soft control system. The soft controls in APR-1400 ACRs are classified into safety-grade and non-safety-grade soft controls; each was designed using different and independent input devices in ACRs. The operations using soft controls require operators to perform new tasks which were not necessary in conventional controls such as navigating computerized displays to monitor plant information and control devices. These kinds of computerized displays and soft controls may make operations more convenient but they might cause new types of human error. In this study the human error mechanism during the soft controls is studied and modeled to be used for analysis and enhancement of human performance (or human errors) during NPP operation. The developed model would contribute to a lot of applications to improve human performance (or reduce human errors), HMI designs, and operators' training program in ACRs. The developed model of human error mechanism for the soft control is based on assumptions that a human operator has certain amount of capacity in cognitive resources and if resources required by operating tasks are greater than resources invested by the operator, human error (or poor human performance) is likely to occur (especially in 'slip'); good HMI (Human-machine Interface) design decreases the required resources; operator's skillfulness decreases the required resources; and high vigilance increases the invested resources. In this study the human error mechanism during the soft controls is studied and modeled to be used for analysis and enhancement of human performance (or reduction of human errors) during NPP operation

  4. Modeling Human Error Mechanism for Soft Control in Advanced Control Rooms (ACRs)

    Energy Technology Data Exchange (ETDEWEB)

    Aljneibi, Hanan Salah Ali [Khalifa Univ., Abu Dhabi (United Arab Emirates); Ha, Jun Su; Kang, Seongkeun; Seong, Poong Hyun [KAIST, Daejeon (Korea, Republic of)

    2015-10-15

    To achieve the switch from conventional analog-based design to digital design in ACRs, a large number of manual operating controls and switches have to be replaced by a few common multi-function devices which is called soft control system. The soft controls in APR-1400 ACRs are classified into safety-grade and non-safety-grade soft controls; each was designed using different and independent input devices in ACRs. The operations using soft controls require operators to perform new tasks which were not necessary in conventional controls such as navigating computerized displays to monitor plant information and control devices. These kinds of computerized displays and soft controls may make operations more convenient but they might cause new types of human error. In this study the human error mechanism during the soft controls is studied and modeled to be used for analysis and enhancement of human performance (or human errors) during NPP operation. The developed model would contribute to a lot of applications to improve human performance (or reduce human errors), HMI designs, and operators' training program in ACRs. The developed model of human error mechanism for the soft control is based on assumptions that a human operator has certain amount of capacity in cognitive resources and if resources required by operating tasks are greater than resources invested by the operator, human error (or poor human performance) is likely to occur (especially in 'slip'); good HMI (Human-machine Interface) design decreases the required resources; operator's skillfulness decreases the required resources; and high vigilance increases the invested resources. In this study the human error mechanism during the soft controls is studied and modeled to be used for analysis and enhancement of human performance (or reduction of human errors) during NPP operation.

  5. Human medial frontal cortex activity predicts learning from errors.

    Science.gov (United States)

    Hester, Robert; Barre, Natalie; Murphy, Kevin; Silk, Tim J; Mattingley, Jason B

    2008-08-01

    Learning from errors is a critical feature of human cognition. It underlies our ability to adapt to changing environmental demands and to tune behavior for optimal performance. The posterior medial frontal cortex (pMFC) has been implicated in the evaluation of errors to control behavior, although it has not previously been shown that activity in this region predicts learning from errors. Using functional magnetic resonance imaging, we examined activity in the pMFC during an associative learning task in which participants had to recall the spatial locations of 2-digit targets and were provided with immediate feedback regarding accuracy. Activity within the pMFC was significantly greater for errors that were subsequently corrected than for errors that were repeated. Moreover, pMFC activity during recall errors predicted future responses (correct vs. incorrect), despite a sizeable interval (on average 70 s) between an error and the next presentation of the same recall probe. Activity within the hippocampus also predicted future performance and correlated with error-feedback-related pMFC activity. A relationship between performance expectations and pMFC activity, in the absence of differing reinforcement value for errors, is consistent with the idea that error-related pMFC activity reflects the extent to which an outcome is "worse than expected."

  6. Error begat error: design error analysis and prevention in social infrastructure projects.

    Science.gov (United States)

    Love, Peter E D; Lopez, Robert; Edwards, David J; Goh, Yang M

    2012-09-01

    Design errors contribute significantly to cost and schedule growth in social infrastructure projects and to engineering failures, which can result in accidents and loss of life. Despite considerable research that has addressed their error causation in construction projects they still remain prevalent. This paper identifies the underlying conditions that contribute to design errors in social infrastructure projects (e.g. hospitals, education, law and order type buildings). A systemic model of error causation is propagated and subsequently used to develop a learning framework for design error prevention. The research suggests that a multitude of strategies should be adopted in congruence to prevent design errors from occurring and so ensure that safety and project performance are ameliorated. Copyright © 2011. Published by Elsevier Ltd.

  7. Derivation of main drivers affecting the possibility of human errors during low power and shutdown operation

    International Nuclear Information System (INIS)

    Kim, Ar Ryum; Seong, Poong Hyun; Park, Jin Kyun; Kim, Jae Whan

    2016-01-01

    In order to estimate the possibility of human error and identify its nature, human reliability analysis (HRA) methods have been implemented. For this, various HRA methods have been developed so far: techniques for human error rate prediction (THERP), cause based decision tree (CBDT), the cognitive reliability and error analysis method (CREAM) and so on. Most HRA methods have been developed with a focus on full power operation of NPPs even though human performance may more largely affect the safety of the system during low power and shutdown (LPSD) operation than it would when the system is in full power operation. In this regard, it is necessary to conduct a research for developing HRA method to be used in LPSD operation. For the first step of the study, main drivers which affect the possibility of human error have been developed. Drivers which are commonly called as performance shaping factors (PSFs) are aspects of the human's individual characteristics, environment, organization, or task that specifically decrements or improves human performance, thus respectively increasing or decreasing the likelihood of human errors. In order to estimate the possibility of human error and identify its nature, human reliability analysis (HRA) methods have been implemented. For this, various HRA methods have been developed so far: techniques for human error rate prediction (THERP), cause based decision tree (CBDT), the cognitive reliability and error analysis method (CREAM) and so on. Most HRA methods have been developed with a focus on full power operation of NPPs even though human performance may more largely affect the safety of the system during low power and shutdown (LPSD) operation than it would when the system is in full power operation. In this regard, it is necessary to conduct a research for developing HRA method to be used in LPSD operation. For the first step of the study, main drivers which affect the possibility of human error have been developed. Drivers which

  8. Derivation of main drivers affecting the possibility of human errors during low power and shutdown operation

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ar Ryum; Seong, Poong Hyun [KAIST, Daejeon (Korea, Republic of); Park, Jin Kyun; Kim, Jae Whan [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    In order to estimate the possibility of human error and identify its nature, human reliability analysis (HRA) methods have been implemented. For this, various HRA methods have been developed so far: techniques for human error rate prediction (THERP), cause based decision tree (CBDT), the cognitive reliability and error analysis method (CREAM) and so on. Most HRA methods have been developed with a focus on full power operation of NPPs even though human performance may more largely affect the safety of the system during low power and shutdown (LPSD) operation than it would when the system is in full power operation. In this regard, it is necessary to conduct a research for developing HRA method to be used in LPSD operation. For the first step of the study, main drivers which affect the possibility of human error have been developed. Drivers which are commonly called as performance shaping factors (PSFs) are aspects of the human's individual characteristics, environment, organization, or task that specifically decrements or improves human performance, thus respectively increasing or decreasing the likelihood of human errors. In order to estimate the possibility of human error and identify its nature, human reliability analysis (HRA) methods have been implemented. For this, various HRA methods have been developed so far: techniques for human error rate prediction (THERP), cause based decision tree (CBDT), the cognitive reliability and error analysis method (CREAM) and so on. Most HRA methods have been developed with a focus on full power operation of NPPs even though human performance may more largely affect the safety of the system during low power and shutdown (LPSD) operation than it would when the system is in full power operation. In this regard, it is necessary to conduct a research for developing HRA method to be used in LPSD operation. For the first step of the study, main drivers which affect the possibility of human error have been developed. Drivers

  9. Human Error Probability Assessment During Maintenance Activities of Marine Systems

    Directory of Open Access Journals (Sweden)

    Rabiul Islam

    2018-03-01

    Full Text Available Background: Maintenance operations on-board ships are highly demanding. Maintenance operations are intensive activities requiring high man–machine interactions in challenging and evolving conditions. The evolving conditions are weather conditions, workplace temperature, ship motion, noise and vibration, and workload and stress. For example, extreme weather condition affects seafarers' performance, increasing the chances of error, and, consequently, can cause injuries or fatalities to personnel. An effective human error probability model is required to better manage maintenance on-board ships. The developed model would assist in developing and maintaining effective risk management protocols. Thus, the objective of this study is to develop a human error probability model considering various internal and external factors affecting seafarers' performance. Methods: The human error probability model is developed using probability theory applied to Bayesian network. The model is tested using the data received through the developed questionnaire survey of >200 experienced seafarers with >5 years of experience. The model developed in this study is used to find out the reliability of human performance on particular maintenance activities. Results: The developed methodology is tested on the maintenance of marine engine's cooling water pump for engine department and anchor windlass for deck department. In the considered case studies, human error probabilities are estimated in various scenarios and the results are compared between the scenarios and the different seafarer categories. The results of the case studies for both departments are also compared. Conclusion: The developed model is effective in assessing human error probabilities. These probabilities would get dynamically updated as and when new information is available on changes in either internal (i.e., training, experience, and fatigue or external (i.e., environmental and operational conditions

  10. Prediction of human errors by maladaptive changes in event-related brain networks

    NARCIS (Netherlands)

    Eichele, T.; Debener, S.; Calhoun, V.D.; Specht, K.; Engel, A.K.; Hugdahl, K.; Cramon, D.Y. von; Ullsperger, M.

    2008-01-01

    Humans engaged in monotonous tasks are susceptible to occasional errors that may lead to serious consequences, but little is known about brain activity patterns preceding errors. Using functional Mill and applying independent component analysis followed by deconvolution of hemodynamic responses, we

  11. Demonstration Integrated Knowledge-Based System for Estimating Human Error Probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Auflick, Jack L.

    1999-04-21

    Human Reliability Analysis (HRA) is currently comprised of at least 40 different methods that are used to analyze, predict, and evaluate human performance in probabilistic terms. Systematic HRAs allow analysts to examine human-machine relationships, identify error-likely situations, and provide estimates of relative frequencies for human errors on critical tasks, highlighting the most beneficial areas for system improvements. Unfortunately, each of HRA's methods has a different philosophical approach, thereby producing estimates of human error probabilities (HEPs) that area better or worse match to the error likely situation of interest. Poor selection of methodology, or the improper application of techniques can produce invalid HEP estimates, where that erroneous estimation of potential human failure could have potentially severe consequences in terms of the estimated occurrence of injury, death, and/or property damage.

  12. Selection of anchor values for human error probability estimation

    International Nuclear Information System (INIS)

    Buffardi, L.C.; Fleishman, E.A.; Allen, J.A.

    1989-01-01

    There is a need for more dependable information to assist in the prediction of human errors in nuclear power environments. The major objective of the current project is to establish guidelines for using error probabilities from other task settings to estimate errors in the nuclear environment. This involves: (1) identifying critical nuclear tasks, (2) discovering similar tasks in non-nuclear environments, (3) finding error data for non-nuclear tasks, and (4) establishing error-rate values for the nuclear tasks based on the non-nuclear data. A key feature is the application of a classification system to nuclear and non-nuclear tasks to evaluate their similarities and differences in order to provide a basis for generalizing human error estimates across tasks. During the first eight months of the project, several classification systems have been applied to a sample of nuclear tasks. They are discussed in terms of their potential for establishing task equivalence and transferability of human error rates across situations

  13. SCHEME (Soft Control Human error Evaluation MEthod) for advanced MCR HRA

    International Nuclear Information System (INIS)

    Jang, Inseok; Jung, Wondea; Seong, Poong Hyun

    2015-01-01

    The Technique for Human Error Rate Prediction (THERP), Korean Human Reliability Analysis (K-HRA), Human Error Assessment and Reduction Technique (HEART), A Technique for Human Event Analysis (ATHEANA), Cognitive Reliability and Error Analysis Method (CREAM), and Simplified Plant Analysis Risk Human Reliability Assessment (SPAR-H) in relation to NPP maintenance and operation. Most of these methods were developed considering the conventional type of Main Control Rooms (MCRs). They are still used for HRA in advanced MCRs even though the operating environment of advanced MCRs in NPPs has been considerably changed by the adoption of new human-system interfaces such as computer-based soft controls. Among the many features in advanced MCRs, soft controls are an important feature because the operation action in NPP advanced MCRs is performed by soft controls. Consequently, those conventional methods may not sufficiently consider the features of soft control execution human errors. To this end, a new framework of a HRA method for evaluating soft control execution human error is suggested by performing the soft control task analysis and the literature reviews regarding widely accepted human error taxonomies. In this study, the framework of a HRA method for evaluating soft control execution human error in advanced MCRs is developed. First, the factors which HRA method in advanced MCRs should encompass are derived based on the literature review, and soft control task analysis. Based on the derived factors, execution HRA framework in advanced MCRs is developed mainly focusing on the features of soft control. Moreover, since most current HRA database deal with operation in conventional type of MCRs and are not explicitly designed to deal with digital HSI, HRA database are developed under lab scale simulation

  14. Bringing organizational factors to the fore of human error management

    International Nuclear Information System (INIS)

    Embrey, D.

    1991-01-01

    Human performance problems account for more than half of all significant events at nuclear power plants, even when these did not necessarily lead to severe accidents. In dealing with the management of human error, both technical and organizational factors need to be taken into account. Most important, a long-term commitment from senior management is needed. (author)

  15. Error detection in spoken human-machine interaction

    NARCIS (Netherlands)

    Krahmer, E.J.; Swerts, M.G.J.; Theune, M.; Weegels, M.F.

    2001-01-01

    Given the state of the art of current language and speech technology, errors are unavoidable in present-day spoken dialogue systems. Therefore, one of the main concerns in dialogue design is how to decide whether or not the system has understood the user correctly. In human-human communication,

  16. Error detection in spoken human-machine interaction

    NARCIS (Netherlands)

    Krahmer, E.; Swerts, M.; Theune, Mariet; Weegels, M.

    Given the state of the art of current language and speech technology, errors are unavoidable in present-day spoken dialogue systems. Therefore, one of the main concerns in dialogue design is how to decide whether or not the system has understood the user correctly. In human-human communication,

  17. Calculating method on human error probabilities considering influence of management and organization

    International Nuclear Information System (INIS)

    Gao Jia; Huang Xiangrui; Shen Zupei

    1996-01-01

    This paper is concerned with how management and organizational influences can be factored into quantifying human error probabilities on risk assessments, using a three-level Influence Diagram (ID) which is originally only as a tool for construction and representation of models of decision-making trees or event trees. An analytical model of human errors causation has been set up with three influence levels, introducing a method for quantification assessments (of the ID), which can be applied into quantifying probabilities) of human errors on risk assessments, especially into the quantification of complex event trees (system) as engineering decision-making analysis. A numerical case study is provided to illustrate the approach

  18. Human factors interventions to reduce human errors and improve productivity in maintenance tasks

    International Nuclear Information System (INIS)

    Isoda, Hachiro; Yasutake, J.Y.

    1992-01-01

    This paper describes work in progress to develop interventions to reduce human errors and increase maintenance productivity in nuclear power plants. The effort is part of a two-phased Human Factors research program being conducted jointly by the Central Research Institute of Electric Power Industry (CRIEPI) in Japan and the Electric Power Research Institute (EPRI) in the United States. The overall objective of this joint research program is to identify critical maintenance tasks and to develop, implement and evaluate interventions which have high potential for reducing human errors or increasing maintenance productivity. As a result of the Phase 1 effort, ten critical maintenance tasks were identified. For these tasks, over 25 candidate interventions were identified for potential development. After careful analysis, seven interventions were selected for development during Phase 2. This paper describes the methodology used to analyze and identify the most critical tasks, the process of identifying and developing selected interventions and some of the initial results. (author)

  19. An Error Analysis of Structured Light Scanning of Biological Tissue

    DEFF Research Database (Denmark)

    Jensen, Sebastian Hoppe Nesgaard; Wilm, Jakob; Aanæs, Henrik

    2017-01-01

    This paper presents an error analysis and correction model for four structured light methods applied to three common types of biological tissue; skin, fat and muscle. Despite its many advantages, structured light is based on the assumption of direct reflection at the object surface only......, statistical linear model based on the scan geometry. As such, scans can be corrected without introducing any specially designed pattern strategy or hardware. We can effectively reduce the error in a structured light scanner applied to biological tissue by as much as factor of two or three........ This assumption is violated by most biological material e.g. human skin, which exhibits subsurface scattering. In this study, we find that in general, structured light scans of biological tissue deviate significantly from the ground truth. We show that a large portion of this error can be predicted with a simple...

  20. Normalization of Deviation: Quotation Error in Human Factors.

    Science.gov (United States)

    Lock, Jordan; Bearman, Chris

    2018-05-01

    Objective The objective of this paper is to examine quotation error in human factors. Background Science progresses through building on the work of previous research. This requires accurate quotation. Quotation error has a number of adverse consequences: loss of credibility, loss of confidence in the journal, and a flawed basis for academic debate and scientific progress. Quotation error has been observed in a number of domains, including marine biology and medicine, but there has been little or no previous study of this form of error in human factors, a domain that specializes in the causes and management of error. Methods A study was conducted examining quotation accuracy of 187 extracts from 118 published articles that cited a control article (Vaughan's 1996 book: The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA). Results Of extracts studied, 12.8% ( n = 24) were classed as inaccurate, with 87.2% ( n = 163) being classed as accurate. A second dimension of agreement was examined with 96.3% ( n = 180) agreeing with the control article and only 3.7% ( n = 7) disagreeing. The categories of accuracy and agreement form a two by two matrix. Conclusion Rather than simply blaming individuals for quotation error, systemic factors should also be considered. Vaughan's theory, normalization of deviance, is one systemic theory that can account for quotation error. Application Quotation error is occurring in human factors and should receive more attention. According to Vaughan's theory, the normal everyday systems that promote scholarship may also allow mistakes, mishaps, and quotation error to occur.

  1. Guidelines for system modeling: pre-accident human errors, rev.0

    International Nuclear Information System (INIS)

    Kang, Dae Il; Jung, W. D.; Lee, Y. H.; Hwang, M. J.; Yang, J. E.

    2004-01-01

    The evaluation results of Human Reliability Analysis (HRA) of pre-accident human errors in the probabilistic safety assessment (PSA) for the Korea Standard Nuclear Power Plant (KSNP) using the ASME PRA standard show that more than 50% of 10 items to be improved are related to the identification and screening analysis for them. Thus, we developed a guideline for modeling pre-accident human errors for the system analyst to resolve some items to be improved for them. The developed guideline consists of modeling criteria for the pre-accident human errors (identification, qualitative screening, and common restoration errors) and detailed guidelines for pre-accident human errors relating to testing, maintenance, and calibration works of nuclear power plants (NPPs). The system analyst use the developed guideline and he or she applies it to the system which he or she takes care of. The HRA analyst review the application results of the system analyst. We applied the developed guideline to the auxiliary feed water system of the KSNP to show the usefulness of it. The application results of the developed guideline show that more than 50% of the items to be improved for pre-accident human errors of auxiliary feed water system are resolved. The guideline for modeling pre-accident human errors developed in this study can be used for other NPPs as well as the KSNP. It is expected that both use of the detailed procedure, to be developed in the future, for the quantification of pre-accident human errors and the guideline developed in this study will greatly enhance the PSA quality in the HRA of pre-accident human errors

  2. Guidelines for system modeling: pre-accident human errors, rev.0

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Dae Il; Jung, W. D.; Lee, Y. H.; Hwang, M. J.; Yang, J. E

    2004-01-01

    The evaluation results of Human Reliability Analysis (HRA) of pre-accident human errors in the probabilistic safety assessment (PSA) for the Korea Standard Nuclear Power Plant (KSNP) using the ASME PRA standard show that more than 50% of 10 items to be improved are related to the identification and screening analysis for them. Thus, we developed a guideline for modeling pre-accident human errors for the system analyst to resolve some items to be improved for them. The developed guideline consists of modeling criteria for the pre-accident human errors (identification, qualitative screening, and common restoration errors) and detailed guidelines for pre-accident human errors relating to testing, maintenance, and calibration works of nuclear power plants (NPPs). The system analyst use the developed guideline and he or she applies it to the system which he or she takes care of. The HRA analyst review the application results of the system analyst. We applied the developed guideline to the auxiliary feed water system of the KSNP to show the usefulness of it. The application results of the developed guideline show that more than 50% of the items to be improved for pre-accident human errors of auxiliary feed water system are resolved. The guideline for modeling pre-accident human errors developed in this study can be used for other NPPs as well as the KSNP. It is expected that both use of the detailed procedure, to be developed in the future, for the quantification of pre-accident human errors and the guideline developed in this study will greatly enhance the PSA quality in the HRA of pre-accident human errors.

  3. Some aspects of statistical modeling of human-error probability

    International Nuclear Information System (INIS)

    Prairie, R.R.

    1982-01-01

    Human reliability analyses (HRA) are often performed as part of risk assessment and reliability projects. Recent events in nuclear power have shown the potential importance of the human element. There are several on-going efforts in the US and elsewhere with the purpose of modeling human error such that the human contribution can be incorporated into an overall risk assessment associated with one or more aspects of nuclear power. An effort that is described here uses the HRA (event tree) to quantify and model the human contribution to risk. As an example, risk analyses are being prepared on several nuclear power plants as part of the Interim Reliability Assessment Program (IREP). In this process the risk analyst selects the elements of his fault tree that could be contributed to by human error. He then solicits the HF analyst to do a HRA on this element

  4. Advanced MMIS Toward Substantial Reduction in Human Errors in NPPs

    Energy Technology Data Exchange (ETDEWEB)

    Seong, Poong Hyun; Kang, Hyun Gook [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Na, Man Gyun [Chosun Univ., Gwangju (Korea, Republic of); Kim, Jong Hyun [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of); Heo, Gyunyoung [Kyung Hee Univ., Yongin (Korea, Republic of); Jung, Yoensub [Korea Hydro and Nuclear Power Co., Ltd., Daejeon (Korea, Republic of)

    2013-04-15

    This paper aims to give an overview of the methods to inherently prevent human errors and to effectively mitigate the consequences of such errors by securing defense-in-depth during plant management through the advanced man-machine interface system (MMIS). It is needless to stress the significance of human error reduction during an accident in nuclear power plants (NPPs). Unexpected shutdowns caused by human errors not only threaten nuclear safety but also make public acceptance of nuclear power extremely lower. We have to recognize there must be the possibility of human errors occurring since humans are not essentially perfect particularly under stressful conditions. However, we have the opportunity to improve such a situation through advanced information and communication technologies on the basis of lessons learned from our experiences. As important lessons, authors explained key issues associated with automation, man-machine interface, operator support systems, and procedures. Upon this investigation, we outlined the concept and technical factors to develop advanced automation, operation and maintenance support systems, and computer-based procedures using wired/wireless technology. It should be noted that the ultimate responsibility of nuclear safety obviously belongs to humans not to machines. Therefore, safety culture including education and training, which is a kind of organizational factor, should be emphasized as well. In regard to safety culture for human error reduction, several issues that we are facing these days were described. We expect the ideas of the advanced MMIS proposed in this paper to lead in the future direction of related researches and finally supplement the safety of NPPs.

  5. Advanced MMIS Toward Substantial Reduction in Human Errors in NPPs

    International Nuclear Information System (INIS)

    Seong, Poong Hyun; Kang, Hyun Gook; Na, Man Gyun; Kim, Jong Hyun; Heo, Gyunyoung; Jung, Yoensub

    2013-01-01

    This paper aims to give an overview of the methods to inherently prevent human errors and to effectively mitigate the consequences of such errors by securing defense-in-depth during plant management through the advanced man-machine interface system (MMIS). It is needless to stress the significance of human error reduction during an accident in nuclear power plants (NPPs). Unexpected shutdowns caused by human errors not only threaten nuclear safety but also make public acceptance of nuclear power extremely lower. We have to recognize there must be the possibility of human errors occurring since humans are not essentially perfect particularly under stressful conditions. However, we have the opportunity to improve such a situation through advanced information and communication technologies on the basis of lessons learned from our experiences. As important lessons, authors explained key issues associated with automation, man-machine interface, operator support systems, and procedures. Upon this investigation, we outlined the concept and technical factors to develop advanced automation, operation and maintenance support systems, and computer-based procedures using wired/wireless technology. It should be noted that the ultimate responsibility of nuclear safety obviously belongs to humans not to machines. Therefore, safety culture including education and training, which is a kind of organizational factor, should be emphasized as well. In regard to safety culture for human error reduction, several issues that we are facing these days were described. We expect the ideas of the advanced MMIS proposed in this paper to lead in the future direction of related researches and finally supplement the safety of NPPs

  6. ADVANCED MMIS TOWARD SUBSTANTIAL REDUCTION IN HUMAN ERRORS IN NPPS

    Directory of Open Access Journals (Sweden)

    POONG HYUN SEONG

    2013-04-01

    Full Text Available This paper aims to give an overview of the methods to inherently prevent human errors and to effectively mitigate the consequences of such errors by securing defense-in-depth during plant management through the advanced man-machine interface system (MMIS. It is needless to stress the significance of human error reduction during an accident in nuclear power plants (NPPs. Unexpected shutdowns caused by human errors not only threaten nuclear safety but also make public acceptance of nuclear power extremely lower. We have to recognize there must be the possibility of human errors occurring since humans are not essentially perfect particularly under stressful conditions. However, we have the opportunity to improve such a situation through advanced information and communication technologies on the basis of lessons learned from our experiences. As important lessons, authors explained key issues associated with automation, man-machine interface, operator support systems, and procedures. Upon this investigation, we outlined the concept and technical factors to develop advanced automation, operation and maintenance support systems, and computer-based procedures using wired/wireless technology. It should be noted that the ultimate responsibility of nuclear safety obviously belongs to humans not to machines. Therefore, safety culture including education and training, which is a kind of organizational factor, should be emphasized as well. In regard to safety culture for human error reduction, several issues that we are facing these days were described. We expect the ideas of the advanced MMIS proposed in this paper to lead in the future direction of related researches and finally supplement the safety of NPPs.

  7. Human error mode identification for NPP main control room operations using soft controls

    International Nuclear Information System (INIS)

    Lee, Seung Jun; Kim, Jaewhan; Jang, Seung-Cheol

    2011-01-01

    The operation environment of main control rooms (MCRs) in modern nuclear power plants (NPPs) has considerably changed over the years. Advanced MCRs, which have been designed by adapting digital and computer technologies, have simpler interfaces using large display panels, computerized displays, soft controls, computerized procedure systems, and so on. The actions for the NPP operations are performed using soft controls in advanced MCRs. Soft controls have different features from conventional controls. Operators need to navigate the screens to find indicators and controls and manipulate controls using a mouse, touch screens, and so on. Due to these different interfaces, different human errors should be considered in the human reliability analysis (HRA) for advanced MCRs. In this work, human errors that could occur during operation executions using soft controls were analyzed. This work classified the human errors in soft controls into six types, and the reasons that affect the occurrence of the human errors were also analyzed. (author)

  8. Findings from analysing and quantifying human error using current methods

    International Nuclear Information System (INIS)

    Dang, V.N.; Reer, B.

    1999-01-01

    In human reliability analysis (HRA), the scarcity of data means that, at best, judgement must be applied to transfer to the domain of the analysis what data are available for similar tasks. In particular for the quantification of tasks involving decisions, the analyst has to choose among quantification approaches that all depend to a significant degree on expert judgement. The use of expert judgement can be made more reliable by eliciting relative judgements rather than absolute judgements. These approaches, which are based on multiple criterion decision theory, focus on ranking the tasks to be analysed by difficulty. While these approaches remedy at least partially the poor performance of experts in the estimation of probabilities, they nevertheless require the calibration of the relative scale on which the actions are ranked in order to obtain the probabilities of interest. This paper presents some results from a comparison of some current HRA methods performed in the frame of a study of SLIM calibration options. The HRA quantification methods THERP, HEART, and INTENT were applied to derive calibration human error probabilities for two groups of operator actions. (author)

  9. Human errors in test and maintenance of nuclear power plants. Nordic project work

    International Nuclear Information System (INIS)

    Andersson, H.; Liwaang, B.

    1985-08-01

    The present report is a summary of the NKA/LIT-1 project performed for the period 1981-1985. The report summarizes work on human error influence in test and calibration activities in nuclear power plants, reviews problems regarding optimization of the test intervals, organization of test and maintenance activities, and the analysis of human error contribution to the overall risk in test and mainenace tasks. (author)

  10. Error Analysis of Determining Airplane Location by Global Positioning System

    OpenAIRE

    Hajiyev, Chingiz; Burat, Alper

    1999-01-01

    This paper studies the error analysis of determining airplane location by global positioning system (GPS) using statistical testing method. The Newton Rhapson method positions the airplane at the intersection point of four spheres. Absolute errors, relative errors and standard deviation have been calculated The results show that the positioning error of the airplane varies with the coordinates of GPS satellite and the airplane.

  11. The Countermeasures against the Human Errors in Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yong Hee; Kwon, Ki Chun; Lee, Jung Woon; Lee, Hyun; Jang, Tong Il

    2009-10-15

    Due to human error, the failure of nuclear power facilities essential for the prevention of accidents and related research in ergonomics and human factors, including the long term, comprehensive measures are considered technology is urgently required. Past nuclear facilities for the hardware in terms of continuing interest over subsequent definite improvement even have brought, now a nuclear facility to engage in people-related human factors for attention by nuclear facilities, ensuring the safety of its economic and industrial aspects. The point of the improvement is urgently required. The purpose of this research, including nuclear power plants in various nuclear facilities to minimize the possibility of human error by ensuring the safety for human engineering aspects will be implemented in the medium and long term preventive measures is to establish comprehensive.

  12. The Countermeasures against the Human Errors in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Lee, Yong Hee; Kwon, Ki Chun; Lee, Jung Woon; Lee, Hyun; Jang, Tong Il

    2009-10-01

    Due to human error, the failure of nuclear power facilities essential for the prevention of accidents and related research in ergonomics and human factors, including the long term, comprehensive measures are considered technology is urgently required. Past nuclear facilities for the hardware in terms of continuing interest over subsequent definite improvement even have brought, now a nuclear facility to engage in people-related human factors for attention by nuclear facilities, ensuring the safety of its economic and industrial aspects. The point of the improvement is urgently required. The purpose of this research, including nuclear power plants in various nuclear facilities to minimize the possibility of human error by ensuring the safety for human engineering aspects will be implemented in the medium and long term preventive measures is to establish comprehensive

  13. Quality assurance and human error effects on the structural safety

    International Nuclear Information System (INIS)

    Bertero, R.; Lopez, R.; Sarrate, M.

    1991-01-01

    Statistical surveys show that the frequency of failure of structures is much larger than that expected by the codes. Evidence exists that human errors (especially during the design process) is the main cause for the difference between the failure probability admitted by codes and the reality. In this paper, the attenuation of human error effects using tools of quality assurance is analyzed. In particular, the importance of the independent design review is highlighted, and different approaches are discussed. The experience from the Atucha II project, as well as the USA and German practice on independent design review, are summarized. (Author)

  14. A critique of recent models for human error rate assessment

    International Nuclear Information System (INIS)

    Apostolakis, G.E.

    1988-01-01

    This paper critically reviews two groups of models for assessing human error rates under accident conditions. The first group, which includes the US Nuclear Regulatory Commission (NRC) handbook model and the human cognitive reliability (HCR) model, considers as fundamental the time that is available to the operators to act. The second group, which is represented by the success likelihood index methodology multiattribute utility decomposition (SLIM-MAUD) model, relies on ratings of the human actions with respect to certain qualitative factors and the subsequent derivation of error rates. These models are evaluated with respect to two criteria: the treatment of uncertainties and the internal coherence of the models. In other words, this evaluation focuses primarily on normative aspects of these models. The principal findings are as follows: (1) Both of the time-related models provide human error rates as a function of the available time for action and the prevailing conditions. However, the HCR model ignores the important issue of state-of-knowledge uncertainties, dealing exclusively with stochastic uncertainty, whereas the model presented in the NRC handbook handles both types of uncertainty. (2) SLIM-MAUD provides a highly structured approach for the derivation of human error rates under given conditions. However, the treatment of the weights and ratings in this model is internally inconsistent. (author)

  15. Analysis of errors in forensic science

    Directory of Open Access Journals (Sweden)

    Mingxiao Du

    2017-01-01

    Full Text Available Reliability of expert testimony is one of the foundations of judicial justice. Both expert bias and scientific errors affect the reliability of expert opinion, which in turn affects the trustworthiness of the findings of fact in legal proceedings. Expert bias can be eliminated by replacing experts; however, it may be more difficult to eliminate scientific errors. From the perspective of statistics, errors in operation of forensic science include systematic errors, random errors, and gross errors. In general, process repetition and abiding by the standard ISO/IEC:17025: 2005, general requirements for the competence of testing and calibration laboratories, during operation are common measures used to reduce errors that originate from experts and equipment, respectively. For example, to reduce gross errors, the laboratory can ensure that a test is repeated several times by different experts. In applying for forensic principles and methods, the Federal Rules of Evidence 702 mandate that judges consider factors such as peer review, to ensure the reliability of the expert testimony. As the scientific principles and methods may not undergo professional review by specialists in a certain field, peer review serves as an exclusive standard. This study also examines two types of statistical errors. As false-positive errors involve a higher possibility of an unfair decision-making, they should receive more attention than false-negative errors.

  16. HUMAN ERROR QUANTIFICATION USING PERFORMANCE SHAPING FACTORS IN THE SPAR-H METHOD

    Energy Technology Data Exchange (ETDEWEB)

    Harold S. Blackman; David I. Gertman; Ronald L. Boring

    2008-09-01

    This paper describes a cognitively based human reliability analysis (HRA) quantification technique for estimating the human error probabilities (HEPs) associated with operator and crew actions at nuclear power plants. The method described here, Standardized Plant Analysis Risk-Human Reliability Analysis (SPAR-H) method, was developed to aid in characterizing and quantifying human performance at nuclear power plants. The intent was to develop a defensible method that would consider all factors that may influence performance. In the SPAR-H approach, calculation of HEP rates is especially straightforward, starting with pre-defined nominal error rates for cognitive vs. action-oriented tasks, and incorporating performance shaping factor multipliers upon those nominal error rates.

  17. Assessing human error during collecting a hydrocarbon sample of ...

    African Journals Online (AJOL)

    This paper reports the assessment method of the hydrocarbon sample collection standard operation procedure (SOP) using THERP. The Performance Shaping Factors (PSF) from THERP analyzed and assessed the human errors during collecting a hydrocarbon sample of a petrochemical refinery plant. Twenty-two ...

  18. Impact of human error on lumber yield in rough mills

    Science.gov (United States)

    Urs Buehlmann; R. Edward Thomas; R. Edward Thomas

    2002-01-01

    Rough sawn, kiln-dried lumber contains characteristics such as knots and bark pockets that are considered by most people to be defects. When using boards to produce furniture components, these defects are removed to produce clear, defect-free parts. Currently, human operators identify and locate the unusable board areas containing defects. Errors in determining a...

  19. Process error rates in general research applications to the Human ...

    African Journals Online (AJOL)

    Objective. To examine process error rates in applications for ethics clearance of health research. Methods. Minutes of 586 general research applications made to a human health research ethics committee (HREC) from April 2008 to March 2009 were examined. Rates of approval were calculated and reasons for requiring ...

  20. Preventing marine accidents caused by technology-induced human error

    OpenAIRE

    Bielić, Toni; Hasanspahić, Nermin; Čulin, Jelena

    2017-01-01

    The objective of embedding technology on board ships, to improve safety, is not fully accomplished. The paper studies marine accidents caused by human error resulting from improper human-technology interaction. The aim of the paper is to propose measures to prevent reoccurrence of such accidents. This study analyses the marine accident reports issued by Marine Accidents Investigation Branch covering the period from 2012 to 2014. The factors that caused these accidents are examined and categor...

  1. An Estimation of Human Error Probability of Filtered Containment Venting System Using Dynamic HRA Method

    Energy Technology Data Exchange (ETDEWEB)

    Jang, Seunghyun; Jae, Moosung [Hanyang University, Seoul (Korea, Republic of)

    2016-10-15

    The human failure events (HFEs) are considered in the development of system fault trees as well as accident sequence event trees in part of Probabilistic Safety Assessment (PSA). As a method for analyzing the human error, several methods, such as Technique for Human Error Rate Prediction (THERP), Human Cognitive Reliability (HCR), and Standardized Plant Analysis Risk-Human Reliability Analysis (SPAR-H) are used and new methods for human reliability analysis (HRA) are under developing at this time. This paper presents a dynamic HRA method for assessing the human failure events and estimation of human error probability for filtered containment venting system (FCVS) is performed. The action associated with implementation of the containment venting during a station blackout sequence is used as an example. In this report, dynamic HRA method was used to analyze FCVS-related operator action. The distributions of the required time and the available time were developed by MAAP code and LHS sampling. Though the numerical calculations given here are only for illustrative purpose, the dynamic HRA method can be useful tools to estimate the human error estimation and it can be applied to any kind of the operator actions, including the severe accident management strategy.

  2. [Analysis of intrusion errors in free recall].

    Science.gov (United States)

    Diesfeldt, H F A

    2017-06-01

    Extra-list intrusion errors during five trials of the eight-word list-learning task of the Amsterdam Dementia Screening Test (ADST) were investigated in 823 consecutive psychogeriatric patients (87.1% suffering from major neurocognitive disorder). Almost half of the participants (45.9%) produced one or more intrusion errors on the verbal recall test. Correct responses were lower when subjects made intrusion errors, but learning slopes did not differ between subjects who committed intrusion errors and those who did not so. Bivariate regression analyses revealed that participants who committed intrusion errors were more deficient on measures of eight-word recognition memory, delayed visual recognition and tests of executive control (the Behavioral Dyscontrol Scale and the ADST-Graphical Sequences as measures of response inhibition). Using hierarchical multiple regression, only free recall and delayed visual recognition retained an independent effect in the association with intrusion errors, such that deficient scores on tests of episodic memory were sufficient to explain the occurrence of intrusion errors. Measures of inhibitory control did not add significantly to the explanation of intrusion errors in free recall, which makes insufficient strength of memory traces rather than a primary deficit in inhibition the preferred account for intrusion errors in free recall.

  3. Fault tree model of human error based on error-forcing contexts

    International Nuclear Information System (INIS)

    Kang, Hyun Gook; Jang, Seung Cheol; Ha, Jae Joo

    2004-01-01

    In the safety-critical systems such as nuclear power plants, the safety-feature actuation is fully automated. In emergency case, the human operator could also play the role of a backup for automated systems. That is, the failure of safety-feature-actuation signal generation implies the concurrent failure of automated systems and that of manual actuation. The human operator's manual actuation failure is largely affected by error-forcing contexts (EFC). The failures of sensors and automated systems are most important ones. The sensors, the automated actuation system and the human operators are correlated in a complex manner and hard to develop a proper model. In this paper, we will explain the condition-based human reliability assessment (CBHRA) method in order to treat these complicated conditions in a practical way. In this study, we apply the CBHRA method to the manual actuation of safety features such as reactor trip and safety injection in Korean Standard Nuclear Power Plants

  4. Teacher knowledge of error analysis in differential calculus

    Directory of Open Access Journals (Sweden)

    Eunice K. Moru

    2014-12-01

    Full Text Available The study investigated teacher knowledge of error analysis in differential calculus. Two teachers were the sample of the study: one a subject specialist and the other a mathematics education specialist. Questionnaires and interviews were used for data collection. The findings of the study reflect that the teachers’ knowledge of error analysis was characterised by the following assertions, which are backed up with some evidence: (1 teachers identified the errors correctly, (2 the generalised error identification resulted in opaque analysis, (3 some of the identified errors were not interpreted from multiple perspectives, (4 teachers’ evaluation of errors was either local or global and (5 in remedying errors accuracy and efficiency were emphasised more than conceptual understanding. The implications of the findings of the study for teaching include engaging in error analysis continuously as this is one way of improving knowledge for teaching.

  5. THERP and HEART integrated methodology for human error assessment

    Science.gov (United States)

    Castiglia, Francesco; Giardina, Mariarosa; Tomarchio, Elio

    2015-11-01

    THERP and HEART integrated methodology is proposed to investigate accident scenarios that involve operator errors during high-dose-rate (HDR) treatments. The new approach has been modified on the basis of fuzzy set concept with the aim of prioritizing an exhaustive list of erroneous tasks that can lead to patient radiological overexposures. The results allow for the identification of human errors that are necessary to achieve a better understanding of health hazards in the radiotherapy treatment process, so that it can be properly monitored and appropriately managed.

  6. When soft controls get slippery: User interfaces and human error

    International Nuclear Information System (INIS)

    Stubler, W.F.; O'Hara, J.M.

    1998-01-01

    Many types of products and systems that have traditionally featured physical control devices are now being designed with soft controls--input formats appearing on computer-based display devices and operated by a variety of input devices. A review of complex human-machine systems found that soft controls are particularly prone to some types of errors and may affect overall system performance and safety. This paper discusses the application of design approaches for reducing the likelihood of these errors and for enhancing usability, user satisfaction, and system performance and safety

  7. Human error: An essential problem of nuclear power plants

    International Nuclear Information System (INIS)

    Smidt, D.

    1981-01-01

    The author first defines the part played by man in the nuclear power plant and then deals in more detail with the structure of his valse behavior in tactical and strategic repect. The dicussion of tactical errors and their avoidance is follwed by a report on the actual state of plant technology and possible improvements. Subsequently a study of the strategic errors including the conclusion to be drawn until now (joint between plant and man, personal selection and education) is made. If the joints between man and machine are designed according and physiological strenghts and weaknesses of man are fully realized and taken into account human errors not be essential problem in nuclear power plant. (GL) [de

  8. Effects of digital human-machine interface characteristics on human error in nuclear power plants

    International Nuclear Information System (INIS)

    Li Pengcheng; Zhang Li; Dai Licao; Huang Weigang

    2011-01-01

    In order to identify the effects of digital human-machine interface characteristics on human error in nuclear power plants, the new characteristics of digital human-machine interface are identified by comparing with the traditional analog control systems in the aspects of the information display, user interface interaction and management, control systems, alarm systems and procedures system, and the negative effects of digital human-machine interface characteristics on human error are identified by field research and interviewing with operators such as increased cognitive load and workload, mode confusion, loss of situation awareness. As to the adverse effects related above, the corresponding prevention and control measures of human errors are provided to support the prevention and minimization of human errors and the optimization of human-machine interface design. (authors)

  9. Fixturing error measurement and analysis using CMMs

    International Nuclear Information System (INIS)

    Wang, Y; Chen, X; Gindy, N

    2005-01-01

    Influence of fixture on the errors of a machined surface can be very significant. The machined surface errors generated during machining can be measured by using a coordinate measurement machine (CMM) through the displacements of three coordinate systems on a fixture-workpiece pair in relation to the deviation of the machined surface. The surface errors consist of the component movement, component twist, deviation between actual machined surface and defined tool path. A turbine blade fixture for grinding operation is used for case study

  10. Self-assessment of human performance errors in nuclear operations

    International Nuclear Information System (INIS)

    Chambliss, K.V.

    1996-01-01

    One of the most important approaches to improving nuclear safety is to have an effective self-assessment process in place, whose cornerstone is the identification and improvement of human performance errors. Experience has shown that significant events usually have had precursors of human performance errors. If these precursors are left uncorrected or not understood, the symptoms recur and result in unanticipated events of greater safety significance. The Institute of Nuclear Power Operations (INPO) has been championing the cause of promoting excellence in human performance in the nuclear industry. INPO's report, open-quotes Excellence in Human Performance,close quotes emphasizes the importance of several factors that play a role in human performance. They include individual, supervisory, and organizational behaviors; real-time feedback that results in specific behavior to produce safe and reliable performance; and proactive measures that remove obstacles from excellent human performance. Zack Pate, chief executive officer and president of INPO, in his report, open-quotes The Control Room,close quotes provides an excellent discussion of serious events in the nuclear industry since 1994 and compares them with the results from a recent study by the National Transportation Safety Board of airline accidents in the 12-yr period from 1978 to 1990 to draw some common themes that relate to human performance issues in the control room

  11. Representation of human behaviour in probabilistic safety analysis

    International Nuclear Information System (INIS)

    Whittingham, R.B.

    1991-01-01

    This paper provides an overview of the representation of human behaviour in probabilistic safety assessment. Human performance problems which may result in errors leading to accidents are considered in terms of methods of identification using task analysis, screening analysis of critical errors, representation and quantification of human errors in fault trees and event trees and error reduction measures. (author) figs., tabs., 43 refs

  12. An empirical study on the human error recovery failure probability when using soft controls in NPP advanced MCRs

    International Nuclear Information System (INIS)

    Jang, Inseok; Kim, Ar Ryum; Jung, Wondea; Seong, Poong Hyun

    2014-01-01

    Highlights: • Many researchers have tried to understand human recovery process or step. • Modeling human recovery process is not sufficient to be applied to HRA. • The operation environment of MCRs in NPPs has changed by adopting new HSIs. • Recovery failure probability in a soft control operation environment is investigated. • Recovery failure probability here would be important evidence for expert judgment. - Abstract: It is well known that probabilistic safety assessments (PSAs) today consider not just hardware failures and environmental events that can impact upon risk, but also human error contributions. Consequently, the focus on reliability and performance management has been on the prevention of human errors and failures rather than the recovery of human errors. However, the recovery of human errors is as important as the prevention of human errors and failures for the safe operation of nuclear power plants (NPPs). For this reason, many researchers have tried to find a human recovery process or step. However, modeling the human recovery process is not sufficient enough to be applied to human reliability analysis (HRA), which requires human error and recovery probabilities. In this study, therefore, human error recovery failure probabilities based on predefined human error modes were investigated by conducting experiments in the operation mockup of advanced/digital main control rooms (MCRs) in NPPs. To this end, 48 subjects majoring in nuclear engineering participated in the experiments. In the experiments, using the developed accident scenario based on tasks from the standard post trip action (SPTA), the steam generator tube rupture (SGTR), and predominant soft control tasks, which are derived from the loss of coolant accident (LOCA) and the excess steam demand event (ESDE), all error detection and recovery data based on human error modes were checked with the performance sheet and the statistical analysis of error recovery/detection was then

  13. The using of the control room automation against human errors

    International Nuclear Information System (INIS)

    Kautto, A.

    1993-01-01

    The control room automation has developed very strongly during the 80's in IVO (Imatran Voima Oy). The former work expanded strongly with building of the full scope training simulator to the Loviisa plant. The important milestones has been, for example the testing of the Critical Function Monitoring System, a concept developed by Combustion Eng. Inc., in Loviisa training simulator 1982, the replacing of the process and simulator computers in Loviisa 1989, and 1990 and the presenting the use of the computer based procedures in training of operators 1993. With developing of automation and procedures it is possible to minimize the probability of human error. However, it is not possible totally eliminate the risks caused by human errors. (orig.)

  14. Detailed semantic analyses of human error incidents occurring at nuclear power plant in USA (interim report). Characteristics of human error incidents occurring in the period from 1992 to 1996

    International Nuclear Information System (INIS)

    Hirotsu, Yuko; Tsuge, Tadashi; Sano, Toshiaki; Takano, Kenichi; Gouda, Hidenori

    2001-01-01

    CRIEPI has been conducting detailed analyses of all human error incidents at domestic nuclear power plants (NPPs) collected from Japanese Licensee Event Reports (LERs) using J-HPES (Japanese version of HPES) as an analysis method. Results obtained by the analyses have been stored in J-HPES database. Since 1999, human error incidents have been selected from U.S. LERs, and they are analyzed using J-HPES. In this report, the results, which classified error action, cause, and preventive measure, are summarized for U.S. human error cases occurring in the period from 1992 to 1996. It was suggested as a result of classification that the categories of error action were almost the same as those of Japanese human error cases. Therefore, problems in the process of error action and checkpoints for preventing errors will be extracted by analyzing both U.S. and domestic human error cases. It was also suggested that the interrelations between error actions, causes, and organizational factors could be identified. While taking these suggestions into consideration, we will continue to analyze U.S. human error cases. (author)

  15. Role of data and judgment in modeling human errors

    International Nuclear Information System (INIS)

    Carnino, A.

    1986-01-01

    Human beings are not a simple component. This is why prediction of human behaviour in a quantitative way is so difficult. For human reliability analysis, the data sources that can be used are the following: operating experience and incident reports, data banks, data from literature, data collected from simulators, data established by expert judgement. The factors important for conducting a good human reliability analysis are then discussed, including the uncertainties to be associated with. (orig.)

  16. Quantification of human error and common-mode failures in man-machine systems

    International Nuclear Information System (INIS)

    Lisboa, J.J.

    1988-01-01

    Quantification of human performance, particularly the determination of human error, is essential for realistic assessment of overall system performance of man-machine systems. This paper presents an analysis of human errors in nuclear power plant systems when measured against common-mode failures (CMF). Human errors evaluated are improper testing, inadequate maintenance strategy, and miscalibration. The methodology presented in the paper represents a positive contribution to power plant systems availability by identifying sources of common-mode failure when operational functions are involved. It is also applicable to other complex systems such as chemical plants, aircraft and motor industries; in fact, any large man-created, man-machine system could be included

  17. Asteroid orbital error analysis: Theory and application

    Science.gov (United States)

    Muinonen, K.; Bowell, Edward

    1992-01-01

    We present a rigorous Bayesian theory for asteroid orbital error estimation in which the probability density of the orbital elements is derived from the noise statistics of the observations. For Gaussian noise in a linearized approximation the probability density is also Gaussian, and the errors of the orbital elements at a given epoch are fully described by the covariance matrix. The law of error propagation can then be applied to calculate past and future positional uncertainty ellipsoids (Cappellari et al. 1976, Yeomans et al. 1987, Whipple et al. 1991). To our knowledge, this is the first time a Bayesian approach has been formulated for orbital element estimation. In contrast to the classical Fisherian school of statistics, the Bayesian school allows a priori information to be formally present in the final estimation. However, Bayesian estimation does give the same results as Fisherian estimation when no priori information is assumed (Lehtinen 1988, and reference therein).

  18. Hierarchical learning induces two simultaneous, but separable, prediction errors in human basal ganglia.

    Science.gov (United States)

    Diuk, Carlos; Tsai, Karin; Wallis, Jonathan; Botvinick, Matthew; Niv, Yael

    2013-03-27

    Studies suggest that dopaminergic neurons report a unitary, global reward prediction error signal. However, learning in complex real-life tasks, in particular tasks that show hierarchical structure, requires multiple prediction errors that may coincide in time. We used functional neuroimaging to measure prediction error signals in humans performing such a hierarchical task involving simultaneous, uncorrelated prediction errors. Analysis of signals in a priori anatomical regions of interest in the ventral striatum and the ventral tegmental area indeed evidenced two simultaneous, but separable, prediction error signals corresponding to the two levels of hierarchy in the task. This result suggests that suitably designed tasks may reveal a more intricate pattern of firing in dopaminergic neurons. Moreover, the need for downstream separation of these signals implies possible limitations on the number of different task levels that we can learn about simultaneously.

  19. Analysis of errors of radiation relay, (1)

    International Nuclear Information System (INIS)

    Koyanagi, Takami; Nakajima, Sinichi

    1976-01-01

    The statistical error of liquid level controlled by radiation relay is analysed and a method of minimizing the error is proposed. This method comes to the problem of optimum setting of the time constant of radiation relay. The equations for obtaining the value of time constant are presented and the numerical results are shown in a table and plotted in a figure. The optimum time constant of the upper level control relay is entirely different from that of the lower level control relay. (auth.)

  20. Applying lessons learned to enhance human performance and reduce human error for ISS operations

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, W.R.

    1998-09-01

    A major component of reliability, safety, and mission success for space missions is ensuring that the humans involved (flight crew, ground crew, mission control, etc.) perform their tasks and functions as required. This includes compliance with training and procedures during normal conditions, and successful compensation when malfunctions or unexpected conditions occur. A very significant issue that affects human performance in space flight is human error. Human errors can invalidate carefully designed equipment and procedures. If certain errors combine with equipment failures or design flaws, mission failure or loss of life can occur. The control of human error during operation of the International Space Station (ISS) will be critical to the overall success of the program. As experience from Mir operations has shown, human performance plays a vital role in the success or failure of long duration space missions. The Department of Energy`s Idaho National Engineering and Environmental Laboratory (INEEL) is developed a systematic approach to enhance human performance and reduce human errors for ISS operations. This approach is based on the systematic identification and evaluation of lessons learned from past space missions such as Mir to enhance the design and operation of ISS. This paper describes previous INEEL research on human error sponsored by NASA and how it can be applied to enhance human reliability for ISS.

  1. Identification of failure sequences sensitive to human error

    International Nuclear Information System (INIS)

    1987-06-01

    This report prepared by the participants of the technical committee meeting on ''Identification of Failure Sequences Sensitive to Human Error'' addresses the subjects discussed during the meeting and the conclusions reached by the committee. Chapter 1 reviews the INSAG recommendations and the main elements of the IAEA Programme in the area of human element. In Chapter 2 the role of human actions in nuclear power plants safety from insights of operational experience is reviewed. Chapter 3 is concerned with the relationship between probabilistic safety assessment and human performance associated with severe accident sequences. Chapter 4 addresses the role of simulators in view of training for accident conditions. Chapter 5 presents the conclusions and future trends. The seven papers presented by members of this technical committee are also included in this technical document. A separate abstract was prepared for each of these papers

  2. Modelling the basic error tendencies of human operators

    International Nuclear Information System (INIS)

    Reason, J.

    1988-01-01

    The paper outlines the primary structural features of human cognition: a limited, serial workspace interacting with a parallel distributed knowledge base. It is argued that the essential computational features of human cognition - to be captured by an adequate operator model - reside in the mechanisms by which stored knowledge structures are selected and brought into play. Two such computational 'primitives' are identified: similarity-matching and frequency-gambling. These two retrieval heuristics, it is argued, shape both the overall character of human performance (i.e. its heavy reliance on pattern-matching) and its basic error tendencies ('strong-but-wrong' responses, confirmation, similarity and frequency biases, and cognitive 'lock-up'). The various features of human cognition are integrated with a dynamic operator model capable of being represented in software form. This computer model, when run repeatedly with a variety of problem configurations, should produce a distribution of behaviours which, in total, simulate the general character of operator performance. (author)

  3. Modelling the basic error tendencies of human operators

    International Nuclear Information System (INIS)

    Reason, James

    1988-01-01

    The paper outlines the primary structural features of human cognition: a limited, serial workspace interacting with a parallel distributed knowledge base. It is argued that the essential computational features of human cognition - to be captured by an adequate operator model - reside in the mechanisms by which stored knowledge structures are selected and brought into play. Two such computational 'primitives' are identified: similarity-matching and frequency-gambling. These two retrieval heuristics, it is argued, shape both the overall character of human performance (i.e. its heavy reliance on pattern-matching) and its basic error tendencies ('strong-but-wrong' responses, confirmation, similarity and frequency biases, and cognitive 'lock-up'). The various features of human cognition are integrated with a dynamic operator model capable of being represented in software form. This computer model, when run repeatedly with a variety of problem configurations, should produce a distribution of behaviours which, in toto, simulate the general character of operator performance. (author)

  4. An assessment of the risk significance of human errors in selected PSAs and operating events

    International Nuclear Information System (INIS)

    Palla, R.L. Jr.; El-Bassioni, A.

    1991-01-01

    Sensitivity studies based on Probabilistic Safety Assessments (PSAs) for a pressurized water reactor and a boiling water reactor are described. In each case human errors modeled in the PSAs were categorized according to such factors as error type, location, timing, and plant personnel involved. Sensitivity studies were then conducted by varying the error rates in each category and evaluating the corresponding change in total core damage frequency and accident sequence frequency. Insights obtained are discussed and reasons for differences in risk sensitivity between plants are explored. A separate investigation into the role of human error in risk-important operating events is also described. This investigation involved the analysis of data from the USNRC Accident Sequence Precursor program to determine the effect of operator-initiated events on accident precursor trends, and to determine whether improved training can be correlated to current trends. The findings of this study are also presented. 5 refs., 15 figs., 1 tab

  5. Complications: acknowledging, managing, and coping with human error.

    Science.gov (United States)

    Helo, Sevann; Moulton, Carol-Anne E

    2017-08-01

    Errors are inherent in medicine due to the imperfectness of human nature. Health care providers may have a difficult time accepting their fallibility, acknowledging mistakes, and disclosing errors. Fear of litigation, shame, blame, and concern about reputation are just some of the barriers preventing physicians from being more candid with their patients, despite the supporting body of evidence that patients cite poor communication and lack of transparency as primary drivers to file a lawsuit in the wake of a medical complication. Proper error disclosure includes a timely explanation of what happened, who was involved, why the error occurred, and how it will be prevented in the future. Medical mistakes afford the opportunity for individuals and institutions to be candid about their weaknesses while improving patient care processes. When a physician takes the Hippocratic Oath they take on a tremendous sense of responsibility for the care of their patients, and often bear the burden of their mistakes in isolation. Physicians may struggle with guilt, shame, and a crisis of confidence, which may thwart efforts to identify areas for improvement that can lead to meaningful change. Coping strategies for providers include discussing the event with others, seeking professional counseling, and implementing quality improvement projects. Physicians and health care organizations need to find adaptive ways to deal with complications that will benefit patients, providers, and their institutions.

  6. The contributions of human factors on human error in Malaysia aviation maintenance industries

    Science.gov (United States)

    Padil, H.; Said, M. N.; Azizan, A.

    2018-05-01

    Aviation maintenance is a multitasking activity in which individuals perform varied tasks under constant pressure to meet deadlines as well as challenging work conditions. These situational characteristics combined with human factors can lead to various types of human related errors. The primary objective of this research is to develop a structural relationship model that incorporates human factors, organizational factors, and their impact on human errors in aviation maintenance. Towards that end, a questionnaire was developed which was administered to Malaysian aviation maintenance professionals. Structural Equation Modelling (SEM) approach was used in this study utilizing AMOS software. Results showed that there were a significant relationship of human factors on human errors and were tested in the model. Human factors had a partial effect on organizational factors while organizational factors had a direct and positive impact on human errors. It was also revealed that organizational factors contributed to human errors when coupled with human factors construct. This study has contributed to the advancement of knowledge on human factors effecting safety and has provided guidelines for improving human factors performance relating to aviation maintenance activities and could be used as a reference for improving safety performance in the Malaysian aviation maintenance companies.

  7. Thresholds for human detection of patient setup errors in digitally reconstructed portal images of prostate fields

    International Nuclear Information System (INIS)

    Phillips, Brooke L.; Jiroutek, Michael R.; Tracton, Gregg; Elfervig, Michelle; Muller, Keith E.; Chaney, Edward L.

    2002-01-01

    Purpose: Computer-assisted methods to analyze electronic portal images for the presence of treatment setup errors should be studied in controlled experiments before use in the clinical setting. Validation experiments using images that contain known errors usually report the smallest errors that can be detected by the image analysis algorithm. This paper offers human error-detection thresholds as one benchmark for evaluating the smallest errors detected by algorithms. Unfortunately, reliable data are lacking describing human performance. The most rigorous benchmarks for human performance are obtained under conditions that favor error detection. To establish such benchmarks, controlled observer studies were carried out to determine the thresholds of detectability for in-plane and out-of-plane translation and rotation setup errors introduced into digitally reconstructed portal radiographs (DRPRs) of prostate fields. Methods and Materials: Seventeen observers comprising radiation oncologists, radiation oncology residents, physicists, and therapy students participated in a two-alternative forced choice experiment involving 378 DRPRs computed using the National Library of Medicine Visible Human data sets. An observer viewed three images at a time displayed on adjacent computer monitors. Each image triplet included a reference digitally reconstructed radiograph displayed on the central monitor and two DRPRs displayed on the flanking monitors. One DRPR was error free. The other DRPR contained a known in-plane or out-of-plane error in the placement of the treatment field over a target region in the pelvis. The range for each type of error was determined from pilot observer studies based on a Probit model for error detection. The smallest errors approached the limit of human visual capability. The observer was told what kind of error was introduced, and was asked to choose the DRPR that contained the error. Observer decisions were recorded and analyzed using repeated

  8. Dose error analysis for a scanned proton beam delivery system

    International Nuclear Information System (INIS)

    Coutrakon, G; Wang, N; Miller, D W; Yang, Y

    2010-01-01

    All particle beam scanning systems are subject to dose delivery errors due to errors in position, energy and intensity of the delivered beam. In addition, finite scan speeds, beam spill non-uniformities, and delays in detector, detector electronics and magnet responses will all contribute errors in delivery. In this paper, we present dose errors for an 8 x 10 x 8 cm 3 target of uniform water equivalent density with 8 cm spread out Bragg peak and a prescribed dose of 2 Gy. Lower doses are also analyzed and presented later in the paper. Beam energy errors and errors due to limitations of scanning system hardware have been included in the analysis. By using Gaussian shaped pencil beams derived from measurements in the research room of the James M Slater Proton Treatment and Research Center at Loma Linda, CA and executing treatment simulations multiple times, statistical dose errors have been calculated in each 2.5 mm cubic voxel in the target. These errors were calculated by delivering multiple treatments to the same volume and calculating the rms variation in delivered dose at each voxel in the target. The variations in dose were the result of random beam delivery errors such as proton energy, spot position and intensity fluctuations. The results show that with reasonable assumptions of random beam delivery errors, the spot scanning technique yielded an rms dose error in each voxel less than 2% or 3% of the 2 Gy prescribed dose. These calculated errors are within acceptable clinical limits for radiation therapy.

  9. Human errors during the simulations of an SGTR scenario: Application of the HERA system

    International Nuclear Information System (INIS)

    Jung, Won Dea; Whaley, April M.; Hallbert, Bruce P.

    2009-01-01

    Due to the need of data for a Human Reliability Analysis (HRA), a number of data collection efforts have been undertaken in several different organizations. As a part of this effort, a human error analysis that focused on a set of simulator records on a Steam Generator Tube Rupture (SGTR) scenario was performed by using the Human Event Repository and Analysis (HERA) system. This paper summarizes the process and results of the HERA analysis, including discussions about the usability of the HERA system for a human error analysis of simulator data. Five simulated records of an SGTR scenario were analyzed with the HERA analysis process in order to scrutinize the causes and mechanisms of the human related events. From this study, the authors confirmed that the HERA was a serviceable system that can analyze human performance qualitatively from simulator data. It was possible to identify the human related events in the simulator data that affected the system safety not only negatively but also positively. It was also possible to scrutinize the Performance Shaping Factors (PSFs) and the relevant contributory factors with regard to each identified human event

  10. Human error views : a framework for benchmarking organizations and measuring the distance between academia and industry

    NARCIS (Netherlands)

    Karanikas, Nektarios

    2015-01-01

    The paper presents a framework that through structured analysis of accident reports explores the differences between practice and academic literature as well amongst organizations regarding their views on human error. The framework is based on the hypothesis that the wording of accident reports

  11. Human error identification for laparoscopic surgery: Development of a motion economy perspective.

    Science.gov (United States)

    Al-Hakim, Latif; Sevdalis, Nick; Maiping, Tanaphon; Watanachote, Damrongpan; Sengupta, Shomik; Dissaranan, Charuspong

    2015-09-01

    This study postulates that traditional human error identification techniques fail to consider motion economy principles and, accordingly, their applicability in operating theatres may be limited. This study addresses this gap in the literature with a dual aim. First, it identifies the principles of motion economy that suit the operative environment and second, it develops a new error mode taxonomy for human error identification techniques which recognises motion economy deficiencies affecting the performance of surgeons and predisposing them to errors. A total of 30 principles of motion economy were developed and categorised into five areas. A hierarchical task analysis was used to break down main tasks of a urological laparoscopic surgery (hand-assisted laparoscopic nephrectomy) to their elements and the new taxonomy was used to identify errors and their root causes resulting from violation of motion economy principles. The approach was prospectively tested in 12 observed laparoscopic surgeries performed by 5 experienced surgeons. A total of 86 errors were identified and linked to the motion economy deficiencies. Results indicate the developed methodology is promising. Our methodology allows error prevention in surgery and the developed set of motion economy principles could be useful for training surgeons on motion economy principles. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  12. Quantification of human errors in level-1 PSA studies in NUPEC/JINS

    International Nuclear Information System (INIS)

    Hirano, M.; Hirose, M.; Sugawara, M.; Hashiba, T.

    1991-01-01

    THERP (Technique for Human Error Rate Prediction) method is mainly adopted to evaluate the pre-accident and post-accident human error rates. Performance shaping factors are derived by taking Japanese operational practice into account. Several examples of human error rates with calculational procedures are presented. The important human interventions of typical Japanese NPPs are also presented. (orig./HP)

  13. Collection and classification of human error and human reliability data from Indian nuclear power plants for use in PSA

    International Nuclear Information System (INIS)

    Subramaniam, K.; Saraf, R.K.; Sanyasi Rao, V.V.S.; Venkat Raj, V.; Venkatraman, R.

    2000-01-01

    Complex systems such as NPPs involve a large number of Human Interactions (HIs) in every phase of plant operations. Human Reliability Analysis (HRA) in the context of a PSA, attempts to model the HIs and evaluate/predict their impact on safety and reliability using human error/human reliability data. A large number of HRA techniques have been developed for modelling and integrating HIs into PSA but there is a significant lack of HAR data. In the face of insufficient data, human reliability analysts have had to resort to expert judgement methods in order to extend the insufficient data sets. In this situation, the generation of data from plant operating experience assumes importance. The development of a HRA data bank for Indian nuclear power plants was therefore initiated as part of the programme of work on HRA. Later, with the establishment of the coordinated research programme (CRP) on collection of human reliability data and use in PSA by IAEA in 1994-95, the development was carried out under the aegis of the IAEA research contract No. 8239/RB. The work described in this report covers the activities of development of a data taxonomy and a human error reporting form (HERF) based on it, data structuring, review and analysis of plant event reports, collection of data on human errors, analysis of the data and calculation of human error probabilities (HEPs). Analysis of plant operating experience does yield a good amount of qualitative data but obtaining quantitative data on human reliability in the form of HEPs is seen to be more difficult. The difficulties have been highlighted and some ways to bring about improvements in the data situation have been discussed. The implementation of a data system for HRA is described and useful features that can be incorporated in future systems are also discussed. (author)

  14. Analysis of Errors in a Special Perturbations Satellite Orbit Propagator

    Energy Technology Data Exchange (ETDEWEB)

    Beckerman, M.; Jones, J.P.

    1999-02-01

    We performed an analysis of error densities for the Special Perturbations orbit propagator using data for 29 satellites in orbits of interest to Space Shuttle and International Space Station collision avoidance. We find that the along-track errors predominate. These errors increase monotonically over each 36-hour prediction interval. The predicted positions in the along-track direction progressively either leap ahead of or lag behind the actual positions. Unlike the along-track errors the radial and cross-track errors oscillate about their nearly zero mean values. As the number of observations per fit interval decline the along-track prediction errors, and amplitudes of the radial and cross-track errors, increase.

  15. Analysis of the "naming game" with learning errors in communications.

    Science.gov (United States)

    Lou, Yang; Chen, Guanrong

    2015-07-16

    Naming game simulates the process of naming an objective by a population of agents organized in a certain communication network. By pair-wise iterative interactions, the population reaches consensus asymptotically. We study naming game with communication errors during pair-wise conversations, with error rates in a uniform probability distribution. First, a model of naming game with learning errors in communications (NGLE) is proposed. Then, a strategy for agents to prevent learning errors is suggested. To that end, three typical topologies of communication networks, namely random-graph, small-world and scale-free networks, are employed to investigate the effects of various learning errors. Simulation results on these models show that 1) learning errors slightly affect the convergence speed but distinctively increase the requirement for memory of each agent during lexicon propagation; 2) the maximum number of different words held by the population increases linearly as the error rate increases; 3) without applying any strategy to eliminate learning errors, there is a threshold of the learning errors which impairs the convergence. The new findings may help to better understand the role of learning errors in naming game as well as in human language development from a network science perspective.

  16. ERROR ANALYSIS ON INFORMATION AND TECHNOLOGY STUDENTS’ SENTENCE WRITING ASSIGNMENTS

    Directory of Open Access Journals (Sweden)

    Rentauli Mariah Silalahi

    2015-03-01

    Full Text Available Students’ error analysis is very important for helping EFL teachers to develop their teaching materials, assessments and methods. However, it takes much time and effort from the teachers to do such an error analysis towards their students’ language. This study seeks to identify the common errors made by 1 class of 28 freshmen students studying English in their first semester in an IT university. The data is collected from their writing assignments for eight consecutive weeks. The errors found were classified into 24 types and the top ten most common errors committed by the students were article, preposition, spelling, word choice, subject-verb agreement, auxiliary verb, plural form, verb form, capital letter, and meaningless sentences. The findings about the students’ frequency of committing errors were, then, contrasted to their midterm test result and in order to find out the reasons behind the error recurrence; the students were given some questions to answer in a questionnaire format. Most of the students admitted that careless was the major reason for their errors and lack understanding came next. This study suggests EFL teachers to devote their time to continuously check the students’ language by giving corrections so that the students can learn from their errors and stop committing the same errors.

  17. Task types and error types involved in the human-related unplanned reactor trip events

    International Nuclear Information System (INIS)

    Kim, Jae Whan; Park, Jin Kyun

    2008-01-01

    In this paper, the contribution of task types and error types involved in the human-related unplanned reactor trip events that have occurred between 1986 and 2006 in Korean nuclear power plants are analysed in order to establish a strategy for reducing the human-related unplanned reactor trips. Classification systems for the task types, error modes, and cognitive functions are developed or adopted from the currently available taxonomies, and the relevant information is extracted from the event reports or judged on the basis of an event description. According to the analyses from this study, the contributions of the task types are as follows: corrective maintenance (25.7%), planned maintenance (22.8%), planned operation (19.8%), periodic preventive maintenance (14.9%), response to a transient (9.9%), and design/manufacturing/installation (6.9%). According to the analysis of the error modes, error modes such as control failure (22.2%), wrong object (18.5%), omission (14.8%), wrong action (11.1%), and inadequate (8.3%) take up about 75% of the total unplanned trip events. The analysis of the cognitive functions involved in the events indicated that the planning function had the highest contribution (46.7%) to the human actions leading to unplanned reactor trips. This analysis concludes that in order to significantly reduce human-induced or human-related unplanned reactor trips, an aide system (in support of maintenance personnel) for evaluating possible (negative) impacts of planned actions or erroneous actions as well as an appropriate human error prediction technique, should be developed

  18. Task types and error types involved in the human-related unplanned reactor trip events

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae Whan; Park, Jin Kyun [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2008-12-15

    In this paper, the contribution of task types and error types involved in the human-related unplanned reactor trip events that have occurred between 1986 and 2006 in Korean nuclear power plants are analysed in order to establish a strategy for reducing the human-related unplanned reactor trips. Classification systems for the task types, error modes, and cognitive functions are developed or adopted from the currently available taxonomies, and the relevant information is extracted from the event reports or judged on the basis of an event description. According to the analyses from this study, the contributions of the task types are as follows: corrective maintenance (25.7%), planned maintenance (22.8%), planned operation (19.8%), periodic preventive maintenance (14.9%), response to a transient (9.9%), and design/manufacturing/installation (6.9%). According to the analysis of the error modes, error modes such as control failure (22.2%), wrong object (18.5%), omission (14.8%), wrong action (11.1%), and inadequate (8.3%) take up about 75% of the total unplanned trip events. The analysis of the cognitive functions involved in the events indicated that the planning function had the highest contribution (46.7%) to the human actions leading to unplanned reactor trips. This analysis concludes that in order to significantly reduce human-induced or human-related unplanned reactor trips, an aide system (in support of maintenance personnel) for evaluating possible (negative) impacts of planned actions or erroneous actions as well as an appropriate human error prediction technique, should be developed.

  19. An investigation on unintended reactor trip events in terms of human error hazards of Korean nuclear power plants

    International Nuclear Information System (INIS)

    Kim, Sa Kil; Lee, Yong Hee; Jang, Tong Il; Oh, Yeon Ju; Shin, Kwang Hyeon

    2014-01-01

    Highlights: • A methodology to identify human error hazards has been established. • The proposed methodology is a preventive approach to identify not only human error causes but also its hazards. • Using the HFACS framework we tried to find out not causations but all of the hazards and relationships among them. • We determined countermeasures against human errors through dealing with latent factors such as organizational influences. - Abstract: A new approach for finding the hazards of human errors, and not just their causes, in the nuclear industry is currently required. This is because finding causes of human errors is really impossible owing to the multiplicity of causes in each case. Thus, this study aims at identifying the relationships among human error hazards and determining the strategies for preventing human error events by means of a reanalysis of the reactor trip events in Korea NPPs. We investigated human errors to find latent factors such as decisions and conditions in all of the unintended reactor trip events during the last dozen years. In this study, we applied the HFACS (Human Factors Analysis and Classification System), which is a commonly utilized tool for investigating human contributions to aviation accidents under a widespread evaluation scheme. Using the HFACS framework, we tried to find out not the causations but all of the hazards and their relationships in terms of organizational factors. Through the trial, we proposed not only meaningful frequencies of each hazards also correlations of them. Also, considering the correlations of each hazards, we suggested useful strategies to prevent human error event. A method to investigate unintended nuclear reactor trips by human errors and the results will be discussed in more detail

  20. An error taxonomy system for analysis of haemodialysis incidents.

    Science.gov (United States)

    Gu, Xiuzhu; Itoh, Kenji; Suzuki, Satoshi

    2014-12-01

    This paper describes the development of a haemodialysis error taxonomy system for analysing incidents and predicting the safety status of a dialysis organisation. The error taxonomy system was developed by adapting an error taxonomy system which assumed no specific specialty to haemodialysis situations. Its application was conducted with 1,909 incident reports collected from two dialysis facilities in Japan. Over 70% of haemodialysis incidents were reported as problems or complications related to dialyser, circuit, medication and setting of dialysis condition. Approximately 70% of errors took place immediately before and after the four hours of haemodialysis therapy. Error types most frequently made in the dialysis unit were omission and qualitative errors. Failures or complications classified to staff human factors, communication, task and organisational factors were found in most dialysis incidents. Device/equipment/materials, medicine and clinical documents were most likely to be involved in errors. Haemodialysis nurses were involved in more incidents related to medicine and documents, whereas dialysis technologists made more errors with device/equipment/materials. This error taxonomy system is able to investigate incidents and adverse events occurring in the dialysis setting but is also able to estimate safety-related status of an organisation, such as reporting culture. © 2014 European Dialysis and Transplant Nurses Association/European Renal Care Association.

  1. Analysis of Medication Errors in Simulated Pediatric Resuscitation by Residents

    Directory of Open Access Journals (Sweden)

    Evelyn Porter

    2014-07-01

    Full Text Available Introduction: The objective of our study was to estimate the incidence of prescribing medication errors specifically made by a trainee and identify factors associated with these errors during the simulated resuscitation of a critically ill child. Methods: The results of the simulated resuscitation are described. We analyzed data from the simulated resuscitation for the occurrence of a prescribing medication error. We compared univariate analysis of each variable to medication error rate and performed a separate multiple logistic regression analysis on the significant univariate variables to assess the association between the selected variables. Results: We reviewed 49 simulated resuscitations . The final medication error rate for the simulation was 26.5% (95% CI 13.7% - 39.3%. On univariate analysis, statistically significant findings for decreased prescribing medication error rates included senior residents in charge, presence of a pharmacist, sleeping greater than 8 hours prior to the simulation, and a visual analog scale score showing more confidence in caring for critically ill children. Multiple logistic regression analysis using the above significant variables showed only the presence of a pharmacist to remain significantly associated with decreased medication error, odds ratio of 0.09 (95% CI 0.01 - 0.64. Conclusion: Our results indicate that the presence of a clinical pharmacist during the resuscitation of a critically ill child reduces the medication errors made by resident physician trainees.

  2. Error Analysis on Plane-to-Plane Linear Approximate Coordinate ...

    Indian Academy of Sciences (India)

    Abstract. In this paper, the error analysis has been done for the linear approximate transformation between two tangent planes in celestial sphere in a simple case. The results demonstrate that the error from the linear transformation does not meet the requirement of high-precision astrometry under some conditions, so the ...

  3. A strategy for minimizing common mode human error in executing critical functions and tasks

    International Nuclear Information System (INIS)

    Beltracchi, L.; Lindsay, R.W.

    1992-01-01

    Human error in execution of critical functions and tasks can be costly. The Three Mile Island and the Chernobyl Accidents are examples of results from human error in the nuclear industry. There are similar errors that could no doubt be cited from other industries. This paper discusses a strategy to minimize common mode human error in the execution of critical functions and tasks. The strategy consists of the use of human redundancy, and also diversity in human cognitive behavior: skill-, rule-, and knowledge-based behavior. The authors contend that the use of diversity in human cognitive behavior is possible, and it minimizes common mode error

  4. Modelling the basic error tendencies of human operators

    Energy Technology Data Exchange (ETDEWEB)

    Reason, J.

    1988-01-01

    The paper outlines the primary structural features of human cognition: a limited, serial workspace interacting with a parallel distributed knowledge base. It is argued that the essential computational features of human cognition - to be captured by an adequate operator model - reside in the mechanisms by which stored knowledge structures are selected and brought into play. Two such computational 'primitives' are identified: similarity-matching and frequency-gambling. These two retrieval heuristics, it is argued, shape both the overall character of human performance (i.e. its heavy reliance on pattern-matching) and its basic error tendencies ('strong-but-wrong' responses, confirmation, similarity and frequency biases, and cognitive 'lock-up'). The various features of human cognition are integrated with a dynamic operator model capable of being represented in software form. This computer model, when run repeatedly with a variety of problem configurations, should produce a distribution of behaviours which, in toto, simulate the general character of operator performance.

  5. The Relationship between Human Operators' Psycho-physiological Condition and Human Errors in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Kim, Arryum; Jang, Inseok; Kang, Hyungook; Seong, Poonghyun

    2013-01-01

    The safe operation of nuclear power plants (NPPs) is substantially dependent on the performance of the human operators who operate the systems. In this environment, human errors caused by inappropriate performance of operator have been considered to be critical since it may lead serious problems in the safety-critical plants. In order to provide meaningful insights to prevent human errors and enhance the human performance, operators' physiological conditions such as stress and workload have been investigated. Physiological measurements were considered as reliable tools to assess the stress and workload. T. Q. Tran et al. and J. B. Brooking et al pointed out that operators' workload can be assessed using eye tracking, galvanic skin response, electroencephalograms (EEGs), heart rate, respiration and other measurements. The purpose of this study is to investigate the effect of the human operators' tense level and knowledge level to the number of human errors. For this study, the experiments were conducted in the mimic of the main control rooms (MCR) in NPP. It utilized the compact nuclear simulator (CNS) which is modeled based on the three loop Pressurized Water Reactor, 993MWe, Kori unit 3 and 4 in Korea and the subjects were asked to follow the tasks described in the emergency operating procedures (EOP). During the simulation, three kinds of physiological measurement were utilized; Electrocardiogram (ECG), EEG and nose temperature. Also, subjects were divided into three groups based on their knowledge of the plant operation. The result shows that subjects who are tense make fewer errors. In addition, subjects who are in higher knowledge level tend to be tense and make fewer errors. For the ECG data, subjects who make fewer human errors tend to be located in higher tense level area of high SNS activity and low PSNS activity. The results of EEG data are also similar to ECG result. Beta power ratio of subjects who make fewer errors was higher. Since beta power ratio is

  6. A human error probability estimate methodology based on fuzzy inference and expert judgment on nuclear plants

    International Nuclear Information System (INIS)

    Nascimento, C.S. do; Mesquita, R.N. de

    2009-01-01

    Recent studies point human error as an important factor for many industrial and nuclear accidents: Three Mile Island (1979), Bhopal (1984), Chernobyl and Challenger (1986) are classical examples. Human contribution to these accidents may be better understood and analyzed by using Human Reliability Analysis (HRA), which has being taken as an essential part on Probabilistic Safety Analysis (PSA) of nuclear plants. Both HRA and PSA depend on Human Error Probability (HEP) for a quantitative analysis. These probabilities are extremely affected by the Performance Shaping Factors (PSF), which has a direct effect on human behavior and thus shape HEP according with specific environment conditions and personal individual characteristics which are responsible for these actions. This PSF dependence raises a great problem on data availability as turn these scarcely existent database too much generic or too much specific. Besides this, most of nuclear plants do not keep historical records of human error occurrences. Therefore, in order to overcome this occasional data shortage, a methodology based on Fuzzy Inference and expert judgment was employed in this paper in order to determine human error occurrence probabilities and to evaluate PSF's on performed actions by operators in a nuclear power plant (IEA-R1 nuclear reactor). Obtained HEP values were compared with reference tabled data used on current literature in order to show method coherence and valid approach. This comparison leads to a conclusion that this work results are able to be employed both on HRA and PSA enabling efficient prospection of plant safety conditions, operational procedures and local working conditions potential improvements (author)

  7. Deadline pressure and human error: a study of human failures on a particle accelerator at Brookhaven National Laboratory

    International Nuclear Information System (INIS)

    Tiagha, E.A.

    1982-01-01

    The decline in industrial efficiency may be linked to decreased reliability of complex automatic systems. This decline threatens the viability of complex organizations in industrialized economies. Industrial engineering techniques that minimize system failure by increasing the reliability of systems hardware are well developed in comparison with those available to reduce human operator errors. The problem of system reliability and the associated costs of breakdown can be reduced if we understand how highly skilled technical personnel function in complex operations and systems. The purpose of this research is to investigate how human errors are affected by deadline pressures, technical communication and other socio-dynamic factors. Through the analysis of a technologically complex particle accelerator prototype at Brookhaven National Laboratory, two failure mechanisms: (1) physical defects in the production process and (2) human operator errors were identified. Two instruments were used to collect information on human failures: objective laboratory data and a human failure questionnaire. The results of human failures from the objective data were used to test for the deadline hypothesis and also to validate the human failure questionnaire. To explain why the human failures occurred, data were collected from a four-part, closed choice questionnaire administered to two groups of scientists, engineers, and technicians, working together against a deadline to produce an engineering prototype of a particle accelerator

  8. Errors of DWPF frit analysis: Final report

    International Nuclear Information System (INIS)

    Schumacher, R.F.

    1993-01-01

    Glass frit will be a major raw material for the operation of the Defense Waste Processing Facility. The frit will be controlled by certificate of conformance and a confirmatory analysis from a commercial analytical laboratory. The following effort provides additional quantitative information on the variability of frit chemical analyses at two commercial laboratories. Identical samples of IDMS Frit 202 were chemically analyzed at two commercial laboratories and at three different times over a period of four months. The SRL-ADS analyses, after correction with the reference standard and normalization, provided confirmatory information, but did not detect the low silica level in one of the frit samples. A methodology utilizing elliptical limits for confirming the certificate of conformance or confirmatory analysis was introduced and recommended for use when the analysis values are close but not within the specification limits. It was also suggested that the lithia specification limits might be reduced as long as CELS is used to confirm the analysis

  9. Quantification of the effects of dependence on human error probabilities

    International Nuclear Information System (INIS)

    Bell, B.J.; Swain, A.D.

    1980-01-01

    In estimating the probabilities of human error in the performance of a series of tasks in a nuclear power plant, the situation-specific characteristics of the series must be considered. A critical factor not to be overlooked in this estimation is the dependence or independence that pertains to any of the several pairs of task performances. In discussing the quantification of the effects of dependence, the event tree symbology described will be used. In any series of tasks, the only dependence considered for quantification in this document will be that existing between the task of interest and the immediately preceeding task. Tasks performed earlier in the series may have some effect on the end task, but this effect is considered negligible

  10. Sensitivity analysis of periodic errors in heterodyne interferometry

    International Nuclear Information System (INIS)

    Ganguly, Vasishta; Kim, Nam Ho; Kim, Hyo Soo; Schmitz, Tony

    2011-01-01

    Periodic errors in heterodyne displacement measuring interferometry occur due to frequency mixing in the interferometer. These nonlinearities are typically characterized as first- and second-order periodic errors which cause a cyclical (non-cumulative) variation in the reported displacement about the true value. This study implements an existing analytical periodic error model in order to identify sensitivities of the first- and second-order periodic errors to the input parameters, including rotational misalignments of the polarizing beam splitter and mixing polarizer, non-orthogonality of the two laser frequencies, ellipticity in the polarizations of the two laser beams, and different transmission coefficients in the polarizing beam splitter. A local sensitivity analysis is first conducted to examine the sensitivities of the periodic errors with respect to each input parameter about the nominal input values. Next, a variance-based approach is used to study the global sensitivities of the periodic errors by calculating the Sobol' sensitivity indices using Monte Carlo simulation. The effect of variation in the input uncertainty on the computed sensitivity indices is examined. It is seen that the first-order periodic error is highly sensitive to non-orthogonality of the two linearly polarized laser frequencies, while the second-order error is most sensitive to the rotational misalignment between the laser beams and the polarizing beam splitter. A particle swarm optimization technique is finally used to predict the possible setup imperfections based on experimentally generated values for periodic errors

  11. Sensitivity analysis of periodic errors in heterodyne interferometry

    Science.gov (United States)

    Ganguly, Vasishta; Kim, Nam Ho; Kim, Hyo Soo; Schmitz, Tony

    2011-03-01

    Periodic errors in heterodyne displacement measuring interferometry occur due to frequency mixing in the interferometer. These nonlinearities are typically characterized as first- and second-order periodic errors which cause a cyclical (non-cumulative) variation in the reported displacement about the true value. This study implements an existing analytical periodic error model in order to identify sensitivities of the first- and second-order periodic errors to the input parameters, including rotational misalignments of the polarizing beam splitter and mixing polarizer, non-orthogonality of the two laser frequencies, ellipticity in the polarizations of the two laser beams, and different transmission coefficients in the polarizing beam splitter. A local sensitivity analysis is first conducted to examine the sensitivities of the periodic errors with respect to each input parameter about the nominal input values. Next, a variance-based approach is used to study the global sensitivities of the periodic errors by calculating the Sobol' sensitivity indices using Monte Carlo simulation. The effect of variation in the input uncertainty on the computed sensitivity indices is examined. It is seen that the first-order periodic error is highly sensitive to non-orthogonality of the two linearly polarized laser frequencies, while the second-order error is most sensitive to the rotational misalignment between the laser beams and the polarizing beam splitter. A particle swarm optimization technique is finally used to predict the possible setup imperfections based on experimentally generated values for periodic errors.

  12. A methodology for analysing human errors of commission in accident scenarios for risk assessment

    International Nuclear Information System (INIS)

    Kim, J. H.; Jung, W. D.; Park, J. K

    2003-01-01

    As the concern on the impact of the operator's inappropriate interventions, so-called Errors Of Commissions(EOCs), on the plant safety has been raised, the interest in the identification and analysis of EOC events from the risk assessment perspective becomes increasing accordingly. To this purpose, we propose a new methodology for identifying and analysing human errors of commission that might be caused from the failures in situation assessment and decision making during accident progressions given an initiating event. The proposed methodology was applied to the accident scenarios of YGN 3 and 4 NPPs, which resulted in about 10 EOC situations that need careful attention

  13. Attitude Determination Error Analysis System (ADEAS) mathematical specifications document

    Science.gov (United States)

    Nicholson, Mark; Markley, F.; Seidewitz, E.

    1988-01-01

    The mathematical specifications of Release 4.0 of the Attitude Determination Error Analysis System (ADEAS), which provides a general-purpose linear error analysis capability for various spacecraft attitude geometries and determination processes, are presented. The analytical basis of the system is presented. The analytical basis of the system is presented, and detailed equations are provided for both three-axis-stabilized and spin-stabilized attitude sensor models.

  14. Data Analysis & Statistical Methods for Command File Errors

    Science.gov (United States)

    Meshkat, Leila; Waggoner, Bruce; Bryant, Larry

    2014-01-01

    This paper explains current work on modeling for managing the risk of command file errors. It is focused on analyzing actual data from a JPL spaceflight mission to build models for evaluating and predicting error rates as a function of several key variables. We constructed a rich dataset by considering the number of errors, the number of files radiated, including the number commands and blocks in each file, as well as subjective estimates of workload and operational novelty. We have assessed these data using different curve fitting and distribution fitting techniques, such as multiple regression analysis, and maximum likelihood estimation to see how much of the variability in the error rates can be explained with these. We have also used goodness of fit testing strategies and principal component analysis to further assess our data. Finally, we constructed a model of expected error rates based on the what these statistics bore out as critical drivers to the error rate. This model allows project management to evaluate the error rate against a theoretically expected rate as well as anticipate future error rates.

  15. Basic human error probabilities in advanced MCRs when using soft control

    International Nuclear Information System (INIS)

    Jang, In Seok; Seong, Poong Hyun; Kang, Hyun Gook; Lee, Seung Jun

    2012-01-01

    In a report on one of the renowned HRA methods, Technique for Human Error Rate Prediction (THERP), it is pointed out that 'The paucity of actual data on human performance continues to be a major problem for estimating HEPs and performance times in nuclear power plant (NPP) task'. However, another critical difficulty is that most current HRA databases deal with operation in conventional type of MCRs. With the adoption of new human system interfaces that are based on computer based technologies, the operation environment of MCRs in NPPs has changed. The MCRs including these digital and computer technologies, such as large display panels, computerized procedures, soft controls, and so on, are called advanced MCRs. Because of the different interfaces, different Basic Human Error Probabilities (BHEPs) should be considered in human reliability analyses (HRAs) for advanced MCRs. This study carries out an empirical analysis of human error considering soft controls. The aim of this work is not only to compile a database using the simulator for advanced MCRs but also to compare BHEPs with those of a conventional MCR database

  16. The development of human behaviour analysis techniques -The development of human factors technologies-

    International Nuclear Information System (INIS)

    Lee, Jung Woon; Cheon, Se Woo; Shu, Sang Moon; Park, Geun Ok; Lee, Yong Hee; Lee, Han Yeong; Park, Jae Chang; Lee, Eu Jin; Lee, Seung Hee

    1994-04-01

    This project has two major areas ; one is the development of an operator task simulation software and another is the development of human error analysis and application technologies. In this year project, the second year, for the development of an operator task simulation software, we studied the followings: - analysis of the characteristics of operator tasks, - development of operator task structures : Macro Structures, - development of an operator task simulation analyzes, - analysis of performance measures. And the followings for the development of human error analysis and application technologies : - analysis of human error mechanisms, - analysis of human error characteristics in tasks, - analysis of human error occurrence in Korean Nuclear Power Plants, - establishment of an experimental environment for human error data collection with Compact Nuclear Simulator, - basic design of a Multimedia-based Human Error Representing System. (Author)

  17. Grinding Method and Error Analysis of Eccentric Shaft Parts

    Science.gov (United States)

    Wang, Zhiming; Han, Qiushi; Li, Qiguang; Peng, Baoying; Li, Weihua

    2017-12-01

    RV reducer and various mechanical transmission parts are widely used in eccentric shaft parts, The demand of precision grinding technology for eccentric shaft parts now, In this paper, the model of X-C linkage relation of eccentric shaft grinding is studied; By inversion method, the contour curve of the wheel envelope is deduced, and the distance from the center of eccentric circle is constant. The simulation software of eccentric shaft grinding is developed, the correctness of the model is proved, the influence of the X-axis feed error, the C-axis feed error and the wheel radius error on the grinding process is analyzed, and the corresponding error calculation model is proposed. The simulation analysis is carried out to provide the basis for the contour error compensation.

  18. The Concept of Human Error and the Design of Reliable Human-Machine Systems

    DEFF Research Database (Denmark)

    Rasmussen, Jens

    1995-01-01

    The concept of human error is unreliable as a basis for design of reliable human-machine systems. Humans are basically highly adaptive and 'errors' are closely related to the process of adaptation and learning. Therefore, reliability of system operation depends on an interface that is not designed...... so as to support a pre-conceived operating procedure, but, instead, makes visible the deep, functional structure of the system together with the boundaries of acceptable operation in away that allows operators to 'touch' the boundaries and to learn to cope with the effects of errors in a reversible...... way. The concepts behind such 'ecological' interfaces are discussed, an it is argued that a 'typology' of visualization concepts is a pressing research need....

  19. An Analysis of Medication Errors at the Military Medical Center: Implications for a Systems Approach for Error Reduction

    National Research Council Canada - National Science Library

    Scheirman, Katherine

    2001-01-01

    An analysis was accomplished of all inpatient medication errors at a military academic medical center during the year 2000, based on the causes of medication errors as described by current research in the field...

  20. Human error recovery failure probability when using soft controls in computerized control rooms

    International Nuclear Information System (INIS)

    Jang, Inseok; Kim, Ar Ryum; Seong, Poong Hyun; Jung, Wondea

    2014-01-01

    Many literatures categorized recovery process into three phases; detection of problem situation, explanation of problem causes or countermeasures against problem, and end of recovery. Although the focus of recovery promotion has been on categorizing recovery phases and modeling recovery process, research related to human recovery failure probabilities has not been perform actively. On the other hand, a few study regarding recovery failure probabilities were implemented empirically. Summarizing, researches that have performed so far have several problems in terms of use in human reliability analysis (HRA). By adopting new human-system interfaces that are based on computer-based technologies, the operation environment of MCRs in NPPs has changed from conventional MCRs to advanced MCRs. Because of the different interfaces between conventional and advanced MCRs, different recovery failure probabilities should be considered in the HRA for advanced MCRs. Therefore, this study carries out an empirical analysis of human error recovery probabilities under an advanced MCR mockup called compact nuclear simulator (CNS). The aim of this work is not only to compile a recovery failure probability database using the simulator for advanced MCRs but also to collect recovery failure probability according to defined human error modes to compare that which human error mode has highest recovery failure probability. The results show that recovery failure probability regarding wrong screen selection was lowest among human error modes, which means that most of human error related to wrong screen selection can be recovered. On the other hand, recovery failure probabilities of operation selection omission and delayed operation were 1.0. These results imply that once subject omitted one task in the procedure, they have difficulties finding and recovering their errors without supervisor's assistance. Also, wrong screen selection had an effect on delayed operation. That is, wrong screen

  1. Reliability and error analysis on xenon/CT CBF

    International Nuclear Information System (INIS)

    Zhang, Z.

    2000-01-01

    This article provides a quantitative error analysis of a simulation model of xenon/CT CBF in order to investigate the behavior and effect of different types of errors such as CT noise, motion artifacts, lower percentage of xenon supply, lower tissue enhancements, etc. A mathematical model is built to simulate these errors. By adjusting the initial parameters of the simulation model, we can scale the Gaussian noise, control the percentage of xenon supply, and change the tissue enhancement with different kVp settings. The motion artifact will be treated separately by geometrically shifting the sequential CT images. The input function is chosen from an end-tidal xenon curve of a practical study. Four kinds of cerebral blood flow, 10, 20, 50, and 80 cc/100 g/min, are examined under different error environments and the corresponding CT images are generated following the currently popular timing protocol. The simulated studies will be fed to a regular xenon/CT CBF system for calculation and evaluation. A quantitative comparison is given to reveal the behavior and effect of individual error resources. Mixed error testing is also provided to inspect the combination effect of errors. The experiment shows that CT noise is still a major error resource. The motion artifact affects the CBF results more geometrically than quantitatively. Lower xenon supply has a lesser effect on the results, but will reduce the signal/noise ratio. The lower xenon enhancement will lower the flow values in all areas of brain. (author)

  2. Ionospheric error analysis in gps measurements

    Directory of Open Access Journals (Sweden)

    G. Pugliano

    2008-06-01

    Full Text Available The results of an experiment aimed at evaluating the effects of the ionosphere on GPS positioning applications are presented in this paper. Specifically, the study, based upon a differential approach, was conducted utilizing GPS measurements acquired by various receivers located at increasing inter-distances. The experimental research was developed upon the basis of two groups of baselines: the first group is comprised of "short" baselines (less than 10 km; the second group is characterized by greater distances (up to 90 km. The obtained results were compared either upon the basis of the geometric characteristics, for six different baseline lengths, using 24 hours of data, or upon temporal variations, by examining two periods of varying intensity in ionospheric activity respectively coinciding with the maximum of the 23 solar cycle and in conditions of low ionospheric activity. The analysis revealed variations in terms of inter-distance as well as different performances primarily owing to temporal modifications in the state of the ionosphere.

  3. Sample Size Bounding and Context Ranking as Approaches to the Human Error Quantification Problem

    Energy Technology Data Exchange (ETDEWEB)

    Reer, B

    2004-03-01

    The paper describes a technique denoted as Sub-Sample-Size Bounding (SSSB), which is useable for the statistical derivation of context-specific probabilities from data available in existing reports on operating experience. Applications to human reliability analysis (HRA) are emphasised in the presentation of this technique. Exemplified by a sample of 180 abnormal event sequences, the manner in which SSSB can provide viable input for the quantification of errors of commission (EOCs) are outlined. (author)

  4. Sample Size Bounding and Context Ranking as Approaches to the Human Error Quantification Problem

    International Nuclear Information System (INIS)

    Reer, B.

    2004-01-01

    The paper describes a technique denoted as Sub-Sample-Size Bounding (SSSB), which is useable for the statistical derivation of context-specific probabilities from data available in existing reports on operating experience. Applications to human reliability analysis (HRA) are emphasised in the presentation of this technique. Exemplified by a sample of 180 abnormal event sequences, the manner in which SSSB can provide viable input for the quantification of errors of commission (EOCs) are outlined. (author)

  5. Integrating human factors into process hazard analysis

    International Nuclear Information System (INIS)

    Kariuki, S.G.; Loewe, K.

    2007-01-01

    A comprehensive process hazard analysis (PHA) needs to address human factors. This paper describes an approach that systematically identifies human error in process design and the human factors that influence its production and propagation. It is deductive in nature and therefore considers human error as a top event. The combinations of different factors that may lead to this top event are analysed. It is qualitative in nature and is used in combination with other PHA methods. The method has an advantage because it does not look at the operator error as the sole contributor to the human failure within a system but a combination of all underlying factors

  6. Scaling prediction errors to reward variability benefits error-driven learning in humans.

    Science.gov (United States)

    Diederen, Kelly M J; Schultz, Wolfram

    2015-09-01

    Effective error-driven learning requires individuals to adapt learning to environmental reward variability. The adaptive mechanism may involve decays in learning rate across subsequent trials, as shown previously, and rescaling of reward prediction errors. The present study investigated the influence of prediction error scaling and, in particular, the consequences for learning performance. Participants explicitly predicted reward magnitudes that were drawn from different probability distributions with specific standard deviations. By fitting the data with reinforcement learning models, we found scaling of prediction errors, in addition to the learning rate decay shown previously. Importantly, the prediction error scaling was closely related to learning performance, defined as accuracy in predicting the mean of reward distributions, across individual participants. In addition, participants who scaled prediction errors relative to standard deviation also presented with more similar performance for different standard deviations, indicating that increases in standard deviation did not substantially decrease "adapters'" accuracy in predicting the means of reward distributions. However, exaggerated scaling beyond the standard deviation resulted in impaired performance. Thus efficient adaptation makes learning more robust to changing variability. Copyright © 2015 the American Physiological Society.

  7. Formal Analysis of Soft Errors using Theorem Proving

    Directory of Open Access Journals (Sweden)

    Sofiène Tahar

    2013-07-01

    Full Text Available Modeling and analysis of soft errors in electronic circuits has traditionally been done using computer simulations. Computer simulations cannot guarantee correctness of analysis because they utilize approximate real number representations and pseudo random numbers in the analysis and thus are not well suited for analyzing safety-critical applications. In this paper, we present a higher-order logic theorem proving based method for modeling and analysis of soft errors in electronic circuits. Our developed infrastructure includes formalized continuous random variable pairs, their Cumulative Distribution Function (CDF properties and independent standard uniform and Gaussian random variables. We illustrate the usefulness of our approach by modeling and analyzing soft errors in commonly used dynamic random access memory sense amplifier circuits.

  8. QUALITATIVE DATA AND ERROR MEASUREMENT IN INPUT-OUTPUT-ANALYSIS

    NARCIS (Netherlands)

    NIJKAMP, P; OOSTERHAVEN, J; OUWERSLOOT, H; RIETVELD, P

    1992-01-01

    This paper is a contribution to the rapidly emerging field of qualitative data analysis in economics. Ordinal data techniques and error measurement in input-output analysis are here combined in order to test the reliability of a low level of measurement and precision of data by means of a stochastic

  9. Multiple sequential failure model: A probabilistic approach to quantifying human error dependency

    International Nuclear Information System (INIS)

    Samanta

    1985-01-01

    This paper rpesents a probabilistic approach to quantifying human error dependency when multiple tasks are performed. Dependent human failures are dominant contributors to risks from nuclear power plants. An overview of the Multiple Sequential Failure (MSF) model developed and its use in probabilistic risk assessments (PRAs) depending on the available data are discussed. A small-scale psychological experiment was conducted on the nature of human dependency and the interpretation of the experimental data by the MSF model show remarkable accommodation of the dependent failure data. The model, which provides an unique method for quantification of dependent failures in human reliability analysis, can be used in conjunction with any of the general methods currently used for performing the human reliability aspect in PRAs

  10. ATHEANA: open-quotes a technique for human error analysisclose quotes entering the implementation phase

    International Nuclear Information System (INIS)

    Taylor, J.; O'Hara, J.; Luckas, W.

    1997-01-01

    Probabilistic Risk Assessment (PRA) has become an increasingly important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. The NRC recently published a final policy statement, SECY-95-126, encouraging the use of PRA in regulatory activities. Human reliability analysis (HRA), while a critical element of PRA, has limitations in the analysis of human actions in PRAs that have long been recognized as a constraint when using PRA. In fact, better integration of HRA into the PRA process has long been a NRC issue. Of particular concern, has been the omission of errors of commission - those errors that are associated with inappropriate interventions by operators with operating systems. To address these concerns, the NRC identified the need to develop an improved HRA method, so that human reliability can be better represented and integrated into PRA modeling and quantification. The purpose of the Brookhaven National Laboratory (BNL) project, entitled 'Improved HRA Method Based on Operating Experience' is to develop a new method for HRA which is supported by the analysis of risk-significant operating experience. This approach will allow a more realistic assessment and representation of the human contribution to plant risk, and thereby increase the utility of PRA. The project's completed, ongoing, and future efforts fall into four phases: (1) Assessment phase (FY 92/93); (2) Analysis and Characterization phase (FY 93/94); (3) Development phase (FY 95/96); and (4) Implementation phase (FY 96/97 ongoing)

  11. Identification and Assessment of Human Errors in Postgraduate Endodontic Students of Kerman University of Medical Sciences by Using the SHERPA Method

    Directory of Open Access Journals (Sweden)

    Saman Dastaran

    2016-03-01

    Full Text Available Introduction: Human errors are the cause of many accidents, including industrial and medical, therefore finding out an approach for identifying and reducing them is very important. Since no study has been done about human errors in the dental field, this study aimed to identify and assess human errors in postgraduate endodontic students of Kerman University of Medical Sciences by using the SHERPA Method. Methods: This cross-sectional study was performed during year 2014. Data was collected using task observation and interviewing postgraduate endodontic students. Overall, 10 critical tasks, which were most likely to cause harm to patients were determined. Next, Hierarchical Task Analysis (HTA was conducted and human errors in each task were identified by the Systematic Human Error Reduction Prediction Approach (SHERPA technique worksheets. Results: After analyzing the SHERPA worksheets, 90 human errors were identified including (67.7% action errors, (13.3% checking errors, (8.8% selection errors, (5.5% retrieval errors and (4.4% communication errors. As a result, most of them were action errors and less of them were communication errors. Conclusions: The results of the study showed that the highest percentage of errors and the highest level of risk were associated with action errors, therefore, to reduce the occurrence of such errors and limit their consequences, control measures including periodical training of work procedures, providing work check-lists, development of guidelines and establishment of a systematic and standardized reporting system, should be put in place. Regarding the results of this study, the control of recovery errors with the highest percentage of undesirable risk and action errors with the highest frequency of errors should be in the priority of control

  12. Comparison of risk sensitivity to human errors in the Oconee and LaSalle PRAs

    International Nuclear Information System (INIS)

    Wong, S.; Higgins, J.

    1991-01-01

    This paper describes the comparative analyses of plant risk sensitivity to human errors in the Oconee and La Salle Probabilistic Risk Assessment (PRAs). These analyses were performed to determine the reasons for the observed differences in the sensitivity of core melt frequency (CMF) to changes in human error probabilities (HEPs). Plant-specific design features, PRA methods, and the level of detail and assumptions in the human error modeling were evaluated to assess their influence risk estimates and sensitivities

  13. An empirical study on the basic human error probabilities for NPP advanced main control room operation using soft control

    International Nuclear Information System (INIS)

    Jang, Inseok; Kim, Ar Ryum; Harbi, Mohamed Ali Salem Al; Lee, Seung Jun; Kang, Hyun Gook; Seong, Poong Hyun

    2013-01-01

    Highlights: ► The operation environment of MCRs in NPPs has changed by adopting new HSIs. ► The operation action in NPP Advanced MCRs is performed by soft control. ► Different basic human error probabilities (BHEPs) should be considered. ► BHEPs in a soft control operation environment are investigated empirically. ► This work will be helpful to verify if soft control has positive or negative effects. -- Abstract: By adopting new human–system interfaces that are based on computer-based technologies, the operation environment of main control rooms (MCRs) in nuclear power plants (NPPs) has changed. The MCRs that include these digital and computer technologies, such as large display panels, computerized procedures, soft controls, and so on, are called Advanced MCRs. Among the many features in Advanced MCRs, soft controls are an important feature because the operation action in NPP Advanced MCRs is performed by soft control. Using soft controls such as mouse control, touch screens, and so on, operators can select a specific screen, then choose the controller, and finally manipulate the devices. However, because of the different interfaces between soft control and hardwired conventional type control, different basic human error probabilities (BHEPs) should be considered in the Human Reliability Analysis (HRA) for advanced MCRs. Although there are many HRA methods to assess human reliabilities, such as Technique for Human Error Rate Prediction (THERP), Accident Sequence Evaluation Program (ASEP), Human Error Assessment and Reduction Technique (HEART), Human Event Repository and Analysis (HERA), Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR), Cognitive Reliability and Error Analysis Method (CREAM), and so on, these methods have been applied to conventional MCRs, and they do not consider the new features of advance MCRs such as soft controls. As a result, there is an insufficient database for assessing human reliabilities in advanced

  14. Error analysis for mesospheric temperature profiling by absorptive occultation sensors

    Directory of Open Access Journals (Sweden)

    M. J. Rieder

    Full Text Available An error analysis for mesospheric profiles retrieved from absorptive occultation data has been performed, starting with realistic error assumptions as would apply to intensity data collected by available high-precision UV photodiode sensors. Propagation of statistical errors was investigated through the complete retrieval chain from measured intensity profiles to atmospheric density, pressure, and temperature profiles. We assumed unbiased errors as the occultation method is essentially self-calibrating and straight-line propagation of occulted signals as we focus on heights of 50–100 km, where refractive bending of the sensed radiation is negligible. Throughout the analysis the errors were characterized at each retrieval step by their mean profile, their covariance matrix and their probability density function (pdf. This furnishes, compared to a variance-only estimation, a much improved insight into the error propagation mechanism. We applied the procedure to a baseline analysis of the performance of a recently proposed solar UV occultation sensor (SMAS – Sun Monitor and Atmospheric Sounder and provide, using a reasonable exponential atmospheric model as background, results on error standard deviations and error correlation functions of density, pressure, and temperature profiles. Two different sensor photodiode assumptions are discussed, respectively, diamond diodes (DD with 0.03% and silicon diodes (SD with 0.1% (unattenuated intensity measurement noise at 10 Hz sampling rate. A factor-of-2 margin was applied to these noise values in order to roughly account for unmodeled cross section uncertainties. Within the entire height domain (50–100 km we find temperature to be retrieved to better than 0.3 K (DD / 1 K (SD accuracy, respectively, at 2 km height resolution. The results indicate that absorptive occultations acquired by a SMAS-type sensor could provide mesospheric profiles of fundamental variables such as temperature with

  15. Error analysis for mesospheric temperature profiling by absorptive occultation sensors

    Directory of Open Access Journals (Sweden)

    M. J. Rieder

    2001-01-01

    Full Text Available An error analysis for mesospheric profiles retrieved from absorptive occultation data has been performed, starting with realistic error assumptions as would apply to intensity data collected by available high-precision UV photodiode sensors. Propagation of statistical errors was investigated through the complete retrieval chain from measured intensity profiles to atmospheric density, pressure, and temperature profiles. We assumed unbiased errors as the occultation method is essentially self-calibrating and straight-line propagation of occulted signals as we focus on heights of 50–100 km, where refractive bending of the sensed radiation is negligible. Throughout the analysis the errors were characterized at each retrieval step by their mean profile, their covariance matrix and their probability density function (pdf. This furnishes, compared to a variance-only estimation, a much improved insight into the error propagation mechanism. We applied the procedure to a baseline analysis of the performance of a recently proposed solar UV occultation sensor (SMAS – Sun Monitor and Atmospheric Sounder and provide, using a reasonable exponential atmospheric model as background, results on error standard deviations and error correlation functions of density, pressure, and temperature profiles. Two different sensor photodiode assumptions are discussed, respectively, diamond diodes (DD with 0.03% and silicon diodes (SD with 0.1% (unattenuated intensity measurement noise at 10 Hz sampling rate. A factor-of-2 margin was applied to these noise values in order to roughly account for unmodeled cross section uncertainties. Within the entire height domain (50–100 km we find temperature to be retrieved to better than 0.3 K (DD / 1 K (SD accuracy, respectively, at 2 km height resolution. The results indicate that absorptive occultations acquired by a SMAS-type sensor could provide mesospheric profiles of fundamental variables such as temperature with

  16. Sleep quality, posttraumatic stress, depression, and human errors in train drivers: a population-based nationwide study in South Korea.

    Science.gov (United States)

    Jeon, Hong Jin; Kim, Ji-Hae; Kim, Bin-Na; Park, Seung Jin; Fava, Maurizio; Mischoulon, David; Kang, Eun-Ho; Roh, Sungwon; Lee, Dongsoo

    2014-12-01

    Human error is defined as an unintended error that is attributable to humans rather than machines, and that is important to avoid to prevent accidents. We aimed to investigate the association between sleep quality and human errors among train drivers. Cross-sectional. Population-based. A sample of 5,480 subjects who were actively working as train drivers were recruited in South Korea. The participants were 4,634 drivers who completed all questionnaires (response rate 84.6%). None. The Pittsburgh Sleep Quality Index (PSQI), the Center for Epidemiologic Studies Depression Scale (CES-D), the Impact of Event Scale-Revised (IES-R), the State-Trait Anxiety Inventory (STAI), and the Korean Occupational Stress Scale (KOSS). Of 4,634 train drivers, 349 (7.5%) showed more than one human error per 5 y. Human errors were associated with poor sleep quality, higher PSQI total scores, short sleep duration at night, and longer sleep latency. Among train drivers with poor sleep quality, those who experienced severe posttraumatic stress showed a significantly higher number of human errors than those without. Multiple logistic regression analysis showed that human errors were significantly associated with poor sleep quality and posttraumatic stress, whereas there were no significant associations with depression, trait and state anxiety, and work stress after adjusting for age, sex, education years, marital status, and career duration. Poor sleep quality was found to be associated with more human errors in train drivers, especially in those who experienced severe posttraumatic stress. © 2014 Associated Professional Sleep Societies, LLC.

  17. Sensitivity of risk parameters to human errors in reactor safety study for a PWR

    International Nuclear Information System (INIS)

    Samanta, P.K.; Hall, R.E.; Swoboda, A.L.

    1981-01-01

    Sensitivities of the risk parameters, emergency safety system unavailabilities, accident sequence probabilities, release category probabilities and core melt probability were investigated for changes in the human error rates within the general methodological framework of the Reactor Safety Study (RSS) for a Pressurized Water Reactor (PWR). Impact of individual human errors were assessed both in terms of their structural importance to core melt and reliability importance on core melt probability. The Human Error Sensitivity Assessment of a PWR (HESAP) computer code was written for the purpose of this study. The code employed point estimate approach and ignored the smoothing technique applied in RSS. It computed the point estimates for the system unavailabilities from the median values of the component failure rates and proceeded in terms of point values to obtain the point estimates for the accident sequence probabilities, core melt probability, and release category probabilities. The sensitivity measure used was the ratio of the top event probability before and after the perturbation of the constituent events. Core melt probability per reactor year showed significant increase with the increase in the human error rates, but did not show similar decrease with the decrease in the human error rates due to the dominance of the hardware failures. When the Minimum Human Error Rate (M.H.E.R.) used is increased to 10 -3 , the base case human error rates start sensitivity to human errors. This effort now allows the evaluation of new error rate data along with proposed changes in the man machine interface

  18. Error Analysis Of Clock Time (T), Declination (*) And Latitude ...

    African Journals Online (AJOL)

    ), latitude (Φ), longitude (λ) and azimuth (A); which are aimed at establishing fixed positions and orientations of survey points and lines on the earth surface. The paper attempts the analysis of the individual and combined effects of error in time ...

  19. Analysis of possible systematic errors in the Oslo method

    International Nuclear Information System (INIS)

    Larsen, A. C.; Guttormsen, M.; Buerger, A.; Goergen, A.; Nyhus, H. T.; Rekstad, J.; Siem, S.; Toft, H. K.; Tveten, G. M.; Wikan, K.; Krticka, M.; Betak, E.; Schiller, A.; Voinov, A. V.

    2011-01-01

    In this work, we have reviewed the Oslo method, which enables the simultaneous extraction of the level density and γ-ray transmission coefficient from a set of particle-γ coincidence data. Possible errors and uncertainties have been investigated. Typical data sets from various mass regions as well as simulated data have been tested against the assumptions behind the data analysis.

  20. Technical Note: Introduction of variance component analysis to setup error analysis in radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Matsuo, Yukinori, E-mail: ymatsuo@kuhp.kyoto-u.ac.jp; Nakamura, Mitsuhiro; Mizowaki, Takashi; Hiraoka, Masahiro [Department of Radiation Oncology and Image-applied Therapy, Kyoto University, 54 Shogoin-Kawaharacho, Sakyo, Kyoto 606-8507 (Japan)

    2016-09-15

    Purpose: The purpose of this technical note is to introduce variance component analysis to the estimation of systematic and random components in setup error of radiotherapy. Methods: Balanced data according to the one-factor random effect model were assumed. Results: Analysis-of-variance (ANOVA)-based computation was applied to estimate the values and their confidence intervals (CIs) for systematic and random errors and the population mean of setup errors. The conventional method overestimates systematic error, especially in hypofractionated settings. The CI for systematic error becomes much wider than that for random error. The ANOVA-based estimation can be extended to a multifactor model considering multiple causes of setup errors (e.g., interpatient, interfraction, and intrafraction). Conclusions: Variance component analysis may lead to novel applications to setup error analysis in radiotherapy.

  1. Technical Note: Introduction of variance component analysis to setup error analysis in radiotherapy

    International Nuclear Information System (INIS)

    Matsuo, Yukinori; Nakamura, Mitsuhiro; Mizowaki, Takashi; Hiraoka, Masahiro

    2016-01-01

    Purpose: The purpose of this technical note is to introduce variance component analysis to the estimation of systematic and random components in setup error of radiotherapy. Methods: Balanced data according to the one-factor random effect model were assumed. Results: Analysis-of-variance (ANOVA)-based computation was applied to estimate the values and their confidence intervals (CIs) for systematic and random errors and the population mean of setup errors. The conventional method overestimates systematic error, especially in hypofractionated settings. The CI for systematic error becomes much wider than that for random error. The ANOVA-based estimation can be extended to a multifactor model considering multiple causes of setup errors (e.g., interpatient, interfraction, and intrafraction). Conclusions: Variance component analysis may lead to novel applications to setup error analysis in radiotherapy.

  2. Detecting errors in micro and trace analysis by using statistics

    DEFF Research Database (Denmark)

    Heydorn, K.

    1993-01-01

    By assigning a standard deviation to each step in an analytical method it is possible to predict the standard deviation of each analytical result obtained by this method. If the actual variability of replicate analytical results agrees with the expected, the analytical method is said...... to be in statistical control. Significant deviations between analytical results from different laboratories reveal the presence of systematic errors, and agreement between different laboratories indicate the absence of systematic errors. This statistical approach, referred to as the analysis of precision, was applied...

  3. Effective use of pre-job briefing as tool for the prevention of human error

    International Nuclear Information System (INIS)

    Schlump, Ansgar

    2015-01-01

    There is a fundamental demand to minimise the risks for workers and facilities while executing maintenance work. To ensure that facilities are secure and reliable, any deviation from normal operation behaviour has to be avoided. Accurate planning is the basis for minimising mistakes and making work more secure. All workers involved should understand how the work should be done and what is expected to avoid human errors. Especially in nuclear power plants, the human performance tools (HPT) have proved to be an effective instrument to minimise human errors. These human performance tools consist of numerous different tools that complement each other (e.g. pre-job briefing). The safety culture of the plants is also characterised by these tools. The choice of using the right HP-Tool is often a difficult task for the work planer. On the one hand, he wants to avoid mistakes during the execution of work but on the other hand he does not want to irritate the workers with unnecessary requirements. The proposed concept uses a simple risk analysis to take into account the complexity of the task, the experience of the past and the consequences of failure in to account. One main result of this risk analysis is a recommendation of the detailing of the pre-job briefing, to reduce the risks for the involved staff to a minimum.

  4. Determining Bounds on Assumption Errors in Operational Analysis

    Directory of Open Access Journals (Sweden)

    Neal M. Bengtson

    2014-01-01

    Full Text Available The technique of operational analysis (OA is used in the study of systems performance, mainly for estimating mean values of various measures of interest, such as, number of jobs at a device and response times. The basic principles of operational analysis allow errors in assumptions to be quantified over a time period. The assumptions which are used to derive the operational analysis relationships are studied. Using Karush-Kuhn-Tucker (KKT conditions bounds on error measures of these OA relationships are found. Examples of these bounds are used for representative performance measures to show limits on the difference between true performance values and those estimated by operational analysis relationships. A technique for finding tolerance limits on the bounds is demonstrated with a simulation example.

  5. Development of an Experimental Measurement System for Human Error Characteristics and a Pilot Test

    International Nuclear Information System (INIS)

    Jang, Tong-Il; Lee, Hyun-Chul; Moon, Kwangsu

    2017-01-01

    Some items out of individual and team characteristics were partially selected, and a pilot test was performed to measure and evaluate them using the experimental measurement system of human error characteristics. It is one of the processes to produce input data to the Eco-DBMS. And also, through the pilot test, it was tried to take methods to measure and acquire the physiological data, and to develop data format and quantification methods for the database. In this study, a pilot test to measure the stress and the tension level, and team cognitive characteristics out of human error characteristics was performed using the human error characteristics measurement and experimental evaluation system. In an experiment measuring the stress level, physiological characteristics using EEG was measured in a simulated unexpected situation. As shown in results, although this experiment was pilot, it was validated that relevant results for evaluating human error coping effects of workers’ FFD management guidelines and unexpected situation against guidelines can be obtained. In following researches, additional experiments including other human error characteristics will be conducted. Furthermore, the human error characteristics measurement and experimental evaluation system will be utilized to validate various human error coping solutions such as human factors criteria, design, and guidelines as well as supplement the human error characteristics database.

  6. An Approach to Human Error Hazard Detection of Unexpected Situations in NPPs

    Energy Technology Data Exchange (ETDEWEB)

    Park, Sangjun; Oh, Yeonju; Shin, Youmin; Lee, Yong-Hee [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    Fukushima accident is a typical complex event including the extreme situations induced by the succeeding earthquake, tsunami, explosion, and human errors. And it is judged with incomplete cause of system build-up same manner, procedure as a deficiency of response manual, education and training, team capability and the discharge of operator from human engineering point of view. Especially, the guidelines of current operating NPPs are not enough including countermeasures to the human errors at the extreme situations. Therefore, this paper describes a trial to detect the hazards of human errors at extreme situation, and to define the countermeasures that can properly response to the human error hazards when an individual, team, organization, and working entities that encounter the extreme situation in NPPs. In this paper we try to propose an approach to analyzing and extracting human error hazards for suggesting additional countermeasures to the human errors in unexpected situations. They might be utilized to develop contingency guidelines, especially for reducing the human error accident in NPPs. But the trial application in this study is currently limited since it is not easy to find accidents cases in detail enough to enumerate the proposed steps. Therefore, we will try to analyze as more cases as possible, and consider other environmental factors and human error conditions.

  7. An Approach to Human Error Hazard Detection of Unexpected Situations in NPPs

    International Nuclear Information System (INIS)

    Park, Sangjun; Oh, Yeonju; Shin, Youmin; Lee, Yong-Hee

    2015-01-01

    Fukushima accident is a typical complex event including the extreme situations induced by the succeeding earthquake, tsunami, explosion, and human errors. And it is judged with incomplete cause of system build-up same manner, procedure as a deficiency of response manual, education and training, team capability and the discharge of operator from human engineering point of view. Especially, the guidelines of current operating NPPs are not enough including countermeasures to the human errors at the extreme situations. Therefore, this paper describes a trial to detect the hazards of human errors at extreme situation, and to define the countermeasures that can properly response to the human error hazards when an individual, team, organization, and working entities that encounter the extreme situation in NPPs. In this paper we try to propose an approach to analyzing and extracting human error hazards for suggesting additional countermeasures to the human errors in unexpected situations. They might be utilized to develop contingency guidelines, especially for reducing the human error accident in NPPs. But the trial application in this study is currently limited since it is not easy to find accidents cases in detail enough to enumerate the proposed steps. Therefore, we will try to analyze as more cases as possible, and consider other environmental factors and human error conditions

  8. BAYES-HEP: Bayesian belief networks for estimation of human error probability

    International Nuclear Information System (INIS)

    Karthick, M.; Senthil Kumar, C.; Paul, Robert T.

    2017-01-01

    Human errors contribute a significant portion of risk in safety critical applications and methods for estimation of human error probability have been a topic of research for over a decade. The scarce data available on human errors and large uncertainty involved in the prediction of human error probabilities make the task difficult. This paper presents a Bayesian belief network (BBN) model for human error probability estimation in safety critical functions of a nuclear power plant. The developed model using BBN would help to estimate HEP with limited human intervention. A step-by-step illustration of the application of the method and subsequent evaluation is provided with a relevant case study and the model is expected to provide useful insights into risk assessment studies

  9. A system engineer's Perspective on Human Errors For a more Effective Management of Human Factors in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Lee, Yong-Hee; Jang, Tong-Il; Lee, Soo-Kil

    2007-01-01

    The management of human factors in nuclear power plants (NPPs) has become one of the burden factors during their operating period after the design and construction period. Almost every study on the major accidents emphasizes the prominent importance of the human errors. Regardless of the regulatory requirements such as Periodic Safety Review, the management of human factors would be a main issue to reduce the human errors and to enhance the performance of plants. However, it is not easy to find out a more effective perspective on human errors to establish the engineering implementation plan for preventing them. This paper describes a system engineer's perspectives on human errors and discusses its application to the recent study on the human error events in Korean NPPs

  10. Predicting positional error of MLC using volumetric analysis

    International Nuclear Information System (INIS)

    Hareram, E.S.

    2008-01-01

    IMRT normally using multiple beamlets (small width of the beam) for a particular field to deliver so that it is imperative to maintain the positional accuracy of the MLC in order to deliver integrated computed dose accurately. Different manufacturers have reported high precession on MLC devices with leaf positional accuracy nearing 0.1 mm but measuring and rectifying the error in this accuracy is very difficult. Various methods are used to check MLC position and among this volumetric analysis is one of the technique. Volumetric approach was adapted in our method using primus machine and 0.6cc chamber at 5 cm depth In perspex. MLC of 1 mm error introduces an error of 20%, more sensitive to other methods

  11. Students’ Written Production Error Analysis in the EFL Classroom Teaching: A Study of Adult English Learners Errors

    Directory of Open Access Journals (Sweden)

    Ranauli Sihombing

    2016-12-01

    Full Text Available Errors analysis has become one of the most interesting issues in the study of Second Language Acquisition. It can not be denied that some teachers do not know a lot about error analysis and related theories of how L1, L2 or foreign language acquired. In addition, the students often feel upset since they find a gap between themselves and the teachers for the errors the students make and the teachers’ understanding about the error correction. The present research aims to investigate what errors adult English learners make in written production of English. The significances of the study is to know what errors students make in writing that the teachers can find solution to the errors the students make for a better English language teaching and learning especially in teaching English for adults. The study employed qualitative method. The research was undertaken at an airline education center in Bandung. The result showed that syntax errors are more frequently found than morphology errors, especially in terms of verb phrase errors. It is recommended that it is important for teacher to know the theory of second language acquisition in order to know how the students learn and produce theirlanguage. In addition, it will be advantages for teachers if they know what errors students frequently make in their learning, so that the teachers can give solution to the students for a better English language learning achievement.   DOI: https://doi.org/10.24071/llt.2015.180205

  12. Critical slowing down and error analysis in lattice QCD simulations

    International Nuclear Information System (INIS)

    Virotta, Francesco

    2012-01-01

    In this work we investigate the critical slowing down of lattice QCD simulations. We perform a preliminary study in the quenched approximation where we find that our estimate of the exponential auto-correlation time scales as τ exp (a)∝a -5 , where a is the lattice spacing. In unquenched simulations with O(a) improved Wilson fermions we do not obtain a scaling law but find results compatible with the behavior that we find in the pure gauge theory. The discussion is supported by a large set of ensembles both in pure gauge and in the theory with two degenerate sea quarks. We have moreover investigated the effect of slow algorithmic modes in the error analysis of the expectation value of typical lattice QCD observables (hadronic matrix elements and masses). In the context of simulations affected by slow modes we propose and test a method to obtain reliable estimates of statistical errors. The method is supposed to help in the typical algorithmic setup of lattice QCD, namely when the total statistics collected is of O(10)τ exp . This is the typical case when simulating close to the continuum limit where the computational costs for producing two independent data points can be extremely large. We finally discuss the scale setting in N f =2 simulations using the Kaon decay constant f K as physical input. The method is explained together with a thorough discussion of the error analysis employed. A description of the publicly available code used for the error analysis is included.

  13. Disasters of endoscopic surgery and how to avoid them: error analysis.

    Science.gov (United States)

    Troidl, H

    1999-08-01

    For every innovation there are two sides to consider. For endoscopic surgery the positive side is more comfort for the patient, and the negative side is new complications, even disasters, such as injuries to organs (e.g., the bowel), vessels, and the common bile duct. These disasters are rare and seldom reported in the scientific world, as at conferences, at symposiums, and in publications. Today there are many methods for testing an innovation (controlled clinical trials, consensus conferences, audits, and confidential inquiries). Reporting "complications," however, does not help to avoid them. We need real methods for avoiding negative failures. The failure analysis is the method of choice in industry. If an airplane crashes, error analysis starts immediately. Humans make errors, and making errors means punishment. Failure analysis means rigorously and objectively investigating a clinical situation to find clinical relevant information for avoiding these negative events in the future. Error analysis has four important steps: (1) What was the clinical situation? (2) What has happened? (3) Most important: Why did it happen? (4) How do we avoid the negative event or disaster in the future. Error analysis has decisive advantages. It is easy to perform; it supplies clinically relevant information to help avoid it; and there is no need for money. It can be done everywhere; and the information is available in a short time. The other side of the coin is that error analysis is of course retrospective, it may not be objective, and most important it will probably have legal consequences. To be more effective in medicine and surgery we must handle our errors using a different approach. According to Sir Karl Popper: "The consituation is that we have to learn from our errors. To cover up failure is therefore the biggest intellectual sin.

  14. Development of safety analysis and constraint detection techniques for process interaction errors

    International Nuclear Information System (INIS)

    Fan, Chin-Feng; Tsai, Shang-Lin; Tseng, Wan-Hui

    2011-01-01

    Among the new failure modes introduced by computer into safety systems, the process interaction error is the most unpredictable and complicated failure mode, which may cause disastrous consequences. This paper presents safety analysis and constraint detection techniques for process interaction errors among hardware, software, and human processes. Among interaction errors, the most dreadful ones are those that involve run-time misinterpretation from a logic process. We call them the 'semantic interaction errors'. Such abnormal interaction is not adequately emphasized in current research. In our static analysis, we provide a fault tree template focusing on semantic interaction errors by checking conflicting pre-conditions and post-conditions among interacting processes. Thus, far-fetched, but highly risky, interaction scenarios involve interpretation errors can be identified. For run-time monitoring, a range of constraint types is proposed for checking abnormal signs at run time. We extend current constraints to a broader relational level and a global level, considering process/device dependencies and physical conservation rules in order to detect process interaction errors. The proposed techniques can reduce abnormal interactions; they can also be used to assist in safety-case construction.

  15. Development of safety analysis and constraint detection techniques for process interaction errors

    Energy Technology Data Exchange (ETDEWEB)

    Fan, Chin-Feng, E-mail: csfanc@saturn.yzu.edu.tw [Computer Science and Engineering Dept., Yuan-Ze University, Taiwan (China); Tsai, Shang-Lin; Tseng, Wan-Hui [Computer Science and Engineering Dept., Yuan-Ze University, Taiwan (China)

    2011-02-15

    Among the new failure modes introduced by computer into safety systems, the process interaction error is the most unpredictable and complicated failure mode, which may cause disastrous consequences. This paper presents safety analysis and constraint detection techniques for process interaction errors among hardware, software, and human processes. Among interaction errors, the most dreadful ones are those that involve run-time misinterpretation from a logic process. We call them the 'semantic interaction errors'. Such abnormal interaction is not adequately emphasized in current research. In our static analysis, we provide a fault tree template focusing on semantic interaction errors by checking conflicting pre-conditions and post-conditions among interacting processes. Thus, far-fetched, but highly risky, interaction scenarios involve interpretation errors can be identified. For run-time monitoring, a range of constraint types is proposed for checking abnormal signs at run time. We extend current constraints to a broader relational level and a global level, considering process/device dependencies and physical conservation rules in order to detect process interaction errors. The proposed techniques can reduce abnormal interactions; they can also be used to assist in safety-case construction.

  16. Human reliability analysis in Loviisa probabilistic safety analysis

    International Nuclear Information System (INIS)

    Illman, L.; Isaksson, J.; Makkonen, L.; Vaurio, J.K.; Vuorio, U.

    1986-01-01

    The human reliability analysis in the Loviisa PSA project is carried out for three major groups of errors in human actions: (A) errors made before an initiating event, (B) errors that initiate a transient and (C) errors made during transients. Recovery possibilities are also included in each group. The methods used or planned for each group are described. A simplified THERP approach is used for group A, with emphasis on test and maintenance error recovery aspects and dependencies between redundancies. For group B, task analyses and human factors assessments are made for startup, shutdown and operational transients, with emphasis on potential common cause initiators. For group C, both misdiagnosis and slow decision making are analyzed, as well as errors made in carrying out necessary or backup actions. New or advanced features of the methodology are described

  17. Post-event human decision errors: operator action tree/time reliability correlation

    Energy Technology Data Exchange (ETDEWEB)

    Hall, R E; Fragola, J; Wreathall, J

    1982-11-01

    This report documents an interim framework for the quantification of the probability of errors of decision on the part of nuclear power plant operators after the initiation of an accident. The framework can easily be incorporated into an event tree/fault tree analysis. The method presented consists of a structure called the operator action tree and a time reliability correlation which assumes the time available for making a decision to be the dominating factor in situations requiring cognitive human response. This limited approach decreases the magnitude and complexity of the decision modeling task. Specifically, in the past, some human performance models have attempted prediction by trying to emulate sequences of human actions, or by identifying and modeling the information processing approach applicable to the task. The model developed here is directed at describing the statistical performance of a representative group of hypothetical individuals responding to generalized situations.

  18. Post-event human decision errors: operator action tree/time reliability correlation

    International Nuclear Information System (INIS)

    Hall, R.E.; Fragola, J.; Wreathall, J.

    1982-11-01

    This report documents an interim framework for the quantification of the probability of errors of decision on the part of nuclear power plant operators after the initiation of an accident. The framework can easily be incorporated into an event tree/fault tree analysis. The method presented consists of a structure called the operator action tree and a time reliability correlation which assumes the time available for making a decision to be the dominating factor in situations requiring cognitive human response. This limited approach decreases the magnitude and complexity of the decision modeling task. Specifically, in the past, some human performance models have attempted prediction by trying to emulate sequences of human actions, or by identifying and modeling the information processing approach applicable to the task. The model developed here is directed at describing the statistical performance of a representative group of hypothetical individuals responding to generalized situations

  19. Human dorsal striatum encodes prediction errors during observational learning of instrumental actions.

    Science.gov (United States)

    Cooper, Jeffrey C; Dunne, Simon; Furey, Teresa; O'Doherty, John P

    2012-01-01

    The dorsal striatum plays a key role in the learning and expression of instrumental reward associations that are acquired through direct experience. However, not all learning about instrumental actions require direct experience. Instead, humans and other animals are also capable of acquiring instrumental actions by observing the experiences of others. In this study, we investigated the extent to which human dorsal striatum is involved in observational as well as experiential instrumental reward learning. Human participants were scanned with fMRI while they observed a confederate over a live video performing an instrumental conditioning task to obtain liquid juice rewards. Participants also performed a similar instrumental task for their own rewards. Using a computational model-based analysis, we found reward prediction errors in the dorsal striatum not only during the experiential learning condition but also during observational learning. These results suggest a key role for the dorsal striatum in learning instrumental associations, even when those associations are acquired purely by observing others.

  20. Incident reporting: Its role in aviation safety and the acquisition of human error data

    Science.gov (United States)

    Reynard, W. D.

    1983-01-01

    The rationale for aviation incident reporting systems is presented and contrasted to some of the shortcomings of accident investigation procedures. The history of the United State's Aviation Safety Reporting System (ASRS) is outlined and the program's character explained. The planning elements that resulted in the ASRS program's voluntary, confidential, and non-punitive design are discussed. Immunity, from enforcement action and misuse of the volunteered data, is explained and evaluated. Report generation techniques and the ASRS data analysis process are described; in addition, examples of the ASRS program's output and accomplishments are detailed. Finally, the value of incident reporting for the acquisition of safety information, particularly human error data, is explored.

  1. Accommodating error analysis in comparison and clustering of molecular fingerprints.

    OpenAIRE

    Salamon, H.; Segal, M. R.; Ponce de Leon, A.; Small, P. M.

    1998-01-01

    Molecular epidemiologic studies of infectious diseases rely on pathogen genotype comparisons, which usually yield patterns comprising sets of DNA fragments (DNA fingerprints). We use a highly developed genotyping system, IS6110-based restriction fragment length polymorphism analysis of Mycobacterium tuberculosis, to develop a computational method that automates comparison of large numbers of fingerprints. Because error in fragment length measurements is proportional to fragment length and is ...

  2. Radiological error: analysis, standard setting, targeted instruction and teamworking

    International Nuclear Information System (INIS)

    FitzGerald, Richard

    2005-01-01

    Diagnostic radiology does not have objective benchmarks for acceptable levels of missed diagnoses [1]. Until now, data collection of radiological discrepancies has been very time consuming. The culture within the specialty did not encourage it. However, public concern about patient safety is increasing. There have been recent innovations in compiling radiological interpretive discrepancy rates which may facilitate radiological standard setting. However standard setting alone will not optimise radiologists' performance or patient safety. We must use these new techniques in radiological discrepancy detection to stimulate greater knowledge sharing, targeted instruction and teamworking among radiologists. Not all radiological discrepancies are errors. Radiological discrepancy programmes must not be abused as an instrument for discrediting individual radiologists. Discrepancy rates must not be distorted as a weapon in turf battles. Radiological errors may be due to many causes and are often multifactorial. A systems approach to radiological error is required. Meaningful analysis of radiological discrepancies and errors is challenging. Valid standard setting will take time. Meanwhile, we need to develop top-up training, mentoring and rehabilitation programmes. (orig.)

  3. Automation of Commanding at NASA: Reducing Human Error in Space Flight

    Science.gov (United States)

    Dorn, Sarah J.

    2010-01-01

    Automation has been implemented in many different industries to improve efficiency and reduce human error. Reducing or eliminating the human interaction in tasks has been proven to increase productivity in manufacturing and lessen the risk of mistakes by humans in the airline industry. Human space flight requires the flight controllers to monitor multiple systems and react quickly when failures occur so NASA is interested in implementing techniques that can assist in these tasks. Using automation to control some of these responsibilities could reduce the number of errors the flight controllers encounter due to standard human error characteristics. This paper will investigate the possibility of reducing human error in the critical area of manned space flight at NASA.

  4. Human Error Probability Assessment During Maintenance Activities of Marine Systems

    OpenAIRE

    Rabiul Islam; Faisal Khan; Rouzbeh Abbassi; Vikram Garaniya

    2018-01-01

    Background: Maintenance operations on-board ships are highly demanding. Maintenance operations are intensive activities requiring high man–machine interactions in challenging and evolving conditions. The evolving conditions are weather conditions, workplace temperature, ship motion, noise and vibration, and workload and stress. For example, extreme weather condition affects seafarers' performance, increasing the chances of error, and, consequently, can cause injuries or fatalities to personne...

  5. Errors in data interpretation from genetic variation of human analytes

    OpenAIRE

    Howie, Heather L.; Delaney, Meghan; Wang, Xiaohong; Er, Lay See; Kapp, Linda; Lebedev, Jenna N.; Zimring, James C.

    2017-01-01

    In recent years, the extent of our vulnerability to misinterpretation due to poorly characterized reagents has become an issue of great concern. Antibody reagents have been identified as a major source of error, contributing to the ?reproducibility crisis.? In the current report, we define an additional dimension of the crisis; in particular, we define variation of the targets being analyzed. We report that natural variation in the immunoglobulin ?constant? region alters the reactivity with c...

  6. Critical slowing down and error analysis in lattice QCD simulations

    Energy Technology Data Exchange (ETDEWEB)

    Virotta, Francesco

    2012-02-21

    In this work we investigate the critical slowing down of lattice QCD simulations. We perform a preliminary study in the quenched approximation where we find that our estimate of the exponential auto-correlation time scales as {tau}{sub exp}(a){proportional_to}a{sup -5}, where a is the lattice spacing. In unquenched simulations with O(a) improved Wilson fermions we do not obtain a scaling law but find results compatible with the behavior that we find in the pure gauge theory. The discussion is supported by a large set of ensembles both in pure gauge and in the theory with two degenerate sea quarks. We have moreover investigated the effect of slow algorithmic modes in the error analysis of the expectation value of typical lattice QCD observables (hadronic matrix elements and masses). In the context of simulations affected by slow modes we propose and test a method to obtain reliable estimates of statistical errors. The method is supposed to help in the typical algorithmic setup of lattice QCD, namely when the total statistics collected is of O(10){tau}{sub exp}. This is the typical case when simulating close to the continuum limit where the computational costs for producing two independent data points can be extremely large. We finally discuss the scale setting in N{sub f}=2 simulations using the Kaon decay constant f{sub K} as physical input. The method is explained together with a thorough discussion of the error analysis employed. A description of the publicly available code used for the error analysis is included.

  7. Error analysis of short term wind power prediction models

    International Nuclear Information System (INIS)

    De Giorgi, Maria Grazia; Ficarella, Antonio; Tarantino, Marco

    2011-01-01

    The integration of wind farms in power networks has become an important problem. This is because the electricity produced cannot be preserved because of the high cost of storage and electricity production must follow market demand. Short-long-range wind forecasting over different lengths/periods of time is becoming an important process for the management of wind farms. Time series modelling of wind speeds is based upon the valid assumption that all the causative factors are implicitly accounted for in the sequence of occurrence of the process itself. Hence time series modelling is equivalent to physical modelling. Auto Regressive Moving Average (ARMA) models, which perform a linear mapping between inputs and outputs, and Artificial Neural Networks (ANNs) and Adaptive Neuro-Fuzzy Inference Systems (ANFIS), which perform a non-linear mapping, provide a robust approach to wind power prediction. In this work, these models are developed in order to forecast power production of a wind farm with three wind turbines, using real load data and comparing different time prediction periods. This comparative analysis takes in the first time, various forecasting methods, time horizons and a deep performance analysis focused upon the normalised mean error and the statistical distribution hereof in order to evaluate error distribution within a narrower curve and therefore forecasting methods whereby it is more improbable to make errors in prediction. (author)

  8. Error analysis of short term wind power prediction models

    Energy Technology Data Exchange (ETDEWEB)

    De Giorgi, Maria Grazia; Ficarella, Antonio; Tarantino, Marco [Dipartimento di Ingegneria dell' Innovazione, Universita del Salento, Via per Monteroni, 73100 Lecce (Italy)

    2011-04-15

    The integration of wind farms in power networks has become an important problem. This is because the electricity produced cannot be preserved because of the high cost of storage and electricity production must follow market demand. Short-long-range wind forecasting over different lengths/periods of time is becoming an important process for the management of wind farms. Time series modelling of wind speeds is based upon the valid assumption that all the causative factors are implicitly accounted for in the sequence of occurrence of the process itself. Hence time series modelling is equivalent to physical modelling. Auto Regressive Moving Average (ARMA) models, which perform a linear mapping between inputs and outputs, and Artificial Neural Networks (ANNs) and Adaptive Neuro-Fuzzy Inference Systems (ANFIS), which perform a non-linear mapping, provide a robust approach to wind power prediction. In this work, these models are developed in order to forecast power production of a wind farm with three wind turbines, using real load data and comparing different time prediction periods. This comparative analysis takes in the first time, various forecasting methods, time horizons and a deep performance analysis focused upon the normalised mean error and the statistical distribution hereof in order to evaluate error distribution within a narrower curve and therefore forecasting methods whereby it is more improbable to make errors in prediction. (author)

  9. A Generalized Process Model of Human Action Selection and Error and its Application to Error Prediction

    Science.gov (United States)

    2014-07-01

    Macmillan & Creelman , 2005). This is a quite high degree of discriminability and it means that when the decision model predicts a probability of...ROC analysis. Pattern Recognition Letters, 27(8), 861-874. Retrieved from Google Scholar. Macmillan, N. A., & Creelman , C. D. (2005). Detection

  10. Neural prediction errors reveal a risk-sensitive reinforcement-learning process in the human brain.

    Science.gov (United States)

    Niv, Yael; Edlund, Jeffrey A; Dayan, Peter; O'Doherty, John P

    2012-01-11

    Humans and animals are exquisitely, though idiosyncratically, sensitive to risk or variance in the outcomes of their actions. Economic, psychological, and neural aspects of this are well studied when information about risk is provided explicitly. However, we must normally learn about outcomes from experience, through trial and error. Traditional models of such reinforcement learning focus on learning about the mean reward value of cues and ignore higher order moments such as variance. We used fMRI to test whether the neural correlates of human reinforcement learning are sensitive to experienced risk. Our analysis focused on anatomically delineated regions of a priori interest in the nucleus accumbens, where blood oxygenation level-dependent (BOLD) signals have been suggested as correlating with quantities derived from reinforcement learning. We first provide unbiased evidence that the raw BOLD signal in these regions corresponds closely to a reward prediction error. We then derive from this signal the learned values of cues that predict rewards of equal mean but different variance and show that these values are indeed modulated by experienced risk. Moreover, a close neurometric-psychometric coupling exists between the fluctuations of the experience-based evaluations of risky options that we measured neurally and the fluctuations in behavioral risk aversion. This suggests that risk sensitivity is integral to human learning, illuminating economic models of choice, neuroscientific models of affective learning, and the workings of the underlying neural mechanisms.

  11. Critical slowing down and error analysis in lattice QCD simulations

    Energy Technology Data Exchange (ETDEWEB)

    Schaefer, Stefan [Humboldt-Universitaet, Berlin (Germany). Inst. fuer Physik; Sommer, Rainer; Virotta, Francesco [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC

    2010-09-15

    We study the critical slowing down towards the continuum limit of lattice QCD simulations with Hybrid Monte Carlo type algorithms. In particular for the squared topological charge we find it to be very severe with an effective dynamical critical exponent of about 5 in pure gauge theory. We also consider Wilson loops which we can demonstrate to decouple from the modes which slow down the topological charge. Quenched observables are studied and a comparison to simulations of full QCD is made. In order to deal with the slow modes in the simulation, we propose a method to incorporate the information from slow observables into the error analysis of physical observables and arrive at safer error estimates. (orig.)

  12. Critical slowing down and error analysis in lattice QCD simulations

    International Nuclear Information System (INIS)

    Schaefer, Stefan; Sommer, Rainer; Virotta, Francesco

    2010-09-01

    We study the critical slowing down towards the continuum limit of lattice QCD simulations with Hybrid Monte Carlo type algorithms. In particular for the squared topological charge we find it to be very severe with an effective dynamical critical exponent of about 5 in pure gauge theory. We also consider Wilson loops which we can demonstrate to decouple from the modes which slow down the topological charge. Quenched observables are studied and a comparison to simulations of full QCD is made. In order to deal with the slow modes in the simulation, we propose a method to incorporate the information from slow observables into the error analysis of physical observables and arrive at safer error estimates. (orig.)

  13. Evaluation of Analysis by Cross-Validation, Part II: Diagnostic and Optimization of Analysis Error Covariance

    Directory of Open Access Journals (Sweden)

    Richard Ménard

    2018-02-01

    Full Text Available We present a general theory of estimation of analysis error covariances based on cross-validation as well as a geometric interpretation of the method. In particular, we use the variance of passive observation-minus-analysis residuals and show that the true analysis error variance can be estimated, without relying on the optimality assumption. This approach is used to obtain near optimal analyses that are then used to evaluate the air quality analysis error using several different methods at active and passive observation sites. We compare the estimates according to the method of Hollingsworth-Lönnberg, Desroziers et al., a new diagnostic we developed, and the perceived analysis error computed from the analysis scheme, to conclude that, as long as the analysis is near optimal, all estimates agree within a certain error margin.

  14. Analytical sensitivity analysis of geometric errors in a three axis machine tool

    International Nuclear Information System (INIS)

    Park, Sung Ryung; Yang, Seung Han

    2012-01-01

    In this paper, an analytical method is used to perform a sensitivity analysis of geometric errors in a three axis machine tool. First, an error synthesis model is constructed for evaluating the position volumetric error due to the geometric errors, and then an output variable is defined, such as the magnitude of the position volumetric error. Next, the global sensitivity analysis is executed using an analytical method. Finally, the sensitivity indices are calculated using the quantitative values of the geometric errors

  15. SIMULATED HUMAN ERROR PROBABILITY AND ITS APPLICATION TO DYNAMIC HUMAN FAILURE EVENTS

    Energy Technology Data Exchange (ETDEWEB)

    Herberger, Sarah M.; Boring, Ronald L.

    2016-10-01

    Abstract Objectives: Human reliability analysis (HRA) methods typically analyze human failure events (HFEs) at the overall task level. For dynamic HRA, it is important to model human activities at the subtask level. There exists a disconnect between dynamic subtask level and static task level that presents issues when modeling dynamic scenarios. For example, the SPAR-H method is typically used to calculate the human error probability (HEP) at the task level. As demonstrated in this paper, quantification in SPAR-H does not translate to the subtask level. Methods: Two different discrete distributions were generated for each SPAR-H Performance Shaping Factor (PSF) to define the frequency of PSF levels. The first distribution was a uniform, or uninformed distribution that assumed the frequency of each PSF level was equally likely. The second non-continuous distribution took the frequency of PSF level as identified from an assessment of the HERA database. These two different approaches were created to identify the resulting distribution of the HEP. The resulting HEP that appears closer to the known distribution, a log-normal centered on 1E-3, is the more desirable. Each approach then has median, average and maximum HFE calculations applied. To calculate these three values, three events, A, B and C are generated from the PSF level frequencies comprised of subtasks. The median HFE selects the median PSF level from each PSF and calculates HEP. The average HFE takes the mean PSF level, and the maximum takes the maximum PSF level. The same data set of subtask HEPs yields starkly different HEPs when aggregated to the HFE level in SPAR-H. Results: Assuming that each PSF level in each HFE is equally likely creates an unrealistic distribution of the HEP that is centered at 1. Next the observed frequency of PSF levels was applied with the resulting HEP behaving log-normally with a majority of the values under 2.5% HEP. The median, average and maximum HFE calculations did yield

  16. Using a Delphi Method to Identify Human Factors Contributing to Nursing Errors.

    Science.gov (United States)

    Roth, Cheryl; Brewer, Melanie; Wieck, K Lynn

    2017-07-01

    The purpose of this study was to identify human factors associated with nursing errors. Using a Delphi technique, this study used feedback from a panel of nurse experts (n = 25) on an initial qualitative survey questionnaire followed by summarizing the results with feedback and confirmation. Synthesized factors regarding causes of errors were incorporated into a quantitative Likert-type scale, and the original expert panel participants were queried a second time to validate responses. The list identified 24 items as most common causes of nursing errors, including swamping and errors made by others that nurses are expected to recognize and fix. The responses provided a consensus top 10 errors list based on means with heavy workload and fatigue at the top of the list. The use of the Delphi survey established consensus and developed a platform upon which future study of nursing errors can evolve as a link to future solutions. This list of human factors in nursing errors should serve to stimulate dialogue among nurses about how to prevent errors and improve outcomes. Human and system failures have been the subject of an abundance of research, yet nursing errors continue to occur. © 2016 Wiley Periodicals, Inc.

  17. Human errors evaluation for muster in emergency situations applying human error probability index (HEPI, in the oil company warehouse in Hamadan City

    Directory of Open Access Journals (Sweden)

    2012-12-01

    Full Text Available Introduction: Emergency situation is one of the influencing factors on human error. The aim of this research was purpose to evaluate human error in emergency situation of fire and explosion at the oil company warehouse in Hamadan city applying human error probability index (HEPI. . Material and Method: First, the scenario of emergency situation of those situation of fire and explosion at the oil company warehouse was designed and then maneuver against, was performed. The scaled questionnaire of muster for the maneuver was completed in the next stage. Collected data were analyzed to calculate the probability success for the 18 actions required in an emergency situation from starting point of the muster until the latest action to temporary sheltersafe. .Result: The result showed that the highest probability of error occurrence was related to make safe workplace (evaluation phase with 32.4 % and lowest probability of occurrence error in detection alarm (awareness phase with 1.8 %, probability. The highest severity of error was in the evaluation phase and the lowest severity of error was in the awareness and recovery phase. Maximum risk level was related to the evaluating exit routes and selecting one route and choosy another exit route and minimum risk level was related to the four evaluation phases. . Conclusion: To reduce the risk of reaction in the exit phases of an emergency situation, the following actions are recommended, based on the finding in this study: A periodic evaluation of the exit phase and modifying them if necessary, conducting more maneuvers and analyzing this results along with a sufficient feedback to the employees.

  18. Analysis of human performance in KHNP NPPs

    International Nuclear Information System (INIS)

    Tae, Sung Eun

    2004-01-01

    The most important thing in the management of nuclear power plant is safety. One of the key factors to enhance the safety is to analyze human performance and to reflect the results on the practical plant operation. KHNP NPPs experienced human errors in the fields of operation and maintenance. The human errors need to be analyzed and, necessary corrective actions according to the causes should be made to prevent the same event or similar events. Therefore we'd like to introduce the procedure of K-HPES(KHNP-Human Performance Enhancement System) and the results of analysis of HPES reports produced in 2002 and 2003

  19. Error performance analysis in downlink cellular networks with interference management

    KAUST Repository

    Afify, Laila H.

    2015-05-01

    Modeling aggregate network interference in cellular networks has recently gained immense attention both in academia and industry. While stochastic geometry based models have succeeded to account for the cellular network geometry, they mostly abstract many important wireless communication system aspects (e.g., modulation techniques, signal recovery techniques). Recently, a novel stochastic geometry model, based on the Equivalent-in-Distribution (EiD) approach, succeeded to capture the aforementioned communication system aspects and extend the analysis to averaged error performance, however, on the expense of increasing the modeling complexity. Inspired by the EiD approach, the analysis developed in [1] takes into consideration the key system parameters, while providing a simple tractable analysis. In this paper, we extend this framework to study the effect of different interference management techniques in downlink cellular network. The accuracy of the proposed analysis is verified via Monte Carlo simulations.

  20. Investigating the causes of human error-induced incidents in the maintenance operations of petrochemical industry by using HFACS

    Directory of Open Access Journals (Sweden)

    Mohammadreza Azhdari

    2017-03-01

    Full Text Available Background & Objectives: Maintenance is an important tool for the petrochemical industries to prevent of accidents and increase operational and process safety success. The purpose of this study was to identify the possible causes of incidents caused by human error in the petrochemical maintenance activities by using Human Factors Analysis and Classification System (HFACS. Methods: This study is a cross-sectional analysis that was conducted in Zagros Petrochemical Company, Asaluyeh-Iran. A checklist of human error-induced incidents was developed based on four HFACS main levels and nineteen sub-groups. Hierarchical task analysis (HTA technique was used to identify maintenance activities and tasks. The main causes of possible incidents were identified by checklist and recorded. Corrective and preventive actions were defined depending on priority.   Results: The content analysis of worksheets of 444 activities showed 37.6% of the causes at the level of unsafe actions, 27.5% at the level of unsafe supervision, 20.9% at the level of preconditions for unsafe acts and 14% of the causes at the level of organizational effects. The HFACS sub-groups showed errors (24.36% inadequate supervision (14.89% and violations (13.26% with the most frequency. Conclusion: In order to prevent and reduce the occurrence of the identified errors, reducing the rate of the detected errors is crucial. Findings of this study showed that appropriate controlling measures such as periodical training of work procedures and supervision improvement decrease the human error-induced incidents in petrochemical industry maintenance.

  1. Error Analysis for Fourier Methods for Option Pricing

    KAUST Repository

    Häppölä, Juho

    2016-01-06

    We provide a bound for the error committed when using a Fourier method to price European options when the underlying follows an exponential Levy dynamic. The price of the option is described by a partial integro-differential equation (PIDE). Applying a Fourier transformation to the PIDE yields an ordinary differential equation that can be solved analytically in terms of the characteristic exponent of the Levy process. Then, a numerical inverse Fourier transform allows us to obtain the option price. We present a novel bound for the error and use this bound to set the parameters for the numerical method. We analyze the properties of the bound for a dissipative and pure-jump example. The bound presented is independent of the asymptotic behaviour of option prices at extreme asset prices. The error bound can be decomposed into a product of terms resulting from the dynamics and the option payoff, respectively. The analysis is supplemented by numerical examples that demonstrate results comparable to and superior to the existing literature.

  2. Human Error and the International Space Station: Challenges and Triumphs in Science Operations

    Science.gov (United States)

    Harris, Samantha S.; Simpson, Beau C.

    2016-01-01

    Any system with a human component is inherently risky. Studies in human factors and psychology have repeatedly shown that human operators will inevitably make errors, regardless of how well they are trained. Onboard the International Space Station (ISS) where crew time is arguably the most valuable resource, errors by the crew or ground operators can be costly to critical science objectives. Operations experts at the ISS Payload Operations Integration Center (POIC), located at NASA's Marshall Space Flight Center in Huntsville, Alabama, have learned that from payload concept development through execution, there are countless opportunities to introduce errors that can potentially result in costly losses of crew time and science. To effectively address this challenge, we must approach the design, testing, and operation processes with two specific goals in mind. First, a systematic approach to error and human centered design methodology should be implemented to minimize opportunities for user error. Second, we must assume that human errors will be made and enable rapid identification and recoverability when they occur. While a systematic approach and human centered development process can go a long way toward eliminating error, the complete exclusion of operator error is not a reasonable expectation. The ISS environment in particular poses challenging conditions, especially for flight controllers and astronauts. Operating a scientific laboratory 250 miles above the Earth is a complicated and dangerous task with high stakes and a steep learning curve. While human error is a reality that may never be fully eliminated, smart implementation of carefully chosen tools and techniques can go a long way toward minimizing risk and increasing the efficiency of NASA's space science operations.

  3. Error Analysis of Galerkin's Method for Semilinear Equations

    Directory of Open Access Journals (Sweden)

    Tadashi Kawanago

    2012-01-01

    Full Text Available We establish a general existence result for Galerkin's approximate solutions of abstract semilinear equations and conduct an error analysis. Our results may be regarded as some extension of a precedent work (Schultz 1969. The derivation of our results is, however, different from the discussion in his paper and is essentially based on the convergence theorem of Newton’s method and some techniques for deriving it. Some of our results may be applicable for investigating the quality of numerical verification methods for solutions of ordinary and partial differential equations.

  4. A classification scheme of erroneous behaviors for human error probability estimations based on simulator data

    International Nuclear Information System (INIS)

    Kim, Yochan; Park, Jinkyun; Jung, Wondea

    2017-01-01

    Because it has been indicated that empirical data supporting the estimates used in human reliability analysis (HRA) is insufficient, several databases have been constructed recently. To generate quantitative estimates from human reliability data, it is important to appropriately sort the erroneous behaviors found in the reliability data. Therefore, this paper proposes a scheme to classify the erroneous behaviors identified by the HuREX (Human Reliability data Extraction) framework through a review of the relevant literature. A case study of the human error probability (HEP) calculations is conducted to verify that the proposed scheme can be successfully implemented for the categorization of the erroneous behaviors and to assess whether the scheme is useful for the HEP quantification purposes. Although continuously accumulating and analyzing simulator data is desirable to secure more reliable HEPs, the resulting HEPs were insightful in several important ways with regard to human reliability in off-normal conditions. From the findings of the literature review and the case study, the potential and limitations of the proposed method are discussed. - Highlights: • A taxonomy of erroneous behaviors is proposed to estimate HEPs from a database. • The cognitive models, procedures, HRA methods, and HRA databases were reviewed. • HEPs for several types of erroneous behaviors are calculated as a case study.

  5. Republished error management: Descriptions of verbal communication errors between staff. An analysis of 84 root cause analysis-reports from Danish hospitals

    DEFF Research Database (Denmark)

    Rabøl, Louise Isager; Andersen, Mette Lehmann; Østergaard, Doris

    2011-01-01

    Introduction Poor teamwork and communication between healthcare staff are correlated to patient safety incidents. However, the organisational factors responsible for these issues are unexplored. Root cause analyses (RCA) use human factors thinking to analyse the systems behind severe patient safety...... and characteristics of verbal communication errors such as handover errors and error during teamwork. Results Raters found description of verbal communication errors in 44 reports (52%). These included handover errors (35 (86%)), communication errors between different staff groups (19 (43%)), misunderstandings (13...... (30%)), communication errors between junior and senior staff members (11 (25%)), hesitance in speaking up (10 (23%)) and communication errors during teamwork (8 (18%)). The kappa values were 0.44-0.78. Unproceduralized communication and information exchange via telephone, related to transfer between...

  6. The application of SHERPA (Systematic Human Error Reduction and Prediction Approach) in the development of compensatory cognitive rehabilitation strategies for stroke patients with left and right brain damage.

    Science.gov (United States)

    Hughes, Charmayne M L; Baber, Chris; Bienkiewicz, Marta; Worthington, Andrew; Hazell, Alexa; Hermsdörfer, Joachim

    2015-01-01

    Approximately 33% of stroke patients have difficulty performing activities of daily living, often committing errors during the planning and execution of such activities. The objective of this study was to evaluate the ability of the human error identification (HEI) technique SHERPA (Systematic Human Error Reduction and Prediction Approach) to predict errors during the performance of daily activities in stroke patients with left and right hemisphere lesions. Using SHERPA we successfully predicted 36 of the 38 observed errors, with analysis indicating that the proportion of predicted and observed errors was similar for all sub-tasks and severity levels. HEI results were used to develop compensatory cognitive strategies that clinicians could employ to reduce or prevent errors from occurring. This study provides evidence for the reliability and validity of SHERPA in the design of cognitive rehabilitation strategies in stroke populations.

  7. Proactive error analysis of ultrasound-guided axillary brachial plexus block performance.

    LENUS (Irish Health Repository)

    O'Sullivan, Owen

    2012-07-13

    Detailed description of the tasks anesthetists undertake during the performance of a complex procedure, such as ultrasound-guided peripheral nerve blockade, allows elements that are vulnerable to human error to be identified. We have applied 3 task analysis tools to one such procedure, namely, ultrasound-guided axillary brachial plexus blockade, with the intention that the results may form a basis to enhance training and performance of the procedure.

  8. Research review and development trends of human reliability analysis techniques

    International Nuclear Information System (INIS)

    Li Pengcheng; Chen Guohua; Zhang Li; Dai Licao

    2011-01-01

    Human reliability analysis (HRA) methods are reviewed. The theoretical basis of human reliability analysis, human error mechanism, the key elements of HRA methods as well as the existing HRA methods are respectively introduced and assessed. Their shortcomings,the current research hotspot and difficult problems are identified. Finally, it takes a close look at the trends of human reliability analysis methods. (authors)

  9. Error analysis of mathematical problems on TIMSS: A case of Indonesian secondary students

    Science.gov (United States)

    Priyani, H. A.; Ekawati, R.

    2018-01-01

    Indonesian students’ competence in solving mathematical problems is still considered as weak. It was pointed out by the results of international assessment such as TIMSS. This might be caused by various types of errors made. Hence, this study aimed at identifying students’ errors in solving mathematical problems in TIMSS in the topic of numbers that considered as the fundamental concept in Mathematics. This study applied descriptive qualitative analysis. The subject was three students with most errors in the test indicators who were taken from 34 students of 8th graders. Data was obtained through paper and pencil test and student’s’ interview. The error analysis indicated that in solving Applying level problem, the type of error that students made was operational errors. In addition, for reasoning level problem, there are three types of errors made such as conceptual errors, operational errors and principal errors. Meanwhile, analysis of the causes of students’ errors showed that students did not comprehend the mathematical problems given.

  10. Coping with human errors through system design: Implications for ecological interface design

    DEFF Research Database (Denmark)

    Rasmussen, Jens; Vicente, Kim J.

    1989-01-01

    Research during recent years has revealed that human errors are not stochastic events which can be removed through improved training programs or optimal interface design. Rather, errors tend to reflect either systematic interference between various models, rules, and schemata, or the effects...... of the adaptive mechanisms involved in learning. In terms of design implications, these findings suggest that reliable human-system interaction will be achieved by designing interfaces which tend to minimize the potential for control interference and support recovery from errors. In other words, the focus should...... be on control of the effects of errors rather than on the elimination of errors per se. In this paper, we propose a theoretical framework for interface design that attempts to satisfy these objectives. The goal of our framework, called ecological interface design, is to develop a meaningful representation...

  11. Human reliability analysis

    International Nuclear Information System (INIS)

    Dougherty, E.M.; Fragola, J.R.

    1988-01-01

    The authors present a treatment of human reliability analysis incorporating an introduction to probabilistic risk assessment for nuclear power generating stations. They treat the subject according to the framework established for general systems theory. Draws upon reliability analysis, psychology, human factors engineering, and statistics, integrating elements of these fields within a systems framework. Provides a history of human reliability analysis, and includes examples of the application of the systems approach

  12. Quantitative estimation of the human error probability during soft control operations

    International Nuclear Information System (INIS)

    Lee, Seung Jun; Kim, Jaewhan; Jung, Wondea

    2013-01-01

    Highlights: ► An HRA method to evaluate execution HEP for soft control operations was proposed. ► The soft control tasks were analyzed and design-related influencing factors were identified. ► An application to evaluate the effects of soft controls was performed. - Abstract: In this work, a method was proposed for quantifying human errors that can occur during operation executions using soft controls. Soft controls of advanced main control rooms have totally different features from conventional controls, and thus they may have different human error modes and occurrence probabilities. It is important to identify the human error modes and quantify the error probability for evaluating the reliability of the system and preventing errors. This work suggests an evaluation framework for quantifying the execution error probability using soft controls. In the application result, it was observed that the human error probabilities of soft controls showed both positive and negative results compared to the conventional controls according to the design quality of advanced main control rooms

  13. From human error to organizational failure: a historical perspective

    International Nuclear Information System (INIS)

    Guarnieri, F.; Cambon, J.; Boissieres, I.

    2008-01-01

    This article hinges around three main parts.The first part goes back over the foundations of the human factor approach. It introduces the basic assumptions as well as some of the methods which have been developed. The second part accounts for the reasons why organizational factors have drawn our attention at the first place underlying two major points: the limits of the human factor approach but also the original contribution and the innovative aspect of sociology. At last, the third part describes the keystone principles and hypotheses on which lay the foundations of the organizational factor approach and draws a brief overview of the methods which have lately been implanted within the industrial world. (authors)

  14. Error Analysis of CM Data Products Sources of Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Hunt, Brian D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eckert-Gallup, Aubrey Celia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Cochran, Lainy Dromgoole [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kraus, Terrence D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Allen, Mark B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Beal, Bill [National Security Technologies, Joint Base Andrews, MD (United States); Okada, Colin [National Security Technologies, LLC. (NSTec), Las Vegas, NV (United States); Simpson, Mathew [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-02-01

    This goal of this project is to address the current inability to assess the overall error and uncertainty of data products developed and distributed by DOE’s Consequence Management (CM) Program. This is a widely recognized shortfall, the resolution of which would provide a great deal of value and defensibility to the analysis results, data products, and the decision making process that follows this work. A global approach to this problem is necessary because multiple sources of error and uncertainty contribute to the ultimate production of CM data products. Therefore, this project will require collaboration with subject matter experts across a wide range of FRMAC skill sets in order to quantify the types of uncertainty that each area of the CM process might contain and to understand how variations in these uncertainty sources contribute to the aggregated uncertainty present in CM data products. The ultimate goal of this project is to quantify the confidence level of CM products to ensure that appropriate public and worker protections decisions are supported by defensible analysis.

  15. Improving patient safety in radiotherapy through error reporting and analysis

    International Nuclear Information System (INIS)

    Findlay, Ú.; Best, H.; Ottrey, M.

    2016-01-01

    Aim: To improve patient safety in radiotherapy (RT) through the analysis and publication of radiotherapy errors and near misses (RTE). Materials and methods: RTE are submitted on a voluntary basis by NHS RT departments throughout the UK to the National Reporting and Learning System (NRLS) or directly to Public Health England (PHE). RTE are analysed by PHE staff using frequency trend analysis based on the classification and pathway coding from Towards Safer Radiotherapy (TSRT). PHE in conjunction with the Patient Safety in Radiotherapy Steering Group publish learning from these events, on a triannual and summarised on a biennial basis, so their occurrence might be mitigated. Results: Since the introduction of this initiative in 2010, over 30,000 (RTE) reports have been submitted. The number of RTE reported in each biennial cycle has grown, ranging from 680 (2010) to 12,691 (2016) RTE. The vast majority of the RTE reported are lower level events, thus not affecting the outcome of patient care. Of the level 1 and 2 incidents reported, it is known the majority of them affected only one fraction of a course of treatment. This means that corrective action could be taken over the remaining treatment fractions so the incident did not have a significant impact on the patient or the outcome of their treatment. Analysis of the RTE reports demonstrates that generation of error is not confined to one professional group or to any particular point in the pathway. It also indicates that the pattern of errors is replicated across service providers in the UK. Conclusion: Use of the terminology, classification and coding of TSRT, together with implementation of the national voluntary reporting system described within this report, allows clinical departments to compare their local analysis to the national picture. Further opportunities to improve learning from this dataset must be exploited through development of the analysis and development of proactive risk management strategies

  16. An experimental approach to validating a theory of human error in complex systems

    Science.gov (United States)

    Morris, N. M.; Rouse, W. B.

    1985-01-01

    The problem of 'human error' is pervasive in engineering systems in which the human is involved. In contrast to the common engineering approach of dealing with error probabilistically, the present research seeks to alleviate problems associated with error by gaining a greater understanding of causes and contributing factors from a human information processing perspective. The general approach involves identifying conditions which are hypothesized to contribute to errors, and experimentally creating the conditions in order to verify the hypotheses. The conceptual framework which serves as the basis for this research is discussed briefly, followed by a description of upcoming research. Finally, the potential relevance of this research to design, training, and aiding issues is discussed.

  17. Error analysis and prevention of cosmic ion-induced soft errors in static CMOS RAMS

    International Nuclear Information System (INIS)

    Diehl, S.E.; Ochoa, A. Jr.; Dressendorfer, P.V.; Koga, R.; Kolasinski, W.A.

    1982-06-01

    Cosmic ray interactions with memory cells are known to cause temporary, random, bit errors in some designs. The sensitivity of polysilicon gate CMOS static RAM designs to logic upset by impinging ions has been studied using computer simulations and experimental heavy ion bombardment. Results of the simulations are confirmed by experimental upset cross-section data. Analytical models have been extended to determine and evaluate design modifications which reduce memory cell sensitivity to cosmic ions. A simple design modification, the addition of decoupling resistance in the feedback path, is shown to produce static RAMs immune to cosmic ray-induced bit errors

  18. Development of a new cause classification method considering plant ageing and human errors for adverse events which occurred in nuclear power plants and some results of its application

    International Nuclear Information System (INIS)

    Miyazaki, Takamasa

    2007-01-01

    The adverse events which occurred in nuclear power plants are analyzed to prevent similar events, and in the analysis of each event, the cause of the event is classified by a cause classification method. This paper shows a new cause classification method which is improved in several points as follows: (1) the whole causes are systematically classified into three major categories such as machine system, operation system and plant outside causes, (2) the causes of the operation system are classified into several management errors normally performed in a nuclear power plant, (3) the content of ageing is defined in detail for their further analysis, (4) human errors are divided and defined by the error stage, (5) human errors can be related to background factors, and so on. This new method is applied to the adverse events which occurred in domestic and overseas nuclear power plants in 2005. From these results, it is clarified that operation system errors account for about 60% of the whole causes, of which approximately 60% are maintenance errors, about 40% are worker's human errors, and that the prevention of maintenance errors, especially worker's human errors is crucial. (author)

  19. Estimation error algorithm at analysis of beta-spectra

    International Nuclear Information System (INIS)

    Bakovets, N.V.; Zhukovskij, A.I.; Zubarev, V.N.; Khadzhinov, E.M.

    2005-01-01

    This work describes the estimation error algorithm at the operations with beta-spectrums, as well as compares the theoretical and experimental errors by the processing of beta-channel's data. (authors)

  20. Error analysis to improve the speech recognition accuracy on ...

    Indian Academy of Sciences (India)

    dictionary plays a key role in the speech recognition accuracy. .... Sophisticated microphone is used for the recording speech corpus in a noise free environment. .... values, word error rate (WER) and error-rate will be calculated as follows:.

  1. An Empirical Study on Human Performance according to the Physical Environment (Potential Human Error Hazard) in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Kim, Ar Ryum; Jang, In Seok; Seong, Proong Hyun

    2014-01-01

    The management of the physical environment for safety is more effective than a nuclear industry. Despite the physical environment such as lighting, noise satisfy with management standards, it can be background factors may cause human error and affect human performance. Because the consequence of extremely human error and human performance is high according to the physical environment, requirement standard could be covered with specific criteria. Particularly, in order to avoid human errors caused by an extremely low or rapidly-changing intensity illumination and masking effect such as power disconnection, plans for better visual environment and better function performances should be made as a careful study on efficient ways to manage and continue the better conditions is conducted

  2. Seismic-load-induced human errors and countermeasures using computer graphics in plant-operator communication

    International Nuclear Information System (INIS)

    Hara, Fumio

    1988-01-01

    This paper remarks the importance of seismic load-induced human errors in plant operation by delineating the characteristics of the task performance of human beings under seismic loads. It focuses on man-machine communication via multidimensional data like that conventionally displayed on large panels in a plant control room. It demonstrates a countermeasure to human errors using a computer graphics technique that conveys the global state of the plant operation to operators through cartoon-like, colored graphs in the form of faces that, with different facial expressions, show the plant safety status. (orig.)

  3. Psychological scaling of expert estimates of human error probabilities: application to nuclear power plant operation

    International Nuclear Information System (INIS)

    Comer, K.; Gaddy, C.D.; Seaver, D.A.; Stillwell, W.G.

    1985-01-01

    The US Nuclear Regulatory Commission and Sandia National Laboratories sponsored a project to evaluate psychological scaling techniques for use in generating estimates of human error probabilities. The project evaluated two techniques: direct numerical estimation and paired comparisons. Expert estimates were found to be consistent across and within judges. Convergent validity was good, in comparison to estimates in a handbook of human reliability. Predictive validity could not be established because of the lack of actual relative frequencies of error (which will be a difficulty inherent in validation of any procedure used to estimate HEPs). Application of expert estimates in probabilistic risk assessment and in human factors is discussed

  4. Accommodating error analysis in comparison and clustering of molecular fingerprints.

    Science.gov (United States)

    Salamon, H; Segal, M R; Ponce de Leon, A; Small, P M

    1998-01-01

    Molecular epidemiologic studies of infectious diseases rely on pathogen genotype comparisons, which usually yield patterns comprising sets of DNA fragments (DNA fingerprints). We use a highly developed genotyping system, IS6110-based restriction fragment length polymorphism analysis of Mycobacterium tuberculosis, to develop a computational method that automates comparison of large numbers of fingerprints. Because error in fragment length measurements is proportional to fragment length and is positively correlated for fragments within a lane, an align-and-count method that compensates for relative scaling of lanes reliably counts matching fragments between lanes. Results of a two-step method we developed to cluster identical fingerprints agree closely with 5 years of computer-assisted visual matching among 1,335 M. tuberculosis fingerprints. Fully documented and validated methods of automated comparison and clustering will greatly expand the scope of molecular epidemiology.

  5. Error-rate performance analysis of opportunistic regenerative relaying

    KAUST Repository

    Tourki, Kamel

    2011-09-01

    In this paper, we investigate an opportunistic relaying scheme where the selected relay assists the source-destination (direct) communication. In our study, we consider a regenerative opportunistic relaying scheme in which the direct path can be considered unusable, and takes into account the effect of the possible erroneously detected and transmitted data at the best relay. We first derive the exact statistics of each hop, in terms of probability density function (PDF). Then, the PDFs are used to determine accurate closed form expressions for end-to-end bit-error rate (BER) of binary phase-shift keying (BPSK) modulation where the detector may use maximum ration combining (MRC) or selection combining (SC). Finally, we validate our analysis by showing that performance simulation results coincide with our analytical results over linear network (LN) architecture and considering Rayleigh fading channels. © 2011 IEEE.

  6. Kitchen Physics: Lessons in Fluid Pressure and Error Analysis

    Science.gov (United States)

    Vieyra, Rebecca Elizabeth; Vieyra, Chrystian; Macchia, Stefano

    2017-02-01

    Although the advent and popularization of the "flipped classroom" tends to center around at-home video lectures, teachers are increasingly turning to at-home labs for enhanced student engagement. This paper describes two simple at-home experiments that can be accomplished in the kitchen. The first experiment analyzes the density of four liquids using a waterproof case and a smartphone barometer in a container, sink, or tub. The second experiment determines the relationship between pressure and temperature of an ideal gas in a constant volume container placed momentarily in a refrigerator freezer. These experiences provide a ripe opportunity both for learning fundamental physics concepts as well as to investigate a variety of error analysis techniques that are frequently overlooked in introductory physics courses.

  7. Error analysis of acceleration control loops of a synchrotron

    International Nuclear Information System (INIS)

    Zhang, S.Y.; Weng, W.T.

    1991-01-01

    For beam control during acceleration, it is conventional to derive the frequency from an external reference, be it a field marker or an external oscillator, to provide phase and radius feedback loops to ensure the phase stability, radial position and emittance integrity of the beam. The open and closed loop behaviors of both feedback control and their response under the possible frequency, phase and radius errors are derived from fundamental principles and equations. The stability of the loops is investigated under a wide range of variations of the gain and time delays. Actual system performance of the AGS Booster is analyzed and compared to commissioning experiences. Such analysis is useful for setting design criteria and tolerances for new proton synchrotrons. 4 refs., 13 figs

  8. The error performance analysis over cyclic redundancy check codes

    Science.gov (United States)

    Yoon, Hee B.

    1991-06-01

    The burst error is generated in digital communication networks by various unpredictable conditions, which occur at high error rates, for short durations, and can impact services. To completely describe a burst error one has to know the bit pattern. This is impossible in practice on working systems. Therefore, under the memoryless binary symmetric channel (MBSC) assumptions, the performance evaluation or estimation schemes for digital signal 1 (DS1) transmission systems carrying live traffic is an interesting and important problem. This study will present some analytical methods, leading to efficient detecting algorithms of burst error using cyclic redundancy check (CRC) code. The definition of burst error is introduced using three different models. Among the three burst error models, the mathematical model is used in this study. The probability density function, function(b) of burst error of length b is proposed. The performance of CRC-n codes is evaluated and analyzed using function(b) through the use of a computer simulation model within CRC block burst error. The simulation result shows that the mean block burst error tends to approach the pattern of the burst error which random bit errors generate.

  9. SPACE-BORNE LASER ALTIMETER GEOLOCATION ERROR ANALYSIS

    Directory of Open Access Journals (Sweden)

    Y. Wang

    2018-05-01

    Full Text Available This paper reviews the development of space-borne laser altimetry technology over the past 40 years. Taking the ICESAT satellite as an example, a rigorous space-borne laser altimeter geolocation model is studied, and an error propagation equation is derived. The influence of the main error sources, such as the platform positioning error, attitude measurement error, pointing angle measurement error and range measurement error, on the geolocation accuracy of the laser spot are analysed by simulated experiments. The reasons for the different influences on geolocation accuracy in different directions are discussed, and to satisfy the accuracy of the laser control point, a design index for each error source is put forward.

  10. Extracting and Converting Quantitative Data into Human Error Probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Tuan Q. Tran; Ronald L. Boring; Jeffrey C. Joe; Candice D. Griffith

    2007-08-01

    This paper discusses a proposed method using a combination of advanced statistical approaches (e.g., meta-analysis, regression, structural equation modeling) that will not only convert different empirical results into a common metric for scaling individual PSFs effects, but will also examine the complex interrelationships among PSFs. Furthermore, the paper discusses how the derived statistical estimates (i.e., effect sizes) can be mapped onto a HRA method (e.g. SPAR-H) to generate HEPs that can then be use in probabilistic risk assessment (PRA). The paper concludes with a discussion of the benefits of using academic literature in assisting HRA analysts in generating sound HEPs and HRA developers in validating current HRA models and formulating new HRA models.

  11. Medication errors as malpractice-a qualitative content analysis of 585 medication errors by nurses in Sweden.

    Science.gov (United States)

    Björkstén, Karin Sparring; Bergqvist, Monica; Andersén-Karlsson, Eva; Benson, Lina; Ulfvarson, Johanna

    2016-08-24

    Many studies address the prevalence of medication errors but few address medication errors serious enough to be regarded as malpractice. Other studies have analyzed the individual and system contributory factor leading to a medication error. Nurses have a key role in medication administration, and there are contradictory reports on the nurses' work experience in relation to the risk and type for medication errors. All medication errors where a nurse was held responsible for malpractice (n = 585) during 11 years in Sweden were included. A qualitative content analysis and classification according to the type and the individual and system contributory factors was made. In order to test for possible differences between nurses' work experience and associations within and between the errors and contributory factors, Fisher's exact test was used, and Cohen's kappa (k) was performed to estimate the magnitude and direction of the associations. There were a total of 613 medication errors in the 585 cases, the most common being "Wrong dose" (41 %), "Wrong patient" (13 %) and "Omission of drug" (12 %). In 95 % of the cases, an average of 1.4 individual contributory factors was found; the most common being "Negligence, forgetfulness or lack of attentiveness" (68 %), "Proper protocol not followed" (25 %), "Lack of knowledge" (13 %) and "Practice beyond scope" (12 %). In 78 % of the cases, an average of 1.7 system contributory factors was found; the most common being "Role overload" (36 %), "Unclear communication or orders" (30 %) and "Lack of adequate access to guidelines or unclear organisational routines" (30 %). The errors "Wrong patient due to mix-up of patients" and "Wrong route" and the contributory factors "Lack of knowledge" and "Negligence, forgetfulness or lack of attentiveness" were more common in less experienced nurses. The experienced nurses were more prone to "Practice beyond scope of practice" and to make errors in spite of "Lack of adequate

  12. Advancing Usability Evaluation through Human Reliability Analysis

    International Nuclear Information System (INIS)

    Ronald L. Boring; David I. Gertman

    2005-01-01

    This paper introduces a novel augmentation to the current heuristic usability evaluation methodology. The SPAR-H human reliability analysis method was developed for categorizing human performance in nuclear power plants. Despite the specialized use of SPAR-H for safety critical scenarios, the method also holds promise for use in commercial off-the-shelf software usability evaluations. The SPAR-H method shares task analysis underpinnings with human-computer interaction, and it can be easily adapted to incorporate usability heuristics as performance shaping factors. By assigning probabilistic modifiers to heuristics, it is possible to arrive at the usability error probability (UEP). This UEP is not a literal probability of error but nonetheless provides a quantitative basis to heuristic evaluation. When combined with a consequence matrix for usability errors, this method affords ready prioritization of usability issues

  13. The development of human factors technologies -The development of human behaviour analysis techniques-

    International Nuclear Information System (INIS)

    Lee, Jung Woon; Lee, Yong Heui; Park, Keun Ok; Chun, Se Woo; Suh, Sang Moon; Park, Jae Chang

    1995-07-01

    In order to contribute to human error reduction through the studies on human-machine interaction in nuclear power plants, this project has objectives to develop SACOM(Simulation Analyzer with a Cognitive Operator Model) and techniques for human error analysis and application. In this year, we studied the followings: 1) Site investigation of operator tasks, 2) Development of operator task micro structure and revision of micro structure, 3) Development of knowledge representation software and SACOM prototype, 4) Development of performance assessment methodologies in task simulation and analysis of the effects of performance shaping factors. 1) Classification of error shaping factors(ESFs) and development of software for ESF evaluation, 2) Analysis of human error occurrences and revision of analysis procedure, 3) Experiment for human error data collection using a compact nuclear simulator, 4) Development of a prototype data base system of the analyzed information on trip cases. 55 figs, 23 tabs, 33 refs. (Author)

  14. Analysis of error functions in speckle shearing interferometry

    International Nuclear Information System (INIS)

    Wan Saffiey Wan Abdullah

    2001-01-01

    Electronic Speckle Pattern Shearing Interferometry (ESPSI) or shearography has successfully been used in NDT for slope (∂w/ (∂x and / or (∂w/ (∂y) measurement while strain measurement (∂u/ ∂x, ∂v/ ∂y, ∂u/ ∂y and (∂v/ (∂x) is still under investigation. This method is well accepted in industrial applications especially in the aerospace industry. Demand of this method is increasing due to complexity of the test materials and objects. ESPSI has successfully performed in NDT only for qualitative measurement whilst quantitative measurement is the current aim of many manufacturers. Industrial use of such equipment is being completed without considering the errors arising from numerous sources, including wavefront divergence. The majority of commercial systems are operated with diverging object illumination wave fronts without considering the curvature of the object illumination wavefront or the object geometry, when calculating the interferometer fringe function and quantifying data. This thesis reports the novel approach in quantified maximum phase change difference analysis for derivative out-of-plane (OOP) and in-plane (IP) cases that propagate from the divergent illumination wavefront compared to collimated illumination. The theoretical of maximum phase difference is formulated by means of three dependent variables, these being the object distance, illuminated diameter, center of illuminated area and camera distance and illumination angle. The relative maximum phase change difference that may contributed to the error in the measurement analysis in this scope of research is defined by the difference of maximum phase difference value measured by divergent illumination wavefront relative to the maximum phase difference value of collimated illumination wavefront, taken at the edge of illuminated area. Experimental validation using test objects for derivative out-of-plane and derivative in-plane deformation, using a single illumination wavefront

  15. The Relationship between Human Operators' Psycho-physiological Condition and Human Errors in Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Arryum; Jang, Inseok; Kang, Hyungook; Seong, Poonghyun [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of)

    2013-05-15

    The safe operation of nuclear power plants (NPPs) is substantially dependent on the performance of the human operators who operate the systems. In this environment, human errors caused by inappropriate performance of operator have been considered to be critical since it may lead serious problems in the safety-critical plants. In order to provide meaningful insights to prevent human errors and enhance the human performance, operators' physiological conditions such as stress and workload have been investigated. Physiological measurements were considered as reliable tools to assess the stress and workload. T. Q. Tran et al. and J. B. Brooking et al pointed out that operators' workload can be assessed using eye tracking, galvanic skin response, electroencephalograms (EEGs), heart rate, respiration and other measurements. The purpose of this study is to investigate the effect of the human operators' tense level and knowledge level to the number of human errors. For this study, the experiments were conducted in the mimic of the main control rooms (MCR) in NPP. It utilized the compact nuclear simulator (CNS) which is modeled based on the three loop Pressurized Water Reactor, 993MWe, Kori unit 3 and 4 in Korea and the subjects were asked to follow the tasks described in the emergency operating procedures (EOP). During the simulation, three kinds of physiological measurement were utilized; Electrocardiogram (ECG), EEG and nose temperature. Also, subjects were divided into three groups based on their knowledge of the plant operation. The result shows that subjects who are tense make fewer errors. In addition, subjects who are in higher knowledge level tend to be tense and make fewer errors. For the ECG data, subjects who make fewer human errors tend to be located in higher tense level area of high SNS activity and low PSNS activity. The results of EEG data are also similar to ECG result. Beta power ratio of subjects who make fewer errors was higher. Since beta

  16. Catching errors with patient-specific pretreatment machine log file analysis.

    Science.gov (United States)

    Rangaraj, Dharanipathy; Zhu, Mingyao; Yang, Deshan; Palaniswaamy, Geethpriya; Yaddanapudi, Sridhar; Wooten, Omar H; Brame, Scott; Mutic, Sasa

    2013-01-01

    A robust, efficient, and reliable quality assurance (QA) process is highly desired for modern external beam radiation therapy treatments. Here, we report the results of a semiautomatic, pretreatment, patient-specific QA process based on dynamic machine log file analysis clinically implemented for intensity modulated radiation therapy (IMRT) treatments delivered by high energy linear accelerators (Varian 2100/2300 EX, Trilogy, iX-D, Varian Medical Systems Inc, Palo Alto, CA). The multileaf collimator machine (MLC) log files are called Dynalog by Varian. Using an in-house developed computer program called "Dynalog QA," we automatically compare the beam delivery parameters in the log files that are generated during pretreatment point dose verification measurements, with the treatment plan to determine any discrepancies in IMRT deliveries. Fluence maps are constructed and compared between the delivered and planned beams. Since clinical introduction in June 2009, 912 machine log file analyses QA were performed by the end of 2010. Among these, 14 errors causing dosimetric deviation were detected and required further investigation and intervention. These errors were the result of human operating mistakes, flawed treatment planning, and data modification during plan file transfer. Minor errors were also reported in 174 other log file analyses, some of which stemmed from false positives and unreliable results; the origins of these are discussed herein. It has been demonstrated that the machine log file analysis is a robust, efficient, and reliable QA process capable of detecting errors originating from human mistakes, flawed planning, and data transfer problems. The possibility of detecting these errors is low using point and planar dosimetric measurements. Copyright © 2013 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.

  17. Safety coaches in radiology: decreasing human error and minimizing patient harm

    Energy Technology Data Exchange (ETDEWEB)

    Dickerson, Julie M.; Adams, Janet M. [Cincinnati Children' s Hospital Medical Center, Department of Radiology, MLC 5031, Cincinnati, OH (United States); Koch, Bernadette L.; Donnelly, Lane F. [Cincinnati Children' s Hospital Medical Center, Department of Radiology, MLC 5031, Cincinnati, OH (United States); Cincinnati Children' s Hospital Medical Center, Department of Pediatrics, Cincinnati, OH (United States); Goodfriend, Martha A. [Cincinnati Children' s Hospital Medical Center, Department of Quality Improvement, Cincinnati, OH (United States)

    2010-09-15

    Successful programs to improve patient safety require a component aimed at improving safety culture and environment, resulting in a reduced number of human errors that could lead to patient harm. Safety coaching provides peer accountability. It involves observing for safety behaviors and use of error prevention techniques and provides immediate feedback. For more than a decade, behavior-based safety coaching has been a successful strategy for reducing error within the context of occupational safety in industry. We describe the use of safety coaches in radiology. Safety coaches are an important component of our comprehensive patient safety program. (orig.)

  18. Safety coaches in radiology: decreasing human error and minimizing patient harm

    International Nuclear Information System (INIS)

    Dickerson, Julie M.; Adams, Janet M.; Koch, Bernadette L.; Donnelly, Lane F.; Goodfriend, Martha A.

    2010-01-01

    Successful programs to improve patient safety require a component aimed at improving safety culture and environment, resulting in a reduced number of human errors that could lead to patient harm. Safety coaching provides peer accountability. It involves observing for safety behaviors and use of error prevention techniques and provides immediate feedback. For more than a decade, behavior-based safety coaching has been a successful strategy for reducing error within the context of occupational safety in industry. We describe the use of safety coaches in radiology. Safety coaches are an important component of our comprehensive patient safety program. (orig.)

  19. Safety coaches in radiology: decreasing human error and minimizing patient harm.

    Science.gov (United States)

    Dickerson, Julie M; Koch, Bernadette L; Adams, Janet M; Goodfriend, Martha A; Donnelly, Lane F

    2010-09-01

    Successful programs to improve patient safety require a component aimed at improving safety culture and environment, resulting in a reduced number of human errors that could lead to patient harm. Safety coaching provides peer accountability. It involves observing for safety behaviors and use of error prevention techniques and provides immediate feedback. For more than a decade, behavior-based safety coaching has been a successful strategy for reducing error within the context of occupational safety in industry. We describe the use of safety coaches in radiology. Safety coaches are an important component of our comprehensive patient safety program.

  20. Analysis of error-correction constraints in an optical disk

    Science.gov (United States)

    Roberts, Jonathan D.; Ryley, Alan; Jones, David M.; Burke, David

    1996-07-01

    The compact disk read-only memory (CD-ROM) is a mature storage medium with complex error control. It comprises four levels of Reed Solomon codes allied to a sequence of sophisticated interleaving strategies and 8:14 modulation coding. New storage media are being developed and introduced that place still further demands on signal processing for error correction. It is therefore appropriate to explore thoroughly the limit of existing strategies to assess future requirements. We describe a simulation of all stages of the CD-ROM coding, modulation, and decoding. The results of decoding the burst error of a prescribed number of modulation bits are discussed in detail. Measures of residual uncorrected error within a sector are displayed by C1, C2, P, and Q error counts and by the status of the final cyclic redundancy check (CRC). Where each data sector is encoded separately, it is shown that error-correction performance against burst errors depends critically on the position of the burst within a sector. The C1 error measures the burst length, whereas C2 errors reflect the burst position. The performance of Reed Solomon product codes is shown by the P and Q statistics. It is shown that synchronization loss is critical near the limits of error correction. An example is given of miscorrection that is identified by the CRC check.

  1. Accounting for measurement error in human life history trade-offs using structural equation modeling.

    Science.gov (United States)

    Helle, Samuli

    2018-03-01

    Revealing causal effects from correlative data is very challenging and a contemporary problem in human life history research owing to the lack of experimental approach. Problems with causal inference arising from measurement error in independent variables, whether related either to inaccurate measurement technique or validity of measurements, seem not well-known in this field. The aim of this study is to show how structural equation modeling (SEM) with latent variables can be applied to account for measurement error in independent variables when the researcher has recorded several indicators of a hypothesized latent construct. As a simple example of this approach, measurement error in lifetime allocation of resources to reproduction in Finnish preindustrial women is modelled in the context of the survival cost of reproduction. In humans, lifetime energetic resources allocated in reproduction are almost impossible to quantify with precision and, thus, typically used measures of lifetime reproductive effort (e.g., lifetime reproductive success and parity) are likely to be plagued by measurement error. These results are contrasted with those obtained from a traditional regression approach where the single best proxy of lifetime reproductive effort available in the data is used for inference. As expected, the inability to account for measurement error in women's lifetime reproductive effort resulted in the underestimation of its underlying effect size on post-reproductive survival. This article emphasizes the advantages that the SEM framework can provide in handling measurement error via multiple-indicator latent variables in human life history studies. © 2017 Wiley Periodicals, Inc.

  2. Team errors: definition and taxonomy

    International Nuclear Information System (INIS)

    Sasou, Kunihide; Reason, James

    1999-01-01

    In error analysis or error management, the focus is usually upon individuals who have made errors. In large complex systems, however, most people work in teams or groups. Considering this working environment, insufficient emphasis has been given to 'team errors'. This paper discusses the definition of team errors and its taxonomy. These notions are also applied to events that have occurred in the nuclear power industry, aviation industry and shipping industry. The paper also discusses the relations between team errors and Performance Shaping Factors (PSFs). As a result, the proposed definition and taxonomy are found to be useful in categorizing team errors. The analysis also reveals that deficiencies in communication, resource/task management, excessive authority gradient, excessive professional courtesy will cause team errors. Handling human errors as team errors provides an opportunity to reduce human errors

  3. A study on fatigue measurement of operators for human error prevention in NPPs

    Energy Technology Data Exchange (ETDEWEB)

    Ju, Oh Yeon; Il, Jang Tong; Meiling, Luo; Hee, Lee Young [KAERI, Daejeon (Korea, Republic of)

    2012-10-15

    The identification and the analysis of individual factor of operators, which is one of the various causes of adverse effects in human performance, is not easy in NPPs. There are work types (including shift), environment, personality, qualification, training, education, cognition, fatigue, job stress, workload, etc in individual factors for the operators. Research at the Finnish Institute of Occupational Health (FIOH) reported that a 'burn out (extreme fatigue)' is related to alcohol dependent habits and must be dealt with using a stress management program. USNRC (U.S. Nuclear Regulatory Commission) developed FFD (Fitness for Duty) for improving the task efficiency and preventing human errors. 'Managing Fatigue' of 10CFR26 presented as requirements to control operator fatigue in NPPs. The committee explained that excessive fatigue is due to stressful work environments, working hours, shifts, sleep disorders, and unstable circadian rhythms. In addition, an International Labor Organization (ILO) developed and suggested a checklist to manage fatigue and job stress. In domestic, a systematic evaluation way is presented by the Final Safety Analysis Report (FSAR) chapter 18, Human Factors, in the licensing process. However, it almost focused on the interface design such as HMI (Human Machine Interface), not individual factors. In particular, because our country is in a process of the exporting the NPP to UAE, the development and setting of fatigue management technique is important and urgent to present the technical standard and FFD criteria to UAE. And also, it is anticipated that the domestic regulatory body applies the FFD program as the regulation requirement so that a preparation for that situation is required. In this paper, advanced researches are investigated to find the fatigue measurement and evaluation methods of operators in a high reliability industry. Also, this study tries to review the NRC report and discuss the causal factors and

  4. A study on fatigue measurement of operators for human error prevention in NPPs

    International Nuclear Information System (INIS)

    Ju, Oh Yeon; Il, Jang Tong; Meiling, Luo; Hee, Lee Young

    2012-01-01

    The identification and the analysis of individual factor of operators, which is one of the various causes of adverse effects in human performance, is not easy in NPPs. There are work types (including shift), environment, personality, qualification, training, education, cognition, fatigue, job stress, workload, etc in individual factors for the operators. Research at the Finnish Institute of Occupational Health (FIOH) reported that a 'burn out (extreme fatigue)' is related to alcohol dependent habits and must be dealt with using a stress management program. USNRC (U.S. Nuclear Regulatory Commission) developed FFD (Fitness for Duty) for improving the task efficiency and preventing human errors. 'Managing Fatigue' of 10CFR26 presented as requirements to control operator fatigue in NPPs. The committee explained that excessive fatigue is due to stressful work environments, working hours, shifts, sleep disorders, and unstable circadian rhythms. In addition, an International Labor Organization (ILO) developed and suggested a checklist to manage fatigue and job stress. In domestic, a systematic evaluation way is presented by the Final Safety Analysis Report (FSAR) chapter 18, Human Factors, in the licensing process. However, it almost focused on the interface design such as HMI (Human Machine Interface), not individual factors. In particular, because our country is in a process of the exporting the NPP to UAE, the development and setting of fatigue management technique is important and urgent to present the technical standard and FFD criteria to UAE. And also, it is anticipated that the domestic regulatory body applies the FFD program as the regulation requirement so that a preparation for that situation is required. In this paper, advanced researches are investigated to find the fatigue measurement and evaluation methods of operators in a high reliability industry. Also, this study tries to review the NRC report and discuss the causal factors and management

  5. Basic design of multimedia system for the representation of human error cases in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung Woon; Park, Geun Ok [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1994-04-01

    We have developed a multimedia system for the representation of human error cases with the education and training on human errors can be done effectively. The followings are major topics during the basic design; 1 Establishment of a basic concept for representing human error cases using multimedia, 2 Establishment of a design procedure for the multimedia system, 3 Establishment of a hardware and software environment for operating the multimedia system, 4 Design of multimedia input and output interfaces. In order to verify the results of this basic design, we implemented the basic design with an incident triggered by operator`s misaction which occurred at Uljin NPP Unit 1. (Author) 12 refs., 30 figs.,.

  6. Generalization error analysis: deep convolutional neural network in mammography

    Science.gov (United States)

    Richter, Caleb D.; Samala, Ravi K.; Chan, Heang-Ping; Hadjiiski, Lubomir; Cha, Kenny

    2018-02-01

    We conducted a study to gain understanding of the generalizability of deep convolutional neural networks (DCNNs) given their inherent capability to memorize data. We examined empirically a specific DCNN trained for classification of masses on mammograms. Using a data set of 2,454 lesions from 2,242 mammographic views, a DCNN was trained to classify masses into malignant and benign classes using transfer learning from ImageNet LSVRC-2010. We performed experiments with varying amounts of label corruption and types of pixel randomization to analyze the generalization error for the DCNN. Performance was evaluated using the area under the receiver operating characteristic curve (AUC) with an N-fold cross validation. Comparisons were made between the convergence times, the inference AUCs for both the training set and the test set of the original image patches without corruption, and the root-mean-squared difference (RMSD) in the layer weights of the DCNN trained with different amounts and methods of corruption. Our experiments observed trends which revealed that the DCNN overfitted by memorizing corrupted data. More importantly, this study improved our understanding of DCNN weight updates when learning new patterns or new labels. Although we used a specific classification task with the ImageNet as example, similar methods may be useful for analysis of the DCNN learning processes, especially those that employ transfer learning for medical image analysis where sample size is limited and overfitting risk is high.

  7. A Conceptual Framework for Predicting Error in Complex Human-Machine Environments

    Science.gov (United States)

    Freed, Michael; Remington, Roger; Null, Cynthia H. (Technical Monitor)

    1998-01-01

    We present a Goals, Operators, Methods, and Selection Rules-Model Human Processor (GOMS-MHP) style model-based approach to the problem of predicting human habit capture errors. Habit captures occur when the model fails to allocate limited cognitive resources to retrieve task-relevant information from memory. Lacking the unretrieved information, decision mechanisms act in accordance with implicit default assumptions, resulting in error when relied upon assumptions prove incorrect. The model helps interface designers identify situations in which such failures are especially likely.

  8. Analysis of Students' Errors on Linear Programming at Secondary ...

    African Journals Online (AJOL)

    The purpose of this study was to identify secondary school students' errors on linear programming at 'O' level. It is based on the fact that students' errors inform teaching hence an essential tool for any serious mathematics teacher who intends to improve mathematics teaching. The study was guided by a descriptive survey ...

  9. THE PRACTICAL ANALYSIS OF FINITE ELEMENTS METHOD ERRORS

    Directory of Open Access Journals (Sweden)

    Natalia Bakhova

    2011-03-01

    Full Text Available Abstract. The most important in the practical plan questions of reliable estimations of finite elementsmethod errors are considered. Definition rules of necessary calculations accuracy are developed. Methodsand ways of the calculations allowing receiving at economical expenditures of computing work the best finalresults are offered.Keywords: error, given the accuracy, finite element method, lagrangian and hermitian elements.

  10. Evaluation and Error Analysis for a Solar Thermal Receiver

    International Nuclear Information System (INIS)

    Pfander, M.

    2001-01-01

    In the following study a complete balance over the REFOS receiver module, mounted on the tower power plant CESA-1 at the Plataforma Solar de Almeria (PSA), is carried out. Additionally an error inspection of the various measurement techniques used in the REFOS project is made. Especially the flux measurement system Pro hermes that is used to determine the total entry power of the receiver module and known as a major error source is analysed in detail. Simulations and experiments on the particular instruments are used to determine and quantify possible error sources. After discovering the origin of the errors they are reduced and included in the error calculation. The ultimate result is presented as an overall efficiency of the receiver module in dependence on the flux density at the receiver modules entry plane and the receiver operating temperature. (Author) 26 refs

  11. Evaluation and Error Analysis for a Solar thermal Receiver

    Energy Technology Data Exchange (ETDEWEB)

    Pfander, M.

    2001-07-01

    In the following study a complete balance over the REFOS receiver module, mounted on the tower power plant CESA-1 at the Plataforma Solar de Almeria (PSA), is carried out. Additionally an error inspection of the various measurement techniques used in the REFOS project is made. Especially the flux measurement system Prohermes that is used to determine the total entry power of the receiver module and known as a major error source is analysed in detail. Simulations and experiments on the particular instruments are used to determine and quantify possible error sources. After discovering the origin of the errors they are reduced and included in the error calculation. the ultimate result is presented as an overall efficiency of the receiver module in dependence on the flux density at the receiver module's entry plane and the receiver operating temperature. (Author) 26 refs.

  12. Analysis for Human-related Events during the Overhaul

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ji Tae; Kim, Min Chull; Choi, Dong Won; Lee, Durk Hun [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2011-10-15

    The event frequency due to human error is decreasing among 20 operating Nuclear Power Plants (NPPs) excluding the NPP (Shin-Kori unit-1) in the commissioning stage since 2008. However, the events due to human error during an overhaul (O/H) occur annually (see Table I). An analysis for human-related events during the O/H was performed. Similar problems were identified for each event from the analysis and also, organizational and safety cultural factors were also identified

  13. A system dynamic simulation model for managing the human error in power tools industries

    Science.gov (United States)

    Jamil, Jastini Mohd; Shaharanee, Izwan Nizal Mohd

    2017-10-01

    In the era of modern and competitive life of today, every organization will face the situations in which the work does not proceed as planned when there is problems occur in which it had to be delay. However, human error is often cited as the culprit. The error that made by the employees would cause them have to spend additional time to identify and check for the error which in turn could affect the normal operations of the company as well as the company's reputation. Employee is a key element of the organization in running all of the activities of organization. Hence, work performance of the employees is a crucial factor in organizational success. The purpose of this study is to identify the factors that cause the increasing errors make by employees in the organization by using system dynamics approach. The broadly defined targets in this study are employees in the Regional Material Field team from purchasing department in power tools industries. Questionnaires were distributed to the respondents to obtain their perceptions on the root cause of errors make by employees in the company. The system dynamics model was developed to simulate the factor of the increasing errors make by employees and its impact. The findings of this study showed that the increasing of error make by employees was generally caused by the factors of workload, work capacity, job stress, motivation and performance of employees. However, this problem could be solve by increased the number of employees in the organization.

  14. The current approach to human error and blame in the NHS.

    Science.gov (United States)

    Ottewill, Melanie

    There is a large body of research to suggest that serious errors are widespread throughout medicine. The traditional response to these adverse events has been to adopt a 'person approach' - blaming the individual seen as 'responsible'. The culture of medicine is highly complicit in this response. Such an approach results in enormous personal costs to the individuals concerned and does little to address the root causes of errors and thus prevent their recurrence. Other industries, such as aviation, where safety is a paramount concern and which have similar structures to the medical profession, have, over the past decade or so, adopted a 'systems' approach to error, recognizing that human error is ubiquitous and inevitable and that systems need to be developed with this in mind. This approach has been highly successful, but has necessitated, first and foremost, a cultural shift. It is in the best interests of patients, and medical professionals alike, that such a shift is embraced in the NHS.

  15. Spectrogram Image Analysis of Error Signals for Minimizing Impulse Noise

    Directory of Open Access Journals (Sweden)

    Jeakwan Kim

    2016-01-01

    Full Text Available This paper presents the theoretical and experimental study on the spectrogram image analysis of error signals for minimizing the impulse input noises in the active suppression of noise. Impulse inputs of some specific wave patterns as primary noises to a one-dimensional duct with the length of 1800 mm are shown. The convergence speed of the adaptive feedforward algorithm based on the least mean square approach was controlled by a normalized step size which was incorporated into the algorithm. The variations of the step size govern the stability as well as the convergence speed. Because of this reason, a normalized step size is introduced as a new method for the control of impulse noise. The spectrogram images which indicate the degree of the attenuation of the impulse input noises are considered to represent the attenuation with the new method. The algorithm is extensively investigated in both simulation and real-time control experiment. It is demonstrated that the suggested algorithm worked with a nice stability and performance against impulse noises. The results in this study can be used for practical active noise control systems.

  16. ERM model analysis for adaptation to hydrological model errors

    Science.gov (United States)

    Baymani-Nezhad, M.; Han, D.

    2018-05-01

    Hydrological conditions are changed continuously and these phenomenons generate errors on flood forecasting models and will lead to get unrealistic results. Therefore, to overcome these difficulties, a concept called model updating is proposed in hydrological studies. Real-time model updating is one of the challenging processes in hydrological sciences and has not been entirely solved due to lack of knowledge about the future state of the catchment under study. Basically, in terms of flood forecasting process, errors propagated from the rainfall-runoff model are enumerated as the main source of uncertainty in the forecasting model. Hence, to dominate the exciting errors, several methods have been proposed by researchers to update the rainfall-runoff models such as parameter updating, model state updating, and correction on input data. The current study focuses on investigations about the ability of rainfall-runoff model parameters to cope with three types of existing errors, timing, shape and volume as the common errors in hydrological modelling. The new lumped model, the ERM model, has been selected for this study to evaluate its parameters for its use in model updating to cope with the stated errors. Investigation about ten events proves that the ERM model parameters can be updated to cope with the errors without the need to recalibrate the model.

  17. Effective training based on the cause analysis of operation errors

    International Nuclear Information System (INIS)

    Fujita, Eimitsu; Noji, Kunio; Kobayashi, Akira.

    1991-01-01

    The authors have investigated typical error types through our training experience, and analyzed the causes of them. Error types which are observed in simulator training are: (1) lack of knowledge or lack of its applying ability to actual operation; (2) defective mastery of skillbase operation; (3) rote operation or stereotyped manner; (4) mind-setting or lack of redundant verification; (5) lack of team work; (6) misjudgement for the plant overall conditions by operation chief, who directs a reactor operator and a turbine operator in the training. The paper describes training methods used in Japan for BWR utilities to overcome these error types

  18. Standardizing the practice of human reliability analysis

    International Nuclear Information System (INIS)

    Hallbert, B.P.

    1993-01-01

    The practice of human reliability analysis (HRA) within the nuclear industry varies greatly in terms of posited mechanisms that shape human performance, methods of characterizing and analytically modeling human behavior, and the techniques that are employed to estimate the frequency with which human error occurs. This variation has been a source of contention among HRA practitioners regarding the validity of results obtained from different HRA methods. It has also resulted in attempts to develop standard methods and procedures for conducting HRAs. For many of the same reasons, the practice of HRA has not been standardized or has been standardized only to the extent that individual analysts have developed heuristics and consistent approaches in their practice of HRA. From the standpoint of consumers and regulators, this has resulted in a lack of clear acceptance criteria for the assumptions, modeling, and quantification of human errors in probabilistic risk assessments

  19. An analysis of errors of commission in the Nordic nuclear power plants based on plant operating experience

    International Nuclear Information System (INIS)

    Pyy, P.; Bento, J.P.; Flodin, Y.

    2001-12-01

    The report presents the methodology followed, the material used and conclusions drawn in a study of active human failures. First, the report discusses the concept of active human failures in the context of human errors. Then, a simplified methodology is presented applicable to analysis of operating experience and documenting all kinds of human failures. Also the material and analysis procedure used in the three parts of the study are discussed. Finally, some selected highlights of results are presented with common conclusions and recommendations. (au)

  20. How we learn to make decisions: rapid propagation of reinforcement learning prediction errors in humans.

    Science.gov (United States)

    Krigolson, Olav E; Hassall, Cameron D; Handy, Todd C

    2014-03-01

    Our ability to make decisions is predicated upon our knowledge of the outcomes of the actions available to us. Reinforcement learning theory posits that actions followed by a reward or punishment acquire value through the computation of prediction errors-discrepancies between the predicted and the actual reward. A multitude of neuroimaging studies have demonstrated that rewards and punishments evoke neural responses that appear to reflect reinforcement learning prediction errors [e.g., Krigolson, O. E., Pierce, L. J., Holroyd, C. B., & Tanaka, J. W. Learning to become an expert: Reinforcement learning and the acquisition of perceptual expertise. Journal of Cognitive Neuroscience, 21, 1833-1840, 2009; Bayer, H. M., & Glimcher, P. W. Midbrain dopamine neurons encode a quantitative reward prediction error signal. Neuron, 47, 129-141, 2005; O'Doherty, J. P. Reward representations and reward-related learning in the human brain: Insights from neuroimaging. Current Opinion in Neurobiology, 14, 769-776, 2004; Holroyd, C. B., & Coles, M. G. H. The neural basis of human error processing: Reinforcement learning, dopamine, and the error-related negativity. Psychological Review, 109, 679-709, 2002]. Here, we used the brain ERP technique to demonstrate that not only do rewards elicit a neural response akin to a prediction error but also that this signal rapidly diminished and propagated to the time of choice presentation with learning. Specifically, in a simple, learnable gambling task, we show that novel rewards elicited a feedback error-related negativity that rapidly decreased in amplitude with learning. Furthermore, we demonstrate the existence of a reward positivity at choice presentation, a previously unreported ERP component that has a similar timing and topography as the feedback error-related negativity that increased in amplitude with learning. The pattern of results we observed mirrored the output of a computational model that we implemented to compute reward

  1. The application of human error prevention tool in Tianwan nuclear power station

    International Nuclear Information System (INIS)

    Qiao Zhiguo

    2013-01-01

    This paper mainly discusses the application and popularization of human error prevention tool in Tianwan nuclear power station, including the study on project implementation background, main contents and innovation, performance management, innovation practice and development, and performance of innovation application. (authors)

  2. Taking human error into account in the design of nuclear reactor centres

    International Nuclear Information System (INIS)

    Prouillac; Lerat; Janoir.

    1982-05-01

    The role of the operator in the centralized management of pressurized water reactors is studied. Different types of human error likely to arise, the means of their prevention and methods of mitigating their consequences are presented. Some possible improvements are outlined

  3. The human fallibility of scientists : Dealing with error and bias in academic research

    NARCIS (Netherlands)

    Veldkamp, Coosje

    2017-01-01

    THE HUMAN FALLIBILITY OF SCIENTISTS Dealing with error and bias in academic research Recent studies have highlighted that not all published findings in the scientific lit¬erature are trustworthy, suggesting that currently implemented control mechanisms such as high standards for the reporting of

  4. US-LHC IR magnet error analysis and compensation

    International Nuclear Information System (INIS)

    Wei, J.; Ptitsin, V.; Pilat, F.; Tepikian, S.; Gelfand, N.; Wan, W.; Holt, J.

    1998-01-01

    This paper studies the impact of the insertion-region (IR) magnet field errors on LHC collision performance. Compensation schemes including magnet orientation optimization, body-end compensation, tuning shims, and local nonlinear correction are shown to be highly effective

  5. Analysis of error in Monte Carlo transport calculations

    International Nuclear Information System (INIS)

    Booth, T.E.

    1979-01-01

    The Monte Carlo method for neutron transport calculations suffers, in part, because of the inherent statistical errors associated with the method. Without an estimate of these errors in advance of the calculation, it is difficult to decide what estimator and biasing scheme to use. Recently, integral equations have been derived that, when solved, predicted errors in Monte Carlo calculations in nonmultiplying media. The present work allows error prediction in nonanalog Monte Carlo calculations of multiplying systems, even when supercritical. Nonanalog techniques such as biased kernels, particle splitting, and Russian Roulette are incorporated. Equations derived here allow prediction of how much a specific variance reduction technique reduces the number of histories required, to be weighed against the change in time required for calculation of each history. 1 figure, 1 table

  6. Error Analysis for Fourier Methods for Option Pricing

    KAUST Repository

    Hä ppö lä , Juho

    2016-01-01

    We provide a bound for the error committed when using a Fourier method to price European options when the underlying follows an exponential Levy dynamic. The price of the option is described by a partial integro-differential equation (PIDE

  7. Human error data collection as a precursor to the development of a human reliability assessment capability in air traffic management

    International Nuclear Information System (INIS)

    Kirwan, Barry; Gibson, W. Huw; Hickling, Brian

    2008-01-01

    Quantified risk and safety assessments are now required for safety cases for European air traffic management (ATM) services. Since ATM is highly human-dependent for its safety, this suggests a need for formal human reliability assessment (HRA), as carried out in other industries such as nuclear power. Since the fundamental aspect of HRA is human error data, in the form of human error probabilities (HEPs), it was decided to take a first step towards development of an ATM HRA approach by deriving some HEPs in an ATM context. This paper reports a study, which collected HEPs via analysing the results of a real-time simulation involving air traffic controllers (ATCOs) and pilots, with a focus on communication errors. This study did indeed derive HEPs that were found to be concordant with other known communication human error data. This is a first step, and shows promise for HRA in ATM, since HEPs have been derived which could be used in safety assessments, although these HEPs are for only one (albeit critical) aspect of ATCOs' tasks (communications). The paper discusses options and potential ways forward for the development of a full HRA capability in ATM

  8. Error Analysis of Inertial Navigation Systems Using Test Algorithms

    OpenAIRE

    Vaispacher, Tomáš; Bréda, Róbert; Adamčík, František

    2015-01-01

    Content of this contribution is an issue of inertial sensors errors, specification of inertial measurement units and generating of test signals for Inertial Navigation System (INS). Given the different levels of navigation tasks, part of this contribution is comparison of the actual types of Inertial Measurement Units. Considering this comparison, there is proposed the way of solving inertial sensors errors and their modelling for low – cost inertial navigation applications. The last part is ...

  9. Error Analysis of Variations on Larsen's Benchmark Problem

    International Nuclear Information System (INIS)

    Azmy, YY

    2001-01-01

    Error norms for three variants of Larsen's benchmark problem are evaluated using three numerical methods for solving the discrete ordinates approximation of the neutron transport equation in multidimensional Cartesian geometry. The three variants of Larsen's test problem are concerned with the incoming flux boundary conditions: unit incoming flux on the left and bottom edges (Larsen's configuration); unit, incoming flux only on the left edge; unit incoming flux only on the bottom edge. The three methods considered are the Diamond Difference (DD) method, and the constant-approximation versions of the Arbitrarily High Order Transport method of the Nodal type (AHOT-N), and of the Characteristic (AHOT-C) type. The cell-wise error is computed as the difference between the cell-averaged flux computed by each method and the exact value, then the L 1 , L 2 , and L ∞ error norms are calculated. The results of this study demonstrate that while integral error norms, i.e. L 1 , L 2 , converge to zero with mesh refinement, the pointwise L ∞ norm does not due to solution discontinuity across the singular characteristic. Little difference is observed between the error norm behavior of the three methods considered in spite of the fact that AHOT-C is locally exact, suggesting that numerical diffusion across the singular characteristic as the major source of error on the global scale. However, AHOT-C possesses a given accuracy in a larger fraction of computational cells than DD

  10. WORKING MEMORY STRUCTURE REVEALED IN ANALYSIS OF RECALL ERRORS

    Directory of Open Access Journals (Sweden)

    Regina V Ershova

    2017-12-01

    Full Text Available We analyzed working memory errors stemming from 193 Russian college students taking the Tarnow Unchunkable Test utilizing double digit items on a visual display.In three-item trials with at most one error per trial, single incorrect tens and ones digits (“singlets” were overrepresented and made up the majority of errors, indicating a base 10 organization.These errors indicate that there are separate memory maps for each position and that there are pointers that can move primarily within these maps. Several pointers make up a pointer collection. The number of pointer collections possible is the working memory capacity limit. A model for self-organizing maps is constructed in which the organization is created by turning common pointer collections into maps thereby replacing a pointer collection with a single pointer.The factors 5 and 11 were underrepresented in the errors, presumably because base 10 properties beyond positional order were used for error correction, perhaps reflecting the existence of additional maps of integers divisible by 5 and integers divisible by 11.

  11. Support of protective work of human error in a nuclear power plant

    International Nuclear Information System (INIS)

    Yoshizawa, Yuriko

    1999-01-01

    The nuclear power plant human factor group of the Tokyo Electric Power Co., Ltd. supports various protective work of human error conducted at the nuclear power plant. Its main researching theme are studies on human factor on operation of a nuclear power plant, and on recovery and common basic study on human factor. In addition, on a base of the obtained informations, assistance to protective work of human error conducted at the nuclear power plant as well as development for its actual use was also promoted. Especially, for actions sharing some dangerous informations, various assistances such as a proposal on actual example analytical method to effectively understand a dangerous information not facially but faithfully, construction of a data base to conveniently share such dangerous information, and practice on non-accident business survey for a hint of effective promotion of the protection work, were promoted. Here were introduced on assistance and investigation for effective sharing of the dangerous informations for various actions on protection of human error mainly conducted in nuclear power plant. (G.K.)

  12. Analysis of family-wise error rates in statistical parametric mapping using random field theory.

    Science.gov (United States)

    Flandin, Guillaume; Friston, Karl J

    2017-11-01

    This technical report revisits the analysis of family-wise error rates in statistical parametric mapping-using random field theory-reported in (Eklund et al. []: arXiv 1511.01863). Contrary to the understandable spin that these sorts of analyses attract, a review of their results suggests that they endorse the use of parametric assumptions-and random field theory-in the analysis of functional neuroimaging data. We briefly rehearse the advantages parametric analyses offer over nonparametric alternatives and then unpack the implications of (Eklund et al. []: arXiv 1511.01863) for parametric procedures. Hum Brain Mapp, 2017. © 2017 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc. © 2017 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc.

  13. Error Analysis of Brailled Instructional Materials Produced by Public School Personnel in Texas

    Science.gov (United States)

    Herzberg, Tina

    2010-01-01

    In this study, a detailed error analysis was performed to determine if patterns of errors existed in braille transcriptions. The most frequently occurring errors were the insertion of letters or words that were not contained in the original print material; the incorrect usage of the emphasis indicator; and the incorrect formatting of titles,…

  14. Error analysis for 1-1/2-loop semiscale system isothermal test data

    International Nuclear Information System (INIS)

    Feldman, E.M.; Naff, S.A.

    1975-05-01

    An error analysis was performed on the measurements made during the isothermal portion of the Semiscale Blowdown and Emergency Core Cooling (ECC) Project. A brief description of the measurement techniques employed, identification of potential sources of errors, and quantification of the errors associated with data is presented. (U.S.)

  15. Development of a framework to estimate human error for diagnosis tasks in advanced control room

    International Nuclear Information System (INIS)

    Kim, Ar Ryum; Jang, In Seok; Seong, Proong Hyun

    2014-01-01

    In the emergency situation of nuclear power plants (NPPs), a diagnosis of the occurring events is crucial for managing or controlling the plant to a safe and stable condition. If the operators fail to diagnose the occurring events or relevant situations, their responses can eventually inappropriate or inadequate Accordingly, huge researches have been performed to identify the cause of diagnosis error and estimate the probability of diagnosis error. D.I Gertman et al. asserted that 'the cognitive failures stem from erroneous decision-making, poor understanding of rules and procedures, and inadequate problem solving and this failures may be due to quality of data and people's capacity for processing information'. Also many researchers have asserted that human-system interface (HSI), procedure, training and available time are critical factors to cause diagnosis error. In nuclear power plants, a diagnosis of the event is critical for safe condition of the system. As advanced main control room is being adopted in nuclear power plants, the operators may obtain the plant data via computer-based HSI and procedure. Also many researchers have asserted that HSI, procedure, training and available time are critical factors to cause diagnosis error. In this regards, using simulation data, diagnosis errors and its causes were identified. From this study, some useful insights to reduce diagnosis errors of operators in advanced main control room were provided

  16. In-plant reliability data base for nuclear plant components: a feasibility study on human error information

    International Nuclear Information System (INIS)

    Borkowski, R.J.; Fragola, J.R.; Schurman, D.L.; Johnson, J.W.

    1984-03-01

    This report documents the procedure and final results of a feasibility study which examined the usefulness of nuclear plant maintenance work requests in the IPRDS as tools for understanding human error and its influence on component failure and repair. Developed in this study were (1) a set of criteria for judging the quality of a plant maintenance record set for studying human error; (2) a scheme for identifying human errors in the maintenance records; and (3) two taxonomies (engineering-based and psychology-based) for categorizing and coding human error-related events

  17. Republished error management: Descriptions of verbal communication errors between staff. An analysis of 84 root cause analysis-reports from Danish hospitals

    DEFF Research Database (Denmark)

    Rabøl, Louise Isager; Andersen, Mette Lehmann; Østergaard, Doris

    2011-01-01

    Introduction Poor teamwork and communication between healthcare staff are correlated to patient safety incidents. However, the organisational factors responsible for these issues are unexplored. Root cause analyses (RCA) use human factors thinking to analyse the systems behind severe patient safety...... (30%)), communication errors between junior and senior staff members (11 (25%)), hesitance in speaking up (10 (23%)) and communication errors during teamwork (8 (18%)). The kappa values were 0.44-0.78. Unproceduralized communication and information exchange via telephone, related to transfer between...... incidents. The RCARs rich descriptions of the incidents revealed the organisational factors and needs related to these errors....

  18. Error analysis of the freshmen Criminology students’ grammar in the written English

    Directory of Open Access Journals (Sweden)

    Maico Demi Banate Aperocho

    2017-12-01

    Full Text Available This study identifies the various syntactical errors of the fifty (50 freshmen B.S. Criminology students of the University of Mindanao in Davao City. Specifically, this study aims to answer the following: (1 What are the common errors present in the argumentative essays of the respondents? (2 What are the reasons of the existence of these errors? This study is descriptive-qualitative. It also uses error analysis to point out the syntactical errors present in the compositions of the participants. The fifty essays are subjected to error analysis. Errors are classified based on Chanquoy’s Classification of Writing Errors. Furthermore, Hourani’s Common Reasons of Grammatical Errors Checklist was also used to determine the common reasons of the identified syntactical errors. To create a meaningful interpretation of data and to solicit further ideas from the participants, a focus group discussion is also done. Findings show that students’ most common errors are on the grammatical aspect. In the grammatical aspect, students have more frequently committed errors in the verb aspect (tense, subject agreement, and auxiliary and linker choice compared to spelling and punctuation aspects. Moreover, there are three topmost reasons of committing errors in the paragraph: mother tongue interference, incomprehensibility of the grammar rules, and the incomprehensibility of the writing mechanics. Despite the difficulty in learning English as a second language, students are still very motivated to master the concepts and applications of the language.

  19. Human reliability analysis of control room operators

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Isaac J.A.L.; Carvalho, Paulo Victor R.; Grecco, Claudio H.S. [Instituto de Engenharia Nuclear (IEN), Rio de Janeiro, RJ (Brazil)

    2005-07-01

    Human reliability is the probability that a person correctly performs some system required action in a required time period and performs no extraneous action that can degrade the system Human reliability analysis (HRA) is the analysis, prediction and evaluation of work-oriented human performance using some indices as human error likelihood and probability of task accomplishment. Significant progress has been made in the HRA field during the last years, mainly in nuclear area. Some first-generation HRA methods were developed, as THERP (Technique for human error rate prediction). Now, an array of called second-generation methods are emerging as alternatives, for instance ATHEANA (A Technique for human event analysis). The ergonomics approach has as tool the ergonomic work analysis. It focus on the study of operator's activities in physical and mental form, considering at the same time the observed characteristics of operator and the elements of the work environment as they are presented to and perceived by the operators. The aim of this paper is to propose a methodology to analyze the human reliability of the operators of industrial plant control room, using a framework that includes the approach used by ATHEANA, THERP and the work ergonomics analysis. (author)

  20. Error analysis in predictive modelling demonstrated on mould data.

    Science.gov (United States)

    Baranyi, József; Csernus, Olívia; Beczner, Judit

    2014-01-17

    The purpose of this paper was to develop a predictive model for the effect of temperature and water activity on the growth rate of Aspergillus niger and to determine the sources of the error when the model is used for prediction. Parallel mould growth curves, derived from the same spore batch, were generated and fitted to determine their growth rate. The variances of replicate ln(growth-rate) estimates were used to quantify the experimental variability, inherent to the method of determining the growth rate. The environmental variability was quantified by the variance of the respective means of replicates. The idea is analogous to the "within group" and "between groups" variability concepts of ANOVA procedures. A (secondary) model, with temperature and water activity as explanatory variables, was fitted to the natural logarithm of the growth rates determined by the primary model. The model error and the experimental and environmental errors were ranked according to their contribution to the total error of prediction. Our method can readily be applied to analysing the error structure of predictive models of bacterial growth models, too. © 2013.

  1. A Benefit/Cost/Deficit (BCD) model for learning from human errors

    International Nuclear Information System (INIS)

    Vanderhaegen, Frederic; Zieba, Stephane; Enjalbert, Simon; Polet, Philippe

    2011-01-01

    This paper proposes an original model for interpreting human errors, mainly violations, in terms of benefits, costs and potential deficits. This BCD model is then used as an input framework to learn from human errors, and two systems based on this model are developed: a case-based reasoning system and an artificial neural network system. These systems are used to predict a specific human car driving violation: not respecting the priority-to-the-right rule, which is a decision to remove a barrier. Both prediction systems learn from previous violation occurrences, using the BCD model and four criteria: safety, for identifying the deficit or the danger; and opportunity for action, driver comfort, and time spent; for identifying the benefits or the costs. The application of learning systems to predict car driving violations gives a rate over 80% of correct prediction after 10 iterations. These results are validated for the non-respect of priority-to-the-right rule.

  2. The development of human behavior analysis techniques

    International Nuclear Information System (INIS)

    Lee, Jung Woon; Lee, Yong Hee; Park, Geun Ok; Cheon, Se Woo; Suh, Sang Moon; Oh, In Suk; Lee, Hyun Chul; Park, Jae Chang.

    1997-07-01

    In this project, which is to study on man-machine interaction in Korean nuclear power plants, we developed SACOM (Simulation Analyzer with a Cognitive Operator Model), a tool for the assessment of task performance in the control rooms using software simulation, and also develop human error analysis and application techniques. SACOM was developed to assess operator's physical workload, workload in information navigation at VDU workstations, and cognitive workload in procedural tasks. We developed trip analysis system including a procedure based on man-machine interaction analysis system including a procedure based on man-machine interaction analysis and a classification system. We analyzed a total of 277 trips occurred from 1978 to 1994 to produce trip summary information, and for 79 cases induced by human errors time-lined man-machine interactions. The INSTEC, a database system of our analysis results, was developed. The MARSTEC, a multimedia authoring and representation system for trip information, was also developed, and techniques for human error detection in human factors experiments were established. (author). 121 refs., 38 tabs., 52 figs

  3. The development of human behavior analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung Woon; Lee, Yong Hee; Park, Geun Ok; Cheon, Se Woo; Suh, Sang Moon; Oh, In Suk; Lee, Hyun Chul; Park, Jae Chang

    1997-07-01

    In this project, which is to study on man-machine interaction in Korean nuclear power plants, we developed SACOM (Simulation Analyzer with a Cognitive Operator Model), a tool for the assessment of task performance in the control rooms using software simulation, and also develop human error analysis and application techniques. SACOM was developed to assess operator`s physical workload, workload in information navigation at VDU workstations, and cognitive workload in procedural tasks. We developed trip analysis system including a procedure based on man-machine interaction analysis system including a procedure based on man-machine interaction analysis and a classification system. We analyzed a total of 277 trips occurred from 1978 to 1994 to produce trip summary information, and for 79 cases induced by human errors time-lined man-machine interactions. The INSTEC, a database system of our analysis results, was developed. The MARSTEC, a multimedia authoring and representation system for trip information, was also developed, and techniques for human error detection in human factors experiments were established. (author). 121 refs., 38 tabs., 52 figs.

  4. Selection of the important performance influencing factors for the assessment of human error under accident management situations in nuclear power plants

    International Nuclear Information System (INIS)

    Kim, J. H.; Jung, W. J.

    1999-01-01

    This paper introduces the process and final results of selection of the important Performance Influencing Factors (PIFs) under emergency operation and accident management situations in nuclear power plants for use in the assessment of human errors. We collected two types of PIF taxonomies, one is the full set PIF list mainly developed for human error analysis, and the other is the PIFs for human reliability analysis (HRA) in probabilistic safety assessment (PSA). 5 PIF taxonomies among the full set PIF list and 10 PIF taxonomies among HRA methodologies (CREAM, SLIM, INTENT, were collected in this research. By reviewing and analyzing PIFs selected for HRA methodologies, the criterion could be established for the selection of appropriate PIFs under emergency operation and accident management situations. Based on this selection criteria, a new PIF taxonomy was proposed for the assessment of human error under emergency operation and accident management situations in nuclear power plants

  5. Analysis and improvement of gas turbine blade temperature measurement error

    International Nuclear Information System (INIS)

    Gao, Shan; Wang, Lixin; Feng, Chi; Daniel, Ketui

    2015-01-01

    Gas turbine blade components are easily damaged; they also operate in harsh high-temperature, high-pressure environments over extended durations. Therefore, ensuring that the blade temperature remains within the design limits is very important. In this study, measurement errors in turbine blade temperatures were analyzed, taking into account detector lens contamination, the reflection of environmental energy from the target surface, the effects of the combustion gas, and the emissivity of the blade surface. In this paper, each of the above sources of measurement error is discussed, and an iterative computing method for calculating blade temperature is proposed. (paper)

  6. Analysis and improvement of gas turbine blade temperature measurement error

    Science.gov (United States)

    Gao, Shan; Wang, Lixin; Feng, Chi; Daniel, Ketui

    2015-10-01

    Gas turbine blade components are easily damaged; they also operate in harsh high-temperature, high-pressure environments over extended durations. Therefore, ensuring that the blade temperature remains within the design limits is very important. In this study, measurement errors in turbine blade temperatures were analyzed, taking into account detector lens contamination, the reflection of environmental energy from the target surface, the effects of the combustion gas, and the emissivity of the blade surface. In this paper, each of the above sources of measurement error is discussed, and an iterative computing method for calculating blade temperature is proposed.

  7. Undesirable effects of covariance matrix techniques for error analysis

    International Nuclear Information System (INIS)

    Seibert, D.

    1994-01-01

    Regression with χ 2 constructed from covariance matrices should not be used for some combinations of covariance matrices and fitting functions. Using the technique for unsuitable combinations can amplify systematic errors. This amplification is uncontrolled, and can produce arbitrarily inaccurate results that might not be ruled out by a χ 2 test. In addition, this technique can give incorrect (artificially small) errors for fit parameters. I give a test for this instability and a more robust (but computationally more intensive) method for fitting correlated data

  8. A statistical approach to estimating effects of performance shaping factors on human error probabilities of soft controls

    International Nuclear Information System (INIS)

    Kim, Yochan; Park, Jinkyun; Jung, Wondea; Jang, Inseok; Hyun Seong, Poong

    2015-01-01

    Despite recent efforts toward data collection for supporting human reliability analysis, there remains a lack of empirical basis in determining the effects of performance shaping factors (PSFs) on human error probabilities (HEPs). To enhance the empirical basis regarding the effects of the PSFs, a statistical methodology using a logistic regression and stepwise variable selection was proposed, and the effects of the PSF on HEPs related with the soft controls were estimated through the methodology. For this estimation, more than 600 human error opportunities related to soft controls in a computerized control room were obtained through laboratory experiments. From the eight PSF surrogates and combinations of these variables, the procedure quality, practice level, and the operation type were identified as significant factors for screen switch and mode conversion errors. The contributions of these significant factors to HEPs were also estimated in terms of a multiplicative form. The usefulness and limitation of the experimental data and the techniques employed are discussed herein, and we believe that the logistic regression and stepwise variable selection methods will provide a way to estimate the effects of PSFs on HEPs in an objective manner. - Highlights: • It is necessary to develop an empirical basis for the effects of the PSFs on the HEPs. • A statistical method using a logistic regression and variable selection was proposed. • The effects of PSFs on the HEPs of soft controls were empirically investigated. • The significant factors were identified and their effects were estimated

  9. An Enhancement of Campaign Posters for Human Error Prevention in NPPs

    International Nuclear Information System (INIS)

    Lee, Yong Hee; Lee, Yong Hee; Kwon, Soon Il

    2010-01-01

    Accidents in high reliability systems such as nuclear power plants (NPPs) give rise to not only a loss of property and life, but also social problems. One of the most frequently used techniques to grasp the current situation for hazard factors in the NPPs is an event investigation analysis based on the INPO's Human Performance Enhancement System (HPES), and the Korean Human Performance Enhancement System (K-HPES) in Korea, respectively. There are many methods and approaches for an HE assessment that is valuable for investigating the causes of undesirable events and counter-plans to prevent their recurrence in the NPPs. They differ from each other according to the objectives of the analysis; the explanation of the event, the investigation of the causes, the allocation of the responsibility, and the establishment of the counter-plan. Event databases include their own events and information from various sources such as the IAEA, regulatory bodies, and also from the INPO and WANO. As many as 111 reactor trips have occurred in the past 5 years ('01∼'05), and 26 cases of them have occurred due to HE. The trend of human error rate didn't decrease in 2004, so the KHNP started to make efforts to decrease HEs. The KHNP created as many as 40 posters for human performance improvement in 2006. The INPO has been using a traditional form of poster; additionally, the Central Research Institute of Electric Power Industry (CRIEPI) developed a type of caution report. The caution report is comprised of a poster name, a serial number, a figure, work situations, the point at issue, and a countermeasure. The preceding posters which KHNP developed in 2006 give a message about specific information related to HE events. However, it is not enough to arouse interest in the effectiveness of the posters because most people are favorably disposed toward a simple poster with many illustrations. Therefore, we stressed the need for worker's receptiveness rather than notification of information

  10. Application of DFM in human reliability analysis

    International Nuclear Information System (INIS)

    Yu Shaojie; Zhao Jun; Tong Jiejuan

    2011-01-01

    Combining with ATHEANA, the possible to identify EFCs and UAs using DFM is studied; and then Steam Generator Tube Rupture (SGTR) accident is modeled and solved. Through inductive analysis, 26 Prime Implicants (PIs) are obtained and the meaning of results is interpreted; and one of PIs is similar to the accident scenario of human failure event in one nuclear power plant. Finally, this paper discusses the methods of quantifying PIs, analysis of Error of commission (EOC) and so on. (authors)

  11. Flight Technical Error Analysis of the SATS Higher Volume Operations Simulation and Flight Experiments

    Science.gov (United States)

    Williams, Daniel M.; Consiglio, Maria C.; Murdoch, Jennifer L.; Adams, Catherine H.

    2005-01-01

    This paper provides an analysis of Flight Technical Error (FTE) from recent SATS experiments, called the Higher Volume Operations (HVO) Simulation and Flight experiments, which NASA conducted to determine pilot acceptability of the HVO concept for normal operating conditions. Reported are FTE results from simulation and flight experiment data indicating the SATS HVO concept is viable and acceptable to low-time instrument rated pilots when compared with today s system (baseline). Described is the comparative FTE analysis of lateral, vertical, and airspeed deviations from the baseline and SATS HVO experimental flight procedures. Based on FTE analysis, all evaluation subjects, low-time instrument-rated pilots, flew the HVO procedures safely and proficiently in comparison to today s system. In all cases, the results of the flight experiment validated the results of the simulation experiment and confirm the utility of the simulation platform for comparative Human in the Loop (HITL) studies of SATS HVO and Baseline operations.

  12. Plant specification of a generic human-error data through a two-stage Bayesian approach

    International Nuclear Information System (INIS)

    Heising, C.D.; Patterson, E.I.

    1984-01-01

    Expert judgement concerning human performance in nuclear power plants is quantitatively coupled with actuarial data on such performance in order to derive plant-specific human-error rate probability distributions. The coupling procedure consists of a two-stage application of Bayes' theorem to information which is grouped by type. The first information type contains expert judgement concerning human performance at nuclear power plants in general. Data collected on human performance at a group of similar plants forms the second information type. The third information type consists of data on human performance in a specific plant which has the same characteristics as the group members. The first and second information types are coupled in the first application of Bayes' theorem to derive a probability distribution for population performance. This distribution is then combined with the third information type in a second application of Bayes' theorem to determine a plant-specific human-error rate probability distribution. The two stage Bayesian procedure thus provides a means to quantitatively couple sparse data with expert judgement in order to obtain a human performance probability distribution based upon available information. Example calculations for a group of like reactors are also given. (author)

  13. Linguistic Error Analysis on Students' Thesis Proposals

    Science.gov (United States)

    Pescante-Malimas, Mary Ann; Samson, Sonrisa C.

    2017-01-01

    This study identified and analyzed the common linguistic errors encountered by Linguistics, Literature, and Advertising Arts majors in their Thesis Proposal classes in the First Semester 2016-2017. The data were the drafts of the thesis proposals of the students from the three different programs. A total of 32 manuscripts were analyzed which was…

  14. Reading and Spelling Error Analysis of Native Arabic Dyslexic Readers

    Science.gov (United States)

    Abu-rabia, Salim; Taha, Haitham

    2004-01-01

    This study was an investigation of reading and spelling errors of dyslexic Arabic readers ("n"=20) compared with two groups of normal readers: a young readers group, matched with the dyslexics by reading level ("n"=20) and an age-matched group ("n"=20). They were tested on reading and spelling of texts, isolated…

  15. Analysis of Students' Error in Learning of Quadratic Equations

    Science.gov (United States)

    Zakaria, Effandi; Ibrahim; Maat, Siti Mistima

    2010-01-01

    The purpose of the study was to determine the students' error in learning quadratic equation. The samples were 30 form three students from a secondary school in Jambi, Indonesia. Diagnostic test was used as the instrument of this study that included three components: factorization, completing the square and quadratic formula. Diagnostic interview…

  16. Frame-based safety analysis approach for decision-based errors

    International Nuclear Information System (INIS)

    Fan, Chin-Feng; Yihb, Swu

    1997-01-01

    A frame-based approach is proposed to analyze decision-based errors made by automatic controllers or human operators due to erroneous reference frames. An integrated framework, Two Frame Model (TFM), is first proposed to model the dynamic interaction between the physical process and the decision-making process. Two important issues, consistency and competing processes, are raised. Consistency between the physical and logic frames makes a TFM-based system work properly. Loss of consistency refers to the failure mode that the logic frame does not accurately reflect the state of the controlled processes. Once such failure occurs, hazards may arise. Among potential hazards, the competing effect between the controller and the controlled process is the most severe one, which may jeopardize a defense-in-depth design. When the logic and physical frames are inconsistent, conventional safety analysis techniques are inadequate. We propose Frame-based Fault Tree; Analysis (FFTA) and Frame-based Event Tree Analysis (FETA) under TFM to deduce the context for decision errors and to separately generate the evolution of the logical frame as opposed to that of the physical frame. This multi-dimensional analysis approach, different from the conventional correctness-centred approach, provides a panoramic view in scenario generation. Case studies using the proposed techniques are also given to demonstrate their usage and feasibility

  17. A method to deal with installation errors of wearable accelerometers for human activity recognition

    International Nuclear Information System (INIS)

    Jiang, Ming; Wang, Zhelong; Shang, Hong; Li, Hongyi; Wang, Yuechao

    2011-01-01

    Human activity recognition (HAR) by using wearable accelerometers has gained significant interest in recent years in a range of healthcare areas, including inferring metabolic energy expenditure, predicting falls, measuring gait parameters and monitoring daily activities. The implementation of HAR relies heavily on the correctness of sensor fixation. The installation errors of wearable accelerometers may dramatically decrease the accuracy of HAR. In this paper, a method is proposed to improve the robustness of HAR to the installation errors of accelerometers. The method first calculates a transformation matrix by using Gram–Schmidt orthonormalization in order to eliminate the sensor's orientation error and then employs a low-pass filter with a cut-off frequency of 10 Hz to eliminate the main effect of the sensor's misplacement. The experimental results showed that the proposed method obtained a satisfactory performance for HAR. The average accuracy rate from ten subjects was 95.1% when there were no installation errors, and was 91.9% when installation errors were involved in wearable accelerometers

  18. Identification and Evaluation of Human Errors in the Medication Process Using the Extended CREAM Technique

    Directory of Open Access Journals (Sweden)

    Iraj Mohammadfam

    2017-10-01

    Full Text Available Background Medication process is a powerful instrument for curing patients. Obeying the commands of this process has an important role in the treatment and provision of care to patients. Medication error, as a complicated process, can occur in any stage of this process, and to avoid it, appropriate decision-making, cognition, and performance of the hospital staff are needed. Objectives The present study aimed at identifying and evaluating the nature and reasons of human errors in the medication process in a hospital using the extended CREAM method. Methods This was a qualitative and cross-sectional study conducted in a hospital in Hamadan. In this study, first, the medication process was selected as a critical issue based on the opinions of experts, specialists, and experienced individuals in the nursing and medical departments. Then, the process was analyzed into relative steps and substeps using the method of HTA and was evaluated using extended CREAM technique considering the probability of human errors. Results Based on the findings achieved through the basic CREAM method, the highest CFPt was in the step of medicine administration to patients (0.056. Moreover, the results revealed that the highest CFPt was in the substeps of calculating the dose of medicine and determining the method of prescription and identifying the patient (0.0796 and 0.0785, respectively. Also, the least CFPt was related to transcribing the prescribed medicine from file to worksheet of medicine (0.0106. Conclusions Considering the critical consequences of human errors in the medication process, holding pharmacological retraining classes, using the principles of executing pharmaceutical orders, increasing medical personnel, reducing working overtime, organizing work shifts, and using error reporting systems are of paramount importance.

  19. Human Factors Risk Analyses of a Doffing Protocol for Ebola-Level Personal Protective Equipment: Mapping Errors to Contamination.

    Science.gov (United States)

    Mumma, Joel M; Durso, Francis T; Ferguson, Ashley N; Gipson, Christina L; Casanova, Lisa; Erukunuakpor, Kimberly; Kraft, Colleen S; Walsh, Victoria L; Zimring, Craig; DuBose, Jennifer; Jacob, Jesse T

    2018-03-05

    Doffing protocols for personal protective equipment (PPE) are critical for keeping healthcare workers (HCWs) safe during care of patients with Ebola virus disease. We assessed the relationship between errors and self-contamination during doffing. Eleven HCWs experienced with doffing Ebola-level PPE participated in simulations in which HCWs donned PPE marked with surrogate viruses (ɸ6 and MS2), completed a clinical task, and were assessed for contamination after doffing. Simulations were video recorded, and a failure modes and effects analysis and fault tree analyses were performed to identify errors during doffing, quantify their risk (risk index), and predict contamination data. Fifty-one types of errors were identified, many having the potential to spread contamination. Hand hygiene and removing the powered air purifying respirator (PAPR) hood had the highest total risk indexes (111 and 70, respectively) and number of types of errors (9 and 13, respectively). ɸ6 was detected on 10% of scrubs and the fault tree predicted a 10.4% contamination rate, likely occurring when the PAPR hood inadvertently contacted scrubs during removal. MS2 was detected on 10% of hands, 20% of scrubs, and 70% of inner gloves and the predicted rates were 7.3%, 19.4%, 73.4%, respectively. Fault trees for MS2 and ɸ6 contamination suggested similar pathways. Ebola-level PPE can both protect and put HCWs at risk for self-contamination throughout the doffing process, even among experienced HCWs doffing with a trained observer. Human factors methodologies can identify error-prone steps, delineate the relationship between errors and self-contamination, and suggest remediation strategies.

  20. Human Factors Process Task Analysis: Liquid Oxygen Pump Acceptance Test Procedure at the Advanced Technology Development Center

    Science.gov (United States)

    Diorio, Kimberly A.; Voska, Ned (Technical Monitor)

    2002-01-01

    This viewgraph presentation provides information on Human Factors Process Failure Modes and Effects Analysis (HF PFMEA). HF PFMEA includes the following 10 steps: Describe mission; Define System; Identify human-machine; List human actions; Identify potential errors; Identify factors that effect error; Determine likelihood of error; Determine potential effects of errors; Evaluate risk; Generate solutions (manage error). The presentation also describes how this analysis was applied to a liquid oxygen pump acceptance test.

  1. Spatial-temporal analysis of wind power forecast errors for West-Coast Norway

    Energy Technology Data Exchange (ETDEWEB)

    Revheim, Paal Preede; Beyer, Hans Georg [Agder Univ. (UiA), Grimstad (Norway). Dept. of Engineering Sciences

    2012-07-01

    In this paper the spatial-temporal structure of forecast errors for wind power in West-Coast Norway is analyzed. Starting on the qualitative analysis of the forecast error reduction, with respect to single site data, for the lumped conditions of groups of sites the spatial and temporal correlations of the wind power forecast errors within and between the same groups are studied in detail. Based on this, time-series regression models to be used to analytically describe the error reduction are set up. The models give an expected reduction in forecast error between 48.4% and 49%. (orig.)

  2. Human error considerations and annunciator effects in determining optimal test intervals for periodically inspected standby systems

    International Nuclear Information System (INIS)

    McWilliams, T.P.; Martz, H.F.

    1981-01-01

    This paper incorporates the effects of four types of human error in a model for determining the optimal time between periodic inspections which maximizes the steady state availability for standby safety systems. Such safety systems are characteristic of nuclear power plant operations. The system is modeled by means of an infinite state-space Markov chain. Purpose of the paper is to demonstrate techniques for computing steady-state availability A and the optimal periodic inspection interval tau* for the system. The model can be used to investigate the effects of human error probabilities on optimal availability, study the benefits of annunciating the standby-system, and to determine optimal inspection intervals. Several examples which are representative of nuclear power plant applications are presented

  3. Dynamic Error Analysis Method for Vibration Shape Reconstruction of Smart FBG Plate Structure

    Directory of Open Access Journals (Sweden)

    Hesheng Zhang

    2016-01-01

    Full Text Available Shape reconstruction of aerospace plate structure is an important issue for safe operation of aerospace vehicles. One way to achieve such reconstruction is by constructing smart fiber Bragg grating (FBG plate structure with discrete distributed FBG sensor arrays using reconstruction algorithms in which error analysis of reconstruction algorithm is a key link. Considering that traditional error analysis methods can only deal with static data, a new dynamic data error analysis method are proposed based on LMS algorithm for shape reconstruction of smart FBG plate structure. Firstly, smart FBG structure and orthogonal curved network based reconstruction method is introduced. Then, a dynamic error analysis model is proposed for dynamic reconstruction error analysis. Thirdly, the parameter identification is done for the proposed dynamic error analysis model based on least mean square (LMS algorithm. Finally, an experimental verification platform is constructed and experimental dynamic reconstruction analysis is done. Experimental results show that the dynamic characteristics of the reconstruction performance for plate structure can be obtained accurately based on the proposed dynamic error analysis method. The proposed method can also be used for other data acquisition systems and data processing systems as a general error analysis method.

  4. Error analysis of terrestrial laser scanning data by means of spherical statistics and 3D graphs.

    Science.gov (United States)

    Cuartero, Aurora; Armesto, Julia; Rodríguez, Pablo G; Arias, Pedro

    2010-01-01

    This paper presents a complete analysis of the positional errors of terrestrial laser scanning (TLS) data based on spherical statistics and 3D graphs. Spherical statistics are preferred because of the 3D vectorial nature of the spatial error. Error vectors have three metric elements (one module and two angles) that were analyzed by spherical statistics. A study case has been presented and discussed in detail. Errors were calculating using 53 check points (CP) and CP coordinates were measured by a digitizer with submillimetre accuracy. The positional accuracy was analyzed by both the conventional method (modular errors analysis) and the proposed method (angular errors analysis) by 3D graphics and numerical spherical statistics. Two packages in R programming language were performed to obtain graphics automatically. The results indicated that the proposed method is advantageous as it offers a more complete analysis of the positional accuracy, such as angular error component, uniformity of the vector distribution, error isotropy, and error, in addition the modular error component by linear statistics.

  5. Spectral Analysis of Forecast Error Investigated with an Observing System Simulation Experiment

    Science.gov (United States)

    Prive, N. C.; Errico, Ronald M.

    2015-01-01

    The spectra of analysis and forecast error are examined using the observing system simulation experiment (OSSE) framework developed at the National Aeronautics and Space Administration Global Modeling and Assimilation Office (NASAGMAO). A global numerical weather prediction model, the Global Earth Observing System version 5 (GEOS-5) with Gridpoint Statistical Interpolation (GSI) data assimilation, is cycled for two months with once-daily forecasts to 336 hours to generate a control case. Verification of forecast errors using the Nature Run as truth is compared with verification of forecast errors using self-analysis; significant underestimation of forecast errors is seen using self-analysis verification for up to 48 hours. Likewise, self analysis verification significantly overestimates the error growth rates of the early forecast, as well as mischaracterizing the spatial scales at which the strongest growth occurs. The Nature Run-verified error variances exhibit a complicated progression of growth, particularly for low wave number errors. In a second experiment, cycling of the model and data assimilation over the same period is repeated, but using synthetic observations with different explicitly added observation errors having the same error variances as the control experiment, thus creating a different realization of the control. The forecast errors of the two experiments become more correlated during the early forecast period, with correlations increasing for up to 72 hours before beginning to decrease.

  6. DOI resolution measurement and error analysis with LYSO and APDs

    International Nuclear Information System (INIS)

    Lee, Chae-hun; Cho, Gyuseong

    2008-01-01

    Spatial resolution degradation in PET occurs at the edge of Field Of View (FOV) due to parallax error. To improve spatial resolution at the edge of FOV, Depth-Of-Interaction (DOI) PET has been investigated and several methods for DOI positioning were proposed. In this paper, a DOI-PET detector module using two 8x4 array avalanche photodiodes (APDs) (Hamamatsu, S8550) and a 2 cm long LYSO scintillation crystal was proposed and its DOI characteristics were investigated experimentally. In order to measure DOI positions, signals from two APDs were compared. Energy resolution was obtained from the sum of two APDs' signals and DOI positioning error was calculated. Finally, an optimum DOI step size in a 2 cm long LYSO were suggested to help to design a DOI-PET

  7. Time Error Analysis of SOE System Using Network Time Protocol

    International Nuclear Information System (INIS)

    Keum, Jong Yong; Park, Geun Ok; Park, Heui Youn

    2005-01-01

    To find the accuracy of time in the fully digitalized SOE (Sequence of Events) system, we used a formal specification of the Network Time Protocol (NTP) Version 3, which is used to synchronize time keeping among a set of distributed computers. Through constructing a simple experimental environments and experimenting internet time synchronization, we analyzed the time errors of local clocks of SOE system synchronized with a time server via computer networks

  8. Error analysis of pupils in calculating with fractions

    OpenAIRE

    Uranič, Petra

    2016-01-01

    In this thesis I examine the correlation between the frequency of errors that seventh grade pupils make in their calculations with fractions and their level of understanding of fractions. Fractions are a relevant and demanding theme in the mathematics curriculum. Although we use fractions on a daily basis, pupils find learning fractions to be very difficult. They generally do not struggle with the concept of fractions itself, but they frequently have problems with mathematical operations ...

  9. Magnetic error analysis of recycler pbar injection transfer line

    Energy Technology Data Exchange (ETDEWEB)

    Yang, M.J.; /Fermilab

    2007-06-01

    Detailed study of Fermilab Recycler Ring anti-proton injection line became feasible with its BPM system upgrade, though the beamline has been in existence and operational since year 2000. Previous attempts were not fruitful due to limitations in the BPM system. Among the objectives are the assessment of beamline optics and the presence of error fields. In particular the field region of the permanent Lambertson magnets at both ends of R22 transfer line will be scrutinized.

  10. Analysis of Periodic Errors for Synthesized-Reference-Wave Holography

    Directory of Open Access Journals (Sweden)

    V. Schejbal

    2009-12-01

    Full Text Available Synthesized-reference-wave holographic techniques offer relatively simple and cost-effective measurement of antenna radiation characteristics and reconstruction of complex aperture fields using near-field intensity-pattern measurement. These methods allow utilization of advantages of methods for probe compensations for amplitude and phasing near-field measurements for the planar and cylindrical scanning including accuracy analyses. The paper analyzes periodic errors, which can be created during scanning, using both theoretical results and numerical simulations.

  11. Task Decomposition in Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald Laurids [Idaho National Laboratory; Joe, Jeffrey Clark [Idaho National Laboratory

    2014-06-01

    In the probabilistic safety assessments (PSAs) used in the nuclear industry, human failure events (HFEs) are determined as a subset of hardware failures, namely those hardware failures that could be triggered by human action or inaction. This approach is top-down, starting with hardware faults and deducing human contributions to those faults. Elsewhere, more traditionally human factors driven approaches would tend to look at opportunities for human errors first in a task analysis and then identify which of those errors is risk significant. The intersection of top-down and bottom-up approaches to defining HFEs has not been carefully studied. Ideally, both approaches should arrive at the same set of HFEs. This question remains central as human reliability analysis (HRA) methods are generalized to new domains like oil and gas. The HFEs used in nuclear PSAs tend to be top-down— defined as a subset of the PSA—whereas the HFEs used in petroleum quantitative risk assessments (QRAs) are more likely to be bottom-up—derived from a task analysis conducted by human factors experts. The marriage of these approaches is necessary in order to ensure that HRA methods developed for top-down HFEs are also sufficient for bottom-up applications.

  12. Slow Learner Errors Analysis in Solving Fractions Problems in Inclusive Junior High School Class

    Science.gov (United States)

    Novitasari, N.; Lukito, A.; Ekawati, R.

    2018-01-01

    A slow learner whose IQ is between 71 and 89 will have difficulties in solving mathematics problems that often lead to errors. The errors could be analyzed to where the errors may occur and its type. This research is qualitative descriptive which aims to describe the locations, types, and causes of slow learner errors in the inclusive junior high school class in solving the fraction problem. The subject of this research is one slow learner of seventh-grade student which was selected through direct observation by the researcher and through discussion with mathematics teacher and special tutor which handles the slow learner students. Data collection methods used in this study are written tasks and semistructured interviews. The collected data was analyzed by Newman’s Error Analysis (NEA). Results show that there are four locations of errors, namely comprehension, transformation, process skills, and encoding errors. There are four types of errors, such as concept, principle, algorithm, and counting errors. The results of this error analysis will help teachers to identify the causes of the errors made by the slow learner.

  13. Human error as the root cause of severe accidents at nuclear reactors

    International Nuclear Information System (INIS)

    Kovács Zoltán; Rýdzi, Stanislav

    2017-01-01

    A root cause is a factor inducing an undesirable event. It is feasible for root causes to be eliminated through technological process improvements. Human error was the root cause of all severe accidents at nuclear power plants. The TMI accident was caused by a series of human errors. The Chernobyl disaster occurred after a badly performed test of the turbogenerator at a reactor with design deficiencies, and in addition, the operators ignored the safety principles and disabled the safety systems. At Fukushima the tsunami risk was underestimated and the project failed to consider the specific issues of the site. The paper describes the severe accidents and points out the human errors that caused them. Also, provisions that might have eliminated those severe accidents are suggested. The fact that each severe accident occurred on a different type of reactor is relevant – no severe accident ever occurred twice at the same reactor type. The lessons learnt from the severe accidents and the safety measures implemented on reactor units all over the world seem to be effective. (orig.)

  14. Subsecond dopamine fluctuations in human striatum encode superposed error signals about actual and counterfactual reward

    Science.gov (United States)

    Kishida, Kenneth T.; Saez, Ignacio; Lohrenz, Terry; Witcher, Mark R.; Laxton, Adrian W.; Tatter, Stephen B.; White, Jason P.; Ellis, Thomas L.; Phillips, Paul E. M.; Montague, P. Read

    2016-01-01

    In the mammalian brain, dopamine is a critical neuromodulator whose actions underlie learning, decision-making, and behavioral control. Degeneration of dopamine neurons causes Parkinson’s disease, whereas dysregulation of dopamine signaling is believed to contribute to psychiatric conditions such as schizophrenia, addiction, and depression. Experiments in animal models suggest the hypothesis that dopamine release in human striatum encodes reward prediction errors (RPEs) (the difference between actual and expected outcomes) during ongoing decision-making. Blood oxygen level-dependent (BOLD) imaging experiments in humans support the idea that RPEs are tracked in the striatum; however, BOLD measurements cannot be used to infer the action of any one specific neurotransmitter. We monitored dopamine levels with subsecond temporal resolution in humans (n = 17) with Parkinson’s disease while they executed a sequential decision-making task. Participants placed bets and experienced monetary gains or losses. Dopamine fluctuations in the striatum fail to encode RPEs, as anticipated by a large body of work in model organisms. Instead, subsecond dopamine fluctuations encode an integration of RPEs with counterfactual prediction errors, the latter defined by how much better or worse the experienced outcome could have been. How dopamine fluctuations combine the actual and counterfactual is unknown. One possibility is that this process is the normal behavior of reward processing dopamine neurons, which previously had not been tested by experiments in animal models. Alternatively, this superposition of error terms may result from an additional yet-to-be-identified subclass of dopamine neurons. PMID:26598677

  15. Human error and the associated recovery probabilities for soft control being used in the advanced MCRs of NPPs

    International Nuclear Information System (INIS)

    Jang, Inseok; Jung, Wondea; Seong, Poong Hyun

    2016-01-01

    Highlights: • The operation environment of MCRs in NPPs has changed by adopting digital HSIs. • Most current HRA databases are not explicitly designed to deal with digital HSI. • Empirical analysis for new HRA DB under an advanced MCR mockup are carried. • It is expected that the results can be used for advanced MCR HRA. - Abstract: Since the Three Mile Island (TMI)-2 accident, human error has been recognized as one of the main causes of Nuclear Power Plant (NPP) accidents, and numerous studies related to Human Reliability Analysis (HRA) have been carried out. Most of these studies were focused on considering the conventional Main Control Room (MCR) environment. However, the operating environment of MCRs in NPPs has changed with the adoption of new human-system interfaces (HSI) largely based on up-to-date digital technologies. The MCRs that include these digital and computer technologies, such as large display panels, computerized procedures, and soft controls, are called advanced MCRs. Among the many features of advanced MCRs, soft controls are a particularly important because operating actions in advanced MCRs are performed by soft control. Due to the difference in interfaces between soft control and hardwired conventional controls, different HEP should be used in the HRA for advanced MCRs. Unfortunately, most current HRA databases deal with operations in conventional MCRs and are not explicitly designed to deal with digital Human System Interface (HSI). For this reason, empirical human error and the associated error recovery probabilities were collected from the mockup of an advanced MCR equipped with soft controls. To this end, small-scaled experiments are conducted with 48 graduated students in the department of nuclear engineering in Korea Advanced Institute of Science and Technology (KAIST) are participated, and accident scenarios are designed with respect to the typical Design Basis Accidents (DBAs) in NPPs, such as Steam Generator Tube Rupture

  16. Does the A-not-B error in adult pet dogs indicate sensitivity to human communication?

    Science.gov (United States)

    Kis, Anna; Topál, József; Gácsi, Márta; Range, Friederike; Huber, Ludwig; Miklósi, Adám; Virányi, Zsófia

    2012-07-01

    Recent dog-infant comparisons have indicated that the experimenter's communicative signals in object hide-and-search tasks increase the probability of perseverative (A-not-B) errors in both species (Topál et al. 2009). These behaviourally similar results, however, might reflect different mechanisms in dogs and in children. Similar errors may occur if the motor response of retrieving the object during the A trials cannot be inhibited in the B trials or if the experimenter's movements and signals toward the A hiding place in the B trials ('sham-baiting') distract the dogs' attention. In order to test these hypotheses, we tested dogs similarly to Topál et al. (2009) but eliminated the motor search in the A trials and 'sham-baiting' in the B trials. We found that neither an inability to inhibit previously rewarded motor response nor insufficiencies in their working memory and/or attention skills can explain dogs' erroneous choices. Further, we replicated the finding that dogs have a strong tendency to commit the A-not-B error after ostensive-communicative hiding and demonstrated the crucial effect of socio-communicative cues as the A-not-B error diminishes when location B is ostensively enhanced. These findings further support the hypothesis that the dogs' A-not-B error may reflect a special sensitivity to human communicative cues. Such object-hiding and search tasks provide a typical case for how susceptibility to human social signals could (mis)lead domestic dogs.

  17. A Sensitivity Study of Human Errors in Optimizing Surveillance Test Interval (STI) and Allowed Outage Time (AOT) of Standby Safety System

    International Nuclear Information System (INIS)

    Chung, Dae Wook; Shin, Won Ky; You, Young Woo; Yang, Hui Chang

    1998-01-01

    In most cases, the surveillance test intervals (STIs), allowed outage times (AOTS) and testing strategies of safety components in nuclear power plant are prescribed in plant technical specifications. And, in general, it is required that standby safety system shall be redundant (i.e., composed of multiple components) and these components are tested by either staggered test strategy or sequential test strategy. In this study, a linear model is presented to incorporate the effects of human errors associated with test into the evaluation of unavailability. The average unavailabilities of 1/4, 2/4 redundant systems are computed considering human error and testing strategy. The adverse effects of test on system unavailability, such as component wear and test-induced transient have been modelled. The final outcome of this study would be the optimized human error domain from 3-D human error sensitivity analysis by selecting finely classified segment. The results of sensitivity analysis show that the STI and AOT can be optimized provided human error probability is maintained within allowable range. (authors)

  18. Comparative evaluation of three cognitive error analysis methods through an application to accident management tasks in NPPs

    International Nuclear Information System (INIS)

    Jung, Won Dea; Kim, Jae Whan; Ha, Jae Joo; Yoon, Wan C.

    1999-01-01

    This study was performed to comparatively evaluate selected Human Reliability Analysis (HRA) methods which mainly focus on cognitive error analysis, and to derive the requirement of a new human error analysis (HEA) framework for Accident Management (AM) in nuclear power plants(NPPs). In order to achieve this goal, we carried out a case study of human error analysis on an AM task in NPPs. In the study we evaluated three cognitive HEA methods, HRMS, CREAM and PHECA, which were selected through the review of the currently available seven cognitive HEA methods. The task of reactor cavity flooding was chosen for the application study as one of typical tasks of AM in NPPs. From the study, we derived seven requirement items for a new HEA method of AM in NPPs. We could also evaluate the applicability of three cognitive HEA methods to AM tasks. CREAM is considered to be more appropriate than others for the analysis of AM tasks. But, PHECA is regarded less appropriate for the predictive HEA technique as well as for the analysis of AM tasks. In addition to these, the advantages and disadvantages of each method are described. (author)

  19. AN ERROR ANALYSIS OF ARGUMENTATIVE ESSAY (CASE STUDY AT UNIVERSITY MUHAMMADIYAH OF METRO

    Directory of Open Access Journals (Sweden)

    Fenny - Thresia

    2015-10-01

    Full Text Available The purpose of this study was study analyze the students’ error in writing argumentative essay. The researcher focuses on errors of verb, concord and learner language. This study took 20 students as the subject of research from the third semester. The data took from observation and documentation. Based on the result of the data analysis there are some errors still found on the student’s argumentative essay in English writing? The common errors which repeatedly appear are verb. The second is concord, and learner languages are the smallest error. From 20 samples that took, the frequency the errors of verb are 12 items (60%, concord are 8 items (40%, learner languages are 7 items (35%. As a result, verb has the biggest number of common errors.

  20. Statistical evaluation of major human errors during the development of new technological systems

    International Nuclear Information System (INIS)

    Campbell, G; Ott, K.O.

    1979-01-01

    Statistical procedures are presented to evaluate major human errors during the development of a new system, errors that have led or can lead to accidents or major failures. The first procedure aims at estimating the average residual occurrence rate for s or major failures after several have occurred. The procedure is solely based on the historical record. Certain idealizations are introduced that allow the application of a sound statistical evaluation procedure. These idealizations are practically realized to a sufficient degree such that the proposed estimation procedure yields meaningful results, even for situations with a sparse data base, represented by very few accidents. Under the assumption that the possible human-error-related failure times have exponential distributions, the statistical technique of isotonic regression is proposed to estimate the failure rates due to human design error at the failure times of the system. The last value in the sequence of estimates gives the residual accident chance. In addition, theactual situation is tested against the hypothesis that the failure rate of the system remains constant over time. This test determines the chance for a decreasing failure rate being incidental, rather than an indication of an actual learning process. Both techniques can be applied not merely to a single system but to an entire series of similar systems that a technology would generate, enabling the assessment of technological improvement. For the purpose of illustration, the nuclear decay of isotopes was chosen as an example, since the assumptions of the model are rigorously satisfied in this case. This application shows satisfactory agreement of the estimated and actual failure rates (which are exactly known in this example), although the estimation was deliberately based on a sparse historical record

  1. Operator errors

    International Nuclear Information System (INIS)

    Knuefer; Lindauer

    1980-01-01

    Besides that at spectacular events a combination of component failure and human error is often found. Especially the Rasmussen-Report and the German Risk Assessment Study show for pressurised water reactors that human error must not be underestimated. Although operator errors as a form of human error can never be eliminated entirely, they can be minimized and their effects kept within acceptable limits if a thorough training of personnel is combined with an adequate design of the plant against accidents. Contrary to the investigation of engineering errors, the investigation of human errors has so far been carried out with relatively small budgets. Intensified investigations in this field appear to be a worthwhile effort. (orig.)

  2. ERROR ANALYSIS IN THE TRAVEL WRITING MADE BY THE STUDENTS OF ENGLISH STUDY PROGRAM

    Directory of Open Access Journals (Sweden)

    Vika Agustina

    2015-05-01

    Full Text Available This study was conducted to identify the kinds of errors in surface strategy taxonomy and to know the dominant type of errors made by the fifth semester students of English Department of one State University in Malang-Indonesia in producing their travel writing. The type of research of this study is document analysis since it analyses written materials, in this case travel writing texts. The analysis finds that the grammatical errors made by the students based on surface strategy taxonomy theory consist of four types. They are (1 omission, (2 addition, (3 misformation and (4 misordering. The most frequent errors occuring in misformation are in the use of tense form. Secondly, the errors are in omission of noun/verb inflection. The next error, there are many clauses that contain unnecessary phrase added there.

  3. Error Floor Analysis of Coded Slotted ALOHA over Packet Erasure Channels

    DEFF Research Database (Denmark)

    Ivanov, Mikhail; Graell i Amat, Alexandre; Brannstrom, F.

    2014-01-01

    We present a framework for the analysis of the error floor of coded slotted ALOHA (CSA) for finite frame lengths over the packet erasure channel. The error floor is caused by stopping sets in the corresponding bipartite graph, whose enumeration is, in general, not a trivial problem. We therefore ...... identify the most dominant stopping sets for the distributions of practical interest. The derived analytical expressions allow us to accurately predict the error floor at low to moderate channel loads and characterize the unequal error protection inherent in CSA.......We present a framework for the analysis of the error floor of coded slotted ALOHA (CSA) for finite frame lengths over the packet erasure channel. The error floor is caused by stopping sets in the corresponding bipartite graph, whose enumeration is, in general, not a trivial problem. We therefore...

  4. The development of human factors technologies -The development of human behaviour analysis techniques-

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung Woon; Lee, Yong Heui; Park, Keun Ok; Chun, Se Woo; Suh, Sang Moon; Park, Jae Chang [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-07-01

    In order to contribute to human error reduction through the studies on human-machine interaction in nuclear power plants, this project has objectives to develop SACOM(Simulation Analyzer with a Cognitive Operator Model) and techniques for human error analysis and application. In this year, we studied the followings: (1) Site investigation of operator tasks, (2) Development of operator task micro structure and revision of micro structure, (3) Development of knowledge representation software and SACOM prototype, (4) Development of performance assessment methodologies in task simulation and analysis of the effects of performance shaping factors. human error analysis and application techniques> (1) Classification of error shaping factors(ESFs) and development of software for ESF evaluation, (2) Analysis of human error occurrences and revision of analysis procedure, (3) Experiment for human error data collection using a compact nuclear simulator, (4) Development of a prototype data base system of the analyzed information on trip cases. 55 figs, 23 tabs, 33 refs. (Author).

  5. Analysis of Student Errors on Division of Fractions

    Science.gov (United States)

    Maelasari, E.; Jupri, A.

    2017-02-01

    This study aims to describe the type of student errors that typically occurs at the completion of the division arithmetic operations on fractions, and to describe the causes of students’ mistakes. This research used a descriptive qualitative method, and involved 22 fifth grade students at one particular elementary school in Kuningan, Indonesia. The results of this study showed that students’ error answers caused by students changing their way of thinking to solve multiplication and division operations on the same procedures, the changing of mix fractions to common fraction have made students confused, and students are careless in doing calculation. From student written work, in solving the fraction problems, we found that there is influence between the uses of learning methods and student response, and some of student responses beyond researchers’ prediction. We conclude that the teaching method is not only the important thing that must be prepared, but the teacher should also prepare about predictions of students’ answers to the problems that will be given in the learning process. This could be a reflection for teachers to be better and to achieve the expected learning goals.

  6. Error Probability Analysis of Hardware Impaired Systems with Asymmetric Transmission

    KAUST Repository

    Javed, Sidrah; Amin, Osama; Ikki, Salama S.; Alouini, Mohamed-Slim

    2018-01-01

    Error probability study of the hardware impaired (HWI) systems highly depends on the adopted model. Recent models have proved that the aggregate noise is equivalent to improper Gaussian signals. Therefore, considering the distinct noise nature and self-interfering (SI) signals, an optimal maximum likelihood (ML) receiver is derived. This renders the conventional minimum Euclidean distance (MED) receiver as a sub-optimal receiver because it is based on the assumptions of ideal hardware transceivers and proper Gaussian noise in communication systems. Next, the average error probability performance of the proposed optimal ML receiver is analyzed and tight bounds and approximations are derived for various adopted systems including transmitter and receiver I/Q imbalanced systems with or without transmitter distortions as well as transmitter or receiver only impaired systems. Motivated by recent studies that shed the light on the benefit of improper Gaussian signaling in mitigating the HWIs, asymmetric quadrature amplitude modulation or phase shift keying is optimized and adapted for transmission. Finally, different numerical and simulation results are presented to support the superiority of the proposed ML receiver over MED receiver, the tightness of the derived bounds and effectiveness of asymmetric transmission in dampening HWIs and improving overall system performance

  7. Error Probability Analysis of Hardware Impaired Systems with Asymmetric Transmission

    KAUST Repository

    Javed, Sidrah

    2018-04-26

    Error probability study of the hardware impaired (HWI) systems highly depends on the adopted model. Recent models have proved that the aggregate noise is equivalent to improper Gaussian signals. Therefore, considering the distinct noise nature and self-interfering (SI) signals, an optimal maximum likelihood (ML) receiver is derived. This renders the conventional minimum Euclidean distance (MED) receiver as a sub-optimal receiver because it is based on the assumptions of ideal hardware transceivers and proper Gaussian noise in communication systems. Next, the average error probability performance of the proposed optimal ML receiver is analyzed and tight bounds and approximations are derived for various adopted systems including transmitter and receiver I/Q imbalanced systems with or without transmitter distortions as well as transmitter or receiver only impaired systems. Motivated by recent studies that shed the light on the benefit of improper Gaussian signaling in mitigating the HWIs, asymmetric quadrature amplitude modulation or phase shift keying is optimized and adapted for transmission. Finally, different numerical and simulation results are presented to support the superiority of the proposed ML receiver over MED receiver, the tightness of the derived bounds and effectiveness of asymmetric transmission in dampening HWIs and improving overall system performance

  8. Patient safety in the clinical laboratory: a longitudinal analysis of specimen identification errors.

    Science.gov (United States)

    Wagar, Elizabeth A; Tamashiro, Lorraine; Yasin, Bushra; Hilborne, Lee; Bruckner, David A

    2006-11-01

    Patient safety is an increasingly visible and important mission for clinical laboratories. Attention to improving processes related to patient identification and specimen labeling is being paid by accreditation and regulatory organizations because errors in these areas that jeopardize patient safety are common and avoidable through improvement in the total testing process. To assess patient identification and specimen labeling improvement after multiple implementation projects using longitudinal statistical tools. Specimen errors were categorized by a multidisciplinary health care team. Patient identification errors were grouped into 3 categories: (1) specimen/requisition mismatch, (2) unlabeled specimens, and (3) mislabeled specimens. Specimens with these types of identification errors were compared preimplementation and postimplementation for 3 patient safety projects: (1) reorganization of phlebotomy (4 months); (2) introduction of an electronic event reporting system (10 months); and (3) activation of an automated processing system (14 months) for a 24-month period, using trend analysis and Student t test statistics. Of 16,632 total specimen errors, mislabeled specimens, requisition mismatches, and unlabeled specimens represented 1.0%, 6.3%, and 4.6% of errors, respectively. Student t test showed a significant decrease in the most serious error, mislabeled specimens (P patient safety projects. Trend analysis demonstrated decreases in all 3 error types for 26 months. Applying performance-improvement strategies that focus longitudinally on specimen labeling errors can significantly reduce errors, therefore improving patient safety. This is an important area in which laboratory professionals, working in interdisciplinary teams, can improve safety and outcomes of care.

  9. Incremental Volumetric Remapping Method: Analysis and Error Evaluation

    International Nuclear Information System (INIS)

    Baptista, A. J.; Oliveira, M. C.; Rodrigues, D. M.; Menezes, L. F.; Alves, J. L.

    2007-01-01

    In this paper the error associated with the remapping problem is analyzed. A range of numerical results that assess the performance of three different remapping strategies, applied to FE meshes that typically are used in sheet metal forming simulation, are evaluated. One of the selected strategies is the previously presented Incremental Volumetric Remapping method (IVR), which was implemented in the in-house code DD3TRIM. The IVR method fundaments consists on the premise that state variables in all points associated to a Gauss volume of a given element are equal to the state variable quantities placed in the correspondent Gauss point. Hence, given a typical remapping procedure between a donor and a target mesh, the variables to be associated to a target Gauss volume (and point) are determined by a weighted average. The weight function is the Gauss volume percentage of each donor element that is located inside the target Gauss volume. The calculus of the intersecting volumes between the donor and target Gauss volumes is attained incrementally, for each target Gauss volume, by means of a discrete approach. The other two remapping strategies selected are based in the interpolation/extrapolation of variables by using the finite element shape functions or moving least square interpolants. The performance of the three different remapping strategies is address with two tests. The first remapping test was taken from a literature work. The test consists in remapping successively a rotating symmetrical mesh, throughout N increments, in an angular span of 90 deg. The second remapping error evaluation test consists of remapping an irregular element shape target mesh from a given regular element shape donor mesh and proceed with the inverse operation. In this second test the computation effort is also measured. The results showed that the error level associated to IVR can be very low and with a stable evolution along the number of remapping procedures when compared with the

  10. ANALYSIS AND CORRECTION OF SYSTEMATIC HEIGHT MODEL ERRORS

    Directory of Open Access Journals (Sweden)

    K. Jacobsen

    2016-06-01

    Full Text Available The geometry of digital height models (DHM determined with optical satellite stereo combinations depends upon the image orientation, influenced by the satellite camera, the system calibration and attitude registration. As standard these days the image orientation is available in form of rational polynomial coefficients (RPC. Usually a bias correction of the RPC based on ground control points is required. In most cases the bias correction requires affine transformation, sometimes only shifts, in image or object space. For some satellites and some cases, as caused by small base length, such an image orientation does not lead to the possible accuracy of height models. As reported e.g. by Yong-hua et al. 2015 and Zhang et al. 2015, especially the Chinese stereo satellite ZiYuan-3 (ZY-3 has a limited calibration accuracy and just an attitude recording of 4 Hz which may not be satisfying. Zhang et al. 2015 tried to improve the attitude based on the color sensor bands of ZY-3, but the color images are not always available as also detailed satellite orientation information. There is a tendency of systematic deformation at a Pléiades tri-stereo combination with small base length. The small base length enlarges small systematic errors to object space. But also in some other satellite stereo combinations systematic height model errors have been detected. The largest influence is the not satisfying leveling of height models, but also low frequency height deformations can be seen. A tilt of the DHM by theory can be eliminated by ground control points (GCP, but often the GCP accuracy and distribution is not optimal, not allowing a correct leveling of the height model. In addition a model deformation at GCP locations may lead to not optimal DHM leveling. Supported by reference height models better accuracy has been reached. As reference height model the Shuttle Radar Topography Mission (SRTM digital surface model (DSM or the new AW3D30 DSM, based on ALOS

  11. Human-simulation-based learning to prevent medication error: A systematic review.

    Science.gov (United States)

    Sarfati, Laura; Ranchon, Florence; Vantard, Nicolas; Schwiertz, Vérane; Larbre, Virginie; Parat, Stéphanie; Faudel, Amélie; Rioufol, Catherine

    2018-01-31

    In the past 2 decades, there has been an increasing interest in simulation-based learning programs to prevent medication error (ME). To improve knowledge, skills, and attitudes in prescribers, nurses, and pharmaceutical staff, these methods enable training without directly involving patients. However, best practices for simulation for healthcare providers are as yet undefined. By analysing the current state of experience in the field, the present review aims to assess whether human simulation in healthcare helps to reduce ME. A systematic review was conducted on Medline from 2000 to June 2015, associating the terms "Patient Simulation," "Medication Errors," and "Simulation Healthcare." Reports of technology-based simulation were excluded, to focus exclusively on human simulation in nontechnical skills learning. Twenty-one studies assessing simulation-based learning programs were selected, focusing on pharmacy, medicine or nursing students, or concerning programs aimed at reducing administration or preparation errors, managing crises, or learning communication skills for healthcare professionals. The studies varied in design, methodology, and assessment criteria. Few demonstrated that simulation was more effective than didactic learning in reducing ME. This review highlights a lack of long-term assessment and real-life extrapolation, with limited scenarios and participant samples. These various experiences, however, help in identifying the key elements required for an effective human simulation-based learning program for ME prevention: ie, scenario design, debriefing, and perception assessment. The performance of these programs depends on their ability to reflect reality and on professional guidance. Properly regulated simulation is a good way to train staff in events that happen only exceptionally, as well as in standard daily activities. By integrating human factors, simulation seems to be effective in preventing iatrogenic risk related to ME, if the program is

  12. Faces in places: humans and machines make similar face detection errors.

    Directory of Open Access Journals (Sweden)

    Bernard Marius 't Hart

    Full Text Available The human visual system seems to be particularly efficient at detecting faces. This efficiency sometimes comes at the cost of wrongfully seeing faces in arbitrary patterns, including famous examples such as a rock configuration on Mars or a toast's roast patterns. In machine vision, face detection has made considerable progress and has become a standard feature of many digital cameras. The arguably most wide-spread algorithm for such applications ("Viola-Jones" algorithm achieves high detection rates at high computational efficiency. To what extent do the patterns that the algorithm mistakenly classifies as faces also fool humans? We selected three kinds of stimuli from real-life, first-person perspective movies based on the algorithm's output: correct detections ("real faces", false positives ("illusory faces" and correctly rejected locations ("non faces". Observers were shown pairs of these for 20 ms and had to direct their gaze to the location of the face. We found that illusory faces were mistaken for faces more frequently than non faces. In addition, rotation of the real face yielded more errors, while rotation of the illusory face yielded fewer errors. Using colored stimuli increases overall performance, but does not change the pattern of results. When replacing the eye movement by a manual response, however, the preference for illusory faces over non faces disappeared. Taken together, our data show that humans make similar face-detection errors as the Viola-Jones algorithm, when directing their gaze to briefly presented stimuli. In particular, the relative spatial arrangement of oriented filters seems of relevance. This suggests that efficient face detection in humans is likely to be pre-attentive and based on rather simple features as those encoded in the early visual system.

  13. Stochastic and sensitivity analysis of shape error of inflatable antenna reflectors

    Science.gov (United States)

    San, Bingbing; Yang, Qingshan; Yin, Liwei

    2017-03-01

    Inflatable antennas are promising candidates to realize future satellite communications and space observations since they are lightweight, low-cost and small-packaged-volume. However, due to their high flexibility, inflatable reflectors are difficult to manufacture accurately, which may result in undesirable shape errors, and thus affect their performance negatively. In this paper, the stochastic characteristics of shape errors induced during manufacturing process are investigated using Latin hypercube sampling coupled with manufacture simulations. Four main random error sources are involved, including errors in membrane thickness, errors in elastic modulus of membrane, boundary deviations and pressure variations. Using regression and correlation analysis, a global sensitivity study is conducted to rank the importance of these error sources. This global sensitivity analysis is novel in that it can take into account the random variation and the interaction between error sources. Analyses are parametrically carried out with various focal-length-to-diameter ratios (F/D) and aperture sizes (D) of reflectors to investigate their effects on significance ranking of error sources. The research reveals that RMS (Root Mean Square) of shape error is a random quantity with an exponent probability distribution and features great dispersion; with the increase of F/D and D, both mean value and standard deviation of shape errors are increased; in the proposed range, the significance ranking of error sources is independent of F/D and D; boundary deviation imposes the greatest effect with a much higher weight than the others; pressure variation ranks the second; error in thickness and elastic modulus of membrane ranks the last with very close sensitivities to pressure variation. Finally, suggestions are given for the control of the shape accuracy of reflectors and allowable values of error sources are proposed from the perspective of reliability.

  14. Contribution of Error Analysis to Foreign Language Teaching

    Directory of Open Access Journals (Sweden)

    Vacide ERDOĞAN

    2014-01-01

    Full Text Available It is inevitable that learners make mistakes in the process of foreign language learning.However, what is questioned by language teachers is why students go on making the same mistakeseven when such mistakes have been repeatedly pointed out to them. Yet not all mistakes are the same;sometimes they seem to be deeply ingrained, but at other times students correct themselves with ease.Thus, researchers and teachers of foreign language came to realize that the mistakes a person made inthe process of constructing a new system of language is needed to be analyzed carefully, for theypossibly held in them some of the keys to the understanding of second language acquisition. In thisrespect, the aim of this study is to point out the significance of learners’ errors for they provideevidence of how language is learned and what strategies or procedures the learners are employing inthe discovery of language.

  15. Study on analysis from sources of error for Airborne LIDAR

    Science.gov (United States)

    Ren, H. C.; Yan, Q.; Liu, Z. J.; Zuo, Z. Q.; Xu, Q. Q.; Li, F. F.; Song, C.

    2016-11-01

    With the advancement of Aerial Photogrammetry, it appears that to obtain geo-spatial information of high spatial and temporal resolution provides a new technical means for Airborne LIDAR measurement techniques, with unique advantages and broad application prospects. Airborne LIDAR is increasingly becoming a new kind of space for earth observation technology, which is mounted by launching platform for aviation, accepting laser pulses to get high-precision, high-density three-dimensional coordinate point cloud data and intensity information. In this paper, we briefly demonstrates Airborne laser radar systems, and that some errors about Airborne LIDAR data sources are analyzed in detail, so the corresponding methods is put forwarded to avoid or eliminate it. Taking into account the practical application of engineering, some recommendations were developed for these designs, which has crucial theoretical and practical significance in Airborne LIDAR data processing fields.

  16. Systematic analysis of video data from different human–robot interaction studies: a categorization of social signals during error situations

    Science.gov (United States)

    Giuliani, Manuel; Mirnig, Nicole; Stollnberger, Gerald; Stadler, Susanne; Buchner, Roland; Tscheligi, Manfred

    2015-01-01

    Human–robot interactions are often affected by error situations that are caused by either the robot or the human. Therefore, robots would profit from the ability to recognize when error situations occur. We investigated the verbal and non-verbal social signals that humans show when error situations occur in human–robot interaction experiments. For that, we analyzed 201 videos of five human–robot interaction user studies with varying tasks from four independent projects. The analysis shows that there are two types of error situations: social norm violations and technical failures. Social norm violations are situations in which the robot does not adhere to the underlying social script of the interaction. Technical failures are caused by technical shortcomings of the robot. The results of the video analysis show that the study participants use many head movements and very few gestures, but they often smile, when in an error situation with the robot. Another result is that the participants sometimes stop moving at the beginning of error situations. We also found that the participants talked more in the case of social norm violations and less during technical failures. Finally, the participants use fewer non-verbal social signals (for example smiling, nodding, and head shaking), when they are interacting with the robot alone and no experimenter or other human is present. The results suggest that participants do not see the robot as a social interaction partner with comparable communication skills. Our findings have implications for builders and evaluators of human–robot interaction systems. The builders need to consider including modules for recognition and classification of head movements to the robot input channels. The evaluators need to make sure that the presence of an experimenter does not skew the results of their user studies. PMID:26217266

  17. Hebbian errors in learning: an analysis using the Oja model.

    Science.gov (United States)

    Rădulescu, Anca; Cox, Kingsley; Adams, Paul

    2009-06-21

    Recent work on long term potentiation in brain slices shows that Hebb's rule is not completely synapse-specific, probably due to intersynapse diffusion of calcium or other factors. We previously suggested that such errors in Hebbian learning might be analogous to mutations in evolution. We examine this proposal quantitatively, extending the classical Oja unsupervised model of learning by a single linear neuron to include Hebbian inspecificity. We introduce an error matrix E, which expresses possible crosstalk between updating at different connections. When there is no inspecificity, this gives the classical result of convergence to the first principal component of the input distribution (PC1). We show the modified algorithm converges to the leading eigenvector of the matrix EC, where C is the input covariance matrix. In the most biologically plausible case when there are no intrinsically privileged connections, E has diagonal elements Q and off-diagonal elements (1-Q)/(n-1), where Q, the quality, is expected to decrease with the number of inputs n and with a synaptic parameter b that reflects synapse density, calcium diffusion, etc. We study the dependence of the learning accuracy on b, n and the amount of input activity or correlation (analytically and computationally). We find that accuracy increases (learning becomes gradually less useful) with increases in b, particularly for intermediate (i.e., biologically realistic) correlation strength, although some useful learning always occurs up to the trivial limit Q=1/n. We discuss the relation of our results to Hebbian unsupervised learning in the brain. When the mechanism lacks specificity, the network fails to learn the expected, and typically most useful, result, especially when the input correlation is weak. Hebbian crosstalk would reflect the very high density of synapses along dendrites, and inevitably degrades learning.

  18. Descriptions of verbal communication errors between staff. An analysis of 84 root cause analysis-reports from Danish hospitals

    DEFF Research Database (Denmark)

    Rabøl, Louise Isager; Andersen, Mette Lehmann; Østergaard, Doris

    2011-01-01

    incidents. The objective of this study is to review RCA reports (RCAR) for characteristics of verbal communication errors between hospital staff in an organisational perspective. Method Two independent raters analysed 84 RCARs, conducted in six Danish hospitals between 2004 and 2006, for descriptions......Introduction Poor teamwork and communication between healthcare staff are correlated to patient safety incidents. However, the organisational factors responsible for these issues are unexplored. Root cause analyses (RCA) use human factors thinking to analyse the systems behind severe patient safety...... and characteristics of verbal communication errors such as handover errors and error during teamwork. Results Raters found description of verbal communication errors in 44 reports (52%). These included handover errors (35 (86%)), communication errors between different staff groups (19 (43%)), misunderstandings (13...

  19. The study of error for analysis in dynamic image from the error of count rates in Nal (Tl) scintillation camera

    International Nuclear Information System (INIS)

    Oh, Joo Young; Kang, Chun Goo; Kim, Jung Yul; Oh, Ki Baek; Kim, Jae Sam; Park, Hoon Hee

    2013-01-01

    This study is aimed to evaluate the effect of T 1/2 upon count rates in the analysis of dynamic scan using NaI (Tl) scintillation camera, and suggest a new quality control method with this effects. We producted a point source with '9 9m TcO 4 - of 18.5 to 185 MBq in the 2 mL syringes, and acquired 30 frames of dynamic images with 10 to 60 seconds each using Infinia gamma camera (GE, USA). In the second experiment, 90 frames of dynamic images were acquired from 74 MBq point source by 5 gamma cameras (Infinia 2, Forte 2, Argus 1). There were not significant differences in average count rates of the sources with 18.5 to 92.5 MBq in the analysis of 10 to 60 seconds/frame with 10 seconds interval in the first experiment (p>0.05). But there were significantly low average count rates with the sources over 111 MBq activity at 60 seconds/frame (p<0.01). According to the second analysis results of linear regression by count rates of 5 gamma cameras those were acquired during 90 minutes, counting efficiency of fourth gamma camera was most low as 0.0064%, and gradient and coefficient of variation was high as 0.0042 and 0.229 each. We could not find abnormal fluctuation in χ 2 test with count rates (p>0.02), and we could find the homogeneity of variance in Levene's F-test among the gamma cameras (p>0.05). At the correlation analysis, there was only correlation between counting efficiency and gradient as significant negative correlation (r=-0.90, p<0.05). Lastly, according to the results of calculation of T 1/2 error from change of gradient with -0.25% to +0.25%, if T 1/2 is relatively long, or gradient is high, the error increase relationally. When estimate the value of 4th camera which has highest gradient from the above mentioned result, we could not see T 1/2 error within 60 minutes at that value. In conclusion, it is necessary for the scintillation gamma camera in medical field to manage hard for the quality of radiation measurement. Especially, we found a

  20. Visual Analysis of Humans

    CERN Document Server

    Moeslund, Thomas B

    2011-01-01

    This unique text/reference provides a coherent and comprehensive overview of all aspects of video analysis of humans. Broad in coverage and accessible in style, the text presents original perspectives collected from preeminent researchers gathered from across the world. In addition to presenting state-of-the-art research, the book reviews the historical origins of the different existing methods, and predicts future trends and challenges. This title: features a Foreword by Professor Larry Davis; contains contributions from an international selection of leading authorities in the field; includes

  1. Using human error theory to explore the supply of non-prescription medicines from community pharmacies.

    Science.gov (United States)

    Watson, M C; Bond, C M; Johnston, M; Mearns, K

    2006-08-01

    The importance of theory in underpinning interventions to promote effective professional practice is gaining recognition. The Medical Research Council framework for complex interventions has assisted in promoting awareness and adoption of theory into study design. Human error theory has previously been used by high risk industries but its relevance to healthcare settings and patient safety requires further investigation. This study used this theory as a framework to explore non-prescription medicine supply from community pharmacies. The relevance to other healthcare settings and behaviours is discussed. A 25% random sample was made of 364 observed consultations for non-prescription medicines. Each of the 91 consultations was assessed by two groups: a consensus group (stage 1) to identify common problems with the consultation process, and an expert group (stages 2 and 3) to apply human error theory to these consultations. Paired assessors (most of whom were pharmacists) categorised the perceived problems occurring in each consultation (stage 1). During stage 2 paired assessors from an expert group (comprising patient safety experts, community pharmacists and psychologists) considered whether each consultation was compliant with professional guidelines for the supply of pharmacy medicines. Each non-compliant consultation identified during stage 2 was then categorised as a slip/lapse, mistake, or violation using human error theory (stage 3). During stage 1 most consultations (n = 75, 83%) were deemed deficient in information exchange. At stage 2, paired assessors varied in attributing non-compliance to specific error types. Where agreement was achieved, the error type most often selected was "violation" (n = 27, 51.9%, stage 3). Consultations involving product requests were less likely to be guideline compliant than symptom presentations (OR 0.30, 95% CI 0.10 to 0.95, p = 0.05). The large proportion of consultations classified as violations suggests that either

  2. Boundary error analysis and categorization in the TRECVID news story segmentation task

    NARCIS (Netherlands)

    Arlandis, J.; Over, P.; Kraaij, W.

    2005-01-01

    In this paper, an error analysis based on boundary error popularity (frequency) including semantic boundary categorization is applied in the context of the news story segmentation task from TRECVTD1. Clusters of systems were defined based on the input resources they used including video, audio and

  3. An Analysis of College Students' Attitudes towards Error Correction in EFL Context

    Science.gov (United States)

    Zhu, Honglin

    2010-01-01

    This article is based on a survey on the attitudes towards the error correction by their teachers in the process of teaching and learning and it is intended to improve the language teachers' understanding of the nature of error correction. Based on the analysis, the article expounds some principles and techniques that can be applied in the process…

  4. Analysis of Errors and Misconceptions in the Learning of Calculus by Undergraduate Students

    Science.gov (United States)

    Muzangwa, Jonatan; Chifamba, Peter

    2012-01-01

    This paper is going to analyse errors and misconceptions in an undergraduate course in Calculus. The study will be based on a group of 10 BEd. Mathematics students at Great Zimbabwe University. Data is gathered through use of two exercises on Calculus 1&2.The analysis of the results from the tests showed that a majority of the errors were due…

  5. Development of the Human Error Management Criteria and the Job Aptitude Evaluation Criteria for Rail Safety Personnel

    Energy Technology Data Exchange (ETDEWEB)

    Koo, In Soo; Seo, Sang Mun; Park, Geun Ok (and others)

    2008-08-15

    It has been estimated that up to 90% of all workplace accidents have human error as a cause. Human error has been widely recognized as a key factor in almost all the highly publicized accidents, including Daegu subway fire of February 18, 2003 killed 198 people and injured 147. Because most human behavior is 'unintentional', carried out automatically, root causes of human error should be carefully investigated and regulated by a legal authority. The final goal of this study is to set up some regulatory guidance that are supposed to be used by the korean rail organizations related to safety managements and the contents are : - to develop the regulatory guidance for managing human error, - to develop the regulatory guidance for managing qualifications of rail drivers - to develop the regulatory guidance for evaluating the aptitude of the safety-related personnel.

  6. Development of the Human Error Management Criteria and the Job Aptitude Evaluation Criteria for Rail Safety Personnel

    International Nuclear Information System (INIS)

    Koo, In Soo; Seo, Sang Mun; Park, Geun Ok

    2008-08-01

    It has been estimated that up to 90% of all workplace accidents have human error as a cause. Human error has been widely recognized as a key factor in almost all the highly publicized accidents, including Daegu subway fire of February 18, 2003 killed 198 people and injured 147. Because most human behavior is 'unintentional', carried out automatically, root causes of human error should be carefully investigated and regulated by a legal authority. The final goal of this study is to set up some regulatory guidance that are supposed to be used by the korean rail organizations related to safety managements and the contents are : - to develop the regulatory guidance for managing human error, - to develop the regulatory guidance for managing qualifications of rail drivers - to develop the regulatory guidance for evaluating the aptitude of the safety-related personnel

  7. An Analysis of Students Error In Solving PISA 2012 And Its Scaffolding

    Directory of Open Access Journals (Sweden)

    Yurizka Melia Sari

    2017-08-01

    Full Text Available Based on PISA survey in 2012, Indonesia was only placed on 64 out of 65 participating countries. The survey suggest that the students’ ability of reasoning, spatial orientation, and problem solving are lower compare with other participants countries, especially in Shouth East Asia. Nevertheless, the result of PISA does not elicit clearly on the students’ inability in solving PISA problem such as the location and the types of student’s errors. Therefore, analyzing students’ error in solving PISA problem would be essential countermeasure to help the students in solving mathematics problems and to develop scaffolding. Based on the data analysis, it is found that there are 5 types of error which is made by the subject. They consist of reading error, comprehension error, transformation error, process skill error, and encoding error. The most common mistake that subject do is encoding error with a percentage of 26%. While reading is the fewest errors made by the subjects that is only 12%. The types of given scaffolding was explaining the problem carefully and making a summary of new words and find the meaning of them, restructuring problem-solving strategies and reviewing the results of the completion of the problem.

  8. Problems of accuracy and sources of error in trace analysis of elements

    International Nuclear Information System (INIS)

    Porat, Ze'ev.

    1995-07-01

    The technological developments in the field of analytical chemistry in recent years facilitates trace analysis of materials in sub-ppb levels. This provides important information regarding the presence of various trace elements in the human body, in drinking water and in the environment. However, it also exposes the measurements to more severe problems of contamination and inaccuracy due to the high sensitivity of the analytical methods. The sources of error are numerous and can be included in three main groups: (a) impurities of various sources; (b) loss of material during sample processing; (c) problems of calibration and interference. These difficulties are discussed here in detail, together with some practical solutions and examples.(authors) 8 figs., 2 tabs., 18 refs.,

  9. Problems of accuracy and sources of error in trace analysis of elements

    Energy Technology Data Exchange (ETDEWEB)

    Porat, Ze` ev

    1995-07-01

    The technological developments in the field of analytical chemistry in recent years facilitates trace analysis of materials in sub-ppb levels. This provides important information regarding the presence of various trace elements in the human body, in drinking water and in the environment. However, it also exposes the measurements to more severe problems of contamination and inaccuracy due to the high sensitivity of the analytical methods. The sources of error are numerous and can be included in three main groups: (a) impurities of various sources; (b) loss of material during sample processing; (c) problems of calibration and interference. These difficulties are discussed here in detail, together with some practical solutions and examples.(authors) 8 figs., 2 tabs., 18 refs.,.

  10. Two-component model application for error calculus in the environmental monitoring data analysis

    International Nuclear Information System (INIS)

    Carvalho, Maria Angelica G.; Hiromoto, Goro

    2002-01-01

    Analysis and interpretation of results of an environmental monitoring program is often based on the evaluation of the mean value of a particular set of data, which is strongly affected by the analytical errors associated with each measurement. A model proposed by Rocke and Lorenzato assumes two error components, one additive and one multiplicative, to deal with lower and higher concentration values in a single model. In this communication, an application of this method for re-evaluation of the errors reported in a large set of results of total alpha measurements in a environmental sample is presented. The results show that the mean values calculated taking into account the new errors is higher than as obtained with the original errors, being an indicative that the analytical errors reported before were underestimated in the region of lower concentrations. (author)

  11. Analysis technique for controlling system wavefront error with active/adaptive optics

    Science.gov (United States)

    Genberg, Victor L.; Michels, Gregory J.

    2017-08-01

    The ultimate goal of an active mirror system is to control system level wavefront error (WFE). In the past, the use of this technique was limited by the difficulty of obtaining a linear optics model. In this paper, an automated method for controlling system level WFE using a linear optics model is presented. An error estimate is included in the analysis output for both surface error disturbance fitting and actuator influence function fitting. To control adaptive optics, the technique has been extended to write system WFE in state space matrix form. The technique is demonstrated by example with SigFit, a commercially available tool integrating mechanical analysis with optical analysis.

  12. Detection method of nonlinearity errors by statistical signal analysis in heterodyne Michelson interferometer.

    Science.gov (United States)

    Hu, Juju; Hu, Haijiang; Ji, Yinghua

    2010-03-15

    Periodic nonlinearity that ranges from tens of nanometers to a few nanometers in heterodyne interferometer limits its use in high accuracy measurement. A novel method is studied to detect the nonlinearity errors based on the electrical subdivision and the analysis method of statistical signal in heterodyne Michelson interferometer. Under the movement of micropositioning platform with the uniform velocity, the method can detect the nonlinearity errors by using the regression analysis and Jackknife estimation. Based on the analysis of the simulations, the method can estimate the influence of nonlinearity errors and other noises for the dimensions measurement in heterodyne Michelson interferometer.

  13. Comparison of Methods for Dependency Determination between Human Failure Events within Human Reliability Analysis

    International Nuclear Information System (INIS)

    Cepin, M.

    2008-01-01

    The human reliability analysis (HRA) is a highly subjective evaluation of human performance, which is an input for probabilistic safety assessment, which deals with many parameters of high uncertainty. The objective of this paper is to show that subjectivism can have a large impact on human reliability results and consequently on probabilistic safety assessment results and applications. The objective is to identify the key features, which may decrease subjectivity of human reliability analysis. Human reliability methods are compared with focus on dependency comparison between Institute Jozef Stefan human reliability analysis (IJS-HRA) and standardized plant analysis risk human reliability analysis (SPAR-H). Results show large differences in the calculated human error probabilities for the same events within the same probabilistic safety assessment, which are the consequence of subjectivity. The subjectivity can be reduced by development of more detailed guidelines for human reliability analysis with many practical examples for all steps of the process of evaluation of human performance

  14. Comparison of methods for dependency determination between human failure events within human reliability analysis

    International Nuclear Information System (INIS)

    Cepis, M.

    2007-01-01

    The Human Reliability Analysis (HRA) is a highly subjective evaluation of human performance, which is an input for probabilistic safety assessment, which deals with many parameters of high uncertainty. The objective of this paper is to show that subjectivism can have a large impact on