WorldWideScience

Sample records for human error probability

  1. The probability and the management of human error

    International Nuclear Information System (INIS)

    Dufey, R.B.; Saull, J.W.

    2004-01-01

    Embedded within modern technological systems, human error is the largest, and indeed dominant contributor to accident cause. The consequences dominate the risk profiles for nuclear power and for many other technologies. We need to quantify the probability of human error for the system as an integral contribution within the overall system failure, as it is generally not separable or predictable for actual events. We also need to provide a means to manage and effectively reduce the failure (error) rate. The fact that humans learn from their mistakes allows a new determination of the dynamic probability and human failure (error) rate in technological systems. The result is consistent with and derived from the available world data for modern technological systems. Comparisons are made to actual data from large technological systems and recent catastrophes. Best estimate values and relationships can be derived for both the human error rate, and for the probability. We describe the potential for new approaches to the management of human error and safety indicators, based on the principles of error state exclusion and of the systematic effect of learning. A new equation is given for the probability of human error (λ) that combines the influences of early inexperience, learning from experience (ε) and stochastic occurrences with having a finite minimum rate, this equation is λ 5.10 -5 + ((1/ε) - 5.10 -5 ) exp(-3*ε). The future failure rate is entirely determined by the experience: thus the past defines the future

  2. Collection of offshore human error probability data

    International Nuclear Information System (INIS)

    Basra, Gurpreet; Kirwan, Barry

    1998-01-01

    Accidents such as Piper Alpha have increased concern about the effects of human errors in complex systems. Such accidents can in theory be predicted and prevented by risk assessment, and in particular human reliability assessment (HRA), but HRA ideally requires qualitative and quantitative human error data. A research initiative at the University of Birmingham led to the development of CORE-DATA, a Computerised Human Error Data Base. This system currently contains a reasonably large number of human error data points, collected from a variety of mainly nuclear-power related sources. This article outlines a recent offshore data collection study, concerned with collecting lifeboat evacuation data. Data collection methods are outlined and a selection of human error probabilities generated as a result of the study are provided. These data give insights into the type of errors and human failure rates that could be utilised to support offshore risk analyses

  3. Human Error Probability Assessment During Maintenance Activities of Marine Systems

    Directory of Open Access Journals (Sweden)

    Rabiul Islam

    2018-03-01

    Full Text Available Background: Maintenance operations on-board ships are highly demanding. Maintenance operations are intensive activities requiring high man–machine interactions in challenging and evolving conditions. The evolving conditions are weather conditions, workplace temperature, ship motion, noise and vibration, and workload and stress. For example, extreme weather condition affects seafarers' performance, increasing the chances of error, and, consequently, can cause injuries or fatalities to personnel. An effective human error probability model is required to better manage maintenance on-board ships. The developed model would assist in developing and maintaining effective risk management protocols. Thus, the objective of this study is to develop a human error probability model considering various internal and external factors affecting seafarers' performance. Methods: The human error probability model is developed using probability theory applied to Bayesian network. The model is tested using the data received through the developed questionnaire survey of >200 experienced seafarers with >5 years of experience. The model developed in this study is used to find out the reliability of human performance on particular maintenance activities. Results: The developed methodology is tested on the maintenance of marine engine's cooling water pump for engine department and anchor windlass for deck department. In the considered case studies, human error probabilities are estimated in various scenarios and the results are compared between the scenarios and the different seafarer categories. The results of the case studies for both departments are also compared. Conclusion: The developed model is effective in assessing human error probabilities. These probabilities would get dynamically updated as and when new information is available on changes in either internal (i.e., training, experience, and fatigue or external (i.e., environmental and operational conditions

  4. BAYES-HEP: Bayesian belief networks for estimation of human error probability

    International Nuclear Information System (INIS)

    Karthick, M.; Senthil Kumar, C.; Paul, Robert T.

    2017-01-01

    Human errors contribute a significant portion of risk in safety critical applications and methods for estimation of human error probability have been a topic of research for over a decade. The scarce data available on human errors and large uncertainty involved in the prediction of human error probabilities make the task difficult. This paper presents a Bayesian belief network (BBN) model for human error probability estimation in safety critical functions of a nuclear power plant. The developed model using BBN would help to estimate HEP with limited human intervention. A step-by-step illustration of the application of the method and subsequent evaluation is provided with a relevant case study and the model is expected to provide useful insights into risk assessment studies

  5. Quantitative estimation of the human error probability during soft control operations

    International Nuclear Information System (INIS)

    Lee, Seung Jun; Kim, Jaewhan; Jung, Wondea

    2013-01-01

    Highlights: ► An HRA method to evaluate execution HEP for soft control operations was proposed. ► The soft control tasks were analyzed and design-related influencing factors were identified. ► An application to evaluate the effects of soft controls was performed. - Abstract: In this work, a method was proposed for quantifying human errors that can occur during operation executions using soft controls. Soft controls of advanced main control rooms have totally different features from conventional controls, and thus they may have different human error modes and occurrence probabilities. It is important to identify the human error modes and quantify the error probability for evaluating the reliability of the system and preventing errors. This work suggests an evaluation framework for quantifying the execution error probability using soft controls. In the application result, it was observed that the human error probabilities of soft controls showed both positive and negative results compared to the conventional controls according to the design quality of advanced main control rooms

  6. Simulator data on human error probabilities

    International Nuclear Information System (INIS)

    Kozinsky, E.J.; Guttmann, H.E.

    1982-01-01

    Analysis of operator errors on NPP simulators is being used to determine Human Error Probabilities (HEP) for task elements defined in NUREG/CR 1278. Simulator data tapes from research conducted by EPRI and ORNL are being analyzed for operator error rates. The tapes collected, using Performance Measurement System software developed for EPRI, contain a history of all operator manipulations during simulated casualties. Analysis yields a time history or Operational Sequence Diagram and a manipulation summary, both stored in computer data files. Data searches yield information on operator errors of omission and commission. This work experimentally determines HEPs for Probabilistic Risk Assessment calculations. It is the only practical experimental source of this data to date

  7. Simulator data on human error probabilities

    International Nuclear Information System (INIS)

    Kozinsky, E.J.; Guttmann, H.E.

    1981-01-01

    Analysis of operator errors on NPP simulators is being used to determine Human Error Probabilities (HEP) for task elements defined in NUREG/CR-1278. Simulator data tapes from research conducted by EPRI and ORNL are being analyzed for operator error rates. The tapes collected, using Performance Measurement System software developed for EPRI, contain a history of all operator manipulations during simulated casualties. Analysis yields a time history or Operational Sequence Diagram and a manipulation summary, both stored in computer data files. Data searches yield information on operator errors of omission and commission. This work experimentally determined HEP's for Probabilistic Risk Assessment calculations. It is the only practical experimental source of this data to date

  8. Human error recovery failure probability when using soft controls in computerized control rooms

    International Nuclear Information System (INIS)

    Jang, Inseok; Kim, Ar Ryum; Seong, Poong Hyun; Jung, Wondea

    2014-01-01

    Many literatures categorized recovery process into three phases; detection of problem situation, explanation of problem causes or countermeasures against problem, and end of recovery. Although the focus of recovery promotion has been on categorizing recovery phases and modeling recovery process, research related to human recovery failure probabilities has not been perform actively. On the other hand, a few study regarding recovery failure probabilities were implemented empirically. Summarizing, researches that have performed so far have several problems in terms of use in human reliability analysis (HRA). By adopting new human-system interfaces that are based on computer-based technologies, the operation environment of MCRs in NPPs has changed from conventional MCRs to advanced MCRs. Because of the different interfaces between conventional and advanced MCRs, different recovery failure probabilities should be considered in the HRA for advanced MCRs. Therefore, this study carries out an empirical analysis of human error recovery probabilities under an advanced MCR mockup called compact nuclear simulator (CNS). The aim of this work is not only to compile a recovery failure probability database using the simulator for advanced MCRs but also to collect recovery failure probability according to defined human error modes to compare that which human error mode has highest recovery failure probability. The results show that recovery failure probability regarding wrong screen selection was lowest among human error modes, which means that most of human error related to wrong screen selection can be recovered. On the other hand, recovery failure probabilities of operation selection omission and delayed operation were 1.0. These results imply that once subject omitted one task in the procedure, they have difficulties finding and recovering their errors without supervisor's assistance. Also, wrong screen selection had an effect on delayed operation. That is, wrong screen

  9. Selection of anchor values for human error probability estimation

    International Nuclear Information System (INIS)

    Buffardi, L.C.; Fleishman, E.A.; Allen, J.A.

    1989-01-01

    There is a need for more dependable information to assist in the prediction of human errors in nuclear power environments. The major objective of the current project is to establish guidelines for using error probabilities from other task settings to estimate errors in the nuclear environment. This involves: (1) identifying critical nuclear tasks, (2) discovering similar tasks in non-nuclear environments, (3) finding error data for non-nuclear tasks, and (4) establishing error-rate values for the nuclear tasks based on the non-nuclear data. A key feature is the application of a classification system to nuclear and non-nuclear tasks to evaluate their similarities and differences in order to provide a basis for generalizing human error estimates across tasks. During the first eight months of the project, several classification systems have been applied to a sample of nuclear tasks. They are discussed in terms of their potential for establishing task equivalence and transferability of human error rates across situations

  10. Demonstration Integrated Knowledge-Based System for Estimating Human Error Probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Auflick, Jack L.

    1999-04-21

    Human Reliability Analysis (HRA) is currently comprised of at least 40 different methods that are used to analyze, predict, and evaluate human performance in probabilistic terms. Systematic HRAs allow analysts to examine human-machine relationships, identify error-likely situations, and provide estimates of relative frequencies for human errors on critical tasks, highlighting the most beneficial areas for system improvements. Unfortunately, each of HRA's methods has a different philosophical approach, thereby producing estimates of human error probabilities (HEPs) that area better or worse match to the error likely situation of interest. Poor selection of methodology, or the improper application of techniques can produce invalid HEP estimates, where that erroneous estimation of potential human failure could have potentially severe consequences in terms of the estimated occurrence of injury, death, and/or property damage.

  11. Calculating method on human error probabilities considering influence of management and organization

    International Nuclear Information System (INIS)

    Gao Jia; Huang Xiangrui; Shen Zupei

    1996-01-01

    This paper is concerned with how management and organizational influences can be factored into quantifying human error probabilities on risk assessments, using a three-level Influence Diagram (ID) which is originally only as a tool for construction and representation of models of decision-making trees or event trees. An analytical model of human errors causation has been set up with three influence levels, introducing a method for quantification assessments (of the ID), which can be applied into quantifying probabilities) of human errors on risk assessments, especially into the quantification of complex event trees (system) as engineering decision-making analysis. A numerical case study is provided to illustrate the approach

  12. Human error probability estimation using licensee event reports

    International Nuclear Information System (INIS)

    Voska, K.J.; O'Brien, J.N.

    1984-07-01

    Objective of this report is to present a method for using field data from nuclear power plants to estimate human error probabilities (HEPs). These HEPs are then used in probabilistic risk activities. This method of estimating HEPs is one of four being pursued in NRC-sponsored research. The other three are structured expert judgment, analysis of training simulator data, and performance modeling. The type of field data analyzed in this report is from Licensee Event reports (LERs) which are analyzed using a method specifically developed for that purpose. However, any type of field data or human errors could be analyzed using this method with minor adjustments. This report assesses the practicality, acceptability, and usefulness of estimating HEPs from LERs and comprehensively presents the method for use

  13. The Human Bathtub: Safety and Risk Predictions Including the Dynamic Probability of Operator Errors

    International Nuclear Information System (INIS)

    Duffey, Romney B.; Saull, John W.

    2006-01-01

    Reactor safety and risk are dominated by the potential and major contribution for human error in the design, operation, control, management, regulation and maintenance of the plant, and hence to all accidents. Given the possibility of accidents and errors, now we need to determine the outcome (error) probability, or the chance of failure. Conventionally, reliability engineering is associated with the failure rate of components, or systems, or mechanisms, not of human beings in and interacting with a technological system. The probability of failure requires a prior knowledge of the total number of outcomes, which for any predictive purposes we do not know or have. Analysis of failure rates due to human error and the rate of learning allow a new determination of the dynamic human error rate in technological systems, consistent with and derived from the available world data. The basis for the analysis is the 'learning hypothesis' that humans learn from experience, and consequently the accumulated experience defines the failure rate. A new 'best' equation has been derived for the human error, outcome or failure rate, which allows for calculation and prediction of the probability of human error. We also provide comparisons to the empirical Weibull parameter fitting used in and by conventional reliability engineering and probabilistic safety analysis methods. These new analyses show that arbitrary Weibull fitting parameters and typical empirical hazard function techniques cannot be used to predict the dynamics of human errors and outcomes in the presence of learning. Comparisons of these new insights show agreement with human error data from the world's commercial airlines, the two shuttle failures, and from nuclear plant operator actions and transient control behavior observed in transients in both plants and simulators. The results demonstrate that the human error probability (HEP) is dynamic, and that it may be predicted using the learning hypothesis and the minimum

  14. Human errors evaluation for muster in emergency situations applying human error probability index (HEPI, in the oil company warehouse in Hamadan City

    Directory of Open Access Journals (Sweden)

    2012-12-01

    Full Text Available Introduction: Emergency situation is one of the influencing factors on human error. The aim of this research was purpose to evaluate human error in emergency situation of fire and explosion at the oil company warehouse in Hamadan city applying human error probability index (HEPI. . Material and Method: First, the scenario of emergency situation of those situation of fire and explosion at the oil company warehouse was designed and then maneuver against, was performed. The scaled questionnaire of muster for the maneuver was completed in the next stage. Collected data were analyzed to calculate the probability success for the 18 actions required in an emergency situation from starting point of the muster until the latest action to temporary sheltersafe. .Result: The result showed that the highest probability of error occurrence was related to make safe workplace (evaluation phase with 32.4 % and lowest probability of occurrence error in detection alarm (awareness phase with 1.8 %, probability. The highest severity of error was in the evaluation phase and the lowest severity of error was in the awareness and recovery phase. Maximum risk level was related to the evaluating exit routes and selecting one route and choosy another exit route and minimum risk level was related to the four evaluation phases. . Conclusion: To reduce the risk of reaction in the exit phases of an emergency situation, the following actions are recommended, based on the finding in this study: A periodic evaluation of the exit phase and modifying them if necessary, conducting more maneuvers and analyzing this results along with a sufficient feedback to the employees.

  15. An empirical study on the human error recovery failure probability when using soft controls in NPP advanced MCRs

    International Nuclear Information System (INIS)

    Jang, Inseok; Kim, Ar Ryum; Jung, Wondea; Seong, Poong Hyun

    2014-01-01

    Highlights: • Many researchers have tried to understand human recovery process or step. • Modeling human recovery process is not sufficient to be applied to HRA. • The operation environment of MCRs in NPPs has changed by adopting new HSIs. • Recovery failure probability in a soft control operation environment is investigated. • Recovery failure probability here would be important evidence for expert judgment. - Abstract: It is well known that probabilistic safety assessments (PSAs) today consider not just hardware failures and environmental events that can impact upon risk, but also human error contributions. Consequently, the focus on reliability and performance management has been on the prevention of human errors and failures rather than the recovery of human errors. However, the recovery of human errors is as important as the prevention of human errors and failures for the safe operation of nuclear power plants (NPPs). For this reason, many researchers have tried to find a human recovery process or step. However, modeling the human recovery process is not sufficient enough to be applied to human reliability analysis (HRA), which requires human error and recovery probabilities. In this study, therefore, human error recovery failure probabilities based on predefined human error modes were investigated by conducting experiments in the operation mockup of advanced/digital main control rooms (MCRs) in NPPs. To this end, 48 subjects majoring in nuclear engineering participated in the experiments. In the experiments, using the developed accident scenario based on tasks from the standard post trip action (SPTA), the steam generator tube rupture (SGTR), and predominant soft control tasks, which are derived from the loss of coolant accident (LOCA) and the excess steam demand event (ESDE), all error detection and recovery data based on human error modes were checked with the performance sheet and the statistical analysis of error recovery/detection was then

  16. Basic human error probabilities in advanced MCRs when using soft control

    International Nuclear Information System (INIS)

    Jang, In Seok; Seong, Poong Hyun; Kang, Hyun Gook; Lee, Seung Jun

    2012-01-01

    In a report on one of the renowned HRA methods, Technique for Human Error Rate Prediction (THERP), it is pointed out that 'The paucity of actual data on human performance continues to be a major problem for estimating HEPs and performance times in nuclear power plant (NPP) task'. However, another critical difficulty is that most current HRA databases deal with operation in conventional type of MCRs. With the adoption of new human system interfaces that are based on computer based technologies, the operation environment of MCRs in NPPs has changed. The MCRs including these digital and computer technologies, such as large display panels, computerized procedures, soft controls, and so on, are called advanced MCRs. Because of the different interfaces, different Basic Human Error Probabilities (BHEPs) should be considered in human reliability analyses (HRAs) for advanced MCRs. This study carries out an empirical analysis of human error considering soft controls. The aim of this work is not only to compile a database using the simulator for advanced MCRs but also to compare BHEPs with those of a conventional MCR database

  17. A human error probability estimate methodology based on fuzzy inference and expert judgment on nuclear plants

    International Nuclear Information System (INIS)

    Nascimento, C.S. do; Mesquita, R.N. de

    2009-01-01

    Recent studies point human error as an important factor for many industrial and nuclear accidents: Three Mile Island (1979), Bhopal (1984), Chernobyl and Challenger (1986) are classical examples. Human contribution to these accidents may be better understood and analyzed by using Human Reliability Analysis (HRA), which has being taken as an essential part on Probabilistic Safety Analysis (PSA) of nuclear plants. Both HRA and PSA depend on Human Error Probability (HEP) for a quantitative analysis. These probabilities are extremely affected by the Performance Shaping Factors (PSF), which has a direct effect on human behavior and thus shape HEP according with specific environment conditions and personal individual characteristics which are responsible for these actions. This PSF dependence raises a great problem on data availability as turn these scarcely existent database too much generic or too much specific. Besides this, most of nuclear plants do not keep historical records of human error occurrences. Therefore, in order to overcome this occasional data shortage, a methodology based on Fuzzy Inference and expert judgment was employed in this paper in order to determine human error occurrence probabilities and to evaluate PSF's on performed actions by operators in a nuclear power plant (IEA-R1 nuclear reactor). Obtained HEP values were compared with reference tabled data used on current literature in order to show method coherence and valid approach. This comparison leads to a conclusion that this work results are able to be employed both on HRA and PSA enabling efficient prospection of plant safety conditions, operational procedures and local working conditions potential improvements (author)

  18. Expert estimation of human error probabilities in nuclear power plant operations: a review of probability assessment and scaling

    International Nuclear Information System (INIS)

    Stillwell, W.G.; Seaver, D.A.; Schwartz, J.P.

    1982-05-01

    This report reviews probability assessment and psychological scaling techniques that could be used to estimate human error probabilities (HEPs) in nuclear power plant operations. The techniques rely on expert opinion and can be used to estimate HEPs where data do not exist or are inadequate. These techniques have been used in various other contexts and have been shown to produce reasonably accurate probabilities. Some problems do exist, and limitations are discussed. Additional topics covered include methods for combining estimates from multiple experts, the effects of training on probability estimates, and some ideas on structuring the relationship between performance shaping factors and HEPs. Preliminary recommendations are provided along with cautions regarding the costs of implementing the recommendations. Additional research is required before definitive recommendations can be made

  19. A framework to assess diagnosis error probabilities in the advanced MCR

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ar Ryum; Seong, Poong Hyun [KAIST, Daejeon (Korea, Republic of); Kim, Jong Hyun [Chosun University, Gwangju (Korea, Republic of); Jang, Inseok; Park, Jinkyun [Korea Atomic Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    The Institute of Nuclear Power Operations (INPO)’s operating experience database revealed that about 48% of the total events in world NPPs for 2 years (2010-2011) happened due to human errors. The purposes of human reliability analysis (HRA) method are to evaluate the potential for, and mechanism of, human errors that may affect plant safety. Accordingly, various HRA methods have been developed such as technique for human error rate prediction (THERP), simplified plant analysis risk human reliability assessment (SPAR-H), cognitive reliability and error analysis method (CREAM) and so on. Many researchers have asserted that procedure, alarm, and display are critical factors to affect operators’ generic activities, especially for diagnosis activities. None of various HRA methods was explicitly designed to deal with digital systems. SCHEME (Soft Control Human error Evaluation MEthod) considers only for the probability of soft control execution error in the advanced MCR. The necessity of developing HRA methods in various conditions of NPPs has been raised. In this research, the framework to estimate diagnosis error probabilities in the advanced MCR was suggested. The assessment framework was suggested by three steps. The first step is to investigate diagnosis errors and calculate their probabilities. The second step is to quantitatively estimate PSFs’ weightings in the advanced MCR. The third step is to suggest the updated TRC model to assess the nominal diagnosis error probabilities. Additionally, the proposed framework was applied by using the full-scope simulation. Experiments conducted in domestic full-scope simulator and HAMMLAB were used as data-source. Total eighteen tasks were analyzed and twenty-three crews participated in.

  20. Estimation of the human error probabilities in the human reliability analysis

    International Nuclear Information System (INIS)

    Liu Haibin; He Xuhong; Tong Jiejuan; Shen Shifei

    2006-01-01

    Human error data is an important issue of human reliability analysis (HRA). Using of Bayesian parameter estimation, which can use multiple information, such as the historical data of NPP and expert judgment data to modify the human error data, could get the human error data reflecting the real situation of NPP more truly. This paper, using the numeric compute program developed by the authors, presents some typical examples to illustrate the process of the Bayesian parameter estimation in HRA and discusses the effect of different modification data on the Bayesian parameter estimation. (authors)

  1. An Estimation of Human Error Probability of Filtered Containment Venting System Using Dynamic HRA Method

    Energy Technology Data Exchange (ETDEWEB)

    Jang, Seunghyun; Jae, Moosung [Hanyang University, Seoul (Korea, Republic of)

    2016-10-15

    The human failure events (HFEs) are considered in the development of system fault trees as well as accident sequence event trees in part of Probabilistic Safety Assessment (PSA). As a method for analyzing the human error, several methods, such as Technique for Human Error Rate Prediction (THERP), Human Cognitive Reliability (HCR), and Standardized Plant Analysis Risk-Human Reliability Analysis (SPAR-H) are used and new methods for human reliability analysis (HRA) are under developing at this time. This paper presents a dynamic HRA method for assessing the human failure events and estimation of human error probability for filtered containment venting system (FCVS) is performed. The action associated with implementation of the containment venting during a station blackout sequence is used as an example. In this report, dynamic HRA method was used to analyze FCVS-related operator action. The distributions of the required time and the available time were developed by MAAP code and LHS sampling. Though the numerical calculations given here are only for illustrative purpose, the dynamic HRA method can be useful tools to estimate the human error estimation and it can be applied to any kind of the operator actions, including the severe accident management strategy.

  2. Human error probability quantification using fuzzy methodology in nuclear plants

    International Nuclear Information System (INIS)

    Nascimento, Claudio Souza do

    2010-01-01

    This work obtains Human Error Probability (HEP) estimates from operator's actions in response to emergency situations a hypothesis on Research Reactor IEA-R1 from IPEN. It was also obtained a Performance Shaping Factors (PSF) evaluation in order to classify them according to their influence level onto the operator's actions and to determine these PSF actual states over the plant. Both HEP estimation and PSF evaluation were done based on Specialists Evaluation using interviews and questionnaires. Specialists group was composed from selected IEA-R1 operators. Specialist's knowledge representation into linguistic variables and group evaluation values were obtained through Fuzzy Logic and Fuzzy Set Theory. HEP obtained values show good agreement with literature published data corroborating the proposed methodology as a good alternative to be used on Human Reliability Analysis (HRA). (author)

  3. Psychological scaling of expert estimates of human error probabilities: application to nuclear power plant operation

    International Nuclear Information System (INIS)

    Comer, K.; Gaddy, C.D.; Seaver, D.A.; Stillwell, W.G.

    1985-01-01

    The US Nuclear Regulatory Commission and Sandia National Laboratories sponsored a project to evaluate psychological scaling techniques for use in generating estimates of human error probabilities. The project evaluated two techniques: direct numerical estimation and paired comparisons. Expert estimates were found to be consistent across and within judges. Convergent validity was good, in comparison to estimates in a handbook of human reliability. Predictive validity could not be established because of the lack of actual relative frequencies of error (which will be a difficulty inherent in validation of any procedure used to estimate HEPs). Application of expert estimates in probabilistic risk assessment and in human factors is discussed

  4. A Quantum Theoretical Explanation for Probability Judgment Errors

    Science.gov (United States)

    Busemeyer, Jerome R.; Pothos, Emmanuel M.; Franco, Riccardo; Trueblood, Jennifer S.

    2011-01-01

    A quantum probability model is introduced and used to explain human probability judgment errors including the conjunction and disjunction fallacies, averaging effects, unpacking effects, and order effects on inference. On the one hand, quantum theory is similar to other categorization and memory models of cognition in that it relies on vector…

  5. Basic considerations in predicting error probabilities in human task performance

    International Nuclear Information System (INIS)

    Fleishman, E.A.; Buffardi, L.C.; Allen, J.A.; Gaskins, R.C. III

    1990-04-01

    It is well established that human error plays a major role in the malfunctioning of complex systems. This report takes a broad look at the study of human error and addresses the conceptual, methodological, and measurement issues involved in defining and describing errors in complex systems. In addition, a review of existing sources of human reliability data and approaches to human performance data base development is presented. Alternative task taxonomies, which are promising for establishing the comparability on nuclear and non-nuclear tasks, are also identified. Based on such taxonomic schemes, various data base prototypes for generalizing human error rates across settings are proposed. 60 refs., 3 figs., 7 tabs

  6. Human Error Analysis by Fuzzy-Set

    International Nuclear Information System (INIS)

    Situmorang, Johnny

    1996-01-01

    In conventional HRA the probability of Error is treated as a single and exact value through constructing even tree, but in this moment the Fuzzy-Set Theory is used. Fuzzy set theory treat the probability of error as a plausibility which illustrate a linguistic variable. Most parameter or variable in human engineering been defined verbal good, fairly good, worst etc. Which describe a range of any value of probability. For example this analysis is quantified the human error in calibration task, and the probability of miscalibration is very low

  7. Quantification of the effects of dependence on human error probabilities

    International Nuclear Information System (INIS)

    Bell, B.J.; Swain, A.D.

    1980-01-01

    In estimating the probabilities of human error in the performance of a series of tasks in a nuclear power plant, the situation-specific characteristics of the series must be considered. A critical factor not to be overlooked in this estimation is the dependence or independence that pertains to any of the several pairs of task performances. In discussing the quantification of the effects of dependence, the event tree symbology described will be used. In any series of tasks, the only dependence considered for quantification in this document will be that existing between the task of interest and the immediately preceeding task. Tasks performed earlier in the series may have some effect on the end task, but this effect is considered negligible

  8. Some aspects of statistical modeling of human-error probability

    International Nuclear Information System (INIS)

    Prairie, R.R.

    1982-01-01

    Human reliability analyses (HRA) are often performed as part of risk assessment and reliability projects. Recent events in nuclear power have shown the potential importance of the human element. There are several on-going efforts in the US and elsewhere with the purpose of modeling human error such that the human contribution can be incorporated into an overall risk assessment associated with one or more aspects of nuclear power. An effort that is described here uses the HRA (event tree) to quantify and model the human contribution to risk. As an example, risk analyses are being prepared on several nuclear power plants as part of the Interim Reliability Assessment Program (IREP). In this process the risk analyst selects the elements of his fault tree that could be contributed to by human error. He then solicits the HF analyst to do a HRA on this element

  9. An empirical study on the basic human error probabilities for NPP advanced main control room operation using soft control

    International Nuclear Information System (INIS)

    Jang, Inseok; Kim, Ar Ryum; Harbi, Mohamed Ali Salem Al; Lee, Seung Jun; Kang, Hyun Gook; Seong, Poong Hyun

    2013-01-01

    Highlights: ► The operation environment of MCRs in NPPs has changed by adopting new HSIs. ► The operation action in NPP Advanced MCRs is performed by soft control. ► Different basic human error probabilities (BHEPs) should be considered. ► BHEPs in a soft control operation environment are investigated empirically. ► This work will be helpful to verify if soft control has positive or negative effects. -- Abstract: By adopting new human–system interfaces that are based on computer-based technologies, the operation environment of main control rooms (MCRs) in nuclear power plants (NPPs) has changed. The MCRs that include these digital and computer technologies, such as large display panels, computerized procedures, soft controls, and so on, are called Advanced MCRs. Among the many features in Advanced MCRs, soft controls are an important feature because the operation action in NPP Advanced MCRs is performed by soft control. Using soft controls such as mouse control, touch screens, and so on, operators can select a specific screen, then choose the controller, and finally manipulate the devices. However, because of the different interfaces between soft control and hardwired conventional type control, different basic human error probabilities (BHEPs) should be considered in the Human Reliability Analysis (HRA) for advanced MCRs. Although there are many HRA methods to assess human reliabilities, such as Technique for Human Error Rate Prediction (THERP), Accident Sequence Evaluation Program (ASEP), Human Error Assessment and Reduction Technique (HEART), Human Event Repository and Analysis (HERA), Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR), Cognitive Reliability and Error Analysis Method (CREAM), and so on, these methods have been applied to conventional MCRs, and they do not consider the new features of advance MCRs such as soft controls. As a result, there is an insufficient database for assessing human reliabilities in advanced

  10. A framework to estimate probability of diagnosis error in NPP advanced MCR

    International Nuclear Information System (INIS)

    Kim, Ar Ryum; Kim, Jong Hyun; Jang, Inseok; Seong, Poong Hyun

    2018-01-01

    Highlights: •As new type of MCR has been installed in NPPs, the work environment is considerably changed. •A new framework to estimate operators’ diagnosis error probabilities should be proposed. •Diagnosis error data were extracted from the full-scope simulator of the advanced MCR. •Using Bayesian inference, a TRC model was updated for use in advanced MCR. -- Abstract: Recently, a new type of main control room (MCR) has been adopted in nuclear power plants (NPPs). The new MCR, known as the advanced MCR, consists of digitalized human-system interfaces (HSIs), computer-based procedures (CPS), and soft controls while the conventional MCR includes many alarm tiles, analog indicators, hard-wired control devices, and paper-based procedures. These changes significantly affect the generic activities of the MCR operators, in relation to diagnostic activities. The aim of this paper is to suggest a framework to estimate the probabilities of diagnosis errors in the advanced MCR by updating a time reliability correlation (TRC) model. Using Bayesian inference, the TRC model was updated with the probabilities of diagnosis errors. Here, the diagnosis error data were collected from a full-scope simulator of the advanced MCR. To do this, diagnosis errors were determined based on an information processing model and their probabilities were calculated. However, these calculated probabilities of diagnosis errors were largely affected by context factors such as procedures, HSI, training, and others, known as PSFs (Performance Shaping Factors). In order to obtain the nominal diagnosis error probabilities, the weightings of PSFs were also evaluated. Then, with the nominal diagnosis error probabilities, the TRC model was updated. This led to the proposal of a framework to estimate the nominal probabilities of diagnosis errors in the advanced MCR.

  11. Generalizing human error rates: A taxonomic approach

    International Nuclear Information System (INIS)

    Buffardi, L.; Fleishman, E.; Allen, J.

    1989-01-01

    It is well established that human error plays a major role in malfunctioning of complex, technological systems and in accidents associated with their operation. Estimates of the rate of human error in the nuclear industry range from 20-65% of all system failures. In response to this, the Nuclear Regulatory Commission has developed a variety of techniques for estimating human error probabilities for nuclear power plant personnel. Most of these techniques require the specification of the range of human error probabilities for various tasks. Unfortunately, very little objective performance data on error probabilities exist for nuclear environments. Thus, when human reliability estimates are required, for example in computer simulation modeling of system reliability, only subjective estimates (usually based on experts' best guesses) can be provided. The objective of the current research is to provide guidelines for the selection of human error probabilities based on actual performance data taken in other complex environments and applying them to nuclear settings. A key feature of this research is the application of a comprehensive taxonomic approach to nuclear and non-nuclear tasks to evaluate their similarities and differences, thus providing a basis for generalizing human error estimates across tasks. In recent years significant developments have occurred in classifying and describing tasks. Initial goals of the current research are to: (1) identify alternative taxonomic schemes that can be applied to tasks, and (2) describe nuclear tasks in terms of these schemes. Three standardized taxonomic schemes (Ability Requirements Approach, Generalized Information-Processing Approach, Task Characteristics Approach) are identified, modified, and evaluated for their suitability in comparing nuclear and non-nuclear power plant tasks. An agenda for future research and its relevance to nuclear power plant safety is also discussed

  12. Sensitivity of risk parameters to human errors for a PWR

    International Nuclear Information System (INIS)

    Samanta, P.; Hall, R.E.; Kerr, W.

    1980-01-01

    Sensitivities of the risk parameters, emergency safety system unavailabilities, accident sequence probabilities, release category probabilities and core melt probability were investigated for changes in the human error rates within the general methodological framework of the Reactor Safety Study for a Pressurized Water Reactor (PWR). Impact of individual human errors were assessed both in terms of their structural importance to core melt and reliability importance on core melt probability. The Human Error Sensitivity Assessment of a PWR (HESAP) computer code was written for the purpose of this study

  13. Development of an integrated system for estimating human error probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Auflick, J.L.; Hahn, H.A.; Morzinski, J.A.

    1998-12-01

    This is the final report of a three-year, Laboratory Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). This project had as its main objective the development of a Human Reliability Analysis (HRA), knowledge-based expert system that would provide probabilistic estimates for potential human errors within various risk assessments, safety analysis reports, and hazard assessments. HRA identifies where human errors are most likely, estimates the error rate for individual tasks, and highlights the most beneficial areas for system improvements. This project accomplished three major tasks. First, several prominent HRA techniques and associated databases were collected and translated into an electronic format. Next, the project started a knowledge engineering phase where the expertise, i.e., the procedural rules and data, were extracted from those techniques and compiled into various modules. Finally, these modules, rules, and data were combined into a nearly complete HRA expert system.

  14. Human error probability evaluation as part of reliability analysis of digital protection system of advanced pressurized water reactor - APR 1400

    International Nuclear Information System (INIS)

    Varde, P. V.; Lee, D. Y.; Han, J. B.

    2003-03-01

    A case of study on human reliability analysis has been performed as part of reliability analysis of digital protection system of the reactor automatically actuates the shutdown system of the reactor when demanded. However, the safety analysis takes credit for operator action as a diverse mean for tripping the reactor for, though a low probability, ATWS scenario. Based on the available information two cases, viz., human error in tripping the reactor and calibration error for instrumentations in protection system, have been analyzed. Wherever applicable a parametric study has also been performed

  15. Savannah River Site human error data base development for nonreactor nuclear facilities

    International Nuclear Information System (INIS)

    Benhardt, H.C.; Held, J.E.; Olsen, L.M.; Vail, R.E.; Eide, S.A.

    1994-01-01

    As part of an overall effort to upgrade and streamline methodologies for safety analyses of nonreactor nuclear facilities at the Savannah River Site (SRS), a human error data base has been developed and is presented in this report. The data base fulfills several needs of risk analysts supporting safety analysis report (SAR) development. First, it provides a single source for probabilities or rates for a wide variety of human errors associated with the SRS nonreactor nuclear facilities. Second, it provides a documented basis for human error probabilities or rates. And finally, it provides actual SRS-specific human error data to support many of the error probabilities or rates. Use of a single, documented reference source for human errors, supported by SRS-specific human error data, will improve the consistency and accuracy of human error modeling by SRS risk analysts. It is envisioned that SRS risk analysts will use this report as both a guide to identifying the types of human errors that may need to be included in risk models such as fault and event trees, and as a source for human error probabilities or rates. For each human error in this report, ffime different mean probabilities or rates are presented to cover a wide range of conditions and influencing factors. The ask analysts must decide which mean value is most appropriate for each particular application. If other types of human errors are needed for the risk models, the analyst must use other sources. Finally, if human enors are dominant in the quantified risk models (based on the values obtained fmm this report), then it may be appropriate to perform detailed human reliability analyses (HRAS) for the dominant events. This document does not provide guidance for such refined HRAS; in such cases experienced human reliability analysts should be involved

  16. Effects of human errors on the determination of surveillance test interval

    International Nuclear Information System (INIS)

    Chung, Dae Wook; Koo, Bon Hyun

    1990-01-01

    This paper incorporates the effects of human error relevant to the periodic test on the unavailability of the safety system as well as the component unavailability. Two types of possible human error during the test are considered. One is the possibility that a good safety system is inadvertently left in a bad state after the test (Type A human error) and the other is the possibility that bad safety system is undetected upon the test (Type B human error). An event tree model is developed for the steady-state unavailability of safety system to determine the effects of human errors on the component unavailability and the test interval. We perform the reliability analysis of safety injection system (SIS) by applying aforementioned two types of human error to safety injection pumps. Results of various sensitivity analyses show that; 1) the appropriate test interval decreases and steady-state unavailability increases as the probabilities of both types of human errors increase, and they are far more sensitive to Type A human error than Type B and 2) the SIS unavailability increases slightly as the probability of Type B human error increases, and significantly as the probability of Type A human error increases. Therefore, to avoid underestimation, the effects of human error should be incorporated in the system reliability analysis which aims at the relaxations of the surveillance test intervals, and Type A human error has more important effect on the unavailability and surveillance test interval

  17. Human error and the associated recovery probabilities for soft control being used in the advanced MCRs of NPPs

    International Nuclear Information System (INIS)

    Jang, Inseok; Jung, Wondea; Seong, Poong Hyun

    2016-01-01

    Highlights: • The operation environment of MCRs in NPPs has changed by adopting digital HSIs. • Most current HRA databases are not explicitly designed to deal with digital HSI. • Empirical analysis for new HRA DB under an advanced MCR mockup are carried. • It is expected that the results can be used for advanced MCR HRA. - Abstract: Since the Three Mile Island (TMI)-2 accident, human error has been recognized as one of the main causes of Nuclear Power Plant (NPP) accidents, and numerous studies related to Human Reliability Analysis (HRA) have been carried out. Most of these studies were focused on considering the conventional Main Control Room (MCR) environment. However, the operating environment of MCRs in NPPs has changed with the adoption of new human-system interfaces (HSI) largely based on up-to-date digital technologies. The MCRs that include these digital and computer technologies, such as large display panels, computerized procedures, and soft controls, are called advanced MCRs. Among the many features of advanced MCRs, soft controls are a particularly important because operating actions in advanced MCRs are performed by soft control. Due to the difference in interfaces between soft control and hardwired conventional controls, different HEP should be used in the HRA for advanced MCRs. Unfortunately, most current HRA databases deal with operations in conventional MCRs and are not explicitly designed to deal with digital Human System Interface (HSI). For this reason, empirical human error and the associated error recovery probabilities were collected from the mockup of an advanced MCR equipped with soft controls. To this end, small-scaled experiments are conducted with 48 graduated students in the department of nuclear engineering in Korea Advanced Institute of Science and Technology (KAIST) are participated, and accident scenarios are designed with respect to the typical Design Basis Accidents (DBAs) in NPPs, such as Steam Generator Tube Rupture

  18. Sensitivity of risk parameters to human errors in reactor safety study for a PWR

    International Nuclear Information System (INIS)

    Samanta, P.K.; Hall, R.E.; Swoboda, A.L.

    1981-01-01

    Sensitivities of the risk parameters, emergency safety system unavailabilities, accident sequence probabilities, release category probabilities and core melt probability were investigated for changes in the human error rates within the general methodological framework of the Reactor Safety Study (RSS) for a Pressurized Water Reactor (PWR). Impact of individual human errors were assessed both in terms of their structural importance to core melt and reliability importance on core melt probability. The Human Error Sensitivity Assessment of a PWR (HESAP) computer code was written for the purpose of this study. The code employed point estimate approach and ignored the smoothing technique applied in RSS. It computed the point estimates for the system unavailabilities from the median values of the component failure rates and proceeded in terms of point values to obtain the point estimates for the accident sequence probabilities, core melt probability, and release category probabilities. The sensitivity measure used was the ratio of the top event probability before and after the perturbation of the constituent events. Core melt probability per reactor year showed significant increase with the increase in the human error rates, but did not show similar decrease with the decrease in the human error rates due to the dominance of the hardware failures. When the Minimum Human Error Rate (M.H.E.R.) used is increased to 10 -3 , the base case human error rates start sensitivity to human errors. This effort now allows the evaluation of new error rate data along with proposed changes in the man machine interface

  19. The cost of human error intervention

    International Nuclear Information System (INIS)

    Bennett, C.T.; Banks, W.W.; Jones, E.D.

    1994-03-01

    DOE has directed that cost-benefit analyses be conducted as part of the review process for all new DOE orders. This new policy will have the effect of ensuring that DOE analysts can justify the implementation costs of the orders that they develop. We would like to argue that a cost-benefit analysis is merely one phase of a complete risk management program -- one that would more than likely start with a probabilistic risk assessment. The safety community defines risk as the probability of failure times the severity of consequence. An engineering definition of failure can be considered in terms of physical performance, as in mean-time-between-failure; or, it can be thought of in terms of human performance, as in probability of human error. The severity of consequence of a failure can be measured along any one of a number of dimensions -- economic, political, or social. Clearly, an analysis along one dimension cannot be directly compared to another but, a set of cost-benefit analyses, based on a series of cost-dimensions, can be extremely useful to managers who must prioritize their resources. Over the last two years, DOE has been developing a series of human factors orders, directed a lowering the probability of human error -- or at least changing the distribution of those errors. The following discussion presents a series of cost-benefit analyses using historical events in the nuclear industry. However, we would first like to discuss some of the analytic cautions that must be considered when we deal with human error

  20. An Analysis and Quantification Method of Human Errors of Soft Controls in Advanced MCRs

    International Nuclear Information System (INIS)

    Lee, Seung Jun; Kim, Jae Whan; Jang, Seung Cheol

    2011-01-01

    In this work, a method was proposed for quantifying human errors that may occur during operation executions using soft control. Soft controls of advanced main control rooms (MCRs) have totally different features from conventional controls, and thus they may have different human error modes and occurrence probabilities. It is important to define the human error modes and to quantify the error probability for evaluating the reliability of the system and preventing errors. This work suggests a modified K-HRA method for quantifying error probability

  1. SHERPA: A systematic human error reduction and prediction approach

    International Nuclear Information System (INIS)

    Embrey, D.E.

    1986-01-01

    This paper describes a Systematic Human Error Reduction and Prediction Approach (SHERPA) which is intended to provide guidelines for human error reduction and quantification in a wide range of human-machine systems. The approach utilizes as its basic current cognitive models of human performance. The first module in SHERPA performs task and human error analyses, which identify likely error modes, together with guidelines for the reduction of these errors by training, procedures and equipment redesign. The second module uses a SARAH approach to quantify the probability of occurrence of the errors identified earlier, and provides cost benefit analyses to assist in choosing the appropriate error reduction approaches in the third module

  2. Fisher classifier and its probability of error estimation

    Science.gov (United States)

    Chittineni, C. B.

    1979-01-01

    Computationally efficient expressions are derived for estimating the probability of error using the leave-one-out method. The optimal threshold for the classification of patterns projected onto Fisher's direction is derived. A simple generalization of the Fisher classifier to multiple classes is presented. Computational expressions are developed for estimating the probability of error of the multiclass Fisher classifier.

  3. Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR): Data manual. Part 2: Human error probability (HEP) data; Volume 5, Revision 4

    Energy Technology Data Exchange (ETDEWEB)

    Reece, W.J.; Gilbert, B.G.; Richards, R.E. [EG and G Idaho, Inc., Idaho Falls, ID (United States)

    1994-09-01

    This data manual contains a hard copy of the information in the Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR) Version 3.5 database, which is sponsored by the US Nuclear Regulatory Commission. NUCLARR was designed as a tool for risk analysis. Many of the nuclear reactors in the US and several outside the US are represented in the NUCLARR database. NUCLARR includes both human error probability estimates for workers at the plants and hardware failure data for nuclear reactor equipment. Aggregations of these data yield valuable reliability estimates for probabilistic risk assessments and human reliability analyses. The data manual is organized to permit manual searches of the information if the computerized version is not available. Originally, the manual was published in three parts. In this revision the introductory material located in the original Part 1 has been incorporated into the text of Parts 2 and 3. The user can now find introductory material either in the original Part 1, or in Parts 2 and 3 as revised. Part 2 contains the human error probability data, and Part 3, the hardware component reliability data.

  4. Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR): Data manual. Part 2: Human error probability (HEP) data; Volume 5, Revision 4

    International Nuclear Information System (INIS)

    Reece, W.J.; Gilbert, B.G.; Richards, R.E.

    1994-09-01

    This data manual contains a hard copy of the information in the Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR) Version 3.5 database, which is sponsored by the US Nuclear Regulatory Commission. NUCLARR was designed as a tool for risk analysis. Many of the nuclear reactors in the US and several outside the US are represented in the NUCLARR database. NUCLARR includes both human error probability estimates for workers at the plants and hardware failure data for nuclear reactor equipment. Aggregations of these data yield valuable reliability estimates for probabilistic risk assessments and human reliability analyses. The data manual is organized to permit manual searches of the information if the computerized version is not available. Originally, the manual was published in three parts. In this revision the introductory material located in the original Part 1 has been incorporated into the text of Parts 2 and 3. The user can now find introductory material either in the original Part 1, or in Parts 2 and 3 as revised. Part 2 contains the human error probability data, and Part 3, the hardware component reliability data

  5. Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR): Data manual, Part 2: Human Error Probability (HEP) Data. Volume 5, Revision 4

    International Nuclear Information System (INIS)

    Reece, W.J.; Gilbert, B.G.; Richards, R.E.

    1994-09-01

    This data manual contains a hard copy of the information in the Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR) Version 3.5 database, which is sponsored by the US Nuclear Regulatory Commission. NUCLARR was designed as a tool for risk analysis. Many of the nuclear reactors in the US and several outside the US are represented in the NUCLARR database. NUCLARR includes both human error probability estimates for workers at the plants and hardware failure data for nuclear reactor equipment. Aggregations of these data yield valuable reliability estimates for probabilistic risk assessments and human reliability analyses. The data manual is organized to permit manual searches of the information if the computerized version is not available. Originally, the manual was published in three parts. In this revision the introductory material located in the original Part 1 has been incorporated into the text of Parts 2 and 3. The user can now find introductory material either in the original Part 1, or in Parts 2 and 3 as revised. Part 2 contains the human error probability data, and Part 3, the hardware component reliability data

  6. A statistical approach to estimating effects of performance shaping factors on human error probabilities of soft controls

    International Nuclear Information System (INIS)

    Kim, Yochan; Park, Jinkyun; Jung, Wondea; Jang, Inseok; Hyun Seong, Poong

    2015-01-01

    Despite recent efforts toward data collection for supporting human reliability analysis, there remains a lack of empirical basis in determining the effects of performance shaping factors (PSFs) on human error probabilities (HEPs). To enhance the empirical basis regarding the effects of the PSFs, a statistical methodology using a logistic regression and stepwise variable selection was proposed, and the effects of the PSF on HEPs related with the soft controls were estimated through the methodology. For this estimation, more than 600 human error opportunities related to soft controls in a computerized control room were obtained through laboratory experiments. From the eight PSF surrogates and combinations of these variables, the procedure quality, practice level, and the operation type were identified as significant factors for screen switch and mode conversion errors. The contributions of these significant factors to HEPs were also estimated in terms of a multiplicative form. The usefulness and limitation of the experimental data and the techniques employed are discussed herein, and we believe that the logistic regression and stepwise variable selection methods will provide a way to estimate the effects of PSFs on HEPs in an objective manner. - Highlights: • It is necessary to develop an empirical basis for the effects of the PSFs on the HEPs. • A statistical method using a logistic regression and variable selection was proposed. • The effects of PSFs on the HEPs of soft controls were empirically investigated. • The significant factors were identified and their effects were estimated

  7. Dependent Human Error Probability Assessment

    International Nuclear Information System (INIS)

    Simic, Z.; Mikulicic, V.; Vukovic, I.

    2006-01-01

    This paper presents an assessment of the dependence between dynamic operator actions modeled in a Nuclear Power Plant (NPP) PRA and estimate the associated impact on Core damage frequency (CDF). This assessment was done improve HEP dependencies implementation inside existing PRA. All of the dynamic operator actions modeled in the NPP PRA are included in this assessment. Determining the level of HEP dependence and the associated influence on CDF are the major steps of this assessment. A decision on how to apply the results, i.e., should permanent HEP model changes be made, is based on the resulting relative CDF increase. Some CDF increase was selected as a threshold based on the NPP base CDF value and acceptance guidelines from the Regulatory Guide 1.174. HEP dependence resulting in a CDF increase of > 5E-07 would be considered potential candidates for specific incorporation into the baseline model. The approach used to judge the level of dependence between operator actions is based on dependency level categories and conditional probabilities developed in the Handbook of Human Reliability Analysis with Emphasis on Nuclear Power Plant Applications NUREG/CR-1278. To simplify the process, NUREG/CR-1278 identifies five levels of dependence: ZD (zero dependence), LD (low dependence), MD (moderate dependence), HD (high dependence), and CD (complete dependence). NUREG/CR-1278 also identifies several qualitative factors that could be involved in determining the level of dependence. Based on the NUREG/CR-1278 information, Time, Function, and Spatial attributes were judged to be the most important considerations when determining the level of dependence between operator actions within an accident sequence. These attributes were used to develop qualitative criteria (rules) that were used to judge the level of dependence (CD, HD, MD, LD, ZD) between the operator actions. After the level of dependence between the various HEPs is judged, quantitative values associated with the

  8. A classification scheme of erroneous behaviors for human error probability estimations based on simulator data

    International Nuclear Information System (INIS)

    Kim, Yochan; Park, Jinkyun; Jung, Wondea

    2017-01-01

    Because it has been indicated that empirical data supporting the estimates used in human reliability analysis (HRA) is insufficient, several databases have been constructed recently. To generate quantitative estimates from human reliability data, it is important to appropriately sort the erroneous behaviors found in the reliability data. Therefore, this paper proposes a scheme to classify the erroneous behaviors identified by the HuREX (Human Reliability data Extraction) framework through a review of the relevant literature. A case study of the human error probability (HEP) calculations is conducted to verify that the proposed scheme can be successfully implemented for the categorization of the erroneous behaviors and to assess whether the scheme is useful for the HEP quantification purposes. Although continuously accumulating and analyzing simulator data is desirable to secure more reliable HEPs, the resulting HEPs were insightful in several important ways with regard to human reliability in off-normal conditions. From the findings of the literature review and the case study, the potential and limitations of the proposed method are discussed. - Highlights: • A taxonomy of erroneous behaviors is proposed to estimate HEPs from a database. • The cognitive models, procedures, HRA methods, and HRA databases were reviewed. • HEPs for several types of erroneous behaviors are calculated as a case study.

  9. Impact of controlling the sum of error probability in the sequential probability ratio test

    Directory of Open Access Journals (Sweden)

    Bijoy Kumarr Pradhan

    2013-05-01

    Full Text Available A generalized modified method is proposed to control the sum of error probabilities in sequential probability ratio test to minimize the weighted average of the two average sample numbers under a simple null hypothesis and a simple alternative hypothesis with the restriction that the sum of error probabilities is a pre-assigned constant to find the optimal sample size and finally a comparison is done with the optimal sample size found from fixed sample size procedure. The results are applied to the cases when the random variate follows a normal law as well as Bernoullian law.

  10. Intervention strategies for the management of human error

    Science.gov (United States)

    Wiener, Earl L.

    1993-01-01

    This report examines the management of human error in the cockpit. The principles probably apply as well to other applications in the aviation realm (e.g. air traffic control, dispatch, weather, etc.) as well as other high-risk systems outside of aviation (e.g. shipping, high-technology medical procedures, military operations, nuclear power production). Management of human error is distinguished from error prevention. It is a more encompassing term, which includes not only the prevention of error, but also a means of disallowing an error, once made, from adversely affecting system output. Such techniques include: traditional human factors engineering, improvement of feedback and feedforward of information from system to crew, 'error-evident' displays which make erroneous input more obvious to the crew, trapping of errors within a system, goal-sharing between humans and machines (also called 'intent-driven' systems), paperwork management, and behaviorally based approaches, including procedures, standardization, checklist design, training, cockpit resource management, etc. Fifteen guidelines for the design and implementation of intervention strategies are included.

  11. Human Error Assessmentin Minefield Cleaning Operation Using Human Event Analysis

    Directory of Open Access Journals (Sweden)

    Mohammad Hajiakbari

    2015-12-01

    Full Text Available Background & objective: Human error is one of the main causes of accidents. Due to the unreliability of the human element and the high-risk nature of demining operations, this study aimed to assess and manage human errors likely to occur in such operations. Methods: This study was performed at a demining site in war zones located in the West of Iran. After acquiring an initial familiarity with the operations, methods, and tools of clearing minefields, job task related to clearing landmines were specified. Next, these tasks were studied using HTA and related possible errors were assessed using ATHEANA. Results: de-mining task was composed of four main operations, including primary detection, technical identification, investigation, and neutralization. There were found four main reasons for accidents occurring in such operations; walking on the mines, leaving mines with no action, error in neutralizing operation and environmental explosion. The possibility of human error in mine clearance operations was calculated as 0.010. Conclusion: The main causes of human error in de-mining operations can be attributed to various factors such as poor weather and operating conditions like outdoor work, inappropriate personal protective equipment, personality characteristics, insufficient accuracy in the work, and insufficient time available. To reduce the probability of human error in de-mining operations, the aforementioned factors should be managed properly.

  12. INVESTIGATION OF INFLUENCE OF ENCODING FUNCTION COMPLEXITY ON DISTRIBUTION OF ERROR MASKING PROBABILITY

    Directory of Open Access Journals (Sweden)

    A. B. Levina

    2016-03-01

    Full Text Available Error detection codes are mechanisms that enable robust delivery of data in unreliable communication channels and devices. Unreliable channels and devices are error-prone objects. Respectively, error detection codes allow detecting such errors. There are two classes of error detecting codes - classical codes and security-oriented codes. The classical codes have high percentage of detected errors; however, they have a high probability to miss an error in algebraic manipulation. In order, security-oriented codes are codes with a small Hamming distance and high protection to algebraic manipulation. The probability of error masking is a fundamental parameter of security-oriented codes. A detailed study of this parameter allows analyzing the behavior of the error-correcting code in the case of error injection in the encoding device. In order, the complexity of the encoding function plays an important role in the security-oriented codes. Encoding functions with less computational complexity and a low probability of masking are the best protection of encoding device against malicious acts. This paper investigates the influence of encoding function complexity on the error masking probability distribution. It will be shownthat the more complex encoding function reduces the maximum of error masking probability. It is also shown in the paper that increasing of the function complexity changes the error masking probability distribution. In particular, increasing of computational complexity decreases the difference between the maximum and average value of the error masking probability. Our resultshave shown that functions with greater complexity have smoothed maximums of error masking probability, which significantly complicates the analysis of error-correcting code by attacker. As a result, in case of complex encoding function the probability of the algebraic manipulation is reduced. The paper discusses an approach how to measure the error masking

  13. ERF/ERFC, Calculation of Error Function, Complementary Error Function, Probability Integrals

    International Nuclear Information System (INIS)

    Vogel, J.E.

    1983-01-01

    1 - Description of problem or function: ERF and ERFC are used to compute values of the error function and complementary error function for any real number. They may be used to compute other related functions such as the normal probability integrals. 4. Method of solution: The error function and complementary error function are approximated by rational functions. Three such rational approximations are used depending on whether - x .GE.4.0. In the first region the error function is computed directly and the complementary error function is computed via the identity erfc(x)=1.0-erf(x). In the other two regions the complementary error function is computed directly and the error function is computed from the identity erf(x)=1.0-erfc(x). The error function and complementary error function are real-valued functions of any real argument. The range of the error function is (-1,1). The range of the complementary error function is (0,2). 5. Restrictions on the complexity of the problem: The user is cautioned against using ERF to compute the complementary error function by using the identity erfc(x)=1.0-erf(x). This subtraction may cause partial or total loss of significance for certain values of x

  14. Procedures for using expert judgment to estimate human-error probabilities in nuclear power plant operations

    International Nuclear Information System (INIS)

    Seaver, D.A.; Stillwell, W.G.

    1983-03-01

    This report describes and evaluates several procedures for using expert judgment to estimate human-error probabilities (HEPs) in nuclear power plant operations. These HEPs are currently needed for several purposes, particularly for probabilistic risk assessments. Data do not exist for estimating these HEPs, so expert judgment can provide these estimates in a timely manner. Five judgmental procedures are described here: paired comparisons, ranking and rating, direct numerical estimation, indirect numerical estimation and multiattribute utility measurement. These procedures are evaluated in terms of several criteria: quality of judgments, difficulty of data collection, empirical support, acceptability, theoretical justification, and data processing. Situational constraints such as the number of experts available, the number of HEPs to be estimated, the time available, the location of the experts, and the resources available are discussed in regard to their implications for selecting a procedure for use

  15. SLIM-MAUD: an approach to assessing human error probabilities using structured expert judgment. Volume II. Detailed analysis of the technical issues

    International Nuclear Information System (INIS)

    Embrey, D.E.; Humphreys, P.; Rosa, E.A.; Kirwan, B.; Rea, K.

    1984-07-01

    This two-volume report presents the procedures and analyses performed in developing an approach for structuring expert judgments to estimate human error probabilities. Volume I presents an overview of work performed in developing the approach: SLIM-MAUD (Success Likelihood Index Methodology, implemented through the use of an interactive computer program called MAUD-Multi-Attribute Utility Decomposition). Volume II provides a more detailed analysis of the technical issues underlying the approach

  16. SIMULATED HUMAN ERROR PROBABILITY AND ITS APPLICATION TO DYNAMIC HUMAN FAILURE EVENTS

    Energy Technology Data Exchange (ETDEWEB)

    Herberger, Sarah M.; Boring, Ronald L.

    2016-10-01

    Abstract Objectives: Human reliability analysis (HRA) methods typically analyze human failure events (HFEs) at the overall task level. For dynamic HRA, it is important to model human activities at the subtask level. There exists a disconnect between dynamic subtask level and static task level that presents issues when modeling dynamic scenarios. For example, the SPAR-H method is typically used to calculate the human error probability (HEP) at the task level. As demonstrated in this paper, quantification in SPAR-H does not translate to the subtask level. Methods: Two different discrete distributions were generated for each SPAR-H Performance Shaping Factor (PSF) to define the frequency of PSF levels. The first distribution was a uniform, or uninformed distribution that assumed the frequency of each PSF level was equally likely. The second non-continuous distribution took the frequency of PSF level as identified from an assessment of the HERA database. These two different approaches were created to identify the resulting distribution of the HEP. The resulting HEP that appears closer to the known distribution, a log-normal centered on 1E-3, is the more desirable. Each approach then has median, average and maximum HFE calculations applied. To calculate these three values, three events, A, B and C are generated from the PSF level frequencies comprised of subtasks. The median HFE selects the median PSF level from each PSF and calculates HEP. The average HFE takes the mean PSF level, and the maximum takes the maximum PSF level. The same data set of subtask HEPs yields starkly different HEPs when aggregated to the HFE level in SPAR-H. Results: Assuming that each PSF level in each HFE is equally likely creates an unrealistic distribution of the HEP that is centered at 1. Next the observed frequency of PSF levels was applied with the resulting HEP behaving log-normally with a majority of the values under 2.5% HEP. The median, average and maximum HFE calculations did yield

  17. Closed Form Aliasing Probability For Q-ary Symmetric Errors

    Directory of Open Access Journals (Sweden)

    Geetani Edirisooriya

    1996-01-01

    Full Text Available In Built-In Self-Test (BIST techniques, test data reduction can be achieved using Linear Feedback Shift Registers (LFSRs. A faulty circuit may escape detection due to loss of information inherent to data compaction schemes. This is referred to as aliasing. The probability of aliasing in Multiple-Input Shift-Registers (MISRs has been studied under various bit error models. By modeling the signature analyzer as a Markov process we show that the closed form expression derived for aliasing probability previously, for MISRs with primitive polynomials under q-ary symmetric error model holds for all MISRs irrespective of their feedback polynomials and for group cellular automata signature analyzers as well. If the erroneous behaviour of a circuit can be modelled with q-ary symmetric errors, then the test circuit complexity and propagation delay associated with the signature analyzer can be minimized by using a set of m single bit LFSRs without increasing the probability of aliasing.

  18. Technique for human-error sequence identification and signification

    International Nuclear Information System (INIS)

    Heslinga, G.

    1988-01-01

    The aim of the present study was to investigate whether the event-tree technique can be used for the analysis of sequences of human errors that could cause initiating events. The scope of the study was limited to a consideration of the performance of procedural actions. The event-tree technique was modified to adapt it for this study and will be referred to as the 'Technique for Human-Error-Sequence Identification and Signification' (THESIS). The event trees used in this manner, i.e. THESIS event trees, appear to present additional problems if they are applied to human performance instead of technical systems. These problems, referred to as the 'Man-Related Features' of THESIS, are: the human capability to choose among several procedures, the ergonomics of the panel layout, human actions of a continuous nature, dependence between human errors, human capability to recover possible errors, the influence of memory during the recovery attempt, variability in human performance and correlations between human;erropr probabilities. The influence of these problems on the applicability of THESIS was assessed by means of mathematical analysis, field studies and laboratory experiments (author). 130 refs.; 51 figs.; 24 tabs

  19. Error probabilities in default Bayesian hypothesis testing

    NARCIS (Netherlands)

    Gu, Xin; Hoijtink, Herbert; Mulder, J,

    2016-01-01

    This paper investigates the classical type I and type II error probabilities of default Bayes factors for a Bayesian t test. Default Bayes factors quantify the relative evidence between the null hypothesis and the unrestricted alternative hypothesis without needing to specify prior distributions for

  20. The statistical significance of error probability as determined from decoding simulations for long codes

    Science.gov (United States)

    Massey, J. L.

    1976-01-01

    The very low error probability obtained with long error-correcting codes results in a very small number of observed errors in simulation studies of practical size and renders the usual confidence interval techniques inapplicable to the observed error probability. A natural extension of the notion of a 'confidence interval' is made and applied to such determinations of error probability by simulation. An example is included to show the surprisingly great significance of as few as two decoding errors in a very large number of decoding trials.

  1. HUMAN RELIABILITY ANALYSIS DENGAN PENDEKATAN COGNITIVE RELIABILITY AND ERROR ANALYSIS METHOD (CREAM

    Directory of Open Access Journals (Sweden)

    Zahirah Alifia Maulida

    2015-01-01

    Full Text Available Kecelakaan kerja pada bidang grinding dan welding menempati urutan tertinggi selama lima tahun terakhir di PT. X. Kecelakaan ini disebabkan oleh human error. Human error terjadi karena pengaruh lingkungan kerja fisik dan non fisik.Penelitian kali menggunakan skenario untuk memprediksi serta mengurangi kemungkinan terjadinya error pada manusia dengan pendekatan CREAM (Cognitive Reliability and Error Analysis Method. CREAM adalah salah satu metode human reliability analysis yang berfungsi untuk mendapatkan nilai Cognitive Failure Probability (CFP yang dapat dilakukan dengan dua cara yaitu basic method dan extended method. Pada basic method hanya akan didapatkan nilai failure probabailty secara umum, sedangkan untuk extended method akan didapatkan CFP untuk setiap task. Hasil penelitian menunjukkan faktor- faktor yang mempengaruhi timbulnya error pada pekerjaan grinding dan welding adalah kecukupan organisasi, kecukupan dari Man Machine Interface (MMI & dukungan operasional, ketersediaan prosedur/ perencanaan, serta kecukupan pelatihan dan pengalaman. Aspek kognitif pada pekerjaan grinding yang memiliki nilai error paling tinggi adalah planning dengan nilai CFP 0.3 dan pada pekerjaan welding yaitu aspek kognitif execution dengan nilai CFP 0.18. Sebagai upaya untuk mengurangi nilai error kognitif pada pekerjaan grinding dan welding rekomendasi yang diberikan adalah memberikan training secara rutin, work instrucstion yang lebih rinci dan memberikan sosialisasi alat. Kata kunci: CREAM (cognitive reliability and error analysis method, HRA (human reliability analysis, cognitive error Abstract The accidents in grinding and welding sectors were the highest cases over the last five years in PT. X and it caused by human error. Human error occurs due to the influence of working environment both physically and non-physically. This study will implement an approaching scenario called CREAM (Cognitive Reliability and Error Analysis Method. CREAM is one of human

  2. Probability of undetected error after decoding for a concatenated coding scheme

    Science.gov (United States)

    Costello, D. J., Jr.; Lin, S.

    1984-01-01

    A concatenated coding scheme for error control in data communications is analyzed. In this scheme, the inner code is used for both error correction and detection, however the outer code is used only for error detection. A retransmission is requested if the outer code detects the presence of errors after the inner code decoding. Probability of undetected error is derived and bounded. A particular example, proposed for NASA telecommand system is analyzed.

  3. The recovery factors analysis of the human errors for research reactors

    International Nuclear Information System (INIS)

    Farcasiu, M.; Nitoi, M.; Apostol, M.; Turcu, I.; Florescu, Ghe.

    2006-01-01

    The results of many Probabilistic Safety Assessment (PSA) studies show a very significant contribution of human errors to systems unavailability of the nuclear installations. The treatment of human interactions is considered one of the major limitations in the context of PSA. To identify those human actions that can have an effect on system reliability or availability applying the Human Reliability Analysis (HRA) is necessary. The recovery factors analysis of the human action is an important step in HRA. This paper presents how can be reduced the human errors probabilities (HEP) using those elements that have the capacity to recovery human error. The recovery factors modeling is marked to identify error likelihood situations or situations that conduct at development of the accident. This analysis is realized by THERP method. The necessary information was obtained from the operating experience of the research reactor TRIGA of the INR Pitesti. The required data were obtained from generic databases. (authors)

  4. HUMAN ERROR QUANTIFICATION USING PERFORMANCE SHAPING FACTORS IN THE SPAR-H METHOD

    Energy Technology Data Exchange (ETDEWEB)

    Harold S. Blackman; David I. Gertman; Ronald L. Boring

    2008-09-01

    This paper describes a cognitively based human reliability analysis (HRA) quantification technique for estimating the human error probabilities (HEPs) associated with operator and crew actions at nuclear power plants. The method described here, Standardized Plant Analysis Risk-Human Reliability Analysis (SPAR-H) method, was developed to aid in characterizing and quantifying human performance at nuclear power plants. The intent was to develop a defensible method that would consider all factors that may influence performance. In the SPAR-H approach, calculation of HEP rates is especially straightforward, starting with pre-defined nominal error rates for cognitive vs. action-oriented tasks, and incorporating performance shaping factor multipliers upon those nominal error rates.

  5. Comparison of risk sensitivity to human errors in the Oconee and LaSalle PRAs

    International Nuclear Information System (INIS)

    Wong, S.; Higgins, J.

    1991-01-01

    This paper describes the comparative analyses of plant risk sensitivity to human errors in the Oconee and La Salle Probabilistic Risk Assessment (PRAs). These analyses were performed to determine the reasons for the observed differences in the sensitivity of core melt frequency (CMF) to changes in human error probabilities (HEPs). Plant-specific design features, PRA methods, and the level of detail and assumptions in the human error modeling were evaluated to assess their influence risk estimates and sensitivities

  6. Plant specification of a generic human-error data through a two-stage Bayesian approach

    International Nuclear Information System (INIS)

    Heising, C.D.; Patterson, E.I.

    1984-01-01

    Expert judgement concerning human performance in nuclear power plants is quantitatively coupled with actuarial data on such performance in order to derive plant-specific human-error rate probability distributions. The coupling procedure consists of a two-stage application of Bayes' theorem to information which is grouped by type. The first information type contains expert judgement concerning human performance at nuclear power plants in general. Data collected on human performance at a group of similar plants forms the second information type. The third information type consists of data on human performance in a specific plant which has the same characteristics as the group members. The first and second information types are coupled in the first application of Bayes' theorem to derive a probability distribution for population performance. This distribution is then combined with the third information type in a second application of Bayes' theorem to determine a plant-specific human-error rate probability distribution. The two stage Bayesian procedure thus provides a means to quantitatively couple sparse data with expert judgement in order to obtain a human performance probability distribution based upon available information. Example calculations for a group of like reactors are also given. (author)

  7. Human errors and mistakes

    International Nuclear Information System (INIS)

    Wahlstroem, B.

    1993-01-01

    Human errors have a major contribution to the risks for industrial accidents. Accidents have provided important lesson making it possible to build safer systems. In avoiding human errors it is necessary to adapt the systems to their operators. The complexity of modern industrial systems is however increasing the danger of system accidents. Models of the human operator have been proposed, but the models are not able to give accurate predictions of human performance. Human errors can never be eliminated, but their frequency can be decreased by systematic efforts. The paper gives a brief summary of research in human error and it concludes with suggestions for further work. (orig.)

  8. Study on relationship of performance shaping factor in human error probability with prevalent stress of PUSPATI TRIGA reactor operators

    Science.gov (United States)

    Rahim, Ahmad Nabil Bin Ab; Mohamed, Faizal; Farid, Mohd Fairus Abdul; Fazli Zakaria, Mohd; Sangau Ligam, Alfred; Ramli, Nurhayati Binti

    2018-01-01

    Human factor can be affected by prevalence stress measured using Depression, Anxiety and Stress Scale (DASS). From the respondents feedback can be summarized that the main factor causes the highest prevalence stress is due to the working conditions that require operators to handle critical situation and make a prompt critical decisions. The relationship between the prevalence stress and performance shaping factors found that PSFFitness and PSFWork Process showed positive Pearson’s Correlation with the score of .763 and .826 while the level of significance, p = .028 and p = .012. These positive correlations with good significant values between prevalence stress and human performance shaping factor (PSF) related to fitness, work processes and procedures. The higher the stress level of the respondents, the higher the score of selected for the PSFs. This is due to the higher levels of stress lead to deteriorating physical health and cognitive also worsened. In addition, the lack of understanding in the work procedures can also be a factor that causes a growing stress. The higher these values will lead to the higher the probabilities of human error occur. Thus, monitoring the level of stress among operators RTP is important to ensure the safety of RTP.

  9. AGAPE-ET for human error analysis of emergency tasks and its application

    International Nuclear Information System (INIS)

    Kim, J. H.; Jeong, W. D.

    2002-01-01

    The paper presents a proceduralised human reliability analysis (HRA) methodology, AGAPE-ET (A Guidance And Procedure for Human Error Analysis for Emergency Tasks), covering both qualitative error analysis and quantification of human error probability (HEP) of emergency tasks in nuclear power plants. The AGAPE-ET method is based on the simplified cognitive model. By each cognitive function, error causes or error-likely situations have been identified considering the characteristics of the performance of each cognitive function and influencing mechanism of the performance influencing factors (PIFs) on the cognitive function. Then, error analysis items have been determined from the identified error causes or error-likely situations and a human error analysis procedure based on the error analysis items is organised to help the analysts cue or guide overall human error analysis. The basic scheme for the quantification of HEP consists in the multiplication of the BHEP assigned by the error analysis item and the weight from the influencing factors decision tree (IFDT) constituted by cognitive function. The method can be characterised by the structured identification of the weak points of the task required to perform and the efficient analysis process that the analysts have only to carry out with the necessary cognitive functions. The paper also presents the application of AGAPE-ET to 31 nuclear emergency tasks and its results

  10. Quality assurance and human error effects on the structural safety

    International Nuclear Information System (INIS)

    Bertero, R.; Lopez, R.; Sarrate, M.

    1991-01-01

    Statistical surveys show that the frequency of failure of structures is much larger than that expected by the codes. Evidence exists that human errors (especially during the design process) is the main cause for the difference between the failure probability admitted by codes and the reality. In this paper, the attenuation of human error effects using tools of quality assurance is analyzed. In particular, the importance of the independent design review is highlighted, and different approaches are discussed. The experience from the Atucha II project, as well as the USA and German practice on independent design review, are summarized. (Author)

  11. Analysis of Human Error Types and Performance Shaping Factors in the Next Generation Main Control Room

    International Nuclear Information System (INIS)

    Sin, Y. C.; Jung, Y. S.; Kim, K. H.; Kim, J. H.

    2008-04-01

    Main control room of nuclear power plants has been computerized and digitalized in new and modernized plants, as information and digital technologies make great progresses and become mature. Survey on human factors engineering issues in advanced MCRs: Model-based approach, Literature survey-based approach. Analysis of human error types and performance shaping factors is analysis of three human errors. The results of project can be used for task analysis, evaluation of human error probabilities, and analysis of performance shaping factors in the HRA analysis

  12. A Conceptual Framework of Human Reliability Analysis for Execution Human Error in NPP Advanced MCRs

    Energy Technology Data Exchange (ETDEWEB)

    Jang, In Seok; Kim, Ar Ryum; Seong, Poong Hyun [KAIST, Daejeon (Korea, Republic of); Jung, Won Dea [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-08-15

    The operation environment of Main Control Rooms (MCRs) in Nuclear Power Plants (NPPs) has changed with the adoption of new human-system interfaces that are based on computer-based technologies. The MCRs that include these digital and computer technologies, such as large display panels, computerized procedures, and soft controls, are called Advanced MCRs. Among the many features of Advanced MCRs, soft controls are a particularly important feature because the operation action in NPP Advanced MCRs is performed by soft control. Using soft controls such as mouse control, and touch screens, operators can select a specific screen, then choose the controller, and finally manipulate the given devices. Due to the different interfaces between soft control and hardwired conventional type control, different human error probabilities and a new Human Reliability Analysis (HRA) framework should be considered in the HRA for advanced MCRs. In other words, new human error modes should be considered for interface management tasks such as navigation tasks, and icon (device) selection tasks in monitors and a new framework of HRA method taking these newly generated human error modes into account should be considered. In this paper, a conceptual framework for a HRA method for the evaluation of soft control execution human error in advanced MCRs is suggested by analyzing soft control tasks.

  13. A Conceptual Framework of Human Reliability Analysis for Execution Human Error in NPP Advanced MCRs

    International Nuclear Information System (INIS)

    Jang, In Seok; Kim, Ar Ryum; Seong, Poong Hyun; Jung, Won Dea

    2014-01-01

    The operation environment of Main Control Rooms (MCRs) in Nuclear Power Plants (NPPs) has changed with the adoption of new human-system interfaces that are based on computer-based technologies. The MCRs that include these digital and computer technologies, such as large display panels, computerized procedures, and soft controls, are called Advanced MCRs. Among the many features of Advanced MCRs, soft controls are a particularly important feature because the operation action in NPP Advanced MCRs is performed by soft control. Using soft controls such as mouse control, and touch screens, operators can select a specific screen, then choose the controller, and finally manipulate the given devices. Due to the different interfaces between soft control and hardwired conventional type control, different human error probabilities and a new Human Reliability Analysis (HRA) framework should be considered in the HRA for advanced MCRs. In other words, new human error modes should be considered for interface management tasks such as navigation tasks, and icon (device) selection tasks in monitors and a new framework of HRA method taking these newly generated human error modes into account should be considered. In this paper, a conceptual framework for a HRA method for the evaluation of soft control execution human error in advanced MCRs is suggested by analyzing soft control tasks

  14. The role of human error in risk analysis: Application to pre- and post-maintenance procedures of process facilities

    International Nuclear Information System (INIS)

    Noroozi, Alireza; Khakzad, Nima; Khan, Faisal; MacKinnon, Scott; Abbassi, Rouzbeh

    2013-01-01

    Human factors play an important role in the safe operation of a facility. Human factors include the systematic application of information about human characteristics and behavior to increase the safety of a process system. A significant proportion of human errors occur during the maintenance phase. However, the quantification of human error probabilities in the maintenance phase has not been given the amount of attention it deserves. This paper focuses on a human factors analysis in pre-and post- pump maintenance operations. The procedures for removing process equipment from service (pre-maintenance) and returning the equipment to service (post-maintenance) are considered for possible failure scenarios. For each scenario, human error probability is calculated for each activity using the Success Likelihood Index Method (SLIM). Consequences are also assessed in this methodology. The risk assessment is conducted for each component and the overall risk is estimated by adding individual risks. The present study is aimed at highlighting the importance of considering human error in quantitative risk analyses. The developed methodology has been applied to a case study of an offshore process facility

  15. Human Errors in Decision Making

    OpenAIRE

    Mohamad, Shahriari; Aliandrina, Dessy; Feng, Yan

    2005-01-01

    The aim of this paper was to identify human errors in decision making process. The study was focused on a research question such as: what could be the human error as a potential of decision failure in evaluation of the alternatives in the process of decision making. Two case studies were selected from the literature and analyzed to find the human errors contribute to decision fail. Then the analysis of human errors was linked with mental models in evaluation of alternative step. The results o...

  16. The using of the control room automation against human errors

    International Nuclear Information System (INIS)

    Kautto, A.

    1993-01-01

    The control room automation has developed very strongly during the 80's in IVO (Imatran Voima Oy). The former work expanded strongly with building of the full scope training simulator to the Loviisa plant. The important milestones has been, for example the testing of the Critical Function Monitoring System, a concept developed by Combustion Eng. Inc., in Loviisa training simulator 1982, the replacing of the process and simulator computers in Loviisa 1989, and 1990 and the presenting the use of the computer based procedures in training of operators 1993. With developing of automation and procedures it is possible to minimize the probability of human error. However, it is not possible totally eliminate the risks caused by human errors. (orig.)

  17. A Human Error Analysis with Physiological Signals during Utilizing Digital Devices

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yong Hee; Oh, Yeon Ju; Shin, Kwang Hyeon [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2011-10-15

    The introduction of advanced MCR is accompanied with lots of changes and different forms and features through the virtue of new digital technologies. There are various kinds of digital devices such as flat panel displays, touch screens, and so on. The characteristics of these digital devices give many chances to the interface management, and can be integrated into a compact single workstation in an advanced MCR so that workers can operate the plant with minimum burden during any operating condition. However, these devices may introduce new types of human errors, and thus we need a means to evaluate and prevent such error, especially those related to the digital devices. Human errors have been retrospectively assessed for accident reviews and quantitatively evaluated through HRA for PSA. However, the ergonomic verification and validation is an important process to defend all human error potential in the NPP design. HRA is a crucial part of a PSA, and helps in preparing a countermeasure for design by drawing potential human error items that affect the overall safety of NPPs. Various HRA techniques are available however: they reveal shortages of the HMI design in the digital era. - HRA techniques depend on PSFs: this means that the scope dealing with human factors is previously limited, and thus all attributes of new digital devices may not be considered in HRA. - The data used to HRA are not close to the evaluation items. So, human error analysis is not easy to apply to design by several individual experiments and cases. - The results of HRA are not statistically meaningful because accidents including human errors in NPPs are rare and have been estimated as having an extremely low probability

  18. A theory of human error

    Science.gov (United States)

    Mcruer, D. T.; Clement, W. F.; Allen, R. W.

    1981-01-01

    Human errors tend to be treated in terms of clinical and anecdotal descriptions, from which remedial measures are difficult to derive. Correction of the sources of human error requires an attempt to reconstruct underlying and contributing causes of error from the circumstantial causes cited in official investigative reports. A comprehensive analytical theory of the cause-effect relationships governing propagation of human error is indispensable to a reconstruction of the underlying and contributing causes. A validated analytical theory of the input-output behavior of human operators involving manual control, communication, supervisory, and monitoring tasks which are relevant to aviation, maritime, automotive, and process control operations is highlighted. This theory of behavior, both appropriate and inappropriate, provides an insightful basis for investigating, classifying, and quantifying the needed cause-effect relationships governing propagation of human error.

  19. On the average capacity and bit error probability of wireless communication systems

    KAUST Repository

    Yilmaz, Ferkan

    2011-12-01

    Analysis of the average binary error probabilities and average capacity of wireless communications systems over generalized fading channels have been considered separately in the past. This paper introduces a novel moment generating function-based unified expression for both average binary error probabilities and average capacity of single and multiple link communication with maximal ratio combining. It is a matter to note that the generic unified expression offered in this paper can be easily calculated and that is applicable to a wide variety of fading scenarios, and the mathematical formalism is illustrated with the generalized Gamma fading distribution in order to validate the correctness of our newly derived results. © 2011 IEEE.

  20. Applications of human error analysis to aviation and space operations

    International Nuclear Information System (INIS)

    Nelson, W.R.

    1998-01-01

    For the past several years at the Idaho National Engineering and Environmental Laboratory (INEEL) we have been working to apply methods of human error analysis to the design of complex systems. We have focused on adapting human reliability analysis (HRA) methods that were developed for Probabilistic Safety Assessment (PSA) for application to system design. We are developing methods so that human errors can be systematically identified during system design, the potential consequences of each error can be assessed, and potential corrective actions (e.g. changes to system design or procedures) can be identified. These applications lead to different requirements when compared with HR.As performed as part of a PSA. For example, because the analysis will begin early during the design stage, the methods must be usable when only partial design information is available. In addition, the ability to perform numerous ''what if'' analyses to identify and compare multiple design alternatives is essential. Finally, since the goals of such human error analyses focus on proactive design changes rather than the estimate of failure probabilities for PRA, there is more emphasis on qualitative evaluations of error relationships and causal factors than on quantitative estimates of error frequency. The primary vehicle we have used to develop and apply these methods has been a series of prqjects sponsored by the National Aeronautics and Space Administration (NASA) to apply human error analysis to aviation operations. The first NASA-sponsored project had the goal to evaluate human errors caused by advanced cockpit automation. Our next aviation project focused on the development of methods and tools to apply human error analysis to the design of commercial aircraft. This project was performed by a consortium comprised of INEEL, NASA, and Boeing Commercial Airplane Group. The focus of the project was aircraft design and procedures that could lead to human errors during airplane maintenance

  1. Human Error Analysis in a Permit to Work System: A Case Study in a Chemical Plant

    Directory of Open Access Journals (Sweden)

    Mehdi Jahangiri

    2016-03-01

    Conclusion: The SPAR-H method applied in this study could analyze and quantify the potential human errors and extract the required measures for reducing the error probabilities in PTW system. Some suggestions to reduce the likelihood of errors, especially in the field of modifying the performance shaping factors and dependencies among tasks are provided.

  2. Fast Outage Probability Simulation for FSO Links with a Generalized Pointing Error Model

    KAUST Repository

    Ben Issaid, Chaouki

    2017-02-07

    Over the past few years, free-space optical (FSO) communication has gained significant attention. In fact, FSO can provide cost-effective and unlicensed links, with high-bandwidth capacity and low error rate, making it an exciting alternative to traditional wireless radio-frequency communication systems. However, the system performance is affected not only by the presence of atmospheric turbulences, which occur due to random fluctuations in the air refractive index but also by the existence of pointing errors. Metrics, such as the outage probability which quantifies the probability that the instantaneous signal-to-noise ratio is smaller than a given threshold, can be used to analyze the performance of this system. In this work, we consider weak and strong turbulence regimes, and we study the outage probability of an FSO communication system under a generalized pointing error model with both a nonzero boresight component and different horizontal and vertical jitter effects. More specifically, we use an importance sampling approach which is based on the exponential twisting technique to offer fast and accurate results.

  3. Understanding human management of automation errors

    Science.gov (United States)

    McBride, Sara E.; Rogers, Wendy A.; Fisk, Arthur D.

    2013-01-01

    Automation has the potential to aid humans with a diverse set of tasks and support overall system performance. Automated systems are not always reliable, and when automation errs, humans must engage in error management, which is the process of detecting, understanding, and correcting errors. However, this process of error management in the context of human-automation interaction is not well understood. Therefore, we conducted a systematic review of the variables that contribute to error management. We examined relevant research in human-automation interaction and human error to identify critical automation, person, task, and emergent variables. We propose a framework for management of automation errors to incorporate and build upon previous models. Further, our analysis highlights variables that may be addressed through design and training to positively influence error management. Additional efforts to understand the error management process will contribute to automation designed and implemented to support safe and effective system performance. PMID:25383042

  4. A qualitative description of human error

    International Nuclear Information System (INIS)

    Li Zhaohuan

    1992-11-01

    The human error has an important contribution to risk of reactor operation. The insight and analytical model are main parts in human reliability analysis. It consists of the concept of human error, the nature, the mechanism of generation, the classification and human performance influence factors. On the operating reactor the human error is defined as the task-human-machine mismatch. The human error event is focused on the erroneous action and the unfavored result. From the time limitation of performing a task, the operation is divided into time-limited and time-opened. The HCR (human cognitive reliability) model is suited for only time-limited. The basic cognitive process consists of the information gathering, cognition/thinking, decision making and action. The human erroneous action may be generated in any stage of this process. The more natural ways to classify human errors are presented. The human performance influence factors including personal, organizational and environmental factors are also listed

  5. A qualitative description of human error

    Energy Technology Data Exchange (ETDEWEB)

    Zhaohuan, Li [Academia Sinica, Beijing, BJ (China). Inst. of Atomic Energy

    1992-11-01

    The human error has an important contribution to risk of reactor operation. The insight and analytical model are main parts in human reliability analysis. It consists of the concept of human error, the nature, the mechanism of generation, the classification and human performance influence factors. On the operating reactor the human error is defined as the task-human-machine mismatch. The human error event is focused on the erroneous action and the unfavored result. From the time limitation of performing a task, the operation is divided into time-limited and time-opened. The HCR (human cognitive reliability) model is suited for only time-limited. The basic cognitive process consists of the information gathering, cognition/thinking, decision making and action. The human erroneous action may be generated in any stage of this process. The more natural ways to classify human errors are presented. The human performance influence factors including personal, organizational and environmental factors are also listed.

  6. Undetected error probability for data services in a terrestrial DAB single frequency network

    NARCIS (Netherlands)

    Schiphorst, Roelof; Hoeksema, F.W.; Slump, Cornelis H.; Veldhuis, Raymond N.J.; Veldhuis, R.N.J.; Cronie, H.S.

    2007-01-01

    DAB (Digital Audio Broadcasting) is the European successor of FM radio. Besides audio services, other services such as traffic information can be provided. An important parameter for data services is the probability of non-recognized or undetected errors in the system. To derive this probability, we

  7. Sporadic error probability due to alpha particles in dynamic memories of various technologies

    International Nuclear Information System (INIS)

    Edwards, D.G.

    1980-01-01

    The sensitivity of MOS memory components to errors induced by alpha particles is expected to increase with integration level. The soft error rate of a 65-kbit VMOS memory has been compared experimentally with that of three field-proven 16-kbit designs. The technological and design advantages of the VMOS RAM ensure an error rate which is lower than those of the 16-kbit memories. Calculation of the error probability for the 65-kbit RAM and comparison with the measurements show that for large duty cycles single particle hits lead to sensing errors and for small duty cycles cell errors caused by multiple hits predominate. (Auth.)

  8. Human error probability quantification using fuzzy methodology in nuclear plants; Aplicacao da metodologia fuzzy na quantificacao da probabilidade de erro humano em instalacoes nucleares

    Energy Technology Data Exchange (ETDEWEB)

    Nascimento, Claudio Souza do

    2010-07-01

    This work obtains Human Error Probability (HEP) estimates from operator's actions in response to emergency situations a hypothesis on Research Reactor IEA-R1 from IPEN. It was also obtained a Performance Shaping Factors (PSF) evaluation in order to classify them according to their influence level onto the operator's actions and to determine these PSF actual states over the plant. Both HEP estimation and PSF evaluation were done based on Specialists Evaluation using interviews and questionnaires. Specialists group was composed from selected IEA-R1 operators. Specialist's knowledge representation into linguistic variables and group evaluation values were obtained through Fuzzy Logic and Fuzzy Set Theory. HEP obtained values show good agreement with literature published data corroborating the proposed methodology as a good alternative to be used on Human Reliability Analysis (HRA). (author)

  9. Human Errors and Bridge Management Systems

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle; Nowak, A. S.

    on basis of reliability profiles for bridges without human errors are extended to include bridges with human errors. The first rehabilitation distributions for bridges without and with human errors are combined into a joint first rehabilitation distribution. The methodology presented is illustrated...... for reinforced concrete bridges....

  10. Development of an analysis rule of diagnosis error for standard method of human reliability analysis

    International Nuclear Information System (INIS)

    Jeong, W. D.; Kang, D. I.; Jeong, K. S.

    2003-01-01

    This paper presents the status of development of Korea standard method for Human Reliability Analysis (HRA), and proposed a standard procedure and rules for the evaluation of diagnosis error probability. The quality of KSNP HRA was evaluated using the requirement of ASME PRA standard guideline, and the design requirement for the standard HRA method was defined. Analysis procedure and rules, developed so far, to analyze diagnosis error probability was suggested as a part of the standard method. And also a study of comprehensive application was performed to evaluate the suitability of the proposed rules

  11. On the average capacity and bit error probability of wireless communication systems

    KAUST Repository

    Yilmaz, Ferkan; Alouini, Mohamed-Slim

    2011-01-01

    Analysis of the average binary error probabilities and average capacity of wireless communications systems over generalized fading channels have been considered separately in the past. This paper introduces a novel moment generating function

  12. Game Design Principles based on Human Error

    Directory of Open Access Journals (Sweden)

    Guilherme Zaffari

    2016-03-01

    Full Text Available This paper displays the result of the authors’ research regarding to the incorporation of Human Error, through design principles, to video game design. In a general way, designers must consider Human Error factors throughout video game interface development; however, when related to its core design, adaptations are in need, since challenge is an important factor for fun and under the perspective of Human Error, challenge can be considered as a flaw in the system. The research utilized Human Error classifications, data triangulation via predictive human error analysis, and the expanded flow theory to allow the design of a set of principles in order to match the design of playful challenges with the principles of Human Error. From the results, it was possible to conclude that the application of Human Error in game design has a positive effect on player experience, allowing it to interact only with errors associated with the intended aesthetics of the game.

  13. Human errors in NPP operations

    International Nuclear Information System (INIS)

    Sheng Jufang

    1993-01-01

    Based on the operational experiences of nuclear power plants (NPPs), the importance of studying human performance problems is described. Statistical analysis on the significance or frequency of various root-causes and error-modes from a large number of human-error-related events demonstrate that the defects in operation/maintenance procedures, working place factors, communication and training practices are primary root-causes, while omission, transposition, quantitative mistake are the most frequent among the error-modes. Recommendations about domestic research on human performance problem in NPPs are suggested

  14. Human errors, countermeasures for their prevention and evaluation

    International Nuclear Information System (INIS)

    Kohda, Takehisa; Inoue, Koichi

    1992-01-01

    The accidents originated in human errors have occurred as ever in recent large accidents such as the TMI accident and the Chernobyl accident. The proportion of the accidents originated in human errors is unexpectedly high, therefore, the reliability and safety of hardware are improved hereafter, but the improvement of human reliability cannot be expected. Human errors arise by the difference between the function required for men and the function actually accomplished by men, and the results exert some adverse effect to systems. Human errors are classified into design error, manufacture error, operation error, maintenance error, checkup error and general handling error. In terms of behavior, human errors are classified into forget to do, fail to do, do that must not be done, mistake in order and do at improper time. The factors in human error occurrence are circumstantial factor, personal factor and stress factor. As the method of analyzing and evaluating human errors, system engineering method such as probabilistic risk assessment is used. The technique for human error rate prediction, the method for human cognitive reliability, confusion matrix and SLIM-MAUD are also used. (K.I.)

  15. Research trend on human error reduction

    International Nuclear Information System (INIS)

    Miyaoka, Sadaoki

    1990-01-01

    Human error has been the problem in all industries. In 1988, the Bureau of Mines, Department of the Interior, USA, carried out the worldwide survey on the human error in all industries in relation to the fatal accidents in mines. There was difference in the results according to the methods of collecting data, but the proportion that human error took in the total accidents distributed in the wide range of 20∼85%, and was 35% on the average. The rate of occurrence of accidents and troubles in Japanese nuclear power stations is shown, and the rate of occurrence of human error is 0∼0.5 cases/reactor-year, which did not much vary. Therefore, the proportion that human error took in the total tended to increase, and it has become important to reduce human error for lowering the rate of occurrence of accidents and troubles hereafter. After the TMI accident in 1979 in USA, the research on man-machine interface became active, and after the Chernobyl accident in 1986 in USSR, the problem of organization and management has been studied. In Japan, 'Safety 21' was drawn up by the Advisory Committee for Energy, and also the annual reports on nuclear safety pointed out the importance of human factors. The state of the research on human factors in Japan and abroad and three targets to reduce human error are reported. (K.I.)

  16. Error Probability Analysis of Hardware Impaired Systems with Asymmetric Transmission

    KAUST Repository

    Javed, Sidrah; Amin, Osama; Ikki, Salama S.; Alouini, Mohamed-Slim

    2018-01-01

    Error probability study of the hardware impaired (HWI) systems highly depends on the adopted model. Recent models have proved that the aggregate noise is equivalent to improper Gaussian signals. Therefore, considering the distinct noise nature and self-interfering (SI) signals, an optimal maximum likelihood (ML) receiver is derived. This renders the conventional minimum Euclidean distance (MED) receiver as a sub-optimal receiver because it is based on the assumptions of ideal hardware transceivers and proper Gaussian noise in communication systems. Next, the average error probability performance of the proposed optimal ML receiver is analyzed and tight bounds and approximations are derived for various adopted systems including transmitter and receiver I/Q imbalanced systems with or without transmitter distortions as well as transmitter or receiver only impaired systems. Motivated by recent studies that shed the light on the benefit of improper Gaussian signaling in mitigating the HWIs, asymmetric quadrature amplitude modulation or phase shift keying is optimized and adapted for transmission. Finally, different numerical and simulation results are presented to support the superiority of the proposed ML receiver over MED receiver, the tightness of the derived bounds and effectiveness of asymmetric transmission in dampening HWIs and improving overall system performance

  17. Error Probability Analysis of Hardware Impaired Systems with Asymmetric Transmission

    KAUST Repository

    Javed, Sidrah

    2018-04-26

    Error probability study of the hardware impaired (HWI) systems highly depends on the adopted model. Recent models have proved that the aggregate noise is equivalent to improper Gaussian signals. Therefore, considering the distinct noise nature and self-interfering (SI) signals, an optimal maximum likelihood (ML) receiver is derived. This renders the conventional minimum Euclidean distance (MED) receiver as a sub-optimal receiver because it is based on the assumptions of ideal hardware transceivers and proper Gaussian noise in communication systems. Next, the average error probability performance of the proposed optimal ML receiver is analyzed and tight bounds and approximations are derived for various adopted systems including transmitter and receiver I/Q imbalanced systems with or without transmitter distortions as well as transmitter or receiver only impaired systems. Motivated by recent studies that shed the light on the benefit of improper Gaussian signaling in mitigating the HWIs, asymmetric quadrature amplitude modulation or phase shift keying is optimized and adapted for transmission. Finally, different numerical and simulation results are presented to support the superiority of the proposed ML receiver over MED receiver, the tightness of the derived bounds and effectiveness of asymmetric transmission in dampening HWIs and improving overall system performance

  18. Minimum Probability of Error-Based Equalization Algorithms for Fading Channels

    Directory of Open Access Journals (Sweden)

    Janos Levendovszky

    2007-06-01

    Full Text Available Novel channel equalizer algorithms are introduced for wireless communication systems to combat channel distortions resulting from multipath propagation. The novel algorithms are based on newly derived bounds on the probability of error (PE and guarantee better performance than the traditional zero forcing (ZF or minimum mean square error (MMSE algorithms. The new equalization methods require channel state information which is obtained by a fast adaptive channel identification algorithm. As a result, the combined convergence time needed for channel identification and PE minimization still remains smaller than the convergence time of traditional adaptive algorithms, yielding real-time equalization. The performance of the new algorithms is tested by extensive simulations on standard mobile channels.

  19. The relative impact of sizing errors on steam generator tube failure probability

    International Nuclear Information System (INIS)

    Cizelj, L.; Dvorsek, T.

    1998-01-01

    The Outside Diameter Stress Corrosion Cracking (ODSCC) at tube support plates is currently the major degradation mechanism affecting the steam generator tubes made of Inconel 600. This caused development and licensing of degradation specific maintenance approaches, which addressed two main failure modes of the degraded piping: tube rupture; and excessive leakage through degraded tubes. A methodology aiming at assessing the efficiency of a given set of possible maintenance approaches has already been proposed by the authors. It pointed out better performance of the degradation specific over generic approaches in (1) lower probability of single and multiple steam generator tube rupture (SGTR), (2) lower estimated accidental leak rates and (3) less tubes plugged. A sensitivity analysis was also performed pointing out the relative contributions of uncertain input parameters to the tube rupture probabilities. The dominant contribution was assigned to the uncertainties inherent to the regression models used to correlate the defect size and tube burst pressure. The uncertainties, which can be estimated from the in-service inspections, are further analysed in this paper. The defect growth was found to have significant and to some extent unrealistic impact on the probability of single tube rupture. Since the defect growth estimates were based on the past inspection records they strongly depend on the sizing errors. Therefore, an attempt was made to filter out the sizing errors and to arrive at more realistic estimates of the defect growth. The impact of different assumptions regarding sizing errors on the tube rupture probability was studied using a realistic numerical example. The data used is obtained from a series of inspection results from Krsko NPP with 2 Westinghouse D-4 steam generators. The results obtained are considered useful in safety assessment and maintenance of affected steam generators. (author)

  20. A methodology for collection and analysis of human error data based on a cognitive model: IDA

    International Nuclear Information System (INIS)

    Shen, S.-H.; Smidts, C.; Mosleh, A.

    1997-01-01

    This paper presents a model-based human error taxonomy and data collection. The underlying model, IDA (described in two companion papers), is a cognitive model of behavior developed for analysis of the actions of nuclear power plant operating crew during abnormal situations. The taxonomy is established with reference to three external reference points (i.e. plant status, procedures, and crew) and four reference points internal to the model (i.e. information collected, diagnosis, decision, action). The taxonomy helps the analyst: (1) recognize errors as such; (2) categorize the error in terms of generic characteristics such as 'error in selection of problem solving strategies' and (3) identify the root causes of the error. The data collection methodology is summarized in post event operator interview and analysis summary forms. The root cause analysis methodology is illustrated using a subset of an actual event. Statistics, which extract generic characteristics of error prone behaviors and error prone situations are presented. Finally, applications of the human error data collection are reviewed. A primary benefit of this methodology is to define better symptom-based and other auxiliary procedures with associated training to minimize or preclude certain human errors. It also helps in design of control rooms, and in assessment of human error probabilities in the probabilistic risk assessment framework. (orig.)

  1. Human error data collection as a precursor to the development of a human reliability assessment capability in air traffic management

    International Nuclear Information System (INIS)

    Kirwan, Barry; Gibson, W. Huw; Hickling, Brian

    2008-01-01

    Quantified risk and safety assessments are now required for safety cases for European air traffic management (ATM) services. Since ATM is highly human-dependent for its safety, this suggests a need for formal human reliability assessment (HRA), as carried out in other industries such as nuclear power. Since the fundamental aspect of HRA is human error data, in the form of human error probabilities (HEPs), it was decided to take a first step towards development of an ATM HRA approach by deriving some HEPs in an ATM context. This paper reports a study, which collected HEPs via analysing the results of a real-time simulation involving air traffic controllers (ATCOs) and pilots, with a focus on communication errors. This study did indeed derive HEPs that were found to be concordant with other known communication human error data. This is a first step, and shows promise for HRA in ATM, since HEPs have been derived which could be used in safety assessments, although these HEPs are for only one (albeit critical) aspect of ATCOs' tasks (communications). The paper discusses options and potential ways forward for the development of a full HRA capability in ATM

  2. A human error analysis methodology, AGAPE-ET, for emergency tasks in nuclear power plants and its application

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae Whan; Jung, Won Dea [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2002-03-01

    This report presents a procedurised human reliability analysis (HRA) methodology, AGAPE-ET (A Guidance And Procedure for Human Error Analysis for Emergency Tasks), for both qualitative error analysis and quantification of human error probability (HEP) of emergency tasks in nuclear power plants. The AGAPE-ET is based on the simplified cognitive model. By each cognitive function, error causes or error-likely situations have been identified considering the characteristics of the performance of each cognitive function and influencing mechanism of PIFs on the cognitive function. Then, error analysis items have been determined from the identified error causes or error-likely situations to help the analysts cue or guide overall human error analysis. A human error analysis procedure based on the error analysis items is organised. The basic scheme for the quantification of HEP consists in the multiplication of the BHEP assigned by the error analysis item and the weight from the influencing factors decision tree (IFDT) constituted by cognitive function. The method can be characterised by the structured identification of the weak points of the task required to perform and the efficient analysis process that the analysts have only to carry out with the necessary cognitive functions. The report also presents the the application of AFAPE-ET to 31 nuclear emergency tasks and its results. 42 refs., 7 figs., 36 tabs. (Author)

  3. Analysis of Employee's Survey for Preventing Human-Errors

    International Nuclear Information System (INIS)

    Sung, Chanho; Kim, Younggab; Joung, Sanghoun

    2013-01-01

    Human errors in nuclear power plant can cause large and small events or incidents. These events or incidents are one of main contributors of reactor trip and might threaten the safety of nuclear plants. To prevent human-errors, KHNP(nuclear power plants) introduced 'Human-error prevention techniques' and have applied the techniques to main parts such as plant operation, operation support, and maintenance and engineering. This paper proposes the methods to prevent and reduce human-errors in nuclear power plants through analyzing survey results which includes the utilization of the human-error prevention techniques and the employees' awareness of preventing human-errors. With regard to human-error prevention, this survey analysis presented the status of the human-error prevention techniques and the employees' awareness of preventing human-errors. Employees' understanding and utilization of the techniques was generally high and training level of employee and training effect on actual works were in good condition. Also, employees answered that the root causes of human-error were due to working environment including tight process, manpower shortage, and excessive mission rather than personal negligence or lack of personal knowledge. Consideration of working environment is certainly needed. At the present time, based on analyzing this survey, the best methods of preventing human-error are personal equipment, training/education substantiality, private mental health check before starting work, prohibit of multiple task performing, compliance with procedures, and enhancement of job site review. However, the most important and basic things for preventing human-error are interests of workers and organizational atmosphere such as communication between managers and workers, and communication between employees and bosses

  4. A technique for human error analysis (ATHEANA)

    International Nuclear Information System (INIS)

    Cooper, S.E.; Ramey-Smith, A.M.; Wreathall, J.; Parry, G.W.

    1996-05-01

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions

  5. A technique for human error analysis (ATHEANA)

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, S.E.; Ramey-Smith, A.M.; Wreathall, J.; Parry, G.W. [and others

    1996-05-01

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions.

  6. Human error in remote Afterloading Brachytherapy

    International Nuclear Information System (INIS)

    Quinn, M.L.; Callan, J.; Schoenfeld, I.; Serig, D.

    1994-01-01

    Remote Afterloading Brachytherapy (RAB) is a medical process used in the treatment of cancer. RAB uses a computer-controlled device to remotely insert and remove radioactive sources close to a target (or tumor) in the body. Some RAB problems affecting the radiation dose to the patient have been reported and attributed to human error. To determine the root cause of human error in the RAB system, a human factors team visited 23 RAB treatment sites in the US. The team observed RAB treatment planning and delivery, interviewed RAB personnel, and performed walk-throughs, during which staff demonstrated the procedures and practices used in performing RAB tasks. Factors leading to human error in the RAB system were identified. The impact of those factors on the performance of RAB was then evaluated and prioritized in terms of safety significance. Finally, the project identified and evaluated alternative approaches for resolving the safety significant problems related to human error

  7. The calculation of average error probability in a digital fibre optical communication system

    Science.gov (United States)

    Rugemalira, R. A. M.

    1980-03-01

    This paper deals with the problem of determining the average error probability in a digital fibre optical communication system, in the presence of message dependent inhomogeneous non-stationary shot noise, additive Gaussian noise and intersymbol interference. A zero-forcing equalization receiver filter is considered. Three techniques for error rate evaluation are compared. The Chernoff bound and the Gram-Charlier series expansion methods are compared to the characteristic function technique. The latter predicts a higher receiver sensitivity

  8. Findings from analysing and quantifying human error using current methods

    International Nuclear Information System (INIS)

    Dang, V.N.; Reer, B.

    1999-01-01

    In human reliability analysis (HRA), the scarcity of data means that, at best, judgement must be applied to transfer to the domain of the analysis what data are available for similar tasks. In particular for the quantification of tasks involving decisions, the analyst has to choose among quantification approaches that all depend to a significant degree on expert judgement. The use of expert judgement can be made more reliable by eliciting relative judgements rather than absolute judgements. These approaches, which are based on multiple criterion decision theory, focus on ranking the tasks to be analysed by difficulty. While these approaches remedy at least partially the poor performance of experts in the estimation of probabilities, they nevertheless require the calibration of the relative scale on which the actions are ranked in order to obtain the probabilities of interest. This paper presents some results from a comparison of some current HRA methods performed in the frame of a study of SLIM calibration options. The HRA quantification methods THERP, HEART, and INTENT were applied to derive calibration human error probabilities for two groups of operator actions. (author)

  9. Quantitative developments in the cognitive reliability and error analysis method (CREAM) for the assessment of human performance

    International Nuclear Information System (INIS)

    Marseguerra, Marzio; Zio, Enrico; Librizzi, Massimo

    2006-01-01

    The current 'second generation' approaches in human reliability analysis focus their attention on the contextual conditions under which a given action is performed rather than on the notion of inherent human error probabilities, as was done in the earlier 'first generation' techniques. Among the 'second generation' methods, this paper considers the Cognitive Reliability and Error Analysis Method (CREAM) and proposes some developments with respect to a systematic procedure for computing probabilities of action failure. The starting point for the quantification is a previously introduced fuzzy version of the CREAM paradigm which is here further extended to include uncertainty on the qualification of the conditions under which the action is performed and to account for the fact that the effects of the common performance conditions (CPCs) on performance reliability may not all be equal. By the proposed approach, the probability of action failure is estimated by rating the performance conditions in terms of their effect on the action

  10. Human Error Probability Assessment During Maintenance Activities of Marine Systems

    OpenAIRE

    Rabiul Islam; Faisal Khan; Rouzbeh Abbassi; Vikram Garaniya

    2018-01-01

    Background: Maintenance operations on-board ships are highly demanding. Maintenance operations are intensive activities requiring high man–machine interactions in challenging and evolving conditions. The evolving conditions are weather conditions, workplace temperature, ship motion, noise and vibration, and workload and stress. For example, extreme weather condition affects seafarers' performance, increasing the chances of error, and, consequently, can cause injuries or fatalities to personne...

  11. Can human error theory explain non-adherence?

    Science.gov (United States)

    Barber, Nick; Safdar, A; Franklin, Bryoney D

    2005-08-01

    To apply human error theory to explain non-adherence and examine how well it fits. Patients who were taking chronic medication were telephoned and asked whether they had been adhering to their medicine, and if not the reasons were explored and analysed according to a human error theory. Of 105 patients, 87 were contacted by telephone and they took part in the study. Forty-two recalled being non-adherent, 17 of them in the last 7 days; 11 of the 42 were intentionally non-adherent. The errors could be described by human error theory, and it explained unintentional non-adherence well, however, the application of 'rules' was difficult when considering mistakes. The consideration of error producing conditions and latent failures also revealed useful contributing factors. Human error theory offers a new and valuable way of understanding non-adherence, and could inform interventions. However, the theory needs further development to explain intentional non-adherence.

  12. Human errors in operation - what to do with them?

    International Nuclear Information System (INIS)

    Michalek, J.

    2009-01-01

    'It is human to make errors!' This saying of our predecessors is still current and will continue to be valid also in the future, until human is a human. Errors cannot be completely eliminated from human activities. In average human makes two simple errors in one hour. For example, how many typing errors do we make while typing on the computer keyboard? How many times we make mistakes in writing the date in the first days of a new year? These errors have no major consequences, however, in certain situations errors of humans are very unpleasant and may be also very costly, they may even endanger human lives. (author)

  13. Human Error and Organizational Management

    Directory of Open Access Journals (Sweden)

    Alecxandrina DEACONU

    2009-01-01

    Full Text Available The concern for performance is a topic that raises interest in the businessenvironment but also in other areas that – even if they seem distant from thisworld – are aware of, interested in or conditioned by the economy development.As individual performance is very much influenced by the human resource, wechose to analyze in this paper the mechanisms that generate – consciously or not–human error nowadays.Moreover, the extremely tense Romanian context,where failure is rather a rule than an exception, made us investigate thephenomenon of generating a human error and the ways to diminish its effects.

  14. Modeling the probability distribution of positional errors incurred by residential address geocoding

    Directory of Open Access Journals (Sweden)

    Mazumdar Soumya

    2007-01-01

    Full Text Available Abstract Background The assignment of a point-level geocode to subjects' residences is an important data assimilation component of many geographic public health studies. Often, these assignments are made by a method known as automated geocoding, which attempts to match each subject's address to an address-ranged street segment georeferenced within a streetline database and then interpolate the position of the address along that segment. Unfortunately, this process results in positional errors. Our study sought to model the probability distribution of positional errors associated with automated geocoding and E911 geocoding. Results Positional errors were determined for 1423 rural addresses in Carroll County, Iowa as the vector difference between each 100%-matched automated geocode and its true location as determined by orthophoto and parcel information. Errors were also determined for 1449 60%-matched geocodes and 2354 E911 geocodes. Huge (> 15 km outliers occurred among the 60%-matched geocoding errors; outliers occurred for the other two types of geocoding errors also but were much smaller. E911 geocoding was more accurate (median error length = 44 m than 100%-matched automated geocoding (median error length = 168 m. The empirical distributions of positional errors associated with 100%-matched automated geocoding and E911 geocoding exhibited a distinctive Greek-cross shape and had many other interesting features that were not capable of being fitted adequately by a single bivariate normal or t distribution. However, mixtures of t distributions with two or three components fit the errors very well. Conclusion Mixtures of bivariate t distributions with few components appear to be flexible enough to fit many positional error datasets associated with geocoding, yet parsimonious enough to be feasible for nascent applications of measurement-error methodology to spatial epidemiology.

  15. Human errors related to maintenance and modifications

    International Nuclear Information System (INIS)

    Laakso, K.; Pyy, P.; Reiman, L.

    1998-01-01

    The focus in human reliability analysis (HRA) relating to nuclear power plants has traditionally been on human performance in disturbance conditions. On the other hand, some studies and incidents have shown that also maintenance errors, which have taken place earlier in plant history, may have an impact on the severity of a disturbance, e.g. if they disable safety related equipment. Especially common cause and other dependent failures of safety systems may significantly contribute to the core damage risk. The first aim of the study was to identify and give examples of multiple human errors which have penetrated the various error detection and inspection processes of plant safety barriers. Another objective was to generate numerical safety indicators to describe and forecast the effectiveness of maintenance. A more general objective was to identify needs for further development of maintenance quality and planning. In the first phase of this operational experience feedback analysis, human errors recognisable in connection with maintenance were looked for by reviewing about 4400 failure and repair reports and some special reports which cover two nuclear power plant units on the same site during 1992-94. A special effort was made to study dependent human errors since they are generally the most serious ones. An in-depth root cause analysis was made for 14 dependent errors by interviewing plant maintenance foremen and by thoroughly analysing the errors. A more simple treatment was given to maintenance-related single errors. The results were shown as a distribution of errors among operating states i.a. as regards the following matters: in what operational state the errors were committed and detected; in what operational and working condition the errors were detected, and what component and error type they were related to. These results were presented separately for single and dependent maintenance-related errors. As regards dependent errors, observations were also made

  16. Notes on human error analysis and prediction

    International Nuclear Information System (INIS)

    Rasmussen, J.

    1978-11-01

    The notes comprise an introductory discussion of the role of human error analysis and prediction in industrial risk analysis. Following this introduction, different classes of human errors and role in industrial systems are mentioned. Problems related to the prediction of human behaviour in reliability and safety analysis are formulated and ''criteria for analyzability'' which must be met by industrial systems so that a systematic analysis can be performed are suggested. The appendices contain illustrative case stories and a review of human error reports for the task of equipment calibration and testing as found in the US Licensee Event Reports. (author)

  17. Human Errors - A Taxonomy for Describing Human Malfunction in Industrial Installations

    DEFF Research Database (Denmark)

    Rasmussen, Jens

    1982-01-01

    This paper describes the definition and the characteristics of human errors. Different types of human behavior are classified, and their relation to different error mechanisms are analyzed. The effect of conditioning factors related to affective, motivating aspects of the work situation as well...... as physiological factors are also taken into consideration. The taxonomy for event analysis, including human malfunction, is presented. Possibilities for the prediction of human error are discussed. The need for careful studies in actual work situations is expressed. Such studies could provide a better...

  18. Study on a new framework of Human Reliability Analysis to evaluate soft control execution error in advanced MCRs of NPPs

    International Nuclear Information System (INIS)

    Jang, Inseok; Kim, Ar Ryum; Jung, Wondea; Seong, Poong Hyun

    2016-01-01

    Highlights: • The operation environment of MCRs in NPPs has changed by adopting new HSIs. • The operation action in NPP Advanced MCRs is performed by soft control. • New HRA framework should be considered in the HRA for advanced MCRs. • HRA framework for evaluation of soft control execution human error is suggested. • Suggested method will be helpful to analyze human reliability in advance MCRs. - Abstract: Since the Three Mile Island (TMI)-2 accident, human error has been recognized as one of the main causes of Nuclear Power Plant (NPP) accidents, and numerous studies related to Human Reliability Analysis (HRA) have been carried out. Most of these methods were developed considering the conventional type of Main Control Rooms (MCRs). However, the operating environment of MCRs in NPPs has changed with the adoption of new Human-System Interfaces (HSIs) that are based on computer-based technologies. The MCRs that include these digital technologies, such as large display panels, computerized procedures, and soft controls, are called advanced MCRs. Among the many features of advanced MCRs, soft controls are a particularly important feature because operating actions in NPP advanced MCRs are performed by soft control. Due to the differences in interfaces between soft control and hardwired conventional type control, different Human Error Probabilities (HEPs) and a new HRA framework should be considered in the HRA for advanced MCRs. To this end, a new framework of a HRA method for evaluating soft control execution human error is suggested by performing a soft control task analysis and the literature regarding widely accepted human error taxonomies is reviewed. Moreover, since most current HRA databases deal with operation in conventional MCRs and are not explicitly designed to deal with digital HSIs, empirical analysis of human error and error recovery considering soft controls under an advanced MCR mockup are carried out to collect human error data, which is

  19. Collection and classification of human error and human reliability data from Indian nuclear power plants for use in PSA

    International Nuclear Information System (INIS)

    Subramaniam, K.; Saraf, R.K.; Sanyasi Rao, V.V.S.; Venkat Raj, V.; Venkatraman, R.

    2000-01-01

    Complex systems such as NPPs involve a large number of Human Interactions (HIs) in every phase of plant operations. Human Reliability Analysis (HRA) in the context of a PSA, attempts to model the HIs and evaluate/predict their impact on safety and reliability using human error/human reliability data. A large number of HRA techniques have been developed for modelling and integrating HIs into PSA but there is a significant lack of HAR data. In the face of insufficient data, human reliability analysts have had to resort to expert judgement methods in order to extend the insufficient data sets. In this situation, the generation of data from plant operating experience assumes importance. The development of a HRA data bank for Indian nuclear power plants was therefore initiated as part of the programme of work on HRA. Later, with the establishment of the coordinated research programme (CRP) on collection of human reliability data and use in PSA by IAEA in 1994-95, the development was carried out under the aegis of the IAEA research contract No. 8239/RB. The work described in this report covers the activities of development of a data taxonomy and a human error reporting form (HERF) based on it, data structuring, review and analysis of plant event reports, collection of data on human errors, analysis of the data and calculation of human error probabilities (HEPs). Analysis of plant operating experience does yield a good amount of qualitative data but obtaining quantitative data on human reliability in the form of HEPs is seen to be more difficult. The difficulties have been highlighted and some ways to bring about improvements in the data situation have been discussed. The implementation of a data system for HRA is described and useful features that can be incorporated in future systems are also discussed. (author)

  20. Logic, probability, and human reasoning.

    Science.gov (United States)

    Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P

    2015-04-01

    This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Scaling prediction errors to reward variability benefits error-driven learning in humans.

    Science.gov (United States)

    Diederen, Kelly M J; Schultz, Wolfram

    2015-09-01

    Effective error-driven learning requires individuals to adapt learning to environmental reward variability. The adaptive mechanism may involve decays in learning rate across subsequent trials, as shown previously, and rescaling of reward prediction errors. The present study investigated the influence of prediction error scaling and, in particular, the consequences for learning performance. Participants explicitly predicted reward magnitudes that were drawn from different probability distributions with specific standard deviations. By fitting the data with reinforcement learning models, we found scaling of prediction errors, in addition to the learning rate decay shown previously. Importantly, the prediction error scaling was closely related to learning performance, defined as accuracy in predicting the mean of reward distributions, across individual participants. In addition, participants who scaled prediction errors relative to standard deviation also presented with more similar performance for different standard deviations, indicating that increases in standard deviation did not substantially decrease "adapters'" accuracy in predicting the means of reward distributions. However, exaggerated scaling beyond the standard deviation resulted in impaired performance. Thus efficient adaptation makes learning more robust to changing variability. Copyright © 2015 the American Physiological Society.

  2. Analysis of Employee's Survey for Preventing Human-Errors

    Energy Technology Data Exchange (ETDEWEB)

    Sung, Chanho; Kim, Younggab; Joung, Sanghoun [KHNP Central Research Institute, Daejeon (Korea, Republic of)

    2013-10-15

    Human errors in nuclear power plant can cause large and small events or incidents. These events or incidents are one of main contributors of reactor trip and might threaten the safety of nuclear plants. To prevent human-errors, KHNP(nuclear power plants) introduced 'Human-error prevention techniques' and have applied the techniques to main parts such as plant operation, operation support, and maintenance and engineering. This paper proposes the methods to prevent and reduce human-errors in nuclear power plants through analyzing survey results which includes the utilization of the human-error prevention techniques and the employees' awareness of preventing human-errors. With regard to human-error prevention, this survey analysis presented the status of the human-error prevention techniques and the employees' awareness of preventing human-errors. Employees' understanding and utilization of the techniques was generally high and training level of employee and training effect on actual works were in good condition. Also, employees answered that the root causes of human-error were due to working environment including tight process, manpower shortage, and excessive mission rather than personal negligence or lack of personal knowledge. Consideration of working environment is certainly needed. At the present time, based on analyzing this survey, the best methods of preventing human-error are personal equipment, training/education substantiality, private mental health check before starting work, prohibit of multiple task performing, compliance with procedures, and enhancement of job site review. However, the most important and basic things for preventing human-error are interests of workers and organizational atmosphere such as communication between managers and workers, and communication between employees and bosses.

  3. Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR): Guide to data processing and revision: Part 2, Human error probability data entry and revision procedures

    International Nuclear Information System (INIS)

    Gilmore, W.E.; Gertman, D.I.; Gilbert, B.G.; Reece, W.J.

    1988-11-01

    The Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR) is an automated data base management system for processing and storing human error probability (HEP) and hardware component failure data (HCFD). The NUCLARR system software resides on an IBM (or compatible) personal micro-computer. Users can perform data base searches to furnish HEP estimates and HCFD rates. In this manner, the NUCLARR system can be used to support a variety of risk assessment activities. This volume, Volume 3 of a 5-volume series, presents the procedures used to process HEP and HCFD for entry in NUCLARR and describes how to modify the existing NUCLARR taxonomy in order to add either equipment types or action verbs. Volume 3 also specifies the various roles of the administrative staff on assignment to the NUCLARR Clearinghouse who are tasked with maintaining the data base, dealing with user requests, and processing NUCLARR data. 5 refs., 34 figs., 3 tabs

  4. The contributions of human factors on human error in Malaysia aviation maintenance industries

    Science.gov (United States)

    Padil, H.; Said, M. N.; Azizan, A.

    2018-05-01

    Aviation maintenance is a multitasking activity in which individuals perform varied tasks under constant pressure to meet deadlines as well as challenging work conditions. These situational characteristics combined with human factors can lead to various types of human related errors. The primary objective of this research is to develop a structural relationship model that incorporates human factors, organizational factors, and their impact on human errors in aviation maintenance. Towards that end, a questionnaire was developed which was administered to Malaysian aviation maintenance professionals. Structural Equation Modelling (SEM) approach was used in this study utilizing AMOS software. Results showed that there were a significant relationship of human factors on human errors and were tested in the model. Human factors had a partial effect on organizational factors while organizational factors had a direct and positive impact on human errors. It was also revealed that organizational factors contributed to human errors when coupled with human factors construct. This study has contributed to the advancement of knowledge on human factors effecting safety and has provided guidelines for improving human factors performance relating to aviation maintenance activities and could be used as a reference for improving safety performance in the Malaysian aviation maintenance companies.

  5. Human reliability analysis of errors of commission: a review of methods and applications

    Energy Technology Data Exchange (ETDEWEB)

    Reer, B

    2007-06-15

    Illustrated by specific examples relevant to contemporary probabilistic safety assessment (PSA), this report presents a review of human reliability analysis (HRA) addressing post initiator errors of commission (EOCs), i.e. inappropriate actions under abnormal operating conditions. The review addressed both methods and applications. Emerging HRA methods providing advanced features and explicit guidance suitable for PSA are: A Technique for Human Event Analysis (ATHEANA, key publications in 1998/2000), Methode d'Evaluation de la Realisation des Missions Operateur pour la Surete (MERMOS, 1998/2000), the EOC HRA method developed by the Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS, 2003), the Misdiagnosis Tree Analysis (MDTA) method (2005/2006), the Cognitive Reliability and Error Analysis Method (CREAM, 1998), and the Commission Errors Search and Assessment (CESA) method (2002/2004). As a result of a thorough investigation of various PSA/HRA applications, this paper furthermore presents an overview of EOCs (termination of safety injection, shutdown of secondary cooling, etc.) referred to in predictive studies and a qualitative review of cases of EOC quantification. The main conclusions of the review of both the methods and the EOC HRA cases are: (1) The CESA search scheme, which proceeds from possible operator actions to the affected systems to scenarios, may be preferable because this scheme provides a formalized way for identifying relatively important scenarios with EOC opportunities; (2) an EOC identification guidance like CESA, which is strongly based on the procedural guidance and important measures of systems or components affected by inappropriate actions, however should pay some attention to EOCs associated with familiar but non-procedural actions and EOCs leading to failures of manually initiated safety functions. (3) Orientations of advanced EOC quantification comprise a) modeling of multiple contexts for a given scenario, b) accounting for

  6. Human reliability analysis of errors of commission: a review of methods and applications

    International Nuclear Information System (INIS)

    Reer, B.

    2007-06-01

    Illustrated by specific examples relevant to contemporary probabilistic safety assessment (PSA), this report presents a review of human reliability analysis (HRA) addressing post initiator errors of commission (EOCs), i.e. inappropriate actions under abnormal operating conditions. The review addressed both methods and applications. Emerging HRA methods providing advanced features and explicit guidance suitable for PSA are: A Technique for Human Event Analysis (ATHEANA, key publications in 1998/2000), Methode d'Evaluation de la Realisation des Missions Operateur pour la Surete (MERMOS, 1998/2000), the EOC HRA method developed by the Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS, 2003), the Misdiagnosis Tree Analysis (MDTA) method (2005/2006), the Cognitive Reliability and Error Analysis Method (CREAM, 1998), and the Commission Errors Search and Assessment (CESA) method (2002/2004). As a result of a thorough investigation of various PSA/HRA applications, this paper furthermore presents an overview of EOCs (termination of safety injection, shutdown of secondary cooling, etc.) referred to in predictive studies and a qualitative review of cases of EOC quantification. The main conclusions of the review of both the methods and the EOC HRA cases are: (1) The CESA search scheme, which proceeds from possible operator actions to the affected systems to scenarios, may be preferable because this scheme provides a formalized way for identifying relatively important scenarios with EOC opportunities; (2) an EOC identification guidance like CESA, which is strongly based on the procedural guidance and important measures of systems or components affected by inappropriate actions, however should pay some attention to EOCs associated with familiar but non-procedural actions and EOCs leading to failures of manually initiated safety functions. (3) Orientations of advanced EOC quantification comprise a) modeling of multiple contexts for a given scenario, b) accounting for

  7. Human error mechanisms in complex work environments

    International Nuclear Information System (INIS)

    Rasmussen, J.

    1988-01-01

    Human error taxonomies have been developed from analysis of industrial incident reports as well as from psychological experiments. In this paper the results of the two approaches are reviewed and compared. It is found, in both cases, that a fairly small number of basic psychological mechanisms will account for most of the action errors observed. In addition, error mechanisms appear to be intimately related to the development of high skill and know-how in a complex work context. This relationship between errors and human adaptation is discussed in detail for individuals and organisations. The implications for system safety and briefly mentioned, together with the implications for system design. (author)

  8. Human error mechanisms in complex work environments

    International Nuclear Information System (INIS)

    Rasmussen, Jens; Danmarks Tekniske Hoejskole, Copenhagen)

    1988-01-01

    Human error taxonomies have been developed from analysis of industrial incident reports as well as from psychological experiments. In this paper the results of the two approaches are reviewed and compared. It is found, in both cases, that a fairly small number of basic psychological mechanisms will account for most of the action errors observed. In addition, error mechanisms appear to be intimately related to the development of high skill and know-how in a complex work context. This relationship between errors and human adaptation is discussed in detail for individuals and organisations. The implications for system safety are briefly mentioned, together with the implications for system design. (author)

  9. Human decision error (HUMDEE) trees

    International Nuclear Information System (INIS)

    Ostrom, L.T.

    1993-01-01

    Graphical presentations of human actions in incident and accident sequences have been used for many years. However, for the most part, human decision making has been underrepresented in these trees. This paper presents a method of incorporating the human decision process into graphical presentations of incident/accident sequences. This presentation is in the form of logic trees. These trees are called Human Decision Error Trees or HUMDEE for short. The primary benefit of HUMDEE trees is that they graphically illustrate what else the individuals involved in the event could have done to prevent either the initiation or continuation of the event. HUMDEE trees also present the alternate paths available at the operator decision points in the incident/accident sequence. This is different from the Technique for Human Error Rate Prediction (THERP) event trees. There are many uses of these trees. They can be used for incident/accident investigations to show what other courses of actions were available and for training operators. The trees also have a consequence component so that not only the decision can be explored, also the consequence of that decision

  10. A Sensitivity Study of Human Errors in Optimizing Surveillance Test Interval (STI) and Allowed Outage Time (AOT) of Standby Safety System

    International Nuclear Information System (INIS)

    Chung, Dae Wook; Shin, Won Ky; You, Young Woo; Yang, Hui Chang

    1998-01-01

    In most cases, the surveillance test intervals (STIs), allowed outage times (AOTS) and testing strategies of safety components in nuclear power plant are prescribed in plant technical specifications. And, in general, it is required that standby safety system shall be redundant (i.e., composed of multiple components) and these components are tested by either staggered test strategy or sequential test strategy. In this study, a linear model is presented to incorporate the effects of human errors associated with test into the evaluation of unavailability. The average unavailabilities of 1/4, 2/4 redundant systems are computed considering human error and testing strategy. The adverse effects of test on system unavailability, such as component wear and test-induced transient have been modelled. The final outcome of this study would be the optimized human error domain from 3-D human error sensitivity analysis by selecting finely classified segment. The results of sensitivity analysis show that the STI and AOT can be optimized provided human error probability is maintained within allowable range. (authors)

  11. Human Error Mechanisms in Complex Work Environments

    DEFF Research Database (Denmark)

    Rasmussen, Jens

    1988-01-01

    will account for most of the action errors observed. In addition, error mechanisms appear to be intimately related to the development of high skill and know-how in a complex work context. This relationship between errors and human adaptation is discussed in detail for individuals and organisations...

  12. Probability biases as Bayesian inference

    Directory of Open Access Journals (Sweden)

    Andre; C. R. Martins

    2006-11-01

    Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.

  13. Effects of digital human-machine interface characteristics on human error in nuclear power plants

    International Nuclear Information System (INIS)

    Li Pengcheng; Zhang Li; Dai Licao; Huang Weigang

    2011-01-01

    In order to identify the effects of digital human-machine interface characteristics on human error in nuclear power plants, the new characteristics of digital human-machine interface are identified by comparing with the traditional analog control systems in the aspects of the information display, user interface interaction and management, control systems, alarm systems and procedures system, and the negative effects of digital human-machine interface characteristics on human error are identified by field research and interviewing with operators such as increased cognitive load and workload, mode confusion, loss of situation awareness. As to the adverse effects related above, the corresponding prevention and control measures of human errors are provided to support the prevention and minimization of human errors and the optimization of human-machine interface design. (authors)

  14. Human error considerations and annunciator effects in determining optimal test intervals for periodically inspected standby systems

    International Nuclear Information System (INIS)

    McWilliams, T.P.; Martz, H.F.

    1981-01-01

    This paper incorporates the effects of four types of human error in a model for determining the optimal time between periodic inspections which maximizes the steady state availability for standby safety systems. Such safety systems are characteristic of nuclear power plant operations. The system is modeled by means of an infinite state-space Markov chain. Purpose of the paper is to demonstrate techniques for computing steady-state availability A and the optimal periodic inspection interval tau* for the system. The model can be used to investigate the effects of human error probabilities on optimal availability, study the benefits of annunciating the standby-system, and to determine optimal inspection intervals. Several examples which are representative of nuclear power plant applications are presented

  15. Sample Size Bounding and Context Ranking as Approaches to the Human Error Quantification Problem

    Energy Technology Data Exchange (ETDEWEB)

    Reer, B

    2004-03-01

    The paper describes a technique denoted as Sub-Sample-Size Bounding (SSSB), which is useable for the statistical derivation of context-specific probabilities from data available in existing reports on operating experience. Applications to human reliability analysis (HRA) are emphasised in the presentation of this technique. Exemplified by a sample of 180 abnormal event sequences, the manner in which SSSB can provide viable input for the quantification of errors of commission (EOCs) are outlined. (author)

  16. Sample Size Bounding and Context Ranking as Approaches to the Human Error Quantification Problem

    International Nuclear Information System (INIS)

    Reer, B.

    2004-01-01

    The paper describes a technique denoted as Sub-Sample-Size Bounding (SSSB), which is useable for the statistical derivation of context-specific probabilities from data available in existing reports on operating experience. Applications to human reliability analysis (HRA) are emphasised in the presentation of this technique. Exemplified by a sample of 180 abnormal event sequences, the manner in which SSSB can provide viable input for the quantification of errors of commission (EOCs) are outlined. (author)

  17. SHEAN (Simplified Human Error Analysis code) and automated THERP

    International Nuclear Information System (INIS)

    Wilson, J.R.

    1993-01-01

    One of the most widely used human error analysis tools is THERP (Technique for Human Error Rate Prediction). Unfortunately, this tool has disadvantages. The Nuclear Regulatory Commission, realizing these drawbacks, commissioned Dr. Swain, the author of THERP, to create a simpler, more consistent tool for deriving human error rates. That effort produced the Accident Sequence Evaluation Program Human Reliability Analysis Procedure (ASEP), which is more conservative than THERP, but a valuable screening tool. ASEP involves answering simple questions about the scenario in question, and then looking up the appropriate human error rate in the indicated table (THERP also uses look-up tables, but four times as many). The advantages of ASEP are that human factors expertise is not required, and the training to use the method is minimal. Although not originally envisioned by Dr. Swain, the ASEP approach actually begs to be computerized. That WINCO did, calling the code SHEAN, for Simplified Human Error ANalysis. The code was done in TURBO Basic for IBM or IBM-compatible MS-DOS, for fast execution. WINCO is now in the process of comparing this code against THERP for various scenarios. This report provides a discussion of SHEAN

  18. Post-event human decision errors: operator action tree/time reliability correlation

    International Nuclear Information System (INIS)

    Hall, R.E.; Fragola, J.; Wreathall, J.

    1982-11-01

    This report documents an interim framework for the quantification of the probability of errors of decision on the part of nuclear power plant operators after the initiation of an accident. The framework can easily be incorporated into an event tree/fault tree analysis. The method presented consists of a structure called the operator action tree and a time reliability correlation which assumes the time available for making a decision to be the dominating factor in situations requiring cognitive human response. This limited approach decreases the magnitude and complexity of the decision modeling task. Specifically, in the past, some human performance models have attempted prediction by trying to emulate sequences of human actions, or by identifying and modeling the information processing approach applicable to the task. The model developed here is directed at describing the statistical performance of a representative group of hypothetical individuals responding to generalized situations

  19. Post-event human decision errors: operator action tree/time reliability correlation

    Energy Technology Data Exchange (ETDEWEB)

    Hall, R E; Fragola, J; Wreathall, J

    1982-11-01

    This report documents an interim framework for the quantification of the probability of errors of decision on the part of nuclear power plant operators after the initiation of an accident. The framework can easily be incorporated into an event tree/fault tree analysis. The method presented consists of a structure called the operator action tree and a time reliability correlation which assumes the time available for making a decision to be the dominating factor in situations requiring cognitive human response. This limited approach decreases the magnitude and complexity of the decision modeling task. Specifically, in the past, some human performance models have attempted prediction by trying to emulate sequences of human actions, or by identifying and modeling the information processing approach applicable to the task. The model developed here is directed at describing the statistical performance of a representative group of hypothetical individuals responding to generalized situations.

  20. Human error theory: relevance to nurse management.

    Science.gov (United States)

    Armitage, Gerry

    2009-03-01

    Describe, discuss and critically appraise human error theory and consider its relevance for nurse managers. Healthcare errors are a persistent threat to patient safety. Effective risk management and clinical governance depends on understanding the nature of error. This paper draws upon a wide literature from published works, largely from the field of cognitive psychology and human factors. Although the content of this paper is pertinent to any healthcare professional; it is written primarily for nurse managers. Error is inevitable. Causation is often attributed to individuals, yet causation in complex environments such as healthcare is predominantly multi-factorial. Individual performance is affected by the tendency to develop prepacked solutions and attention deficits, which can in turn be related to local conditions and systems or latent failures. Blame is often inappropriate. Defences should be constructed in the light of these considerations and to promote error wisdom and organizational resilience. Managing and learning from error is seen as a priority in the British National Health Service (NHS), this can be better achieved with an understanding of the roots, nature and consequences of error. Such an understanding can provide a helpful framework for a range of risk management activities.

  1. Applying lessons learned to enhance human performance and reduce human error for ISS operations

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, W.R.

    1998-09-01

    A major component of reliability, safety, and mission success for space missions is ensuring that the humans involved (flight crew, ground crew, mission control, etc.) perform their tasks and functions as required. This includes compliance with training and procedures during normal conditions, and successful compensation when malfunctions or unexpected conditions occur. A very significant issue that affects human performance in space flight is human error. Human errors can invalidate carefully designed equipment and procedures. If certain errors combine with equipment failures or design flaws, mission failure or loss of life can occur. The control of human error during operation of the International Space Station (ISS) will be critical to the overall success of the program. As experience from Mir operations has shown, human performance plays a vital role in the success or failure of long duration space missions. The Department of Energy`s Idaho National Engineering and Environmental Laboratory (INEEL) is developed a systematic approach to enhance human performance and reduce human errors for ISS operations. This approach is based on the systematic identification and evaluation of lessons learned from past space missions such as Mir to enhance the design and operation of ISS. This paper describes previous INEEL research on human error sponsored by NASA and how it can be applied to enhance human reliability for ISS.

  2. Guidelines for system modeling: pre-accident human errors, rev.0

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Dae Il; Jung, W. D.; Lee, Y. H.; Hwang, M. J.; Yang, J. E

    2004-01-01

    The evaluation results of Human Reliability Analysis (HRA) of pre-accident human errors in the probabilistic safety assessment (PSA) for the Korea Standard Nuclear Power Plant (KSNP) using the ASME PRA standard show that more than 50% of 10 items to be improved are related to the identification and screening analysis for them. Thus, we developed a guideline for modeling pre-accident human errors for the system analyst to resolve some items to be improved for them. The developed guideline consists of modeling criteria for the pre-accident human errors (identification, qualitative screening, and common restoration errors) and detailed guidelines for pre-accident human errors relating to testing, maintenance, and calibration works of nuclear power plants (NPPs). The system analyst use the developed guideline and he or she applies it to the system which he or she takes care of. The HRA analyst review the application results of the system analyst. We applied the developed guideline to the auxiliary feed water system of the KSNP to show the usefulness of it. The application results of the developed guideline show that more than 50% of the items to be improved for pre-accident human errors of auxiliary feed water system are resolved. The guideline for modeling pre-accident human errors developed in this study can be used for other NPPs as well as the KSNP. It is expected that both use of the detailed procedure, to be developed in the future, for the quantification of pre-accident human errors and the guideline developed in this study will greatly enhance the PSA quality in the HRA of pre-accident human errors.

  3. Guidelines for system modeling: pre-accident human errors, rev.0

    International Nuclear Information System (INIS)

    Kang, Dae Il; Jung, W. D.; Lee, Y. H.; Hwang, M. J.; Yang, J. E.

    2004-01-01

    The evaluation results of Human Reliability Analysis (HRA) of pre-accident human errors in the probabilistic safety assessment (PSA) for the Korea Standard Nuclear Power Plant (KSNP) using the ASME PRA standard show that more than 50% of 10 items to be improved are related to the identification and screening analysis for them. Thus, we developed a guideline for modeling pre-accident human errors for the system analyst to resolve some items to be improved for them. The developed guideline consists of modeling criteria for the pre-accident human errors (identification, qualitative screening, and common restoration errors) and detailed guidelines for pre-accident human errors relating to testing, maintenance, and calibration works of nuclear power plants (NPPs). The system analyst use the developed guideline and he or she applies it to the system which he or she takes care of. The HRA analyst review the application results of the system analyst. We applied the developed guideline to the auxiliary feed water system of the KSNP to show the usefulness of it. The application results of the developed guideline show that more than 50% of the items to be improved for pre-accident human errors of auxiliary feed water system are resolved. The guideline for modeling pre-accident human errors developed in this study can be used for other NPPs as well as the KSNP. It is expected that both use of the detailed procedure, to be developed in the future, for the quantification of pre-accident human errors and the guideline developed in this study will greatly enhance the PSA quality in the HRA of pre-accident human errors

  4. Risk Management and the Concept of Human Error

    DEFF Research Database (Denmark)

    Rasmussen, Jens

    1995-01-01

    by a stochastic coincidence of faults and human errors, but by a systemic erosion of the defenses due to decision making under competitive pressure in a dynamic environment. The presentation will discuss the nature of human error and the risk management problems found in a dynamic, competitive society facing...

  5. PERANCANGAN COMPUTER AIDED SYSTEM DALAM MENGANALISA HUMAN ERROR DI PERKERETAAPIAN INDONESIA

    Directory of Open Access Journals (Sweden)

    Wiwik Budiawan

    2013-06-01

    Full Text Available Kecelakaan kereta api (KA yang terjadi secara beruntun di Indonesia sudah berada pada tingkat kritis. Berdasarkan data dari Direktorat Jendral Perkeretaapian, dalam kurun 5 tahun terakhir (2005-2009 total terdapat 611 kecelakaan KA.  Banyak faktor yang berkontribusi menyebabkan terjadinya kecelakaan, antara lain: sarana, prasarana, SDM operator (human error, eksternal, dan alam.  Kegagalan manusia (Human error merupakan salah satu faktor yang berpotensi menyebabkan terjadinya suatu kecelakaan KA dan dinyatakan sebagai faktor utama penyebab terjadinya suatu kecelakaan kereta api di Indonesia. Namun, tidak jelas bagaimana teknik analisis ini dilakukan. Kajian human error yang dilakukan Komite Nasional Keselamatan Transportasi (KNKT masih relatif terbatas, tidak dilengkapi dengan metode yang sistematis. Terdapat beberapa metode yang telah dikembangkan saat ini, tetapi untuk moda transportasi kereta api masih belum banyak dikembangkan. Human Factors Analysis and Classification System (HFACS merupakan metode analisis human error yang dikembangkan dan disesuaikan dengan sistem perkeretaapian Indonesia. Guna meningkatkan keandalan dalam analisis human error, HFACS kemudian dikembangkan dalam bentuk aplikasi berbasis web yang dapat diakses di komputer maupun smartphone. Hasil penelitian ini dapat dimanfaatkan oleh KNKT sebagai metode analisis kecelakaan kereta api khususnya terkait dengan human error. Kata kunci : human error, HFACS, CAS, kereta api   Abstract Train wreck (KA which occurred in quick succession in Indonesia already at a critical level. Based on data from the Directorate General of Railways, during the last 5 years (2005-2009 there were a total of 611 railway accidents. Many factors contribute to cause accidents, such as: facilities, infrastructure, human operator (human error, external, and natural. Human failure (Human error is one of the factors that could potentially cause a train accident and expressed as the main factors causing

  6. A chance to avoid mistakes human error

    International Nuclear Information System (INIS)

    Amaro, Pablo; Obeso, Eduardo; Gomez, Ruben

    2010-01-01

    Trying to give an answer to the lack of public information in the industry, in relationship with the different tools that are managed in the nuclear industry for minimizing the human error, a group of workers from different sections of the St. Maria de Garona NPP (Quality Assurance/ Organization and Human Factors) decided to embark on a challenging and exciting project: 'Write a book collecting all the knowledge accumulated during their daily activities, very often during lecture time of external information received from different organizations within the nuclear industry (INPO, WANO...), but also visiting different NPP's, maintaining meetings and participating in training courses related de Human and Organizational Factors'. Main objective of the book is presenting to the industry in general, the different tools that are used and fostered in the nuclear industry, in a practical way. In this way, the assimilation and implementation in others industries could be possible and achievable in and efficient context. One year of work, and our project is a reality. We have presented and abstract during the last Spanish Nuclear Society meeting in Sevilla, last October...and the best, the book is into the market for everybody in web-site: www.bubok.com. The book is structured in the following areas: 'Errare humanum est': Trying to present what is the human error to the reader, its origin and the different barriers. The message is that the reader see the error like something continuously present in our lives... even more frequently than we think. Studying its origin can be established aimed at barriers to avoid or at least minimize it. 'Error's bitter face': Shows the possible consequences of human errors. What better that presenting real experiences that have occurred in the industry. In the book, accidents in the nuclear industry, like Tree Mile Island NPP, Chernobyl NPP, and incidents like Davis Besse NPP in the past, helps to the reader to make a reflection about the

  7. Nursing Errors in Intensive Care Unit by Human Error Identification in Systems Tool: A Case Study

    Directory of Open Access Journals (Sweden)

    Nezamodini

    2016-03-01

    Full Text Available Background Although health services are designed and implemented to improve human health, the errors in health services are a very common phenomenon and even sometimes fatal in this field. Medical errors and their cost are global issues with serious consequences for the patients’ community that are preventable and require serious attention. Objectives The current study aimed to identify possible nursing errors applying human error identification in systems tool (HEIST in the intensive care units (ICUs of hospitals. Patients and Methods This descriptive research was conducted in the intensive care unit of a hospital in Khuzestan province in 2013. Data were collected through observation and interview by nine nurses in this section in a period of four months. Human error classification was based on Rose and Rose and Swain and Guttmann models. According to HEIST work sheets the guide questions were answered and error causes were identified after the determination of the type of errors. Results In total 527 errors were detected. The performing operation on the wrong path had the highest frequency which was 150, and the second rate with a frequency of 136 was doing the tasks later than the deadline. Management causes with a frequency of 451 were the first rank among identified errors. Errors mostly occurred in the system observation stage and among the performance shaping factors (PSFs, time was the most influencing factor in occurrence of human errors. Conclusions Finally, in order to prevent the occurrence and reduce the consequences of identified errors the following suggestions were proposed : appropriate training courses, applying work guidelines and monitoring their implementation, increasing the number of work shifts, hiring professional workforce, equipping work space with appropriate facilities and equipment.

  8. Studying and comparing spectrum efficiency and error probability in GMSK and DBPSK modulation schemes

    Directory of Open Access Journals (Sweden)

    Juan Mario Torres Nova

    2008-09-01

    Full Text Available Gaussian minimum shift keying (GMSK and differential binary phase shift keying (DBPSK are two digital modulation schemes which are -frequently used in radio communication systems; however, there is interdependence in the use of its benefits (spectral efficiency, low bit error rate, low inter symbol interference, etc. Optimising one parameter creates problems for another; for example, the GMSK scheme succeeds in reducing bandwidth when introducing a Gaussian filter into an MSK (minimum shift ke-ying modulator in exchange for increasing inter-symbol interference in the system. The DBPSK scheme leads to lower error pro-bability, occupying more bandwidth; it likewise facilitates synchronous data transmission due to the receiver’s bit delay when re-covering a signal.

  9. Uncovering the Best Skill Multimap by Constraining the Error Probabilities of the Gain-Loss Model

    Science.gov (United States)

    Anselmi, Pasquale; Robusto, Egidio; Stefanutti, Luca

    2012-01-01

    The Gain-Loss model is a probabilistic skill multimap model for assessing learning processes. In practical applications, more than one skill multimap could be plausible, while none corresponds to the true one. The article investigates whether constraining the error probabilities is a way of uncovering the best skill assignment among a number of…

  10. Normalization of Deviation: Quotation Error in Human Factors.

    Science.gov (United States)

    Lock, Jordan; Bearman, Chris

    2018-05-01

    Objective The objective of this paper is to examine quotation error in human factors. Background Science progresses through building on the work of previous research. This requires accurate quotation. Quotation error has a number of adverse consequences: loss of credibility, loss of confidence in the journal, and a flawed basis for academic debate and scientific progress. Quotation error has been observed in a number of domains, including marine biology and medicine, but there has been little or no previous study of this form of error in human factors, a domain that specializes in the causes and management of error. Methods A study was conducted examining quotation accuracy of 187 extracts from 118 published articles that cited a control article (Vaughan's 1996 book: The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA). Results Of extracts studied, 12.8% ( n = 24) were classed as inaccurate, with 87.2% ( n = 163) being classed as accurate. A second dimension of agreement was examined with 96.3% ( n = 180) agreeing with the control article and only 3.7% ( n = 7) disagreeing. The categories of accuracy and agreement form a two by two matrix. Conclusion Rather than simply blaming individuals for quotation error, systemic factors should also be considered. Vaughan's theory, normalization of deviance, is one systemic theory that can account for quotation error. Application Quotation error is occurring in human factors and should receive more attention. According to Vaughan's theory, the normal everyday systems that promote scholarship may also allow mistakes, mishaps, and quotation error to occur.

  11. Monte Carlo simulation of expert judgments on human errors in chemical analysis--a case study of ICP-MS.

    Science.gov (United States)

    Kuselman, Ilya; Pennecchi, Francesca; Epstein, Malka; Fajgelj, Ales; Ellison, Stephen L R

    2014-12-01

    Monte Carlo simulation of expert judgments on human errors in a chemical analysis was used for determination of distributions of the error quantification scores (scores of likelihood and severity, and scores of effectiveness of a laboratory quality system in prevention of the errors). The simulation was based on modeling of an expert behavior: confident, reasonably doubting and irresolute expert judgments were taken into account by means of different probability mass functions (pmfs). As a case study, 36 scenarios of human errors which may occur in elemental analysis of geological samples by ICP-MS were examined. Characteristics of the score distributions for three pmfs of an expert behavior were compared. Variability of the scores, as standard deviation of the simulated score values from the distribution mean, was used for assessment of the score robustness. A range of the score values, calculated directly from elicited data and simulated by a Monte Carlo method for different pmfs, was also discussed from the robustness point of view. It was shown that robustness of the scores, obtained in the case study, can be assessed as satisfactory for the quality risk management and improvement of a laboratory quality system against human errors. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. Detailed semantic analyses of human error incidents occurring at nuclear power plant in USA (interim report). Characteristics of human error incidents occurring in the period from 1992 to 1996

    International Nuclear Information System (INIS)

    Hirotsu, Yuko; Tsuge, Tadashi; Sano, Toshiaki; Takano, Kenichi; Gouda, Hidenori

    2001-01-01

    CRIEPI has been conducting detailed analyses of all human error incidents at domestic nuclear power plants (NPPs) collected from Japanese Licensee Event Reports (LERs) using J-HPES (Japanese version of HPES) as an analysis method. Results obtained by the analyses have been stored in J-HPES database. Since 1999, human error incidents have been selected from U.S. LERs, and they are analyzed using J-HPES. In this report, the results, which classified error action, cause, and preventive measure, are summarized for U.S. human error cases occurring in the period from 1992 to 1996. It was suggested as a result of classification that the categories of error action were almost the same as those of Japanese human error cases. Therefore, problems in the process of error action and checkpoints for preventing errors will be extracted by analyzing both U.S. and domestic human error cases. It was also suggested that the interrelations between error actions, causes, and organizational factors could be identified. While taking these suggestions into consideration, we will continue to analyze U.S. human error cases. (author)

  13. Chernobyl - system accident or human error?

    International Nuclear Information System (INIS)

    Stang, E.

    1996-01-01

    Did human error cause the Chernobyl disaster? The standard point of view is that operator error was the root cause of the disaster. This was also the view of the Soviet Accident Commission. The paper analyses the operator errors at Chernobyl in a system context. The reactor operators committed errors that depended upon a lot of other failures that made up a complex accident scenario. The analysis is based on Charles Perrow's analysis of technological disasters. Failure possibility is an inherent property of high-risk industrial installations. The Chernobyl accident consisted of a chain of events that were both extremely improbable and difficult to predict. It is not reasonable to put the blame for the disaster on the operators. (author)

  14. Exploiting Outage and Error Probability of Cooperative Incremental Relaying in Underwater Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Hina Nasir

    2016-07-01

    Full Text Available This paper embeds a bi-fold contribution for Underwater Wireless Sensor Networks (UWSNs; performance analysis of incremental relaying in terms of outage and error probability, and based on the analysis proposition of two new cooperative routing protocols. Subject to the first contribution, a three step procedure is carried out; a system model is presented, the number of available relays are determined, and based on cooperative incremental retransmission methodology, closed-form expressions for outage and error probability are derived. Subject to the second contribution, Adaptive Cooperation in Energy (ACE efficient depth based routing and Enhanced-ACE (E-ACE are presented. In the proposed model, feedback mechanism indicates success or failure of data transmission. If direct transmission is successful, there is no need for relaying by cooperative relay nodes. In case of failure, all the available relays retransmit the data one by one till the desired signal quality is achieved at destination. Simulation results show that the ACE and E-ACE significantly improves network performance, i.e., throughput, when compared with other incremental relaying protocols like Cooperative Automatic Repeat reQuest (CARQ. E-ACE and ACE achieve 69% and 63% more throughput respectively as compared to CARQ in hard underwater environment.

  15. Posterior Probability Matching and Human Perceptual Decision Making.

    Directory of Open Access Journals (Sweden)

    Richard F Murray

    2015-06-01

    Full Text Available Probability matching is a classic theory of decision making that was first developed in models of cognition. Posterior probability matching, a variant in which observers match their response probabilities to the posterior probability of each response being correct, is being used increasingly often in models of perception. However, little is known about whether posterior probability matching is consistent with the vast literature on vision and hearing that has developed within signal detection theory. Here we test posterior probability matching models using two tools from detection theory. First, we examine the models' performance in a two-pass experiment, where each block of trials is presented twice, and we measure the proportion of times that the model gives the same response twice to repeated stimuli. We show that at low performance levels, posterior probability matching models give highly inconsistent responses across repeated presentations of identical trials. We find that practised human observers are more consistent across repeated trials than these models predict, and we find some evidence that less practised observers more consistent as well. Second, we compare the performance of posterior probability matching models on a discrimination task to the performance of a theoretical ideal observer that achieves the best possible performance. We find that posterior probability matching is very inefficient at low-to-moderate performance levels, and that human observers can be more efficient than is ever possible according to posterior probability matching models. These findings support classic signal detection models, and rule out a broad class of posterior probability matching models for expert performance on perceptual tasks that range in complexity from contrast discrimination to symmetry detection. However, our findings leave open the possibility that inexperienced observers may show posterior probability matching behaviour, and our methods

  16. Automation of Commanding at NASA: Reducing Human Error in Space Flight

    Science.gov (United States)

    Dorn, Sarah J.

    2010-01-01

    Automation has been implemented in many different industries to improve efficiency and reduce human error. Reducing or eliminating the human interaction in tasks has been proven to increase productivity in manufacturing and lessen the risk of mistakes by humans in the airline industry. Human space flight requires the flight controllers to monitor multiple systems and react quickly when failures occur so NASA is interested in implementing techniques that can assist in these tasks. Using automation to control some of these responsibilities could reduce the number of errors the flight controllers encounter due to standard human error characteristics. This paper will investigate the possibility of reducing human error in the critical area of manned space flight at NASA.

  17. An Approach to Human Error Hazard Detection of Unexpected Situations in NPPs

    Energy Technology Data Exchange (ETDEWEB)

    Park, Sangjun; Oh, Yeonju; Shin, Youmin; Lee, Yong-Hee [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    Fukushima accident is a typical complex event including the extreme situations induced by the succeeding earthquake, tsunami, explosion, and human errors. And it is judged with incomplete cause of system build-up same manner, procedure as a deficiency of response manual, education and training, team capability and the discharge of operator from human engineering point of view. Especially, the guidelines of current operating NPPs are not enough including countermeasures to the human errors at the extreme situations. Therefore, this paper describes a trial to detect the hazards of human errors at extreme situation, and to define the countermeasures that can properly response to the human error hazards when an individual, team, organization, and working entities that encounter the extreme situation in NPPs. In this paper we try to propose an approach to analyzing and extracting human error hazards for suggesting additional countermeasures to the human errors in unexpected situations. They might be utilized to develop contingency guidelines, especially for reducing the human error accident in NPPs. But the trial application in this study is currently limited since it is not easy to find accidents cases in detail enough to enumerate the proposed steps. Therefore, we will try to analyze as more cases as possible, and consider other environmental factors and human error conditions.

  18. An Approach to Human Error Hazard Detection of Unexpected Situations in NPPs

    International Nuclear Information System (INIS)

    Park, Sangjun; Oh, Yeonju; Shin, Youmin; Lee, Yong-Hee

    2015-01-01

    Fukushima accident is a typical complex event including the extreme situations induced by the succeeding earthquake, tsunami, explosion, and human errors. And it is judged with incomplete cause of system build-up same manner, procedure as a deficiency of response manual, education and training, team capability and the discharge of operator from human engineering point of view. Especially, the guidelines of current operating NPPs are not enough including countermeasures to the human errors at the extreme situations. Therefore, this paper describes a trial to detect the hazards of human errors at extreme situation, and to define the countermeasures that can properly response to the human error hazards when an individual, team, organization, and working entities that encounter the extreme situation in NPPs. In this paper we try to propose an approach to analyzing and extracting human error hazards for suggesting additional countermeasures to the human errors in unexpected situations. They might be utilized to develop contingency guidelines, especially for reducing the human error accident in NPPs. But the trial application in this study is currently limited since it is not easy to find accidents cases in detail enough to enumerate the proposed steps. Therefore, we will try to analyze as more cases as possible, and consider other environmental factors and human error conditions

  19. Does the A-not-B error in adult pet dogs indicate sensitivity to human communication?

    Science.gov (United States)

    Kis, Anna; Topál, József; Gácsi, Márta; Range, Friederike; Huber, Ludwig; Miklósi, Adám; Virányi, Zsófia

    2012-07-01

    Recent dog-infant comparisons have indicated that the experimenter's communicative signals in object hide-and-search tasks increase the probability of perseverative (A-not-B) errors in both species (Topál et al. 2009). These behaviourally similar results, however, might reflect different mechanisms in dogs and in children. Similar errors may occur if the motor response of retrieving the object during the A trials cannot be inhibited in the B trials or if the experimenter's movements and signals toward the A hiding place in the B trials ('sham-baiting') distract the dogs' attention. In order to test these hypotheses, we tested dogs similarly to Topál et al. (2009) but eliminated the motor search in the A trials and 'sham-baiting' in the B trials. We found that neither an inability to inhibit previously rewarded motor response nor insufficiencies in their working memory and/or attention skills can explain dogs' erroneous choices. Further, we replicated the finding that dogs have a strong tendency to commit the A-not-B error after ostensive-communicative hiding and demonstrated the crucial effect of socio-communicative cues as the A-not-B error diminishes when location B is ostensively enhanced. These findings further support the hypothesis that the dogs' A-not-B error may reflect a special sensitivity to human communicative cues. Such object-hiding and search tasks provide a typical case for how susceptibility to human social signals could (mis)lead domestic dogs.

  20. Human reliability assessment in a 99Mo/99mTc generator production facility using the standardized plant analysis risk-human (SPAR-H) technique.

    Science.gov (United States)

    Eyvazlou, Meysam; Dadashpour Ahangar, Ali; Rahimi, Azin; Davarpanah, Mohammad Reza; Sayyahi, Seyed Soheil; Mohebali, Mehdi

    2018-02-13

    Reducing human error is an important factor for enhancing safety protocols in various industries. Hence, analysis of the likelihood of human error in nuclear industries such as radiopharmaceutical production facilities has become more essential. This cross-sectional descriptive study was conducted to quantify the probability of human errors in a 99 Mo/ 99m Tc generator production facility in Iran. First, through expert interviews, the production process of the 99 Mo/ 99m Tc generator was analyzed using hierarchical task analysis (HTA). The standardized plant analysis risk-human (SPAR-H) method was then applied in order to calculate the probability of human error. Twenty tasks were determined using HTA. All of the eight performance shaping factors (PSF S ) were evaluated for the tasks. The mean probability of human error was 0.320. The highest and the lowest probability of human error in the 99 Mo/ 99m Tc generator production process, related to the 'loading the generator with the molybdenum solution' task and the 'generator elution' task, were 0.858 and 0.059, respectively. Required measures for reducing the human error probability (HEP) were suggested. These measures were derived from the level of PSF S that were evaluated in this study.

  1. FRamework Assessing Notorious Contributing Influences for Error (FRANCIE): Perspective on Taxonomy Development to Support Error Reporting and Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lon N. Haney; David I. Gertman

    2003-04-01

    Beginning in the 1980s a primary focus of human reliability analysis was estimation of human error probabilities. However, detailed qualitative modeling with comprehensive representation of contextual variables often was lacking. This was likely due to the lack of comprehensive error and performance shaping factor taxonomies, and the limited data available on observed error rates and their relationship to specific contextual variables. In the mid 90s Boeing, America West Airlines, NASA Ames Research Center and INEEL partnered in a NASA sponsored Advanced Concepts grant to: assess the state of the art in human error analysis, identify future needs for human error analysis, and develop an approach addressing these needs. Identified needs included the need for a method to identify and prioritize task and contextual characteristics affecting human reliability. Other needs identified included developing comprehensive taxonomies to support detailed qualitative modeling and to structure meaningful data collection efforts across domains. A result was the development of the FRamework Assessing Notorious Contributing Influences for Error (FRANCIE) with a taxonomy for airline maintenance tasks. The assignment of performance shaping factors to generic errors by experts proved to be valuable to qualitative modeling. Performance shaping factors and error types from such detailed approaches can be used to structure error reporting schemes. In a recent NASA Advanced Human Support Technology grant FRANCIE was refined, and two new taxonomies for use on space missions were developed. The development, sharing, and use of error taxonomies, and the refinement of approaches for increased fidelity of qualitative modeling is offered as a means to help direct useful data collection strategies.

  2. Two-dimensional errors

    International Nuclear Information System (INIS)

    Anon.

    1991-01-01

    This chapter addresses the extension of previous work in one-dimensional (linear) error theory to two-dimensional error analysis. The topics of the chapter include the definition of two-dimensional error, the probability ellipse, the probability circle, elliptical (circular) error evaluation, the application to position accuracy, and the use of control systems (points) in measurements

  3. SCHEME (Soft Control Human error Evaluation MEthod) for advanced MCR HRA

    International Nuclear Information System (INIS)

    Jang, Inseok; Jung, Wondea; Seong, Poong Hyun

    2015-01-01

    The Technique for Human Error Rate Prediction (THERP), Korean Human Reliability Analysis (K-HRA), Human Error Assessment and Reduction Technique (HEART), A Technique for Human Event Analysis (ATHEANA), Cognitive Reliability and Error Analysis Method (CREAM), and Simplified Plant Analysis Risk Human Reliability Assessment (SPAR-H) in relation to NPP maintenance and operation. Most of these methods were developed considering the conventional type of Main Control Rooms (MCRs). They are still used for HRA in advanced MCRs even though the operating environment of advanced MCRs in NPPs has been considerably changed by the adoption of new human-system interfaces such as computer-based soft controls. Among the many features in advanced MCRs, soft controls are an important feature because the operation action in NPP advanced MCRs is performed by soft controls. Consequently, those conventional methods may not sufficiently consider the features of soft control execution human errors. To this end, a new framework of a HRA method for evaluating soft control execution human error is suggested by performing the soft control task analysis and the literature reviews regarding widely accepted human error taxonomies. In this study, the framework of a HRA method for evaluating soft control execution human error in advanced MCRs is developed. First, the factors which HRA method in advanced MCRs should encompass are derived based on the literature review, and soft control task analysis. Based on the derived factors, execution HRA framework in advanced MCRs is developed mainly focusing on the features of soft control. Moreover, since most current HRA database deal with operation in conventional type of MCRs and are not explicitly designed to deal with digital HSI, HRA database are developed under lab scale simulation

  4. ADVANCED MMIS TOWARD SUBSTANTIAL REDUCTION IN HUMAN ERRORS IN NPPS

    Directory of Open Access Journals (Sweden)

    POONG HYUN SEONG

    2013-04-01

    Full Text Available This paper aims to give an overview of the methods to inherently prevent human errors and to effectively mitigate the consequences of such errors by securing defense-in-depth during plant management through the advanced man-machine interface system (MMIS. It is needless to stress the significance of human error reduction during an accident in nuclear power plants (NPPs. Unexpected shutdowns caused by human errors not only threaten nuclear safety but also make public acceptance of nuclear power extremely lower. We have to recognize there must be the possibility of human errors occurring since humans are not essentially perfect particularly under stressful conditions. However, we have the opportunity to improve such a situation through advanced information and communication technologies on the basis of lessons learned from our experiences. As important lessons, authors explained key issues associated with automation, man-machine interface, operator support systems, and procedures. Upon this investigation, we outlined the concept and technical factors to develop advanced automation, operation and maintenance support systems, and computer-based procedures using wired/wireless technology. It should be noted that the ultimate responsibility of nuclear safety obviously belongs to humans not to machines. Therefore, safety culture including education and training, which is a kind of organizational factor, should be emphasized as well. In regard to safety culture for human error reduction, several issues that we are facing these days were described. We expect the ideas of the advanced MMIS proposed in this paper to lead in the future direction of related researches and finally supplement the safety of NPPs.

  5. Advanced MMIS Toward Substantial Reduction in Human Errors in NPPs

    Energy Technology Data Exchange (ETDEWEB)

    Seong, Poong Hyun; Kang, Hyun Gook [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Na, Man Gyun [Chosun Univ., Gwangju (Korea, Republic of); Kim, Jong Hyun [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of); Heo, Gyunyoung [Kyung Hee Univ., Yongin (Korea, Republic of); Jung, Yoensub [Korea Hydro and Nuclear Power Co., Ltd., Daejeon (Korea, Republic of)

    2013-04-15

    This paper aims to give an overview of the methods to inherently prevent human errors and to effectively mitigate the consequences of such errors by securing defense-in-depth during plant management through the advanced man-machine interface system (MMIS). It is needless to stress the significance of human error reduction during an accident in nuclear power plants (NPPs). Unexpected shutdowns caused by human errors not only threaten nuclear safety but also make public acceptance of nuclear power extremely lower. We have to recognize there must be the possibility of human errors occurring since humans are not essentially perfect particularly under stressful conditions. However, we have the opportunity to improve such a situation through advanced information and communication technologies on the basis of lessons learned from our experiences. As important lessons, authors explained key issues associated with automation, man-machine interface, operator support systems, and procedures. Upon this investigation, we outlined the concept and technical factors to develop advanced automation, operation and maintenance support systems, and computer-based procedures using wired/wireless technology. It should be noted that the ultimate responsibility of nuclear safety obviously belongs to humans not to machines. Therefore, safety culture including education and training, which is a kind of organizational factor, should be emphasized as well. In regard to safety culture for human error reduction, several issues that we are facing these days were described. We expect the ideas of the advanced MMIS proposed in this paper to lead in the future direction of related researches and finally supplement the safety of NPPs.

  6. Advanced MMIS Toward Substantial Reduction in Human Errors in NPPs

    International Nuclear Information System (INIS)

    Seong, Poong Hyun; Kang, Hyun Gook; Na, Man Gyun; Kim, Jong Hyun; Heo, Gyunyoung; Jung, Yoensub

    2013-01-01

    This paper aims to give an overview of the methods to inherently prevent human errors and to effectively mitigate the consequences of such errors by securing defense-in-depth during plant management through the advanced man-machine interface system (MMIS). It is needless to stress the significance of human error reduction during an accident in nuclear power plants (NPPs). Unexpected shutdowns caused by human errors not only threaten nuclear safety but also make public acceptance of nuclear power extremely lower. We have to recognize there must be the possibility of human errors occurring since humans are not essentially perfect particularly under stressful conditions. However, we have the opportunity to improve such a situation through advanced information and communication technologies on the basis of lessons learned from our experiences. As important lessons, authors explained key issues associated with automation, man-machine interface, operator support systems, and procedures. Upon this investigation, we outlined the concept and technical factors to develop advanced automation, operation and maintenance support systems, and computer-based procedures using wired/wireless technology. It should be noted that the ultimate responsibility of nuclear safety obviously belongs to humans not to machines. Therefore, safety culture including education and training, which is a kind of organizational factor, should be emphasized as well. In regard to safety culture for human error reduction, several issues that we are facing these days were described. We expect the ideas of the advanced MMIS proposed in this paper to lead in the future direction of related researches and finally supplement the safety of NPPs

  7. The Relationship between Human Operators' Psycho-physiological Condition and Human Errors in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Kim, Arryum; Jang, Inseok; Kang, Hyungook; Seong, Poonghyun

    2013-01-01

    The safe operation of nuclear power plants (NPPs) is substantially dependent on the performance of the human operators who operate the systems. In this environment, human errors caused by inappropriate performance of operator have been considered to be critical since it may lead serious problems in the safety-critical plants. In order to provide meaningful insights to prevent human errors and enhance the human performance, operators' physiological conditions such as stress and workload have been investigated. Physiological measurements were considered as reliable tools to assess the stress and workload. T. Q. Tran et al. and J. B. Brooking et al pointed out that operators' workload can be assessed using eye tracking, galvanic skin response, electroencephalograms (EEGs), heart rate, respiration and other measurements. The purpose of this study is to investigate the effect of the human operators' tense level and knowledge level to the number of human errors. For this study, the experiments were conducted in the mimic of the main control rooms (MCR) in NPP. It utilized the compact nuclear simulator (CNS) which is modeled based on the three loop Pressurized Water Reactor, 993MWe, Kori unit 3 and 4 in Korea and the subjects were asked to follow the tasks described in the emergency operating procedures (EOP). During the simulation, three kinds of physiological measurement were utilized; Electrocardiogram (ECG), EEG and nose temperature. Also, subjects were divided into three groups based on their knowledge of the plant operation. The result shows that subjects who are tense make fewer errors. In addition, subjects who are in higher knowledge level tend to be tense and make fewer errors. For the ECG data, subjects who make fewer human errors tend to be located in higher tense level area of high SNS activity and low PSNS activity. The results of EEG data are also similar to ECG result. Beta power ratio of subjects who make fewer errors was higher. Since beta power ratio is

  8. Quantification of human errors in level-1 PSA studies in NUPEC/JINS

    International Nuclear Information System (INIS)

    Hirano, M.; Hirose, M.; Sugawara, M.; Hashiba, T.

    1991-01-01

    THERP (Technique for Human Error Rate Prediction) method is mainly adopted to evaluate the pre-accident and post-accident human error rates. Performance shaping factors are derived by taking Japanese operational practice into account. Several examples of human error rates with calculational procedures are presented. The important human interventions of typical Japanese NPPs are also presented. (orig./HP)

  9. Logic, Probability, and Human Reasoning

    Science.gov (United States)

    2015-01-01

    accordingly suggest a way to integrate probability and deduction. The nature of deductive reasoning To be rational is to be able to make deductions...3–6] and they underlie mathematics, science, and tech- nology [7–10]. Plato claimed that emotions upset reason- ing. However, individuals in the grip...fundamental to human rationality . So, if counterexamples to its principal predictions occur, the theory will at least explain its own refutation

  10. Estimating the probability that the Taser directly causes human ventricular fibrillation.

    Science.gov (United States)

    Sun, H; Haemmerich, D; Rahko, P S; Webster, J G

    2010-04-01

    This paper describes the first methodology and results for estimating the order of probability for Tasers directly causing human ventricular fibrillation (VF). The probability of an X26 Taser causing human VF was estimated using: (1) current density near the human heart estimated by using 3D finite-element (FE) models; (2) prior data of the maximum dart-to-heart distances that caused VF in pigs; (3) minimum skin-to-heart distances measured in erect humans by echocardiography; and (4) dart landing distribution estimated from police reports. The estimated mean probability of human VF was 0.001 for data from a pig having a chest wall resected to the ribs and 0.000006 for data from a pig with no resection when inserting a blunt probe. The VF probability for a given dart location decreased with the dart-to-heart horizontal distance (radius) on the skin surface.

  11. Modeling Human Error Mechanism for Soft Control in Advanced Control Rooms (ACRs)

    Energy Technology Data Exchange (ETDEWEB)

    Aljneibi, Hanan Salah Ali [Khalifa Univ., Abu Dhabi (United Arab Emirates); Ha, Jun Su; Kang, Seongkeun; Seong, Poong Hyun [KAIST, Daejeon (Korea, Republic of)

    2015-10-15

    To achieve the switch from conventional analog-based design to digital design in ACRs, a large number of manual operating controls and switches have to be replaced by a few common multi-function devices which is called soft control system. The soft controls in APR-1400 ACRs are classified into safety-grade and non-safety-grade soft controls; each was designed using different and independent input devices in ACRs. The operations using soft controls require operators to perform new tasks which were not necessary in conventional controls such as navigating computerized displays to monitor plant information and control devices. These kinds of computerized displays and soft controls may make operations more convenient but they might cause new types of human error. In this study the human error mechanism during the soft controls is studied and modeled to be used for analysis and enhancement of human performance (or human errors) during NPP operation. The developed model would contribute to a lot of applications to improve human performance (or reduce human errors), HMI designs, and operators' training program in ACRs. The developed model of human error mechanism for the soft control is based on assumptions that a human operator has certain amount of capacity in cognitive resources and if resources required by operating tasks are greater than resources invested by the operator, human error (or poor human performance) is likely to occur (especially in 'slip'); good HMI (Human-machine Interface) design decreases the required resources; operator's skillfulness decreases the required resources; and high vigilance increases the invested resources. In this study the human error mechanism during the soft controls is studied and modeled to be used for analysis and enhancement of human performance (or reduction of human errors) during NPP operation.

  12. Modeling Human Error Mechanism for Soft Control in Advanced Control Rooms (ACRs)

    International Nuclear Information System (INIS)

    Aljneibi, Hanan Salah Ali; Ha, Jun Su; Kang, Seongkeun; Seong, Poong Hyun

    2015-01-01

    To achieve the switch from conventional analog-based design to digital design in ACRs, a large number of manual operating controls and switches have to be replaced by a few common multi-function devices which is called soft control system. The soft controls in APR-1400 ACRs are classified into safety-grade and non-safety-grade soft controls; each was designed using different and independent input devices in ACRs. The operations using soft controls require operators to perform new tasks which were not necessary in conventional controls such as navigating computerized displays to monitor plant information and control devices. These kinds of computerized displays and soft controls may make operations more convenient but they might cause new types of human error. In this study the human error mechanism during the soft controls is studied and modeled to be used for analysis and enhancement of human performance (or human errors) during NPP operation. The developed model would contribute to a lot of applications to improve human performance (or reduce human errors), HMI designs, and operators' training program in ACRs. The developed model of human error mechanism for the soft control is based on assumptions that a human operator has certain amount of capacity in cognitive resources and if resources required by operating tasks are greater than resources invested by the operator, human error (or poor human performance) is likely to occur (especially in 'slip'); good HMI (Human-machine Interface) design decreases the required resources; operator's skillfulness decreases the required resources; and high vigilance increases the invested resources. In this study the human error mechanism during the soft controls is studied and modeled to be used for analysis and enhancement of human performance (or reduction of human errors) during NPP operation

  13. The Concept of Human Error and the Design of Reliable Human-Machine Systems

    DEFF Research Database (Denmark)

    Rasmussen, Jens

    1995-01-01

    The concept of human error is unreliable as a basis for design of reliable human-machine systems. Humans are basically highly adaptive and 'errors' are closely related to the process of adaptation and learning. Therefore, reliability of system operation depends on an interface that is not designed...... so as to support a pre-conceived operating procedure, but, instead, makes visible the deep, functional structure of the system together with the boundaries of acceptable operation in away that allows operators to 'touch' the boundaries and to learn to cope with the effects of errors in a reversible...... way. The concepts behind such 'ecological' interfaces are discussed, an it is argued that a 'typology' of visualization concepts is a pressing research need....

  14. Quantum Probabilities as Behavioral Probabilities

    Directory of Open Access Journals (Sweden)

    Vyacheslav I. Yukalov

    2017-03-01

    Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.

  15. The treatment of commission errors in first generation human reliability analysis methods

    Energy Technology Data Exchange (ETDEWEB)

    Alvarengga, Marco Antonio Bayout; Fonseca, Renato Alves da, E-mail: bayout@cnen.gov.b, E-mail: rfonseca@cnen.gov.b [Comissao Nacional de Energia Nuclear (CNEN) Rio de Janeiro, RJ (Brazil); Melo, Paulo Fernando Frutuoso e, E-mail: frutuoso@nuclear.ufrj.b [Coordenacao dos Programas de Pos-Graduacao de Engenharia (PEN/COPPE/UFRJ), RJ (Brazil). Programa de Engenharia Nuclear

    2011-07-01

    Human errors in human reliability analysis can be classified generically as errors of omission and commission errors. Omission errors are related to the omission of any human action that should have been performed, but does not occur. Errors of commission are those related to human actions that should not be performed, but which in fact are performed. Both involve specific types of cognitive error mechanisms, however, errors of commission are more difficult to model because they are characterized by non-anticipated actions that are performed instead of others that are omitted (omission errors) or are entered into an operational task without being part of the normal sequence of this task. The identification of actions that are not supposed to occur depends on the operational context that will influence or become easy certain unsafe actions of the operator depending on the operational performance of its parameters and variables. The survey of operational contexts and associated unsafe actions is a characteristic of second-generation models, unlike the first generation models. This paper discusses how first generation models can treat errors of commission in the steps of detection, diagnosis, decision-making and implementation, in the human information processing, particularly with the use of THERP tables of errors quantification. (author)

  16. Identification and Evaluation of Human Errors in the Medication Process Using the Extended CREAM Technique

    Directory of Open Access Journals (Sweden)

    Iraj Mohammadfam

    2017-10-01

    Full Text Available Background Medication process is a powerful instrument for curing patients. Obeying the commands of this process has an important role in the treatment and provision of care to patients. Medication error, as a complicated process, can occur in any stage of this process, and to avoid it, appropriate decision-making, cognition, and performance of the hospital staff are needed. Objectives The present study aimed at identifying and evaluating the nature and reasons of human errors in the medication process in a hospital using the extended CREAM method. Methods This was a qualitative and cross-sectional study conducted in a hospital in Hamadan. In this study, first, the medication process was selected as a critical issue based on the opinions of experts, specialists, and experienced individuals in the nursing and medical departments. Then, the process was analyzed into relative steps and substeps using the method of HTA and was evaluated using extended CREAM technique considering the probability of human errors. Results Based on the findings achieved through the basic CREAM method, the highest CFPt was in the step of medicine administration to patients (0.056. Moreover, the results revealed that the highest CFPt was in the substeps of calculating the dose of medicine and determining the method of prescription and identifying the patient (0.0796 and 0.0785, respectively. Also, the least CFPt was related to transcribing the prescribed medicine from file to worksheet of medicine (0.0106. Conclusions Considering the critical consequences of human errors in the medication process, holding pharmacological retraining classes, using the principles of executing pharmaceutical orders, increasing medical personnel, reducing working overtime, organizing work shifts, and using error reporting systems are of paramount importance.

  17. A Human Error Analysis Procedure for Identifying Potential Error Modes and Influencing Factors for Test and Maintenance Activities

    International Nuclear Information System (INIS)

    Kim, Jae Whan; Park, Jin Kyun

    2010-01-01

    Periodic or non-periodic test and maintenance (T and M) activities in large, complex systems such as nuclear power plants (NPPs) are essential for sustaining stable and safe operation of the systems. On the other hand, it also has been raised that human erroneous actions that might occur during T and M activities has the possibility of incurring unplanned reactor trips (RTs) or power derate, making safety-related systems unavailable, or making the reliability of components degraded. Contribution of human errors during normal and abnormal activities of NPPs to the unplanned RTs is known to be about 20% of the total events. This paper introduces a procedure for predictively analyzing human error potentials when maintenance personnel perform T and M tasks based on a work procedure or their work plan. This procedure helps plant maintenance team prepare for plausible human errors. The procedure to be introduced is focusing on the recurrent error forms (or modes) in execution-based errors such as wrong object, omission, too little, and wrong action

  18. Knowledge-base for the new human reliability analysis method, A Technique for Human Error Analysis (ATHEANA)

    International Nuclear Information System (INIS)

    Cooper, S.E.; Wreathall, J.; Thompson, C.M., Drouin, M.; Bley, D.C.

    1996-01-01

    This paper describes the knowledge base for the application of the new human reliability analysis (HRA) method, a ''A Technique for Human Error Analysis'' (ATHEANA). Since application of ATHEANA requires the identification of previously unmodeled human failure events, especially errors of commission, and associated error-forcing contexts (i.e., combinations of plant conditions and performance shaping factors), this knowledge base is an essential aid for the HRA analyst

  19. A Human Reliability Analysis of Post- Accident Human Errors in the Low Power and Shutdown PSA of KSNP

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Daeil; Kim, J. H.; Jang, S. C

    2007-03-15

    Korea Atomic Energy Research Institute, using the ANS low power and shutdown (LPSD) probabilistic risk assessment (PRA) Standard, evaluated the LPSD PSA model of the KSNP, Yonggwang Units 5 and 6, and identified the items to be improved. The evaluation results of human reliability analysis (HRA) of the post-accident human errors in the LPSD PSA model for the KSNP showed that 10 items among 19 items of supporting requirements for those in the ANS PRA Standard were identified as them to be improved. Thus, we newly carried out a HRA for post-accident human errors in the LPSD PSA model for the KSNP. Following tasks are the improvements in the HRA of post-accident human errors of the LPSD PSA model for the KSNP compared with the previous one: Interviews with operators in the interpretation of the procedure, modeling of operator actions, and the quantification results of human errors, site visit. Applications of limiting value to the combined post-accident human errors. Documentation of information of all the input and bases for the detailed quantifications and the dependency analysis using the quantification sheets The assessment results for the new HRA results of post-accident human errors using the ANS LPSD PRA Standard show that above 80% items of its supporting requirements for post-accident human errors were graded as its Category II. The number of the re-estimated human errors using the LPSD Korea Standard HRA method is 385. Among them, the number of individual post-accident human errors is 253. The number of dependent post-accident human errors is 135. The quantification results of the LPSD PSA model for the KSNP with new HEPs show that core damage frequency (CDF) is increased by 5.1% compared with the previous baseline CDF It is expected that this study results will be greatly helpful to improve the PSA quality for the domestic nuclear power plants because they have sufficient PSA quality to meet the Category II of Supporting Requirements for the post

  20. A Human Reliability Analysis of Post- Accident Human Errors in the Low Power and Shutdown PSA of KSNP

    International Nuclear Information System (INIS)

    Kang, Daeil; Kim, J. H.; Jang, S. C.

    2007-03-01

    Korea Atomic Energy Research Institute, using the ANS low power and shutdown (LPSD) probabilistic risk assessment (PRA) Standard, evaluated the LPSD PSA model of the KSNP, Yonggwang Units 5 and 6, and identified the items to be improved. The evaluation results of human reliability analysis (HRA) of the post-accident human errors in the LPSD PSA model for the KSNP showed that 10 items among 19 items of supporting requirements for those in the ANS PRA Standard were identified as them to be improved. Thus, we newly carried out a HRA for post-accident human errors in the LPSD PSA model for the KSNP. Following tasks are the improvements in the HRA of post-accident human errors of the LPSD PSA model for the KSNP compared with the previous one: Interviews with operators in the interpretation of the procedure, modeling of operator actions, and the quantification results of human errors, site visit. Applications of limiting value to the combined post-accident human errors. Documentation of information of all the input and bases for the detailed quantifications and the dependency analysis using the quantification sheets The assessment results for the new HRA results of post-accident human errors using the ANS LPSD PRA Standard show that above 80% items of its supporting requirements for post-accident human errors were graded as its Category II. The number of the re-estimated human errors using the LPSD Korea Standard HRA method is 385. Among them, the number of individual post-accident human errors is 253. The number of dependent post-accident human errors is 135. The quantification results of the LPSD PSA model for the KSNP with new HEPs show that core damage frequency (CDF) is increased by 5.1% compared with the previous baseline CDF It is expected that this study results will be greatly helpful to improve the PSA quality for the domestic nuclear power plants because they have sufficient PSA quality to meet the Category II of Supporting Requirements for the post

  1. Human reliability analysis during PSA at Trillo NPP: main characteristics and analysis of diagnostic errors

    International Nuclear Information System (INIS)

    Barquin, M.A.; Gomez, F.

    1998-01-01

    The design difference between Trillo NPP and other Spanish nuclear power plants (basic Westinghouse and General Electric designs) were made clear in the Human Reliability Analysis of the Probabilistic Safety Analysis (PSA) for Trillo NPP. The object of this paper is to describe the most significant characteristics of the Human Reliability Analysis carried out in the PSA, with special emphasis on the possible diagnostic errors and their consequences, based on the characteristics in the Emergency Operations Manual for Trillo NPP. - In the case of human errors before the initiating event (type 1), the existence of four redundancies in most of the plant safety systems, means that the impact of this type or error on the final results of the PSA is insignificant. However, in the case common cause errors, especially in certain calibration errors, some actions are significant in the final equation for core damage - The number of human actions that the operator has to carry out during the accidents (type 3) modelled, is relatively small in comparison with this value in other PSAs. This is basically due to the high level of automation at Rillo NPP - The Plant Operations Manual cannot be strictly considered to be a symptoms-based procedure. The operation Group must select the chapter from the Operations Manual to be followed, after having diagnosed the perturbing event, using for this purpose and Emergency and Anomaly Decision Tree (M.O.3.0.1) based on the different indications, alarms and symptoms present in the plant after the perturbing event. For this reason, it was decided to analyse the possible diagnosis errors. In the bibliography on diagnosis and commission errors available at the present time, there is no precise methodology for the analysis of this type of error and its incorporation into PSAs. The method used in the PSA for Trillo y NPP to evaluate this type of interaction, is to develop a Diagnosis Error Table, the object of which is to identify the situations in

  2. Error detection in spoken human-machine interaction

    NARCIS (Netherlands)

    Krahmer, E.J.; Swerts, M.G.J.; Theune, M.; Weegels, M.F.

    2001-01-01

    Given the state of the art of current language and speech technology, errors are unavoidable in present-day spoken dialogue systems. Therefore, one of the main concerns in dialogue design is how to decide whether or not the system has understood the user correctly. In human-human communication,

  3. Error detection in spoken human-machine interaction

    NARCIS (Netherlands)

    Krahmer, E.; Swerts, M.; Theune, Mariet; Weegels, M.

    Given the state of the art of current language and speech technology, errors are unavoidable in present-day spoken dialogue systems. Therefore, one of the main concerns in dialogue design is how to decide whether or not the system has understood the user correctly. In human-human communication,

  4. Optimal design methods for a digital human-computer interface based on human reliability in a nuclear power plant

    International Nuclear Information System (INIS)

    Jiang, Jianjun; Zhang, Li; Xie, Tian; Wu, Daqing; Li, Min; Wang, Yiqun; Peng, Yuyuan; Peng, Jie; Zhang, Mengjia; Li, Peiyao; Ma, Congmin; Wu, Xing

    2017-01-01

    Highlights: • A complete optimization process is established for digital human-computer interfaces of Npps. • A quick convergence search method is proposed. • The authors propose an affinity error probability mapping function to test human reliability. - Abstract: This is the second in a series of papers describing the optimal design method for a digital human-computer interface of nuclear power plant (Npp) from three different points based on human reliability. The purpose of this series is to explore different optimization methods from varying perspectives. This present paper mainly discusses the optimal design method for quantity of components of the same factor. In monitoring process, quantity of components has brought heavy burden to operators, thus, human errors are easily triggered. To solve the problem, the authors propose an optimization process, a quick convergence search method and an affinity error probability mapping function. Two balanceable parameter values of the affinity error probability function are obtained by experiments. The experimental results show that the affinity error probability mapping function about human-computer interface has very good sensitivity and stability, and that quick convergence search method for fuzzy segments divided by component quantity has better performance than general algorithm.

  5. Evaluation of human error estimation for nuclear power plants

    International Nuclear Information System (INIS)

    Haney, L.N.; Blackman, H.S.

    1987-01-01

    The dominant risk for severe accident occurrence in nuclear power plants (NPPs) is human error. The US Nuclear Regulatory Commission (NRC) sponsored an evaluation of Human Reliability Analysis (HRA) techniques for estimation of human error in NPPs. Twenty HRA techniques identified by a literature search were evaluated with criteria sets designed for that purpose and categorized. Data were collected at a commercial NPP with operators responding in walkthroughs of four severe accident scenarios and full scope simulator runs. Results suggest a need for refinement and validation of the techniques. 19 refs

  6. The Relationship between Human Operators' Psycho-physiological Condition and Human Errors in Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Arryum; Jang, Inseok; Kang, Hyungook; Seong, Poonghyun [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of)

    2013-05-15

    The safe operation of nuclear power plants (NPPs) is substantially dependent on the performance of the human operators who operate the systems. In this environment, human errors caused by inappropriate performance of operator have been considered to be critical since it may lead serious problems in the safety-critical plants. In order to provide meaningful insights to prevent human errors and enhance the human performance, operators' physiological conditions such as stress and workload have been investigated. Physiological measurements were considered as reliable tools to assess the stress and workload. T. Q. Tran et al. and J. B. Brooking et al pointed out that operators' workload can be assessed using eye tracking, galvanic skin response, electroencephalograms (EEGs), heart rate, respiration and other measurements. The purpose of this study is to investigate the effect of the human operators' tense level and knowledge level to the number of human errors. For this study, the experiments were conducted in the mimic of the main control rooms (MCR) in NPP. It utilized the compact nuclear simulator (CNS) which is modeled based on the three loop Pressurized Water Reactor, 993MWe, Kori unit 3 and 4 in Korea and the subjects were asked to follow the tasks described in the emergency operating procedures (EOP). During the simulation, three kinds of physiological measurement were utilized; Electrocardiogram (ECG), EEG and nose temperature. Also, subjects were divided into three groups based on their knowledge of the plant operation. The result shows that subjects who are tense make fewer errors. In addition, subjects who are in higher knowledge level tend to be tense and make fewer errors. For the ECG data, subjects who make fewer human errors tend to be located in higher tense level area of high SNS activity and low PSNS activity. The results of EEG data are also similar to ECG result. Beta power ratio of subjects who make fewer errors was higher. Since beta

  7. A system engineer's Perspective on Human Errors For a more Effective Management of Human Factors in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Lee, Yong-Hee; Jang, Tong-Il; Lee, Soo-Kil

    2007-01-01

    The management of human factors in nuclear power plants (NPPs) has become one of the burden factors during their operating period after the design and construction period. Almost every study on the major accidents emphasizes the prominent importance of the human errors. Regardless of the regulatory requirements such as Periodic Safety Review, the management of human factors would be a main issue to reduce the human errors and to enhance the performance of plants. However, it is not easy to find out a more effective perspective on human errors to establish the engineering implementation plan for preventing them. This paper describes a system engineer's perspectives on human errors and discusses its application to the recent study on the human error events in Korean NPPs

  8. Development of a framework to estimate human error for diagnosis tasks in advanced control room

    International Nuclear Information System (INIS)

    Kim, Ar Ryum; Jang, In Seok; Seong, Proong Hyun

    2014-01-01

    In the emergency situation of nuclear power plants (NPPs), a diagnosis of the occurring events is crucial for managing or controlling the plant to a safe and stable condition. If the operators fail to diagnose the occurring events or relevant situations, their responses can eventually inappropriate or inadequate Accordingly, huge researches have been performed to identify the cause of diagnosis error and estimate the probability of diagnosis error. D.I Gertman et al. asserted that 'the cognitive failures stem from erroneous decision-making, poor understanding of rules and procedures, and inadequate problem solving and this failures may be due to quality of data and people's capacity for processing information'. Also many researchers have asserted that human-system interface (HSI), procedure, training and available time are critical factors to cause diagnosis error. In nuclear power plants, a diagnosis of the event is critical for safe condition of the system. As advanced main control room is being adopted in nuclear power plants, the operators may obtain the plant data via computer-based HSI and procedure. Also many researchers have asserted that HSI, procedure, training and available time are critical factors to cause diagnosis error. In this regards, using simulation data, diagnosis errors and its causes were identified. From this study, some useful insights to reduce diagnosis errors of operators in advanced main control room were provided

  9. Quantitative evaluation of the impact of human reliability in risk assessment for nuclear power plants

    International Nuclear Information System (INIS)

    Samanta, P.K.

    1981-01-01

    The role of human beings in the safe operation of a nuclear power plant has been a matter of concern. This study describes methods for the quantitative description of that role and its impact on the risk from nuclear power plants. The impact of human errors was calculated by observing the changes in risk parameters, such as core melt probability, release category probabilities, accident sequence probabilities and system unavailabilities due to changes in the contribution to unavailablity of human errors, within the framework of risk assessment methodology. It was found that for operational pressurized water reactors the opportunity for reduction in core melt probability by reducing the human error rates without simultaneous reduction of hardware failures is limited, but that core melt probability would significantly increase as human error rates increased. More importantly, most of the dominant accident sequences showed a significant increase in their probabilities with an increase in human error rates. Release categories resulting in high consequences showed a much larger sensitivity to human errors than categories resulting in low consequences. A combination of structural importance and reliability importance measure was used to describe the importance of individual errors

  10. On Probability Leakage

    OpenAIRE

    Briggs, William M.

    2012-01-01

    The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.

  11. A stochastic dynamic model for human error analysis in nuclear power plants

    Science.gov (United States)

    Delgado-Loperena, Dharma

    Nuclear disasters like Three Mile Island and Chernobyl indicate that human performance is a critical safety issue, sending a clear message about the need to include environmental press and competence aspects in research. This investigation was undertaken to serve as a roadmap for studying human behavior through the formulation of a general solution equation. The theoretical model integrates models from two heretofore-disassociated disciplines (behavior specialists and technical specialists), that historically have independently studied the nature of error and human behavior; including concepts derived from fractal and chaos theory; and suggests re-evaluation of base theory regarding human error. The results of this research were based on comprehensive analysis of patterns of error, with the omnipresent underlying structure of chaotic systems. The study of patterns lead to a dynamic formulation, serving for any other formula used to study human error consequences. The search for literature regarding error yielded insight for the need to include concepts rooted in chaos theory and strange attractors---heretofore unconsidered by mainstream researchers who investigated human error in nuclear power plants or those who employed the ecological model in their work. The study of patterns obtained from the rupture of a steam generator tube (SGTR) event simulation, provided a direct application to aspects of control room operations in nuclear power plant operations. In doing so, the conceptual foundation based in the understanding of the patterns of human error analysis can be gleaned, resulting in reduced and prevent undesirable events.

  12. Operator error and emotions. Operator error and emotions - a major cause of human failure

    International Nuclear Information System (INIS)

    Patterson, B.K.; Bradley, M.; Artiss, W.G.

    2000-01-01

    This paper proposes the idea that a large proportion of the incidents attributed to operator and maintenance error in a nuclear or industrial plant are actually founded in our human emotions. Basic psychological theory of emotions is briefly presented and then the authors present situations and instances that can cause emotions to swell and lead to operator and maintenance error. Since emotional information is not recorded in industrial incident reports, the challenge is extended to industry, to review incident source documents for cases of emotional involvement and to develop means to collect emotion related information in future root cause analysis investigations. Training must then be provided to operators and maintainers to enable them to know one's emotions, manage emotions, motivate one's self, recognize emotions in others and handle relationships. Effective training will reduce the instances of human error based in emotions and enable a cooperative, productive environment in which to work. (author)

  13. Operator error and emotions. Operator error and emotions - a major cause of human failure

    Energy Technology Data Exchange (ETDEWEB)

    Patterson, B.K. [Human Factors Practical Incorporated (Canada); Bradley, M. [Univ. of New Brunswick, Saint John, New Brunswick (Canada); Artiss, W.G. [Human Factors Practical (Canada)

    2000-07-01

    This paper proposes the idea that a large proportion of the incidents attributed to operator and maintenance error in a nuclear or industrial plant are actually founded in our human emotions. Basic psychological theory of emotions is briefly presented and then the authors present situations and instances that can cause emotions to swell and lead to operator and maintenance error. Since emotional information is not recorded in industrial incident reports, the challenge is extended to industry, to review incident source documents for cases of emotional involvement and to develop means to collect emotion related information in future root cause analysis investigations. Training must then be provided to operators and maintainers to enable them to know one's emotions, manage emotions, motivate one's self, recognize emotions in others and handle relationships. Effective training will reduce the instances of human error based in emotions and enable a cooperative, productive environment in which to work. (author)

  14. A probabilistic analysis method to evaluate the effect of human factors on plant safety

    International Nuclear Information System (INIS)

    Ujita, H.

    1987-01-01

    A method to evaluate the effect of human factors on probabilistic safety analysis (PSA) is developed. The main features of the method are as follows: 1. A time-dependent multibranch tree is constructed to treat time dependency of human error probability. 2. A sensitivity analysis is done to determine uncertainty in the PSA due to branch time of human error occurrence, human error data source, extraneous act probability, and human recovery probability. The method is applied to a large-break, loss-of-coolant accident of a boiling water reactor-5. As a result, core melt probability and risk do not depend on the number of time branches, which means that a small number of branches are sufficient. These values depend on the first branch time and the human error probability

  15. The common mode failures analysis of the redundent system with dependent human error

    International Nuclear Information System (INIS)

    Kim, M.K.; Chang, S.H.

    1983-01-01

    Common mode failures (CMFs) have been a serious concern in the nuclear power plant. Thereis a broad category of the failure mechanisms that can cause common mode failures. This paper is a theoretical investigation of the CMFs on the unavailability of the redundent system. It is assumed that the total CMFs consist of the potential CMFs and the dependent human error CMFs. As the human error dependency is higher, the total CMFs are more effected by the dependent human error. If the human error dependence is lower, the system unavailability strongly depends on the potential CMFs, rather than the mechanical failure or the dependent human error. And it is shown that the total CMFs are dominant factor to the unavailability of the redundent system. (Author)

  16. A comparative evaluation of five human reliability assessment techniques

    International Nuclear Information System (INIS)

    Kirwan, B.

    1988-01-01

    A field experiment was undertaken to evaluate the accuracy, usefulness, and resources requirements of five human reliability quantification techniques (Techniques for Human Error Rate Prediction (THERP); Paired Comparisons, Human Error Assessment and Reduction Technique (HEART), Success Liklihood Index Method (SLIM)-Multi Attribute Utility Decomposition (MAUD), and Absolute Probability Judgement). This was achieved by assessing technique predictions against a set of known human error probabilities, and by comparing their predictions on a set of five realistic Probabilisitc Risk Assessment (PRA) human error. On a combined measure of accuracy THERP and Absolute Probability Judgement performed best, whilst HEART showed indications of accuracy and was lower in resources usage than other techniques. HEART and THERP both appear to benefit from using trained assessors in order to obtain the best results. SLIM and Paired Comparisons require further research on achieving a robust calibration relationship between their scale values and absolute probabilities. (author)

  17. A strategy for minimizing common mode human error in executing critical functions and tasks

    International Nuclear Information System (INIS)

    Beltracchi, L.; Lindsay, R.W.

    1992-01-01

    Human error in execution of critical functions and tasks can be costly. The Three Mile Island and the Chernobyl Accidents are examples of results from human error in the nuclear industry. There are similar errors that could no doubt be cited from other industries. This paper discusses a strategy to minimize common mode human error in the execution of critical functions and tasks. The strategy consists of the use of human redundancy, and also diversity in human cognitive behavior: skill-, rule-, and knowledge-based behavior. The authors contend that the use of diversity in human cognitive behavior is possible, and it minimizes common mode error

  18. Human error in maintenance: An investigative study for the factories of the future

    International Nuclear Information System (INIS)

    Dhillon, B S

    2014-01-01

    This paper presents a study of human error in maintenance. Many different aspects of human error in maintenance considered useful for the factories of the future are studied, including facts, figures, and examples; occurrence of maintenance error in equipment life cycle, elements of a maintenance person's time, maintenance environment and the causes for the occurrence of maintenance error, types and typical maintenance errors, common maintainability design errors and useful design guidelines to reduce equipment maintenance errors, maintenance work instructions, and maintenance error analysis methods

  19. Development of an Experimental Measurement System for Human Error Characteristics and a Pilot Test

    International Nuclear Information System (INIS)

    Jang, Tong-Il; Lee, Hyun-Chul; Moon, Kwangsu

    2017-01-01

    Some items out of individual and team characteristics were partially selected, and a pilot test was performed to measure and evaluate them using the experimental measurement system of human error characteristics. It is one of the processes to produce input data to the Eco-DBMS. And also, through the pilot test, it was tried to take methods to measure and acquire the physiological data, and to develop data format and quantification methods for the database. In this study, a pilot test to measure the stress and the tension level, and team cognitive characteristics out of human error characteristics was performed using the human error characteristics measurement and experimental evaluation system. In an experiment measuring the stress level, physiological characteristics using EEG was measured in a simulated unexpected situation. As shown in results, although this experiment was pilot, it was validated that relevant results for evaluating human error coping effects of workers’ FFD management guidelines and unexpected situation against guidelines can be obtained. In following researches, additional experiments including other human error characteristics will be conducted. Furthermore, the human error characteristics measurement and experimental evaluation system will be utilized to validate various human error coping solutions such as human factors criteria, design, and guidelines as well as supplement the human error characteristics database.

  20. A critique of recent models for human error rate assessment

    International Nuclear Information System (INIS)

    Apostolakis, G.E.

    1988-01-01

    This paper critically reviews two groups of models for assessing human error rates under accident conditions. The first group, which includes the US Nuclear Regulatory Commission (NRC) handbook model and the human cognitive reliability (HCR) model, considers as fundamental the time that is available to the operators to act. The second group, which is represented by the success likelihood index methodology multiattribute utility decomposition (SLIM-MAUD) model, relies on ratings of the human actions with respect to certain qualitative factors and the subsequent derivation of error rates. These models are evaluated with respect to two criteria: the treatment of uncertainties and the internal coherence of the models. In other words, this evaluation focuses primarily on normative aspects of these models. The principal findings are as follows: (1) Both of the time-related models provide human error rates as a function of the available time for action and the prevailing conditions. However, the HCR model ignores the important issue of state-of-knowledge uncertainties, dealing exclusively with stochastic uncertainty, whereas the model presented in the NRC handbook handles both types of uncertainty. (2) SLIM-MAUD provides a highly structured approach for the derivation of human error rates under given conditions. However, the treatment of the weights and ratings in this model is internally inconsistent. (author)

  1. Quality of IT service delivery — Analysis and framework for human error prevention

    KAUST Repository

    Shwartz, L.

    2010-12-01

    In this paper, we address the problem of reducing the occurrence of Human Errors that cause service interruptions in IT Service Support and Delivery operations. Analysis of a large volume of service interruption records revealed that more than 21% of interruptions were caused by human error. We focus on Change Management, the process with the largest risk of human error, and identify the main instances of human errors as the 4 Wrongs: request, time, configuration item, and command. Analysis of change records revealed that the humanerror prevention by partial automation is highly relevant. We propose the HEP Framework, a framework for execution of IT Service Delivery operations that reduces human error by addressing the 4 Wrongs using content integration, contextualization of operation patterns, partial automation of command execution, and controlled access to resources.

  2. Using a Delphi Method to Identify Human Factors Contributing to Nursing Errors.

    Science.gov (United States)

    Roth, Cheryl; Brewer, Melanie; Wieck, K Lynn

    2017-07-01

    The purpose of this study was to identify human factors associated with nursing errors. Using a Delphi technique, this study used feedback from a panel of nurse experts (n = 25) on an initial qualitative survey questionnaire followed by summarizing the results with feedback and confirmation. Synthesized factors regarding causes of errors were incorporated into a quantitative Likert-type scale, and the original expert panel participants were queried a second time to validate responses. The list identified 24 items as most common causes of nursing errors, including swamping and errors made by others that nurses are expected to recognize and fix. The responses provided a consensus top 10 errors list based on means with heavy workload and fatigue at the top of the list. The use of the Delphi survey established consensus and developed a platform upon which future study of nursing errors can evolve as a link to future solutions. This list of human factors in nursing errors should serve to stimulate dialogue among nurses about how to prevent errors and improve outcomes. Human and system failures have been the subject of an abundance of research, yet nursing errors continue to occur. © 2016 Wiley Periodicals, Inc.

  3. Using HET taxonomy to help stop human error

    OpenAIRE

    Li, Wen-Chin; Harris, Don; Stanton, Neville A.; Hsu, Yueh-Ling; Chang, Danny; Wang, Thomas; Young, Hong-Tsu

    2010-01-01

    Flight crews make positive contributions to the safety of aviation operations. Pilots have to assess continuously changing situations, evaluate potential risks, and make quick decisions. However, even well-trained and experienced pilots make errors. Accident investigations have identified that pilots’ performance is influenced significantly by the design of the flightdeck interface. This research applies hierarchical task analysis (HTA) and utilizes the Human Error Template (HET) taxonomy to ...

  4. Human error prediction and countermeasures based on CREAM in spent nuclear fuel (SNF) transportation

    International Nuclear Information System (INIS)

    Kim, Jae San

    2007-02-01

    Since the 1980s, in order to secure the storage capacity of spent nuclear fuel (SNF) at NPPs, SNF assemblies have been transported on-site from one unit to another unit nearby. However in the future the amount of the spent fuel will approach capacity in the areas used, and some of these SNFs will have to be transported to an off-site spent fuel repository. Most SNF materials used at NPPs will be transported by general cargo ships from abroad, and these SNFs will be stored in an interim storage facility. In the process of transporting SNF, human interactions will involve inspecting and preparing the cask and spent fuel, loading the cask onto the vehicle or ship, transferring the cask as well as storage or monitoring the cask. The transportation of SNF involves a number of activities that depend on reliable human performance. In the case of the transport of a cask, human errors may include spent fuel bundle misidentification or cask transport accidents among others. Reviews of accident events when transporting the Radioactive Material (RAM) throughout the world indicate that human error is the major causes for more than 65% of significant events. For the safety of SNF transportation, it is very important to predict human error and to deduce a method that minimizes the human error. This study examines the human factor effects on the safety of transporting spent nuclear fuel (SNF). It predicts and identifies the possible human errors in the SNF transport process (loading, transfer and storage of the SNF). After evaluating the human error mode in each transport process, countermeasures to minimize the human error are deduced. The human errors in SNF transportation were analyzed using Hollnagel's Cognitive Reliability and Error Analysis Method (CREAM). After determining the important factors for each process, countermeasures to minimize human error are provided in three parts: System design, Operational environment, and Human ability

  5. Human Error and the International Space Station: Challenges and Triumphs in Science Operations

    Science.gov (United States)

    Harris, Samantha S.; Simpson, Beau C.

    2016-01-01

    Any system with a human component is inherently risky. Studies in human factors and psychology have repeatedly shown that human operators will inevitably make errors, regardless of how well they are trained. Onboard the International Space Station (ISS) where crew time is arguably the most valuable resource, errors by the crew or ground operators can be costly to critical science objectives. Operations experts at the ISS Payload Operations Integration Center (POIC), located at NASA's Marshall Space Flight Center in Huntsville, Alabama, have learned that from payload concept development through execution, there are countless opportunities to introduce errors that can potentially result in costly losses of crew time and science. To effectively address this challenge, we must approach the design, testing, and operation processes with two specific goals in mind. First, a systematic approach to error and human centered design methodology should be implemented to minimize opportunities for user error. Second, we must assume that human errors will be made and enable rapid identification and recoverability when they occur. While a systematic approach and human centered development process can go a long way toward eliminating error, the complete exclusion of operator error is not a reasonable expectation. The ISS environment in particular poses challenging conditions, especially for flight controllers and astronauts. Operating a scientific laboratory 250 miles above the Earth is a complicated and dangerous task with high stakes and a steep learning curve. While human error is a reality that may never be fully eliminated, smart implementation of carefully chosen tools and techniques can go a long way toward minimizing risk and increasing the efficiency of NASA's space science operations.

  6. Cause analysis and preventives for human error events in Daya Bay NPP

    International Nuclear Information System (INIS)

    Huang Weigang; Zhang Li

    1998-01-01

    Daya Bay Nuclear Power Plant is put into commercial operation in 1994 Until 1996, there are 368 human error events in operating and maintenance area, occupying 39% of total events. These events occurred mainly in the processes of maintenance, test equipment isolation and system on-line, in particular in refuelling and maintenance. The author analyses root causes for human errorievents, which are mainly operator omission or error procedure deficiency; procedure not followed; lack of training; communication failures; work management inadequacy. The protective measures and treatment principle for human error events are also discussed, and several examples applying them are given. Finally, it is put forward that key to prevent human error event lies in the coordination and management, person in charge of work, and good work habits of staffs

  7. Detailed semantic analyses of human error incidents occurring at nuclear power plants. Extraction of periodical transition of error occurrence patterns by applying multivariate analysis

    International Nuclear Information System (INIS)

    Hirotsu, Yuko; Suzuki, Kunihiko; Takano, Kenichi; Kojima, Mitsuhiro

    2000-01-01

    It is essential for preventing the recurrence of human error incidents to analyze and evaluate them with the emphasis on human factor. Detailed and structured analyses of all incidents at domestic nuclear power plants (NPPs) reported during last 31 years have been conducted based on J-HPES, in which total 193 human error cases are identified. Results obtained by the analyses have been stored into the J-HPES database. In the previous study, by applying multivariate analysis to above case studies, it was suggested that there were several occurrence patterns identified of how errors occur at NPPs. It was also clarified that the causes related to each human error are different depending on age of their occurrence. This paper described the obtained results in respects of periodical transition of human error occurrence patterns. By applying multivariate analysis to the above data, it was suggested there were two types of error occurrence patterns as to each human error type. First type is common occurrence patterns, not depending on the age, and second type is the one influenced by periodical characteristics. (author)

  8. New method of classifying human errors at nuclear power plants and the analysis results of applying this method to maintenance errors at domestic plants

    International Nuclear Information System (INIS)

    Takagawa, Kenichi; Miyazaki, Takamasa; Gofuku, Akio; Iida, Hiroyasu

    2007-01-01

    Since many of the adverse events that have occurred in nuclear power plants in Japan and abroad have been related to maintenance or operation, it is necessary to plan preventive measures based on detailed analyses of human errors made by maintenance workers or operators. Therefore, before planning preventive measures, we developed a new method of analyzing human errors. Since each human error is an unsafe action caused by some misjudgement made by a person, we decided to classify them into six categories according to the stage in the judgment process in which the error was made. By further classifying each error into either an omission-type or commission-type, we produced 12 categories of errors. Then, we divided them into the two categories of basic error tendencies and individual error tendencies, and categorized background factors into four categories: imperfect planning; imperfect facilities or tools; imperfect environment; and imperfect instructions or communication. We thus defined the factors in each category to make it easy to identify factors that caused the error. Then using this method, we studied the characteristics of human errors that involved maintenance workers and planners since many maintenance errors have occurred. Among the human errors made by workers (worker errors) during the implementation stage, the following three types were prevalent with approximately 80%: commission-type 'projection errors', omission-type comprehension errors' and commission type 'action errors'. The most common among the individual factors of worker errors was 'repetition or habit' (schema), based on the assumption of a typical situation, and the half number of the 'repetition or habit' cases (schema) were not influenced by any background factors. The most common background factor that contributed to the individual factor was 'imperfect work environment', followed by 'insufficient knowledge'. Approximately 80% of the individual factors were 'repetition or habit' or

  9. The dependence level analysis between the human actions in NPP Operation

    International Nuclear Information System (INIS)

    Farcasiu, M.; Nitoi, M.; Apostol, M.; Florescu, G.; Prisecaru, Ilie

    2009-01-01

    The Human Reliability Analysis (HRA) is an important method in Probabilistic Safety Assessment (PSA) studies and offers desirability for concrete improvement of the man - machine - organization interfaces, reliability and safety. An important step in HRA is the dependence level analysis between the human actions performed by the same person or between the actions performed by different persons, step in quantitative analysis of the human errors probabilities. The purpose of this paper is to develop a model to analyze the dependence level between human actions for Nuclear Power Plant (NPP) operation. The model estimates the conditional human error probabilities (CHEP) and joint human error probabilities (JHEP). The achieved sensitivity analyses determine human performance sensibility to systematic variations for dependence level between human actions. The human error probabilities estimated in this paper are adequate values for integration both in HRA and in PSA realized for NPP. This type of analysis helps in finding and analyzing the ways of reducing the likelihood of human errors, so that the impact of human factor to systems availability, reliability and safety can be realistically estimated. In order to demonstrate the usability of this model an analysis is performed upon the dependences between the necessary human actions in mitigating the consequences of LOCA events, particularly for the case of Cernavoda NPP. (authors)

  10. Symbol Error Probability of DF Relay Selection over Arbitrary Nakagami-m Fading Channels

    Directory of Open Access Journals (Sweden)

    George C. Alexandropoulos

    2013-01-01

    Full Text Available We present a new analytical expression for the moment generating function (MGF of the end-to-end signal-to-noise ratio of dual-hop decode-and-forward (DF relaying systems with relay selection when operating over Nakagami-m fading channels. The derived MGF expression, which is valid for arbitrary values of the fading parameters of both hops, is subsequently utilized to evaluate the average symbol error probability (ASEP of M-ary phase shift keying modulation for the considered DF relaying scheme under various asymmetric fading conditions. It is shown that the MGF-based ASEP performance evaluation results are in excellent agreement with equivalent ones obtained by means of computer simulations, thus validating the correctness of the presented MGF expression.

  11. Analysis of measured data of human body based on error correcting frequency

    Science.gov (United States)

    Jin, Aiyan; Peipei, Gao; Shang, Xiaomei

    2014-04-01

    Anthropometry is to measure all parts of human body surface, and the measured data is the basis of analysis and study of the human body, establishment and modification of garment size and formulation and implementation of online clothing store. In this paper, several groups of the measured data are gained, and analysis of data error is gotten by analyzing the error frequency and using analysis of variance method in mathematical statistics method. Determination of the measured data accuracy and the difficulty of measured parts of human body, further studies of the causes of data errors, and summarization of the key points to minimize errors possibly are also mentioned in the paper. This paper analyses the measured data based on error frequency, and in a way , it provides certain reference elements to promote the garment industry development.

  12. Human error as a source of disturbances in Swedish nuclear power plants

    International Nuclear Information System (INIS)

    Sokolowski, E.

    1985-01-01

    Events involving human errors at the Swedish nuclear power plants are registered and periodically analyzed. The philosophy behind the scheme for data collection and analysis is discussed. Human errors cause about 10% of the disturbances registered. Only a small part of these errors are committed by operators in the control room. These and other findings differ from those in other countries. Possible reasons are put forward

  13. Quantification of human error and common-mode failures in man-machine systems

    International Nuclear Information System (INIS)

    Lisboa, J.J.

    1988-01-01

    Quantification of human performance, particularly the determination of human error, is essential for realistic assessment of overall system performance of man-machine systems. This paper presents an analysis of human errors in nuclear power plant systems when measured against common-mode failures (CMF). Human errors evaluated are improper testing, inadequate maintenance strategy, and miscalibration. The methodology presented in the paper represents a positive contribution to power plant systems availability by identifying sources of common-mode failure when operational functions are involved. It is also applicable to other complex systems such as chemical plants, aircraft and motor industries; in fact, any large man-created, man-machine system could be included

  14. A method for analysing incidents due to human errors on nuclear installations

    International Nuclear Information System (INIS)

    Griffon, M.

    1980-01-01

    This paper deals with the development of a methodology adapted to a detailed analysis of incidents considered to be due to human errors. An identification of human errors and a search for their eventual multiple causes is then needed. They are categorized in eight classes: education and training of personnel, installation design, work organization, time and work duration, physical environment, social environment, history of the plant and performance of the operator. The method is illustrated by the analysis of a handling incident generated by multiple human errors. (author)

  15. Research on Human-Error Factors of Civil Aircraft Pilots Based On Grey Relational Analysis

    Directory of Open Access Journals (Sweden)

    Guo Yundong

    2018-01-01

    Full Text Available In consideration of the situation that civil aviation accidents involve many human-error factors and show the features of typical grey systems, an index system of civil aviation accident human-error factors is built using human factor analysis and classification system model. With the data of accidents happened worldwide between 2008 and 2011, the correlation between human-error factors can be analyzed quantitatively using the method of grey relational analysis. Research results show that the order of main factors affecting pilot human-error factors is preconditions for unsafe acts, unsafe supervision, organization and unsafe acts. The factor related most closely with second-level indexes and pilot human-error factors is the physical/mental limitations of pilots, followed by supervisory violations. The relevancy between the first-level indexes and the corresponding second-level indexes and the relevancy between second-level indexes can also be analyzed quantitatively.

  16. Latent human error analysis and efficient improvement strategies by fuzzy TOPSIS in aviation maintenance tasks.

    Science.gov (United States)

    Chiu, Ming-Chuan; Hsieh, Min-Chih

    2016-05-01

    The purposes of this study were to develop a latent human error analysis process, to explore the factors of latent human error in aviation maintenance tasks, and to provide an efficient improvement strategy for addressing those errors. First, we used HFACS and RCA to define the error factors related to aviation maintenance tasks. Fuzzy TOPSIS with four criteria was applied to evaluate the error factors. Results show that 1) adverse physiological states, 2) physical/mental limitations, and 3) coordination, communication, and planning are the factors related to airline maintenance tasks that could be addressed easily and efficiently. This research establishes a new analytic process for investigating latent human error and provides a strategy for analyzing human error using fuzzy TOPSIS. Our analysis process complements shortages in existing methodologies by incorporating improvement efficiency, and it enhances the depth and broadness of human error analysis methodology. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  17. Analysis of human error and organizational deficiency in events considering risk significance

    International Nuclear Information System (INIS)

    Lee, Yong Suk; Kim, Yoonik; Kim, Say Hyung; Kim, Chansoo; Chung, Chang Hyun; Jung, Won Dea

    2004-01-01

    In this study, we analyzed human and organizational deficiencies in the trip events of Korean nuclear power plants. K-HPES items were used in human error analysis, and the organizational factors by Jacobs and Haber were used for organizational deficiency analysis. We proposed the use of CCDP as a risk measure to consider risk information in prioritizing K-HPES items and organizational factors. Until now, the risk significance of events has not been considered in human error and organizational deficiency analysis. Considering the risk significance of events in the process of analysis is necessary for effective enhancement of nuclear power plant safety by focusing on causes of human error and organizational deficiencies that are associated with significant risk

  18. The Countermeasures against the Human Errors in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Lee, Yong Hee; Kwon, Ki Chun; Lee, Jung Woon; Lee, Hyun; Jang, Tong Il

    2009-10-01

    Due to human error, the failure of nuclear power facilities essential for the prevention of accidents and related research in ergonomics and human factors, including the long term, comprehensive measures are considered technology is urgently required. Past nuclear facilities for the hardware in terms of continuing interest over subsequent definite improvement even have brought, now a nuclear facility to engage in people-related human factors for attention by nuclear facilities, ensuring the safety of its economic and industrial aspects. The point of the improvement is urgently required. The purpose of this research, including nuclear power plants in various nuclear facilities to minimize the possibility of human error by ensuring the safety for human engineering aspects will be implemented in the medium and long term preventive measures is to establish comprehensive

  19. The Countermeasures against the Human Errors in Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yong Hee; Kwon, Ki Chun; Lee, Jung Woon; Lee, Hyun; Jang, Tong Il

    2009-10-15

    Due to human error, the failure of nuclear power facilities essential for the prevention of accidents and related research in ergonomics and human factors, including the long term, comprehensive measures are considered technology is urgently required. Past nuclear facilities for the hardware in terms of continuing interest over subsequent definite improvement even have brought, now a nuclear facility to engage in people-related human factors for attention by nuclear facilities, ensuring the safety of its economic and industrial aspects. The point of the improvement is urgently required. The purpose of this research, including nuclear power plants in various nuclear facilities to minimize the possibility of human error by ensuring the safety for human engineering aspects will be implemented in the medium and long term preventive measures is to establish comprehensive.

  20. Bayesian network models for error detection in radiotherapy plans

    International Nuclear Information System (INIS)

    Kalet, Alan M; Ford, Eric C; Phillips, Mark H; Gennari, John H

    2015-01-01

    The purpose of this study is to design and develop a probabilistic network for detecting errors in radiotherapy plans for use at the time of initial plan verification. Our group has initiated a multi-pronged approach to reduce these errors. We report on our development of Bayesian models of radiotherapy plans. Bayesian networks consist of joint probability distributions that define the probability of one event, given some set of other known information. Using the networks, we find the probability of obtaining certain radiotherapy parameters, given a set of initial clinical information. A low probability in a propagated network then corresponds to potential errors to be flagged for investigation. To build our networks we first interviewed medical physicists and other domain experts to identify the relevant radiotherapy concepts and their associated interdependencies and to construct a network topology. Next, to populate the network’s conditional probability tables, we used the Hugin Expert software to learn parameter distributions from a subset of de-identified data derived from a radiation oncology based clinical information database system. These data represent 4990 unique prescription cases over a 5 year period. Under test case scenarios with approximately 1.5% introduced error rates, network performance produced areas under the ROC curve of 0.88, 0.98, and 0.89 for the lung, brain and female breast cancer error detection networks, respectively. Comparison of the brain network to human experts performance (AUC of 0.90 ± 0.01) shows the Bayes network model performs better than domain experts under the same test conditions. Our results demonstrate the feasibility and effectiveness of comprehensive probabilistic models as part of decision support systems for improved detection of errors in initial radiotherapy plan verification procedures. (paper)

  1. Error Probability of Binary and -ary Signals with Spatial Diversity in Nakagami- (Hoyt Fading Channels

    Directory of Open Access Journals (Sweden)

    Duong Trung Q

    2007-01-01

    Full Text Available We analyze the exact average symbol error probability (SEP of binary and -ary signals with spatial diversity in Nakagami- (Hoyt fading channels. The maximal-ratio combining and orthogonal space-time block coding are considered as diversity techniques for single-input multiple-output and multiple-input multiple-output systems, respectively. We obtain the average SEP in terms of the Lauricella multivariate hypergeometric function . The analysis is verified by comparing with Monte Carlo simulations and we further show that our general SEP expressions particularize to the previously known results for Rayleigh ( = 1 and single-input single-output (SISO Nakagami- cases.

  2. Investigations on human error hazards in recent unintended trip events of Korean nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sa Kil; Jang, Tong Il; Lee, Yong Hee; Shin, Kwang Hyeon [KAERI, Daejeon (Korea, Republic of)

    2012-10-15

    According to the Operational Performance Information System (OPIS) which has been operated to improve the public understanding by the KINS (Korea Institute of Nuclear Safety), unintended trip events by mainly human errors counted up to 38 cases (18.7%) from 2000 to 2011. Although the Nuclear Power Plant (NPP) industry in Korea has been making efforts to reduce the human errors which have largely contributed to trip events, the human error rate might keep increasing. Interestingly, digital based I and C systems is the one of the reduction factors of unintended reactor trips. Human errors, however, have occurred due to the digital based I and C systems because those systems require new or changed behaviors to the NPP operators. Therefore, it is necessary that the investigations of human errors consider a new methodology to find not only tangible behavior but also intangible behavior such as organizational behaviors. In this study we investigated human errors to find latent factors such as decisions and conditions in the all of the unintended reactor trip events during last dozen years. To find them, we applied the HFACS (Human Factors Analysis and Classification System) which is a commonly utilized tool for investigating human contributions to aviation accidents under a widespread evaluation scheme. The objective of this study is to find latent factors behind of human errors in nuclear reactor trip events. Therefore, a method to investigate unintended trip events by human errors and the results will be discussed in more detail.

  3. Investigations on human error hazards in recent unintended trip events of Korean nuclear power plants

    International Nuclear Information System (INIS)

    Kim, Sa Kil; Jang, Tong Il; Lee, Yong Hee; Shin, Kwang Hyeon

    2012-01-01

    According to the Operational Performance Information System (OPIS) which has been operated to improve the public understanding by the KINS (Korea Institute of Nuclear Safety), unintended trip events by mainly human errors counted up to 38 cases (18.7%) from 2000 to 2011. Although the Nuclear Power Plant (NPP) industry in Korea has been making efforts to reduce the human errors which have largely contributed to trip events, the human error rate might keep increasing. Interestingly, digital based I and C systems is the one of the reduction factors of unintended reactor trips. Human errors, however, have occurred due to the digital based I and C systems because those systems require new or changed behaviors to the NPP operators. Therefore, it is necessary that the investigations of human errors consider a new methodology to find not only tangible behavior but also intangible behavior such as organizational behaviors. In this study we investigated human errors to find latent factors such as decisions and conditions in the all of the unintended reactor trip events during last dozen years. To find them, we applied the HFACS (Human Factors Analysis and Classification System) which is a commonly utilized tool for investigating human contributions to aviation accidents under a widespread evaluation scheme. The objective of this study is to find latent factors behind of human errors in nuclear reactor trip events. Therefore, a method to investigate unintended trip events by human errors and the results will be discussed in more detail

  4. Fault tree model of human error based on error-forcing contexts

    International Nuclear Information System (INIS)

    Kang, Hyun Gook; Jang, Seung Cheol; Ha, Jae Joo

    2004-01-01

    In the safety-critical systems such as nuclear power plants, the safety-feature actuation is fully automated. In emergency case, the human operator could also play the role of a backup for automated systems. That is, the failure of safety-feature-actuation signal generation implies the concurrent failure of automated systems and that of manual actuation. The human operator's manual actuation failure is largely affected by error-forcing contexts (EFC). The failures of sensors and automated systems are most important ones. The sensors, the automated actuation system and the human operators are correlated in a complex manner and hard to develop a proper model. In this paper, we will explain the condition-based human reliability assessment (CBHRA) method in order to treat these complicated conditions in a practical way. In this study, we apply the CBHRA method to the manual actuation of safety features such as reactor trip and safety injection in Korean Standard Nuclear Power Plants

  5. The psychological background about human error and safety in NPP

    International Nuclear Information System (INIS)

    Zhang Li

    1992-01-01

    A human error is one of the factors which cause an accident in NPP. The in-situ psychological background plays an important role in inducing it. The author analyzes the structure of one's psychological background when one is at work, and gives a few examples of typical psychological background resulting in human errors. Finally it points out that the fundamental way to eliminate the unfavourable psychological background of safety production is to establish the safety culture in NPP along with its characteristics

  6. Review of human error analysis methodologies and case study for accident management

    International Nuclear Information System (INIS)

    Jung, Won Dae; Kim, Jae Whan; Lee, Yong Hee; Ha, Jae Joo

    1998-03-01

    In this research, we tried to establish the requirements for the development of a new human error analysis method. To achieve this goal, we performed a case study as following steps; 1. review of the existing HEA methods 2. selection of those methods which are considered to be appropriate for the analysis of operator's tasks in NPPs 3. choice of tasks for the application, selected for the case study: HRMS (Human reliability management system), PHECA (Potential Human Error Cause Analysis), CREAM (Cognitive Reliability and Error Analysis Method). And, as the tasks for the application, 'bleed and feed operation' and 'decision-making for the reactor cavity flooding' tasks are chosen. We measured the applicability of the selected methods to the NPP tasks, and evaluated the advantages and disadvantages between each method. The three methods are turned out to be applicable for the prediction of human error. We concluded that both of CREAM and HRMS are equipped with enough applicability for the NPP tasks, however, compared two methods. CREAM is thought to be more appropriate than HRMS from the viewpoint of overall requirements. The requirements for the new HEA method obtained from the study can be summarized as follows; firstly, it should deal with cognitive error analysis, secondly, it should have adequate classification system for the NPP tasks, thirdly, the description on the error causes and error mechanisms should be explicit, fourthly, it should maintain the consistency of the result by minimizing the ambiguity in each step of analysis procedure, fifty, it should be done with acceptable human resources. (author). 25 refs., 30 tabs., 4 figs

  7. Development and evaluation of a computer-aided system for analyzing human error in railway operations

    International Nuclear Information System (INIS)

    Kim, Dong San; Baek, Dong Hyun; Yoon, Wan Chul

    2010-01-01

    As human error has been recognized as one of the major contributors to accidents in safety-critical systems, there has been a strong need for techniques that can analyze human error effectively. Although many techniques have been developed so far, much room for improvement remains. As human error analysis is a cognitively demanding and time-consuming task, it is particularly necessary to develop a computerized system supporting this task. This paper presents a computer-aided system for analyzing human error in railway operations, called Computer-Aided System for Human Error Analysis and Reduction (CAS-HEAR). It supports analysts to find multiple levels of error causes and their causal relations by using predefined links between contextual factors and causal factors as well as links between causal factors. In addition, it is based on a complete accident model; hence, it helps analysts to conduct a thorough analysis without missing any important part of human error analysis. A prototype of CAS-HEAR was evaluated by nine field investigators from six railway organizations in Korea. Its overall usefulness in human error analysis was confirmed, although development of its simplified version and some modification of the contextual factors and causal factors are required in order to ensure its practical use.

  8. Exact Symbol Error Probability of Cross-QAM in AWGN and Fading Channels

    Directory of Open Access Journals (Sweden)

    Zhang Xi-chun

    2010-01-01

    Full Text Available The exact symbol error probability (SEP performance of -ary cross quadrature amplitude modulation (QAM in additive white Gaussian noise (AWGN channel and fading channels, including Rayleigh, Nakagami-m, Rice, and Nakagami-q (Hoyt channels, is analyzed. The obtained closed-form SEP expressions contain a finite (in proportion to sum of single integrals with finite limits and an integrand composed of elementary (exponential, trigonometric, and/or power functions, thus readily enabling numerical evaluation. Particularly, Gaussian -function is a special case of these integrals and is included in the SEP expressions. Simple and very precise approximations, which contain only Gaussian -function for AWGN channel and contain three terms of the single integrals mentioned above for fading channels, respectively, are also given. The analytical expressions show excellent agreement with the simulation results, and numerical evaluation with the proposed expressions reveals that cross QAM can obtain at least 1.1 dB gain compared to rectangular QAM when SEP < 0.3 in all the considered channels.

  9. Human medial frontal cortex activity predicts learning from errors.

    Science.gov (United States)

    Hester, Robert; Barre, Natalie; Murphy, Kevin; Silk, Tim J; Mattingley, Jason B

    2008-08-01

    Learning from errors is a critical feature of human cognition. It underlies our ability to adapt to changing environmental demands and to tune behavior for optimal performance. The posterior medial frontal cortex (pMFC) has been implicated in the evaluation of errors to control behavior, although it has not previously been shown that activity in this region predicts learning from errors. Using functional magnetic resonance imaging, we examined activity in the pMFC during an associative learning task in which participants had to recall the spatial locations of 2-digit targets and were provided with immediate feedback regarding accuracy. Activity within the pMFC was significantly greater for errors that were subsequently corrected than for errors that were repeated. Moreover, pMFC activity during recall errors predicted future responses (correct vs. incorrect), despite a sizeable interval (on average 70 s) between an error and the next presentation of the same recall probe. Activity within the hippocampus also predicted future performance and correlated with error-feedback-related pMFC activity. A relationship between performance expectations and pMFC activity, in the absence of differing reinforcement value for errors, is consistent with the idea that error-related pMFC activity reflects the extent to which an outcome is "worse than expected."

  10. Trial application of a technique for human error analysis (ATHEANA)

    International Nuclear Information System (INIS)

    Bley, D.C.; Cooper, S.E.; Parry, G.W.

    1996-01-01

    The new method for HRA, ATHEANA, has been developed based on a study of the operating history of serious accidents and an understanding of the reasons why people make errors. Previous publications associated with the project have dealt with the theoretical framework under which errors occur and the retrospective analysis of operational events. This is the first attempt to use ATHEANA in a prospective way, to select and evaluate human errors within the PSA context

  11. Human error and the problem of causality in analysis of accidents

    DEFF Research Database (Denmark)

    Rasmussen, Jens

    1990-01-01

    , designers or managers have played a major role. There are, however, several basic problems in analysis of accidents and identification of human error. This paper addresses the nature of causal explanations and the ambiguity of the rules applied for identification of the events to include in analysis......Present technology is characterized by complexity, rapid change and growing size of technical systems. This has caused increasing concern with the human involvement in system safety. Analyses of the major accidents during recent decades have concluded that human errors on part of operators...

  12. Dynamic encoding of speech sequence probability in human temporal cortex.

    Science.gov (United States)

    Leonard, Matthew K; Bouchard, Kristofer E; Tang, Claire; Chang, Edward F

    2015-05-06

    Sensory processing involves identification of stimulus features, but also integration with the surrounding sensory and cognitive context. Previous work in animals and humans has shown fine-scale sensitivity to context in the form of learned knowledge about the statistics of the sensory environment, including relative probabilities of discrete units in a stream of sequential auditory input. These statistics are a defining characteristic of one of the most important sequential signals humans encounter: speech. For speech, extensive exposure to a language tunes listeners to the statistics of sound sequences. To address how speech sequence statistics are neurally encoded, we used high-resolution direct cortical recordings from human lateral superior temporal cortex as subjects listened to words and nonwords with varying transition probabilities between sound segments. In addition to their sensitivity to acoustic features (including contextual features, such as coarticulation), we found that neural responses dynamically encoded the language-level probability of both preceding and upcoming speech sounds. Transition probability first negatively modulated neural responses, followed by positive modulation of neural responses, consistent with coordinated predictive and retrospective recognition processes, respectively. Furthermore, transition probability encoding was different for real English words compared with nonwords, providing evidence for online interactions with high-order linguistic knowledge. These results demonstrate that sensory processing of deeply learned stimuli involves integrating physical stimulus features with their contextual sequential structure. Despite not being consciously aware of phoneme sequence statistics, listeners use this information to process spoken input and to link low-level acoustic representations with linguistic information about word identity and meaning. Copyright © 2015 the authors 0270-6474/15/357203-12$15.00/0.

  13. An experimental approach to validating a theory of human error in complex systems

    Science.gov (United States)

    Morris, N. M.; Rouse, W. B.

    1985-01-01

    The problem of 'human error' is pervasive in engineering systems in which the human is involved. In contrast to the common engineering approach of dealing with error probabilistically, the present research seeks to alleviate problems associated with error by gaining a greater understanding of causes and contributing factors from a human information processing perspective. The general approach involves identifying conditions which are hypothesized to contribute to errors, and experimentally creating the conditions in order to verify the hypotheses. The conceptual framework which serves as the basis for this research is discussed briefly, followed by a description of upcoming research. Finally, the potential relevance of this research to design, training, and aiding issues is discussed.

  14. Human error mode identification for NPP main control room operations using soft controls

    International Nuclear Information System (INIS)

    Lee, Seung Jun; Kim, Jaewhan; Jang, Seung-Cheol

    2011-01-01

    The operation environment of main control rooms (MCRs) in modern nuclear power plants (NPPs) has considerably changed over the years. Advanced MCRs, which have been designed by adapting digital and computer technologies, have simpler interfaces using large display panels, computerized displays, soft controls, computerized procedure systems, and so on. The actions for the NPP operations are performed using soft controls in advanced MCRs. Soft controls have different features from conventional controls. Operators need to navigate the screens to find indicators and controls and manipulate controls using a mouse, touch screens, and so on. Due to these different interfaces, different human errors should be considered in the human reliability analysis (HRA) for advanced MCRs. In this work, human errors that could occur during operation executions using soft controls were analyzed. This work classified the human errors in soft controls into six types, and the reasons that affect the occurrence of the human errors were also analyzed. (author)

  15. Type I error probability spending for post-market drug and vaccine safety surveillance with binomial data.

    Science.gov (United States)

    Silva, Ivair R

    2018-01-15

    Type I error probability spending functions are commonly used for designing sequential analysis of binomial data in clinical trials, but it is also quickly emerging for near-continuous sequential analysis of post-market drug and vaccine safety surveillance. It is well known that, for clinical trials, when the null hypothesis is not rejected, it is still important to minimize the sample size. Unlike in post-market drug and vaccine safety surveillance, that is not important. In post-market safety surveillance, specially when the surveillance involves identification of potential signals, the meaningful statistical performance measure to be minimized is the expected sample size when the null hypothesis is rejected. The present paper shows that, instead of the convex Type I error spending shape conventionally used in clinical trials, a concave shape is more indicated for post-market drug and vaccine safety surveillance. This is shown for both, continuous and group sequential analysis. Copyright © 2017 John Wiley & Sons, Ltd.

  16. Human reliability

    International Nuclear Information System (INIS)

    Embrey, D.E.

    1987-01-01

    Concepts and techniques of human reliability have been developed and are used mostly in probabilistic risk assessment. For this, the major application of human reliability assessment has been to identify the human errors which have a significant effect on the overall safety of the system and to quantify the probability of their occurrence. Some of the major issues within human reliability studies are reviewed and it is shown how these are applied to the assessment of human failures in systems. This is done under the following headings; models of human performance used in human reliability assessment, the nature of human error, classification of errors in man-machine systems, practical aspects, human reliability modelling in complex situations, quantification and examination of human reliability, judgement based approaches, holistic techniques and decision analytic approaches. (UK)

  17. An Empirical Study on Human Performance according to the Physical Environment (Potential Human Error Hazard) in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Kim, Ar Ryum; Jang, In Seok; Seong, Proong Hyun

    2014-01-01

    The management of the physical environment for safety is more effective than a nuclear industry. Despite the physical environment such as lighting, noise satisfy with management standards, it can be background factors may cause human error and affect human performance. Because the consequence of extremely human error and human performance is high according to the physical environment, requirement standard could be covered with specific criteria. Particularly, in order to avoid human errors caused by an extremely low or rapidly-changing intensity illumination and masking effect such as power disconnection, plans for better visual environment and better function performances should be made as a careful study on efficient ways to manage and continue the better conditions is conducted

  18. Understanding Human Error in Naval Aviation Mishaps.

    Science.gov (United States)

    Miranda, Andrew T

    2018-04-01

    To better understand the external factors that influence the performance and decisions of aviators involved in Naval aviation mishaps. Mishaps in complex activities, ranging from aviation to nuclear power operations, are often the result of interactions between multiple components within an organization. The Naval aviation mishap database contains relevant information, both in quantitative statistics and qualitative reports, that permits analysis of such interactions to identify how the working atmosphere influences aviator performance and judgment. Results from 95 severe Naval aviation mishaps that occurred from 2011 through 2016 were analyzed using Bayes' theorem probability formula. Then a content analysis was performed on a subset of relevant mishap reports. Out of the 14 latent factors analyzed, the Bayes' application identified 6 that impacted specific aspects of aviator behavior during mishaps. Technological environment, misperceptions, and mental awareness impacted basic aviation skills. The remaining 3 factors were used to inform a content analysis of the contextual information within mishap reports. Teamwork failures were the result of plan continuation aggravated by diffused responsibility. Resource limitations and risk management deficiencies impacted judgments made by squadron commanders. The application of Bayes' theorem to historical mishap data revealed the role of latent factors within Naval aviation mishaps. Teamwork failures were seen to be considerably damaging to both aviator skill and judgment. Both the methods and findings have direct application for organizations interested in understanding the relationships between external factors and human error. It presents real-world evidence to promote effective safety decisions.

  19. Review of advances in human reliability analysis of errors of commission-Part 2: EOC quantification

    International Nuclear Information System (INIS)

    Reer, Bernhard

    2008-01-01

    In close connection with examples relevant to contemporary probabilistic safety assessment (PSA), a review of advances in human reliability analysis (HRA) of post-initiator errors of commission (EOCs), i.e. inappropriate actions under abnormal operating conditions, has been carried out. The review comprises both EOC identification (part 1) and quantification (part 2); part 2 is presented in this article. Emerging HRA methods in this field are: ATHEANA, MERMOS, the EOC HRA method developed by Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS), the MDTA method and CREAM. The essential advanced features are on the conceptual side, especially to envisage the modeling of multiple contexts for an EOC to be quantified (ATHEANA, MERMOS and MDTA), in order to explicitly address adverse conditions. There is promising progress in providing systematic guidance to better account for cognitive demands and tendencies (GRS, CREAM), and EOC recovery (MDTA). Problematic issues are associated with the implementation of multiple context modeling and the assessment of context-specific error probabilities. Approaches for task or error opportunity scaling (CREAM, GRS) and the concept of reference cases (ATHEANA outlook) provide promising orientations for achieving progress towards data-based quantification. Further development work is needed and should be carried out in close connection with large-scale applications of existing approaches

  20. Trend analysis of human error events and assessment of their proactive prevention measure at Rokkasho reprocessing plant

    International Nuclear Information System (INIS)

    Yamazaki, Satoru; Tanaka, Izumi; Wakabayashi, Toshio

    2012-01-01

    A trend analysis of human error events is important for preventing the recurrence of human error events. We propose a new method for identifying the common characteristics from results of trend analysis, such as the latent weakness of organization, and a management process for strategic error prevention. In this paper, we describe a trend analysis method for human error events that have been accumulated in the organization and the utilization of the results of trend analysis to prevent accidents proactively. Although the systematic analysis of human error events, the monitoring of their overall trend, and the utilization of the analyzed results have been examined for the plant operation, such information has never been utilized completely. Sharing information on human error events and analyzing their causes lead to the clarification of problems in the management and human factors. This new method was applied to the human error events that occurred in the Rokkasho reprocessing plant from 2010 October. Results revealed that the output of this method is effective in judging the error prevention plan and that the number of human error events is reduced to about 50% those observed in 2009 and 2010. (author)

  1. Interactive analysis of human error factors in NPP operation events

    International Nuclear Information System (INIS)

    Zhang Li; Zou Yanhua; Huang Weigang

    2010-01-01

    Interactive of human error factors in NPP operation events were introduced, and 645 WANO operation event reports from 1999 to 2008 were analyzed, among which 432 were found relative to human errors. After classifying these errors with the Root Causes or Causal Factors, and then applying SPSS for correlation analysis,we concluded: (1) Personnel work practices are restricted by many factors. Forming a good personnel work practices is a systematic work which need supports in many aspects. (2)Verbal communications,personnel work practices, man-machine interface and written procedures and documents play great roles. They are four interaction factors which often come in bundle. If some improvements need to be made on one of them,synchronous measures are also necessary for the others.(3) Management direction and decision process, which are related to management,have a significant interaction with personnel factors. (authors)

  2. Coping with human errors through system design: Implications for ecological interface design

    DEFF Research Database (Denmark)

    Rasmussen, Jens; Vicente, Kim J.

    1989-01-01

    Research during recent years has revealed that human errors are not stochastic events which can be removed through improved training programs or optimal interface design. Rather, errors tend to reflect either systematic interference between various models, rules, and schemata, or the effects...... of the adaptive mechanisms involved in learning. In terms of design implications, these findings suggest that reliable human-system interaction will be achieved by designing interfaces which tend to minimize the potential for control interference and support recovery from errors. In other words, the focus should...... be on control of the effects of errors rather than on the elimination of errors per se. In this paper, we propose a theoretical framework for interface design that attempts to satisfy these objectives. The goal of our framework, called ecological interface design, is to develop a meaningful representation...

  3. Support of protective work of human error in a nuclear power plant

    International Nuclear Information System (INIS)

    Yoshizawa, Yuriko

    1999-01-01

    The nuclear power plant human factor group of the Tokyo Electric Power Co., Ltd. supports various protective work of human error conducted at the nuclear power plant. Its main researching theme are studies on human factor on operation of a nuclear power plant, and on recovery and common basic study on human factor. In addition, on a base of the obtained informations, assistance to protective work of human error conducted at the nuclear power plant as well as development for its actual use was also promoted. Especially, for actions sharing some dangerous informations, various assistances such as a proposal on actual example analytical method to effectively understand a dangerous information not facially but faithfully, construction of a data base to conveniently share such dangerous information, and practice on non-accident business survey for a hint of effective promotion of the protection work, were promoted. Here were introduced on assistance and investigation for effective sharing of the dangerous informations for various actions on protection of human error mainly conducted in nuclear power plant. (G.K.)

  4. [Medical errors: inevitable but preventable].

    Science.gov (United States)

    Giard, R W

    2001-10-27

    Medical errors are increasingly reported in the lay press. Studies have shown dramatic error rates of 10 percent or even higher. From a methodological point of view, studying the frequency and causes of medical errors is far from simple. Clinical decisions on diagnostic or therapeutic interventions are always taken within a clinical context. Reviewing outcomes of interventions without taking into account both the intentions and the arguments for a particular action will limit the conclusions from a study on the rate and preventability of errors. The interpretation of the preventability of medical errors is fraught with difficulties and probably highly subjective. Blaming the doctor personally does not do justice to the actual situation and especially the organisational framework. Attention for and improvement of the organisational aspects of error are far more important then litigating the person. To err is and will remain human and if we want to reduce the incidence of faults we must be able to learn from our mistakes. That requires an open attitude towards medical mistakes, a continuous effort in their detection, a sound analysis and, where feasible, the institution of preventive measures.

  5. Operator errors

    International Nuclear Information System (INIS)

    Knuefer; Lindauer

    1980-01-01

    Besides that at spectacular events a combination of component failure and human error is often found. Especially the Rasmussen-Report and the German Risk Assessment Study show for pressurised water reactors that human error must not be underestimated. Although operator errors as a form of human error can never be eliminated entirely, they can be minimized and their effects kept within acceptable limits if a thorough training of personnel is combined with an adequate design of the plant against accidents. Contrary to the investigation of engineering errors, the investigation of human errors has so far been carried out with relatively small budgets. Intensified investigations in this field appear to be a worthwhile effort. (orig.)

  6. Human factors interventions to reduce human errors and improve productivity in maintenance tasks

    International Nuclear Information System (INIS)

    Isoda, Hachiro; Yasutake, J.Y.

    1992-01-01

    This paper describes work in progress to develop interventions to reduce human errors and increase maintenance productivity in nuclear power plants. The effort is part of a two-phased Human Factors research program being conducted jointly by the Central Research Institute of Electric Power Industry (CRIEPI) in Japan and the Electric Power Research Institute (EPRI) in the United States. The overall objective of this joint research program is to identify critical maintenance tasks and to develop, implement and evaluate interventions which have high potential for reducing human errors or increasing maintenance productivity. As a result of the Phase 1 effort, ten critical maintenance tasks were identified. For these tasks, over 25 candidate interventions were identified for potential development. After careful analysis, seven interventions were selected for development during Phase 2. This paper describes the methodology used to analyze and identify the most critical tasks, the process of identifying and developing selected interventions and some of the initial results. (author)

  7. Self-assessment of human performance errors in nuclear operations

    International Nuclear Information System (INIS)

    Chambliss, K.V.

    1996-01-01

    One of the most important approaches to improving nuclear safety is to have an effective self-assessment process in place, whose cornerstone is the identification and improvement of human performance errors. Experience has shown that significant events usually have had precursors of human performance errors. If these precursors are left uncorrected or not understood, the symptoms recur and result in unanticipated events of greater safety significance. The Institute of Nuclear Power Operations (INPO) has been championing the cause of promoting excellence in human performance in the nuclear industry. INPO's report, open-quotes Excellence in Human Performance,close quotes emphasizes the importance of several factors that play a role in human performance. They include individual, supervisory, and organizational behaviors; real-time feedback that results in specific behavior to produce safe and reliable performance; and proactive measures that remove obstacles from excellent human performance. Zack Pate, chief executive officer and president of INPO, in his report, open-quotes The Control Room,close quotes provides an excellent discussion of serious events in the nuclear industry since 1994 and compares them with the results from a recent study by the National Transportation Safety Board of airline accidents in the 12-yr period from 1978 to 1990 to draw some common themes that relate to human performance issues in the control room

  8. Exploring human error in military aviation flight safety events using post-incident classification systems.

    Science.gov (United States)

    Hooper, Brionny J; O'Hare, David P A

    2013-08-01

    Human error classification systems theoretically allow researchers to analyze postaccident data in an objective and consistent manner. The Human Factors Analysis and Classification System (HFACS) framework is one such practical analysis tool that has been widely used to classify human error in aviation. The Cognitive Error Taxonomy (CET) is another. It has been postulated that the focus on interrelationships within HFACS can facilitate the identification of the underlying causes of pilot error. The CET provides increased granularity at the level of unsafe acts. The aim was to analyze the influence of factors at higher organizational levels on the unsafe acts of front-line operators and to compare the errors of fixed-wing and rotary-wing operations. This study analyzed 288 aircraft incidents involving human error from an Australasian military organization occurring between 2001 and 2008. Action errors accounted for almost twice (44%) the proportion of rotary wing compared to fixed wing (23%) incidents. Both classificatory systems showed significant relationships between precursor factors such as the physical environment, mental and physiological states, crew resource management, training and personal readiness, and skill-based, but not decision-based, acts. The CET analysis showed different predisposing factors for different aspects of skill-based behaviors. Skill-based errors in military operations are more prevalent in rotary wing incidents and are related to higher level supervisory processes in the organization. The Cognitive Error Taxonomy provides increased granularity to HFACS analyses of unsafe acts.

  9. A Conceptual Framework for Predicting Error in Complex Human-Machine Environments

    Science.gov (United States)

    Freed, Michael; Remington, Roger; Null, Cynthia H. (Technical Monitor)

    1998-01-01

    We present a Goals, Operators, Methods, and Selection Rules-Model Human Processor (GOMS-MHP) style model-based approach to the problem of predicting human habit capture errors. Habit captures occur when the model fails to allocate limited cognitive resources to retrieve task-relevant information from memory. Lacking the unretrieved information, decision mechanisms act in accordance with implicit default assumptions, resulting in error when relied upon assumptions prove incorrect. The model helps interface designers identify situations in which such failures are especially likely.

  10. A Comparison of Error Bounds for a Nonlinear Tracking System with Detection Probability Pd < 1

    Science.gov (United States)

    Tong, Huisi; Zhang, Hao; Meng, Huadong; Wang, Xiqin

    2012-01-01

    Error bounds for nonlinear filtering are very important for performance evaluation and sensor management. This paper presents a comparative study of three error bounds for tracking filtering, when the detection probability is less than unity. One of these bounds is the random finite set (RFS) bound, which is deduced within the framework of finite set statistics. The others, which are the information reduction factor (IRF) posterior Cramer-Rao lower bound (PCRLB) and enumeration method (ENUM) PCRLB are introduced within the framework of finite vector statistics. In this paper, we deduce two propositions and prove that the RFS bound is equal to the ENUM PCRLB, while it is tighter than the IRF PCRLB, when the target exists from the beginning to the end. Considering the disappearance of existing targets and the appearance of new targets, the RFS bound is tighter than both IRF PCRLB and ENUM PCRLB with time, by introducing the uncertainty of target existence. The theory is illustrated by two nonlinear tracking applications: ballistic object tracking and bearings-only tracking. The simulation studies confirm the theory and reveal the relationship among the three bounds. PMID:23242274

  11. Classification of resistance to passive motion using minimum probability of error criterion.

    Science.gov (United States)

    Chan, H C; Manry, M T; Kondraske, G V

    1987-01-01

    Neurologists diagnose many muscular and nerve disorders by classifying the resistance to passive motion of patients' limbs. Over the past several years, a computer-based instrument has been developed for automated measurement and parameterization of this resistance. In the device, a voluntarily relaxed lower extremity is moved at constant velocity by a motorized driver. The torque exerted on the extremity by the machine is sampled, along with the angle of the extremity. In this paper a computerized technique is described for classifying a patient's condition as 'Normal' or 'Parkinson disease' (rigidity), from the torque versus angle curve for the knee joint. A Legendre polynomial, fit to the curve, is used to calculate a set of eight normally distributed features of the curve. The minimum probability of error approach is used to classify the curve as being from a normal or Parkinson disease patient. Data collected from 44 different subjects was processes and the results were compared with an independent physician's subjective assessment of rigidity. There is agreement in better than 95% of the cases, when all of the features are used.

  12. Review of U.S. Army Unmanned Aerial Systems Accident Reports: Analysis of Human Error Contributions

    Science.gov (United States)

    2018-03-20

    within report documents. The information presented was obtained through a request to use the U.S. Army Combat Readiness Center’s Risk Management ...controlled flight into terrain (13 accidents), fueling errors by improper techniques (7 accidents), and a variety of maintenance errors (10 accidents). The...and 9 of the 10 maintenance accidents. Table 4. Frequencies Based on Source of Human Error Human error source Presence Poor Planning

  13. In-plant reliability data base for nuclear plant components: a feasibility study on human error information

    International Nuclear Information System (INIS)

    Borkowski, R.J.; Fragola, J.R.; Schurman, D.L.; Johnson, J.W.

    1984-03-01

    This report documents the procedure and final results of a feasibility study which examined the usefulness of nuclear plant maintenance work requests in the IPRDS as tools for understanding human error and its influence on component failure and repair. Developed in this study were (1) a set of criteria for judging the quality of a plant maintenance record set for studying human error; (2) a scheme for identifying human errors in the maintenance records; and (3) two taxonomies (engineering-based and psychology-based) for categorizing and coding human error-related events

  14. Human error: An essential problem of nuclear power plants

    International Nuclear Information System (INIS)

    Smidt, D.

    1981-01-01

    The author first defines the part played by man in the nuclear power plant and then deals in more detail with the structure of his valse behavior in tactical and strategic repect. The dicussion of tactical errors and their avoidance is follwed by a report on the actual state of plant technology and possible improvements. Subsequently a study of the strategic errors including the conclusion to be drawn until now (joint between plant and man, personal selection and education) is made. If the joints between man and machine are designed according and physiological strenghts and weaknesses of man are fully realized and taken into account human errors not be essential problem in nuclear power plant. (GL) [de

  15. A trend analysis of human error events for proactive prevention of accidents. Methodology development and effective utilization

    International Nuclear Information System (INIS)

    Hirotsu, Yuko; Ebisu, Mitsuhiro; Aikawa, Takeshi; Matsubara, Katsuyuki

    2006-01-01

    This paper described methods for analyzing human error events that has been accumulated in the individual plant and for utilizing the result to prevent accidents proactively. Firstly, a categorization framework of trigger action and causal factors of human error events were reexamined, and the procedure to analyze human error events was reviewed based on the framework. Secondly, a method for identifying the common characteristics of trigger action data and of causal factor data accumulated by analyzing human error events was clarified. In addition, to utilize the results of trend analysis effectively, methods to develop teaching material for safety education, to develop the checkpoints for the error prevention and to introduce an error management process for strategic error prevention were proposed. (author)

  16. Application of grey incidence analysis to connection between human errors and root cause

    International Nuclear Information System (INIS)

    Ren Yinxiang; Yu Ren; Zhou Gang; Chen Dengke

    2008-01-01

    By introducing grey incidence analysis, the relatively important impact of root cause upon human errors was researched in the paper. On the basis of WANO statistic data and grey incidence analysis, lack of alternate examine, bad basic operation, short of theoretical knowledge, relaxation of organization and management and deficiency of regulations are the important influence of root cause on human err ors. Finally, the question to reduce human errors was discussed. (authors)

  17. Science, practice, and human errors in controlling Clostridium botulinum in heat-preserved food in hermetic containers.

    Science.gov (United States)

    Pflug, Irving J

    2010-05-01

    The incidence of botulism in canned food in the last century is reviewed along with the background science; a few conclusions are reached based on analysis of published data. There are two primary aspects to botulism control: the design of an adequate process and the delivery of the adequate process to containers of food. The probability that the designed process will not be adequate to control Clostridium botulinum is very small, probably less than 1.0 x 10(-6), based on containers of food, whereas the failure of the operator of the processing equipment to deliver the specified process to containers of food may be of the order of 1 in 40, to 1 in 100, based on processing units (retort loads). In the commercial food canning industry, failure to deliver the process will probably be of the order of 1.0 x 10(-4) to 1.0 x 10(-6) when U.S. Food and Drug Administration (FDA) regulations are followed. Botulism incidents have occurred in food canning plants that have not followed the FDA regulations. It is possible but very rare to have botulism result from postprocessing contamination. It may thus be concluded that botulism incidents in canned food are primarily the result of human failure in the delivery of the designed or specified process to containers of food that, in turn, result in the survival, outgrowth, and toxin production of C. botulinum spores. Therefore, efforts in C. botulinum control should be concentrated on reducing human errors in the delivery of the specified process to containers of food.

  18. Learning time-dependent noise to reduce logical errors: real time error rate estimation in quantum error correction

    Science.gov (United States)

    Huo, Ming-Xia; Li, Ying

    2017-12-01

    Quantum error correction is important to quantum information processing, which allows us to reliably process information encoded in quantum error correction codes. Efficient quantum error correction benefits from the knowledge of error rates. We propose a protocol for monitoring error rates in real time without interrupting the quantum error correction. Any adaptation of the quantum error correction code or its implementation circuit is not required. The protocol can be directly applied to the most advanced quantum error correction techniques, e.g. surface code. A Gaussian processes algorithm is used to estimate and predict error rates based on error correction data in the past. We find that using these estimated error rates, the probability of error correction failures can be significantly reduced by a factor increasing with the code distance.

  19. Estimation of long-term probabilities for inadvertent intrusion into radioactive waste management areas

    International Nuclear Information System (INIS)

    Eedy, W.; Hart, D.

    1988-05-01

    The risk to human health from radioactive waste management sites can be calculated as the product of the probability of accidental exposure (intrusion) times the probability of a health effect from such exposure. This report reviews the literature and evaluates methods used to predict the probabilities for unintentional intrusion into radioactive waste management areas in Canada over a 10,000-year period. Methods to predict such probabilities are available. They generally assume a long-term stability in terms of existing resource uses and society in the management area. The major potential for errors results from the unlikeliness of these assumptions holding true over such lengthy periods of prediction

  20. THERP and HEART integrated methodology for human error assessment

    Science.gov (United States)

    Castiglia, Francesco; Giardina, Mariarosa; Tomarchio, Elio

    2015-11-01

    THERP and HEART integrated methodology is proposed to investigate accident scenarios that involve operator errors during high-dose-rate (HDR) treatments. The new approach has been modified on the basis of fuzzy set concept with the aim of prioritizing an exhaustive list of erroneous tasks that can lead to patient radiological overexposures. The results allow for the identification of human errors that are necessary to achieve a better understanding of health hazards in the radiotherapy treatment process, so that it can be properly monitored and appropriately managed.

  1. An investigation on unintended reactor trip events in terms of human error hazards of Korean nuclear power plants

    International Nuclear Information System (INIS)

    Kim, Sa Kil; Lee, Yong Hee; Jang, Tong Il; Oh, Yeon Ju; Shin, Kwang Hyeon

    2014-01-01

    Highlights: • A methodology to identify human error hazards has been established. • The proposed methodology is a preventive approach to identify not only human error causes but also its hazards. • Using the HFACS framework we tried to find out not causations but all of the hazards and relationships among them. • We determined countermeasures against human errors through dealing with latent factors such as organizational influences. - Abstract: A new approach for finding the hazards of human errors, and not just their causes, in the nuclear industry is currently required. This is because finding causes of human errors is really impossible owing to the multiplicity of causes in each case. Thus, this study aims at identifying the relationships among human error hazards and determining the strategies for preventing human error events by means of a reanalysis of the reactor trip events in Korea NPPs. We investigated human errors to find latent factors such as decisions and conditions in all of the unintended reactor trip events during the last dozen years. In this study, we applied the HFACS (Human Factors Analysis and Classification System), which is a commonly utilized tool for investigating human contributions to aviation accidents under a widespread evaluation scheme. Using the HFACS framework, we tried to find out not the causations but all of the hazards and their relationships in terms of organizational factors. Through the trial, we proposed not only meaningful frequencies of each hazards also correlations of them. Also, considering the correlations of each hazards, we suggested useful strategies to prevent human error event. A method to investigate unintended nuclear reactor trips by human errors and the results will be discussed in more detail

  2. Human reliability analysis data obtainment through fuzzy logic in nuclear plants

    International Nuclear Information System (INIS)

    Nascimento, C.S. do; Mesquita, R.N. de

    2012-01-01

    Highlights: ► Human Error Probability estimates from operator's reactions to emergency situations. ► Human Reliability Analysis input data obtainment through fuzzy logic inference. ► Performance Shaping Factors evaluation influence level onto the operator's actions. - Abstract: Human error has been recognized as an important factor for many industrial and nuclear accidents occurrence. Human error data is scarcely available for different reasons among which, lapses in historical database registry methodology is an important one. Human Reliability Analysis (HRA) is an usual tool employed to estimate the probability that an operator will reasonably perform a system required task in required time without degrading the system. This meta-analysis requires specific Human Error Probability estimates for most of its procedure. This work obtains Human Error Probability (HEP) estimates from operator's actions in response to emergency situations hypothesis on Research Reactor IEA-R1 from IPEN, Brazil. Through this proposed methodology HRA should be able to be performed even with shortage of related human error statistical data. A Performance Shaping Factors (PSF's) evaluation in order to classify and estimate their influence level onto the operator's actions and to determine their actual state over the plant was also done. Both HEP estimation and PSF evaluation were done based on expert judgment using interviews and questionnaires. Expert group was established based on selected IEA-R1 operators, and their evaluation were put into a knowledge representation system which used linguistic variables and group evaluation values that were obtained through Fuzzy Logic and Fuzzy Set theory. HEP obtained values show good agreement with literature published data corroborating the proposed methodology as a good alternative to be used on HRA.

  3. Human error identification for laparoscopic surgery: Development of a motion economy perspective.

    Science.gov (United States)

    Al-Hakim, Latif; Sevdalis, Nick; Maiping, Tanaphon; Watanachote, Damrongpan; Sengupta, Shomik; Dissaranan, Charuspong

    2015-09-01

    This study postulates that traditional human error identification techniques fail to consider motion economy principles and, accordingly, their applicability in operating theatres may be limited. This study addresses this gap in the literature with a dual aim. First, it identifies the principles of motion economy that suit the operative environment and second, it develops a new error mode taxonomy for human error identification techniques which recognises motion economy deficiencies affecting the performance of surgeons and predisposing them to errors. A total of 30 principles of motion economy were developed and categorised into five areas. A hierarchical task analysis was used to break down main tasks of a urological laparoscopic surgery (hand-assisted laparoscopic nephrectomy) to their elements and the new taxonomy was used to identify errors and their root causes resulting from violation of motion economy principles. The approach was prospectively tested in 12 observed laparoscopic surgeries performed by 5 experienced surgeons. A total of 86 errors were identified and linked to the motion economy deficiencies. Results indicate the developed methodology is promising. Our methodology allows error prevention in surgery and the developed set of motion economy principles could be useful for training surgeons on motion economy principles. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  4. A measurement error approach to assess the association between dietary diversity, nutrient intake, and mean probability of adequacy.

    Science.gov (United States)

    Joseph, Maria L; Carriquiry, Alicia

    2010-11-01

    Collection of dietary intake information requires time-consuming and expensive methods, making it inaccessible to many resource-poor countries. Quantifying the association between simple measures of usual dietary diversity and usual nutrient intake/adequacy would allow inferences to be made about the adequacy of micronutrient intake at the population level for a fraction of the cost. In this study, we used secondary data from a dietary intake study carried out in Bangladesh to assess the association between 3 food group diversity indicators (FGI) and calcium intake; and the association between these same 3 FGI and a composite measure of nutrient adequacy, mean probability of adequacy (MPA). By implementing Fuller's error-in-the-equation measurement error model (EEM) and simple linear regression (SLR) models, we assessed these associations while accounting for the error in the observed quantities. Significant associations were detected between usual FGI and usual calcium intakes, when the more complex EEM was used. The SLR model detected significant associations between FGI and MPA as well as for variations of these measures, including the best linear unbiased predictor. Through simulation, we support the use of the EEM. In contrast to the EEM, the SLR model does not account for the possible correlation between the measurement errors in the response and predictor. The EEM performs best when the model variables are not complex functions of other variables observed with error (e.g. MPA). When observation days are limited and poor estimates of the within-person variances are obtained, the SLR model tends to be more appropriate.

  5. Prediction of human errors by maladaptive changes in event-related brain networks

    NARCIS (Netherlands)

    Eichele, T.; Debener, S.; Calhoun, V.D.; Specht, K.; Engel, A.K.; Hugdahl, K.; Cramon, D.Y. von; Ullsperger, M.

    2008-01-01

    Humans engaged in monotonous tasks are susceptible to occasional errors that may lead to serious consequences, but little is known about brain activity patterns preceding errors. Using functional Mill and applying independent component analysis followed by deconvolution of hemodynamic responses, we

  6. An MEG signature corresponding to an axiomatic model of reward prediction error.

    Science.gov (United States)

    Talmi, Deborah; Fuentemilla, Lluis; Litvak, Vladimir; Duzel, Emrah; Dolan, Raymond J

    2012-01-02

    Optimal decision-making is guided by evaluating the outcomes of previous decisions. Prediction errors are theoretical teaching signals which integrate two features of an outcome: its inherent value and prior expectation of its occurrence. To uncover the magnetic signature of prediction errors in the human brain we acquired magnetoencephalographic (MEG) data while participants performed a gambling task. Our primary objective was to use formal criteria, based upon an axiomatic model (Caplin and Dean, 2008a), to determine the presence and timing profile of MEG signals that express prediction errors. We report analyses at the sensor level, implemented in SPM8, time locked to outcome onset. We identified, for the first time, a MEG signature of prediction error, which emerged approximately 320 ms after an outcome and expressed as an interaction between outcome valence and probability. This signal followed earlier, separate signals for outcome valence and probability, which emerged approximately 200 ms after an outcome. Strikingly, the time course of the prediction error signal, as well as the early valence signal, resembled the Feedback-Related Negativity (FRN). In simultaneously acquired EEG data we obtained a robust FRN, but the win and loss signals that comprised this difference wave did not comply with the axiomatic model. Our findings motivate an explicit examination of the critical issue of timing embodied in computational models of prediction errors as seen in human electrophysiological data. Copyright © 2011 Elsevier Inc. All rights reserved.

  7. Probable sources of errors in radiation therapy (abstract)

    International Nuclear Information System (INIS)

    Khan, U.H.

    1998-01-01

    It is fact that some errors are always in dose-volume prescription, management of radiation beam, derivation of exposure, planning the treatment and finally the treatment of the patient ( a three dimensional subject). This paper highlights all the sources of error and relevant methods to decrease or eliminate them, thus improving the over-all therapeutic efficiency and accuracy. It is a comprehensive teamwork of the radiotherapist, medical radiation physicist, medical technologist and the patient. All the links, in the whole chain of radiotherapy, are equally important and duly considered in the paper. The decision for Palliative or Radical treatment is based on the nature and extent disease, site, stage, grade, length of the history of condition and biopsy reports etc. This may entail certain uncertainties in Volume of tumor, quality and quantity of radiation and dose fractionation etc, which may be under or over-estimated. An effort has been made to guide the radiotherapist in avoiding the pitfalls in the arena of radiotherapy. (author)

  8. Derivation of main drivers affecting the possibility of human errors during low power and shutdown operation

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ar Ryum; Seong, Poong Hyun [KAIST, Daejeon (Korea, Republic of); Park, Jin Kyun; Kim, Jae Whan [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    In order to estimate the possibility of human error and identify its nature, human reliability analysis (HRA) methods have been implemented. For this, various HRA methods have been developed so far: techniques for human error rate prediction (THERP), cause based decision tree (CBDT), the cognitive reliability and error analysis method (CREAM) and so on. Most HRA methods have been developed with a focus on full power operation of NPPs even though human performance may more largely affect the safety of the system during low power and shutdown (LPSD) operation than it would when the system is in full power operation. In this regard, it is necessary to conduct a research for developing HRA method to be used in LPSD operation. For the first step of the study, main drivers which affect the possibility of human error have been developed. Drivers which are commonly called as performance shaping factors (PSFs) are aspects of the human's individual characteristics, environment, organization, or task that specifically decrements or improves human performance, thus respectively increasing or decreasing the likelihood of human errors. In order to estimate the possibility of human error and identify its nature, human reliability analysis (HRA) methods have been implemented. For this, various HRA methods have been developed so far: techniques for human error rate prediction (THERP), cause based decision tree (CBDT), the cognitive reliability and error analysis method (CREAM) and so on. Most HRA methods have been developed with a focus on full power operation of NPPs even though human performance may more largely affect the safety of the system during low power and shutdown (LPSD) operation than it would when the system is in full power operation. In this regard, it is necessary to conduct a research for developing HRA method to be used in LPSD operation. For the first step of the study, main drivers which affect the possibility of human error have been developed. Drivers

  9. Derivation of main drivers affecting the possibility of human errors during low power and shutdown operation

    International Nuclear Information System (INIS)

    Kim, Ar Ryum; Seong, Poong Hyun; Park, Jin Kyun; Kim, Jae Whan

    2016-01-01

    In order to estimate the possibility of human error and identify its nature, human reliability analysis (HRA) methods have been implemented. For this, various HRA methods have been developed so far: techniques for human error rate prediction (THERP), cause based decision tree (CBDT), the cognitive reliability and error analysis method (CREAM) and so on. Most HRA methods have been developed with a focus on full power operation of NPPs even though human performance may more largely affect the safety of the system during low power and shutdown (LPSD) operation than it would when the system is in full power operation. In this regard, it is necessary to conduct a research for developing HRA method to be used in LPSD operation. For the first step of the study, main drivers which affect the possibility of human error have been developed. Drivers which are commonly called as performance shaping factors (PSFs) are aspects of the human's individual characteristics, environment, organization, or task that specifically decrements or improves human performance, thus respectively increasing or decreasing the likelihood of human errors. In order to estimate the possibility of human error and identify its nature, human reliability analysis (HRA) methods have been implemented. For this, various HRA methods have been developed so far: techniques for human error rate prediction (THERP), cause based decision tree (CBDT), the cognitive reliability and error analysis method (CREAM) and so on. Most HRA methods have been developed with a focus on full power operation of NPPs even though human performance may more largely affect the safety of the system during low power and shutdown (LPSD) operation than it would when the system is in full power operation. In this regard, it is necessary to conduct a research for developing HRA method to be used in LPSD operation. For the first step of the study, main drivers which affect the possibility of human error have been developed. Drivers which

  10. Errors in practical measurement in surveying, engineering, and technology

    International Nuclear Information System (INIS)

    Barry, B.A.; Morris, M.D.

    1991-01-01

    This book discusses statistical measurement, error theory, and statistical error analysis. The topics of the book include an introduction to measurement, measurement errors, the reliability of measurements, probability theory of errors, measures of reliability, reliability of repeated measurements, propagation of errors in computing, errors and weights, practical application of the theory of errors in measurement, two-dimensional errors and includes a bibliography. Appendices are included which address significant figures in measurement, basic concepts of probability and the normal probability curve, writing a sample specification for a procedure, classification, standards of accuracy, and general specifications of geodetic control surveys, the geoid, the frequency distribution curve and the computer and calculator solution of problems

  11. Human errors in test and maintenance of nuclear power plants. Nordic project work

    International Nuclear Information System (INIS)

    Andersson, H.; Liwaang, B.

    1985-08-01

    The present report is a summary of the NKA/LIT-1 project performed for the period 1981-1985. The report summarizes work on human error influence in test and calibration activities in nuclear power plants, reviews problems regarding optimization of the test intervals, organization of test and maintenance activities, and the analysis of human error contribution to the overall risk in test and mainenace tasks. (author)

  12. Assessing human error during collecting a hydrocarbon sample of ...

    African Journals Online (AJOL)

    This paper reports the assessment method of the hydrocarbon sample collection standard operation procedure (SOP) using THERP. The Performance Shaping Factors (PSF) from THERP analyzed and assessed the human errors during collecting a hydrocarbon sample of a petrochemical refinery plant. Twenty-two ...

  13. Bringing organizational factors to the fore of human error management

    International Nuclear Information System (INIS)

    Embrey, D.

    1991-01-01

    Human performance problems account for more than half of all significant events at nuclear power plants, even when these did not necessarily lead to severe accidents. In dealing with the management of human error, both technical and organizational factors need to be taken into account. Most important, a long-term commitment from senior management is needed. (author)

  14. Analysis of Human Errors in Japanese Nuclear Power Plants using JHPES/JAESS

    International Nuclear Information System (INIS)

    Kojima, Mitsuhiro; Mimura, Masahiro; Yamaguchi, Osamu

    1998-01-01

    CRIEPI (Central Research Institute for Electric Power Industries) / HFC (Human Factors research Center) developed J-HPES (Japanese version of Human Performance Enhancement System) based on the HPES which was originally developed by INPO to analyze events resulted from human errors. J-HPES was systematized into a computer program named JAESS (J-HPES Analysis and Evaluation Support System) and both systems were distributed to all Japanese electric power companies to analyze events by themselves. CRIEPI / HFC also analyzed the incidents in Japanese nuclear power plants (NPPs) which were officially reported and identified as human error related with J-HPES / JAESS. These incidents have numbered up to 188 cases over the last 30 years. An outline of this analysis is given, and some preliminary findings are shown. (authors)

  15. Preventing marine accidents caused by technology-induced human error

    OpenAIRE

    Bielić, Toni; Hasanspahić, Nermin; Čulin, Jelena

    2017-01-01

    The objective of embedding technology on board ships, to improve safety, is not fully accomplished. The paper studies marine accidents caused by human error resulting from improper human-technology interaction. The aim of the paper is to propose measures to prevent reoccurrence of such accidents. This study analyses the marine accident reports issued by Marine Accidents Investigation Branch covering the period from 2012 to 2014. The factors that caused these accidents are examined and categor...

  16. Human errors and violations in computer and information security: the viewpoint of network administrators and security specialists.

    Science.gov (United States)

    Kraemer, Sara; Carayon, Pascale

    2007-03-01

    This paper describes human errors and violations of end users and network administration in computer and information security. This information is summarized in a conceptual framework for examining the human and organizational factors contributing to computer and information security. This framework includes human error taxonomies to describe the work conditions that contribute adversely to computer and information security, i.e. to security vulnerabilities and breaches. The issue of human error and violation in computer and information security was explored through a series of 16 interviews with network administrators and security specialists. The interviews were audio taped, transcribed, and analyzed by coding specific themes in a node structure. The result is an expanded framework that classifies types of human error and identifies specific human and organizational factors that contribute to computer and information security. Network administrators tended to view errors created by end users as more intentional than unintentional, while errors created by network administrators as more unintentional than intentional. Organizational factors, such as communication, security culture, policy, and organizational structure, were the most frequently cited factors associated with computer and information security.

  17. Human reliability analysis for probabilistic safety assessments - review of methods and issues

    International Nuclear Information System (INIS)

    Srinivas, G.; Guptan, Rajee; Malhotra, P.K.; Ghadge, S.G.; Chandra, Umesh

    2011-01-01

    It is well known that the two major events in World Nuclear Power Plant Operating history, namely the Three Mile Island and Chernobyl, were Human failure events. Subsequent to these two events, several significant changes have been incorporated in Plant Design, Control Room Design and Operator Training to reduce the possibility of Human errors during plant transients. Still, human error contribution to Risk in Nuclear Power Plant operations has been a topic of continued attention for research, development and analysis. Probabilistic Safety Assessments attempt to capture all potential human errors with a scientifically computed failure probability, through Human Reliability Analysis. Several methods are followed by different countries to quantify the Human error probability. This paper reviews the various popular methods being followed, critically examines them with reference to their criticisms and brings out issues for future research. (author)

  18. An advanced human reliability analysis methodology: analysis of cognitive errors focused on

    International Nuclear Information System (INIS)

    Kim, J. H.; Jeong, W. D.

    2001-01-01

    The conventional Human Reliability Analysis (HRA) methods such as THERP/ASEP, HCR and SLIM has been criticised for their deficiency in analysing cognitive errors which occurs during operator's decision making process. In order to supplement the limitation of the conventional methods, an advanced HRA method, what is called the 2 nd generation HRA method, including both qualitative analysis and quantitative assessment of cognitive errors has been being developed based on the state-of-the-art theory of cognitive systems engineering and error psychology. The method was developed on the basis of human decision-making model and the relation between the cognitive function and the performance influencing factors. The application of the proposed method to two emergency operation tasks is presented

  19. Application of human error theory in case analysis of wrong procedures.

    Science.gov (United States)

    Duthie, Elizabeth A

    2010-06-01

    The goal of this study was to contribute to the emerging body of literature about the role of human behaviors and cognitive processes in the commission of wrong procedures. Case analysis of 5 wrong procedures in operative and nonoperative settings using James Reason's human error theory was performed. The case analysis showed that cognitive underspecification, cognitive flips, automode processing, and skill-based errors were contributory to wrong procedures. Wrong-site procedures accounted for the preponderance of the cases. Front-line supervisory staff used corrective actions that focused on the performance of the individual without taking into account cognitive factors. System fixes using human cognition concepts have a greater chance of achieving sustainable safety outcomes than those that are based on the traditional approach of counseling, education, and disciplinary action for staff.

  20. Deadline pressure and human error: a study of human failures on a particle accelerator at Brookhaven National Laboratory

    International Nuclear Information System (INIS)

    Tiagha, E.A.

    1982-01-01

    The decline in industrial efficiency may be linked to decreased reliability of complex automatic systems. This decline threatens the viability of complex organizations in industrialized economies. Industrial engineering techniques that minimize system failure by increasing the reliability of systems hardware are well developed in comparison with those available to reduce human operator errors. The problem of system reliability and the associated costs of breakdown can be reduced if we understand how highly skilled technical personnel function in complex operations and systems. The purpose of this research is to investigate how human errors are affected by deadline pressures, technical communication and other socio-dynamic factors. Through the analysis of a technologically complex particle accelerator prototype at Brookhaven National Laboratory, two failure mechanisms: (1) physical defects in the production process and (2) human operator errors were identified. Two instruments were used to collect information on human failures: objective laboratory data and a human failure questionnaire. The results of human failures from the objective data were used to test for the deadline hypothesis and also to validate the human failure questionnaire. To explain why the human failures occurred, data were collected from a four-part, closed choice questionnaire administered to two groups of scientists, engineers, and technicians, working together against a deadline to produce an engineering prototype of a particle accelerator

  1. Human reliability analysis data obtainment through fuzzy logic in nuclear plants

    Energy Technology Data Exchange (ETDEWEB)

    Nascimento, C.S. do, E-mail: claudio.souza@ctmsp.mar.mil.br [Centro Tecnologico da Marinha em Sao Paulo (CTMSP), Av. Professor Lineu Prestes 2468, 05508-000 Sao Paulo, SP (Brazil); Mesquita, R.N. de, E-mail: rnavarro@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN - SP), Av. Professor Lineu Prestes 2242, 05508-000 Sao Paulo, SP (Brazil)

    2012-09-15

    Highlights: Black-Right-Pointing-Pointer Human Error Probability estimates from operator's reactions to emergency situations. Black-Right-Pointing-Pointer Human Reliability Analysis input data obtainment through fuzzy logic inference. Black-Right-Pointing-Pointer Performance Shaping Factors evaluation influence level onto the operator's actions. - Abstract: Human error has been recognized as an important factor for many industrial and nuclear accidents occurrence. Human error data is scarcely available for different reasons among which, lapses in historical database registry methodology is an important one. Human Reliability Analysis (HRA) is an usual tool employed to estimate the probability that an operator will reasonably perform a system required task in required time without degrading the system. This meta-analysis requires specific Human Error Probability estimates for most of its procedure. This work obtains Human Error Probability (HEP) estimates from operator's actions in response to emergency situations hypothesis on Research Reactor IEA-R1 from IPEN, Brazil. Through this proposed methodology HRA should be able to be performed even with shortage of related human error statistical data. A Performance Shaping Factors (PSF's) evaluation in order to classify and estimate their influence level onto the operator's actions and to determine their actual state over the plant was also done. Both HEP estimation and PSF evaluation were done based on expert judgment using interviews and questionnaires. Expert group was established based on selected IEA-R1 operators, and their evaluation were put into a knowledge representation system which used linguistic variables and group evaluation values that were obtained through Fuzzy Logic and Fuzzy Set theory. HEP obtained values show good agreement with literature published data corroborating the proposed methodology as a good alternative to be used on HRA.

  2. A human error taxonomy and its application to an automatic method accident analysis

    International Nuclear Information System (INIS)

    Matthews, R.H.; Winter, P.W.

    1983-01-01

    Commentary is provided on the quantification aspects of human factors analysis in risk assessment. Methods for quantifying human error in a plant environment are discussed and their application to system quantification explored. Such a programme entails consideration of the data base and a taxonomy of factors contributing to human error. A multi-levelled approach to system quantification is proposed, each level being treated differently drawing on the advantages of different techniques within the fault/event tree framework. Management, as controller of organization, planning and procedure, is assigned a dominant role. (author)

  3. Human reliability data, human error and accident models--illustration through the Three Mile Island accident analysis

    International Nuclear Information System (INIS)

    Le Bot, Pierre

    2004-01-01

    Our first objective is to provide a panorama of Human Reliability data used in EDF's Safety Probabilistic Studies, and then, since these concepts are at the heart of Human Reliability and its methods, to go over the notion of human error and the understanding of accidents. We are not sure today that it is actually possible to provide in this field a foolproof and productive theoretical framework. Consequently, the aim of this article is to suggest potential paths of action and to provide information on EDF's progress along those paths which enables us to produce the most potentially useful Human Reliability analyses while taking into account current knowledge in Human Sciences. The second part of this article illustrates our point of view as EDF researchers through the analysis of the most famous civil nuclear accident, the Three Mile Island unit accident in 1979. Analysis of this accident allowed us to validate our positions regarding the need to move, in the case of an accident, from the concept of human error to that of systemic failure in the operation of systems such as a nuclear power plant. These concepts rely heavily on the notion of distributed cognition and we will explain how we applied it. These concepts were implemented in the MERMOS Human Reliability Probabilistic Assessment methods used in the latest EDF Probabilistic Human Reliability Assessment. Besides the fact that it is not very productive to focus exclusively on individual psychological error, the design of the MERMOS method and its implementation have confirmed two things: the significance of qualitative data collection for Human Reliability, and the central role held by Human Reliability experts in building knowledge about emergency operation, which in effect consists of Human Reliability data collection. The latest conclusion derived from the implementation of MERMOS is that, considering the difficulty in building 'generic' Human Reliability data in the field we are involved in, the best

  4. Advancing Usability Evaluation through Human Reliability Analysis

    International Nuclear Information System (INIS)

    Ronald L. Boring; David I. Gertman

    2005-01-01

    This paper introduces a novel augmentation to the current heuristic usability evaluation methodology. The SPAR-H human reliability analysis method was developed for categorizing human performance in nuclear power plants. Despite the specialized use of SPAR-H for safety critical scenarios, the method also holds promise for use in commercial off-the-shelf software usability evaluations. The SPAR-H method shares task analysis underpinnings with human-computer interaction, and it can be easily adapted to incorporate usability heuristics as performance shaping factors. By assigning probabilistic modifiers to heuristics, it is possible to arrive at the usability error probability (UEP). This UEP is not a literal probability of error but nonetheless provides a quantitative basis to heuristic evaluation. When combined with a consequence matrix for usability errors, this method affords ready prioritization of usability issues

  5. Task types and error types involved in the human-related unplanned reactor trip events

    International Nuclear Information System (INIS)

    Kim, Jae Whan; Park, Jin Kyun

    2008-01-01

    In this paper, the contribution of task types and error types involved in the human-related unplanned reactor trip events that have occurred between 1986 and 2006 in Korean nuclear power plants are analysed in order to establish a strategy for reducing the human-related unplanned reactor trips. Classification systems for the task types, error modes, and cognitive functions are developed or adopted from the currently available taxonomies, and the relevant information is extracted from the event reports or judged on the basis of an event description. According to the analyses from this study, the contributions of the task types are as follows: corrective maintenance (25.7%), planned maintenance (22.8%), planned operation (19.8%), periodic preventive maintenance (14.9%), response to a transient (9.9%), and design/manufacturing/installation (6.9%). According to the analysis of the error modes, error modes such as control failure (22.2%), wrong object (18.5%), omission (14.8%), wrong action (11.1%), and inadequate (8.3%) take up about 75% of the total unplanned trip events. The analysis of the cognitive functions involved in the events indicated that the planning function had the highest contribution (46.7%) to the human actions leading to unplanned reactor trips. This analysis concludes that in order to significantly reduce human-induced or human-related unplanned reactor trips, an aide system (in support of maintenance personnel) for evaluating possible (negative) impacts of planned actions or erroneous actions as well as an appropriate human error prediction technique, should be developed

  6. Task types and error types involved in the human-related unplanned reactor trip events

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae Whan; Park, Jin Kyun [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2008-12-15

    In this paper, the contribution of task types and error types involved in the human-related unplanned reactor trip events that have occurred between 1986 and 2006 in Korean nuclear power plants are analysed in order to establish a strategy for reducing the human-related unplanned reactor trips. Classification systems for the task types, error modes, and cognitive functions are developed or adopted from the currently available taxonomies, and the relevant information is extracted from the event reports or judged on the basis of an event description. According to the analyses from this study, the contributions of the task types are as follows: corrective maintenance (25.7%), planned maintenance (22.8%), planned operation (19.8%), periodic preventive maintenance (14.9%), response to a transient (9.9%), and design/manufacturing/installation (6.9%). According to the analysis of the error modes, error modes such as control failure (22.2%), wrong object (18.5%), omission (14.8%), wrong action (11.1%), and inadequate (8.3%) take up about 75% of the total unplanned trip events. The analysis of the cognitive functions involved in the events indicated that the planning function had the highest contribution (46.7%) to the human actions leading to unplanned reactor trips. This analysis concludes that in order to significantly reduce human-induced or human-related unplanned reactor trips, an aide system (in support of maintenance personnel) for evaluating possible (negative) impacts of planned actions or erroneous actions as well as an appropriate human error prediction technique, should be developed.

  7. The human error rate assessment and optimizing system HEROS - a new procedure for evaluating and optimizing the man-machine interface in PSA

    International Nuclear Information System (INIS)

    Richei, A.; Hauptmanns, U.; Unger, H.

    2001-01-01

    A new procedure allowing the probabilistic evaluation and optimization of the man-machine system is presented. This procedure and the resulting expert system HEROS, which is an acronym for Human Error Rate Assessment and Optimizing System, is based on the fuzzy set theory. Most of the well-known procedures employed for the probabilistic evaluation of human factors involve the use of vague linguistic statements on performance shaping factors to select and to modify basic human error probabilities from the associated databases. This implies a large portion of subjectivity. Vague statements are expressed here in terms of fuzzy numbers or intervals which allow mathematical operations to be performed on them. A model of the man-machine system is the basis of the procedure. A fuzzy rule-based expert system was derived from ergonomic and psychological studies. Hence, it does not rely on a database, whose transferability to situations different from its origin is questionable. In this way, subjective elements are eliminated to a large extent. HEROS facilitates the importance analysis for the evaluation of human factors, which is necessary for optimizing the man-machine system. HEROS is applied to the analysis of a simple diagnosis of task of the operating personnel in a nuclear power plant

  8. Application of human reliability analysis methodology of second generation

    International Nuclear Information System (INIS)

    Ruiz S, T. de J.; Nelson E, P. F.

    2009-10-01

    The human reliability analysis (HRA) is a very important part of probabilistic safety analysis. The main contribution of HRA in nuclear power plants is the identification and characterization of the issues that are brought together for an error occurring in the human tasks that occur under normal operation conditions and those made after abnormal event. Additionally, the analysis of various accidents in history, it was found that the human component has been a contributing factor in the cause. Because of need to understand the forms and probability of human error in the 60 decade begins with the collection of generic data that result in the development of the first generation of HRA methodologies. Subsequently develop methods to include in their models additional performance shaping factors and the interaction between them. So by the 90 mid, comes what is considered the second generation methodologies. Among these is the methodology A Technique for Human Event Analysis (ATHEANA). The application of this method in a generic human failure event, it is interesting because it includes in its modeling commission error, the additional deviations quantification to nominal scenario considered in the accident sequence of probabilistic safety analysis and, for this event the dependency actions evaluation. That is, the generic human failure event was required first independent evaluation of the two related human failure events . So the gathering of the new human error probabilities involves the nominal scenario quantification and cases of significant deviations considered by the potential impact on analyzed human failure events. Like probabilistic safety analysis, with the analysis of the sequences were extracted factors more specific with the highest contribution in the human error probabilities. (Author)

  9. Identification of failure sequences sensitive to human error

    International Nuclear Information System (INIS)

    1987-06-01

    This report prepared by the participants of the technical committee meeting on ''Identification of Failure Sequences Sensitive to Human Error'' addresses the subjects discussed during the meeting and the conclusions reached by the committee. Chapter 1 reviews the INSAG recommendations and the main elements of the IAEA Programme in the area of human element. In Chapter 2 the role of human actions in nuclear power plants safety from insights of operational experience is reviewed. Chapter 3 is concerned with the relationship between probabilistic safety assessment and human performance associated with severe accident sequences. Chapter 4 addresses the role of simulators in view of training for accident conditions. Chapter 5 presents the conclusions and future trends. The seven papers presented by members of this technical committee are also included in this technical document. A separate abstract was prepared for each of these papers

  10. PRA (probabilistic risk analysis) in the nuclear sector. Quantifying human error and human malice

    International Nuclear Information System (INIS)

    Heyes, A.G.

    1995-01-01

    Regardless of the regulatory style chosen ('command and control' or 'functional') a vital prerequisite for coherent safety regulations in the nuclear power industry is the ability to assess accident risk. In this paper we present a critical analysis of current techniques of probabilistic risk analysis applied in the industry, with particular regard to the problems of quantifying risks arising from, or exacerbated by, human risk and/or human error. (Author)

  11. A Method and Support Tool for the Analysis of Human Error Hazards in Digital Devices

    International Nuclear Information System (INIS)

    Lee, Yong Hee; Kim, Seon Soo; Lee, Yong Hee

    2012-01-01

    In recent years, many nuclear power plants have adopted modern digital I and C technologies since they are expected to significantly improve their performance and safety. Modern digital technologies were expected to significantly improve both the economical efficiency and safety of nuclear power plants. However, the introduction of an advanced main control room (MCR) is accompanied with lots of changes in forms and features and differences through virtue of new digital devices. Many user-friendly displays and new features in digital devices are not enough to prevent human errors in nuclear power plants (NPPs). It may be an urgent to matter find the human errors potentials due to digital devices, and their detailed mechanisms. We can then consider them during the design of digital devices and their interfaces. The characteristics of digital technologies and devices may give many opportunities to the interface management, and can be integrated into a compact single workstation in an advanced MCR, such that workers can operate the plant with minimum burden under any operating condition. However, these devices may introduce new types of human errors, and thus we need a means to evaluate and prevent such errors, especially within digital devices for NPPs. This research suggests a new method named HEA-BIS (Human Error Analysis based on Interaction Segment) to confirm and detect human errors associated with digital devices. This method can be facilitated by support tools when used to ensure the safety when applying digital devices in NPPs

  12. Safety coaches in radiology: decreasing human error and minimizing patient harm

    Energy Technology Data Exchange (ETDEWEB)

    Dickerson, Julie M.; Adams, Janet M. [Cincinnati Children' s Hospital Medical Center, Department of Radiology, MLC 5031, Cincinnati, OH (United States); Koch, Bernadette L.; Donnelly, Lane F. [Cincinnati Children' s Hospital Medical Center, Department of Radiology, MLC 5031, Cincinnati, OH (United States); Cincinnati Children' s Hospital Medical Center, Department of Pediatrics, Cincinnati, OH (United States); Goodfriend, Martha A. [Cincinnati Children' s Hospital Medical Center, Department of Quality Improvement, Cincinnati, OH (United States)

    2010-09-15

    Successful programs to improve patient safety require a component aimed at improving safety culture and environment, resulting in a reduced number of human errors that could lead to patient harm. Safety coaching provides peer accountability. It involves observing for safety behaviors and use of error prevention techniques and provides immediate feedback. For more than a decade, behavior-based safety coaching has been a successful strategy for reducing error within the context of occupational safety in industry. We describe the use of safety coaches in radiology. Safety coaches are an important component of our comprehensive patient safety program. (orig.)

  13. Safety coaches in radiology: decreasing human error and minimizing patient harm

    International Nuclear Information System (INIS)

    Dickerson, Julie M.; Adams, Janet M.; Koch, Bernadette L.; Donnelly, Lane F.; Goodfriend, Martha A.

    2010-01-01

    Successful programs to improve patient safety require a component aimed at improving safety culture and environment, resulting in a reduced number of human errors that could lead to patient harm. Safety coaching provides peer accountability. It involves observing for safety behaviors and use of error prevention techniques and provides immediate feedback. For more than a decade, behavior-based safety coaching has been a successful strategy for reducing error within the context of occupational safety in industry. We describe the use of safety coaches in radiology. Safety coaches are an important component of our comprehensive patient safety program. (orig.)

  14. Safety coaches in radiology: decreasing human error and minimizing patient harm.

    Science.gov (United States)

    Dickerson, Julie M; Koch, Bernadette L; Adams, Janet M; Goodfriend, Martha A; Donnelly, Lane F

    2010-09-01

    Successful programs to improve patient safety require a component aimed at improving safety culture and environment, resulting in a reduced number of human errors that could lead to patient harm. Safety coaching provides peer accountability. It involves observing for safety behaviors and use of error prevention techniques and provides immediate feedback. For more than a decade, behavior-based safety coaching has been a successful strategy for reducing error within the context of occupational safety in industry. We describe the use of safety coaches in radiology. Safety coaches are an important component of our comprehensive patient safety program.

  15. Human errors identification using the human factors analysis and classification system technique (HFACS

    Directory of Open Access Journals (Sweden)

    G. A. Shirali

    2013-12-01

    .Result: In this study, 158 reports of accident in Ahvaz steel industry were analyzed by HFACS technique. This analysis showed that most of the human errors were: in the first level was related to the skill-based errors, in the second to the physical environment, in the third level to the inadequate supervision and in the fourth level to the management of resources. .Conclusion: Studying and analyzing of past events using the HFACS technique can identify the major and root causes of accidents and can be effective on prevent repetitions of such mishaps. Also, it can be used as a basis for developing strategies to prevent future events in steel industries.

  16. Human Reliability Analysis for In-Tank Precipitation Alignment and Startup of Emergency Purge Ventilation Equipment. Revision 3

    International Nuclear Information System (INIS)

    Shapiro, B.J.; Britt, T.E.

    1994-10-01

    This report documents the methodology used for calculating the human error probability for establishing air based ventilation using emergency purge ventilation equipment on In-Tank Precipitation (ITP) processing tanks 48 and 49 after failure of the nitrogen purge system following a seismic event. The analyses were performed according to THERP (Technique for Human Error Rate Prediction) as described in NUREG/CR-1278-F, ''Handbook of Human Reliability Analysis with Emphasis on Nuclear Power Plant Applications.'' The calculated human error probabilities are provided as input to the Fault Tree Analysis for the ITP Nitrogen Purge System

  17. Thresholds for human detection of patient setup errors in digitally reconstructed portal images of prostate fields

    International Nuclear Information System (INIS)

    Phillips, Brooke L.; Jiroutek, Michael R.; Tracton, Gregg; Elfervig, Michelle; Muller, Keith E.; Chaney, Edward L.

    2002-01-01

    Purpose: Computer-assisted methods to analyze electronic portal images for the presence of treatment setup errors should be studied in controlled experiments before use in the clinical setting. Validation experiments using images that contain known errors usually report the smallest errors that can be detected by the image analysis algorithm. This paper offers human error-detection thresholds as one benchmark for evaluating the smallest errors detected by algorithms. Unfortunately, reliable data are lacking describing human performance. The most rigorous benchmarks for human performance are obtained under conditions that favor error detection. To establish such benchmarks, controlled observer studies were carried out to determine the thresholds of detectability for in-plane and out-of-plane translation and rotation setup errors introduced into digitally reconstructed portal radiographs (DRPRs) of prostate fields. Methods and Materials: Seventeen observers comprising radiation oncologists, radiation oncology residents, physicists, and therapy students participated in a two-alternative forced choice experiment involving 378 DRPRs computed using the National Library of Medicine Visible Human data sets. An observer viewed three images at a time displayed on adjacent computer monitors. Each image triplet included a reference digitally reconstructed radiograph displayed on the central monitor and two DRPRs displayed on the flanking monitors. One DRPR was error free. The other DRPR contained a known in-plane or out-of-plane error in the placement of the treatment field over a target region in the pelvis. The range for each type of error was determined from pilot observer studies based on a Probit model for error detection. The smallest errors approached the limit of human visual capability. The observer was told what kind of error was introduced, and was asked to choose the DRPR that contained the error. Observer decisions were recorded and analyzed using repeated

  18. Human error as the root cause of severe accidents at nuclear reactors

    International Nuclear Information System (INIS)

    Kovács Zoltán; Rýdzi, Stanislav

    2017-01-01

    A root cause is a factor inducing an undesirable event. It is feasible for root causes to be eliminated through technological process improvements. Human error was the root cause of all severe accidents at nuclear power plants. The TMI accident was caused by a series of human errors. The Chernobyl disaster occurred after a badly performed test of the turbogenerator at a reactor with design deficiencies, and in addition, the operators ignored the safety principles and disabled the safety systems. At Fukushima the tsunami risk was underestimated and the project failed to consider the specific issues of the site. The paper describes the severe accidents and points out the human errors that caused them. Also, provisions that might have eliminated those severe accidents are suggested. The fact that each severe accident occurred on a different type of reactor is relevant – no severe accident ever occurred twice at the same reactor type. The lessons learnt from the severe accidents and the safety measures implemented on reactor units all over the world seem to be effective. (orig.)

  19. Error rates in forensic DNA analysis: definition, numbers, impact and communication.

    Science.gov (United States)

    Kloosterman, Ate; Sjerps, Marjan; Quak, Astrid

    2014-09-01

    Forensic DNA casework is currently regarded as one of the most important types of forensic evidence, and important decisions in intelligence and justice are based on it. However, errors occasionally occur and may have very serious consequences. In other domains, error rates have been defined and published. The forensic domain is lagging behind concerning this transparency for various reasons. In this paper we provide definitions and observed frequencies for different types of errors at the Human Biological Traces Department of the Netherlands Forensic Institute (NFI) over the years 2008-2012. Furthermore, we assess their actual and potential impact and describe how the NFI deals with the communication of these numbers to the legal justice system. We conclude that the observed relative frequency of quality failures is comparable to studies from clinical laboratories and genetic testing centres. Furthermore, this frequency is constant over the five-year study period. The most common causes of failures related to the laboratory process were contamination and human error. Most human errors could be corrected, whereas gross contamination in crime samples often resulted in irreversible consequences. Hence this type of contamination is identified as the most significant source of error. Of the known contamination incidents, most were detected by the NFI quality control system before the report was issued to the authorities, and thus did not lead to flawed decisions like false convictions. However in a very limited number of cases crucial errors were detected after the report was issued, sometimes with severe consequences. Many of these errors were made in the post-analytical phase. The error rates reported in this paper are useful for quality improvement and benchmarking, and contribute to an open research culture that promotes public trust. However, they are irrelevant in the context of a particular case. Here case-specific probabilities of undetected errors are needed

  20. Human factors reliability benchmark exercise, report of the SRD participation

    International Nuclear Information System (INIS)

    Waters, Trevor

    1988-01-01

    Within the scope of the Human Factors Reliability Benchmark Exercise, organised by the Joint Research Centre, Ispra, Italy, the Safety and Reliability Directorate (SRD) team has performed analysis of human factors in two different activities - a routine test and a non-routine operational transient. For both activities, an 'FMEA-like' task, potential errors, and the factors which affect performance. For analysis of the non-routine activity, which involved a significant amount of cognitive processing, such as diagnosis and decision making, a new approach for qualitative analysis has been developed. Modelling has been performed using both event trees and fault trees and examples are provided. Human error probabilities were estimated using the methods Absolute Probability Judgement (APJ), Human Cognitive Reliability Method (HCR), Human Error and Assessment and Reduction Technique (HEART), Success-Likelihood Index Method (SLIM), Technica Empiriza Stima Eurori Operatori (TESEO), and Technique for Human Error Rate Prediction (THERP). A discussion is provided of the lessons learnt in the course of the exercise and unresolved difficulties in the assessment of human reliability. (author)

  1. Complications: acknowledging, managing, and coping with human error.

    Science.gov (United States)

    Helo, Sevann; Moulton, Carol-Anne E

    2017-08-01

    Errors are inherent in medicine due to the imperfectness of human nature. Health care providers may have a difficult time accepting their fallibility, acknowledging mistakes, and disclosing errors. Fear of litigation, shame, blame, and concern about reputation are just some of the barriers preventing physicians from being more candid with their patients, despite the supporting body of evidence that patients cite poor communication and lack of transparency as primary drivers to file a lawsuit in the wake of a medical complication. Proper error disclosure includes a timely explanation of what happened, who was involved, why the error occurred, and how it will be prevented in the future. Medical mistakes afford the opportunity for individuals and institutions to be candid about their weaknesses while improving patient care processes. When a physician takes the Hippocratic Oath they take on a tremendous sense of responsibility for the care of their patients, and often bear the burden of their mistakes in isolation. Physicians may struggle with guilt, shame, and a crisis of confidence, which may thwart efforts to identify areas for improvement that can lead to meaningful change. Coping strategies for providers include discussing the event with others, seeking professional counseling, and implementing quality improvement projects. Physicians and health care organizations need to find adaptive ways to deal with complications that will benefit patients, providers, and their institutions.

  2. Average bit error probability of binary coherent signaling over generalized fading channels subject to additive generalized gaussian noise

    KAUST Repository

    Soury, Hamza

    2012-06-01

    This letter considers the average bit error probability of binary coherent signaling over flat fading channels subject to additive generalized Gaussian noise. More specifically, a generic closed form expression in terms of the Fox\\'s H function is offered for the extended generalized-K fading case. Simplifications for some special fading distributions such as generalized-K fading and Nakagami-m fading and special additive noise distributions such as Gaussian and Laplacian noise are then presented. Finally, the mathematical formalism is illustrated by some numerical examples verified by computer based simulations for a variety of fading and additive noise parameters. © 2012 IEEE.

  3. Nuclear power plant personnel errors in decision-making as an object of probabilistic risk assessment

    International Nuclear Information System (INIS)

    Reer, B.

    1993-09-01

    The integration of human error - also called man-machine system analysis (MMSA) - is an essential part of probabilistic risk assessment (PRA). A new method is presented which allows for a systematic and comprehensive PRA inclusions of decision-based errors due to conflicts or similarities. For the error identification procedure, new question techniques are developed. These errors are shown to be identified by looking at retroactions caused by subordinate goals as components of the overall safety relevant goal. New quantification methods for estimating situation-specific probabilities are developed. The factors conflict and similarity are operationalized in a way that allows their quantification based on informations which are usually available in PRA. The quantification procedure uses extrapolations and interpolations based on a poor set of data related to decision-based errors. Moreover, for passive errors in decision-making a completely new approach is presented where errors are quantified via a delay initiating the required action rather than via error probabilities. The practicability of this dynamic approach is demonstrated by a probabilistic analysis of the actions required during the total loss of feedwater event at the Davis-Besse plant 1985. The extensions of the ''classical'' PRA method developed in this work are applied to a MMSA of the decay heat removal (DHR) of the ''HTR-500''. Errors in decision-making - as potential roots of extraneous acts - are taken into account in a comprehensive and systematic manner. Five additional errors are identified. However, the probabilistic quantification results a nonsignificant increase of the DHR failure probability. (orig.) [de

  4. When soft controls get slippery: User interfaces and human error

    International Nuclear Information System (INIS)

    Stubler, W.F.; O'Hara, J.M.

    1998-01-01

    Many types of products and systems that have traditionally featured physical control devices are now being designed with soft controls--input formats appearing on computer-based display devices and operated by a variety of input devices. A review of complex human-machine systems found that soft controls are particularly prone to some types of errors and may affect overall system performance and safety. This paper discusses the application of design approaches for reducing the likelihood of these errors and for enhancing usability, user satisfaction, and system performance and safety

  5. Development of an FAA-EUROCONTROL technique for the analysis of human error in ATM : final report.

    Science.gov (United States)

    2002-07-01

    Human error has been identified as a dominant risk factor in safety-oriented industries such as air traffic control (ATC). However, little is known about the factors leading to human errors in current air traffic management (ATM) systems. The first s...

  6. Personnel reliability impact on petrochemical facilities monitoring system's failure skipping probability

    Science.gov (United States)

    Kostyukov, V. N.; Naumenko, A. P.

    2017-08-01

    The paper dwells upon urgent issues of evaluating impact of actions conducted by complex technological systems operators on their safe operation considering application of condition monitoring systems for elements and sub-systems of petrochemical production facilities. The main task for the research is to distinguish factors and criteria of monitoring system properties description, which would allow to evaluate impact of errors made by personnel on operation of real-time condition monitoring and diagnostic systems for machinery of petrochemical facilities, and find and objective criteria for monitoring system class, considering a human factor. On the basis of real-time condition monitoring concepts of sudden failure skipping risk, static and dynamic error, monitoring systems, one may solve a task of evaluation of impact that personnel's qualification has on monitoring system operation in terms of error in personnel or operators' actions while receiving information from monitoring systems and operating a technological system. Operator is considered as a part of the technological system. Although, personnel's behavior is usually a combination of the following parameters: input signal - information perceiving, reaction - decision making, response - decision implementing. Based on several researches on behavior of nuclear powers station operators in USA, Italy and other countries, as well as on researches conducted by Russian scientists, required data on operator's reliability were selected for analysis of operator's behavior at technological facilities diagnostics and monitoring systems. The calculations revealed that for the monitoring system selected as an example, the failure skipping risk for the set values of static (less than 0.01) and dynamic (less than 0.001) errors considering all related factors of data on reliability of information perception, decision-making, and reaction fulfilled is 0.037, in case when all the facilities and error probability are under

  7. Seismic-load-induced human errors and countermeasures using computer graphics in plant-operator communication

    International Nuclear Information System (INIS)

    Hara, Fumio

    1988-01-01

    This paper remarks the importance of seismic load-induced human errors in plant operation by delineating the characteristics of the task performance of human beings under seismic loads. It focuses on man-machine communication via multidimensional data like that conventionally displayed on large panels in a plant control room. It demonstrates a countermeasure to human errors using a computer graphics technique that conveys the global state of the plant operation to operators through cartoon-like, colored graphs in the form of faces that, with different facial expressions, show the plant safety status. (orig.)

  8. Development of the Human Error Management Criteria and the Job Aptitude Evaluation Criteria for Rail Safety Personnel

    Energy Technology Data Exchange (ETDEWEB)

    Koo, In Soo; Seo, Sang Mun; Park, Geun Ok (and others)

    2008-08-15

    It has been estimated that up to 90% of all workplace accidents have human error as a cause. Human error has been widely recognized as a key factor in almost all the highly publicized accidents, including Daegu subway fire of February 18, 2003 killed 198 people and injured 147. Because most human behavior is 'unintentional', carried out automatically, root causes of human error should be carefully investigated and regulated by a legal authority. The final goal of this study is to set up some regulatory guidance that are supposed to be used by the korean rail organizations related to safety managements and the contents are : - to develop the regulatory guidance for managing human error, - to develop the regulatory guidance for managing qualifications of rail drivers - to develop the regulatory guidance for evaluating the aptitude of the safety-related personnel.

  9. Development of the Human Error Management Criteria and the Job Aptitude Evaluation Criteria for Rail Safety Personnel

    International Nuclear Information System (INIS)

    Koo, In Soo; Seo, Sang Mun; Park, Geun Ok

    2008-08-01

    It has been estimated that up to 90% of all workplace accidents have human error as a cause. Human error has been widely recognized as a key factor in almost all the highly publicized accidents, including Daegu subway fire of February 18, 2003 killed 198 people and injured 147. Because most human behavior is 'unintentional', carried out automatically, root causes of human error should be carefully investigated and regulated by a legal authority. The final goal of this study is to set up some regulatory guidance that are supposed to be used by the korean rail organizations related to safety managements and the contents are : - to develop the regulatory guidance for managing human error, - to develop the regulatory guidance for managing qualifications of rail drivers - to develop the regulatory guidance for evaluating the aptitude of the safety-related personnel

  10. Perancangan Fasilitas Kerja untuk Mereduksi Human Error

    Directory of Open Access Journals (Sweden)

    Harmein Nasution

    2012-01-01

    Full Text Available Work equipments and environment which are not design ergonomically can cause physical exhaustion to the workers. As a result of that physical exhaustion, many defects in the production lines can happen due to human error and also cause musculoskeletal complaints. To overcome, those effects, we occupied methods for analyzing the workers posture based on the SNQ (Standard Nordic Questionnaire, plibel, QEC (Quick Exposure Check and biomechanism. Moreover, we applied those methods for designing rolling machines and grip egrek ergono-mically, so that the defects on those production lines can be minimized.

  11. Modelling the basic error tendencies of human operators

    Energy Technology Data Exchange (ETDEWEB)

    Reason, J.

    1988-01-01

    The paper outlines the primary structural features of human cognition: a limited, serial workspace interacting with a parallel distributed knowledge base. It is argued that the essential computational features of human cognition - to be captured by an adequate operator model - reside in the mechanisms by which stored knowledge structures are selected and brought into play. Two such computational 'primitives' are identified: similarity-matching and frequency-gambling. These two retrieval heuristics, it is argued, shape both the overall character of human performance (i.e. its heavy reliance on pattern-matching) and its basic error tendencies ('strong-but-wrong' responses, confirmation, similarity and frequency biases, and cognitive 'lock-up'). The various features of human cognition are integrated with a dynamic operator model capable of being represented in software form. This computer model, when run repeatedly with a variety of problem configurations, should produce a distribution of behaviours which, in toto, simulate the general character of operator performance.

  12. Modelling the basic error tendencies of human operators

    International Nuclear Information System (INIS)

    Reason, J.

    1988-01-01

    The paper outlines the primary structural features of human cognition: a limited, serial workspace interacting with a parallel distributed knowledge base. It is argued that the essential computational features of human cognition - to be captured by an adequate operator model - reside in the mechanisms by which stored knowledge structures are selected and brought into play. Two such computational 'primitives' are identified: similarity-matching and frequency-gambling. These two retrieval heuristics, it is argued, shape both the overall character of human performance (i.e. its heavy reliance on pattern-matching) and its basic error tendencies ('strong-but-wrong' responses, confirmation, similarity and frequency biases, and cognitive 'lock-up'). The various features of human cognition are integrated with a dynamic operator model capable of being represented in software form. This computer model, when run repeatedly with a variety of problem configurations, should produce a distribution of behaviours which, in total, simulate the general character of operator performance. (author)

  13. Modelling the basic error tendencies of human operators

    International Nuclear Information System (INIS)

    Reason, James

    1988-01-01

    The paper outlines the primary structural features of human cognition: a limited, serial workspace interacting with a parallel distributed knowledge base. It is argued that the essential computational features of human cognition - to be captured by an adequate operator model - reside in the mechanisms by which stored knowledge structures are selected and brought into play. Two such computational 'primitives' are identified: similarity-matching and frequency-gambling. These two retrieval heuristics, it is argued, shape both the overall character of human performance (i.e. its heavy reliance on pattern-matching) and its basic error tendencies ('strong-but-wrong' responses, confirmation, similarity and frequency biases, and cognitive 'lock-up'). The various features of human cognition are integrated with a dynamic operator model capable of being represented in software form. This computer model, when run repeatedly with a variety of problem configurations, should produce a distribution of behaviours which, in toto, simulate the general character of operator performance. (author)

  14. Human error in strabismus surgery: Quantification with a sensitivity analysis

    NARCIS (Netherlands)

    S. Schutte (Sander); J.R. Polling (Jan Roelof); F.C.T. van der Helm (Frans); H.J. Simonsz (Huib)

    2009-01-01

    textabstractBackground: Reoperations are frequently necessary in strabismus surgery. The goal of this study was to analyze human-error related factors that introduce variability in the results of strabismus surgery in a systematic fashion. Methods: We identified the primary factors that influence

  15. A Benefit/Cost/Deficit (BCD) model for learning from human errors

    International Nuclear Information System (INIS)

    Vanderhaegen, Frederic; Zieba, Stephane; Enjalbert, Simon; Polet, Philippe

    2011-01-01

    This paper proposes an original model for interpreting human errors, mainly violations, in terms of benefits, costs and potential deficits. This BCD model is then used as an input framework to learn from human errors, and two systems based on this model are developed: a case-based reasoning system and an artificial neural network system. These systems are used to predict a specific human car driving violation: not respecting the priority-to-the-right rule, which is a decision to remove a barrier. Both prediction systems learn from previous violation occurrences, using the BCD model and four criteria: safety, for identifying the deficit or the danger; and opportunity for action, driver comfort, and time spent; for identifying the benefits or the costs. The application of learning systems to predict car driving violations gives a rate over 80% of correct prediction after 10 iterations. These results are validated for the non-respect of priority-to-the-right rule.

  16. Probability for human intake of an atom randomly released into ground, rivers, oceans and air

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, B L

    1984-08-01

    Numerical estimates are developed for the probability of an atom randomly released in the top ground layers, in a river, or in the oceans to be ingested orally by a human, and for an atom emitted from an industrial source to be inhaled by a human. Estimates are obtained for both probability per year and for total eventual probability. Results vary considerably for different elements, but typical values for total probabilities are: ground, 3 X 10/sup -3/, oceans, 3 X 10/sup -4/; rivers, 1.7 x 10/sup -4/; and air, 5 X 10/sup -6/. Probabilities per year are typcially 1 X 10/sup -7/ for releases into the ground and 5 X 10/sup -8/ for releases into the oceans. These results indicate that for material with very long-lasting toxicity, it is important to include the pathways from the ground and from the oceans.

  17. How to Cope with the Rare Human Error Events Involved with organizational Factors in Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sa Kil; Luo, Meiling; Lee, Yong Hee [Korea Atomic Research Institute, Daejeon (Korea, Republic of)

    2014-10-15

    The current human error guidelines (e.g. US DOD handbooks, US NRC Guidelines) are representative tools to prevent human errors. These tools, however, have limits that they do not adapt all operating situations and circumstances such as design base events. In other words, these tools are only adapted foreseeable standardized operating situations and circumstances. In this study, our research team proposed an evidence-based approach such as UK's safety case to coping with the rare human error events such as TMI, Chernobyl, Fukushima accidents. These accidents are representative events involved with rare human errors. Our research team defined the 'rare human errors' as the follow three characterized events; Extremely low frequency Extremely high complicated structure Extremely serious damage of human life and property A safety case is a structured argument, supported by evidence, intended to justify that a system is acceptably safe. The definition by UK defense standard 00-56 issue 4 states that such an evidence-based approach can be contrast with a prescriptive approach to safety certification, which require safety to be justified using a prescribed process. Safety managements and safety regulatory activities based on safety case are effective to control organizational factors in terms of integrated safety management. Especially safety issues relevant with public acceptance are useful to provide practical evidences to the public reasonably. European Union including UK has developed the concept of engineered safety management system to deal with public acceptance using the safety case. In Korea nuclear industry, the Korean Atomic Research Institute has firstly performed a basic research to adapt the safety case in the field of radioactive waste according to the IAEA SSG-23(KAERI/TR-4497, 4531). Excepting the radioactive waste, there is no try to adapt the safety case yet. Most incidents and accidents involved human during operating NPPs have a tendency

  18. How to Cope with the Rare Human Error Events Involved with organizational Factors in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Kim, Sa Kil; Luo, Meiling; Lee, Yong Hee

    2014-01-01

    The current human error guidelines (e.g. US DOD handbooks, US NRC Guidelines) are representative tools to prevent human errors. These tools, however, have limits that they do not adapt all operating situations and circumstances such as design base events. In other words, these tools are only adapted foreseeable standardized operating situations and circumstances. In this study, our research team proposed an evidence-based approach such as UK's safety case to coping with the rare human error events such as TMI, Chernobyl, Fukushima accidents. These accidents are representative events involved with rare human errors. Our research team defined the 'rare human errors' as the follow three characterized events; Extremely low frequency Extremely high complicated structure Extremely serious damage of human life and property A safety case is a structured argument, supported by evidence, intended to justify that a system is acceptably safe. The definition by UK defense standard 00-56 issue 4 states that such an evidence-based approach can be contrast with a prescriptive approach to safety certification, which require safety to be justified using a prescribed process. Safety managements and safety regulatory activities based on safety case are effective to control organizational factors in terms of integrated safety management. Especially safety issues relevant with public acceptance are useful to provide practical evidences to the public reasonably. European Union including UK has developed the concept of engineered safety management system to deal with public acceptance using the safety case. In Korea nuclear industry, the Korean Atomic Research Institute has firstly performed a basic research to adapt the safety case in the field of radioactive waste according to the IAEA SSG-23(KAERI/TR-4497, 4531). Excepting the radioactive waste, there is no try to adapt the safety case yet. Most incidents and accidents involved human during operating NPPs have a tendency

  19. Human error in strabismus surgery : Quantification with a sensitivity analysis

    NARCIS (Netherlands)

    Schutte, S.; Polling, J.R.; Van der Helm, F.C.T.; Simonsz, H.J.

    2008-01-01

    Background- Reoperations are frequently necessary in strabismus surgery. The goal of this study was to analyze human-error related factors that introduce variability in the results of strabismus surgery in a systematic fashion. Methods- We identified the primary factors that influence the outcome of

  20. New classification of operators' human errors at overseas nuclear power plants and preparation of easy-to-use case sheets

    International Nuclear Information System (INIS)

    Takagawa, Kenichi

    2004-01-01

    At nuclear power plants, plant operators examine other human error cases, including those that occurred at other plants, so that they can learn from such experiences and avoid making similar errors again. Although there is little data available on errors made at domestic plants, nuclear operators in foreign countries are reporting even minor irregularities and signs of faults, and a large amount of data on human errors at overseas plants could be collected and examined. However, these overseas data have not been used effectively because most of them are poorly organized or not properly classified and are often hard to understand. Accordingly, we carried out a study on the cases of human errors at overseas power plants in order to help plant personnel clearly understand overseas experiences and avoid repeating similar errors, The study produced the following results, which were put to use at nuclear power plants and other facilities. (1) ''One-Point-Advice'' refers to a practice where a leader gives pieces of advice to his team of operators in order to prevent human errors before starting work. Based on this practice and those used in the aviation industry, we have developed a new method of classifying human errors that consists of four basic actions and three applied actions. (2) We used this new classification method to classify human errors made by operators at overseas nuclear power plants. The results show that the most frequent errors caused not by operators themselves but due to insufficient team monitoring, for which superiors and/or their colleagues were responsible. We therefore analyzed and classified possible factors contributing to insufficient team monitoring, and demonstrated that the frequent errors have also occurred at domestic power plants. (3) Using the new classification formula, we prepared a human error case sheets that is easy for plant personnel to understand. The sheets are designed to make data more understandable and easier to remember

  1. Critical lengths of error events in convolutional codes

    DEFF Research Database (Denmark)

    Justesen, Jørn

    1994-01-01

    If the calculation of the critical length is based on the expurgated exponent, the length becomes nonzero for low error probabilities. This result applies to typical long codes, but it may also be useful for modeling error events in specific codes......If the calculation of the critical length is based on the expurgated exponent, the length becomes nonzero for low error probabilities. This result applies to typical long codes, but it may also be useful for modeling error events in specific codes...

  2. Critical Lengths of Error Events in Convolutional Codes

    DEFF Research Database (Denmark)

    Justesen, Jørn; Andersen, Jakob Dahl

    1998-01-01

    If the calculation of the critical length is based on the expurgated exponent, the length becomes nonzero for low error probabilities. This result applies to typical long codes, but it may also be useful for modeling error events in specific codes......If the calculation of the critical length is based on the expurgated exponent, the length becomes nonzero for low error probabilities. This result applies to typical long codes, but it may also be useful for modeling error events in specific codes...

  3. Analysis of the "naming game" with learning errors in communications.

    Science.gov (United States)

    Lou, Yang; Chen, Guanrong

    2015-07-16

    Naming game simulates the process of naming an objective by a population of agents organized in a certain communication network. By pair-wise iterative interactions, the population reaches consensus asymptotically. We study naming game with communication errors during pair-wise conversations, with error rates in a uniform probability distribution. First, a model of naming game with learning errors in communications (NGLE) is proposed. Then, a strategy for agents to prevent learning errors is suggested. To that end, three typical topologies of communication networks, namely random-graph, small-world and scale-free networks, are employed to investigate the effects of various learning errors. Simulation results on these models show that 1) learning errors slightly affect the convergence speed but distinctively increase the requirement for memory of each agent during lexicon propagation; 2) the maximum number of different words held by the population increases linearly as the error rate increases; 3) without applying any strategy to eliminate learning errors, there is a threshold of the learning errors which impairs the convergence. The new findings may help to better understand the role of learning errors in naming game as well as in human language development from a network science perspective.

  4. Systematic Analysis of Video Data from Different Human-Robot Interaction Studies: A Categorisation of Social Signals During Error Situations

    OpenAIRE

    Manuel eGiuliani; Nicole eMirnig; Gerald eStollnberger; Susanne eStadler; Roland eBuchner; Manfred eTscheligi

    2015-01-01

    Human?robot interactions are often affected by error situations that are caused by either the robot or the human. Therefore, robots would profit from the ability to recognize when error situations occur. We investigated the verbal and non-verbal social signals that humans show when error situations occur in human?robot interaction experiments. For that, we analyzed 201 videos of five human?robot interaction user studies with varying tasks from four independent projects. The analysis shows tha...

  5. DNA double-strand-break complexity levels and their possible contributions to the probability for error-prone processing and repair pathway choice.

    Science.gov (United States)

    Schipler, Agnes; Iliakis, George

    2013-09-01

    Although the DNA double-strand break (DSB) is defined as a rupture in the double-stranded DNA molecule that can occur without chemical modification in any of the constituent building blocks, it is recognized that this form is restricted to enzyme-induced DSBs. DSBs generated by physical or chemical agents can include at the break site a spectrum of base alterations (lesions). The nature and number of such chemical alterations define the complexity of the DSB and are considered putative determinants for repair pathway choice and the probability that errors will occur during this processing. As the pathways engaged in DSB processing show distinct and frequently inherent propensities for errors, pathway choice also defines the error-levels cells opt to accept. Here, we present a classification of DSBs on the basis of increasing complexity and discuss how complexity may affect processing, as well as how it may cause lethal or carcinogenic processing errors. By critically analyzing the characteristics of DSB repair pathways, we suggest that all repair pathways can in principle remove lesions clustering at the DSB but are likely to fail when they encounter clusters of DSBs that cause a local form of chromothripsis. In the same framework, we also analyze the rational of DSB repair pathway choice.

  6. Trend analysis and comparison of operators' human error events occurred at overseas and domestic nuclear power plants

    International Nuclear Information System (INIS)

    Takagawa, Kenichi

    2006-01-01

    Human errors by operators at overseas and domestic nuclear power plants during the period from 2002 to 2005 were compared and their trends analyzed. The most frequently cited cause of such errors was 'insufficient team monitoring' (inadequate superiors' and other crews' instructions and supervision) both at overseas and domestic plants, followed by 'insufficient self-checking' (lack of cautions by the operator himself). A comparison of the effects of the errors on the operations of plants in Japan and the United Sates showed that the drop in plant output and plant shutdowns at plants in Japan were approximately one-tenth of those in the United States. The ratio of automatic reactor trips to the total number of human errors reported is about 6% for both Japanese and American plants. Looking at changes in the incidence of human errors by years of occurrence, although a distinctive trend cannot be identified for domestic nuclear power plants due to insufficient reported cases, 'inadequate self-checking' as a factor contributing to human errors at overseas nuclear power plants has decreased significantly over the past four years. Regarding changes in the effects of human errors on the operations of plants during the four-year period, events leading to an automatic reactor trip have tended to increase at American plants. Conceivable factors behind this increasing tendency included lack of operating experience by a team (e.g., plant transients and reactor shutdowns and startups) and excessive dependence on training simulators. (author)

  7. Sleep quality, posttraumatic stress, depression, and human errors in train drivers: a population-based nationwide study in South Korea.

    Science.gov (United States)

    Jeon, Hong Jin; Kim, Ji-Hae; Kim, Bin-Na; Park, Seung Jin; Fava, Maurizio; Mischoulon, David; Kang, Eun-Ho; Roh, Sungwon; Lee, Dongsoo

    2014-12-01

    Human error is defined as an unintended error that is attributable to humans rather than machines, and that is important to avoid to prevent accidents. We aimed to investigate the association between sleep quality and human errors among train drivers. Cross-sectional. Population-based. A sample of 5,480 subjects who were actively working as train drivers were recruited in South Korea. The participants were 4,634 drivers who completed all questionnaires (response rate 84.6%). None. The Pittsburgh Sleep Quality Index (PSQI), the Center for Epidemiologic Studies Depression Scale (CES-D), the Impact of Event Scale-Revised (IES-R), the State-Trait Anxiety Inventory (STAI), and the Korean Occupational Stress Scale (KOSS). Of 4,634 train drivers, 349 (7.5%) showed more than one human error per 5 y. Human errors were associated with poor sleep quality, higher PSQI total scores, short sleep duration at night, and longer sleep latency. Among train drivers with poor sleep quality, those who experienced severe posttraumatic stress showed a significantly higher number of human errors than those without. Multiple logistic regression analysis showed that human errors were significantly associated with poor sleep quality and posttraumatic stress, whereas there were no significant associations with depression, trait and state anxiety, and work stress after adjusting for age, sex, education years, marital status, and career duration. Poor sleep quality was found to be associated with more human errors in train drivers, especially in those who experienced severe posttraumatic stress. © 2014 Associated Professional Sleep Societies, LLC.

  8. Human performance analysis in the frame of probabilistic safety assessment of research reactors

    International Nuclear Information System (INIS)

    Farcasiu, Mita; Nitoi, Mirela; Apostol, Minodora; Turcu, I.; Florescu, Gh.

    2005-01-01

    Full text: The analysis of operating experience has identified the importance of human performance in reliability and safety of research reactors. In Probabilistic Safety Assessment (PSA) of nuclear facilities, human performance analysis (HPA) is used in order to estimate human error contribution to the failure of system components or functions. HPA is a qualitative and quantitative analysis of human actions identified for error-likely situations or accident-prone situations. Qualitative analysis is used to identify all man-machine interfaces that can lead to an accident, types of human interactions which may mitigate or exacerbate the accident, types of human errors and performance shaping factors. Quantitative analysis is used to develop estimates of human error probability as effects of human performance in reliability and safety. The goal of this paper is to accomplish a HPA in the PSA frame for research reactors. Human error probabilities estimated as results of human actions analysis could be included in system event tree and/or system fault tree. The achieved sensitivity analyses determine human performance sensibility at systematically variations both for dependencies level between human actions and for operator stress level. The necessary information was obtained from operating experience of research reactor TRIGA from INR Pitesti. The required data were obtained from generic data bases. (authors)

  9. Good people who try their best can have problems: recognition of human factors and how to minimise error.

    Science.gov (United States)

    Brennan, Peter A; Mitchell, David A; Holmes, Simon; Plint, Simon; Parry, David

    2016-01-01

    Human error is as old as humanity itself and is an appreciable cause of mistakes by both organisations and people. Much of the work related to human factors in causing error has originated from aviation where mistakes can be catastrophic not only for those who contribute to the error, but for passengers as well. The role of human error in medical and surgical incidents, which are often multifactorial, is becoming better understood, and includes both organisational issues (by the employer) and potential human factors (at a personal level). Mistakes as a result of individual human factors and surgical teams should be better recognised and emphasised. Attitudes and acceptance of preoperative briefing has improved since the introduction of the World Health Organization (WHO) surgical checklist. However, this does not address limitations or other safety concerns that are related to performance, such as stress and fatigue, emotional state, hunger, awareness of what is going on situational awareness, and other factors that could potentially lead to error. Here we attempt to raise awareness of these human factors, and highlight how they can lead to error, and how they can be minimised in our day-to-day practice. Can hospitals move from being "high risk industries" to "high reliability organisations"? Copyright © 2015 The British Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  10. Identification and Assessment of Human Errors in Postgraduate Endodontic Students of Kerman University of Medical Sciences by Using the SHERPA Method

    Directory of Open Access Journals (Sweden)

    Saman Dastaran

    2016-03-01

    Full Text Available Introduction: Human errors are the cause of many accidents, including industrial and medical, therefore finding out an approach for identifying and reducing them is very important. Since no study has been done about human errors in the dental field, this study aimed to identify and assess human errors in postgraduate endodontic students of Kerman University of Medical Sciences by using the SHERPA Method. Methods: This cross-sectional study was performed during year 2014. Data was collected using task observation and interviewing postgraduate endodontic students. Overall, 10 critical tasks, which were most likely to cause harm to patients were determined. Next, Hierarchical Task Analysis (HTA was conducted and human errors in each task were identified by the Systematic Human Error Reduction Prediction Approach (SHERPA technique worksheets. Results: After analyzing the SHERPA worksheets, 90 human errors were identified including (67.7% action errors, (13.3% checking errors, (8.8% selection errors, (5.5% retrieval errors and (4.4% communication errors. As a result, most of them were action errors and less of them were communication errors. Conclusions: The results of the study showed that the highest percentage of errors and the highest level of risk were associated with action errors, therefore, to reduce the occurrence of such errors and limit their consequences, control measures including periodical training of work procedures, providing work check-lists, development of guidelines and establishment of a systematic and standardized reporting system, should be put in place. Regarding the results of this study, the control of recovery errors with the highest percentage of undesirable risk and action errors with the highest frequency of errors should be in the priority of control

  11. Human factors evaluation of remote afterloading brachytherapy: Human error and critical tasks in remote afterloading brachytherapy and approaches for improved system performance. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Callan, J.R.; Kelly, R.T.; Quinn, M.L. [Pacific Science and Engineering Group, San Diego, CA (United States)] [and others

    1995-05-01

    Remote Afterloading Brachytherapy (RAB) is a medical process used in the treatment of cancer. RAB uses a computer-controlled device to remotely insert and remove radioactive sources close to a target (or tumor) in the body. Some RAB problems affecting the radiation dose to the patient have been reported and attributed to human error. To determine the root cause of human error in the RAB system, a human factors team visited 23 RAB treatment sites in the US The team observed RAB treatment planning and delivery, interviewed RAB personnel, and performed walk-throughs, during which staff demonstrated the procedures and practices used in performing RAB tasks. Factors leading to human error in the RAB system were identified. The impact of those factors on the performance of RAB was then evaluated and prioritized in terms of safety significance. Finally, the project identified and evaluated alternative approaches for resolving the safety significant problems related to human error.

  12. Human factors evaluation of remote afterloading brachytherapy: Human error and critical tasks in remote afterloading brachytherapy and approaches for improved system performance. Volume 1

    International Nuclear Information System (INIS)

    Callan, J.R.; Kelly, R.T.; Quinn, M.L.

    1995-05-01

    Remote Afterloading Brachytherapy (RAB) is a medical process used in the treatment of cancer. RAB uses a computer-controlled device to remotely insert and remove radioactive sources close to a target (or tumor) in the body. Some RAB problems affecting the radiation dose to the patient have been reported and attributed to human error. To determine the root cause of human error in the RAB system, a human factors team visited 23 RAB treatment sites in the US The team observed RAB treatment planning and delivery, interviewed RAB personnel, and performed walk-throughs, during which staff demonstrated the procedures and practices used in performing RAB tasks. Factors leading to human error in the RAB system were identified. The impact of those factors on the performance of RAB was then evaluated and prioritized in terms of safety significance. Finally, the project identified and evaluated alternative approaches for resolving the safety significant problems related to human error

  13. Process error rates in general research applications to the Human ...

    African Journals Online (AJOL)

    Objective. To examine process error rates in applications for ethics clearance of health research. Methods. Minutes of 586 general research applications made to a human health research ethics committee (HREC) from April 2008 to March 2009 were examined. Rates of approval were calculated and reasons for requiring ...

  14. Comparison of the THERP quantitative tables with the human reliability analysis techniques of second generation

    International Nuclear Information System (INIS)

    Alvarenga, Marco Antonio Bayout; Fonseca, Renato Alves

    2009-01-01

    The methodology THERP is classified as a Human Reliability Analysis (HRA) technique of first generation and its emergence was an important initial step for the development of HRA techniques in the industry. Due to the fact of being a first generation technique, THERP quantification tables of human errors are based on a taxonomy that does not take into account the human errors mechanisms. Concerning the three cognitive levels in the Rasmussen framework for the cognitive information processing in human beings, THERP deals in most cases with errors that happen in the perceptual-motor level (stimulus-response). In the rules level, this technique can work better using the time dependent probabilities curves of diagnosis errors, obtained in nuclear power plants simulators. Nevertheless, this is done without processing any error mechanisms. Another deficiency is the fact that the performance shaping factors are in limited number. Furthermore, the influences (predictable or not) of operational context, arising from operational deviations of the most probable (in terms of occurrence probabilities) standard scenarios beside the consequent operational tendencies (operator actions) are not estimated. This work makes a critical analysis of these deficiencies and it points out possible solutions in order to modify the THERP tables, seeking a realistic quantification, that does not underestimate or overestimate the human errors probabilities when applying the HRA techniques to nuclear power plants. The critical analysis is accomplished through a qualitative comparison between THERP, a HRA technique of first generation, with CREAM, as well as ATHEANA, which are HRA techniques of second generation. (author)

  15. Comparison of the THERP quantitative tables with the human reliability analysis techniques of second generation

    Energy Technology Data Exchange (ETDEWEB)

    Alvarenga, Marco Antonio Bayout; Fonseca, Renato Alves [Comissao Nacional de Energia Nuclear (CNEN), Rio de Janeiro, RJ (Brazil)], e-mail: bayout@cnen.gov.br, e-mail: rfonseca@cnen.gov.br

    2009-07-01

    The methodology THERP is classified as a Human Reliability Analysis (HRA) technique of first generation and its emergence was an important initial step for the development of HRA techniques in the industry. Due to the fact of being a first generation technique, THERP quantification tables of human errors are based on a taxonomy that does not take into account the human errors mechanisms. Concerning the three cognitive levels in the Rasmussen framework for the cognitive information processing in human beings, THERP deals in most cases with errors that happen in the perceptual-motor level (stimulus-response). In the rules level, this technique can work better using the time dependent probabilities curves of diagnosis errors, obtained in nuclear power plants simulators. Nevertheless, this is done without processing any error mechanisms. Another deficiency is the fact that the performance shaping factors are in limited number. Furthermore, the influences (predictable or not) of operational context, arising from operational deviations of the most probable (in terms of occurrence probabilities) standard scenarios beside the consequent operational tendencies (operator actions) are not estimated. This work makes a critical analysis of these deficiencies and it points out possible solutions in order to modify the THERP tables, seeking a realistic quantification, that does not underestimate or overestimate the human errors probabilities when applying the HRA techniques to nuclear power plants. The critical analysis is accomplished through a qualitative comparison between THERP, a HRA technique of first generation, with CREAM, as well as ATHEANA, which are HRA techniques of second generation. (author)

  16. Inborn errors of human STAT1: allelic heterogeneity governs the diversity of immunological and infectious phenotypes

    Science.gov (United States)

    Boisson-Dupuis, Stephanie; Kong, Xiao-Fei; Okada, Satoshi; Cypowyj, Sophie; Puel, Anne; Abel, Laurent; Casanova, Jean-Laurent

    2012-01-01

    The genetic dissection of various human infectious diseases has led to the definition of inborn errors of human STAT1 immunity of four types, including (i) autosomal recessive (AR) complete STAT1 deficiency, (ii) AR partial STAT1 deficiency, (iii) autosomal dominant (AD) STAT1 deficiency, and (iv) AD gain of STAT1 activity. The two types of AR STAT1 defect give rise to a broad infectious phenotype with susceptibility to intramacrophagic bacteria (mostly mycobacteria) and viruses (herpes viruses at least), due principally to the impairment of IFN-γ-mediated and IFN-α/β-mediated immunity, respectively. Clinical outcome depends on the extent to which the STAT1 defect decreases responsiveness to these cytokines. AD STAT1 deficiency selectively predisposes individuals to mycobacterial disease, owing to the impairment of IFN-γ-mediated immunity, as IFN-α/β-mediated immunity is maintained. Finally, AD gain of STAT1 activity is associated with autoimmunity, probably owing to an enhancement of IFN-α/β-mediated immunity. More surprisingly, it is also associated with chronic mucocutaneous candidiasis, through as yet undetermined mechanisms involving an inhibition of the development of IL-17-producing T cells. Thus, germline mutations in human STAT1 define four distinct clinical disorders. Various combinations of viral, mycobacterial and fungal infections are therefore allelic at the human STAT1 locus. These experiments of Nature neatly highlight the clinical and immunological impact of the human genetic dissection of infectious phenotypes. PMID:22651901

  17. Modelling soft error probability in firmware: A case study

    African Journals Online (AJOL)

    The purpose is to estimate the probability that external disruptive events (such as ..... also changed the 16-bit magic variable to its unique 'magic' value. .... is mutually independent, not only over registers but over spikes, such that the above.

  18. Accounting for measurement error in human life history trade-offs using structural equation modeling.

    Science.gov (United States)

    Helle, Samuli

    2018-03-01

    Revealing causal effects from correlative data is very challenging and a contemporary problem in human life history research owing to the lack of experimental approach. Problems with causal inference arising from measurement error in independent variables, whether related either to inaccurate measurement technique or validity of measurements, seem not well-known in this field. The aim of this study is to show how structural equation modeling (SEM) with latent variables can be applied to account for measurement error in independent variables when the researcher has recorded several indicators of a hypothesized latent construct. As a simple example of this approach, measurement error in lifetime allocation of resources to reproduction in Finnish preindustrial women is modelled in the context of the survival cost of reproduction. In humans, lifetime energetic resources allocated in reproduction are almost impossible to quantify with precision and, thus, typically used measures of lifetime reproductive effort (e.g., lifetime reproductive success and parity) are likely to be plagued by measurement error. These results are contrasted with those obtained from a traditional regression approach where the single best proxy of lifetime reproductive effort available in the data is used for inference. As expected, the inability to account for measurement error in women's lifetime reproductive effort resulted in the underestimation of its underlying effect size on post-reproductive survival. This article emphasizes the advantages that the SEM framework can provide in handling measurement error via multiple-indicator latent variables in human life history studies. © 2017 Wiley Periodicals, Inc.

  19. Probability shapes perceptual precision: A study in orientation estimation.

    Science.gov (United States)

    Jabar, Syaheed B; Anderson, Britt

    2015-12-01

    Probability is known to affect perceptual estimations, but an understanding of mechanisms is lacking. Moving beyond binary classification tasks, we had naive participants report the orientation of briefly viewed gratings where we systematically manipulated contingent probability. Participants rapidly developed faster and more precise estimations for high-probability tilts. The shapes of their error distributions, as indexed by a kurtosis measure, also showed a distortion from Gaussian. This kurtosis metric was robust, capturing probability effects that were graded, contextual, and varying as a function of stimulus orientation. Our data can be understood as a probability-induced reduction in the variability or "shape" of estimation errors, as would be expected if probability affects the perceptual representations. As probability manipulations are an implicit component of many endogenous cuing paradigms, changes at the perceptual level could account for changes in performance that might have traditionally been ascribed to "attention." (c) 2015 APA, all rights reserved).

  20. A model-based and computer-aided approach to analysis of human errors in nuclear power plants

    International Nuclear Information System (INIS)

    Yoon, Wan C.; Lee, Yong H.; Kim, Young S.

    1996-01-01

    Since the operator's mission in NPPs is increasingly defined by cognitive tasks such as monitoring, diagnosis and planning, the focus of human error analysis should also move from external actions to internal decision-making processes. While more elaborate analysis of cognitive aspects of human errors will help understand their causes and derive effective countermeasures, a lack of framework and an arbitrary resolution of description may hamper the effectiveness of such analysis. This paper presents new model-based schemes of event description and error classification as well as an interactive computerized support system. The schemes and the support system were produced in an effort to develop an improved version of HPES. The use of a decision-making model enables the analyst to document cognitive aspects of human performance explicitly and in a proper resolution. The stage-specific terms used in the proposed schemes make the task of characterizing human errors easier and confident for field analysts. The support system was designed to help the analyst achieve a contextually well-integrated analysis throughout the different parts of HPES

  1. The current approach to human error and blame in the NHS.

    Science.gov (United States)

    Ottewill, Melanie

    There is a large body of research to suggest that serious errors are widespread throughout medicine. The traditional response to these adverse events has been to adopt a 'person approach' - blaming the individual seen as 'responsible'. The culture of medicine is highly complicit in this response. Such an approach results in enormous personal costs to the individuals concerned and does little to address the root causes of errors and thus prevent their recurrence. Other industries, such as aviation, where safety is a paramount concern and which have similar structures to the medical profession, have, over the past decade or so, adopted a 'systems' approach to error, recognizing that human error is ubiquitous and inevitable and that systems need to be developed with this in mind. This approach has been highly successful, but has necessitated, first and foremost, a cultural shift. It is in the best interests of patients, and medical professionals alike, that such a shift is embraced in the NHS.

  2. Basic design of multimedia system for the representation of human error cases in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung Woon; Park, Geun Ok [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1994-04-01

    We have developed a multimedia system for the representation of human error cases with the education and training on human errors can be done effectively. The followings are major topics during the basic design; 1 Establishment of a basic concept for representing human error cases using multimedia, 2 Establishment of a design procedure for the multimedia system, 3 Establishment of a hardware and software environment for operating the multimedia system, 4 Design of multimedia input and output interfaces. In order to verify the results of this basic design, we implemented the basic design with an incident triggered by operator`s misaction which occurred at Uljin NPP Unit 1. (Author) 12 refs., 30 figs.,.

  3. Development of Human Factor Management Requirements and Human Error Classification for the Prevention of Railway Accident

    International Nuclear Information System (INIS)

    Kwak, Sang Log; Park, Chan Woo; Shin, Seung Ryoung

    2008-08-01

    Railway accident analysis results show that accidents cased by human factors are not decreasing, whereas H/W related accidents are steadily decreasing. For the efficient management of human factors, many expertise on design, conditions, safety culture and staffing are required. But current safety management activities on safety critical works are focused on training, due to the limited resource and information. In order to improve railway safety, human factors management requirements for safety critical worker and human error classification is proposed in this report. For this accident analysis, status of safety measure on human factor, safety management system on safety critical worker, current safety planning is analysis

  4. The estimated lifetime probability of acquiring human papillomavirus in the United States.

    Science.gov (United States)

    Chesson, Harrell W; Dunne, Eileen F; Hariri, Susan; Markowitz, Lauri E

    2014-11-01

    Estimates of the lifetime probability of acquiring human papillomavirus (HPV) can help to quantify HPV incidence, illustrate how common HPV infection is, and highlight the importance of HPV vaccination. We developed a simple model, based primarily on the distribution of lifetime numbers of sex partners across the population and the per-partnership probability of acquiring HPV, to estimate the lifetime probability of acquiring HPV in the United States in the time frame before HPV vaccine availability. We estimated the average lifetime probability of acquiring HPV among those with at least 1 opposite sex partner to be 84.6% (range, 53.6%-95.0%) for women and 91.3% (range, 69.5%-97.7%) for men. Under base case assumptions, more than 80% of women and men acquire HPV by age 45 years. Our results are consistent with estimates in the existing literature suggesting a high lifetime probability of HPV acquisition and are supported by cohort studies showing high cumulative HPV incidence over a relatively short period, such as 3 to 5 years.

  5. Estimating Recovery Failure Probabilities in Off-normal Situations from Full-Scope Simulator Data

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yochan; Park, Jinkyun; Kim, Seunghwan; Choi, Sun Yeong; Jung, Wondea [Korea Atomic Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    As part of this effort, KAERI developed the Human Reliability data EXtraction (HuREX) framework and is collecting full-scope simulator-based human reliability data into the OPERA (Operator PErformance and Reliability Analysis) database. In this study, with the series of estimation research for HEPs or PSF effects, significant information for a quantitative HRA analysis, recovery failure probabilities (RFPs), were produced from the OPERA database. Unsafe acts can occur at any time in safety-critical systems and the operators often manage the systems by discovering their errors and eliminating or mitigating them. To model the recovery processes or recovery strategies, there were several researches that categorize the recovery behaviors. Because the recent human error trends are required to be considered during a human reliability analysis, Jang et al. can be seen as an essential effort of the data collection. However, since the empirical results regarding soft controls were produced from a controlled laboratory environment with student participants, it is necessary to analyze a wide range of operator behaviors using full-scope simulators. This paper presents the statistics related with human error recovery behaviors obtained from the full-scope simulations that in-site operators participated in. In this study, the recovery effects by shift changes or technical support centers were not considered owing to a lack of simulation data.

  6. Error detecting capabilities of the shortened Hamming codes adopted for error detection in IEEE Standard 802.3

    Science.gov (United States)

    Fujiwara, Toru; Kasami, Tadao; Lin, Shu

    1989-09-01

    The error-detecting capabilities of the shortened Hamming codes adopted for error detection in IEEE Standard 802.3 are investigated. These codes are also used for error detection in the data link layer of the Ethernet, a local area network. The weight distributions for various code lengths are calculated to obtain the probability of undetectable error and that of detectable error for a binary symmetric channel with bit-error rate between 0.00001 and 1/2.

  7. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  8. Quantum computing and probability

    International Nuclear Information System (INIS)

    Ferry, David K

    2009-01-01

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction. (viewpoint)

  9. An Alternative Method to Compute the Bit Error Probability of Modulation Schemes Subject to Nakagami- Fading

    Directory of Open Access Journals (Sweden)

    Madeiro Francisco

    2010-01-01

    Full Text Available Abstract This paper presents an alternative method for determining exact expressions for the bit error probability (BEP of modulation schemes subject to Nakagami- fading. In this method, the Nakagami- fading channel is seen as an additive noise channel whose noise is modeled as the ratio between Gaussian and Nakagami- random variables. The method consists of using the cumulative density function of the resulting noise to obtain closed-form expressions for the BEP of modulation schemes subject to Nakagami- fading. In particular, the proposed method is used to obtain closed-form expressions for the BEP of -ary quadrature amplitude modulation ( -QAM, -ary pulse amplitude modulation ( -PAM, and rectangular quadrature amplitude modulation ( -QAM under Nakagami- fading. The main contribution of this paper is to show that this alternative method can be used to reduce the computational complexity for detecting signals in the presence of fading.

  10. A Generic Simulation Approach for the Fast and Accurate Estimation of the Outage Probability of Single Hop and Multihop FSO Links Subject to Generalized Pointing Errors

    KAUST Repository

    Ben Issaid, Chaouki; Park, Kihong; Alouini, Mohamed-Slim

    2017-01-01

    When assessing the performance of the free space optical (FSO) communication systems, the outage probability encountered is generally very small, and thereby the use of nave Monte Carlo simulations becomes prohibitively expensive. To estimate these rare event probabilities, we propose in this work an importance sampling approach which is based on the exponential twisting technique to offer fast and accurate results. In fact, we consider a variety of turbulence regimes, and we investigate the outage probability of FSO communication systems, under a generalized pointing error model based on the Beckmann distribution, for both single and multihop scenarios. Selected numerical simulations are presented to show the accuracy and the efficiency of our approach compared to naive Monte Carlo.

  11. A Generic Simulation Approach for the Fast and Accurate Estimation of the Outage Probability of Single Hop and Multihop FSO Links Subject to Generalized Pointing Errors

    KAUST Repository

    Ben Issaid, Chaouki

    2017-07-28

    When assessing the performance of the free space optical (FSO) communication systems, the outage probability encountered is generally very small, and thereby the use of nave Monte Carlo simulations becomes prohibitively expensive. To estimate these rare event probabilities, we propose in this work an importance sampling approach which is based on the exponential twisting technique to offer fast and accurate results. In fact, we consider a variety of turbulence regimes, and we investigate the outage probability of FSO communication systems, under a generalized pointing error model based on the Beckmann distribution, for both single and multihop scenarios. Selected numerical simulations are presented to show the accuracy and the efficiency of our approach compared to naive Monte Carlo.

  12. How we learn to make decisions: rapid propagation of reinforcement learning prediction errors in humans.

    Science.gov (United States)

    Krigolson, Olav E; Hassall, Cameron D; Handy, Todd C

    2014-03-01

    Our ability to make decisions is predicated upon our knowledge of the outcomes of the actions available to us. Reinforcement learning theory posits that actions followed by a reward or punishment acquire value through the computation of prediction errors-discrepancies between the predicted and the actual reward. A multitude of neuroimaging studies have demonstrated that rewards and punishments evoke neural responses that appear to reflect reinforcement learning prediction errors [e.g., Krigolson, O. E., Pierce, L. J., Holroyd, C. B., & Tanaka, J. W. Learning to become an expert: Reinforcement learning and the acquisition of perceptual expertise. Journal of Cognitive Neuroscience, 21, 1833-1840, 2009; Bayer, H. M., & Glimcher, P. W. Midbrain dopamine neurons encode a quantitative reward prediction error signal. Neuron, 47, 129-141, 2005; O'Doherty, J. P. Reward representations and reward-related learning in the human brain: Insights from neuroimaging. Current Opinion in Neurobiology, 14, 769-776, 2004; Holroyd, C. B., & Coles, M. G. H. The neural basis of human error processing: Reinforcement learning, dopamine, and the error-related negativity. Psychological Review, 109, 679-709, 2002]. Here, we used the brain ERP technique to demonstrate that not only do rewards elicit a neural response akin to a prediction error but also that this signal rapidly diminished and propagated to the time of choice presentation with learning. Specifically, in a simple, learnable gambling task, we show that novel rewards elicited a feedback error-related negativity that rapidly decreased in amplitude with learning. Furthermore, we demonstrate the existence of a reward positivity at choice presentation, a previously unreported ERP component that has a similar timing and topography as the feedback error-related negativity that increased in amplitude with learning. The pattern of results we observed mirrored the output of a computational model that we implemented to compute reward

  13. Probability of misclassifying biological elements in surface waters.

    Science.gov (United States)

    Loga, Małgorzata; Wierzchołowska-Dziedzic, Anna

    2017-11-24

    Measurement uncertainties are inherent to assessment of biological indices of water bodies. The effect of these uncertainties on the probability of misclassification of ecological status is the subject of this paper. Four Monte-Carlo (M-C) models were applied to simulate the occurrence of random errors in the measurements of metrics corresponding to four biological elements of surface waters: macrophytes, phytoplankton, phytobenthos, and benthic macroinvertebrates. Long series of error-prone measurement values of these metrics, generated by M-C models, were used to identify cases in which values of any of the four biological indices lay outside of the "true" water body class, i.e., outside the class assigned from the actual physical measurements. Fraction of such cases in the M-C generated series was used to estimate the probability of misclassification. The method is particularly useful for estimating the probability of misclassification of the ecological status of surface water bodies in the case of short sequences of measurements of biological indices. The results of the Monte-Carlo simulations show a relatively high sensitivity of this probability to measurement errors of the river macrophyte index (MIR) and high robustness to measurement errors of the benthic macroinvertebrate index (MMI). The proposed method of using Monte-Carlo models to estimate the probability of misclassification has significant potential for assessing the uncertainty of water body status reported to the EC by the EU member countries according to WFD. The method can be readily applied also in risk assessment of water management decisions before adopting the status dependent corrective actions.

  14. Human Inferences about Sequences: A Minimal Transition Probability Model.

    Directory of Open Access Journals (Sweden)

    Florent Meyniel

    2016-12-01

    Full Text Available The brain constantly infers the causes of the inputs it receives and uses these inferences to generate statistical expectations about future observations. Experimental evidence for these expectations and their violations include explicit reports, sequential effects on reaction times, and mismatch or surprise signals recorded in electrophysiology and functional MRI. Here, we explore the hypothesis that the brain acts as a near-optimal inference device that constantly attempts to infer the time-varying matrix of transition probabilities between the stimuli it receives, even when those stimuli are in fact fully unpredictable. This parsimonious Bayesian model, with a single free parameter, accounts for a broad range of findings on surprise signals, sequential effects and the perception of randomness. Notably, it explains the pervasive asymmetry between repetitions and alternations encountered in those studies. Our analysis suggests that a neural machinery for inferring transition probabilities lies at the core of human sequence knowledge.

  15. Impact of human error on lumber yield in rough mills

    Science.gov (United States)

    Urs Buehlmann; R. Edward Thomas; R. Edward Thomas

    2002-01-01

    Rough sawn, kiln-dried lumber contains characteristics such as knots and bark pockets that are considered by most people to be defects. When using boards to produce furniture components, these defects are removed to produce clear, defect-free parts. Currently, human operators identify and locate the unusable board areas containing defects. Errors in determining a...

  16. Human-simulation-based learning to prevent medication error: A systematic review.

    Science.gov (United States)

    Sarfati, Laura; Ranchon, Florence; Vantard, Nicolas; Schwiertz, Vérane; Larbre, Virginie; Parat, Stéphanie; Faudel, Amélie; Rioufol, Catherine

    2018-01-31

    In the past 2 decades, there has been an increasing interest in simulation-based learning programs to prevent medication error (ME). To improve knowledge, skills, and attitudes in prescribers, nurses, and pharmaceutical staff, these methods enable training without directly involving patients. However, best practices for simulation for healthcare providers are as yet undefined. By analysing the current state of experience in the field, the present review aims to assess whether human simulation in healthcare helps to reduce ME. A systematic review was conducted on Medline from 2000 to June 2015, associating the terms "Patient Simulation," "Medication Errors," and "Simulation Healthcare." Reports of technology-based simulation were excluded, to focus exclusively on human simulation in nontechnical skills learning. Twenty-one studies assessing simulation-based learning programs were selected, focusing on pharmacy, medicine or nursing students, or concerning programs aimed at reducing administration or preparation errors, managing crises, or learning communication skills for healthcare professionals. The studies varied in design, methodology, and assessment criteria. Few demonstrated that simulation was more effective than didactic learning in reducing ME. This review highlights a lack of long-term assessment and real-life extrapolation, with limited scenarios and participant samples. These various experiences, however, help in identifying the key elements required for an effective human simulation-based learning program for ME prevention: ie, scenario design, debriefing, and perception assessment. The performance of these programs depends on their ability to reflect reality and on professional guidance. Properly regulated simulation is a good way to train staff in events that happen only exceptionally, as well as in standard daily activities. By integrating human factors, simulation seems to be effective in preventing iatrogenic risk related to ME, if the program is

  17. Faces in places: humans and machines make similar face detection errors.

    Directory of Open Access Journals (Sweden)

    Bernard Marius 't Hart

    Full Text Available The human visual system seems to be particularly efficient at detecting faces. This efficiency sometimes comes at the cost of wrongfully seeing faces in arbitrary patterns, including famous examples such as a rock configuration on Mars or a toast's roast patterns. In machine vision, face detection has made considerable progress and has become a standard feature of many digital cameras. The arguably most wide-spread algorithm for such applications ("Viola-Jones" algorithm achieves high detection rates at high computational efficiency. To what extent do the patterns that the algorithm mistakenly classifies as faces also fool humans? We selected three kinds of stimuli from real-life, first-person perspective movies based on the algorithm's output: correct detections ("real faces", false positives ("illusory faces" and correctly rejected locations ("non faces". Observers were shown pairs of these for 20 ms and had to direct their gaze to the location of the face. We found that illusory faces were mistaken for faces more frequently than non faces. In addition, rotation of the real face yielded more errors, while rotation of the illusory face yielded fewer errors. Using colored stimuli increases overall performance, but does not change the pattern of results. When replacing the eye movement by a manual response, however, the preference for illusory faces over non faces disappeared. Taken together, our data show that humans make similar face-detection errors as the Viola-Jones algorithm, when directing their gaze to briefly presented stimuli. In particular, the relative spatial arrangement of oriented filters seems of relevance. This suggests that efficient face detection in humans is likely to be pre-attentive and based on rather simple features as those encoded in the early visual system.

  18. A novel framework on exact average symbol error probabilities of multihop transmission over amplify-and-forward relay fading channels

    KAUST Repository

    Yilmaz, Ferkan; Kucur, Oǧuz; Alouini, Mohamed-Slim

    2010-01-01

    In this paper, we propose an analytical framework on the exact computation of the average symbol error probabilities (ASEP) of multihop transmission over generalized fading channels when an arbitrary number of amplify-and-forward relays is used. Our approach relies on moment generating function (MGF) framework to obtain exact single integral expressions which can be easily computed by Gauss-Chebyshev Quadrature (GCQ) rule. As such, the derived results are a convenient tool to analyze the ASEP performance of multihop transmission over amplify-and-forward relay fading channels. Numerical and simulation results, performed to verify the correctness of the proposed formulation, are in perfect agreement. © 2010 IEEE.

  19. A novel framework on exact average symbol error probabilities of multihop transmission over amplify-and-forward relay fading channels

    KAUST Repository

    Yilmaz, Ferkan

    2010-09-01

    In this paper, we propose an analytical framework on the exact computation of the average symbol error probabilities (ASEP) of multihop transmission over generalized fading channels when an arbitrary number of amplify-and-forward relays is used. Our approach relies on moment generating function (MGF) framework to obtain exact single integral expressions which can be easily computed by Gauss-Chebyshev Quadrature (GCQ) rule. As such, the derived results are a convenient tool to analyze the ASEP performance of multihop transmission over amplify-and-forward relay fading channels. Numerical and simulation results, performed to verify the correctness of the proposed formulation, are in perfect agreement. © 2010 IEEE.

  20. Exact Symbol Error Probability of Square M-QAM Signaling over Generalized Fading Channels subject to Additive Generalized Gaussian Noise

    KAUST Repository

    Soury, Hamza

    2013-07-01

    This paper considers the average symbol error probability of square Quadrature Amplitude Modulation (QAM) coherent signaling over flat fading channels subject to additive generalized Gaussian noise. More specifically, a generic closedform expression in terms of the Fox H function and the bivariate Fox H function is offered for the extended generalized-K fading case. Simplifications for some special fading distributions such as generalized-K fading, Nakagami-m fading, and Rayleigh fading and special additive noise distributions such as Gaussian and Laplacian noise are then presented. Finally, the mathematical formalism is illustrated by some numerical examples verified by computer based simulations for a variety of fading and additive noise parameters.

  1. Action errors, error management, and learning in organizations.

    Science.gov (United States)

    Frese, Michael; Keith, Nina

    2015-01-03

    Every organization is confronted with errors. Most errors are corrected easily, but some may lead to negative consequences. Organizations often focus on error prevention as a single strategy for dealing with errors. Our review suggests that error prevention needs to be supplemented by error management--an approach directed at effectively dealing with errors after they have occurred, with the goal of minimizing negative and maximizing positive error consequences (examples of the latter are learning and innovations). After defining errors and related concepts, we review research on error-related processes affected by error management (error detection, damage control). Empirical evidence on positive effects of error management in individuals and organizations is then discussed, along with emotional, motivational, cognitive, and behavioral pathways of these effects. Learning from errors is central, but like other positive consequences, learning occurs under certain circumstances--one being the development of a mind-set of acceptance of human error.

  2. Error management process for power stations

    International Nuclear Information System (INIS)

    Hirotsu, Yuko; Takeda, Daisuke; Fujimoto, Junzo; Nagasaka, Akihiko

    2016-01-01

    The purpose of this study is to establish 'error management process for power stations' for systematizing activities for human error prevention and for festering continuous improvement of these activities. The following are proposed by deriving concepts concerning error management process from existing knowledge and realizing them through application and evaluation of their effectiveness at a power station: an entire picture of error management process that facilitate four functions requisite for maraging human error prevention effectively (1. systematizing human error prevention tools, 2. identifying problems based on incident reports and taking corrective actions, 3. identifying good practices and potential problems for taking proactive measures, 4. prioritizeng human error prevention tools based on identified problems); detail steps for each activity (i.e. developing an annual plan for human error prevention, reporting and analyzing incidents and near misses) based on a model of human error causation; procedures and example of items for identifying gaps between current and desired levels of executions and outputs of each activity; stages for introducing and establishing the above proposed error management process into a power station. By giving shape to above proposals at a power station, systematization and continuous improvement of activities for human error prevention in line with the actual situation of the power station can be expected. (author)

  3. Human Error Prediction and Countermeasures based on CREAM in Loading and Storage Phase of Spent Nuclear Fuel (SNF)

    International Nuclear Information System (INIS)

    Kim, Jae San; Kim, Min Su; Jo, Seong Youn

    2007-01-01

    With the steady demands for nuclear power energy in Korea, the amount of accumulated SNF has inevitably increased year by year. Thus far, SNF has been on-site transported from one unit to a nearby unit or an on-site dry storage facility. In the near future, as the amount of SNF generated approaches the capacity of these facilities, a percentage of it will be transported to another SNF storage facility. In the process of transporting SNF, human interactions involve inspecting and preparing the cask and spent fuel, loading the cask, transferring the cask and storage or monitoring the cask, etc. So, human actions play a significant role in SNF transportation. In analyzing incidents that have occurred during transport operations, several recent studies have indicated that 'human error' is a primary cause. Therefore, the objectives of this study are to predict and identify possible human errors during the loading and storage of SNF. Furthermore, after evaluating human error for each process, countermeasures to minimize human error are deduced

  4. Reliability assessment for thickness measurements of pipe wall using probability of detection

    International Nuclear Information System (INIS)

    Nakamoto, Hiroyuki; Kojima, Fumio; Kato, Sho

    2013-01-01

    This paper proposes a reliability assessment method for thickness measurements of pipe wall using probability of detection (POD). Thicknesses of pipes are measured by qualified inspectors with ultrasonic thickness gauges. The inspection results are affected by human factors of the inspectors and include some errors, because the inspectors have different experiences and frequency of inspections. In order to ensure reliability for inspection results, first, POD evaluates experimental results of pipe-wall thickness inspection. We verify that the results have differences depending on inspectors including qualified inspectors. Second, two human factors that affect POD are indicated. Finally, it is confirmed that POD can identify the human factors and ensure reliability for pipe-wall thickness inspections. (author)

  5. Hierarchical learning induces two simultaneous, but separable, prediction errors in human basal ganglia.

    Science.gov (United States)

    Diuk, Carlos; Tsai, Karin; Wallis, Jonathan; Botvinick, Matthew; Niv, Yael

    2013-03-27

    Studies suggest that dopaminergic neurons report a unitary, global reward prediction error signal. However, learning in complex real-life tasks, in particular tasks that show hierarchical structure, requires multiple prediction errors that may coincide in time. We used functional neuroimaging to measure prediction error signals in humans performing such a hierarchical task involving simultaneous, uncorrelated prediction errors. Analysis of signals in a priori anatomical regions of interest in the ventral striatum and the ventral tegmental area indeed evidenced two simultaneous, but separable, prediction error signals corresponding to the two levels of hierarchy in the task. This result suggests that suitably designed tasks may reveal a more intricate pattern of firing in dopaminergic neurons. Moreover, the need for downstream separation of these signals implies possible limitations on the number of different task levels that we can learn about simultaneously.

  6. Review of the human reliability analysis performed for Empire State Electric Energy Research Corporation

    International Nuclear Information System (INIS)

    Swart, D.; Banz, I.

    1985-01-01

    The Empire State Electric Energy Research Corporation (ESEERCO) commissioned Westinghouse to conduct a human reliability analysis to identify and quantify human error probabilities associated with operator actions for four specific events which may occur in light water reactors: loss of coolant accident, steam generator tube rupture, steam/feed line break, and stuck open pressurizer spray valve. Human Error Probabilities (HEPs) derived from Swain's Technique for Human Error Rate Prediction (THERP) were compared to data obtained from simulator exercises. A correlation was found between the HEPs derived from Swain and the results of the simulator data. The results of this study provide a unique insight into human factors analysis. The HEPs obtained from such probabilistic studies can be used to prioritize scenarios for operator training situations, and thus improve the correlation between simulator exercises and real control room experiences

  7. Incorporation of human factors into ship collision risk models focusing on human centred design aspects

    International Nuclear Information System (INIS)

    Sotiralis, P.; Ventikos, N.P.; Hamann, R.; Golyshev, P.; Teixeira, A.P.

    2016-01-01

    This paper presents an approach that more adequately incorporates human factor considerations into quantitative risk analysis of ship operation. The focus is on the collision accident category, which is one of the main risk contributors in ship operation. The approach is based on the development of a Bayesian Network (BN) model that integrates elements from the Technique for Retrospective and Predictive Analysis of Cognitive Errors (TRACEr) and focuses on the calculation of the collision accident probability due to human error. The model takes into account the human performance in normal, abnormal and critical operational conditions and implements specific tasks derived from the analysis of the task errors leading to the collision accident category. A sensitivity analysis is performed to identify the most important contributors to human performance and ship collision. Finally, the model developed is applied to assess the collision risk of a feeder operating in Dover strait using the collision probability estimated by the developed BN model and an Event tree model for calculation of human, economic and environmental risks. - Highlights: • A collision risk model for the incorporation of human factors into quantitative risk analysis is proposed. • The model takes into account the human performance in different operational conditions leading to the collision. • The most important contributors to human performance and ship collision are identified. • The model developed is applied to assess the collision risk of a feeder operating in Dover strait.

  8. A basic framework for the analysis of the human error potential due to the computerization in nuclear power plants

    International Nuclear Information System (INIS)

    Lee, Y. H.

    1999-01-01

    Computerization and its vivid benefits expected in the nuclear power plant design cannot be realized without verifying the inherent safety problems. Human error aspect is also included in the verification issues. The verification spans from the perception of the changes in operation functions such as automation to the unfamiliar experience of operators due to the interface change. Therefore, a new framework for human error analysis might capture both the positive and the negative effect of the computerization. This paper suggest a basic framework for error identification through the review of the existing human error studies and the experience of computerizations in nuclear power plants

  9. Human errors during the simulations of an SGTR scenario: Application of the HERA system

    International Nuclear Information System (INIS)

    Jung, Won Dea; Whaley, April M.; Hallbert, Bruce P.

    2009-01-01

    Due to the need of data for a Human Reliability Analysis (HRA), a number of data collection efforts have been undertaken in several different organizations. As a part of this effort, a human error analysis that focused on a set of simulator records on a Steam Generator Tube Rupture (SGTR) scenario was performed by using the Human Event Repository and Analysis (HERA) system. This paper summarizes the process and results of the HERA analysis, including discussions about the usability of the HERA system for a human error analysis of simulator data. Five simulated records of an SGTR scenario were analyzed with the HERA analysis process in order to scrutinize the causes and mechanisms of the human related events. From this study, the authors confirmed that the HERA was a serviceable system that can analyze human performance qualitatively from simulator data. It was possible to identify the human related events in the simulator data that affected the system safety not only negatively but also positively. It was also possible to scrutinize the Performance Shaping Factors (PSFs) and the relevant contributory factors with regard to each identified human event

  10. Human factors in the operation of nuclear power plants

    International Nuclear Information System (INIS)

    Swaton, E.; Neboyan, V.; Lederman, L.

    1987-01-01

    In large and complex interactive systems, human error can contribute substantially to system failures. At nuclear power plants, operational experience demonstrates that human error accounts for a considerable proportion of safety-related incidents. However, experience also shows that human intervention can be very effective if there is a thorough understanding of the situation in the plant. Thus, an efficient interface of man and machine is important not only to prevent human errors but also to assist the operator in coping with unforeseen events. Human reliability can be understood as a qualitative as well as a quantitative term. Qualitatively it can be described as the aim for successful human performance of activities necessary for system reliability and availability. Quantitatively, it refers to data on failure rates or error probabilities that can be used, for example, for probabilistic safety assessments

  11. Large errors and severe conditions

    CERN Document Server

    Smith, D L; Van Wormer, L A

    2002-01-01

    Physical parameters that can assume real-number values over a continuous range are generally represented by inherently positive random variables. However, if the uncertainties in these parameters are significant (large errors), conventional means of representing and manipulating the associated variables can lead to erroneous results. Instead, all analyses involving them must be conducted in a probabilistic framework. Several issues must be considered: First, non-linear functional relations between primary and derived variables may lead to significant 'error amplification' (severe conditions). Second, the commonly used normal (Gaussian) probability distribution must be replaced by a more appropriate function that avoids the occurrence of negative sampling results. Third, both primary random variables and those derived through well-defined functions must be dealt with entirely in terms of their probability distributions. Parameter 'values' and 'errors' should be interpreted as specific moments of these probabil...

  12. Is human failure a stochastic process?

    International Nuclear Information System (INIS)

    Dougherty, Ed M.

    1997-01-01

    Human performance results in failure events that occur with a risk-significant frequency. System analysts have taken for granted the random (stochastic) nature of these events in engineering assessments such as risk assessment. However, cognitive scientists and error technologists, at least those who have interest in human reliability, have, over the recent years, claimed that human error does not need this stochastic framework. Yet they still use the language appropriate to stochastic processes. This paper examines the potential for the stochastic nature of human failure production as the basis for human reliability analysis. It distinguishes and leaves to others, however, the epistemic uncertainties over the possible probability models for the real variability of human performance

  13. An assessment of the risk significance of human errors in selected PSAs and operating events

    International Nuclear Information System (INIS)

    Palla, R.L. Jr.; El-Bassioni, A.

    1991-01-01

    Sensitivity studies based on Probabilistic Safety Assessments (PSAs) for a pressurized water reactor and a boiling water reactor are described. In each case human errors modeled in the PSAs were categorized according to such factors as error type, location, timing, and plant personnel involved. Sensitivity studies were then conducted by varying the error rates in each category and evaluating the corresponding change in total core damage frequency and accident sequence frequency. Insights obtained are discussed and reasons for differences in risk sensitivity between plants are explored. A separate investigation into the role of human error in risk-important operating events is also described. This investigation involved the analysis of data from the USNRC Accident Sequence Precursor program to determine the effect of operator-initiated events on accident precursor trends, and to determine whether improved training can be correlated to current trends. The findings of this study are also presented. 5 refs., 15 figs., 1 tab

  14. Systematic analysis of dependent human errors from the maintenance history at finnish NPPs - A status report

    Energy Technology Data Exchange (ETDEWEB)

    Laakso, K. [VTT Industrial Systems (Finland)

    2002-12-01

    Operating experience has shown missed detection events, where faults have passed inspections and functional tests to operating periods after the maintenance activities during the outage. The causes of these failures have often been complex event sequences, involving human and organisational factors. Especially common cause and other dependent failures of safety systems may significantly contribute to the reactor core damage risk. The topic has been addressed in the Finnish studies of human common cause failures, where experiences on latent human errors have been searched and analysed in detail from the maintenance history. The review of the bulk of the analysis results of the Olkiluoto and Loviisa plant sites shows that the instrumentation and control and electrical equipment is more prone to human error caused failure events than the other maintenance and that plant modifications and also predetermined preventive maintenance are significant sources of common cause failures. Most errors stem from the refuelling and maintenance outage period at the both sites, and less than half of the dependent errors were identified during the same outage. The dependent human errors originating from modifications could be reduced by a more tailored specification and coverage of their start-up testing programs. Improvements could also be achieved by a more case specific planning of the installation inspection and functional testing of complicated maintenance works or work objects of higher plant safety and availability importance. A better use and analysis of condition monitoring information for maintenance steering could also help. The feedback from discussions of the analysis results with plant experts and professionals is still crucial in developing the final conclusions and recommendations that meet the specific development needs at the plants. (au)

  15. The application of two recently developed human reliability techniques to cognitive error analysis

    International Nuclear Information System (INIS)

    Gall, W.

    1990-01-01

    Cognitive error can lead to catastrophic consequences for manned systems, including those whose design renders them immune to the effects of physical slips made by operators. Four such events, pressurized water and boiling water reactor accidents which occurred recently, were analysed. The analysis identifies the factors which contributed to the errors and suggests practical strategies for error recovery or prevention. Two types of analysis were conducted: an unstructured analysis based on the analyst's knowledge of psychological theory, and a structured analysis using two recently-developed human reliability analysis techniques. In general, the structured techniques required less effort to produce results and these were comparable to those of the unstructured analysis. (author)

  16. Taking human error into account in the design of nuclear reactor centres

    International Nuclear Information System (INIS)

    Prouillac; Lerat; Janoir.

    1982-05-01

    The role of the operator in the centralized management of pressurized water reactors is studied. Different types of human error likely to arise, the means of their prevention and methods of mitigating their consequences are presented. Some possible improvements are outlined

  17. The application of human error prevention tool in Tianwan nuclear power station

    International Nuclear Information System (INIS)

    Qiao Zhiguo

    2013-01-01

    This paper mainly discusses the application and popularization of human error prevention tool in Tianwan nuclear power station, including the study on project implementation background, main contents and innovation, performance management, innovation practice and development, and performance of innovation application. (authors)

  18. Inclusion of task dependence in human reliability analysis

    International Nuclear Information System (INIS)

    Su, Xiaoyan; Mahadevan, Sankaran; Xu, Peida; Deng, Yong

    2014-01-01

    Dependence assessment among human errors in human reliability analysis (HRA) is an important issue, which includes the evaluation of the dependence among human tasks and the effect of the dependence on the final human error probability (HEP). This paper represents a computational model to handle dependence in human reliability analysis. The aim of the study is to automatically provide conclusions on the overall degree of dependence and calculate the conditional human error probability (CHEP) once the judgments of the input factors are given. The dependence influencing factors are first identified by the experts and the priorities of these factors are also taken into consideration. Anchors and qualitative labels are provided as guidance for the HRA analyst's judgment of the input factors. The overall degree of dependence between human failure events is calculated based on the input values and the weights of the input factors. Finally, the CHEP is obtained according to a computing formula derived from the technique for human error rate prediction (THERP) method. The proposed method is able to quantify the subjective judgment from the experts and improve the transparency in the HEP evaluation process. Two examples are illustrated to show the effectiveness and the flexibility of the proposed method. - Highlights: • We propose a computational model to handle dependence in human reliability analysis. • The priorities of the dependence influencing factors are taken into consideration. • The overall dependence degree is determined by input judgments and the weights of factors. • The CHEP is obtained according to a computing formula derived from THERP

  19. At least some errors are randomly generated (Freud was wrong)

    Science.gov (United States)

    Sellen, A. J.; Senders, J. W.

    1986-01-01

    An experiment was carried out to expose something about human error generating mechanisms. In the context of the experiment, an error was made when a subject pressed the wrong key on a computer keyboard or pressed no key at all in the time allotted. These might be considered, respectively, errors of substitution and errors of omission. Each of seven subjects saw a sequence of three digital numbers, made an easily learned binary judgement about each, and was to press the appropriate one of two keys. Each session consisted of 1,000 presentations of randomly permuted, fixed numbers broken into 10 blocks of 100. One of two keys should have been pressed within one second of the onset of each stimulus. These data were subjected to statistical analyses in order to probe the nature of the error generating mechanisms. Goodness of fit tests for a Poisson distribution for the number of errors per 50 trial interval and for an exponential distribution of the length of the intervals between errors were carried out. There is evidence for an endogenous mechanism that may best be described as a random error generator. Furthermore, an item analysis of the number of errors produced per stimulus suggests the existence of a second mechanism operating on task driven factors producing exogenous errors. Some errors, at least, are the result of constant probability generating mechanisms with error rate idiosyncratically determined for each subject.

  20. The error in total error reduction.

    Science.gov (United States)

    Witnauer, James E; Urcelay, Gonzalo P; Miller, Ralph R

    2014-02-01

    Most models of human and animal learning assume that learning is proportional to the discrepancy between a delivered outcome and the outcome predicted by all cues present during that trial (i.e., total error across a stimulus compound). This total error reduction (TER) view has been implemented in connectionist and artificial neural network models to describe the conditions under which weights between units change. Electrophysiological work has revealed that the activity of dopamine neurons is correlated with the total error signal in models of reward learning. Similar neural mechanisms presumably support fear conditioning, human contingency learning, and other types of learning. Using a computational modeling approach, we compared several TER models of associative learning to an alternative model that rejects the TER assumption in favor of local error reduction (LER), which assumes that learning about each cue is proportional to the discrepancy between the delivered outcome and the outcome predicted by that specific cue on that trial. The LER model provided a better fit to the reviewed data than the TER models. Given the superiority of the LER model with the present data sets, acceptance of TER should be tempered. Copyright © 2013 Elsevier Inc. All rights reserved.

  1. Error calculations statistics in radioactive measurements

    International Nuclear Information System (INIS)

    Verdera, Silvia

    1994-01-01

    Basic approach and procedures frequently used in the practice of radioactive measurements.Statistical principles applied are part of Good radiopharmaceutical Practices and quality assurance.Concept of error, classification as systematic and random errors.Statistic fundamentals,probability theories, populations distributions, Bernoulli, Poisson,Gauss, t-test distribution,Ξ2 test, error propagation based on analysis of variance.Bibliography.z table,t-test table, Poisson index ,Ξ2 test

  2. Control of Human Error and comparison Level risk after correction action With the SHERPA Method in a control Room of petrochemical industry

    Directory of Open Access Journals (Sweden)

    A. Zakerian

    2011-12-01

    Full Text Available Background and aims Today in many jobs like nuclear, military and chemical industries, human errors may result in a disaster. Accident in different places of the world emphasizes this subject and we indicate for example, Chernobyl disaster in (1986, tree Mile accident in (1974 and Flixborough explosion in (1974.So human errors identification especially in important and intricate systems is necessary and unavoidable for predicting control methods.   Methods Recent research is a case study and performed in Zagross Methanol Company in Asalouye (South pars.   Walking –Talking through method with process expert and control room operators, inspecting technical documents are used for collecting required information and completing Systematic Human Error Reductive and Predictive Approach (SHERPA worksheets.   Results analyzing SHERPA worksheet indicated that, were accepting capable invertebrate errors % 71.25, % 26.75 undesirable errors, % 2 accepting capable(with change errors, % 0 accepting capable errors, and after correction action forecast Level risk to this arrangement, accepting capable invertebrate errors % 0, % 4.35 undesirable errors , % 58.55 accepting capable(with change errors, % 37.1 accepting capable errors .   ConclusionFinally this result is comprehension that this method in different industries especially in chemical industries is enforceable and useful for human errors identification that may lead to accident and adventures.

  3. Detailed semantic analyses of human error incidents occurring at domestic nuclear power plants to fiscal year 2000

    International Nuclear Information System (INIS)

    Tsuge, Tadashi; Hirotsu, Yuko; Takano, Kenichi; Ebisu, Mitsuhiro; Tsumura, Joji

    2003-01-01

    Analysing and evaluating observed cases of human error incidents with the emphasis on human factors and behavior involved was essential for preventing recurrence of those. CRIEPI has been conducting detailed and structures analyses of all incidents reported during last 35 year based on J-HPES, from the beginning of the first Tokai nuclear power operation till fiscal year of 2000, in which total 212 human error cases are identified. Results obtained by the analyses have been stored into the J-HPES data-base. This summarized the semantic analyses on all case-studies stored in the above data-base to grasp the practical and concrete contents and trend of more frequently observed human errors (as are called trigger actions here), causal factors and preventive measures. These semantic analyses have been executed by classifying all those items into some categories that could be considered as having almost the same meaning using the KJ method. Followings are obtained typical results by above analyses: (1) Trigger action-Those could be classified into categories of operation or categories of maintenance. Operational timing errors' and 'operational quantitative errors' were major actions in trigger actions of operation, those occupied about 20% among all actions. At trigger actions of maintenance, 'maintenance quantitative error' were major actions, those occupied quarter among all actions; (2) Causal factor- 'Human internal status' were major factors, as in concrete factors, those occupied 'improper persistence' and 'lack of knowledge'; (3) Preventive measure-Most frequent measures got were job management changes in procedural software improvements, which was from 70% to 80%. As for preventive measures of operation, software improvements have been implemented on 'organization and work practices' and 'individual consciousness'. Concerning preventive measures of maintenance, improvements have been implemented on 'organization and work practices'. (author)

  4. Human Error and General Aviation Accidents: A Comprehensive, Fine-Grained Analysis Using HFACS

    National Research Council Canada - National Science Library

    Wiegmann, Douglas; Faaborg, Troy; Boquet, Albert; Detwiler, Cristy; Holcomb, Kali; Shappell, Scott

    2005-01-01

    ... of both commercial and general aviation (GA) accidents. These analyses have helped to identify general trends in the types of human factors issues and aircrew errors that have contributed to civil aviation accidents...

  5. Wald Sequential Probability Ratio Test for Analysis of Orbital Conjunction Data

    Science.gov (United States)

    Carpenter, J. Russell; Markley, F. Landis; Gold, Dara

    2013-01-01

    We propose a Wald Sequential Probability Ratio Test for analysis of commonly available predictions associated with spacecraft conjunctions. Such predictions generally consist of a relative state and relative state error covariance at the time of closest approach, under the assumption that prediction errors are Gaussian. We show that under these circumstances, the likelihood ratio of the Wald test reduces to an especially simple form, involving the current best estimate of collision probability, and a similar estimate of collision probability that is based on prior assumptions about the likelihood of collision.

  6. Determining The Factors Causing Human Error Deficiencies At A Public Utility Company

    Directory of Open Access Journals (Sweden)

    F. W. Badenhorst

    2004-11-01

    Full Text Available According to Neff (1977, as cited by Bergh (1995, the westernised culture considers work important for industrial mental health. Most individuals experience work positively, which creates a positive attitude. Should this positive attitude be inhibited, workers could lose concentration and become bored, potentially resulting in some form of human error. The aim of this research was to determine the factors responsible for human error events, which lead to power supply failures at Eskom power stations. Proposals were made for the reduction of these contributing factors towards improving plant performance. The target population was 700 panel operators in Eskom’s Power Generation Group. The results showed that factors leading to human error can be reduced or even eliminated. Opsomming Neff (1977 soos aangehaal deur Bergh (1995, skryf dat in die westerse kultuur werk belangrik vir bedryfsgeestesgesondheid is. Die meeste persone ervaar werk as positief, wat ’n positiewe gesindheid kweek. Indien hierdie positiewe gesindheid geïnhibeer word, kan dit lei tot ’n gebrek aan konsentrasie by die werkers. Werkers kan verveeld raak en dit kan weer lei tot menslike foute. Die doel van hierdie navorsing is om die faktore vas te stel wat tot menslike foute lei, en wat bydra tot onderbrekings in kragvoorsiening by Eskom kragstasies. Voorstelle is gemaak vir die vermindering van hierdie bydraende faktore ten einde die kragaanleg se prestasie te verbeter. Die teiken-populasie was 700 paneel-operateurs in die Kragopwekkingsgroep by Eskom. Die resultate dui daarop dat die faktore wat aanleiding gee tot menslike foute wel verminder, of geëlimineer kan word.

  7. A Preliminary Study on the Measures to Assess the Organizational Safety: The Cultural Impact on Human Error Potential

    International Nuclear Information System (INIS)

    Lee, Yong Hee; Lee, Yong Hee

    2011-01-01

    The Fukushima I nuclear accident following the Tohoku earthquake and tsunami on 11 March 2011 occurred after twelve years had passed since the JCO accident which was caused as a result of an error made by JCO employees. These accidents, along with the Chernobyl accident, associated with characteristic problems of various organizations caused severe social and economic disruptions and have had significant environmental and health impact. The cultural problems with human errors occur for various reasons, and different actions are needed to prevent different errors. Unfortunately, much of the research on organization and human error has shown widely various or different results which call for different approaches. In other words, we have to find more practical solutions from various researches for nuclear safety and lead a systematic approach to organizational deficiency causing human error. This paper reviews Hofstede's criteria, IAEA safety culture, safety areas of periodic safety review (PSR), teamwork and performance, and an evaluation of HANARO safety culture to verify the measures used to assess the organizational safety

  8. A Preliminary Study on the Measures to Assess the Organizational Safety: The Cultural Impact on Human Error Potential

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yong Hee; Lee, Yong Hee [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2011-10-15

    The Fukushima I nuclear accident following the Tohoku earthquake and tsunami on 11 March 2011 occurred after twelve years had passed since the JCO accident which was caused as a result of an error made by JCO employees. These accidents, along with the Chernobyl accident, associated with characteristic problems of various organizations caused severe social and economic disruptions and have had significant environmental and health impact. The cultural problems with human errors occur for various reasons, and different actions are needed to prevent different errors. Unfortunately, much of the research on organization and human error has shown widely various or different results which call for different approaches. In other words, we have to find more practical solutions from various researches for nuclear safety and lead a systematic approach to organizational deficiency causing human error. This paper reviews Hofstede's criteria, IAEA safety culture, safety areas of periodic safety review (PSR), teamwork and performance, and an evaluation of HANARO safety culture to verify the measures used to assess the organizational safety

  9. Analysis of probability of defects in the disposal canisters

    International Nuclear Information System (INIS)

    Holmberg, J.-E.; Kuusela, P.

    2011-06-01

    This report presents a probability model for the reliability of the spent nuclear waste final disposal canister. Reliability means here that the welding of the canister lid has no critical defects from the long-term safety point of view. From the reliability point of view, both the reliability of the welding process (that no critical defects will be born) and the non-destructive testing (NDT) process (all critical defects will be detected) are equally important. In the probability model, critical defects in a weld were simplified into a few types. Also the possibility of human errors in the NDT process was taken into account in a simple manner. At this moment there is very little representative data to determine the reliability of welding and also the data on NDT is not well suited for the needs of this study. Therefore calculations presented here are based on expert judgements and on several assumptions that have not been verified yet. The Bayesian probability model shows the importance of the uncertainty in the estimation of the reliability parameters. The effect of uncertainty is that the probability distribution of the number of defective canisters becomes flat for larger numbers of canisters compared to the binomial probability distribution in case of known parameter values. In order to reduce the uncertainty, more information is needed from both the reliability of the welding and NDT processes. It would also be important to analyse the role of human factors in these processes since their role is not reflected in typical test data which is used to estimate 'normal process variation'.The reported model should be seen as a tool to quantify the roles of different methods and procedures in the weld inspection process. (orig.)

  10. Probability Theory, Not the Very Guide of Life

    Science.gov (United States)

    Juslin, Peter; Nilsson, Hakan; Winman, Anders

    2009-01-01

    Probability theory has long been taken as the self-evident norm against which to evaluate inductive reasoning, and classical demonstrations of violations of this norm include the conjunction error and base-rate neglect. Many of these phenomena require multiplicative probability integration, whereas people seem more inclined to linear additive…

  11. Judging the Probability of Hypotheses Versus the Impact of Evidence: Which Form of Inductive Inference Is More Accurate and Time-Consistent?

    Science.gov (United States)

    Tentori, Katya; Chater, Nick; Crupi, Vincenzo

    2016-04-01

    Inductive reasoning requires exploiting links between evidence and hypotheses. This can be done focusing either on the posterior probability of the hypothesis when updated on the new evidence or on the impact of the new evidence on the credibility of the hypothesis. But are these two cognitive representations equally reliable? This study investigates this question by comparing probability and impact judgments on the same experimental materials. The results indicate that impact judgments are more consistent in time and more accurate than probability judgments. Impact judgments also predict the direction of errors in probability judgments. These findings suggest that human inductive reasoning relies more on estimating evidential impact than on posterior probability. Copyright © 2015 Cognitive Science Society, Inc.

  12. Errors and violations

    International Nuclear Information System (INIS)

    Reason, J.

    1988-01-01

    This paper is in three parts. The first part summarizes the human failures responsible for the Chernobyl disaster and argues that, in considering the human contribution to power plant emergencies, it is necessary to distinguish between: errors and violations; and active and latent failures. The second part presents empirical evidence, drawn from driver behavior, which suggest that errors and violations have different psychological origins. The concluding part outlines a resident pathogen view of accident causation, and seeks to identify the various system pathways along which errors and violations may be propagated

  13. Double symbol error rates for differential detection of narrow-band FM

    Science.gov (United States)

    Simon, M. K.

    1985-01-01

    This paper evaluates the double symbol error rate (average probability of two consecutive symbol errors) in differentially detected narrow-band FM. Numerical results are presented for the special case of MSK with a Gaussian IF receive filter. It is shown that, not unlike similar results previously obtained for the single error probability of such systems, large inaccuracies in predicted performance can occur when intersymbol interference is ignored.

  14. Novel MGF-based expressions for the average bit error probability of binary signalling over generalized fading channels

    KAUST Repository

    Yilmaz, Ferkan

    2014-04-01

    The main idea in the moment generating function (MGF) approach is to alternatively express the conditional bit error probability (BEP) in a desired exponential form so that possibly multi-fold performance averaging is readily converted into a computationally efficient single-fold averaging - sometimes into a closed-form - by means of using the MGF of the signal-to-noise ratio. However, as presented in [1] and specifically indicated in [2] and also to the best of our knowledge, there does not exist an MGF-based approach in the literature to represent Wojnar\\'s generic BEP expression in a desired exponential form. This paper presents novel MGF-based expressions for calculating the average BEP of binary signalling over generalized fading channels, specifically by expressing Wojnar\\'s generic BEP expression in a desirable exponential form. We also propose MGF-based expressions to explore the amount of dispersion in the BEP for binary signalling over generalized fading channels.

  15. Team errors: definition and taxonomy

    International Nuclear Information System (INIS)

    Sasou, Kunihide; Reason, James

    1999-01-01

    In error analysis or error management, the focus is usually upon individuals who have made errors. In large complex systems, however, most people work in teams or groups. Considering this working environment, insufficient emphasis has been given to 'team errors'. This paper discusses the definition of team errors and its taxonomy. These notions are also applied to events that have occurred in the nuclear power industry, aviation industry and shipping industry. The paper also discusses the relations between team errors and Performance Shaping Factors (PSFs). As a result, the proposed definition and taxonomy are found to be useful in categorizing team errors. The analysis also reveals that deficiencies in communication, resource/task management, excessive authority gradient, excessive professional courtesy will cause team errors. Handling human errors as team errors provides an opportunity to reduce human errors

  16. Multiple sequential failure model: A probabilistic approach to quantifying human error dependency

    International Nuclear Information System (INIS)

    Samanta

    1985-01-01

    This paper rpesents a probabilistic approach to quantifying human error dependency when multiple tasks are performed. Dependent human failures are dominant contributors to risks from nuclear power plants. An overview of the Multiple Sequential Failure (MSF) model developed and its use in probabilistic risk assessments (PRAs) depending on the available data are discussed. A small-scale psychological experiment was conducted on the nature of human dependency and the interpretation of the experimental data by the MSF model show remarkable accommodation of the dependent failure data. The model, which provides an unique method for quantification of dependent failures in human reliability analysis, can be used in conjunction with any of the general methods currently used for performing the human reliability aspect in PRAs

  17. Systematic analysis of video data from different human-robot interaction studies: a categorization of social signals during error situations.

    Science.gov (United States)

    Giuliani, Manuel; Mirnig, Nicole; Stollnberger, Gerald; Stadler, Susanne; Buchner, Roland; Tscheligi, Manfred

    2015-01-01

    Human-robot interactions are often affected by error situations that are caused by either the robot or the human. Therefore, robots would profit from the ability to recognize when error situations occur. We investigated the verbal and non-verbal social signals that humans show when error situations occur in human-robot interaction experiments. For that, we analyzed 201 videos of five human-robot interaction user studies with varying tasks from four independent projects. The analysis shows that there are two types of error situations: social norm violations and technical failures. Social norm violations are situations in which the robot does not adhere to the underlying social script of the interaction. Technical failures are caused by technical shortcomings of the robot. The results of the video analysis show that the study participants use many head movements and very few gestures, but they often smile, when in an error situation with the robot. Another result is that the participants sometimes stop moving at the beginning of error situations. We also found that the participants talked more in the case of social norm violations and less during technical failures. Finally, the participants use fewer non-verbal social signals (for example smiling, nodding, and head shaking), when they are interacting with the robot alone and no experimenter or other human is present. The results suggest that participants do not see the robot as a social interaction partner with comparable communication skills. Our findings have implications for builders and evaluators of human-robot interaction systems. The builders need to consider including modules for recognition and classification of head movements to the robot input channels. The evaluators need to make sure that the presence of an experimenter does not skew the results of their user studies.

  18. Statistical evaluation of major human errors during the development of new technological systems

    International Nuclear Information System (INIS)

    Campbell, G; Ott, K.O.

    1979-01-01

    Statistical procedures are presented to evaluate major human errors during the development of a new system, errors that have led or can lead to accidents or major failures. The first procedure aims at estimating the average residual occurrence rate for s or major failures after several have occurred. The procedure is solely based on the historical record. Certain idealizations are introduced that allow the application of a sound statistical evaluation procedure. These idealizations are practically realized to a sufficient degree such that the proposed estimation procedure yields meaningful results, even for situations with a sparse data base, represented by very few accidents. Under the assumption that the possible human-error-related failure times have exponential distributions, the statistical technique of isotonic regression is proposed to estimate the failure rates due to human design error at the failure times of the system. The last value in the sequence of estimates gives the residual accident chance. In addition, theactual situation is tested against the hypothesis that the failure rate of the system remains constant over time. This test determines the chance for a decreasing failure rate being incidental, rather than an indication of an actual learning process. Both techniques can be applied not merely to a single system but to an entire series of similar systems that a technology would generate, enabling the assessment of technological improvement. For the purpose of illustration, the nuclear decay of isotopes was chosen as an example, since the assumptions of the model are rigorously satisfied in this case. This application shows satisfactory agreement of the estimated and actual failure rates (which are exactly known in this example), although the estimation was deliberately based on a sparse historical record

  19. Naming game with learning errors in communications

    OpenAIRE

    Lou, Yang; Chen, Guanrong

    2014-01-01

    Naming game simulates the process of naming an objective by a population of agents organized in a certain communication network topology. By pair-wise iterative interactions, the population reaches a consensus state asymptotically. In this paper, we study naming game with communication errors during pair-wise conversations, where errors are represented by error rates in a uniform probability distribution. First, a model of naming game with learning errors in communications (NGLE) is proposed....

  20. Human reliability analysis for venting a BWR Mark I during a severe accident

    International Nuclear Information System (INIS)

    Nelson, W.R.; Blackman, H.S.

    1986-01-01

    A Human Reliability Analysis (HRA) was performed for the operator actions necessary to achieve containment venting for the Peach Bottom Atomic Power Station. This study was funded by the United States Nuclear Regulatory Commission (USNRC) and performed by the Idaho National Engineering Laboratory (INEL). The goal of the analysis was to estimate Human Error Probabilities (HEPs) to determine the likelihood that operators would fail to complete the venting process. The analysis was performed for two generic accident sequences: anticipated transient without scram (ATWS) and station blackout. Two major methods were used to estimate the HEPs: Technique for Human Error rate Prediction (THERP) and Success Likelihood Index Methodology (SLIM). For the ATWS scenarios analyzed, the calculated HEPs ranged from 0.23 to 0.35, depending on the number of vent paths that are required to reduce the containment pressure. It should be noted that the confidence bounds around these HEPs are large, However, even when considering the large confidence range, the failure probabilities are larger than what is typical for normal operator actions. For station blackout, the HEP is 1.0, resulting from the dangerous environmental conditions that are present, assuming that plant management would not deliberately expose personnel to a potentially fatal environment. These results are based on the analysis of draft procedures for containment venting. It is probable that careful revision of the procedures could reduce the human error probabilities

  1. Outage Probability Analysis of FSO Links over Foggy Channel

    KAUST Repository

    Esmail, Maged Abdullah

    2017-02-22

    Outdoor Free space optic (FSO) communication systems are sensitive to atmospheric impairments such as turbulence and fog, in addition to being subject to pointing errors. Fog is particularly severe because it induces an attenuation that may vary from few dBs up to few hundreds of dBs per kilometer. Pointing errors also distort the link alignment and cause signal fading. In this paper, we investigate and analyze the FSO systems performance under fog conditions and pointing errors in terms of outage probability. We then study the impact of several effective communication mitigation techniques that can improve the system performance including multi-hop, transmit laser selection (TLS) and hybrid RF/FSO transmission. Closed-form expressions for the outage probability are derived and practical and comprehensive numerical examples are suggested to assess the obtained results. We found that the FSO system has limited performance that prevents applying FSO in wireless microcells that have a 500 m minimum cell radius. The performance degrades more when pointing errors appear. Increasing the transmitted power can improve the performance under light to moderate fog. However, under thick and dense fog the improvement is negligible. Using mitigation techniques can play a major role in improving the range and outage probability.

  2. Outage Probability Analysis of FSO Links over Foggy Channel

    KAUST Repository

    Esmail, Maged Abdullah; Fathallah, Habib; Alouini, Mohamed-Slim

    2017-01-01

    Outdoor Free space optic (FSO) communication systems are sensitive to atmospheric impairments such as turbulence and fog, in addition to being subject to pointing errors. Fog is particularly severe because it induces an attenuation that may vary from few dBs up to few hundreds of dBs per kilometer. Pointing errors also distort the link alignment and cause signal fading. In this paper, we investigate and analyze the FSO systems performance under fog conditions and pointing errors in terms of outage probability. We then study the impact of several effective communication mitigation techniques that can improve the system performance including multi-hop, transmit laser selection (TLS) and hybrid RF/FSO transmission. Closed-form expressions for the outage probability are derived and practical and comprehensive numerical examples are suggested to assess the obtained results. We found that the FSO system has limited performance that prevents applying FSO in wireless microcells that have a 500 m minimum cell radius. The performance degrades more when pointing errors appear. Increasing the transmitted power can improve the performance under light to moderate fog. However, under thick and dense fog the improvement is negligible. Using mitigation techniques can play a major role in improving the range and outage probability.

  3. Distinguishing mixed quantum states: Minimum-error discrimination versus optimum unambiguous discrimination

    International Nuclear Information System (INIS)

    Herzog, Ulrike; Bergou, Janos A.

    2004-01-01

    We consider two different optimized measurement strategies for the discrimination of nonorthogonal quantum states. The first is ambiguous discrimination with a minimum probability of inferring an erroneous result, and the second is unambiguous, i.e., error-free, discrimination with a minimum probability of getting an inconclusive outcome, where the measurement fails to give a definite answer. For distinguishing between two mixed quantum states, we investigate the relation between the minimum-error probability achievable in ambiguous discrimination, and the minimum failure probability that can be reached in unambiguous discrimination of the same two states. The latter turns out to be at least twice as large as the former for any two given states. As an example, we treat the case where the state of the quantum system is known to be, with arbitrary prior probability, either a given pure state, or a uniform statistical mixture of any number of mutually orthogonal states. For this case we derive an analytical result for the minimum probability of error and perform a quantitative comparison with the minimum failure probability

  4. Investigating the causes of human error-induced incidents in the maintenance operations of petrochemical industry by using HFACS

    Directory of Open Access Journals (Sweden)

    Mohammadreza Azhdari

    2017-03-01

    Full Text Available Background & Objectives: Maintenance is an important tool for the petrochemical industries to prevent of accidents and increase operational and process safety success. The purpose of this study was to identify the possible causes of incidents caused by human error in the petrochemical maintenance activities by using Human Factors Analysis and Classification System (HFACS. Methods: This study is a cross-sectional analysis that was conducted in Zagros Petrochemical Company, Asaluyeh-Iran. A checklist of human error-induced incidents was developed based on four HFACS main levels and nineteen sub-groups. Hierarchical task analysis (HTA technique was used to identify maintenance activities and tasks. The main causes of possible incidents were identified by checklist and recorded. Corrective and preventive actions were defined depending on priority.   Results: The content analysis of worksheets of 444 activities showed 37.6% of the causes at the level of unsafe actions, 27.5% at the level of unsafe supervision, 20.9% at the level of preconditions for unsafe acts and 14% of the causes at the level of organizational effects. The HFACS sub-groups showed errors (24.36% inadequate supervision (14.89% and violations (13.26% with the most frequency. Conclusion: In order to prevent and reduce the occurrence of the identified errors, reducing the rate of the detected errors is crucial. Findings of this study showed that appropriate controlling measures such as periodical training of work procedures and supervision improvement decrease the human error-induced incidents in petrochemical industry maintenance.

  5. Quality of IT service delivery — Analysis and framework for human error prevention

    KAUST Repository

    Shwartz, L.; Rosu, D.; Loewenstern, D.; Buco, M. J.; Guo, S.; Lavrado, Rafael Coelho; Gupta, M.; De, P.; Madduri, V.; Singh, J. K.

    2010-01-01

    In this paper, we address the problem of reducing the occurrence of Human Errors that cause service interruptions in IT Service Support and Delivery operations. Analysis of a large volume of service interruption records revealed that more than 21

  6. A novel unified expression for the capacity and bit error probability of wireless communication systems over generalized fading channels

    KAUST Repository

    Yilmaz, Ferkan

    2012-07-01

    Analysis of the average binary error probabilities (ABEP) and average capacity (AC) of wireless communications systems over generalized fading channels have been considered separately in past years. This paper introduces a novel moment generating function (MGF)-based unified expression for the ABEP and AC of single and multiple link communications with maximal ratio combining. In addition, this paper proposes the hyper-Fox\\'s H fading model as a unified fading distribution of a majority of the well-known generalized fading environments. As such, the authors offer a generic unified performance expression that can be easily calculated, and that is applicable to a wide variety of fading scenarios. The mathematical formulism is illustrated with some selected numerical examples that validate the correctness of the authors\\' newly derived results. © 1972-2012 IEEE.

  7. Quantification of a decision-making failure probability of the accident management using cognitive analysis model

    Energy Technology Data Exchange (ETDEWEB)

    Yoshida, Yoshitaka; Ohtani, Masanori [Institute of Nuclear Safety System, Inc., Mihama, Fukui (Japan); Fujita, Yushi [TECNOVA Corp., Tokyo (Japan)

    2002-09-01

    In the nuclear power plant, much knowledge is acquired through probabilistic safety assessment (PSA) of a severe accident, and accident management (AM) is prepared. It is necessary to evaluate the effectiveness of AM using the decision-making failure probability of an emergency organization, operation failure probability of operators, success criteria of AM and reliability of AM equipments in PSA. However, there has been no suitable qualification method for PSA so far to obtain the decision-making failure probability, because the decision-making failure of an emergency organization treats the knowledge based error. In this work, we developed a new method for quantification of the decision-making failure probability of an emergency organization using cognitive analysis model, which decided an AM strategy, in a nuclear power plant at the severe accident, and tried to apply it to a typical pressurized water reactor (PWR) plant. As a result: (1) It could quantify the decision-making failure probability adjusted to PSA for general analysts, who do not necessarily possess professional human factors knowledge, by choosing the suitable value of a basic failure probability and an error-factor. (2) The decision-making failure probabilities of six AMs were in the range of 0.23 to 0.41 using the screening evaluation method and in the range of 0.10 to 0.19 using the detailed evaluation method as the result of trial evaluation based on severe accident analysis of a typical PWR plant, and a result of sensitivity analysis of the conservative assumption, failure probability decreased about 50%. (3) The failure probability using the screening evaluation method exceeded that using detailed evaluation method by 99% of probability theoretically, and the failure probability of AM in this study exceeded 100%. From this result, it was shown that the decision-making failure probability was more conservative than the detailed evaluation method, and the screening evaluation method satisfied

  8. Quantification of a decision-making failure probability of the accident management using cognitive analysis model

    International Nuclear Information System (INIS)

    Yoshida, Yoshitaka; Ohtani, Masanori; Fujita, Yushi

    2002-01-01

    In the nuclear power plant, much knowledge is acquired through probabilistic safety assessment (PSA) of a severe accident, and accident management (AM) is prepared. It is necessary to evaluate the effectiveness of AM using the decision-making failure probability of an emergency organization, operation failure probability of operators, success criteria of AM and reliability of AM equipments in PSA. However, there has been no suitable qualification method for PSA so far to obtain the decision-making failure probability, because the decision-making failure of an emergency organization treats the knowledge based error. In this work, we developed a new method for quantification of the decision-making failure probability of an emergency organization using cognitive analysis model, which decided an AM strategy, in a nuclear power plant at the severe accident, and tried to apply it to a typical pressurized water reactor (PWR) plant. As a result: (1) It could quantify the decision-making failure probability adjusted to PSA for general analysts, who do not necessarily possess professional human factors knowledge, by choosing the suitable value of a basic failure probability and an error-factor. (2) The decision-making failure probabilities of six AMs were in the range of 0.23 to 0.41 using the screening evaluation method and in the range of 0.10 to 0.19 using the detailed evaluation method as the result of trial evaluation based on severe accident analysis of a typical PWR plant, and a result of sensitivity analysis of the conservative assumption, failure probability decreased about 50%. (3) The failure probability using the screening evaluation method exceeded that using detailed evaluation method by 99% of probability theoretically, and the failure probability of AM in this study exceeded 100%. From this result, it was shown that the decision-making failure probability was more conservative than the detailed evaluation method, and the screening evaluation method satisfied

  9. Evaluating a medical error taxonomy.

    OpenAIRE

    Brixey, Juliana; Johnson, Todd R.; Zhang, Jiajie

    2002-01-01

    Healthcare has been slow in using human factors principles to reduce medical errors. The Center for Devices and Radiological Health (CDRH) recognizes that a lack of attention to human factors during product development may lead to errors that have the potential for patient injury, or even death. In response to the need for reducing medication errors, the National Coordinating Council for Medication Errors Reporting and Prevention (NCC MERP) released the NCC MERP taxonomy that provides a stand...

  10. Application of expert elicitation techniques in human reliability, assessment

    International Nuclear Information System (INIS)

    Sanyasi Rao, V.V.S.; Saraf, R.K.; Ghosh, A.K.; Kushwaha, H.S.

    2006-01-01

    Expert elicitation techniques are being used, in the area of technological forecasting, in estimating data needed for analysis when it is either difficult to arrive at the data by experimental means or when it is quite involved to plan and conduct the experiment. In this study, expert elicitation techniques are applied to the evaluation of the frequencies of the various accident sequences that can result from the initiating event (IE) 'High Pressure Process Water (HPPW) system failure' in typical Indian Pressurised Heavy Water Reactor (IPHWR) of the older generation. The Operating Procedure under Emergency Conditions (OPEC) for this IE involves human actions according to a pre-defined procedure. The Human Error Probabilities for all these human actions are obtained using expert elicitation techniques. These techniques aim at eliciting the opinion of the experts in the area of interest with regard to the issue in question. The uncertainty is analysed by employing the measure of dissonance and the most probable range of human error probabilities are arrived at by maximizing this measure. These values are combined using the same procedures mentioned above to yield a distribution representing the uncertainty associated with the predictions. (author)

  11. Extracting and Converting Quantitative Data into Human Error Probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Tuan Q. Tran; Ronald L. Boring; Jeffrey C. Joe; Candice D. Griffith

    2007-08-01

    This paper discusses a proposed method using a combination of advanced statistical approaches (e.g., meta-analysis, regression, structural equation modeling) that will not only convert different empirical results into a common metric for scaling individual PSFs effects, but will also examine the complex interrelationships among PSFs. Furthermore, the paper discusses how the derived statistical estimates (i.e., effect sizes) can be mapped onto a HRA method (e.g. SPAR-H) to generate HEPs that can then be use in probabilistic risk assessment (PRA). The paper concludes with a discussion of the benefits of using academic literature in assisting HRA analysts in generating sound HEPs and HRA developers in validating current HRA models and formulating new HRA models.

  12. A study on the critical factors of human error in civil aviation: An early warning management perspective in Bangladesh

    Directory of Open Access Journals (Sweden)

    Md. Salah Uddin Rajib

    2015-01-01

    Full Text Available The safety of civil aviation will be more secured if the errors in all the facets can be reduced. Like the other industrial sectors, human resource is one of the most complex and sensitive resources for the civil aviation. The error of human resources can cause fatal disasters. In these days, a good volume of researches have been conducted on the disaster of civil aviation. The researchers have identified the causes of the civil aviation disasters from various perspectives. They identified the areas where more concern is needed to reduce the disastrous impacts. This paper aims to find out the critical factors of human error in civil aviation in a developing country (Bangladesh as it is accepted that human error is one of main causes of civil aviation disasters. The paper reviews the previous research to find out the critical factors conceptually. Fuzzy analytical hierarchy process (FAHP has been used to find out the critical factors systematically. Analyses indicate that the concentration on precondition for unsafe acts (including sub-factors is required to ensure the aviation safety.

  13. Development of a new model to evaluate the probability of automatic plant trips for pressurized water reactors

    Energy Technology Data Exchange (ETDEWEB)

    Shimada, Yoshio [Institute of Nuclear Safety System Inc., Mihama, Fukui (Japan); Kawai, Katsunori; Suzuki, Hiroshi [Mitsubishi Heavy Industries Ltd., Tokyo (Japan)

    2001-09-01

    In order to improve the reliability of plant operations for pressurized water reactors, a new fault tree model was developed to evaluate the probability of automatic plant trips. This model consists of fault trees for sixteen systems. It has the following features: (1) human errors and transmission line incidents are modeled by the existing data, (2) the repair of failed components is considered to calculate the failure probability of components, (3) uncertainty analysis is performed by an exact method. From the present results, it is confirmed that the obtained upper and lower bound values of the automatic plant trip probability are within the existing data bound in Japan. Thereby this model can be applicable to the prediction of plant performance and reliability. (author)

  14. Human reliability analysis as an evaluation tool of the emergency evacuation process on industrial installation

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Isaac J.A.L. dos; Grecco, Claudio H.S.; Mol, Antonio C.A.; Carvalho, Paulo V.R.; Oliveira, Mauro V.; Botelho, Felipe Mury [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil)]. E-mail: luquetti@ien.gov.br; grecco@ien.gov.br; mol@ien.gov.br; paulov@ien.gov.br; mvitor@ien.gov.br; felipemury@superig.com.br

    2007-07-01

    Human reliability is the probability that a person correctly performs some required activity by the system in a required time period and performs no extraneous activity that can degrade the system. Human reliability analysis (HRA) is the analysis, prediction and evaluation of work-oriented human performance using some indices as human error likelihood and probability of task accomplishment. The human error concept must not have connotation of guilt and punishment, having to be treated as a natural consequence, that emerges due to the not continuity between the human capacity and the system demand. The majority of the human error is a consequence of the work situation and not of the responsibility lack of the worker. The anticipation and the control of potentially adverse impacts of human action or interactions between the humans and the system are integral parts of the process safety, where the factors that influence the human performance must be recognized and managed. The aim of this paper is to propose a methodology to evaluate the emergency evacuation process on industrial installations including SLIM-MAUD, a HRA first-generation method, and using virtual reality and simulation software to build and to simulate the chosen emergency scenes. (author)

  15. Human reliability analysis as an evaluation tool of the emergency evacuation process on industrial installation

    International Nuclear Information System (INIS)

    Santos, Isaac J.A.L. dos; Grecco, Claudio H.S.; Mol, Antonio C.A.; Carvalho, Paulo V.R.; Oliveira, Mauro V.; Botelho, Felipe Mury

    2007-01-01

    Human reliability is the probability that a person correctly performs some required activity by the system in a required time period and performs no extraneous activity that can degrade the system. Human reliability analysis (HRA) is the analysis, prediction and evaluation of work-oriented human performance using some indices as human error likelihood and probability of task accomplishment. The human error concept must not have connotation of guilt and punishment, having to be treated as a natural consequence, that emerges due to the not continuity between the human capacity and the system demand. The majority of the human error is a consequence of the work situation and not of the responsibility lack of the worker. The anticipation and the control of potentially adverse impacts of human action or interactions between the humans and the system are integral parts of the process safety, where the factors that influence the human performance must be recognized and managed. The aim of this paper is to propose a methodology to evaluate the emergency evacuation process on industrial installations including SLIM-MAUD, a HRA first-generation method, and using virtual reality and simulation software to build and to simulate the chosen emergency scenes. (author)

  16. Quantum processes: probability fluxes, transition probabilities in unit time and vacuum vibrations

    International Nuclear Information System (INIS)

    Oleinik, V.P.; Arepjev, Ju D.

    1989-01-01

    Transition probabilities in unit time and probability fluxes are compared in studying the elementary quantum processes -the decay of a bound state under the action of time-varying and constant electric fields. It is shown that the difference between these quantities may be considerable, and so the use of transition probabilities W instead of probability fluxes Π, in calculating the particle fluxes, may lead to serious errors. The quantity W represents the rate of change with time of the population of the energy levels relating partly to the real states and partly to the virtual ones, and it cannot be directly measured in experiment. The vacuum background is shown to be continuously distorted when a perturbation acts on a system. Because of this the viewpoint of an observer on the physical properties of real particles continuously varies with time. This fact is not taken into consideration in the conventional theory of quantum transitions based on using the notion of probability amplitude. As a result, the probability amplitudes lose their physical meaning. All the physical information on quantum dynamics of a system is contained in the mean values of physical quantities. The existence of considerable differences between the quantities W and Π permits one in principle to make a choice of the correct theory of quantum transitions on the basis of experimental data. (author)

  17. Human Factors in Nuclear Reactor Accidents

    International Nuclear Information System (INIS)

    Mustafa, M.E.

    2016-01-01

    While many people would blame nature for the disaster of the “Fukushima Daiichi” accident, experts considered this accident to be also a human-induced disaster. This confirmed the importance of human errors which have been getting a growing interest in the nuclear field after the Three Mile Island accident. Personnel play an important role in design, operation, maintenance, planning, and management. The interface between machine and man is known as a human factor. In the present work, the human factors that have to be considered were discussed. The effect of the control room configuration and equipment design effect on the human behavior was also discussed. Precise reviewing of person’s qualifications and experience was focused. Insufficient training has been a major cause of human error in the nuclear field. The effective training issues were introduced. Avoiding complicated operational processes and non responsive management systems was stressed. Distinguishing between the procedures for normal and emergency operations was emphasised. It was stated that human error during maintenance and testing activities could cause a serious accident. This is because safety systems do not cover much more risk probabilities in the maintenance and testing activities like they do in the normal operation. In nuclear industry, the need for a classification and identification of human errors has been well recognised. As a result of this, human reliability must be assessed. These errors are analyzed by a probabilistic safety assessment which deals with errors in reading, listening and implementing procedures but not with cognitive errors. Much efforts must be accomplished to consider cognitive errors in the probabilistic safety assessment. The ways of collecting human factor data were surveyed. The methods for identifying safe designs, helping decision makers to predict how proposed or current policies will affect safety, and comprehensive understanding of the relationship

  18. Epistemic-based investigation of the probability of hazard scenarios using Bayesian network for the lifting operation of floating objects

    Science.gov (United States)

    Toroody, Ahmad Bahoo; Abaiee, Mohammad Mahdi; Gholamnia, Reza; Ketabdari, Mohammad Javad

    2016-09-01

    Owing to the increase in unprecedented accidents with new root causes in almost all operational areas, the importance of risk management has dramatically risen. Risk assessment, one of the most significant aspects of risk management, has a substantial impact on the system-safety level of organizations, industries, and operations. If the causes of all kinds of failure and the interactions between them are considered, effective risk assessment can be highly accurate. A combination of traditional risk assessment approaches and modern scientific probability methods can help in realizing better quantitative risk assessment methods. Most researchers face the problem of minimal field data with respect to the probability and frequency of each failure. Because of this limitation in the availability of epistemic knowledge, it is important to conduct epistemic estimations by applying the Bayesian theory for identifying plausible outcomes. In this paper, we propose an algorithm and demonstrate its application in a case study for a light-weight lifting operation in the Persian Gulf of Iran. First, we identify potential accident scenarios and present them in an event tree format. Next, excluding human error, we use the event tree to roughly estimate the prior probability of other hazard-promoting factors using a minimal amount of field data. We then use the Success Likelihood Index Method (SLIM) to calculate the probability of human error. On the basis of the proposed event tree, we use the Bayesian network of the provided scenarios to compensate for the lack of data. Finally, we determine the resulting probability of each event based on its evidence in the epistemic estimation format by building on two Bayesian network types: the probability of hazard promotion factors and the Bayesian theory. The study results indicate that despite the lack of available information on the operation of floating objects, a satisfactory result can be achieved using epistemic data.

  19. The human fallibility of scientists : Dealing with error and bias in academic research

    NARCIS (Netherlands)

    Veldkamp, Coosje

    2017-01-01

    THE HUMAN FALLIBILITY OF SCIENTISTS Dealing with error and bias in academic research Recent studies have highlighted that not all published findings in the scientific lit¬erature are trustworthy, suggesting that currently implemented control mechanisms such as high standards for the reporting of

  20. Predicting risk and human reliability: a new approach

    International Nuclear Information System (INIS)

    Duffey, R.; Ha, T.-S.

    2009-01-01

    Learning from experience describes human reliability and skill acquisition, and the resulting theory has been validated by comparison against millions of outcome data from multiple industries and technologies worldwide. The resulting predictions were used to benchmark the classic first generation human reliability methods adopted in probabilistic risk assessments. The learning rate, probabilities and response times are also consistent with the existing psychological models for human learning and error correction. The new approach also implies a finite lower bound probability that is not predicted by empirical statistical distributions that ignore the known and fundamental learning effects. (author)

  1. Uncorrected refractive errors.

    Science.gov (United States)

    Naidoo, Kovin S; Jaggernath, Jyoti

    2012-01-01

    Global estimates indicate that more than 2.3 billion people in the world suffer from poor vision due to refractive error; of which 670 million people are considered visually impaired because they do not have access to corrective treatment. Refractive errors, if uncorrected, results in an impaired quality of life for millions of people worldwide, irrespective of their age, sex and ethnicity. Over the past decade, a series of studies using a survey methodology, referred to as Refractive Error Study in Children (RESC), were performed in populations with different ethnic origins and cultural settings. These studies confirmed that the prevalence of uncorrected refractive errors is considerably high for children in low-and-middle-income countries. Furthermore, uncorrected refractive error has been noted to have extensive social and economic impacts, such as limiting educational and employment opportunities of economically active persons, healthy individuals and communities. The key public health challenges presented by uncorrected refractive errors, the leading cause of vision impairment across the world, require urgent attention. To address these issues, it is critical to focus on the development of human resources and sustainable methods of service delivery. This paper discusses three core pillars to addressing the challenges posed by uncorrected refractive errors: Human Resource (HR) Development, Service Development and Social Entrepreneurship.

  2. Uncorrected refractive errors

    Directory of Open Access Journals (Sweden)

    Kovin S Naidoo

    2012-01-01

    Full Text Available Global estimates indicate that more than 2.3 billion people in the world suffer from poor vision due to refractive error; of which 670 million people are considered visually impaired because they do not have access to corrective treatment. Refractive errors, if uncorrected, results in an impaired quality of life for millions of people worldwide, irrespective of their age, sex and ethnicity. Over the past decade, a series of studies using a survey methodology, referred to as Refractive Error Study in Children (RESC, were performed in populations with different ethnic origins and cultural settings. These studies confirmed that the prevalence of uncorrected refractive errors is considerably high for children in low-and-middle-income countries. Furthermore, uncorrected refractive error has been noted to have extensive social and economic impacts, such as limiting educational and employment opportunities of economically active persons, healthy individuals and communities. The key public health challenges presented by uncorrected refractive errors, the leading cause of vision impairment across the world, require urgent attention. To address these issues, it is critical to focus on the development of human resources and sustainable methods of service delivery. This paper discusses three core pillars to addressing the challenges posed by uncorrected refractive errors: Human Resource (HR Development, Service Development and Social Entrepreneurship.

  3. Error evaluation method for material accountancy measurement. Evaluation of random and systematic errors based on material accountancy data

    International Nuclear Information System (INIS)

    Nidaira, Kazuo

    2008-01-01

    International Target Values (ITV) shows random and systematic measurement uncertainty components as a reference for routinely achievable measurement quality in the accountancy measurement. The measurement uncertainty, called error henceforth, needs to be periodically evaluated and checked against ITV for consistency as the error varies according to measurement methods, instruments, operators, certified reference samples, frequency of calibration, and so on. In the paper an error evaluation method was developed with focuses on (1) Specifying clearly error calculation model, (2) Getting always positive random and systematic error variances, (3) Obtaining probability density distribution of an error variance and (4) Confirming the evaluation method by simulation. In addition the method was demonstrated by applying real data. (author)

  4. A method to deal with installation errors of wearable accelerometers for human activity recognition

    International Nuclear Information System (INIS)

    Jiang, Ming; Wang, Zhelong; Shang, Hong; Li, Hongyi; Wang, Yuechao

    2011-01-01

    Human activity recognition (HAR) by using wearable accelerometers has gained significant interest in recent years in a range of healthcare areas, including inferring metabolic energy expenditure, predicting falls, measuring gait parameters and monitoring daily activities. The implementation of HAR relies heavily on the correctness of sensor fixation. The installation errors of wearable accelerometers may dramatically decrease the accuracy of HAR. In this paper, a method is proposed to improve the robustness of HAR to the installation errors of accelerometers. The method first calculates a transformation matrix by using Gram–Schmidt orthonormalization in order to eliminate the sensor's orientation error and then employs a low-pass filter with a cut-off frequency of 10 Hz to eliminate the main effect of the sensor's misplacement. The experimental results showed that the proposed method obtained a satisfactory performance for HAR. The average accuracy rate from ten subjects was 95.1% when there were no installation errors, and was 91.9% when installation errors were involved in wearable accelerometers

  5. Human factors reliability Benchmark exercise

    International Nuclear Information System (INIS)

    Poucet, A.

    1989-06-01

    The Joint Research Centre of the European Commission has organized a Human Factors Reliability Benchmark Exercise (HF-RBE) with the aim of assessing the state of the art in human reliability modelling and assessment. Fifteen teams from eleven countries, representing industry, utilities, licensing organisations and research institutes, participated in the HF-RBE. The HF-RBE was organized around two study cases: (1) analysis of routine functional Test and Maintenance (T and M) procedures: with the aim of assessing the probability of test induced failures, the probability of failures to remain unrevealed and the potential to initiate transients because of errors performed in the test; (2) analysis of human actions during an operational transient: with the aim of assessing the probability that the operators will correctly diagnose the malfunctions and take proper corrective action. This report contains the final summary reports produced by the participants in the exercise

  6. Error-related anterior cingulate cortex activity and the prediction of conscious error awareness

    Directory of Open Access Journals (Sweden)

    Catherine eOrr

    2012-06-01

    Full Text Available Research examining the neural mechanisms associated with error awareness has consistently identified dorsal anterior cingulate activity (ACC as necessary but not predictive of conscious error detection. Two recent studies (Steinhauser and Yeung, 2010; Wessel et al. 2011 have found a contrary pattern of greater dorsal ACC activity (in the form of the error-related negativity during detected errors, but suggested that the greater activity may instead reflect task influences (e.g., response conflict, error probability and or individual variability (e.g., statistical power. We re-analyzed fMRI BOLD data from 56 healthy participants who had previously been administered the Error Awareness Task, a motor Go/No-go response inhibition task in which subjects make errors of commission of which they are aware (Aware errors, or unaware (Unaware errors. Consistent with previous data, the activity in a number of cortical regions was predictive of error awareness, including bilateral inferior parietal and insula cortices, however in contrast to previous studies, including our own smaller sample studies using the same task, error-related dorsal ACC activity was significantly greater during aware errors when compared to unaware errors. While the significantly faster RT for aware errors (compared to unaware was consistent with the hypothesis of higher response conflict increasing ACC activity, we could find no relationship between dorsal ACC activity and the error RT difference. The data suggests that individual variability in error awareness is associated with error-related dorsal ACC activity, and therefore this region may be important to conscious error detection, but it remains unclear what task and individual factors influence error awareness.

  7. A human error taxonomy for analysing healthcare incident reports: assessing reporting culture and its effects on safety perfomance

    DEFF Research Database (Denmark)

    Itoh, Kenji; Omata, N.; Andersen, Henning Boje

    2009-01-01

    The present paper reports on a human error taxonomy system developed for healthcare risk management and on its application to evaluating safety performance and reporting culture. The taxonomy comprises dimensions for classifying errors, for performance-shaping factors, and for the maturity...

  8. Human factors information system

    International Nuclear Information System (INIS)

    Goodman, P.C.; DiPalo, C.A.

    1991-01-01

    Nuclear power plant safety is dependent upon human performance related to plant operations. To provide improvements in human performance, data collection and assessment play key roles. This paper reports on the Human factors Information System (HFIS) which is designed to meet the needs of the human factors specialists of the United States Nuclear Regulatory Commission. These specialists identify personnel errors and provide guidance designed to prevent such errors. HFIS is a simple and modular system designed for use on a personal computer. It is designed to contain four separate modules that provide information indicative of program or function effectiveness as well as safety-related human performance based on programmatic and performance data. These modules include the Human Factors Status module; the Regulatory Programs module; the Licensee Event Report module; and the Operator Requalification Performance module. Information form these modules can either be used separately or can be combined due to the integrated nature of the system. HFIS has the capability, therefore, to provide insights into those areas of human factors that can reduce the probability of events caused by personnel error at nuclear power plants and promote the health and safety of the public. This information system concept can be applied to other industries as well as the nuclear industry

  9. Human reliability analysis of control room operators

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Isaac J.A.L.; Carvalho, Paulo Victor R.; Grecco, Claudio H.S. [Instituto de Engenharia Nuclear (IEN), Rio de Janeiro, RJ (Brazil)

    2005-07-01

    Human reliability is the probability that a person correctly performs some system required action in a required time period and performs no extraneous action that can degrade the system Human reliability analysis (HRA) is the analysis, prediction and evaluation of work-oriented human performance using some indices as human error likelihood and probability of task accomplishment. Significant progress has been made in the HRA field during the last years, mainly in nuclear area. Some first-generation HRA methods were developed, as THERP (Technique for human error rate prediction). Now, an array of called second-generation methods are emerging as alternatives, for instance ATHEANA (A Technique for human event analysis). The ergonomics approach has as tool the ergonomic work analysis. It focus on the study of operator's activities in physical and mental form, considering at the same time the observed characteristics of operator and the elements of the work environment as they are presented to and perceived by the operators. The aim of this paper is to propose a methodology to analyze the human reliability of the operators of industrial plant control room, using a framework that includes the approach used by ATHEANA, THERP and the work ergonomics analysis. (author)

  10. The decline and fall of Type II error rates

    Science.gov (United States)

    Steve Verrill; Mark Durst

    2005-01-01

    For general linear models with normally distributed random errors, the probability of a Type II error decreases exponentially as a function of sample size. This potentially rapid decline reemphasizes the importance of performing power calculations.

  11. Quantifying seining detection probability for fishes of Great Plains sand‐bed rivers

    Science.gov (United States)

    Mollenhauer, Robert; Logue, Daniel R.; Brewer, Shannon K.

    2018-01-01

    Species detection error (i.e., imperfect and variable detection probability) is an essential consideration when investigators map distributions and interpret habitat associations. When fish detection error that is due to highly variable instream environments needs to be addressed, sand‐bed streams of the Great Plains represent a unique challenge. We quantified seining detection probability for diminutive Great Plains fishes across a range of sampling conditions in two sand‐bed rivers in Oklahoma. Imperfect detection resulted in underestimates of species occurrence using naïve estimates, particularly for less common fishes. Seining detection probability also varied among fishes and across sampling conditions. We observed a quadratic relationship between water depth and detection probability, in which the exact nature of the relationship was species‐specific and dependent on water clarity. Similarly, the direction of the relationship between water clarity and detection probability was species‐specific and dependent on differences in water depth. The relationship between water temperature and detection probability was also species dependent, where both the magnitude and direction of the relationship varied among fishes. We showed how ignoring detection error confounded an underlying relationship between species occurrence and water depth. Despite imperfect and heterogeneous detection, our results support that determining species absence can be accomplished with two to six spatially replicated seine hauls per 200‐m reach under average sampling conditions; however, required effort would be higher under certain conditions. Detection probability was low for the Arkansas River Shiner Notropis girardi, which is federally listed as threatened, and more than 10 seine hauls per 200‐m reach would be required to assess presence across sampling conditions. Our model allows scientists to estimate sampling effort to confidently assess species occurrence, which

  12. Working group of experts on rare events in human error analysis and quantification

    International Nuclear Information System (INIS)

    Goodstein, L.P.

    1977-01-01

    In dealing with the reference problem of rare events in nuclear power plants, the group has concerned itself with the man-machine system and, in particular, with human error analysis and quantification. The Group was requested to review methods of human reliability prediction, to evaluate the extent to which such analyses can be formalized and to establish criteria to be met by task conditions and system design which would permit a systematic, formal analysis. Recommendations are given on the Fessenheim safety system

  13. Errors in radiographic recognition in the emergency room

    International Nuclear Information System (INIS)

    Britton, C.A.; Cooperstein, L.A.

    1986-01-01

    For 6 months we monitored the frequency and type of errors in radiographic recognition made by radiology residents on call in our emergency room. A relatively low error rate was observed, probably because the authors evaluated cognitive errors only, rather than include those of interpretation. The most common missed finding was a small fracture, particularly on the hands or feet. First-year residents were most likely to make an error, but, interestingly, our survey revealed a small subset of upper-level residents who made a disproportionate number of errors

  14. ATHEANA: A Technique for Human Error Analysis: An Overview of Its Methodological Basis

    International Nuclear Information System (INIS)

    Wreathall, John; Ramey-Smith, Ann

    1998-01-01

    The U.S. NRC has developed a new human reliability analysis (HRA) method, called A Technique for Human Event Analysis (ATHEANA), to provide a way of modeling the so-called 'errors of commission' - that is, situations in which operators terminate or disable engineered safety features (ESFs) or similar equipment during accident conditions, thereby putting the plant at an increased risk of core damage. In its reviews of operational events, NRC has found that these errors of commission occur with a relatively high frequency (as high as 2 or 3 per year), but are noticeably missing from the scope of most current probabilistic risk assessments (PRAs). This new method was developed through a formalized approach that describes what can occur when operators behave rationally but have inadequate knowledge or poor judgement. In particular, the method is based on models of decision-making and response planning that have been used extensively in the aviation field, and on the analysis of major accidents in both the nuclear and non-nuclear fields. Other papers at this conference present summaries of these event analyses in both the nuclear and non-nuclear fields. This paper presents an overview of ATHEANA and summarizes how the method structures the analysis of operationally significant events, and helps HRA analysts identify and model potentially risk-significant errors of commission in plant PRAs. (authors)

  15. Cytoarchitecture, probability maps and functions of the human frontal pole.

    Science.gov (United States)

    Bludau, S; Eickhoff, S B; Mohlberg, H; Caspers, S; Laird, A R; Fox, P T; Schleicher, A; Zilles, K; Amunts, K

    2014-06-01

    The frontal pole has more expanded than any other part in the human brain as compared to our ancestors. It plays an important role for specifically human behavior and cognitive abilities, e.g. action selection (Kovach et al., 2012). Evidence about divergent functions of its medial and lateral part has been provided, both in the healthy brain and in psychiatric disorders. The anatomical correlates of such functional segregation, however, are still unknown due to a lack of stereotaxic, microstructural maps obtained in a representative sample of brains. Here we show that the human frontopolar cortex consists of two cytoarchitectonically and functionally distinct areas: lateral frontopolar area 1 (Fp1) and medial frontopolar area 2 (Fp2). Based on observer-independent mapping in serial, cell-body stained sections of 10 brains, three-dimensional, probabilistic maps of areas Fp1 and Fp2 were created. They show, for each position of the reference space, the probability with which each area was found in a particular voxel. Applying these maps as seed regions for a meta-analysis revealed that Fp1 and Fp2 differentially contribute to functional networks: Fp1 was involved in cognition, working memory and perception, whereas Fp2 was part of brain networks underlying affective processing and social cognition. The present study thus disclosed cortical correlates of a functional segregation of the human frontopolar cortex. The probabilistic maps provide a sound anatomical basis for interpreting neuroimaging data in the living human brain, and open new perspectives for analyzing structure-function relationships in the prefrontal cortex. The new data will also serve as a starting point for further comparative studies between human and non-human primate brains. This allows finding similarities and differences in the organizational principles of the frontal lobe during evolution as neurobiological basis for our behavior and cognitive abilities. Copyright © 2013 Elsevier Inc. All

  16. Efficient decoding of random errors for quantum expander codes

    OpenAIRE

    Fawzi , Omar; Grospellier , Antoine; Leverrier , Anthony

    2017-01-01

    We show that quantum expander codes, a constant-rate family of quantum LDPC codes, with the quasi-linear time decoding algorithm of Leverrier, Tillich and Z\\'emor can correct a constant fraction of random errors with very high probability. This is the first construction of a constant-rate quantum LDPC code with an efficient decoding algorithm that can correct a linear number of random errors with a negligible failure probability. Finding codes with these properties is also motivated by Gottes...

  17. Management and Evaluation System on Human Error, Licence Requirements, and Job-aptitude in Rail and the Other Industries

    Energy Technology Data Exchange (ETDEWEB)

    Koo, In Soo; Suh, S. M.; Park, G. O. (and others)

    2006-07-15

    Rail system is a system that is very closely related to the public life. When an accident happens, the public using this system should be injured or even be killed. The accident that recently took place in Taegu subway system, because of the inappropriate human-side task performance, showed demonstratively how its results could turn out to be tragic one. Many studies have shown that the most cases of the accidents have occurred because of performing his/her tasks in inappropriate way. It is generally recognised that the rail system without human element could never be happened quite long time. So human element in rail system is going to be the major factor to the next tragic accident. This state of the art report studied the cases of the managements and evaluation systems related to human errors, license requirements, and job aptitudes in the areas of rail and the other industries for the purpose of improvement of the task performance of personnel which consists of an element and finally enhancement of rail safety. The human errors, license requirements, and evaluation system of the job aptitude on people engaged in agencies with close relation to rail do much for development and preservation their abilities. But due to various inside and outside factors, to some extent it may have limitations to timely reflect overall trends of society, technology, and a sense of value. Removal and control of the factors of human errors will have epochal roles in safety of the rail system through the case studies of this report. Analytical results on case studies of this report will be used in the project 'Development of Management Criteria on Human Error and Evaluation Criteria on Job-aptitude of Rail Safe-operation Personnel' which has been carried out as a part of 'Integrated R and D Program for Railway Safety'.

  18. Management and Evaluation System on Human Error, Licence Requirements, and Job-aptitude in Rail and the Other Industries

    International Nuclear Information System (INIS)

    Koo, In Soo; Suh, S. M.; Park, G. O.

    2006-07-01

    Rail system is a system that is very closely related to the public life. When an accident happens, the public using this system should be injured or even be killed. The accident that recently took place in Taegu subway system, because of the inappropriate human-side task performance, showed demonstratively how its results could turn out to be tragic one. Many studies have shown that the most cases of the accidents have occurred because of performing his/her tasks in inappropriate way. It is generally recognised that the rail system without human element could never be happened quite long time. So human element in rail system is going to be the major factor to the next tragic accident. This state of the art report studied the cases of the managements and evaluation systems related to human errors, license requirements, and job aptitudes in the areas of rail and the other industries for the purpose of improvement of the task performance of personnel which consists of an element and finally enhancement of rail safety. The human errors, license requirements, and evaluation system of the job aptitude on people engaged in agencies with close relation to rail do much for development and preservation their abilities. But due to various inside and outside factors, to some extent it may have limitations to timely reflect overall trends of society, technology, and a sense of value. Removal and control of the factors of human errors will have epochal roles in safety of the rail system through the case studies of this report. Analytical results on case studies of this report will be used in the project 'Development of Management Criteria on Human Error and Evaluation Criteria on Job-aptitude of Rail Safe-operation Personnel' which has been carried out as a part of 'Integrated R and D Program for Railway Safety'

  19. Human reliability assessment in context

    International Nuclear Information System (INIS)

    Hollnagel, Erik

    2005-01-01

    Human Reliability Assessment (HRA) is conducted on the unspoken premise that 'human error' is a meaningful concept and that it can be associated with individual actions. The basis for this assumption it found in the origin of HRA, as a necessary extension of PSA to account for the impact of failures emanating from human actions. Although it was natural to model HRA on PSA, a large number of studies have shown that the premises are wrong, specifically that human and technological functions cannot be decomposed in the same manner. The general experience from accident studies also indicates that action failures are a function of the context, and that it is the variability of the context rather than the 'human error probability' that is the much sought for signal. Accepting this will have significant consequences for the way in which HRA, and ultimately also PSA, should be pursued

  20. Dependence assessment in human reliability analysis based on D numbers and AHP

    International Nuclear Information System (INIS)

    Zhou, Xinyi; Deng, Xinyang; Deng, Yong; Mahadevan, Sankaran

    2017-01-01

    Highlights: • D numbers and AHP are combined to implement dependence assessment in HRA. • A new tool, called D numbers, is used to deal with the uncertainty in HRA. • The proposed method can well address the fuzziness and subjectivity in linguistic assessment. • The proposed method is well applicable in dependence assessment which inherently has a linguistic assessment process. - Abstract: Since human errors always cause heavy loss especially in nuclear engineering, human reliability analysis (HRA) has attracted more and more attention. Dependence assessment plays a vital role in HRA, measuring the dependence degree of human errors. Many researches have been done while still have improvement space. In this paper, a dependence assessment model based on D numbers and analytic hierarchy process (AHP) is proposed. Firstly, identify the factors used to measure the dependence level of two human operations. Besides, in terms of the suggested dependence level, determine and quantify the anchor points for each factor. Secondly, D numbers and AHP are adopted in model. Experts evaluate the dependence level of human operations for each factor. Then, the evaluation results are presented as D numbers and fused by D number’s combination rule that can obtain the dependence probability of human operations for each factor. The weights of factors can be determined by AHP. Thirdly, based on the dependence probability for each factor and its corresponding weight, the dependence probability of two human operations and its confidence can be obtained. The proposed method can well address the fuzziness and subjectivity in linguistic assessment. The proposed method is well applicable to assess the dependence degree of human errors in HRA which inherently has a linguistic assessment process.

  1. Dependence assessment in human reliability analysis based on D numbers and AHP

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Xinyi; Deng, Xinyang [School of Computer and Information Science, Southwest University, Chongqing 400715 (China); Deng, Yong, E-mail: ydeng@swu.edu.cn [School of Computer and Information Science, Southwest University, Chongqing 400715 (China); Institute of Fundamental and Frontier Sciences, University of Electronic Science and Technology of China, Chengdu, Sichuan 610054 (China); Mahadevan, Sankaran [School of Engineering, Vanderbilt University, Nashville, TN 37235 (United States)

    2017-03-15

    Highlights: • D numbers and AHP are combined to implement dependence assessment in HRA. • A new tool, called D numbers, is used to deal with the uncertainty in HRA. • The proposed method can well address the fuzziness and subjectivity in linguistic assessment. • The proposed method is well applicable in dependence assessment which inherently has a linguistic assessment process. - Abstract: Since human errors always cause heavy loss especially in nuclear engineering, human reliability analysis (HRA) has attracted more and more attention. Dependence assessment plays a vital role in HRA, measuring the dependence degree of human errors. Many researches have been done while still have improvement space. In this paper, a dependence assessment model based on D numbers and analytic hierarchy process (AHP) is proposed. Firstly, identify the factors used to measure the dependence level of two human operations. Besides, in terms of the suggested dependence level, determine and quantify the anchor points for each factor. Secondly, D numbers and AHP are adopted in model. Experts evaluate the dependence level of human operations for each factor. Then, the evaluation results are presented as D numbers and fused by D number’s combination rule that can obtain the dependence probability of human operations for each factor. The weights of factors can be determined by AHP. Thirdly, based on the dependence probability for each factor and its corresponding weight, the dependence probability of two human operations and its confidence can be obtained. The proposed method can well address the fuzziness and subjectivity in linguistic assessment. The proposed method is well applicable to assess the dependence degree of human errors in HRA which inherently has a linguistic assessment process.

  2. Human Factor Modelling in the Risk Assessment of Port Manoeuvers

    Directory of Open Access Journals (Sweden)

    Teresa Abramowicz-Gerigk

    2015-09-01

    Full Text Available The documentation of human factor influence on the scenario development in maritime accidents compared with expert methods is commonly used as a basis in the process of setting up safety regulations and instructions. The new accidents and near misses show the necessity for further studies in determining the human factor influence on both risk acceptance criteria and development of risk control options for the manoeuvers in restricted waters. The paper presents the model of human error probability proposed for the assessment of ship masters and marine pilots' error decision and its influence on the risk of port manoeuvres.

  3. Probability, statistics, and associated computing techniques

    International Nuclear Information System (INIS)

    James, F.

    1983-01-01

    This chapter attempts to explore the extent to which it is possible for the experimental physicist to find optimal statistical techniques to provide a unique and unambiguous quantitative measure of the significance of raw data. Discusses statistics as the inverse of probability; normal theory of parameter estimation; normal theory (Gaussian measurements); the universality of the Gaussian distribution; real-life resolution functions; combination and propagation of uncertainties; the sum or difference of 2 variables; local theory, or the propagation of small errors; error on the ratio of 2 discrete variables; the propagation of large errors; confidence intervals; classical theory; Bayesian theory; use of the likelihood function; the second derivative of the log-likelihood function; multiparameter confidence intervals; the method of MINOS; least squares; the Gauss-Markov theorem; maximum likelihood for uniform error distribution; the Chebyshev fit; the parameter uncertainties; the efficiency of the Chebyshev estimator; error symmetrization; robustness vs. efficiency; testing of hypotheses (e.g., the Neyman-Pearson test); goodness-of-fit; distribution-free tests; comparing two one-dimensional distributions; comparing multidimensional distributions; and permutation tests for comparing two point sets

  4. Approximation errors during variance propagation

    International Nuclear Information System (INIS)

    Dinsmore, Stephen

    1986-01-01

    Risk and reliability analyses are often performed by constructing and quantifying large fault trees. The inputs to these models are component failure events whose probability of occuring are best represented as random variables. This paper examines the errors inherent in two approximation techniques used to calculate the top event's variance from the inputs' variance. Two sample fault trees are evaluated and several three dimensional plots illustrating the magnitude of the error over a wide range of input means and variances are given

  5. Handbook of human-reliability analysis with emphasis on nuclear power plant applications. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Swain, A D; Guttmann, H E

    1983-08-01

    The primary purpose of the Handbook is to present methods, models, and estimated human error probabilities (HEPs) to enable qualified analysts to make quantitative or qualitative assessments of occurrences of human errors in nuclear power plants (NPPs) that affect the availability or operational reliability of engineered safety features and components. The Handbook is intended to provide much of the modeling and information necessary for the performance of human reliability analysis (HRA) as a part of probabilistic risk assessment (PRA) of NPPs. Although not a design guide, a second purpose of the Handbook is to enable the user to recognize error-likely equipment design, plant policies and practices, written procedures, and other human factors problems so that improvements can be considered. The Handbook provides the methodology to identify and quantify the potential for human error in NPP tasks.

  6. Handbook of human-reliability analysis with emphasis on nuclear power plant applications. Final report

    International Nuclear Information System (INIS)

    Swain, A.D.; Guttmann, H.E.

    1983-08-01

    The primary purpose of the Handbook is to present methods, models, and estimated human error probabilities (HEPs) to enable qualified analysts to make quantitative or qualitative assessments of occurrences of human errors in nuclear power plants (NPPs) that affect the availability or operational reliability of engineered safety features and components. The Handbook is intended to provide much of the modeling and information necessary for the performance of human reliability analysis (HRA) as a part of probabilistic risk assessment (PRA) of NPPs. Although not a design guide, a second purpose of the Handbook is to enable the user to recognize error-likely equipment design, plant policies and practices, written procedures, and other human factors problems so that improvements can be considered. The Handbook provides the methodology to identify and quantify the potential for human error in NPP tasks

  7. Error exponents for entanglement concentration

    International Nuclear Information System (INIS)

    Hayashi, Masahito; Koashi, Masato; Matsumoto, Keiji; Morikoshi, Fumiaki; Winter, Andreas

    2003-01-01

    Consider entanglement concentration schemes that convert n identical copies of a pure state into a maximally entangled state of a desired size with success probability being close to one in the asymptotic limit. We give the distillable entanglement, the number of Bell pairs distilled per copy, as a function of an error exponent, which represents the rate of decrease in failure probability as n tends to infinity. The formula fills the gap between the least upper bound of distillable entanglement in probabilistic concentration, which is the well-known entropy of entanglement, and the maximum attained in deterministic concentration. The method of types in information theory enables the detailed analysis of the distillable entanglement in terms of the error rate. In addition to the probabilistic argument, we consider another type of entanglement concentration scheme, where the initial state is deterministically transformed into a (possibly mixed) final state whose fidelity to a maximally entangled state of a desired size converges to one in the asymptotic limit. We show that the same formula as in the probabilistic argument is valid for the argument on fidelity by replacing the success probability with the fidelity. Furthermore, we also discuss entanglement yield when optimal success probability or optimal fidelity converges to zero in the asymptotic limit (strong converse), and give the explicit formulae for those cases

  8. Frequency formats, probability formats, or problem structure? A test of the nested-sets hypothesis in an extensional reasoning task

    Directory of Open Access Journals (Sweden)

    William P. Neace

    2008-02-01

    Full Text Available Five experiments addressed a controversy in the probability judgment literature that centers on the efficacy of framing probabilities as frequencies. The natural frequency view predicts that frequency formats attenuate errors, while the nested-sets view predicts that highlighting the set-subset structure of the problem reduces error, regardless of problem format. This study tested these predictions using a conjunction task. Previous studies reporting that frequency formats reduced conjunction errors confounded reference class with problem format. After controlling this confound, the present study's findings show that conjunction errors can be reduced using either a probability or a frequency format, that frequency effects depend upon the presence of a reference class, and that frequency formats do not promote better statistical reasoning than probability formats.

  9. Human action contribution to ET-RR-II reactor systems unstems unavailability

    International Nuclear Information System (INIS)

    Sabek, M.G.

    2001-01-01

    This paper gives a study on the human action contribution to the systems unavailability of ET-RR-II reactor as a result of the test and maintenance procedures. The human contribution is expressed in terms of Fussel-Vesely importance which is defined by the probability that event is contributing to system failure (unavailability). The human error basic events contribution was analyzed for all initiating events and systems fault trees. The calculations result shows a high contribution value (61%) for the human error to systems unavailability. This means that the operator and the maintenance people should be highly qualified trained. Moreover, there should be programs for continuous training. Also, the procedures of tests and maintenance should be in a simple way and clear in order to minimize the contribution of the human errors. The calculations were done using the IRRAS cods

  10. Survey of methods used to asses human reliability in the human factors reliability benchmark exercise

    International Nuclear Information System (INIS)

    Poucet, A.

    1988-01-01

    The Joint Research Centre of the European Commission has organised a Human Factors Reliability Benchmark Exercise (HF-RBE) with the aim to assess the state-of-the-art in human reliability modelling and assessment. Fifteen teams from eleven countries, representing industry, utilities, licensing organisations and research institutes, participate in the HF-RBE, which is organised around two study cases: (1) analysis of routine functional test and maintenance procedures, with the aim to assess the probability of test-induced failures, the probability of failures to remain unrevealed, and the potential to initiate transients because of errors performed in the test; and (2) analysis of human actions during an operational transient, with the aim to assess the probability that the operators will correctly diagnose the malfunctions and take proper corrective action. The paper briefly reports how the HF-RBE was structured and gives an overview of the methods that have been used for predicting human reliability in both study cases. The experience in applying these methods is discussed and the results obtained are compared. (author)

  11. Errors in Neonatology

    OpenAIRE

    Antonio Boldrini; Rosa T. Scaramuzzo; Armando Cuttano

    2013-01-01

    Introduction: Danger and errors are inherent in human activities. In medical practice errors can lean to adverse events for patients. Mass media echo the whole scenario. Methods: We reviewed recent published papers in PubMed database to focus on the evidence and management of errors in medical practice in general and in Neonatology in particular. We compared the results of the literature with our specific experience in Nina Simulation Centre (Pisa, Italy). Results: In Neonatology the main err...

  12. Position Error Covariance Matrix Validation and Correction

    Science.gov (United States)

    Frisbee, Joe, Jr.

    2016-01-01

    In order to calculate operationally accurate collision probabilities, the position error covariance matrices predicted at times of closest approach must be sufficiently accurate representations of the position uncertainties. This presentation will discuss why the Gaussian distribution is a reasonable expectation for the position uncertainty and how this assumed distribution type is used in the validation and correction of position error covariance matrices.

  13. Sampling, Probability Models and Statistical Reasoning -RE ...

    Indian Academy of Sciences (India)

    random sampling allows data to be modelled with the help of probability ... g based on different trials to get an estimate of the experimental error. ... research interests lie in the .... if e is indeed the true value of the proportion of defectives in the.

  14. Numerical determination of transmission probabilities in cylindrical geometry

    International Nuclear Information System (INIS)

    Queiroz Bogado Leite, S. de.

    1989-11-01

    Efficient methods for numerical calculation of transmission probabilities in cylindrical geometry are presented. Relative errors of the order of 10 -5 or smaller are obtained using analytical solutions and low order quadrature integration schemes. (author) [pt

  15. Preliminary Human Reliability Issues in Reviewing SMART PSA

    International Nuclear Information System (INIS)

    Lee, Chang Ju; Sheen, Cheol

    2010-01-01

    Human reliability analysis (HRA) identifies the human failure events (HFEs) that can negatively impact normal or emergency plant operations, and systematically estimates probabilities of HFEs using data (when available), models, or expert judgment. In case of newly-conceptualized reactors like SMART (System-integrated Modular Advanced Reactor), HRA results must be provided by first evaluating the applicability of a set of human errors that has been typically applied in PSAs for existing PWRs. Additional human errors should also be identified reflecting its unique design and operational features. The objective of this paper is double-folded: to discuss a direction of HRA used in confirming risk level of SAMRT-type reactors; and to extract preliminarily considerable points or issues for regulatory verification, referred to available safety guides

  16. Trend analysis of nuclear reactor automatic trip events subjected to operator's human error at United States nuclear power plants

    International Nuclear Information System (INIS)

    Takagawa, Kenichi

    2009-01-01

    Trends in nuclear reactor automatic trip events due to human errors during plant operating mode have been analyzed by extracting 20 events which took place in the United States during the period of seven years from 2002 to 2008, cited in the LERs (Licensee Event Reports) submitted to the US Nuclear Regulatory Commission (NRC). It was shown that the yearly number of events was relatively large before 2005, and thereafter the number decreased. A period of stable operation, in which the yearly number was kept very small, continued for about three years, and then the yearly number turned to increase again. Before 2005, automatic trip events occurred more frequently during periodic inspections or start-up/shut-down operations. The recent trends, however, indicate that trip events became more frequent due to human errors during daily operations. Human errors were mostly caused by the self-conceit and carelessness of operators through the whole period. The before mentioned trends in the yearly number of events might be explained as follows. The decrease in the automatic trip events is attributed to sharing trouble information, leading as a consequence to improvement of the manual and training for the operations which have a higher potential risk of automatic trip. Then, while the period of stable operation continued, some operators came to pay less attention to preventing human errors and not interest in the training, leading to automatic trip events in reality due to miss-operation. From these analyses on trouble experiences in the US, we learnt the followings to prevent the occurrence similar troubles in Japan: Operators should be thoroughly skilled in basic actions to prevent human errors as persons concerned. And it should be further emphasized that they should elaborate by imaging actual plant operations even though the simulator training gives them successful experiences. (author)

  17. Asteroid orbital error analysis: Theory and application

    Science.gov (United States)

    Muinonen, K.; Bowell, Edward

    1992-01-01

    We present a rigorous Bayesian theory for asteroid orbital error estimation in which the probability density of the orbital elements is derived from the noise statistics of the observations. For Gaussian noise in a linearized approximation the probability density is also Gaussian, and the errors of the orbital elements at a given epoch are fully described by the covariance matrix. The law of error propagation can then be applied to calculate past and future positional uncertainty ellipsoids (Cappellari et al. 1976, Yeomans et al. 1987, Whipple et al. 1991). To our knowledge, this is the first time a Bayesian approach has been formulated for orbital element estimation. In contrast to the classical Fisherian school of statistics, the Bayesian school allows a priori information to be formally present in the final estimation. However, Bayesian estimation does give the same results as Fisherian estimation when no priori information is assumed (Lehtinen 1988, and reference therein).

  18. Subsecond dopamine fluctuations in human striatum encode superposed error signals about actual and counterfactual reward

    Science.gov (United States)

    Kishida, Kenneth T.; Saez, Ignacio; Lohrenz, Terry; Witcher, Mark R.; Laxton, Adrian W.; Tatter, Stephen B.; White, Jason P.; Ellis, Thomas L.; Phillips, Paul E. M.; Montague, P. Read

    2016-01-01

    In the mammalian brain, dopamine is a critical neuromodulator whose actions underlie learning, decision-making, and behavioral control. Degeneration of dopamine neurons causes Parkinson’s disease, whereas dysregulation of dopamine signaling is believed to contribute to psychiatric conditions such as schizophrenia, addiction, and depression. Experiments in animal models suggest the hypothesis that dopamine release in human striatum encodes reward prediction errors (RPEs) (the difference between actual and expected outcomes) during ongoing decision-making. Blood oxygen level-dependent (BOLD) imaging experiments in humans support the idea that RPEs are tracked in the striatum; however, BOLD measurements cannot be used to infer the action of any one specific neurotransmitter. We monitored dopamine levels with subsecond temporal resolution in humans (n = 17) with Parkinson’s disease while they executed a sequential decision-making task. Participants placed bets and experienced monetary gains or losses. Dopamine fluctuations in the striatum fail to encode RPEs, as anticipated by a large body of work in model organisms. Instead, subsecond dopamine fluctuations encode an integration of RPEs with counterfactual prediction errors, the latter defined by how much better or worse the experienced outcome could have been. How dopamine fluctuations combine the actual and counterfactual is unknown. One possibility is that this process is the normal behavior of reward processing dopamine neurons, which previously had not been tested by experiments in animal models. Alternatively, this superposition of error terms may result from an additional yet-to-be-identified subclass of dopamine neurons. PMID:26598677

  19. Effective use of pre-job briefing as tool for the prevention of human error

    International Nuclear Information System (INIS)

    Schlump, Ansgar

    2015-01-01

    There is a fundamental demand to minimise the risks for workers and facilities while executing maintenance work. To ensure that facilities are secure and reliable, any deviation from normal operation behaviour has to be avoided. Accurate planning is the basis for minimising mistakes and making work more secure. All workers involved should understand how the work should be done and what is expected to avoid human errors. Especially in nuclear power plants, the human performance tools (HPT) have proved to be an effective instrument to minimise human errors. These human performance tools consist of numerous different tools that complement each other (e.g. pre-job briefing). The safety culture of the plants is also characterised by these tools. The choice of using the right HP-Tool is often a difficult task for the work planer. On the one hand, he wants to avoid mistakes during the execution of work but on the other hand he does not want to irritate the workers with unnecessary requirements. The proposed concept uses a simple risk analysis to take into account the complexity of the task, the experience of the past and the consequences of failure in to account. One main result of this risk analysis is a recommendation of the detailing of the pre-job briefing, to reduce the risks for the involved staff to a minimum.

  20. Impact of spectral smoothing on gamma radiation portal alarm probabilities

    International Nuclear Information System (INIS)

    Burr, T.; Hamada, M.; Hengartner, N.

    2011-01-01

    Gamma detector counts are included in radiation portal monitors (RPM) to screen for illicit nuclear material. Gamma counts are sometimes smoothed to reduce variance in the estimated underlying true mean count rate, which is the 'signal' in our context. Smoothing reduces total error variance in the estimated signal if the bias that smoothing introduces is more than offset by the variance reduction. An empirical RPM study for vehicle screening applications is presented for unsmoothed and smoothed gamma counts in low-resolution plastic scintillator detectors and in medium-resolution NaI detectors. - Highlights: → We evaluate options for smoothing counts from gamma detectors deployed for portal monitoring. → A new multiplicative bias correction (MBC) is shown to reduce bias in peak and valley regions. → Performance is measured using mean squared error and detection probabilities for sources. → Smoothing with the MBC improves detection probabilities and the mean squared error.

  1. A system dynamic simulation model for managing the human error in power tools industries

    Science.gov (United States)

    Jamil, Jastini Mohd; Shaharanee, Izwan Nizal Mohd

    2017-10-01

    In the era of modern and competitive life of today, every organization will face the situations in which the work does not proceed as planned when there is problems occur in which it had to be delay. However, human error is often cited as the culprit. The error that made by the employees would cause them have to spend additional time to identify and check for the error which in turn could affect the normal operations of the company as well as the company's reputation. Employee is a key element of the organization in running all of the activities of organization. Hence, work performance of the employees is a crucial factor in organizational success. The purpose of this study is to identify the factors that cause the increasing errors make by employees in the organization by using system dynamics approach. The broadly defined targets in this study are employees in the Regional Material Field team from purchasing department in power tools industries. Questionnaires were distributed to the respondents to obtain their perceptions on the root cause of errors make by employees in the company. The system dynamics model was developed to simulate the factor of the increasing errors make by employees and its impact. The findings of this study showed that the increasing of error make by employees was generally caused by the factors of workload, work capacity, job stress, motivation and performance of employees. However, this problem could be solve by increased the number of employees in the organization.

  2. Human Reliability Analysis for In-Tank Precipitation Alignment and Startup of Emergency Purge Ventilation Equipment. Revision 4

    International Nuclear Information System (INIS)

    Shapiro, B.J.; Britt, T.E.

    1995-06-01

    This report documents the methodology used for calculating the human error probability for establishing air based ventilation using emergency purge ventilation equipment on In-Tank Precipitation (ITP) processing tanks 48 and 49 after a failure of the nitrogen purge system following a seismic event. The analyses were performed according to THERP (Technique for Human Error Rate Prediction) as describes in NUREG/CR-1278-F

  3. Uncertainty quantification and error analysis

    Energy Technology Data Exchange (ETDEWEB)

    Higdon, Dave M [Los Alamos National Laboratory; Anderson, Mark C [Los Alamos National Laboratory; Habib, Salman [Los Alamos National Laboratory; Klein, Richard [Los Alamos National Laboratory; Berliner, Mark [OHIO STATE UNIV.; Covey, Curt [LLNL; Ghattas, Omar [UNIV OF TEXAS; Graziani, Carlo [UNIV OF CHICAGO; Seager, Mark [LLNL; Sefcik, Joseph [LLNL; Stark, Philip [UC/BERKELEY; Stewart, James [SNL

    2010-01-01

    UQ studies all sources of error and uncertainty, including: systematic and stochastic measurement error; ignorance; limitations of theoretical models; limitations of numerical representations of those models; limitations on the accuracy and reliability of computations, approximations, and algorithms; and human error. A more precise definition for UQ is suggested below.

  4. A New Human-Machine Interfaces of Computer-based Procedure System to Reduce the Team Errors in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Kim, Sa Kil; Sim, Joo Hyun; Lee, Hyun Chul

    2016-01-01

    In this study, we identify the emerging types of team errors, especially, in digitalized control room of nuclear power plants such as the APR-1400 main control room of Korea. Most works in nuclear industry are to be performed by a team of more than two persons. Even though the individual errors can be detected and recovered by the qualified others and/or the well trained team, it is rather seldom that the errors by team could be easily detected and properly recovered by the team itself. Note that the team is defined as two or more people who are appropriately interacting with each other, and the team is a dependent aggregate, which accomplishes a valuable goal. Organizational errors sometimes increase the likelihood of operator errors through the active failure pathway and, at the same time, enhance the possibility of adverse outcomes through defensive weaknesses. We incorporate the crew resource management as a representative approach to deal with the team factors of the human errors. We suggest a set of crew resource management training procedures under the unsafe environments where human errors can have devastating effects. We are on the way to develop alternative interfaces against team error in a condition of using a computer-based procedure system in a digitalized main control room. The computer-based procedure system is a representative feature of digitalized control room. In this study, we will propose new interfaces of computer-based procedure system to reduce feasible team errors. We are on the way of effectiveness test to validate whether the new interface can reduce team errors during operating with a computer-based procedure system in a digitalized control room

  5. A New Human-Machine Interfaces of Computer-based Procedure System to Reduce the Team Errors in Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sa Kil; Sim, Joo Hyun; Lee, Hyun Chul [Korea Atomic Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    In this study, we identify the emerging types of team errors, especially, in digitalized control room of nuclear power plants such as the APR-1400 main control room of Korea. Most works in nuclear industry are to be performed by a team of more than two persons. Even though the individual errors can be detected and recovered by the qualified others and/or the well trained team, it is rather seldom that the errors by team could be easily detected and properly recovered by the team itself. Note that the team is defined as two or more people who are appropriately interacting with each other, and the team is a dependent aggregate, which accomplishes a valuable goal. Organizational errors sometimes increase the likelihood of operator errors through the active failure pathway and, at the same time, enhance the possibility of adverse outcomes through defensive weaknesses. We incorporate the crew resource management as a representative approach to deal with the team factors of the human errors. We suggest a set of crew resource management training procedures under the unsafe environments where human errors can have devastating effects. We are on the way to develop alternative interfaces against team error in a condition of using a computer-based procedure system in a digitalized main control room. The computer-based procedure system is a representative feature of digitalized control room. In this study, we will propose new interfaces of computer-based procedure system to reduce feasible team errors. We are on the way of effectiveness test to validate whether the new interface can reduce team errors during operating with a computer-based procedure system in a digitalized control room.

  6. A methodology for analysing human errors of commission in accident scenarios for risk assessment

    International Nuclear Information System (INIS)

    Kim, J. H.; Jung, W. D.; Park, J. K

    2003-01-01

    As the concern on the impact of the operator's inappropriate interventions, so-called Errors Of Commissions(EOCs), on the plant safety has been raised, the interest in the identification and analysis of EOC events from the risk assessment perspective becomes increasing accordingly. To this purpose, we propose a new methodology for identifying and analysing human errors of commission that might be caused from the failures in situation assessment and decision making during accident progressions given an initiating event. The proposed methodology was applied to the accident scenarios of YGN 3 and 4 NPPs, which resulted in about 10 EOC situations that need careful attention

  7. Using human error theory to explore the supply of non-prescription medicines from community pharmacies.

    Science.gov (United States)

    Watson, M C; Bond, C M; Johnston, M; Mearns, K

    2006-08-01

    The importance of theory in underpinning interventions to promote effective professional practice is gaining recognition. The Medical Research Council framework for complex interventions has assisted in promoting awareness and adoption of theory into study design. Human error theory has previously been used by high risk industries but its relevance to healthcare settings and patient safety requires further investigation. This study used this theory as a framework to explore non-prescription medicine supply from community pharmacies. The relevance to other healthcare settings and behaviours is discussed. A 25% random sample was made of 364 observed consultations for non-prescription medicines. Each of the 91 consultations was assessed by two groups: a consensus group (stage 1) to identify common problems with the consultation process, and an expert group (stages 2 and 3) to apply human error theory to these consultations. Paired assessors (most of whom were pharmacists) categorised the perceived problems occurring in each consultation (stage 1). During stage 2 paired assessors from an expert group (comprising patient safety experts, community pharmacists and psychologists) considered whether each consultation was compliant with professional guidelines for the supply of pharmacy medicines. Each non-compliant consultation identified during stage 2 was then categorised as a slip/lapse, mistake, or violation using human error theory (stage 3). During stage 1 most consultations (n = 75, 83%) were deemed deficient in information exchange. At stage 2, paired assessors varied in attributing non-compliance to specific error types. Where agreement was achieved, the error type most often selected was "violation" (n = 27, 51.9%, stage 3). Consultations involving product requests were less likely to be guideline compliant than symptom presentations (OR 0.30, 95% CI 0.10 to 0.95, p = 0.05). The large proportion of consultations classified as violations suggests that either

  8. Minimizing human error in radiopharmaceutical preparation and administration via a bar code-enhanced nuclear pharmacy management system.

    Science.gov (United States)

    Hakala, John L; Hung, Joseph C; Mosman, Elton A

    2012-09-01

    The objective of this project was to ensure correct radiopharmaceutical administration through the use of a bar code system that links patient and drug profiles with on-site information management systems. This new combined system would minimize the amount of manual human manipulation, which has proven to be a primary source of error. The most common reason for dosing errors is improper patient identification when a dose is obtained from the nuclear pharmacy or when a dose is administered. A standardized electronic transfer of information from radiopharmaceutical preparation to injection will further reduce the risk of misadministration. Value stream maps showing the flow of the patient dose information, as well as potential points of human error, were developed. Next, a future-state map was created that included proposed corrections for the most common critical sites of error. Transitioning the current process to the future state will require solutions that address these sites. To optimize the future-state process, a bar code system that links the on-site radiology management system with the nuclear pharmacy management system was proposed. A bar-coded wristband connects the patient directly to the electronic information systems. The bar code-enhanced process linking the patient dose with the electronic information reduces the number of crucial points for human error and provides a framework to ensure that the prepared dose reaches the correct patient. Although the proposed flowchart is designed for a site with an in-house central nuclear pharmacy, much of the framework could be applied by nuclear medicine facilities using unit doses. An electronic connection between information management systems to allow the tracking of a radiopharmaceutical from preparation to administration can be a useful tool in preventing the mistakes that are an unfortunate reality for any facility.

  9. Selection of the important performance influencing factors for the assessment of human error under accident management situations in nuclear power plants

    International Nuclear Information System (INIS)

    Kim, J. H.; Jung, W. J.

    1999-01-01

    This paper introduces the process and final results of selection of the important Performance Influencing Factors (PIFs) under emergency operation and accident management situations in nuclear power plants for use in the assessment of human errors. We collected two types of PIF taxonomies, one is the full set PIF list mainly developed for human error analysis, and the other is the PIFs for human reliability analysis (HRA) in probabilistic safety assessment (PSA). 5 PIF taxonomies among the full set PIF list and 10 PIF taxonomies among HRA methodologies (CREAM, SLIM, INTENT, were collected in this research. By reviewing and analyzing PIFs selected for HRA methodologies, the criterion could be established for the selection of appropriate PIFs under emergency operation and accident management situations. Based on this selection criteria, a new PIF taxonomy was proposed for the assessment of human error under emergency operation and accident management situations in nuclear power plants

  10. Failure probability under parameter uncertainty.

    Science.gov (United States)

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  11. On the Determinants of the Conjunction Fallacy: Probability versus Inductive Confirmation

    Science.gov (United States)

    Tentori, Katya; Crupi, Vincenzo; Russo, Selena

    2013-01-01

    Major recent interpretations of the conjunction fallacy postulate that people assess the probability of a conjunction according to (non-normative) averaging rules as applied to the constituents' probabilities or represent the conjunction fallacy as an effect of random error in the judgment process. In the present contribution, we contrast such…

  12. Design for Error Tolerance

    DEFF Research Database (Denmark)

    Rasmussen, Jens

    1983-01-01

    An important aspect of the optimal design of computer-based operator support systems is the sensitivity of such systems to operator errors. The author discusses how a system might allow for human variability with the use of reversibility and observability.......An important aspect of the optimal design of computer-based operator support systems is the sensitivity of such systems to operator errors. The author discusses how a system might allow for human variability with the use of reversibility and observability....

  13. The Importance of HRA in Human Space Flight: Understanding the Risks

    Science.gov (United States)

    Hamlin, Teri

    2010-01-01

    Human performance is critical to crew safety during space missions. Humans interact with hardware and software during ground processing, normal flight, and in response to events. Human interactions with hardware and software can cause Loss of Crew and/or Vehicle (LOCV) through improper actions, or may prevent LOCV through recovery and control actions. Humans have the ability to deal with complex situations and system interactions beyond the capability of machines. Human Reliability Analysis (HRA) is a method used to qualitatively and quantitatively assess the occurrence of human failures that affect availability and reliability of complex systems. Modeling human actions with their corresponding failure probabilities in a Probabilistic Risk Assessment (PRA) provides a more complete picture of system risks and risk contributions. A high-quality HRA can provide valuable information on potential areas for improvement, including training, procedures, human interfaces design, and the need for automation. Modeling human error has always been a challenge in part because performance data is not always readily available. For spaceflight, the challenge is amplified not only because of the small number of participants and limited amount of performance data available, but also due to the lack of definition of the unique factors influencing human performance in space. These factors, called performance shaping factors in HRA terminology, are used in HRA techniques to modify basic human error probabilities in order to capture the context of an analyzed task. Many of the human error modeling techniques were developed within the context of nuclear power plants and therefore the methodologies do not address spaceflight factors such as the effects of microgravity and longer duration missions. This presentation will describe the types of human error risks which have shown up as risk drivers in the Shuttle PRA which may be applicable to commercial space flight. As with other large PRAs

  14. The SACADA database for human reliability and human performance

    International Nuclear Information System (INIS)

    James Chang, Y.; Bley, Dennis; Criscione, Lawrence; Kirwan, Barry; Mosleh, Ali; Madary, Todd; Nowell, Rodney; Richards, Robert; Roth, Emilie M.; Sieben, Scott; Zoulis, Antonios

    2014-01-01

    Lack of appropriate and sufficient human performance data has been identified as a key factor affecting human reliability analysis (HRA) quality especially in the estimation of human error probability (HEP). The Scenario Authoring, Characterization, and Debriefing Application (SACADA) database was developed by the U.S. Nuclear Regulatory Commission (NRC) to address this data need. An agreement between NRC and the South Texas Project Nuclear Operating Company (STPNOC) was established to support the SACADA development with aims to make the SACADA tool suitable for implementation in the nuclear power plants' operator training program to collect operator performance information. The collected data would support the STPNOC's operator training program and be shared with the NRC for improving HRA quality. This paper discusses the SACADA data taxonomy, the theoretical foundation, the prospective data to be generated from the SACADA raw data to inform human reliability and human performance, and the considerations on the use of simulator data for HRA. Each SACADA data point consists of two information segments: context and performance results. Context is a characterization of the performance challenges to task success. The performance results are the results of performing the task. The data taxonomy uses a macrocognitive functions model for the framework. At a high level, information is classified according to the macrocognitive functions of detecting the plant abnormality, understanding the abnormality, deciding the response plan, executing the response plan, and team related aspects (i.e., communication, teamwork, and supervision). The data are expected to be useful for analyzing the relations between context, error modes and error causes in human performance

  15. On Bit Error Probability and Power Optimization in Multihop Millimeter Wave Relay Systems

    KAUST Repository

    Chelli, Ali

    2018-01-15

    5G networks are expected to provide gigabit data rate to users via millimeter-wave (mmWave) communication technology. One of the major problem faced by mmWaves is that they cannot penetrate buildings. In this paper, we utilize multihop relaying to overcome the signal blockage problem in mmWave band. The multihop relay network comprises a source device, several relay devices and a destination device and uses device-todevice communication. Relay devices redirect the source signal to avoid the obstacles existing in the propagation environment. Each device amplifies and forwards the signal to the next device, such that a multihop link ensures the connectivity between the source device and the destination device. We consider that the relay devices and the destination device are affected by external interference and investigate the bit error probability (BEP) of this multihop mmWave system. Note that the study of the BEP allows quantifying the quality of communication and identifying the impact of different parameters on the system reliability. In this way, the system parameters, such as the powers allocated to different devices, can be tuned to maximize the link reliability. We derive exact expressions for the BEP of M-ary quadrature amplitude modulation (M-QAM) and M-ary phase-shift keying (M-PSK) in terms of multivariate Meijer’s G-function. Due to the complicated expression of the exact BEP, a tight lower-bound expression for the BEP is derived using a novel Mellin-approach. Moreover, an asymptotic expression for the BEP at high SIR regime is derived and used to determine the diversity and the coding gain of the system. Additionally, we optimize the power allocation at different devices subject to a sum power constraint such that the BEP is minimized. Our analysis reveals that optimal power allocation allows achieving more than 3 dB gain compared to the equal power allocation.This research work can serve as a framework for designing and optimizing mmWave multihop

  16. Optimizer convergence and local minima errors and their clinical importance

    International Nuclear Information System (INIS)

    Jeraj, Robert; Wu, Chuan; Mackie, Thomas R

    2003-01-01

    Two of the errors common in the inverse treatment planning optimization have been investigated. The first error is the optimizer convergence error, which appears because of non-perfect convergence to the global or local solution, usually caused by a non-zero stopping criterion. The second error is the local minima error, which occurs when the objective function is not convex and/or the feasible solution space is not convex. The magnitude of the errors, their relative importance in comparison to other errors as well as their clinical significance in terms of tumour control probability (TCP) and normal tissue complication probability (NTCP) were investigated. Two inherently different optimizers, a stochastic simulated annealing and deterministic gradient method were compared on a clinical example. It was found that for typical optimization the optimizer convergence errors are rather small, especially compared to other convergence errors, e.g., convergence errors due to inaccuracy of the current dose calculation algorithms. This indicates that stopping criteria could often be relaxed leading into optimization speed-ups. The local minima errors were also found to be relatively small and typically in the range of the dose calculation convergence errors. Even for the cases where significantly higher objective function scores were obtained the local minima errors were not significantly higher. Clinical evaluation of the optimizer convergence error showed good correlation between the convergence of the clinical TCP or NTCP measures and convergence of the physical dose distribution. On the other hand, the local minima errors resulted in significantly different TCP or NTCP values (up to a factor of 2) indicating clinical importance of the local minima produced by physical optimization

  17. Real-time minimal-bit-error probability decoding of convolutional codes

    Science.gov (United States)

    Lee, L.-N.

    1974-01-01

    A recursive procedure is derived for decoding of rate R = 1/n binary convolutional codes which minimizes the probability of the individual decoding decisions for each information bit, subject to the constraint that the decoding delay be limited to Delta branches. This new decoding algorithm is similar to, but somewhat more complex than, the Viterbi decoding algorithm. A real-time, i.e., fixed decoding delay, version of the Viterbi algorithm is also developed and used for comparison to the new algorithm on simulated channels. It is shown that the new algorithm offers advantages over Viterbi decoding in soft-decision applications, such as in the inner coding system for concatenated coding.

  18. Real-time minimal bit error probability decoding of convolutional codes

    Science.gov (United States)

    Lee, L. N.

    1973-01-01

    A recursive procedure is derived for decoding of rate R=1/n binary convolutional codes which minimizes the probability of the individual decoding decisions for each information bit subject to the constraint that the decoding delay be limited to Delta branches. This new decoding algorithm is similar to, but somewhat more complex than, the Viterbi decoding algorithm. A real-time, i.e. fixed decoding delay, version of the Viterbi algorithm is also developed and used for comparison to the new algorithm on simulated channels. It is shown that the new algorithm offers advantages over Viterbi decoding in soft-decision applications such as in the inner coding system for concatenated coding.

  19. ATHEANA: open-quotes a technique for human error analysisclose quotes entering the implementation phase

    International Nuclear Information System (INIS)

    Taylor, J.; O'Hara, J.; Luckas, W.

    1997-01-01

    Probabilistic Risk Assessment (PRA) has become an increasingly important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. The NRC recently published a final policy statement, SECY-95-126, encouraging the use of PRA in regulatory activities. Human reliability analysis (HRA), while a critical element of PRA, has limitations in the analysis of human actions in PRAs that have long been recognized as a constraint when using PRA. In fact, better integration of HRA into the PRA process has long been a NRC issue. Of particular concern, has been the omission of errors of commission - those errors that are associated with inappropriate interventions by operators with operating systems. To address these concerns, the NRC identified the need to develop an improved HRA method, so that human reliability can be better represented and integrated into PRA modeling and quantification. The purpose of the Brookhaven National Laboratory (BNL) project, entitled 'Improved HRA Method Based on Operating Experience' is to develop a new method for HRA which is supported by the analysis of risk-significant operating experience. This approach will allow a more realistic assessment and representation of the human contribution to plant risk, and thereby increase the utility of PRA. The project's completed, ongoing, and future efforts fall into four phases: (1) Assessment phase (FY 92/93); (2) Analysis and Characterization phase (FY 93/94); (3) Development phase (FY 95/96); and (4) Implementation phase (FY 96/97 ongoing)

  20. Validation of Metrics as Error Predictors

    Science.gov (United States)

    Mendling, Jan

    In this chapter, we test the validity of metrics that were defined in the previous chapter for predicting errors in EPC business process models. In Section 5.1, we provide an overview of how the analysis data is generated. Section 5.2 describes the sample of EPCs from practice that we use for the analysis. Here we discuss a disaggregation by the EPC model group and by error as well as a correlation analysis between metrics and error. Based on this sample, we calculate a logistic regression model for predicting error probability with the metrics as input variables in Section 5.3. In Section 5.4, we then test the regression function for an independent sample of EPC models from textbooks as a cross-validation. Section 5.5 summarizes the findings.

  1. On the Possibility of Assigning Probabilities to Singular Cases, or: Probability Is Subjective Too!

    Directory of Open Access Journals (Sweden)

    Mark R. Crovelli

    2009-06-01

    Full Text Available Both Ludwig von Mises and Richard von Mises claimed that numerical probability could not be legitimately applied to singular cases. This paper challenges this aspect of the von Mises brothers’ theory of probability. It is argued that their denial that numerical probability could be applied to singular cases was based solely upon Richard von Mises’ exceptionally restrictive definition of probability. This paper challenges Richard von Mises’ definition of probability by arguing that the definition of probability necessarily depends upon whether the world is governed by time-invariant causal laws. It is argued that if the world is governed by time-invariant causal laws, a subjective definition of probability must be adopted. It is further argued that both the nature of human action and the relative frequency method for calculating numerical probabilities both presuppose that the world is indeed governed by time-invariant causal laws. It is finally argued that the subjective definition of probability undercuts the von Mises claim that numerical probability cannot legitimately be applied to singular, non-replicable cases.

  2. Fast Outage Probability Simulation for FSO Links with a Generalized Pointing Error Model

    KAUST Repository

    Ben Issaid, Chaouki; Park, Kihong; Alouini, Mohamed-Slim; Tempone, Raul

    2017-01-01

    Over the past few years, free-space optical (FSO) communication has gained significant attention. In fact, FSO can provide cost-effective and unlicensed links, with high-bandwidth capacity and low error rate, making it an exciting alternative

  3. Analysis of Task Types and Error Types of the Human Actions Involved in the Human-related Unplanned Reactor Trip Events

    International Nuclear Information System (INIS)

    Kim, Jae Whan; Park, Jin Kyun; Jung, Won Dea

    2008-02-01

    This report provides the task types and error types involved in the unplanned reactor trip events that have occurred during 1986 - 2006. The events that were caused by the secondary system of the nuclear power plants amount to 67 %, and the remaining 33 % was by the primary system. The contribution of the activities of the plant personnel was identified as the following order: corrective maintenance (25.7 %), planned maintenance (22.8 %), planned operation (19.8 %), periodic preventive maintenance (14.9 %), response to a transient (9.9 %), and design/manufacturing/installation (9.9%). According to the analysis of error modes, the error modes such as control failure (22.2 %), wrong object (18.5 %), omission (14.8 %), wrong action (11.1 %), and inadequate (8.3 %) take up about 75 % of all the unplanned trip events. The analysis of the cognitive functions involved showed that the planning function makes the highest contribution to the human actions leading to unplanned reactor trips, and it is followed by the observation function (23.4%), the execution function (17.8 %), and the interpretation function (10.3 %). The results of this report are to be used as important bases for development of the error reduction measures or development of the error mode prediction system for the test and maintenance tasks in nuclear power plants

  4. Analysis of Task Types and Error Types of the Human Actions Involved in the Human-related Unplanned Reactor Trip Events

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae Whan; Park, Jin Kyun; Jung, Won Dea

    2008-02-15

    This report provides the task types and error types involved in the unplanned reactor trip events that have occurred during 1986 - 2006. The events that were caused by the secondary system of the nuclear power plants amount to 67 %, and the remaining 33 % was by the primary system. The contribution of the activities of the plant personnel was identified as the following order: corrective maintenance (25.7 %), planned maintenance (22.8 %), planned operation (19.8 %), periodic preventive maintenance (14.9 %), response to a transient (9.9 %), and design/manufacturing/installation (9.9%). According to the analysis of error modes, the error modes such as control failure (22.2 %), wrong object (18.5 %), omission (14.8 %), wrong action (11.1 %), and inadequate (8.3 %) take up about 75 % of all the unplanned trip events. The analysis of the cognitive functions involved showed that the planning function makes the highest contribution to the human actions leading to unplanned reactor trips, and it is followed by the observation function (23.4%), the execution function (17.8 %), and the interpretation function (10.3 %). The results of this report are to be used as important bases for development of the error reduction measures or development of the error mode prediction system for the test and maintenance tasks in nuclear power plants.

  5. A strategy to the development of a human error analysis method for accident management in nuclear power plants using industrial accident dynamics

    International Nuclear Information System (INIS)

    Lee, Yong Hee; Kim, Jae Whan; Jung, Won Dae; Ha, Jae Ju

    1998-06-01

    This technical report describes the early progress of he establishment of a human error analysis method as a part of a human reliability analysis(HRA) method for the assessment of the human error potential in a given accident management strategy. At first, we review the shortages and limitations of the existing HRA methods through an example application. In order to enhance the bias to the quantitative aspect of the HRA method, we focused to the qualitative aspect, i.e., human error analysis(HEA), during the proposition of a strategy to the new method. For the establishment of a new HEA method, we discuss the basic theories and approaches to the human error in industry, and propose three basic requirements that should be maintained as pre-requisites for HEA method in practice. Finally, we test IAD(Industrial Accident Dynamics) which has been widely utilized in industrial fields, in order to know whether IAD can be so easily modified and extended to the nuclear power plant applications. We try to apply IAD to the same example case and develop new taxonomy of the performance shaping factors in accident management and their influence matrix, which could enhance the IAD method as an HEA method. (author). 33 refs., 17 tabs., 20 figs

  6. On misclassication probabilities of linear and quadratic classiers ...

    African Journals Online (AJOL)

    We study the theoretical misclassication probability of linear and quadratic classiers and examine the performance of these classiers under distributional variations in theory and using simulation. We derive expression for Bayes errors for some competing distributions from the same family under location shift. Keywords: ...

  7. Phonological errors predominate in Arabic spelling across grades 1-9.

    Science.gov (United States)

    Abu-Rabia, Salim; Taha, Haitham

    2006-03-01

    Most of the spelling error analysis has been conducted in Latin orthographies and rarely conducted in other orthographies like Arabic. Two hundred and eighty-eight students in grades 1-9 participated in the study. They were presented nine lists of words to test their spelling skills. Their spelling errors were analyzed by error categories. The most frequent errors were phonological. The results did not indicate any significant differences in the percentages of phonological errors across grades one to nine.Thus, phonology probably presents the greatest challenge to students developing spelling skills in Arabic.

  8. Results of a nuclear power plant Application of a new technique for human error analysis (ATHEANA)

    International Nuclear Information System (INIS)

    Forester, J.A.; Whitehead, D.W.; Kolaczkowski, A.M.; Thompson, C.M.

    1997-01-01

    A new method to analyze human errors has been demonstrated at a pressurized water reactor (PWR) nuclear power plant. This was the first application of the new method referred to as A Technique for Human Error Analysis (ATHEANA). The main goals of the demonstration were to test the ATHEANA process as described in the frame-of-reference manual and the implementation guideline, test a training package developed for the method, test the hypothesis that plant operators and trainers have significant insight into the error-forcing-contexts (EFCs) that can make unsafe actions (UAs) more likely, and to identify ways to improve the method and its documentation. A set of criteria to evaluate the open-quotes successclose quotes of the ATHEANA method as used in the demonstration was identified. A human reliability analysis (HRA) team was formed that consisted of an expert in probabilistic risk assessment (PRA) with some background in HRA (not ATHEANA) and four personnel from the nuclear power plant. Personnel from the plant included two individuals from their PRA staff and two individuals from their training staff. Both individuals from training are currently licensed operators and one of them was a senior reactor operator open-quotes on shiftclose quotes until a few months before the demonstration. The demonstration was conducted over a 5 month period and was observed by members of the Nuclear Regulatory Commission's ATHEANA development team, who also served as consultants to the HRA team when necessary. Example results of the demonstration to date, including identified human failure events (HFEs), UAs, and EFCs are discussed. Also addressed is how simulator exercises are used in the ATHEANA demonstration project

  9. Effects of preparation time and trial type probability on performance of anti- and pro-saccades.

    Science.gov (United States)

    Pierce, Jordan E; McDowell, Jennifer E

    2016-02-01

    Cognitive control optimizes responses to relevant task conditions by balancing bottom-up stimulus processing with top-down goal pursuit. It can be investigated using the ocular motor system by contrasting basic prosaccades (look toward a stimulus) with complex antisaccades (look away from a stimulus). Furthermore, the amount of time allotted between trials, the need to switch task sets, and the time allowed to prepare for an upcoming saccade all impact performance. In this study the relative probabilities of anti- and pro-saccades were manipulated across five blocks of interleaved trials, while the inter-trial interval and trial type cue duration were varied across subjects. Results indicated that inter-trial interval had no significant effect on error rates or reaction times (RTs), while a shorter trial type cue led to more antisaccade errors and faster overall RTs. Responses following a shorter cue duration also showed a stronger effect of trial type probability, with more antisaccade errors in blocks with a low antisaccade probability and slower RTs for each saccade task when its trial type was unlikely. A longer cue duration yielded fewer errors and slower RTs, with a larger switch cost for errors compared to a short cue duration. Findings demonstrated that when the trial type cue duration was shorter, visual motor responsiveness was faster and subjects relied upon the implicit trial probability context to improve performance. When the cue duration was longer, increased fixation-related activity may have delayed saccade motor preparation and slowed responses, guiding subjects to respond in a controlled manner regardless of trial type probability. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  11. An Extended Quadratic Frobenius Primality Test with Average Case Error Estimates

    DEFF Research Database (Denmark)

    Damgård, Ivan Bjerre; Frandsen, Gudmund Skovbjerg

    2001-01-01

    We present an Extended Quadratic Frobenius Primality Test (EQFT), which is related to an extends the Miller-Rabin test and the Quadratic Frobenius test (QFT) by Grantham. EQFT takes time about equivalent to 2 Miller-Rabin tests, but has much smaller error probability, namely 256/331776t for t...... for the error probability of this algorithm as well as a general closed expression bounding the error. For instance, it is at most 2-143 for k = 500, t = 2. Compared to earlier similar results for the Miller-Rabin test, the results indicates that our test in the average case has the effect of 9 Miller......-Rabin tests, while only taking time equivalent to about 2 such tests. We also give bounds for the error in case a prime is sought by incremental search from a random starting point....

  12. Human Reliability and the Current Dilemma in Human-Machine Interface Design Strategies

    International Nuclear Information System (INIS)

    Passalacqua, Roberto; Yamada, Fumiaki

    2002-01-01

    Since human error dominates the probability of failures of still-existing human-requiring systems (as the Monju reactor), the human-machine interface needs to be improved. Several rationales may lead to the conclusion that 'humans' should limit themselves to monitor the 'machine'. For example, this is the trend in the aviation industry: newest aircrafts are designed to be able to return to a safe state by the use of control systems, which do not need human intervention. Thus, the dilemma whether we really need operators (for example in the nuclear industry) might arise. However, social-technical approaches in recent human error analyses are pointing out the so-called 'organizational errors' and the importance of a human-machine interface harmonization. Typically plant's operators are a 'redundant' safety system with a much lower reliability (than the machine): organizational factors and harmonization requirements suggest designing the human-machine interface in a way that allows improvement of operator's reliability. In addition, taxonomy studies of accident databases have also proved that operators' training should promote processes of decision-making. This is accomplished in the latest trends of PSA technology by introducing the concept of a 'Safety Monitor' that is a computer-based tool that uses a level 1 PSA model of the plant. Operators and maintenance schedulers of the Monju FBR will be able to perform real-time estimations of the plant risk level. The main benefits are risk awareness and improvements in decision-making by operators. Also scheduled maintenance can be approached in a more rational (safe and economic) way. (authors)

  13. Human reliability analysis for In-Tank Precipitation alignment and startup of emergency purge ventilation equipment

    International Nuclear Information System (INIS)

    Olsen, L.M.

    1993-08-01

    This report documents the methodology used for calculating the human error probability for establishing air based ventilation using emergency purge ventilation equipment on In-Tank Precipitation (ITP) processing tanks 48 and 49 after a failure of the nitrogen purge system following a seismic event. The analyses were performed according to THERP (Technique for Human Error Rate Prediction). The calculated human error probabilities are provided as input to the Fault Tree Analysis for the ITP Nitrogen Purge System. The analysis assumes a seismic event initiator leading to establishing air based ventilation on the ITP processing tanks 48 and 49. At the time of this analysis only the tanks and the emergency purge ventilation equipment are seismically qualified. Consequently, onsite and offsite power is assumed to be unavailable and all operator control actions are to be performed locally on the tank top. Assumptions regarding procedures, staffing, equipment locations, equipment tagging, equipment availability, and training were made and are documented in this report. The human error probability for establishing air based ventilation using the emergency purge ventilation equipment on In-Tank Precipitation processing tanks 48 and 49 after a failure of the nitrogen purge system following a seismic event is 4.2E-6 (median value on the lognormal scale). It is important to note that this result is predicated on the implementation of all of the assumptions listed in the ''Assumptions'' section of this report. This analysis was not based on the current conditions in ITP. The analysis is to be used as a tool to aid ITP operations personnel in achieving the training, procedural, and operational goals outlined in this document

  14. Human dorsal striatum encodes prediction errors during observational learning of instrumental actions.

    Science.gov (United States)

    Cooper, Jeffrey C; Dunne, Simon; Furey, Teresa; O'Doherty, John P

    2012-01-01

    The dorsal striatum plays a key role in the learning and expression of instrumental reward associations that are acquired through direct experience. However, not all learning about instrumental actions require direct experience. Instead, humans and other animals are also capable of acquiring instrumental actions by observing the experiences of others. In this study, we investigated the extent to which human dorsal striatum is involved in observational as well as experiential instrumental reward learning. Human participants were scanned with fMRI while they observed a confederate over a live video performing an instrumental conditioning task to obtain liquid juice rewards. Participants also performed a similar instrumental task for their own rewards. Using a computational model-based analysis, we found reward prediction errors in the dorsal striatum not only during the experiential learning condition but also during observational learning. These results suggest a key role for the dorsal striatum in learning instrumental associations, even when those associations are acquired purely by observing others.

  15. On the International Agency for Research on Cancer classification of glyphosate as a probable human carcinogen.

    Science.gov (United States)

    Tarone, Robert E

    2018-01-01

    The recent classification by International Agency for Research on Cancer (IARC) of the herbicide glyphosate as a probable human carcinogen has generated considerable discussion. The classification is at variance with evaluations of the carcinogenic potential of glyphosate by several national and international regulatory bodies. The basis for the IARC classification is examined under the assumptions that the IARC criteria are reasonable and that the body of scientific studies determined by IARC staff to be relevant to the evaluation of glyphosate by the Monograph Working Group is sufficiently complete. It is shown that the classification of glyphosate as a probable human carcinogen was the result of a flawed and incomplete summary of the experimental evidence evaluated by the Working Group. Rational and effective cancer prevention activities depend on scientifically sound and unbiased assessments of the carcinogenic potential of suspected agents. Implications of the erroneous classification of glyphosate with respect to the IARC Monograph Working Group deliberative process are discussed.

  16. Generalized Probability Functions

    Directory of Open Access Journals (Sweden)

    Alexandre Souto Martinez

    2009-01-01

    Full Text Available From the integration of nonsymmetrical hyperboles, a one-parameter generalization of the logarithmic function is obtained. Inverting this function, one obtains the generalized exponential function. Motivated by the mathematical curiosity, we show that these generalized functions are suitable to generalize some probability density functions (pdfs. A very reliable rank distribution can be conveniently described by the generalized exponential function. Finally, we turn the attention to the generalization of one- and two-tail stretched exponential functions. We obtain, as particular cases, the generalized error function, the Zipf-Mandelbrot pdf, the generalized Gaussian and Laplace pdf. Their cumulative functions and moments were also obtained analytically.

  17. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  18. Foundations of quantization for probability distributions

    CERN Document Server

    Graf, Siegfried

    2000-01-01

    Due to the rapidly increasing need for methods of data compression, quantization has become a flourishing field in signal and image processing and information theory. The same techniques are also used in statistics (cluster analysis), pattern recognition, and operations research (optimal location of service centers). The book gives the first mathematically rigorous account of the fundamental theory underlying these applications. The emphasis is on the asymptotics of quantization errors for absolutely continuous and special classes of singular probabilities (surface measures, self-similar measures) presenting some new results for the first time. Written for researchers and graduate students in probability theory the monograph is of potential interest to all people working in the disciplines mentioned above.

  19. The action characterization matrix: A link between HERA (Human Events Reference for ATHEANA) and ATHEANA (a technique for human error analysis)

    International Nuclear Information System (INIS)

    Hahn, H.A.

    1997-01-01

    The Technique for Human Error Analysis (ATHEANA) is a newly developed human reliability analysis (HRA) methodology that aims to facilitate better representation and integration of human performance into probabilistic risk assessment (PRA) modeling and quantification by analyzing risk-significant operating experience in the context of existing behavior science models. The fundamental premise of ATHEANA is that error-forcing contexts (EFCs), which refer to combinations of equipment/material conditions and performance shaping factors (PSFs), set up or create the conditions under which unsafe actions (UAs) can occur. ATHEANA is being developed in the context of nuclear power plant (NPP) PRAs, and much of the language used to describe the method and provide examples of its application are specific to that industry. Because ATHEANA relies heavily on the analysis of operational events that have already occurred as a mechanism for generating creative thinking about possible EFCs, a database, called the Human Events Reference for ATHEANA (HERA), has been developed to support the methodology. Los Alamos National Laboratory's (LANL) Human Factors Group has recently joined the ATHEANA project team; LANL is responsible for further developing the database structure and for analyzing additional exemplar operational events for entry into the database. The Action Characterization Matrix (ACM) is conceived as a bridge between the HERA database structure and ATHEANA. Specifically, the ACM allows each unsafe action or human failure event to be characterized according to its representation along each of six different dimensions: system status, initiator status, unsafe action mechanism, information processing stage, equipment/material conditions, and performance shaping factors. This report describes the development of the ACM and provides details on the structure and content of its dimensions

  20. A New Error Analysis and Accuracy Synthesis Method for Shoe Last Machine

    Directory of Open Access Journals (Sweden)

    Bian Xiangjuan

    2014-05-01

    Full Text Available In order to improve the manufacturing precision of the shoe last machine, a new error-computing model has been put forward to. At first, Based on the special topological structure of the shoe last machine and multi-rigid body system theory, a spatial error-calculating model of the system was built; Then, the law of error distributing in the whole work space was discussed, and the maximum error position of the system was found; At last, The sensitivities of error parameters were analyzed at the maximum position and the accuracy synthesis was conducted by using Monte Carlo method. Considering the error sensitivities analysis, the accuracy of the main parts was distributed. Results show that the probability of the maximal volume error less than 0.05 mm of the new scheme was improved from 0.6592 to 0.7021 than the probability of the old scheme, the precision of the system was improved obviously, the model can be used for the error analysis and accuracy synthesis of the complex multi- embranchment motion chain system, and to improve the system precision of manufacturing.