WorldWideScience

Sample records for reduce human error

  1. Applying lessons learned to enhance human performance and reduce human error for ISS operations

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, W.R.

    1998-09-01

    A major component of reliability, safety, and mission success for space missions is ensuring that the humans involved (flight crew, ground crew, mission control, etc.) perform their tasks and functions as required. This includes compliance with training and procedures during normal conditions, and successful compensation when malfunctions or unexpected conditions occur. A very significant issue that affects human performance in space flight is human error. Human errors can invalidate carefully designed equipment and procedures. If certain errors combine with equipment failures or design flaws, mission failure or loss of life can occur. The control of human error during operation of the International Space Station (ISS) will be critical to the overall success of the program. As experience from Mir operations has shown, human performance plays a vital role in the success or failure of long duration space missions. The Department of Energy`s Idaho National Engineering and Environmental Laboratory (INEEL) is developed a systematic approach to enhance human performance and reduce human errors for ISS operations. This approach is based on the systematic identification and evaluation of lessons learned from past space missions such as Mir to enhance the design and operation of ISS. This paper describes previous INEEL research on human error sponsored by NASA and how it can be applied to enhance human reliability for ISS.

  2. Human factors interventions to reduce human errors and improve productivity in maintenance tasks

    International Nuclear Information System (INIS)

    Isoda, Hachiro; Yasutake, J.Y.

    1992-01-01

    This paper describes work in progress to develop interventions to reduce human errors and increase maintenance productivity in nuclear power plants. The effort is part of a two-phased Human Factors research program being conducted jointly by the Central Research Institute of Electric Power Industry (CRIEPI) in Japan and the Electric Power Research Institute (EPRI) in the United States. The overall objective of this joint research program is to identify critical maintenance tasks and to develop, implement and evaluate interventions which have high potential for reducing human errors or increasing maintenance productivity. As a result of the Phase 1 effort, ten critical maintenance tasks were identified. For these tasks, over 25 candidate interventions were identified for potential development. After careful analysis, seven interventions were selected for development during Phase 2. This paper describes the methodology used to analyze and identify the most critical tasks, the process of identifying and developing selected interventions and some of the initial results. (author)

  3. Automation of Commanding at NASA: Reducing Human Error in Space Flight

    Science.gov (United States)

    Dorn, Sarah J.

    2010-01-01

    Automation has been implemented in many different industries to improve efficiency and reduce human error. Reducing or eliminating the human interaction in tasks has been proven to increase productivity in manufacturing and lessen the risk of mistakes by humans in the airline industry. Human space flight requires the flight controllers to monitor multiple systems and react quickly when failures occur so NASA is interested in implementing techniques that can assist in these tasks. Using automation to control some of these responsibilities could reduce the number of errors the flight controllers encounter due to standard human error characteristics. This paper will investigate the possibility of reducing human error in the critical area of manned space flight at NASA.

  4. Human errors and mistakes

    International Nuclear Information System (INIS)

    Wahlstroem, B.

    1993-01-01

    Human errors have a major contribution to the risks for industrial accidents. Accidents have provided important lesson making it possible to build safer systems. In avoiding human errors it is necessary to adapt the systems to their operators. The complexity of modern industrial systems is however increasing the danger of system accidents. Models of the human operator have been proposed, but the models are not able to give accurate predictions of human performance. Human errors can never be eliminated, but their frequency can be decreased by systematic efforts. The paper gives a brief summary of research in human error and it concludes with suggestions for further work. (orig.)

  5. Hepatic glucose output in humans measured with labeled glucose to reduce negative errors

    International Nuclear Information System (INIS)

    Levy, J.C.; Brown, G.; Matthews, D.R.; Turner, R.C.

    1989-01-01

    Steele and others have suggested that minimizing changes in glucose specific activity when estimating hepatic glucose output (HGO) during glucose infusions could reduce non-steady-state errors. This approach was assessed in nondiabetic and type II diabetic subjects during constant low dose [27 mumol.kg ideal body wt (IBW)-1.min-1] glucose infusion followed by a 12 mmol/l hyperglycemic clamp. Eight subjects had paired tests with and without labeled infusions. Labeled infusion was used to compare HGO in 11 nondiabetic and 15 diabetic subjects. Whereas unlabeled infusions produced negative values for endogenous glucose output, labeled infusions largely eliminated this error and reduced the dependence of the Steele model on the pool fraction in the paired tests. By use of labeled infusions, 11 nondiabetic subjects suppressed HGO from 10.2 +/- 0.6 (SE) fasting to 0.8 +/- 0.9 mumol.kg IBW-1.min-1 after 90 min of glucose infusion and to -1.9 +/- 0.5 mumol.kg IBW-1.min-1 after 90 min of a 12 mmol/l glucose clamp, but 15 diabetic subjects suppressed only partially from 13.0 +/- 0.9 fasting to 5.7 +/- 1.2 at the end of the glucose infusion and 5.6 +/- 1.0 mumol.kg IBW-1.min-1 in the clamp (P = 0.02, 0.002, and less than 0.001, respectively)

  6. Interruption Practice Reduces Errors

    Science.gov (United States)

    2014-01-01

    miscalculations (Koppel et al., 2005). There are cases where the user (medical staff, MD, Nurse , etc.) forgets to complete the PCS which is to log off or...13. SUPPLEMENTARY NOTES Proceedings of the Human Factors and Ergonomics Society Annual Meeting, vol. 58, 1 vol. pp. 265-269, 2014. 14. ABSTRACT...2000). The effects of interruptions in work activ- ity: Field and laboratory results. Applied Ergonomics , 31(5), 537– 543. González, V. M., & Mark, G

  7. The assessment system based on virtual decommissioning environments to reduce abnormal hazards from human errors for decommissioning of nuclear facilities

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Kwan Seong; Moon, Jei Kwon; Choi, Byung Seon; Hyun, Dong jun; Lee, Jong Hwan; Kim, Ik June; Kang, Shin Young [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    Decommissioning of nuclear facilities has to be accomplished by assuring the safety of workers. So, it is necessary that before decommissioning, the exposure dose to workers has to be analyzed and assessed under the principle of ALARA (as low as reasonably achievable). Furthermore, to improve the proficiency of decommissioning environments, method and system need to be developed. To establish the plan of exposure dose to workers during decommissioning of nuclear facilities before decommissioning activities, it is necessary that assessment system is developed. This system has been successfully developed so that exposure dose to workers could be real-time measured and assessed in virtual decommissioning environments. It can be concluded that this system could be protected from accidents and enable workers to improve his familiarization about working environments. It is expected that this system can reduce human errors because workers are able to improve the proficiency of hazardous working environments due to virtual training like real decommissioning situations.

  8. A New Human-Machine Interfaces of Computer-based Procedure System to Reduce the Team Errors in Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sa Kil; Sim, Joo Hyun; Lee, Hyun Chul [Korea Atomic Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    In this study, we identify the emerging types of team errors, especially, in digitalized control room of nuclear power plants such as the APR-1400 main control room of Korea. Most works in nuclear industry are to be performed by a team of more than two persons. Even though the individual errors can be detected and recovered by the qualified others and/or the well trained team, it is rather seldom that the errors by team could be easily detected and properly recovered by the team itself. Note that the team is defined as two or more people who are appropriately interacting with each other, and the team is a dependent aggregate, which accomplishes a valuable goal. Organizational errors sometimes increase the likelihood of operator errors through the active failure pathway and, at the same time, enhance the possibility of adverse outcomes through defensive weaknesses. We incorporate the crew resource management as a representative approach to deal with the team factors of the human errors. We suggest a set of crew resource management training procedures under the unsafe environments where human errors can have devastating effects. We are on the way to develop alternative interfaces against team error in a condition of using a computer-based procedure system in a digitalized main control room. The computer-based procedure system is a representative feature of digitalized control room. In this study, we will propose new interfaces of computer-based procedure system to reduce feasible team errors. We are on the way of effectiveness test to validate whether the new interface can reduce team errors during operating with a computer-based procedure system in a digitalized control room.

  9. A New Human-Machine Interfaces of Computer-based Procedure System to Reduce the Team Errors in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Kim, Sa Kil; Sim, Joo Hyun; Lee, Hyun Chul

    2016-01-01

    In this study, we identify the emerging types of team errors, especially, in digitalized control room of nuclear power plants such as the APR-1400 main control room of Korea. Most works in nuclear industry are to be performed by a team of more than two persons. Even though the individual errors can be detected and recovered by the qualified others and/or the well trained team, it is rather seldom that the errors by team could be easily detected and properly recovered by the team itself. Note that the team is defined as two or more people who are appropriately interacting with each other, and the team is a dependent aggregate, which accomplishes a valuable goal. Organizational errors sometimes increase the likelihood of operator errors through the active failure pathway and, at the same time, enhance the possibility of adverse outcomes through defensive weaknesses. We incorporate the crew resource management as a representative approach to deal with the team factors of the human errors. We suggest a set of crew resource management training procedures under the unsafe environments where human errors can have devastating effects. We are on the way to develop alternative interfaces against team error in a condition of using a computer-based procedure system in a digitalized main control room. The computer-based procedure system is a representative feature of digitalized control room. In this study, we will propose new interfaces of computer-based procedure system to reduce feasible team errors. We are on the way of effectiveness test to validate whether the new interface can reduce team errors during operating with a computer-based procedure system in a digitalized control room

  10. Human Errors in Decision Making

    OpenAIRE

    Mohamad, Shahriari; Aliandrina, Dessy; Feng, Yan

    2005-01-01

    The aim of this paper was to identify human errors in decision making process. The study was focused on a research question such as: what could be the human error as a potential of decision failure in evaluation of the alternatives in the process of decision making. Two case studies were selected from the literature and analyzed to find the human errors contribute to decision fail. Then the analysis of human errors was linked with mental models in evaluation of alternative step. The results o...

  11. A theory of human error

    Science.gov (United States)

    Mcruer, D. T.; Clement, W. F.; Allen, R. W.

    1981-01-01

    Human errors tend to be treated in terms of clinical and anecdotal descriptions, from which remedial measures are difficult to derive. Correction of the sources of human error requires an attempt to reconstruct underlying and contributing causes of error from the circumstantial causes cited in official investigative reports. A comprehensive analytical theory of the cause-effect relationships governing propagation of human error is indispensable to a reconstruction of the underlying and contributing causes. A validated analytical theory of the input-output behavior of human operators involving manual control, communication, supervisory, and monitoring tasks which are relevant to aviation, maritime, automotive, and process control operations is highlighted. This theory of behavior, both appropriate and inappropriate, provides an insightful basis for investigating, classifying, and quantifying the needed cause-effect relationships governing propagation of human error.

  12. Research trend on human error reduction

    International Nuclear Information System (INIS)

    Miyaoka, Sadaoki

    1990-01-01

    Human error has been the problem in all industries. In 1988, the Bureau of Mines, Department of the Interior, USA, carried out the worldwide survey on the human error in all industries in relation to the fatal accidents in mines. There was difference in the results according to the methods of collecting data, but the proportion that human error took in the total accidents distributed in the wide range of 20∼85%, and was 35% on the average. The rate of occurrence of accidents and troubles in Japanese nuclear power stations is shown, and the rate of occurrence of human error is 0∼0.5 cases/reactor-year, which did not much vary. Therefore, the proportion that human error took in the total tended to increase, and it has become important to reduce human error for lowering the rate of occurrence of accidents and troubles hereafter. After the TMI accident in 1979 in USA, the research on man-machine interface became active, and after the Chernobyl accident in 1986 in USSR, the problem of organization and management has been studied. In Japan, 'Safety 21' was drawn up by the Advisory Committee for Energy, and also the annual reports on nuclear safety pointed out the importance of human factors. The state of the research on human factors in Japan and abroad and three targets to reduce human error are reported. (K.I.)

  13. Human errors in NPP operations

    International Nuclear Information System (INIS)

    Sheng Jufang

    1993-01-01

    Based on the operational experiences of nuclear power plants (NPPs), the importance of studying human performance problems is described. Statistical analysis on the significance or frequency of various root-causes and error-modes from a large number of human-error-related events demonstrate that the defects in operation/maintenance procedures, working place factors, communication and training practices are primary root-causes, while omission, transposition, quantitative mistake are the most frequent among the error-modes. Recommendations about domestic research on human performance problem in NPPs are suggested

  14. Human Error and Organizational Management

    Directory of Open Access Journals (Sweden)

    Alecxandrina DEACONU

    2009-01-01

    Full Text Available The concern for performance is a topic that raises interest in the businessenvironment but also in other areas that – even if they seem distant from thisworld – are aware of, interested in or conditioned by the economy development.As individual performance is very much influenced by the human resource, wechose to analyze in this paper the mechanisms that generate – consciously or not–human error nowadays.Moreover, the extremely tense Romanian context,where failure is rather a rule than an exception, made us investigate thephenomenon of generating a human error and the ways to diminish its effects.

  15. Human errors related to maintenance and modifications

    International Nuclear Information System (INIS)

    Laakso, K.; Pyy, P.; Reiman, L.

    1998-01-01

    about weakness in audits made by the operating organisation and in tests relating to plant operation. The number of plant-specific maintenance records used as input material was high and the findings were discussed thoroughly with the plant maintenance personnel. The results indicated that instrumentation is more prone to human error than the rest of maintenance. Most errors stem from refuelling outage periods and about a half of them were identified during the same outage they were committed. Plant modifications are a significant source of common cause failures. The number of dependent errors could be reduced by improved co-ordination and auditing, post-installation checking, training and start-up testing programmes. (orig.)

  16. Reducing diagnostic errors in medicine: what's the goal?

    Science.gov (United States)

    Graber, Mark; Gordon, Ruthanna; Franklin, Nancy

    2002-10-01

    This review considers the feasibility of reducing or eliminating the three major categories of diagnostic errors in medicine: "No-fault errors" occur when the disease is silent, presents atypically, or mimics something more common. These errors will inevitably decline as medical science advances, new syndromes are identified, and diseases can be detected more accurately or at earlier stages. These errors can never be eradicated, unfortunately, because new diseases emerge, tests are never perfect, patients are sometimes noncompliant, and physicians will inevitably, at times, choose the most likely diagnosis over the correct one, illustrating the concept of necessary fallibility and the probabilistic nature of choosing a diagnosis. "System errors" play a role when diagnosis is delayed or missed because of latent imperfections in the health care system. These errors can be reduced by system improvements, but can never be eliminated because these improvements lag behind and degrade over time, and each new fix creates the opportunity for novel errors. Tradeoffs also guarantee system errors will persist, when resources are just shifted. "Cognitive errors" reflect misdiagnosis from faulty data collection or interpretation, flawed reasoning, or incomplete knowledge. The limitations of human processing and the inherent biases in using heuristics guarantee that these errors will persist. Opportunities exist, however, for improving the cognitive aspect of diagnosis by adopting system-level changes (e.g., second opinions, decision-support systems, enhanced access to specialists) and by training designed to improve cognition or cognitive awareness. Diagnostic error can be substantially reduced, but never eradicated.

  17. Human decision error (HUMDEE) trees

    International Nuclear Information System (INIS)

    Ostrom, L.T.

    1993-01-01

    Graphical presentations of human actions in incident and accident sequences have been used for many years. However, for the most part, human decision making has been underrepresented in these trees. This paper presents a method of incorporating the human decision process into graphical presentations of incident/accident sequences. This presentation is in the form of logic trees. These trees are called Human Decision Error Trees or HUMDEE for short. The primary benefit of HUMDEE trees is that they graphically illustrate what else the individuals involved in the event could have done to prevent either the initiation or continuation of the event. HUMDEE trees also present the alternate paths available at the operator decision points in the incident/accident sequence. This is different from the Technique for Human Error Rate Prediction (THERP) event trees. There are many uses of these trees. They can be used for incident/accident investigations to show what other courses of actions were available and for training operators. The trees also have a consequence component so that not only the decision can be explored, also the consequence of that decision

  18. Human Error Assessmentin Minefield Cleaning Operation Using Human Event Analysis

    Directory of Open Access Journals (Sweden)

    Mohammad Hajiakbari

    2015-12-01

    Full Text Available Background & objective: Human error is one of the main causes of accidents. Due to the unreliability of the human element and the high-risk nature of demining operations, this study aimed to assess and manage human errors likely to occur in such operations. Methods: This study was performed at a demining site in war zones located in the West of Iran. After acquiring an initial familiarity with the operations, methods, and tools of clearing minefields, job task related to clearing landmines were specified. Next, these tasks were studied using HTA and related possible errors were assessed using ATHEANA. Results: de-mining task was composed of four main operations, including primary detection, technical identification, investigation, and neutralization. There were found four main reasons for accidents occurring in such operations; walking on the mines, leaving mines with no action, error in neutralizing operation and environmental explosion. The possibility of human error in mine clearance operations was calculated as 0.010. Conclusion: The main causes of human error in de-mining operations can be attributed to various factors such as poor weather and operating conditions like outdoor work, inappropriate personal protective equipment, personality characteristics, insufficient accuracy in the work, and insufficient time available. To reduce the probability of human error in de-mining operations, the aforementioned factors should be managed properly.

  19. Human Errors and Bridge Management Systems

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle; Nowak, A. S.

    on basis of reliability profiles for bridges without human errors are extended to include bridges with human errors. The first rehabilitation distributions for bridges without and with human errors are combined into a joint first rehabilitation distribution. The methodology presented is illustrated...... for reinforced concrete bridges....

  20. Game Design Principles based on Human Error

    Directory of Open Access Journals (Sweden)

    Guilherme Zaffari

    2016-03-01

    Full Text Available This paper displays the result of the authors’ research regarding to the incorporation of Human Error, through design principles, to video game design. In a general way, designers must consider Human Error factors throughout video game interface development; however, when related to its core design, adaptations are in need, since challenge is an important factor for fun and under the perspective of Human Error, challenge can be considered as a flaw in the system. The research utilized Human Error classifications, data triangulation via predictive human error analysis, and the expanded flow theory to allow the design of a set of principles in order to match the design of playful challenges with the principles of Human Error. From the results, it was possible to conclude that the application of Human Error in game design has a positive effect on player experience, allowing it to interact only with errors associated with the intended aesthetics of the game.

  1. Understanding human management of automation errors

    Science.gov (United States)

    McBride, Sara E.; Rogers, Wendy A.; Fisk, Arthur D.

    2013-01-01

    Automation has the potential to aid humans with a diverse set of tasks and support overall system performance. Automated systems are not always reliable, and when automation errs, humans must engage in error management, which is the process of detecting, understanding, and correcting errors. However, this process of error management in the context of human-automation interaction is not well understood. Therefore, we conducted a systematic review of the variables that contribute to error management. We examined relevant research in human-automation interaction and human error to identify critical automation, person, task, and emergent variables. We propose a framework for management of automation errors to incorporate and build upon previous models. Further, our analysis highlights variables that may be addressed through design and training to positively influence error management. Additional efforts to understand the error management process will contribute to automation designed and implemented to support safe and effective system performance. PMID:25383042

  2. A qualitative description of human error

    International Nuclear Information System (INIS)

    Li Zhaohuan

    1992-11-01

    The human error has an important contribution to risk of reactor operation. The insight and analytical model are main parts in human reliability analysis. It consists of the concept of human error, the nature, the mechanism of generation, the classification and human performance influence factors. On the operating reactor the human error is defined as the task-human-machine mismatch. The human error event is focused on the erroneous action and the unfavored result. From the time limitation of performing a task, the operation is divided into time-limited and time-opened. The HCR (human cognitive reliability) model is suited for only time-limited. The basic cognitive process consists of the information gathering, cognition/thinking, decision making and action. The human erroneous action may be generated in any stage of this process. The more natural ways to classify human errors are presented. The human performance influence factors including personal, organizational and environmental factors are also listed

  3. A qualitative description of human error

    Energy Technology Data Exchange (ETDEWEB)

    Zhaohuan, Li [Academia Sinica, Beijing, BJ (China). Inst. of Atomic Energy

    1992-11-01

    The human error has an important contribution to risk of reactor operation. The insight and analytical model are main parts in human reliability analysis. It consists of the concept of human error, the nature, the mechanism of generation, the classification and human performance influence factors. On the operating reactor the human error is defined as the task-human-machine mismatch. The human error event is focused on the erroneous action and the unfavored result. From the time limitation of performing a task, the operation is divided into time-limited and time-opened. The HCR (human cognitive reliability) model is suited for only time-limited. The basic cognitive process consists of the information gathering, cognition/thinking, decision making and action. The human erroneous action may be generated in any stage of this process. The more natural ways to classify human errors are presented. The human performance influence factors including personal, organizational and environmental factors are also listed.

  4. Probabilistic error bounds for reduced order modeling

    Energy Technology Data Exchange (ETDEWEB)

    Abdo, M.G.; Wang, C.; Abdel-Khalik, H.S., E-mail: abdo@purdue.edu, E-mail: wang1730@purdue.edu, E-mail: abdelkhalik@purdue.edu [Purdue Univ., School of Nuclear Engineering, West Lafayette, IN (United States)

    2015-07-01

    Reduced order modeling has proven to be an effective tool when repeated execution of reactor analysis codes is required. ROM operates on the assumption that the intrinsic dimensionality of the associated reactor physics models is sufficiently small when compared to the nominal dimensionality of the input and output data streams. By employing a truncation technique with roots in linear algebra matrix decomposition theory, ROM effectively discards all components of the input and output data that have negligible impact on reactor attributes of interest. This manuscript introduces a mathematical approach to quantify the errors resulting from the discarded ROM components. As supported by numerical experiments, the introduced analysis proves that the contribution of the discarded components could be upper-bounded with an overwhelmingly high probability. The reverse of this statement implies that the ROM algorithm can self-adapt to determine the level of the reduction needed such that the maximum resulting reduction error is below a given tolerance limit that is set by the user. (author)

  5. Collection of offshore human error probability data

    International Nuclear Information System (INIS)

    Basra, Gurpreet; Kirwan, Barry

    1998-01-01

    Accidents such as Piper Alpha have increased concern about the effects of human errors in complex systems. Such accidents can in theory be predicted and prevented by risk assessment, and in particular human reliability assessment (HRA), but HRA ideally requires qualitative and quantitative human error data. A research initiative at the University of Birmingham led to the development of CORE-DATA, a Computerised Human Error Data Base. This system currently contains a reasonably large number of human error data points, collected from a variety of mainly nuclear-power related sources. This article outlines a recent offshore data collection study, concerned with collecting lifeboat evacuation data. Data collection methods are outlined and a selection of human error probabilities generated as a result of the study are provided. These data give insights into the type of errors and human failure rates that could be utilised to support offshore risk analyses

  6. Human Error Mechanisms in Complex Work Environments

    DEFF Research Database (Denmark)

    Rasmussen, Jens

    1988-01-01

    will account for most of the action errors observed. In addition, error mechanisms appear to be intimately related to the development of high skill and know-how in a complex work context. This relationship between errors and human adaptation is discussed in detail for individuals and organisations...

  7. Analysis of Employee's Survey for Preventing Human-Errors

    International Nuclear Information System (INIS)

    Sung, Chanho; Kim, Younggab; Joung, Sanghoun

    2013-01-01

    Human errors in nuclear power plant can cause large and small events or incidents. These events or incidents are one of main contributors of reactor trip and might threaten the safety of nuclear plants. To prevent human-errors, KHNP(nuclear power plants) introduced 'Human-error prevention techniques' and have applied the techniques to main parts such as plant operation, operation support, and maintenance and engineering. This paper proposes the methods to prevent and reduce human-errors in nuclear power plants through analyzing survey results which includes the utilization of the human-error prevention techniques and the employees' awareness of preventing human-errors. With regard to human-error prevention, this survey analysis presented the status of the human-error prevention techniques and the employees' awareness of preventing human-errors. Employees' understanding and utilization of the techniques was generally high and training level of employee and training effect on actual works were in good condition. Also, employees answered that the root causes of human-error were due to working environment including tight process, manpower shortage, and excessive mission rather than personal negligence or lack of personal knowledge. Consideration of working environment is certainly needed. At the present time, based on analyzing this survey, the best methods of preventing human-error are personal equipment, training/education substantiality, private mental health check before starting work, prohibit of multiple task performing, compliance with procedures, and enhancement of job site review. However, the most important and basic things for preventing human-error are interests of workers and organizational atmosphere such as communication between managers and workers, and communication between employees and bosses

  8. Learning time-dependent noise to reduce logical errors: real time error rate estimation in quantum error correction

    Science.gov (United States)

    Huo, Ming-Xia; Li, Ying

    2017-12-01

    Quantum error correction is important to quantum information processing, which allows us to reliably process information encoded in quantum error correction codes. Efficient quantum error correction benefits from the knowledge of error rates. We propose a protocol for monitoring error rates in real time without interrupting the quantum error correction. Any adaptation of the quantum error correction code or its implementation circuit is not required. The protocol can be directly applied to the most advanced quantum error correction techniques, e.g. surface code. A Gaussian processes algorithm is used to estimate and predict error rates based on error correction data in the past. We find that using these estimated error rates, the probability of error correction failures can be significantly reduced by a factor increasing with the code distance.

  9. Notes on human error analysis and prediction

    International Nuclear Information System (INIS)

    Rasmussen, J.

    1978-11-01

    The notes comprise an introductory discussion of the role of human error analysis and prediction in industrial risk analysis. Following this introduction, different classes of human errors and role in industrial systems are mentioned. Problems related to the prediction of human behaviour in reliability and safety analysis are formulated and ''criteria for analyzability'' which must be met by industrial systems so that a systematic analysis can be performed are suggested. The appendices contain illustrative case stories and a review of human error reports for the task of equipment calibration and testing as found in the US Licensee Event Reports. (author)

  10. Human error mechanisms in complex work environments

    International Nuclear Information System (INIS)

    Rasmussen, J.

    1988-01-01

    Human error taxonomies have been developed from analysis of industrial incident reports as well as from psychological experiments. In this paper the results of the two approaches are reviewed and compared. It is found, in both cases, that a fairly small number of basic psychological mechanisms will account for most of the action errors observed. In addition, error mechanisms appear to be intimately related to the development of high skill and know-how in a complex work context. This relationship between errors and human adaptation is discussed in detail for individuals and organisations. The implications for system safety and briefly mentioned, together with the implications for system design. (author)

  11. Human error mechanisms in complex work environments

    International Nuclear Information System (INIS)

    Rasmussen, Jens; Danmarks Tekniske Hoejskole, Copenhagen)

    1988-01-01

    Human error taxonomies have been developed from analysis of industrial incident reports as well as from psychological experiments. In this paper the results of the two approaches are reviewed and compared. It is found, in both cases, that a fairly small number of basic psychological mechanisms will account for most of the action errors observed. In addition, error mechanisms appear to be intimately related to the development of high skill and know-how in a complex work context. This relationship between errors and human adaptation is discussed in detail for individuals and organisations. The implications for system safety are briefly mentioned, together with the implications for system design. (author)

  12. Human error in remote Afterloading Brachytherapy

    International Nuclear Information System (INIS)

    Quinn, M.L.; Callan, J.; Schoenfeld, I.; Serig, D.

    1994-01-01

    Remote Afterloading Brachytherapy (RAB) is a medical process used in the treatment of cancer. RAB uses a computer-controlled device to remotely insert and remove radioactive sources close to a target (or tumor) in the body. Some RAB problems affecting the radiation dose to the patient have been reported and attributed to human error. To determine the root cause of human error in the RAB system, a human factors team visited 23 RAB treatment sites in the US. The team observed RAB treatment planning and delivery, interviewed RAB personnel, and performed walk-throughs, during which staff demonstrated the procedures and practices used in performing RAB tasks. Factors leading to human error in the RAB system were identified. The impact of those factors on the performance of RAB was then evaluated and prioritized in terms of safety significance. Finally, the project identified and evaluated alternative approaches for resolving the safety significant problems related to human error

  13. The probability and the management of human error

    International Nuclear Information System (INIS)

    Dufey, R.B.; Saull, J.W.

    2004-01-01

    Embedded within modern technological systems, human error is the largest, and indeed dominant contributor to accident cause. The consequences dominate the risk profiles for nuclear power and for many other technologies. We need to quantify the probability of human error for the system as an integral contribution within the overall system failure, as it is generally not separable or predictable for actual events. We also need to provide a means to manage and effectively reduce the failure (error) rate. The fact that humans learn from their mistakes allows a new determination of the dynamic probability and human failure (error) rate in technological systems. The result is consistent with and derived from the available world data for modern technological systems. Comparisons are made to actual data from large technological systems and recent catastrophes. Best estimate values and relationships can be derived for both the human error rate, and for the probability. We describe the potential for new approaches to the management of human error and safety indicators, based on the principles of error state exclusion and of the systematic effect of learning. A new equation is given for the probability of human error (λ) that combines the influences of early inexperience, learning from experience (ε) and stochastic occurrences with having a finite minimum rate, this equation is λ 5.10 -5 + ((1/ε) - 5.10 -5 ) exp(-3*ε). The future failure rate is entirely determined by the experience: thus the past defines the future

  14. Nursing Errors in Intensive Care Unit by Human Error Identification in Systems Tool: A Case Study

    Directory of Open Access Journals (Sweden)

    Nezamodini

    2016-03-01

    Full Text Available Background Although health services are designed and implemented to improve human health, the errors in health services are a very common phenomenon and even sometimes fatal in this field. Medical errors and their cost are global issues with serious consequences for the patients’ community that are preventable and require serious attention. Objectives The current study aimed to identify possible nursing errors applying human error identification in systems tool (HEIST in the intensive care units (ICUs of hospitals. Patients and Methods This descriptive research was conducted in the intensive care unit of a hospital in Khuzestan province in 2013. Data were collected through observation and interview by nine nurses in this section in a period of four months. Human error classification was based on Rose and Rose and Swain and Guttmann models. According to HEIST work sheets the guide questions were answered and error causes were identified after the determination of the type of errors. Results In total 527 errors were detected. The performing operation on the wrong path had the highest frequency which was 150, and the second rate with a frequency of 136 was doing the tasks later than the deadline. Management causes with a frequency of 451 were the first rank among identified errors. Errors mostly occurred in the system observation stage and among the performance shaping factors (PSFs, time was the most influencing factor in occurrence of human errors. Conclusions Finally, in order to prevent the occurrence and reduce the consequences of identified errors the following suggestions were proposed : appropriate training courses, applying work guidelines and monitoring their implementation, increasing the number of work shifts, hiring professional workforce, equipping work space with appropriate facilities and equipment.

  15. Human Error Analysis by Fuzzy-Set

    International Nuclear Information System (INIS)

    Situmorang, Johnny

    1996-01-01

    In conventional HRA the probability of Error is treated as a single and exact value through constructing even tree, but in this moment the Fuzzy-Set Theory is used. Fuzzy set theory treat the probability of error as a plausibility which illustrate a linguistic variable. Most parameter or variable in human engineering been defined verbal good, fairly good, worst etc. Which describe a range of any value of probability. For example this analysis is quantified the human error in calibration task, and the probability of miscalibration is very low

  16. FMEA: a model for reducing medical errors.

    Science.gov (United States)

    Chiozza, Maria Laura; Ponzetti, Clemente

    2009-06-01

    Patient safety is a management issue, in view of the fact that clinical risk management has become an important part of hospital management. Failure Mode and Effect Analysis (FMEA) is a proactive technique for error detection and reduction, firstly introduced within the aerospace industry in the 1960s. Early applications in the health care industry dating back to the 1990s included critical systems in the development and manufacture of drugs and in the prevention of medication errors in hospitals. In 2008, the Technical Committee of the International Organization for Standardization (ISO), licensed a technical specification for medical laboratories suggesting FMEA as a method for prospective risk analysis of high-risk processes. Here we describe the main steps of the FMEA process and review data available on the application of this technique to laboratory medicine. A significant reduction of the risk priority number (RPN) was obtained when applying FMEA to blood cross-matching, to clinical chemistry analytes, as well as to point-of-care testing (POCT).

  17. Dependent Human Error Probability Assessment

    International Nuclear Information System (INIS)

    Simic, Z.; Mikulicic, V.; Vukovic, I.

    2006-01-01

    This paper presents an assessment of the dependence between dynamic operator actions modeled in a Nuclear Power Plant (NPP) PRA and estimate the associated impact on Core damage frequency (CDF). This assessment was done improve HEP dependencies implementation inside existing PRA. All of the dynamic operator actions modeled in the NPP PRA are included in this assessment. Determining the level of HEP dependence and the associated influence on CDF are the major steps of this assessment. A decision on how to apply the results, i.e., should permanent HEP model changes be made, is based on the resulting relative CDF increase. Some CDF increase was selected as a threshold based on the NPP base CDF value and acceptance guidelines from the Regulatory Guide 1.174. HEP dependence resulting in a CDF increase of > 5E-07 would be considered potential candidates for specific incorporation into the baseline model. The approach used to judge the level of dependence between operator actions is based on dependency level categories and conditional probabilities developed in the Handbook of Human Reliability Analysis with Emphasis on Nuclear Power Plant Applications NUREG/CR-1278. To simplify the process, NUREG/CR-1278 identifies five levels of dependence: ZD (zero dependence), LD (low dependence), MD (moderate dependence), HD (high dependence), and CD (complete dependence). NUREG/CR-1278 also identifies several qualitative factors that could be involved in determining the level of dependence. Based on the NUREG/CR-1278 information, Time, Function, and Spatial attributes were judged to be the most important considerations when determining the level of dependence between operator actions within an accident sequence. These attributes were used to develop qualitative criteria (rules) that were used to judge the level of dependence (CD, HD, MD, LD, ZD) between the operator actions. After the level of dependence between the various HEPs is judged, quantitative values associated with the

  18. Generalizing human error rates: A taxonomic approach

    International Nuclear Information System (INIS)

    Buffardi, L.; Fleishman, E.; Allen, J.

    1989-01-01

    It is well established that human error plays a major role in malfunctioning of complex, technological systems and in accidents associated with their operation. Estimates of the rate of human error in the nuclear industry range from 20-65% of all system failures. In response to this, the Nuclear Regulatory Commission has developed a variety of techniques for estimating human error probabilities for nuclear power plant personnel. Most of these techniques require the specification of the range of human error probabilities for various tasks. Unfortunately, very little objective performance data on error probabilities exist for nuclear environments. Thus, when human reliability estimates are required, for example in computer simulation modeling of system reliability, only subjective estimates (usually based on experts' best guesses) can be provided. The objective of the current research is to provide guidelines for the selection of human error probabilities based on actual performance data taken in other complex environments and applying them to nuclear settings. A key feature of this research is the application of a comprehensive taxonomic approach to nuclear and non-nuclear tasks to evaluate their similarities and differences, thus providing a basis for generalizing human error estimates across tasks. In recent years significant developments have occurred in classifying and describing tasks. Initial goals of the current research are to: (1) identify alternative taxonomic schemes that can be applied to tasks, and (2) describe nuclear tasks in terms of these schemes. Three standardized taxonomic schemes (Ability Requirements Approach, Generalized Information-Processing Approach, Task Characteristics Approach) are identified, modified, and evaluated for their suitability in comparing nuclear and non-nuclear power plant tasks. An agenda for future research and its relevance to nuclear power plant safety is also discussed

  19. Human error theory: relevance to nurse management.

    Science.gov (United States)

    Armitage, Gerry

    2009-03-01

    Describe, discuss and critically appraise human error theory and consider its relevance for nurse managers. Healthcare errors are a persistent threat to patient safety. Effective risk management and clinical governance depends on understanding the nature of error. This paper draws upon a wide literature from published works, largely from the field of cognitive psychology and human factors. Although the content of this paper is pertinent to any healthcare professional; it is written primarily for nurse managers. Error is inevitable. Causation is often attributed to individuals, yet causation in complex environments such as healthcare is predominantly multi-factorial. Individual performance is affected by the tendency to develop prepacked solutions and attention deficits, which can in turn be related to local conditions and systems or latent failures. Blame is often inappropriate. Defences should be constructed in the light of these considerations and to promote error wisdom and organizational resilience. Managing and learning from error is seen as a priority in the British National Health Service (NHS), this can be better achieved with an understanding of the roots, nature and consequences of error. Such an understanding can provide a helpful framework for a range of risk management activities.

  20. A chance to avoid mistakes human error

    International Nuclear Information System (INIS)

    Amaro, Pablo; Obeso, Eduardo; Gomez, Ruben

    2010-01-01

    Trying to give an answer to the lack of public information in the industry, in relationship with the different tools that are managed in the nuclear industry for minimizing the human error, a group of workers from different sections of the St. Maria de Garona NPP (Quality Assurance/ Organization and Human Factors) decided to embark on a challenging and exciting project: 'Write a book collecting all the knowledge accumulated during their daily activities, very often during lecture time of external information received from different organizations within the nuclear industry (INPO, WANO...), but also visiting different NPP's, maintaining meetings and participating in training courses related de Human and Organizational Factors'. Main objective of the book is presenting to the industry in general, the different tools that are used and fostered in the nuclear industry, in a practical way. In this way, the assimilation and implementation in others industries could be possible and achievable in and efficient context. One year of work, and our project is a reality. We have presented and abstract during the last Spanish Nuclear Society meeting in Sevilla, last October...and the best, the book is into the market for everybody in web-site: www.bubok.com. The book is structured in the following areas: 'Errare humanum est': Trying to present what is the human error to the reader, its origin and the different barriers. The message is that the reader see the error like something continuously present in our lives... even more frequently than we think. Studying its origin can be established aimed at barriers to avoid or at least minimize it. 'Error's bitter face': Shows the possible consequences of human errors. What better that presenting real experiences that have occurred in the industry. In the book, accidents in the nuclear industry, like Tree Mile Island NPP, Chernobyl NPP, and incidents like Davis Besse NPP in the past, helps to the reader to make a reflection about the

  1. Simulator data on human error probabilities

    International Nuclear Information System (INIS)

    Kozinsky, E.J.; Guttmann, H.E.

    1982-01-01

    Analysis of operator errors on NPP simulators is being used to determine Human Error Probabilities (HEP) for task elements defined in NUREG/CR 1278. Simulator data tapes from research conducted by EPRI and ORNL are being analyzed for operator error rates. The tapes collected, using Performance Measurement System software developed for EPRI, contain a history of all operator manipulations during simulated casualties. Analysis yields a time history or Operational Sequence Diagram and a manipulation summary, both stored in computer data files. Data searches yield information on operator errors of omission and commission. This work experimentally determines HEPs for Probabilistic Risk Assessment calculations. It is the only practical experimental source of this data to date

  2. Simulator data on human error probabilities

    International Nuclear Information System (INIS)

    Kozinsky, E.J.; Guttmann, H.E.

    1981-01-01

    Analysis of operator errors on NPP simulators is being used to determine Human Error Probabilities (HEP) for task elements defined in NUREG/CR-1278. Simulator data tapes from research conducted by EPRI and ORNL are being analyzed for operator error rates. The tapes collected, using Performance Measurement System software developed for EPRI, contain a history of all operator manipulations during simulated casualties. Analysis yields a time history or Operational Sequence Diagram and a manipulation summary, both stored in computer data files. Data searches yield information on operator errors of omission and commission. This work experimentally determined HEP's for Probabilistic Risk Assessment calculations. It is the only practical experimental source of this data to date

  3. Chernobyl - system accident or human error?

    International Nuclear Information System (INIS)

    Stang, E.

    1996-01-01

    Did human error cause the Chernobyl disaster? The standard point of view is that operator error was the root cause of the disaster. This was also the view of the Soviet Accident Commission. The paper analyses the operator errors at Chernobyl in a system context. The reactor operators committed errors that depended upon a lot of other failures that made up a complex accident scenario. The analysis is based on Charles Perrow's analysis of technological disasters. Failure possibility is an inherent property of high-risk industrial installations. The Chernobyl accident consisted of a chain of events that were both extremely improbable and difficult to predict. It is not reasonable to put the blame for the disaster on the operators. (author)

  4. Perceptual learning eases crowding by reducing recognition errors but not position errors.

    Science.gov (United States)

    Xiong, Ying-Zi; Yu, Cong; Zhang, Jun-Yun

    2015-08-01

    When an observer reports a letter flanked by additional letters in the visual periphery, the response errors (the crowding effect) may result from failure to recognize the target letter (recognition errors), from mislocating a correctly recognized target letter at a flanker location (target misplacement errors), or from reporting a flanker as the target letter (flanker substitution errors). Crowding can be reduced through perceptual learning. However, it is not known how perceptual learning operates to reduce crowding. In this study we trained observers with a partial-report task (Experiment 1), in which they reported the central target letter of a three-letter string presented in the visual periphery, or a whole-report task (Experiment 2), in which they reported all three letters in order. We then assessed the impact of training on recognition of both unflanked and flanked targets, with particular attention to how perceptual learning affected the types of errors. Our results show that training improved target recognition but not single-letter recognition, indicating that training indeed affected crowding. However, training did not reduce target misplacement errors or flanker substitution errors. This dissociation between target recognition and flanker substitution errors supports the view that flanker substitution may be more likely a by-product (due to response bias), rather than a cause, of crowding. Moreover, the dissociation is not consistent with hypothesized mechanisms of crowding that would predict reduced positional errors.

  5. Electronic prescribing reduces prescribing error in public hospitals.

    Science.gov (United States)

    Shawahna, Ramzi; Rahman, Nisar-Ur; Ahmad, Mahmood; Debray, Marcel; Yliperttula, Marjo; Declèves, Xavier

    2011-11-01

    To examine the incidence of prescribing errors in a main public hospital in Pakistan and to assess the impact of introducing electronic prescribing system on the reduction of their incidence. Medication errors are persistent in today's healthcare system. The impact of electronic prescribing on reducing errors has not been tested in developing world. Prospective review of medication and discharge medication charts before and after the introduction of an electronic inpatient record and prescribing system. Inpatient records (n = 3300) and 1100 discharge medication sheets were reviewed for prescribing errors before and after the installation of electronic prescribing system in 11 wards. Medications (13,328 and 14,064) were prescribed for inpatients, among which 3008 and 1147 prescribing errors were identified, giving an overall error rate of 22·6% and 8·2% throughout paper-based and electronic prescribing, respectively. Medications (2480 and 2790) were prescribed for discharge patients, among which 418 and 123 errors were detected, giving an overall error rate of 16·9% and 4·4% during paper-based and electronic prescribing, respectively. Electronic prescribing has a significant effect on the reduction of prescribing errors. Prescribing errors are commonplace in Pakistan public hospitals. The study evaluated the impact of introducing electronic inpatient records and electronic prescribing in the reduction of prescribing errors in a public hospital in Pakistan. © 2011 Blackwell Publishing Ltd.

  6. Reduced phase error through optimized control of a superconducting qubit

    International Nuclear Information System (INIS)

    Lucero, Erik; Kelly, Julian; Bialczak, Radoslaw C.; Lenander, Mike; Mariantoni, Matteo; Neeley, Matthew; O'Connell, A. D.; Sank, Daniel; Wang, H.; Weides, Martin; Wenner, James; Cleland, A. N.; Martinis, John M.; Yamamoto, Tsuyoshi

    2010-01-01

    Minimizing phase and other errors in experimental quantum gates allows higher fidelity quantum processing. To quantify and correct for phase errors, in particular, we have developed an experimental metrology - amplified phase error (APE) pulses - that amplifies and helps identify phase errors in general multilevel qubit architectures. In order to correct for both phase and amplitude errors specific to virtual transitions and leakage outside of the qubit manifold, we implement 'half derivative', an experimental simplification of derivative reduction by adiabatic gate (DRAG) control theory. The phase errors are lowered by about a factor of five using this method to ∼1.6 deg. per gate, and can be tuned to zero. Leakage outside the qubit manifold, to the qubit |2> state, is also reduced to ∼10 -4 for 20% faster gates.

  7. The cost of human error intervention

    International Nuclear Information System (INIS)

    Bennett, C.T.; Banks, W.W.; Jones, E.D.

    1994-03-01

    DOE has directed that cost-benefit analyses be conducted as part of the review process for all new DOE orders. This new policy will have the effect of ensuring that DOE analysts can justify the implementation costs of the orders that they develop. We would like to argue that a cost-benefit analysis is merely one phase of a complete risk management program -- one that would more than likely start with a probabilistic risk assessment. The safety community defines risk as the probability of failure times the severity of consequence. An engineering definition of failure can be considered in terms of physical performance, as in mean-time-between-failure; or, it can be thought of in terms of human performance, as in probability of human error. The severity of consequence of a failure can be measured along any one of a number of dimensions -- economic, political, or social. Clearly, an analysis along one dimension cannot be directly compared to another but, a set of cost-benefit analyses, based on a series of cost-dimensions, can be extremely useful to managers who must prioritize their resources. Over the last two years, DOE has been developing a series of human factors orders, directed a lowering the probability of human error -- or at least changing the distribution of those errors. The following discussion presents a series of cost-benefit analyses using historical events in the nuclear industry. However, we would first like to discuss some of the analytic cautions that must be considered when we deal with human error

  8. Operator error and emotions. Operator error and emotions - a major cause of human failure

    Energy Technology Data Exchange (ETDEWEB)

    Patterson, B.K. [Human Factors Practical Incorporated (Canada); Bradley, M. [Univ. of New Brunswick, Saint John, New Brunswick (Canada); Artiss, W.G. [Human Factors Practical (Canada)

    2000-07-01

    This paper proposes the idea that a large proportion of the incidents attributed to operator and maintenance error in a nuclear or industrial plant are actually founded in our human emotions. Basic psychological theory of emotions is briefly presented and then the authors present situations and instances that can cause emotions to swell and lead to operator and maintenance error. Since emotional information is not recorded in industrial incident reports, the challenge is extended to industry, to review incident source documents for cases of emotional involvement and to develop means to collect emotion related information in future root cause analysis investigations. Training must then be provided to operators and maintainers to enable them to know one's emotions, manage emotions, motivate one's self, recognize emotions in others and handle relationships. Effective training will reduce the instances of human error based in emotions and enable a cooperative, productive environment in which to work. (author)

  9. Operator error and emotions. Operator error and emotions - a major cause of human failure

    International Nuclear Information System (INIS)

    Patterson, B.K.; Bradley, M.; Artiss, W.G.

    2000-01-01

    This paper proposes the idea that a large proportion of the incidents attributed to operator and maintenance error in a nuclear or industrial plant are actually founded in our human emotions. Basic psychological theory of emotions is briefly presented and then the authors present situations and instances that can cause emotions to swell and lead to operator and maintenance error. Since emotional information is not recorded in industrial incident reports, the challenge is extended to industry, to review incident source documents for cases of emotional involvement and to develop means to collect emotion related information in future root cause analysis investigations. Training must then be provided to operators and maintainers to enable them to know one's emotions, manage emotions, motivate one's self, recognize emotions in others and handle relationships. Effective training will reduce the instances of human error based in emotions and enable a cooperative, productive environment in which to work. (author)

  10. Human error in maintenance: An investigative study for the factories of the future

    International Nuclear Information System (INIS)

    Dhillon, B S

    2014-01-01

    This paper presents a study of human error in maintenance. Many different aspects of human error in maintenance considered useful for the factories of the future are studied, including facts, figures, and examples; occurrence of maintenance error in equipment life cycle, elements of a maintenance person's time, maintenance environment and the causes for the occurrence of maintenance error, types and typical maintenance errors, common maintainability design errors and useful design guidelines to reduce equipment maintenance errors, maintenance work instructions, and maintenance error analysis methods

  11. Human errors in operation - what to do with them?

    International Nuclear Information System (INIS)

    Michalek, J.

    2009-01-01

    'It is human to make errors!' This saying of our predecessors is still current and will continue to be valid also in the future, until human is a human. Errors cannot be completely eliminated from human activities. In average human makes two simple errors in one hour. For example, how many typing errors do we make while typing on the computer keyboard? How many times we make mistakes in writing the date in the first days of a new year? These errors have no major consequences, however, in certain situations errors of humans are very unpleasant and may be also very costly, they may even endanger human lives. (author)

  12. Reduced error signalling in medication-naive children with ADHD

    DEFF Research Database (Denmark)

    Plessen, Kerstin J; Allen, Elena A; Eichele, Heike

    2016-01-01

    reduced in children with ADHD. This adaptation was inversely related to activation of the right-lateralized ventral attention network (VAN) on error trials and to task-driven connectivity between the cingulo-opercular system and the VAN. LIMITATIONS: Our study was limited by the modest sample size......BACKGROUND: We examined the blood-oxygen level-dependent (BOLD) activation in brain regions that signal errors and their association with intraindividual behavioural variability and adaptation to errors in children with attention-deficit/hyperactivity disorder (ADHD). METHODS: We acquired...

  13. SHERPA: A systematic human error reduction and prediction approach

    International Nuclear Information System (INIS)

    Embrey, D.E.

    1986-01-01

    This paper describes a Systematic Human Error Reduction and Prediction Approach (SHERPA) which is intended to provide guidelines for human error reduction and quantification in a wide range of human-machine systems. The approach utilizes as its basic current cognitive models of human performance. The first module in SHERPA performs task and human error analyses, which identify likely error modes, together with guidelines for the reduction of these errors by training, procedures and equipment redesign. The second module uses a SARAH approach to quantify the probability of occurrence of the errors identified earlier, and provides cost benefit analyses to assist in choosing the appropriate error reduction approaches in the third module

  14. Errors as a Means of Reducing Impulsive Food Choice.

    Science.gov (United States)

    Sellitto, Manuela; di Pellegrino, Giuseppe

    2016-06-05

    Nowadays, the increasing incidence of eating disorders due to poor self-control has given rise to increased obesity and other chronic weight problems, and ultimately, to reduced life expectancy. The capacity to refrain from automatic responses is usually high in situations in which making errors is highly likely. The protocol described here aims at reducing imprudent preference in women during hypothetical intertemporal choices about appetitive food by associating it with errors. First, participants undergo an error task where two different edible stimuli are associated with two different error likelihoods (high and low). Second, they make intertemporal choices about the two edible stimuli, separately. As a result, this method decreases the discount rate for future amounts of the edible reward that cued higher error likelihood, selectively. This effect is under the influence of the self-reported hunger level. The present protocol demonstrates that errors, well known as motivationally salient events, can induce the recruitment of cognitive control, thus being ultimately useful in reducing impatient choices for edible commodities.

  15. Human Error Analysis in a Permit to Work System: A Case Study in a Chemical Plant

    Directory of Open Access Journals (Sweden)

    Mehdi Jahangiri

    2016-03-01

    Conclusion: The SPAR-H method applied in this study could analyze and quantify the potential human errors and extract the required measures for reducing the error probabilities in PTW system. Some suggestions to reduce the likelihood of errors, especially in the field of modifying the performance shaping factors and dependencies among tasks are provided.

  16. Reduced error signalling in medication-naive children with ADHD

    DEFF Research Database (Denmark)

    Plessen, Kerstin J; Allen, Elena A; Eichele, Heike

    2016-01-01

    BACKGROUND: We examined the blood-oxygen level-dependent (BOLD) activation in brain regions that signal errors and their association with intraindividual behavioural variability and adaptation to errors in children with attention-deficit/hyperactivity disorder (ADHD). METHODS: We acquired...... functional MRI data during a Flanker task in medication-naive children with ADHD and healthy controls aged 8-12 years and analyzed the data using independent component analysis. For components corresponding to performance monitoring networks, we compared activations across groups and conditions...... and correlated them with reaction times (RT). Additionally, we analyzed post-error adaptations in behaviour and motor component activations. RESULTS: We included 25 children with ADHD and 29 controls in our analysis. Children with ADHD displayed reduced activation to errors in cingulo-opercular regions...

  17. Perancangan Fasilitas Kerja untuk Mereduksi Human Error

    Directory of Open Access Journals (Sweden)

    Harmein Nasution

    2012-01-01

    Full Text Available Work equipments and environment which are not design ergonomically can cause physical exhaustion to the workers. As a result of that physical exhaustion, many defects in the production lines can happen due to human error and also cause musculoskeletal complaints. To overcome, those effects, we occupied methods for analyzing the workers posture based on the SNQ (Standard Nordic Questionnaire, plibel, QEC (Quick Exposure Check and biomechanism. Moreover, we applied those methods for designing rolling machines and grip egrek ergono-mically, so that the defects on those production lines can be minimized.

  18. When soft controls get slippery: User interfaces and human error

    International Nuclear Information System (INIS)

    Stubler, W.F.; O'Hara, J.M.

    1998-01-01

    Many types of products and systems that have traditionally featured physical control devices are now being designed with soft controls--input formats appearing on computer-based display devices and operated by a variety of input devices. A review of complex human-machine systems found that soft controls are particularly prone to some types of errors and may affect overall system performance and safety. This paper discusses the application of design approaches for reducing the likelihood of these errors and for enhancing usability, user satisfaction, and system performance and safety

  19. Human errors, countermeasures for their prevention and evaluation

    International Nuclear Information System (INIS)

    Kohda, Takehisa; Inoue, Koichi

    1992-01-01

    The accidents originated in human errors have occurred as ever in recent large accidents such as the TMI accident and the Chernobyl accident. The proportion of the accidents originated in human errors is unexpectedly high, therefore, the reliability and safety of hardware are improved hereafter, but the improvement of human reliability cannot be expected. Human errors arise by the difference between the function required for men and the function actually accomplished by men, and the results exert some adverse effect to systems. Human errors are classified into design error, manufacture error, operation error, maintenance error, checkup error and general handling error. In terms of behavior, human errors are classified into forget to do, fail to do, do that must not be done, mistake in order and do at improper time. The factors in human error occurrence are circumstantial factor, personal factor and stress factor. As the method of analyzing and evaluating human errors, system engineering method such as probabilistic risk assessment is used. The technique for human error rate prediction, the method for human cognitive reliability, confusion matrix and SLIM-MAUD are also used. (K.I.)

  20. A technique for human error analysis (ATHEANA)

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, S.E.; Ramey-Smith, A.M.; Wreathall, J.; Parry, G.W. [and others

    1996-05-01

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions.

  1. A technique for human error analysis (ATHEANA)

    International Nuclear Information System (INIS)

    Cooper, S.E.; Ramey-Smith, A.M.; Wreathall, J.; Parry, G.W.

    1996-05-01

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions

  2. Reducing Error, Fraud and Corruption (EFC) in Social Protection Programs

    OpenAIRE

    Tesliuc, Emil Daniel; Milazzo, Annamaria

    2007-01-01

    Social Protection (SP) and Social Safety Net (SSN) programs channel a large amount of public resources, it is important to make sure that these reach the intended beneficiaries. Error, fraud, or corruption (EFC) reduces the economic efficiency of these interventions by decreasing the amount of money that goes to the intended beneficiaries, and erodes the political support for the program. ...

  3. Reducing Approximation Error in the Fourier Flexible Functional Form

    Directory of Open Access Journals (Sweden)

    Tristan D. Skolrud

    2017-12-01

    Full Text Available The Fourier Flexible form provides a global approximation to an unknown data generating process. In terms of limiting function specification error, this form is preferable to functional forms based on second-order Taylor series expansions. The Fourier Flexible form is a truncated Fourier series expansion appended to a second-order expansion in logarithms. By replacing the logarithmic expansion with a Box-Cox transformation, we show that the Fourier Flexible form can reduce approximation error by 25% on average in the tails of the data distribution. The new functional form allows for nested testing of a larger set of commonly implemented functional forms.

  4. Stereotype threat can reduce older adults' memory errors.

    Science.gov (United States)

    Barber, Sarah J; Mather, Mara

    2013-01-01

    Stereotype threat often incurs the cost of reducing the amount of information that older adults accurately recall. In the current research, we tested whether stereotype threat can also benefit memory. According to the regulatory focus account of stereotype threat, threat induces a prevention focus in which people become concerned with avoiding errors of commission and are sensitive to the presence or absence of losses within their environment. Because of this, we predicted that stereotype threat might reduce older adults' memory errors. Results were consistent with this prediction. Older adults under stereotype threat had lower intrusion rates during free-recall tests (Experiments 1 and 2). They also reduced their false alarms and adopted more conservative response criteria during a recognition test (Experiment 2). Thus, stereotype threat can decrease older adults' false memories, albeit at the cost of fewer veridical memories, as well.

  5. Reducing systematic errors in measurements made by a SQUID magnetometer

    International Nuclear Information System (INIS)

    Kiss, L.F.; Kaptás, D.; Balogh, J.

    2014-01-01

    A simple method is described which reduces those systematic errors of a superconducting quantum interference device (SQUID) magnetometer that arise from possible radial displacements of the sample in the second-order gradiometer superconducting pickup coil. By rotating the sample rod (and hence the sample) around its axis into a position where the best fit is obtained to the output voltage of the SQUID as the sample is moved through the pickup coil, the accuracy of measuring magnetic moments can be increased significantly. In the cases of an examined Co 1.9 Fe 1.1 Si Heusler alloy, pure iron and nickel samples, the accuracy could be increased over the value given in the specification of the device. The suggested method is only meaningful if the measurement uncertainty is dominated by systematic errors – radial displacement in particular – and not by instrumental or environmental noise. - Highlights: • A simple method is described which reduces systematic errors of a SQUID. • The errors arise from a radial displacement of the sample in the gradiometer coil. • The procedure is to rotate the sample rod (with the sample) around its axis. • The best fit to the SQUID voltage has to be attained moving the sample through the coil. • The accuracy of measuring magnetic moment can be increased significantly

  6. Reducing errors in the GRACE gravity solutions using regularization

    Science.gov (United States)

    Save, Himanshu; Bettadpur, Srinivas; Tapley, Byron D.

    2012-09-01

    The nature of the gravity field inverse problem amplifies the noise in the GRACE data, which creeps into the mid and high degree and order harmonic coefficients of the Earth's monthly gravity fields provided by GRACE. Due to the use of imperfect background models and data noise, these errors are manifested as north-south striping in the monthly global maps of equivalent water heights. In order to reduce these errors, this study investigates the use of the L-curve method with Tikhonov regularization. L-curve is a popular aid for determining a suitable value of the regularization parameter when solving linear discrete ill-posed problems using Tikhonov regularization. However, the computational effort required to determine the L-curve is prohibitively high for a large-scale problem like GRACE. This study implements a parameter-choice method, using Lanczos bidiagonalization which is a computationally inexpensive approximation to L-curve. Lanczos bidiagonalization is implemented with orthogonal transformation in a parallel computing environment and projects a large estimation problem on a problem of the size of about 2 orders of magnitude smaller for computing the regularization parameter. Errors in the GRACE solution time series have certain characteristics that vary depending on the ground track coverage of the solutions. These errors increase with increasing degree and order. In addition, certain resonant and near-resonant harmonic coefficients have higher errors as compared with the other coefficients. Using the knowledge of these characteristics, this study designs a regularization matrix that provides a constraint on the geopotential coefficients as a function of its degree and order. This regularization matrix is then used to compute the appropriate regularization parameter for each monthly solution. A 7-year time-series of the candidate regularized solutions (Mar 2003-Feb 2010) show markedly reduced error stripes compared with the unconstrained GRACE release 4

  7. Basic considerations in predicting error probabilities in human task performance

    International Nuclear Information System (INIS)

    Fleishman, E.A.; Buffardi, L.C.; Allen, J.A.; Gaskins, R.C. III

    1990-04-01

    It is well established that human error plays a major role in the malfunctioning of complex systems. This report takes a broad look at the study of human error and addresses the conceptual, methodological, and measurement issues involved in defining and describing errors in complex systems. In addition, a review of existing sources of human reliability data and approaches to human performance data base development is presented. Alternative task taxonomies, which are promising for establishing the comparability on nuclear and non-nuclear tasks, are also identified. Based on such taxonomic schemes, various data base prototypes for generalizing human error rates across settings are proposed. 60 refs., 3 figs., 7 tabs

  8. Understanding Human Error in Naval Aviation Mishaps.

    Science.gov (United States)

    Miranda, Andrew T

    2018-04-01

    To better understand the external factors that influence the performance and decisions of aviators involved in Naval aviation mishaps. Mishaps in complex activities, ranging from aviation to nuclear power operations, are often the result of interactions between multiple components within an organization. The Naval aviation mishap database contains relevant information, both in quantitative statistics and qualitative reports, that permits analysis of such interactions to identify how the working atmosphere influences aviator performance and judgment. Results from 95 severe Naval aviation mishaps that occurred from 2011 through 2016 were analyzed using Bayes' theorem probability formula. Then a content analysis was performed on a subset of relevant mishap reports. Out of the 14 latent factors analyzed, the Bayes' application identified 6 that impacted specific aspects of aviator behavior during mishaps. Technological environment, misperceptions, and mental awareness impacted basic aviation skills. The remaining 3 factors were used to inform a content analysis of the contextual information within mishap reports. Teamwork failures were the result of plan continuation aggravated by diffused responsibility. Resource limitations and risk management deficiencies impacted judgments made by squadron commanders. The application of Bayes' theorem to historical mishap data revealed the role of latent factors within Naval aviation mishaps. Teamwork failures were seen to be considerably damaging to both aviator skill and judgment. Both the methods and findings have direct application for organizations interested in understanding the relationships between external factors and human error. It presents real-world evidence to promote effective safety decisions.

  9. Approaches to reducing photon dose calculation errors near metal implants

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Jessie Y.; Followill, David S.; Howell, Rebecca M.; Mirkovic, Dragan; Kry, Stephen F., E-mail: sfkry@mdanderson.org [Department of Radiation Physics, The University of Texas MD Anderson Cancer Center, 1515 Holcombe Boulevard, Houston, Texas 77030 and Graduate School of Biomedical Sciences, The University of Texas Health Science Center Houston, Houston, Texas 77030 (United States); Liu, Xinming [Department of Imaging Physics, The University of Texas MD Anderson Cancer Center, 1515 Holcombe Boulevard, Houston, Texas 77030 and Graduate School of Biomedical Sciences, The University of Texas Health Science Center Houston, Houston, Texas 77030 (United States); Stingo, Francesco C. [Department of Biostatistics, The University of Texas MD Anderson Cancer Center, 1515 Holcombe Boulevard, Houston, Texas 77030 and Graduate School of Biomedical Sciences, The University of Texas Health Science Center Houston, Houston, Texas 77030 (United States)

    2016-09-15

    Purpose: Dose calculation errors near metal implants are caused by limitations of the dose calculation algorithm in modeling tissue/metal interface effects as well as density assignment errors caused by imaging artifacts. The purpose of this study was to investigate two strategies for reducing dose calculation errors near metal implants: implementation of metal-based energy deposition kernels in the convolution/superposition (C/S) dose calculation method and use of metal artifact reduction methods for computed tomography (CT) imaging. Methods: Both error reduction strategies were investigated using a simple geometric slab phantom with a rectangular metal insert (composed of titanium or Cerrobend), as well as two anthropomorphic phantoms (one with spinal hardware and one with dental fillings), designed to mimic relevant clinical scenarios. To assess the dosimetric impact of metal kernels, the authors implemented titanium and silver kernels in a commercial collapsed cone C/S algorithm. To assess the impact of CT metal artifact reduction methods, the authors performed dose calculations using baseline imaging techniques (uncorrected 120 kVp imaging) and three commercial metal artifact reduction methods: Philips Healthcare’s O-MAR, GE Healthcare’s monochromatic gemstone spectral imaging (GSI) using dual-energy CT, and GSI with metal artifact reduction software (MARS) applied. For the simple geometric phantom, radiochromic film was used to measure dose upstream and downstream of metal inserts. For the anthropomorphic phantoms, ion chambers and radiochromic film were used to quantify the benefit of the error reduction strategies. Results: Metal kernels did not universally improve accuracy but rather resulted in better accuracy upstream of metal implants and decreased accuracy directly downstream. For the clinical cases (spinal hardware and dental fillings), metal kernels had very little impact on the dose calculation accuracy (<1.0%). Of the commercial CT artifact

  10. Approaches to reducing photon dose calculation errors near metal implants

    International Nuclear Information System (INIS)

    Huang, Jessie Y.; Followill, David S.; Howell, Rebecca M.; Mirkovic, Dragan; Kry, Stephen F.; Liu, Xinming; Stingo, Francesco C.

    2016-01-01

    Purpose: Dose calculation errors near metal implants are caused by limitations of the dose calculation algorithm in modeling tissue/metal interface effects as well as density assignment errors caused by imaging artifacts. The purpose of this study was to investigate two strategies for reducing dose calculation errors near metal implants: implementation of metal-based energy deposition kernels in the convolution/superposition (C/S) dose calculation method and use of metal artifact reduction methods for computed tomography (CT) imaging. Methods: Both error reduction strategies were investigated using a simple geometric slab phantom with a rectangular metal insert (composed of titanium or Cerrobend), as well as two anthropomorphic phantoms (one with spinal hardware and one with dental fillings), designed to mimic relevant clinical scenarios. To assess the dosimetric impact of metal kernels, the authors implemented titanium and silver kernels in a commercial collapsed cone C/S algorithm. To assess the impact of CT metal artifact reduction methods, the authors performed dose calculations using baseline imaging techniques (uncorrected 120 kVp imaging) and three commercial metal artifact reduction methods: Philips Healthcare’s O-MAR, GE Healthcare’s monochromatic gemstone spectral imaging (GSI) using dual-energy CT, and GSI with metal artifact reduction software (MARS) applied. For the simple geometric phantom, radiochromic film was used to measure dose upstream and downstream of metal inserts. For the anthropomorphic phantoms, ion chambers and radiochromic film were used to quantify the benefit of the error reduction strategies. Results: Metal kernels did not universally improve accuracy but rather resulted in better accuracy upstream of metal implants and decreased accuracy directly downstream. For the clinical cases (spinal hardware and dental fillings), metal kernels had very little impact on the dose calculation accuracy (<1.0%). Of the commercial CT artifact

  11. Experimental Evaluation of a Mixed Controller That Amplifies Spatial Errors and Reduces Timing Errors

    Directory of Open Access Journals (Sweden)

    Laura Marchal-Crespo

    2017-06-01

    Full Text Available Research on motor learning suggests that training with haptic guidance enhances learning of the timing components of motor tasks, whereas error amplification is better for learning the spatial components. We present a novel mixed guidance controller that combines haptic guidance and error amplification to simultaneously promote learning of the timing and spatial components of complex motor tasks. The controller is realized using a force field around the desired position. This force field has a stable manifold tangential to the trajectory that guides subjects in velocity-related aspects. The force field has an unstable manifold perpendicular to the trajectory, which amplifies the perpendicular (spatial error. We also designed a controller that applies randomly varying, unpredictable disturbing forces to enhance the subjects’ active participation by pushing them away from their “comfort zone.” We conducted an experiment with thirty-two healthy subjects to evaluate the impact of four different training strategies on motor skill learning and self-reported motivation: (i No haptics, (ii mixed guidance, (iii perpendicular error amplification and tangential haptic guidance provided in sequential order, and (iv randomly varying disturbing forces. Subjects trained two motor tasks using ARMin IV, a robotic exoskeleton for upper limb rehabilitation: follow circles with an ellipsoidal speed profile, and move along a 3D line following a complex speed profile. Mixed guidance showed no detectable learning advantages over the other groups. Results suggest that the effectiveness of the training strategies depends on the subjects’ initial skill level. Mixed guidance seemed to benefit subjects who performed the circle task with smaller errors during baseline (i.e., initially more skilled subjects, while training with no haptics was more beneficial for subjects who created larger errors (i.e., less skilled subjects. Therefore, perhaps the high functional

  12. Influence of organizational culture on human error

    International Nuclear Information System (INIS)

    Friedlander, M.A.; Evans, S.A.

    1996-01-01

    Much has been written in contemporary business literature during the last decade describing the role that corporate culture plays in virtually every aspect of a firm's success. In 1990 Kotter and Heskett wrote, open-quotes We found that firms with cultures that emphasized all of the key managerial constituencies (customers, stockholders, and employees) and leadership from managers at all levels out-performed firms that did not have those cultural traits by a huge margin. Over an eleven year period, the former increased revenues by an average of 682 percent versus 166 percent for the latter, expanded their workforce by 282 percent versus 36 percent, grew their stock prices by 901 percent versus 74 percent, and improved their net incomes by 756 percent versus 1 percent.close quotes Since the mid-1980s, several electric utilities have documented their efforts to undertake strategic culture change. In almost every case, these efforts have yielded dramatic improvements in the open-quotes bottom-lineclose quotes operational and financial results (e.g., Western Resources, Arizona Public Service, San Diego Gas ampersand Electric, and Electricity Trust of South Australia). Given the body of evidence that indicates a relationship between high-performing organizational culture and the financial and business success of a firm, Pennsylvania Power ampersand Light Company undertook a study to identify the relationship between organizational culture and the frequency, severity, and nature of human error at the Susquehanna Steam Electric Station. The underlying proposition for this asssessment is that organizational culture is an independent variable that transforms external events into organizational performance

  13. Reducing WCET Overestimations by Correcting Errors in Loop Bound Constraints

    Directory of Open Access Journals (Sweden)

    Fanqi Meng

    2017-12-01

    Full Text Available In order to reduce overestimations of worst-case execution time (WCET, in this article, we firstly report a kind of specific WCET overestimation caused by non-orthogonal nested loops. Then, we propose a novel correction approach which has three basic steps. The first step is to locate the worst-case execution path (WCEP in the control flow graph and then map it onto source code. The second step is to identify non-orthogonal nested loops from the WCEP by means of an abstract syntax tree. The last step is to recursively calculate the WCET errors caused by the loose loop bound constraints, and then subtract the total errors from the overestimations. The novelty lies in the fact that the WCET correction is only conducted on the non-branching part of WCEP, thus avoiding potential safety risks caused by possible WCEP switches. Experimental results show that our approach reduces the specific WCET overestimation by an average of more than 82%, and 100% of corrected WCET is no less than the actual WCET. Thus, our approach is not only effective but also safe. It will help developers to design energy-efficient and safe real-time systems.

  14. Quality of IT service delivery — Analysis and framework for human error prevention

    KAUST Repository

    Shwartz, L.

    2010-12-01

    In this paper, we address the problem of reducing the occurrence of Human Errors that cause service interruptions in IT Service Support and Delivery operations. Analysis of a large volume of service interruption records revealed that more than 21% of interruptions were caused by human error. We focus on Change Management, the process with the largest risk of human error, and identify the main instances of human errors as the 4 Wrongs: request, time, configuration item, and command. Analysis of change records revealed that the humanerror prevention by partial automation is highly relevant. We propose the HEP Framework, a framework for execution of IT Service Delivery operations that reduces human error by addressing the 4 Wrongs using content integration, contextualization of operation patterns, partial automation of command execution, and controlled access to resources.

  15. Can human error theory explain non-adherence?

    Science.gov (United States)

    Barber, Nick; Safdar, A; Franklin, Bryoney D

    2005-08-01

    To apply human error theory to explain non-adherence and examine how well it fits. Patients who were taking chronic medication were telephoned and asked whether they had been adhering to their medicine, and if not the reasons were explored and analysed according to a human error theory. Of 105 patients, 87 were contacted by telephone and they took part in the study. Forty-two recalled being non-adherent, 17 of them in the last 7 days; 11 of the 42 were intentionally non-adherent. The errors could be described by human error theory, and it explained unintentional non-adherence well, however, the application of 'rules' was difficult when considering mistakes. The consideration of error producing conditions and latent failures also revealed useful contributing factors. Human error theory offers a new and valuable way of understanding non-adherence, and could inform interventions. However, the theory needs further development to explain intentional non-adherence.

  16. Reducing image interpretation errors – Do communication strategies undermine this?

    International Nuclear Information System (INIS)

    Snaith, B.; Hardy, M.; Lewis, E.F.

    2014-01-01

    Introduction: Errors in the interpretation of diagnostic images in the emergency department are a persistent problem internationally. To address this issue, a number of risk reduction strategies have been suggested but only radiographer abnormality detection schemes (RADS) have been widely implemented in the UK. This study considers the variation in RADS operation and communication in light of technological advances and changes in service operation. Methods: A postal survey of all NHS hospitals operating either an Emergency Department or Minor Injury Unit and a diagnostic imaging (radiology) department (n = 510) was undertaken between July and August 2011. The questionnaire was designed to elicit information on emergency service provision and details of RADS. Results: 325 questionnaires were returned (n = 325/510; 63.7%). The majority of sites (n = 288/325; 88.6%) operated a RADS with the majority (n = 227/288; 78.8%) employing a visual ‘flagging’ system as the only method of communication although symbols used were inconsistent and contradictory across sites. 61 sites communicated radiographer findings through a written proforma (paper or electronic) but this was run in conjunction with a flagging system at 50 sites. The majority of sites did not have guidance on the scope or operation of the ‘flagging’ or written communication system in use. Conclusions: RADS is an established clinical intervention to reduce errors in diagnostic image interpretation within the emergency setting. The lack of standardisation in communication processes and practices alongside the rapid adoption of technology has increased the potential for error and miscommunication

  17. Analysis of Employee's Survey for Preventing Human-Errors

    Energy Technology Data Exchange (ETDEWEB)

    Sung, Chanho; Kim, Younggab; Joung, Sanghoun [KHNP Central Research Institute, Daejeon (Korea, Republic of)

    2013-10-15

    Human errors in nuclear power plant can cause large and small events or incidents. These events or incidents are one of main contributors of reactor trip and might threaten the safety of nuclear plants. To prevent human-errors, KHNP(nuclear power plants) introduced 'Human-error prevention techniques' and have applied the techniques to main parts such as plant operation, operation support, and maintenance and engineering. This paper proposes the methods to prevent and reduce human-errors in nuclear power plants through analyzing survey results which includes the utilization of the human-error prevention techniques and the employees' awareness of preventing human-errors. With regard to human-error prevention, this survey analysis presented the status of the human-error prevention techniques and the employees' awareness of preventing human-errors. Employees' understanding and utilization of the techniques was generally high and training level of employee and training effect on actual works were in good condition. Also, employees answered that the root causes of human-error were due to working environment including tight process, manpower shortage, and excessive mission rather than personal negligence or lack of personal knowledge. Consideration of working environment is certainly needed. At the present time, based on analyzing this survey, the best methods of preventing human-error are personal equipment, training/education substantiality, private mental health check before starting work, prohibit of multiple task performing, compliance with procedures, and enhancement of job site review. However, the most important and basic things for preventing human-error are interests of workers and organizational atmosphere such as communication between managers and workers, and communication between employees and bosses.

  18. A system engineer's Perspective on Human Errors For a more Effective Management of Human Factors in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Lee, Yong-Hee; Jang, Tong-Il; Lee, Soo-Kil

    2007-01-01

    The management of human factors in nuclear power plants (NPPs) has become one of the burden factors during their operating period after the design and construction period. Almost every study on the major accidents emphasizes the prominent importance of the human errors. Regardless of the regulatory requirements such as Periodic Safety Review, the management of human factors would be a main issue to reduce the human errors and to enhance the performance of plants. However, it is not easy to find out a more effective perspective on human errors to establish the engineering implementation plan for preventing them. This paper describes a system engineer's perspectives on human errors and discusses its application to the recent study on the human error events in Korean NPPs

  19. Risk Management and the Concept of Human Error

    DEFF Research Database (Denmark)

    Rasmussen, Jens

    1995-01-01

    by a stochastic coincidence of faults and human errors, but by a systemic erosion of the defenses due to decision making under competitive pressure in a dynamic environment. The presentation will discuss the nature of human error and the risk management problems found in a dynamic, competitive society facing...

  20. Quality of IT service delivery — Analysis and framework for human error prevention

    KAUST Repository

    Shwartz, L.; Rosu, D.; Loewenstern, D.; Buco, M. J.; Guo, S.; Lavrado, Rafael Coelho; Gupta, M.; De, P.; Madduri, V.; Singh, J. K.

    2010-01-01

    In this paper, we address the problem of reducing the occurrence of Human Errors that cause service interruptions in IT Service Support and Delivery operations. Analysis of a large volume of service interruption records revealed that more than 21

  1. Investigations on human error hazards in recent unintended trip events of Korean nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sa Kil; Jang, Tong Il; Lee, Yong Hee; Shin, Kwang Hyeon [KAERI, Daejeon (Korea, Republic of)

    2012-10-15

    According to the Operational Performance Information System (OPIS) which has been operated to improve the public understanding by the KINS (Korea Institute of Nuclear Safety), unintended trip events by mainly human errors counted up to 38 cases (18.7%) from 2000 to 2011. Although the Nuclear Power Plant (NPP) industry in Korea has been making efforts to reduce the human errors which have largely contributed to trip events, the human error rate might keep increasing. Interestingly, digital based I and C systems is the one of the reduction factors of unintended reactor trips. Human errors, however, have occurred due to the digital based I and C systems because those systems require new or changed behaviors to the NPP operators. Therefore, it is necessary that the investigations of human errors consider a new methodology to find not only tangible behavior but also intangible behavior such as organizational behaviors. In this study we investigated human errors to find latent factors such as decisions and conditions in the all of the unintended reactor trip events during last dozen years. To find them, we applied the HFACS (Human Factors Analysis and Classification System) which is a commonly utilized tool for investigating human contributions to aviation accidents under a widespread evaluation scheme. The objective of this study is to find latent factors behind of human errors in nuclear reactor trip events. Therefore, a method to investigate unintended trip events by human errors and the results will be discussed in more detail.

  2. Investigations on human error hazards in recent unintended trip events of Korean nuclear power plants

    International Nuclear Information System (INIS)

    Kim, Sa Kil; Jang, Tong Il; Lee, Yong Hee; Shin, Kwang Hyeon

    2012-01-01

    According to the Operational Performance Information System (OPIS) which has been operated to improve the public understanding by the KINS (Korea Institute of Nuclear Safety), unintended trip events by mainly human errors counted up to 38 cases (18.7%) from 2000 to 2011. Although the Nuclear Power Plant (NPP) industry in Korea has been making efforts to reduce the human errors which have largely contributed to trip events, the human error rate might keep increasing. Interestingly, digital based I and C systems is the one of the reduction factors of unintended reactor trips. Human errors, however, have occurred due to the digital based I and C systems because those systems require new or changed behaviors to the NPP operators. Therefore, it is necessary that the investigations of human errors consider a new methodology to find not only tangible behavior but also intangible behavior such as organizational behaviors. In this study we investigated human errors to find latent factors such as decisions and conditions in the all of the unintended reactor trip events during last dozen years. To find them, we applied the HFACS (Human Factors Analysis and Classification System) which is a commonly utilized tool for investigating human contributions to aviation accidents under a widespread evaluation scheme. The objective of this study is to find latent factors behind of human errors in nuclear reactor trip events. Therefore, a method to investigate unintended trip events by human errors and the results will be discussed in more detail

  3. Application of grey incidence analysis to connection between human errors and root cause

    International Nuclear Information System (INIS)

    Ren Yinxiang; Yu Ren; Zhou Gang; Chen Dengke

    2008-01-01

    By introducing grey incidence analysis, the relatively important impact of root cause upon human errors was researched in the paper. On the basis of WANO statistic data and grey incidence analysis, lack of alternate examine, bad basic operation, short of theoretical knowledge, relaxation of organization and management and deficiency of regulations are the important influence of root cause on human err ors. Finally, the question to reduce human errors was discussed. (authors)

  4. Reducing waste and errors: piloting lean principles at Intermountain Healthcare.

    Science.gov (United States)

    Jimmerson, Cindy; Weber, Dorothy; Sobek, Durward K

    2005-05-01

    The Toyota Production System (TPS), based on industrial engineering principles and operational innovations, is used to achieve waste reduction and efficiency while increasing product quality. Several key tools and principles, adapted to health care, have proved effective in improving hospital operations. Value Stream Maps (VSMs), which represent the key people, material, and information flows required to deliver a product or service, distinguish between value-adding and non-value-adding steps. The one-page Problem-Solving A3 Report guides staff through a rigorous and systematic problem-solving process. PILOT PROJECT at INTERMOUNTAIN HEALTHCARE: In a pilot project, participants made many improvements, ranging from simple changes implemented immediately (for example, heart monitor paper not available when a patient presented with a dysrythmia) to larger projects involving patient or information flow issues across multiple departments. Most of the improvements required little or no investment and reduced significant amounts of wasted time for front-line workers. In one unit, turnaround time for pathologist reports from an anatomical pathology lab was reduced from five to two days. TPS principles and tools are applicable to an endless variety of processes and work settings in health care and can be used to address critical challenges such as medical errors, escalating costs, and staffing shortages.

  5. An Approach to Human Error Hazard Detection of Unexpected Situations in NPPs

    Energy Technology Data Exchange (ETDEWEB)

    Park, Sangjun; Oh, Yeonju; Shin, Youmin; Lee, Yong-Hee [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    Fukushima accident is a typical complex event including the extreme situations induced by the succeeding earthquake, tsunami, explosion, and human errors. And it is judged with incomplete cause of system build-up same manner, procedure as a deficiency of response manual, education and training, team capability and the discharge of operator from human engineering point of view. Especially, the guidelines of current operating NPPs are not enough including countermeasures to the human errors at the extreme situations. Therefore, this paper describes a trial to detect the hazards of human errors at extreme situation, and to define the countermeasures that can properly response to the human error hazards when an individual, team, organization, and working entities that encounter the extreme situation in NPPs. In this paper we try to propose an approach to analyzing and extracting human error hazards for suggesting additional countermeasures to the human errors in unexpected situations. They might be utilized to develop contingency guidelines, especially for reducing the human error accident in NPPs. But the trial application in this study is currently limited since it is not easy to find accidents cases in detail enough to enumerate the proposed steps. Therefore, we will try to analyze as more cases as possible, and consider other environmental factors and human error conditions.

  6. An Approach to Human Error Hazard Detection of Unexpected Situations in NPPs

    International Nuclear Information System (INIS)

    Park, Sangjun; Oh, Yeonju; Shin, Youmin; Lee, Yong-Hee

    2015-01-01

    Fukushima accident is a typical complex event including the extreme situations induced by the succeeding earthquake, tsunami, explosion, and human errors. And it is judged with incomplete cause of system build-up same manner, procedure as a deficiency of response manual, education and training, team capability and the discharge of operator from human engineering point of view. Especially, the guidelines of current operating NPPs are not enough including countermeasures to the human errors at the extreme situations. Therefore, this paper describes a trial to detect the hazards of human errors at extreme situation, and to define the countermeasures that can properly response to the human error hazards when an individual, team, organization, and working entities that encounter the extreme situation in NPPs. In this paper we try to propose an approach to analyzing and extracting human error hazards for suggesting additional countermeasures to the human errors in unexpected situations. They might be utilized to develop contingency guidelines, especially for reducing the human error accident in NPPs. But the trial application in this study is currently limited since it is not easy to find accidents cases in detail enough to enumerate the proposed steps. Therefore, we will try to analyze as more cases as possible, and consider other environmental factors and human error conditions

  7. Reducing number entry errors: solving a widespread, serious problem.

    Science.gov (United States)

    Thimbleby, Harold; Cairns, Paul

    2010-10-06

    Number entry is ubiquitous: it is required in many fields including science, healthcare, education, government, mathematics and finance. People entering numbers are to be expected to make errors, but shockingly few systems make any effort to detect, block or otherwise manage errors. Worse, errors may be ignored but processed in arbitrary ways, with unintended results. A standard class of error (defined in the paper) is an 'out by 10 error', which is easily made by miskeying a decimal point or a zero. In safety-critical domains, such as drug delivery, out by 10 errors generally have adverse consequences. Here, we expose the extent of the problem of numeric errors in a very wide range of systems. An analysis of better error management is presented: under reasonable assumptions, we show that the probability of out by 10 errors can be halved by better user interface design. We provide a demonstration user interface to show that the approach is practical.To kill an error is as good a service as, and sometimes even better than, the establishing of a new truth or fact. (Charles Darwin 1879 [2008], p. 229).

  8. The recovery factors analysis of the human errors for research reactors

    International Nuclear Information System (INIS)

    Farcasiu, M.; Nitoi, M.; Apostol, M.; Turcu, I.; Florescu, Ghe.

    2006-01-01

    The results of many Probabilistic Safety Assessment (PSA) studies show a very significant contribution of human errors to systems unavailability of the nuclear installations. The treatment of human interactions is considered one of the major limitations in the context of PSA. To identify those human actions that can have an effect on system reliability or availability applying the Human Reliability Analysis (HRA) is necessary. The recovery factors analysis of the human action is an important step in HRA. This paper presents how can be reduced the human errors probabilities (HEP) using those elements that have the capacity to recovery human error. The recovery factors modeling is marked to identify error likelihood situations or situations that conduct at development of the accident. This analysis is realized by THERP method. The necessary information was obtained from the operating experience of the research reactor TRIGA of the INR Pitesti. The required data were obtained from generic databases. (authors)

  9. Sensitivity of risk parameters to human errors for a PWR

    International Nuclear Information System (INIS)

    Samanta, P.; Hall, R.E.; Kerr, W.

    1980-01-01

    Sensitivities of the risk parameters, emergency safety system unavailabilities, accident sequence probabilities, release category probabilities and core melt probability were investigated for changes in the human error rates within the general methodological framework of the Reactor Safety Study for a Pressurized Water Reactor (PWR). Impact of individual human errors were assessed both in terms of their structural importance to core melt and reliability importance on core melt probability. The Human Error Sensitivity Assessment of a PWR (HESAP) computer code was written for the purpose of this study

  10. Trial application of a technique for human error analysis (ATHEANA)

    International Nuclear Information System (INIS)

    Bley, D.C.; Cooper, S.E.; Parry, G.W.

    1996-01-01

    The new method for HRA, ATHEANA, has been developed based on a study of the operating history of serious accidents and an understanding of the reasons why people make errors. Previous publications associated with the project have dealt with the theoretical framework under which errors occur and the retrospective analysis of operational events. This is the first attempt to use ATHEANA in a prospective way, to select and evaluate human errors within the PSA context

  11. Interventions for reducing medication errors in children in hospital

    NARCIS (Netherlands)

    Maaskant, Jolanda M; Vermeulen, Hester; Apampa, Bugewa; Fernando, Bernard; Ghaleb, Maisoon A; Neubert, Antje; Thayyil, Sudhin; Soe, Aung

    2015-01-01

    BACKGROUND: Many hospitalised patients are affected by medication errors (MEs) that may cause discomfort, harm and even death. Children are at especially high risk of harm as the result of MEs because such errors are potentially more hazardous to them than to adults. Until now, interventions to

  12. Interventions for reducing medication errors in children in hospital

    NARCIS (Netherlands)

    Maaskant, Jolanda M.; Vermeulen, Hester; Apampa, Bugewa; Fernando, Bernard; Ghaleb, Maisoon A.; Neubert, Antje; Thayyil, Sudhin; Soe, Aung

    2015-01-01

    Background Many hospitalised patients are affected by medication errors (MEs) that may cause discomfort, harm and even death. Children are at especially high risk of harm as the result of MEs because such errors are potentially more hazardous to them than to adults. Until now, interventions to

  13. Reducing wrong patient selection errors: exploring the design space of user interface techniques.

    Science.gov (United States)

    Sopan, Awalin; Plaisant, Catherine; Powsner, Seth; Shneiderman, Ben

    2014-01-01

    Wrong patient selection errors are a major issue for patient safety; from ordering medication to performing surgery, the stakes are high. Widespread adoption of Electronic Health Record (EHR) and Computerized Provider Order Entry (CPOE) systems makes patient selection using a computer screen a frequent task for clinicians. Careful design of the user interface can help mitigate the problem by helping providers recall their patients' identities, accurately select their names, and spot errors before orders are submitted. We propose a catalog of twenty seven distinct user interface techniques, organized according to a task analysis. An associated video demonstrates eighteen of those techniques. EHR designers who consider a wider range of human-computer interaction techniques could reduce selection errors, but verification of efficacy is still needed.

  14. The psychological background about human error and safety in NPP

    International Nuclear Information System (INIS)

    Zhang Li

    1992-01-01

    A human error is one of the factors which cause an accident in NPP. The in-situ psychological background plays an important role in inducing it. The author analyzes the structure of one's psychological background when one is at work, and gives a few examples of typical psychological background resulting in human errors. Finally it points out that the fundamental way to eliminate the unfavourable psychological background of safety production is to establish the safety culture in NPP along with its characteristics

  15. Intervention strategies for the management of human error

    Science.gov (United States)

    Wiener, Earl L.

    1993-01-01

    This report examines the management of human error in the cockpit. The principles probably apply as well to other applications in the aviation realm (e.g. air traffic control, dispatch, weather, etc.) as well as other high-risk systems outside of aviation (e.g. shipping, high-technology medical procedures, military operations, nuclear power production). Management of human error is distinguished from error prevention. It is a more encompassing term, which includes not only the prevention of error, but also a means of disallowing an error, once made, from adversely affecting system output. Such techniques include: traditional human factors engineering, improvement of feedback and feedforward of information from system to crew, 'error-evident' displays which make erroneous input more obvious to the crew, trapping of errors within a system, goal-sharing between humans and machines (also called 'intent-driven' systems), paperwork management, and behaviorally based approaches, including procedures, standardization, checklist design, training, cockpit resource management, etc. Fifteen guidelines for the design and implementation of intervention strategies are included.

  16. Human medial frontal cortex activity predicts learning from errors.

    Science.gov (United States)

    Hester, Robert; Barre, Natalie; Murphy, Kevin; Silk, Tim J; Mattingley, Jason B

    2008-08-01

    Learning from errors is a critical feature of human cognition. It underlies our ability to adapt to changing environmental demands and to tune behavior for optimal performance. The posterior medial frontal cortex (pMFC) has been implicated in the evaluation of errors to control behavior, although it has not previously been shown that activity in this region predicts learning from errors. Using functional magnetic resonance imaging, we examined activity in the pMFC during an associative learning task in which participants had to recall the spatial locations of 2-digit targets and were provided with immediate feedback regarding accuracy. Activity within the pMFC was significantly greater for errors that were subsequently corrected than for errors that were repeated. Moreover, pMFC activity during recall errors predicted future responses (correct vs. incorrect), despite a sizeable interval (on average 70 s) between an error and the next presentation of the same recall probe. Activity within the hippocampus also predicted future performance and correlated with error-feedback-related pMFC activity. A relationship between performance expectations and pMFC activity, in the absence of differing reinforcement value for errors, is consistent with the idea that error-related pMFC activity reflects the extent to which an outcome is "worse than expected."

  17. Modeling Human Error Mechanism for Soft Control in Advanced Control Rooms (ACRs)

    International Nuclear Information System (INIS)

    Aljneibi, Hanan Salah Ali; Ha, Jun Su; Kang, Seongkeun; Seong, Poong Hyun

    2015-01-01

    To achieve the switch from conventional analog-based design to digital design in ACRs, a large number of manual operating controls and switches have to be replaced by a few common multi-function devices which is called soft control system. The soft controls in APR-1400 ACRs are classified into safety-grade and non-safety-grade soft controls; each was designed using different and independent input devices in ACRs. The operations using soft controls require operators to perform new tasks which were not necessary in conventional controls such as navigating computerized displays to monitor plant information and control devices. These kinds of computerized displays and soft controls may make operations more convenient but they might cause new types of human error. In this study the human error mechanism during the soft controls is studied and modeled to be used for analysis and enhancement of human performance (or human errors) during NPP operation. The developed model would contribute to a lot of applications to improve human performance (or reduce human errors), HMI designs, and operators' training program in ACRs. The developed model of human error mechanism for the soft control is based on assumptions that a human operator has certain amount of capacity in cognitive resources and if resources required by operating tasks are greater than resources invested by the operator, human error (or poor human performance) is likely to occur (especially in 'slip'); good HMI (Human-machine Interface) design decreases the required resources; operator's skillfulness decreases the required resources; and high vigilance increases the invested resources. In this study the human error mechanism during the soft controls is studied and modeled to be used for analysis and enhancement of human performance (or reduction of human errors) during NPP operation

  18. Modeling Human Error Mechanism for Soft Control in Advanced Control Rooms (ACRs)

    Energy Technology Data Exchange (ETDEWEB)

    Aljneibi, Hanan Salah Ali [Khalifa Univ., Abu Dhabi (United Arab Emirates); Ha, Jun Su; Kang, Seongkeun; Seong, Poong Hyun [KAIST, Daejeon (Korea, Republic of)

    2015-10-15

    To achieve the switch from conventional analog-based design to digital design in ACRs, a large number of manual operating controls and switches have to be replaced by a few common multi-function devices which is called soft control system. The soft controls in APR-1400 ACRs are classified into safety-grade and non-safety-grade soft controls; each was designed using different and independent input devices in ACRs. The operations using soft controls require operators to perform new tasks which were not necessary in conventional controls such as navigating computerized displays to monitor plant information and control devices. These kinds of computerized displays and soft controls may make operations more convenient but they might cause new types of human error. In this study the human error mechanism during the soft controls is studied and modeled to be used for analysis and enhancement of human performance (or human errors) during NPP operation. The developed model would contribute to a lot of applications to improve human performance (or reduce human errors), HMI designs, and operators' training program in ACRs. The developed model of human error mechanism for the soft control is based on assumptions that a human operator has certain amount of capacity in cognitive resources and if resources required by operating tasks are greater than resources invested by the operator, human error (or poor human performance) is likely to occur (especially in 'slip'); good HMI (Human-machine Interface) design decreases the required resources; operator's skillfulness decreases the required resources; and high vigilance increases the invested resources. In this study the human error mechanism during the soft controls is studied and modeled to be used for analysis and enhancement of human performance (or reduction of human errors) during NPP operation.

  19. SHEAN (Simplified Human Error Analysis code) and automated THERP

    International Nuclear Information System (INIS)

    Wilson, J.R.

    1993-01-01

    One of the most widely used human error analysis tools is THERP (Technique for Human Error Rate Prediction). Unfortunately, this tool has disadvantages. The Nuclear Regulatory Commission, realizing these drawbacks, commissioned Dr. Swain, the author of THERP, to create a simpler, more consistent tool for deriving human error rates. That effort produced the Accident Sequence Evaluation Program Human Reliability Analysis Procedure (ASEP), which is more conservative than THERP, but a valuable screening tool. ASEP involves answering simple questions about the scenario in question, and then looking up the appropriate human error rate in the indicated table (THERP also uses look-up tables, but four times as many). The advantages of ASEP are that human factors expertise is not required, and the training to use the method is minimal. Although not originally envisioned by Dr. Swain, the ASEP approach actually begs to be computerized. That WINCO did, calling the code SHEAN, for Simplified Human Error ANalysis. The code was done in TURBO Basic for IBM or IBM-compatible MS-DOS, for fast execution. WINCO is now in the process of comparing this code against THERP for various scenarios. This report provides a discussion of SHEAN

  20. Inherent safety, ethics and human error.

    Science.gov (United States)

    Papadaki, Maria

    2008-02-11

    stated. The reason this article is presented here is that I believe that often, complex accidents, similarly to insignificant ones, often demonstrate an attitude which can be characterized as "inherently unsafe". I take the view that the enormous human potential and the human ability to minimize accidents needs to become a focal point towards inherent safety. Restricting ourselves to human limitations and how we could "treat" or prevent humans from not making accidents needs to be re-addressed. The purpose of this presentation is to highlight observations and provoke a discussion on how we could possibly improve the understanding of safety related issues. I do not intent to reject or criticize existing methodologies. (The entire presentation is strongly influenced by Trevor Kletz's work although our views are often different.).

  1. Human Error Probability Assessment During Maintenance Activities of Marine Systems

    Directory of Open Access Journals (Sweden)

    Rabiul Islam

    2018-03-01

    Full Text Available Background: Maintenance operations on-board ships are highly demanding. Maintenance operations are intensive activities requiring high man–machine interactions in challenging and evolving conditions. The evolving conditions are weather conditions, workplace temperature, ship motion, noise and vibration, and workload and stress. For example, extreme weather condition affects seafarers' performance, increasing the chances of error, and, consequently, can cause injuries or fatalities to personnel. An effective human error probability model is required to better manage maintenance on-board ships. The developed model would assist in developing and maintaining effective risk management protocols. Thus, the objective of this study is to develop a human error probability model considering various internal and external factors affecting seafarers' performance. Methods: The human error probability model is developed using probability theory applied to Bayesian network. The model is tested using the data received through the developed questionnaire survey of >200 experienced seafarers with >5 years of experience. The model developed in this study is used to find out the reliability of human performance on particular maintenance activities. Results: The developed methodology is tested on the maintenance of marine engine's cooling water pump for engine department and anchor windlass for deck department. In the considered case studies, human error probabilities are estimated in various scenarios and the results are compared between the scenarios and the different seafarer categories. The results of the case studies for both departments are also compared. Conclusion: The developed model is effective in assessing human error probabilities. These probabilities would get dynamically updated as and when new information is available on changes in either internal (i.e., training, experience, and fatigue or external (i.e., environmental and operational conditions

  2. Technique for human-error sequence identification and signification

    International Nuclear Information System (INIS)

    Heslinga, G.

    1988-01-01

    The aim of the present study was to investigate whether the event-tree technique can be used for the analysis of sequences of human errors that could cause initiating events. The scope of the study was limited to a consideration of the performance of procedural actions. The event-tree technique was modified to adapt it for this study and will be referred to as the 'Technique for Human-Error-Sequence Identification and Signification' (THESIS). The event trees used in this manner, i.e. THESIS event trees, appear to present additional problems if they are applied to human performance instead of technical systems. These problems, referred to as the 'Man-Related Features' of THESIS, are: the human capability to choose among several procedures, the ergonomics of the panel layout, human actions of a continuous nature, dependence between human errors, human capability to recover possible errors, the influence of memory during the recovery attempt, variability in human performance and correlations between human;erropr probabilities. The influence of these problems on the applicability of THESIS was assessed by means of mathematical analysis, field studies and laboratory experiments (author). 130 refs.; 51 figs.; 24 tabs

  3. Selection of anchor values for human error probability estimation

    International Nuclear Information System (INIS)

    Buffardi, L.C.; Fleishman, E.A.; Allen, J.A.

    1989-01-01

    There is a need for more dependable information to assist in the prediction of human errors in nuclear power environments. The major objective of the current project is to establish guidelines for using error probabilities from other task settings to estimate errors in the nuclear environment. This involves: (1) identifying critical nuclear tasks, (2) discovering similar tasks in non-nuclear environments, (3) finding error data for non-nuclear tasks, and (4) establishing error-rate values for the nuclear tasks based on the non-nuclear data. A key feature is the application of a classification system to nuclear and non-nuclear tasks to evaluate their similarities and differences in order to provide a basis for generalizing human error estimates across tasks. During the first eight months of the project, several classification systems have been applied to a sample of nuclear tasks. They are discussed in terms of their potential for establishing task equivalence and transferability of human error rates across situations

  4. A stochastic dynamic model for human error analysis in nuclear power plants

    Science.gov (United States)

    Delgado-Loperena, Dharma

    Nuclear disasters like Three Mile Island and Chernobyl indicate that human performance is a critical safety issue, sending a clear message about the need to include environmental press and competence aspects in research. This investigation was undertaken to serve as a roadmap for studying human behavior through the formulation of a general solution equation. The theoretical model integrates models from two heretofore-disassociated disciplines (behavior specialists and technical specialists), that historically have independently studied the nature of error and human behavior; including concepts derived from fractal and chaos theory; and suggests re-evaluation of base theory regarding human error. The results of this research were based on comprehensive analysis of patterns of error, with the omnipresent underlying structure of chaotic systems. The study of patterns lead to a dynamic formulation, serving for any other formula used to study human error consequences. The search for literature regarding error yielded insight for the need to include concepts rooted in chaos theory and strange attractors---heretofore unconsidered by mainstream researchers who investigated human error in nuclear power plants or those who employed the ecological model in their work. The study of patterns obtained from the rupture of a steam generator tube (SGTR) event simulation, provided a direct application to aspects of control room operations in nuclear power plant operations. In doing so, the conceptual foundation based in the understanding of the patterns of human error analysis can be gleaned, resulting in reduced and prevent undesirable events.

  5. Reducing Technology-Induced Errors: Organizational and Health Systems Approaches.

    Science.gov (United States)

    Borycki, Elizabeth M; Senthriajah, Yalini; Kushniruk, Andre W; Palojoki, Sari; Saranto, Kaija; Takeda, Hiroshi

    2016-01-01

    Technology-induced errors are a growing concern for health care organizations. Such errors arise from the interaction between healthcare and information technology deployed in complex settings and contexts. As the number of health information technologies that are used to provide patient care rises so will the need to develop ways to improve the quality and safety of the technology that we use. The objective of the panel is to describe varying approaches to improving software safety from and organizational and health systems perspective. We define what a technology-induced error is. Then, we discuss how software design and testing can be used to improve health information technologies. This discussion is followed by work in the area of monitoring and reporting at a health district and national level. Lastly, we draw on the quality, safety and resilience literature. The target audience for this work are nursing and health informatics researchers, practitioners, administrators, policy makers and students.

  6. Evaluation of human error estimation for nuclear power plants

    International Nuclear Information System (INIS)

    Haney, L.N.; Blackman, H.S.

    1987-01-01

    The dominant risk for severe accident occurrence in nuclear power plants (NPPs) is human error. The US Nuclear Regulatory Commission (NRC) sponsored an evaluation of Human Reliability Analysis (HRA) techniques for estimation of human error in NPPs. Twenty HRA techniques identified by a literature search were evaluated with criteria sets designed for that purpose and categorized. Data were collected at a commercial NPP with operators responding in walkthroughs of four severe accident scenarios and full scope simulator runs. Results suggest a need for refinement and validation of the techniques. 19 refs

  7. Bringing organizational factors to the fore of human error management

    International Nuclear Information System (INIS)

    Embrey, D.

    1991-01-01

    Human performance problems account for more than half of all significant events at nuclear power plants, even when these did not necessarily lead to severe accidents. In dealing with the management of human error, both technical and organizational factors need to be taken into account. Most important, a long-term commitment from senior management is needed. (author)

  8. Error detection in spoken human-machine interaction

    NARCIS (Netherlands)

    Krahmer, E.J.; Swerts, M.G.J.; Theune, M.; Weegels, M.F.

    2001-01-01

    Given the state of the art of current language and speech technology, errors are unavoidable in present-day spoken dialogue systems. Therefore, one of the main concerns in dialogue design is how to decide whether or not the system has understood the user correctly. In human-human communication,

  9. Error detection in spoken human-machine interaction

    NARCIS (Netherlands)

    Krahmer, E.; Swerts, M.; Theune, Mariet; Weegels, M.

    Given the state of the art of current language and speech technology, errors are unavoidable in present-day spoken dialogue systems. Therefore, one of the main concerns in dialogue design is how to decide whether or not the system has understood the user correctly. In human-human communication,

  10. Normalization of Deviation: Quotation Error in Human Factors.

    Science.gov (United States)

    Lock, Jordan; Bearman, Chris

    2018-05-01

    Objective The objective of this paper is to examine quotation error in human factors. Background Science progresses through building on the work of previous research. This requires accurate quotation. Quotation error has a number of adverse consequences: loss of credibility, loss of confidence in the journal, and a flawed basis for academic debate and scientific progress. Quotation error has been observed in a number of domains, including marine biology and medicine, but there has been little or no previous study of this form of error in human factors, a domain that specializes in the causes and management of error. Methods A study was conducted examining quotation accuracy of 187 extracts from 118 published articles that cited a control article (Vaughan's 1996 book: The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA). Results Of extracts studied, 12.8% ( n = 24) were classed as inaccurate, with 87.2% ( n = 163) being classed as accurate. A second dimension of agreement was examined with 96.3% ( n = 180) agreeing with the control article and only 3.7% ( n = 7) disagreeing. The categories of accuracy and agreement form a two by two matrix. Conclusion Rather than simply blaming individuals for quotation error, systemic factors should also be considered. Vaughan's theory, normalization of deviance, is one systemic theory that can account for quotation error. Application Quotation error is occurring in human factors and should receive more attention. According to Vaughan's theory, the normal everyday systems that promote scholarship may also allow mistakes, mishaps, and quotation error to occur.

  11. Using HET taxonomy to help stop human error

    OpenAIRE

    Li, Wen-Chin; Harris, Don; Stanton, Neville A.; Hsu, Yueh-Ling; Chang, Danny; Wang, Thomas; Young, Hong-Tsu

    2010-01-01

    Flight crews make positive contributions to the safety of aviation operations. Pilots have to assess continuously changing situations, evaluate potential risks, and make quick decisions. However, even well-trained and experienced pilots make errors. Accident investigations have identified that pilots’ performance is influenced significantly by the design of the flightdeck interface. This research applies hierarchical task analysis (HTA) and utilizes the Human Error Template (HET) taxonomy to ...

  12. Applications of human error analysis to aviation and space operations

    International Nuclear Information System (INIS)

    Nelson, W.R.

    1998-01-01

    For the past several years at the Idaho National Engineering and Environmental Laboratory (INEEL) we have been working to apply methods of human error analysis to the design of complex systems. We have focused on adapting human reliability analysis (HRA) methods that were developed for Probabilistic Safety Assessment (PSA) for application to system design. We are developing methods so that human errors can be systematically identified during system design, the potential consequences of each error can be assessed, and potential corrective actions (e.g. changes to system design or procedures) can be identified. These applications lead to different requirements when compared with HR.As performed as part of a PSA. For example, because the analysis will begin early during the design stage, the methods must be usable when only partial design information is available. In addition, the ability to perform numerous ''what if'' analyses to identify and compare multiple design alternatives is essential. Finally, since the goals of such human error analyses focus on proactive design changes rather than the estimate of failure probabilities for PRA, there is more emphasis on qualitative evaluations of error relationships and causal factors than on quantitative estimates of error frequency. The primary vehicle we have used to develop and apply these methods has been a series of prqjects sponsored by the National Aeronautics and Space Administration (NASA) to apply human error analysis to aviation operations. The first NASA-sponsored project had the goal to evaluate human errors caused by advanced cockpit automation. Our next aviation project focused on the development of methods and tools to apply human error analysis to the design of commercial aircraft. This project was performed by a consortium comprised of INEEL, NASA, and Boeing Commercial Airplane Group. The focus of the project was aircraft design and procedures that could lead to human errors during airplane maintenance

  13. Some aspects of statistical modeling of human-error probability

    International Nuclear Information System (INIS)

    Prairie, R.R.

    1982-01-01

    Human reliability analyses (HRA) are often performed as part of risk assessment and reliability projects. Recent events in nuclear power have shown the potential importance of the human element. There are several on-going efforts in the US and elsewhere with the purpose of modeling human error such that the human contribution can be incorporated into an overall risk assessment associated with one or more aspects of nuclear power. An effort that is described here uses the HRA (event tree) to quantify and model the human contribution to risk. As an example, risk analyses are being prepared on several nuclear power plants as part of the Interim Reliability Assessment Program (IREP). In this process the risk analyst selects the elements of his fault tree that could be contributed to by human error. He then solicits the HF analyst to do a HRA on this element

  14. Advanced MMIS Toward Substantial Reduction in Human Errors in NPPs

    Energy Technology Data Exchange (ETDEWEB)

    Seong, Poong Hyun; Kang, Hyun Gook [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Na, Man Gyun [Chosun Univ., Gwangju (Korea, Republic of); Kim, Jong Hyun [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of); Heo, Gyunyoung [Kyung Hee Univ., Yongin (Korea, Republic of); Jung, Yoensub [Korea Hydro and Nuclear Power Co., Ltd., Daejeon (Korea, Republic of)

    2013-04-15

    This paper aims to give an overview of the methods to inherently prevent human errors and to effectively mitigate the consequences of such errors by securing defense-in-depth during plant management through the advanced man-machine interface system (MMIS). It is needless to stress the significance of human error reduction during an accident in nuclear power plants (NPPs). Unexpected shutdowns caused by human errors not only threaten nuclear safety but also make public acceptance of nuclear power extremely lower. We have to recognize there must be the possibility of human errors occurring since humans are not essentially perfect particularly under stressful conditions. However, we have the opportunity to improve such a situation through advanced information and communication technologies on the basis of lessons learned from our experiences. As important lessons, authors explained key issues associated with automation, man-machine interface, operator support systems, and procedures. Upon this investigation, we outlined the concept and technical factors to develop advanced automation, operation and maintenance support systems, and computer-based procedures using wired/wireless technology. It should be noted that the ultimate responsibility of nuclear safety obviously belongs to humans not to machines. Therefore, safety culture including education and training, which is a kind of organizational factor, should be emphasized as well. In regard to safety culture for human error reduction, several issues that we are facing these days were described. We expect the ideas of the advanced MMIS proposed in this paper to lead in the future direction of related researches and finally supplement the safety of NPPs.

  15. Advanced MMIS Toward Substantial Reduction in Human Errors in NPPs

    International Nuclear Information System (INIS)

    Seong, Poong Hyun; Kang, Hyun Gook; Na, Man Gyun; Kim, Jong Hyun; Heo, Gyunyoung; Jung, Yoensub

    2013-01-01

    This paper aims to give an overview of the methods to inherently prevent human errors and to effectively mitigate the consequences of such errors by securing defense-in-depth during plant management through the advanced man-machine interface system (MMIS). It is needless to stress the significance of human error reduction during an accident in nuclear power plants (NPPs). Unexpected shutdowns caused by human errors not only threaten nuclear safety but also make public acceptance of nuclear power extremely lower. We have to recognize there must be the possibility of human errors occurring since humans are not essentially perfect particularly under stressful conditions. However, we have the opportunity to improve such a situation through advanced information and communication technologies on the basis of lessons learned from our experiences. As important lessons, authors explained key issues associated with automation, man-machine interface, operator support systems, and procedures. Upon this investigation, we outlined the concept and technical factors to develop advanced automation, operation and maintenance support systems, and computer-based procedures using wired/wireless technology. It should be noted that the ultimate responsibility of nuclear safety obviously belongs to humans not to machines. Therefore, safety culture including education and training, which is a kind of organizational factor, should be emphasized as well. In regard to safety culture for human error reduction, several issues that we are facing these days were described. We expect the ideas of the advanced MMIS proposed in this paper to lead in the future direction of related researches and finally supplement the safety of NPPs

  16. ADVANCED MMIS TOWARD SUBSTANTIAL REDUCTION IN HUMAN ERRORS IN NPPS

    Directory of Open Access Journals (Sweden)

    POONG HYUN SEONG

    2013-04-01

    Full Text Available This paper aims to give an overview of the methods to inherently prevent human errors and to effectively mitigate the consequences of such errors by securing defense-in-depth during plant management through the advanced man-machine interface system (MMIS. It is needless to stress the significance of human error reduction during an accident in nuclear power plants (NPPs. Unexpected shutdowns caused by human errors not only threaten nuclear safety but also make public acceptance of nuclear power extremely lower. We have to recognize there must be the possibility of human errors occurring since humans are not essentially perfect particularly under stressful conditions. However, we have the opportunity to improve such a situation through advanced information and communication technologies on the basis of lessons learned from our experiences. As important lessons, authors explained key issues associated with automation, man-machine interface, operator support systems, and procedures. Upon this investigation, we outlined the concept and technical factors to develop advanced automation, operation and maintenance support systems, and computer-based procedures using wired/wireless technology. It should be noted that the ultimate responsibility of nuclear safety obviously belongs to humans not to machines. Therefore, safety culture including education and training, which is a kind of organizational factor, should be emphasized as well. In regard to safety culture for human error reduction, several issues that we are facing these days were described. We expect the ideas of the advanced MMIS proposed in this paper to lead in the future direction of related researches and finally supplement the safety of NPPs.

  17. Safety coaches in radiology: decreasing human error and minimizing patient harm

    Energy Technology Data Exchange (ETDEWEB)

    Dickerson, Julie M.; Adams, Janet M. [Cincinnati Children' s Hospital Medical Center, Department of Radiology, MLC 5031, Cincinnati, OH (United States); Koch, Bernadette L.; Donnelly, Lane F. [Cincinnati Children' s Hospital Medical Center, Department of Radiology, MLC 5031, Cincinnati, OH (United States); Cincinnati Children' s Hospital Medical Center, Department of Pediatrics, Cincinnati, OH (United States); Goodfriend, Martha A. [Cincinnati Children' s Hospital Medical Center, Department of Quality Improvement, Cincinnati, OH (United States)

    2010-09-15

    Successful programs to improve patient safety require a component aimed at improving safety culture and environment, resulting in a reduced number of human errors that could lead to patient harm. Safety coaching provides peer accountability. It involves observing for safety behaviors and use of error prevention techniques and provides immediate feedback. For more than a decade, behavior-based safety coaching has been a successful strategy for reducing error within the context of occupational safety in industry. We describe the use of safety coaches in radiology. Safety coaches are an important component of our comprehensive patient safety program. (orig.)

  18. Safety coaches in radiology: decreasing human error and minimizing patient harm

    International Nuclear Information System (INIS)

    Dickerson, Julie M.; Adams, Janet M.; Koch, Bernadette L.; Donnelly, Lane F.; Goodfriend, Martha A.

    2010-01-01

    Successful programs to improve patient safety require a component aimed at improving safety culture and environment, resulting in a reduced number of human errors that could lead to patient harm. Safety coaching provides peer accountability. It involves observing for safety behaviors and use of error prevention techniques and provides immediate feedback. For more than a decade, behavior-based safety coaching has been a successful strategy for reducing error within the context of occupational safety in industry. We describe the use of safety coaches in radiology. Safety coaches are an important component of our comprehensive patient safety program. (orig.)

  19. Safety coaches in radiology: decreasing human error and minimizing patient harm.

    Science.gov (United States)

    Dickerson, Julie M; Koch, Bernadette L; Adams, Janet M; Goodfriend, Martha A; Donnelly, Lane F

    2010-09-01

    Successful programs to improve patient safety require a component aimed at improving safety culture and environment, resulting in a reduced number of human errors that could lead to patient harm. Safety coaching provides peer accountability. It involves observing for safety behaviors and use of error prevention techniques and provides immediate feedback. For more than a decade, behavior-based safety coaching has been a successful strategy for reducing error within the context of occupational safety in industry. We describe the use of safety coaches in radiology. Safety coaches are an important component of our comprehensive patient safety program.

  20. Development of an integrated system for estimating human error probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Auflick, J.L.; Hahn, H.A.; Morzinski, J.A.

    1998-12-01

    This is the final report of a three-year, Laboratory Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). This project had as its main objective the development of a Human Reliability Analysis (HRA), knowledge-based expert system that would provide probabilistic estimates for potential human errors within various risk assessments, safety analysis reports, and hazard assessments. HRA identifies where human errors are most likely, estimates the error rate for individual tasks, and highlights the most beneficial areas for system improvements. This project accomplished three major tasks. First, several prominent HRA techniques and associated databases were collected and translated into an electronic format. Next, the project started a knowledge engineering phase where the expertise, i.e., the procedural rules and data, were extracted from those techniques and compiled into various modules. Finally, these modules, rules, and data were combined into a nearly complete HRA expert system.

  1. Customization of user interfaces to reduce errors and enhance user acceptance.

    Science.gov (United States)

    Burkolter, Dina; Weyers, Benjamin; Kluge, Annette; Luther, Wolfram

    2014-03-01

    Customization is assumed to reduce error and increase user acceptance in the human-machine relation. Reconfiguration gives the operator the option to customize a user interface according to his or her own preferences. An experimental study with 72 computer science students using a simulated process control task was conducted. The reconfiguration group (RG) interactively reconfigured their user interfaces and used the reconfigured user interface in the subsequent test whereas the control group (CG) used a default user interface. Results showed significantly lower error rates and higher acceptance of the RG compared to the CG while there were no significant differences between the groups regarding situation awareness and mental workload. Reconfiguration seems to be promising and therefore warrants further exploration. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  2. The Countermeasures against the Human Errors in Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yong Hee; Kwon, Ki Chun; Lee, Jung Woon; Lee, Hyun; Jang, Tong Il

    2009-10-15

    Due to human error, the failure of nuclear power facilities essential for the prevention of accidents and related research in ergonomics and human factors, including the long term, comprehensive measures are considered technology is urgently required. Past nuclear facilities for the hardware in terms of continuing interest over subsequent definite improvement even have brought, now a nuclear facility to engage in people-related human factors for attention by nuclear facilities, ensuring the safety of its economic and industrial aspects. The point of the improvement is urgently required. The purpose of this research, including nuclear power plants in various nuclear facilities to minimize the possibility of human error by ensuring the safety for human engineering aspects will be implemented in the medium and long term preventive measures is to establish comprehensive.

  3. The Countermeasures against the Human Errors in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Lee, Yong Hee; Kwon, Ki Chun; Lee, Jung Woon; Lee, Hyun; Jang, Tong Il

    2009-10-01

    Due to human error, the failure of nuclear power facilities essential for the prevention of accidents and related research in ergonomics and human factors, including the long term, comprehensive measures are considered technology is urgently required. Past nuclear facilities for the hardware in terms of continuing interest over subsequent definite improvement even have brought, now a nuclear facility to engage in people-related human factors for attention by nuclear facilities, ensuring the safety of its economic and industrial aspects. The point of the improvement is urgently required. The purpose of this research, including nuclear power plants in various nuclear facilities to minimize the possibility of human error by ensuring the safety for human engineering aspects will be implemented in the medium and long term preventive measures is to establish comprehensive

  4. Quality assurance and human error effects on the structural safety

    International Nuclear Information System (INIS)

    Bertero, R.; Lopez, R.; Sarrate, M.

    1991-01-01

    Statistical surveys show that the frequency of failure of structures is much larger than that expected by the codes. Evidence exists that human errors (especially during the design process) is the main cause for the difference between the failure probability admitted by codes and the reality. In this paper, the attenuation of human error effects using tools of quality assurance is analyzed. In particular, the importance of the independent design review is highlighted, and different approaches are discussed. The experience from the Atucha II project, as well as the USA and German practice on independent design review, are summarized. (Author)

  5. Physical predictions from lattice QCD. Reducing systematic errors

    International Nuclear Information System (INIS)

    Pittori, C.

    1994-01-01

    Some recent developments in the theoretical understanding of lattice quantum chromodynamics and of its possible sources of systematic errors are reported, and a review of some of the latest Monte Carlo results for light quarks phenomenology is presented. A very general introduction on a quantum field theory on a discrete spacetime lattice is given, and the Monte Carlo methods which allow to compute many interesting physical quantities in the non-perturbative domain of strong interactions, is illustrated. (author). 17 refs., 3 figs., 3 tabs

  6. A critique of recent models for human error rate assessment

    International Nuclear Information System (INIS)

    Apostolakis, G.E.

    1988-01-01

    This paper critically reviews two groups of models for assessing human error rates under accident conditions. The first group, which includes the US Nuclear Regulatory Commission (NRC) handbook model and the human cognitive reliability (HCR) model, considers as fundamental the time that is available to the operators to act. The second group, which is represented by the success likelihood index methodology multiattribute utility decomposition (SLIM-MAUD) model, relies on ratings of the human actions with respect to certain qualitative factors and the subsequent derivation of error rates. These models are evaluated with respect to two criteria: the treatment of uncertainties and the internal coherence of the models. In other words, this evaluation focuses primarily on normative aspects of these models. The principal findings are as follows: (1) Both of the time-related models provide human error rates as a function of the available time for action and the prevailing conditions. However, the HCR model ignores the important issue of state-of-knowledge uncertainties, dealing exclusively with stochastic uncertainty, whereas the model presented in the NRC handbook handles both types of uncertainty. (2) SLIM-MAUD provides a highly structured approach for the derivation of human error rates under given conditions. However, the treatment of the weights and ratings in this model is internally inconsistent. (author)

  7. Maintenance strategies to reduce downtime due to machine positional errors

    OpenAIRE

    Shagluf, Abubaker; Longstaff, A.P.; Fletcher, S.

    2014-01-01

    Proceedings of Maintenance Performance Measurement and Management (MPMM) Conference 2014 Manufacturing strives to reduce waste and increase Overall Equipment Effectiveness (OEE). When managing machine tool maintenance a manufacturer must apply an appropriate decision technique in order to reveal hidden costs associated with production losses, reduce equipment downtime competentely and similiarly identify the machines performance. Total productive maintenance (TPM) is a maintenance progr...

  8. Interactive analysis of human error factors in NPP operation events

    International Nuclear Information System (INIS)

    Zhang Li; Zou Yanhua; Huang Weigang

    2010-01-01

    Interactive of human error factors in NPP operation events were introduced, and 645 WANO operation event reports from 1999 to 2008 were analyzed, among which 432 were found relative to human errors. After classifying these errors with the Root Causes or Causal Factors, and then applying SPSS for correlation analysis,we concluded: (1) Personnel work practices are restricted by many factors. Forming a good personnel work practices is a systematic work which need supports in many aspects. (2)Verbal communications,personnel work practices, man-machine interface and written procedures and documents play great roles. They are four interaction factors which often come in bundle. If some improvements need to be made on one of them,synchronous measures are also necessary for the others.(3) Management direction and decision process, which are related to management,have a significant interaction with personnel factors. (authors)

  9. Human error in strabismus surgery: Quantification with a sensitivity analysis

    NARCIS (Netherlands)

    S. Schutte (Sander); J.R. Polling (Jan Roelof); F.C.T. van der Helm (Frans); H.J. Simonsz (Huib)

    2009-01-01

    textabstractBackground: Reoperations are frequently necessary in strabismus surgery. The goal of this study was to analyze human-error related factors that introduce variability in the results of strabismus surgery in a systematic fashion. Methods: We identified the primary factors that influence

  10. Human error in strabismus surgery : Quantification with a sensitivity analysis

    NARCIS (Netherlands)

    Schutte, S.; Polling, J.R.; Van der Helm, F.C.T.; Simonsz, H.J.

    2008-01-01

    Background- Reoperations are frequently necessary in strabismus surgery. The goal of this study was to analyze human-error related factors that introduce variability in the results of strabismus surgery in a systematic fashion. Methods- We identified the primary factors that influence the outcome of

  11. Assessing human error during collecting a hydrocarbon sample of ...

    African Journals Online (AJOL)

    This paper reports the assessment method of the hydrocarbon sample collection standard operation procedure (SOP) using THERP. The Performance Shaping Factors (PSF) from THERP analyzed and assessed the human errors during collecting a hydrocarbon sample of a petrochemical refinery plant. Twenty-two ...

  12. Impact of human error on lumber yield in rough mills

    Science.gov (United States)

    Urs Buehlmann; R. Edward Thomas; R. Edward Thomas

    2002-01-01

    Rough sawn, kiln-dried lumber contains characteristics such as knots and bark pockets that are considered by most people to be defects. When using boards to produce furniture components, these defects are removed to produce clear, defect-free parts. Currently, human operators identify and locate the unusable board areas containing defects. Errors in determining a...

  13. Process error rates in general research applications to the Human ...

    African Journals Online (AJOL)

    Objective. To examine process error rates in applications for ethics clearance of health research. Methods. Minutes of 586 general research applications made to a human health research ethics committee (HREC) from April 2008 to March 2009 were examined. Rates of approval were calculated and reasons for requiring ...

  14. Preventing marine accidents caused by technology-induced human error

    OpenAIRE

    Bielić, Toni; Hasanspahić, Nermin; Čulin, Jelena

    2017-01-01

    The objective of embedding technology on board ships, to improve safety, is not fully accomplished. The paper studies marine accidents caused by human error resulting from improper human-technology interaction. The aim of the paper is to propose measures to prevent reoccurrence of such accidents. This study analyses the marine accident reports issued by Marine Accidents Investigation Branch covering the period from 2012 to 2014. The factors that caused these accidents are examined and categor...

  15. Fault tree model of human error based on error-forcing contexts

    International Nuclear Information System (INIS)

    Kang, Hyun Gook; Jang, Seung Cheol; Ha, Jae Joo

    2004-01-01

    In the safety-critical systems such as nuclear power plants, the safety-feature actuation is fully automated. In emergency case, the human operator could also play the role of a backup for automated systems. That is, the failure of safety-feature-actuation signal generation implies the concurrent failure of automated systems and that of manual actuation. The human operator's manual actuation failure is largely affected by error-forcing contexts (EFC). The failures of sensors and automated systems are most important ones. The sensors, the automated actuation system and the human operators are correlated in a complex manner and hard to develop a proper model. In this paper, we will explain the condition-based human reliability assessment (CBHRA) method in order to treat these complicated conditions in a practical way. In this study, we apply the CBHRA method to the manual actuation of safety features such as reactor trip and safety injection in Korean Standard Nuclear Power Plants

  16. Cognitive strategies: a method to reduce diagnostic errors in ER

    Directory of Open Access Journals (Sweden)

    Carolina Prevaldi

    2009-02-01

    Full Text Available I wonder why sometimes we are able to rapidly recognize patterns of disease presentation, formulate a speedy diagnostic closure, and go on with a treatment plan. On the other hand sometimes we proceed studing in deep our patient in an analytic, slow and rational way of decison making. Why decisions sometimes can be intuitive, while sometimes we have to proceed in a rigorous way? What is the “back ground noise” and the “signal to noise ratio” of presenting sintoms? What is the risk in premature labeling or “closure” of a patient? When is it useful the “cook-book” approach in clinical decision making? The Emergency Department is a natural laboratory for the study of error” stated an author. Many studies have focused on the occurrence of errors in medicine, and in hospital practice, but the ED with his unique operating characteristics seems to be a uniquely errorprone environment. That's why it is useful to understand the underlying pattern of thinking that can lead us to misdiagnosis. The general knowledge of thought processes gives the psysician awareness an the ability to apply different tecniques in clinical decision making and to recognize and avoid pitfalls.

  17. THERP and HEART integrated methodology for human error assessment

    Science.gov (United States)

    Castiglia, Francesco; Giardina, Mariarosa; Tomarchio, Elio

    2015-11-01

    THERP and HEART integrated methodology is proposed to investigate accident scenarios that involve operator errors during high-dose-rate (HDR) treatments. The new approach has been modified on the basis of fuzzy set concept with the aim of prioritizing an exhaustive list of erroneous tasks that can lead to patient radiological overexposures. The results allow for the identification of human errors that are necessary to achieve a better understanding of health hazards in the radiotherapy treatment process, so that it can be properly monitored and appropriately managed.

  18. Human errors evaluation for muster in emergency situations applying human error probability index (HEPI, in the oil company warehouse in Hamadan City

    Directory of Open Access Journals (Sweden)

    2012-12-01

    Full Text Available Introduction: Emergency situation is one of the influencing factors on human error. The aim of this research was purpose to evaluate human error in emergency situation of fire and explosion at the oil company warehouse in Hamadan city applying human error probability index (HEPI. . Material and Method: First, the scenario of emergency situation of those situation of fire and explosion at the oil company warehouse was designed and then maneuver against, was performed. The scaled questionnaire of muster for the maneuver was completed in the next stage. Collected data were analyzed to calculate the probability success for the 18 actions required in an emergency situation from starting point of the muster until the latest action to temporary sheltersafe. .Result: The result showed that the highest probability of error occurrence was related to make safe workplace (evaluation phase with 32.4 % and lowest probability of occurrence error in detection alarm (awareness phase with 1.8 %, probability. The highest severity of error was in the evaluation phase and the lowest severity of error was in the awareness and recovery phase. Maximum risk level was related to the evaluating exit routes and selecting one route and choosy another exit route and minimum risk level was related to the four evaluation phases. . Conclusion: To reduce the risk of reaction in the exit phases of an emergency situation, the following actions are recommended, based on the finding in this study: A periodic evaluation of the exit phase and modifying them if necessary, conducting more maneuvers and analyzing this results along with a sufficient feedback to the employees.

  19. Human error: An essential problem of nuclear power plants

    International Nuclear Information System (INIS)

    Smidt, D.

    1981-01-01

    The author first defines the part played by man in the nuclear power plant and then deals in more detail with the structure of his valse behavior in tactical and strategic repect. The dicussion of tactical errors and their avoidance is follwed by a report on the actual state of plant technology and possible improvements. Subsequently a study of the strategic errors including the conclusion to be drawn until now (joint between plant and man, personal selection and education) is made. If the joints between man and machine are designed according and physiological strenghts and weaknesses of man are fully realized and taken into account human errors not be essential problem in nuclear power plant. (GL) [de

  20. Effects of digital human-machine interface characteristics on human error in nuclear power plants

    International Nuclear Information System (INIS)

    Li Pengcheng; Zhang Li; Dai Licao; Huang Weigang

    2011-01-01

    In order to identify the effects of digital human-machine interface characteristics on human error in nuclear power plants, the new characteristics of digital human-machine interface are identified by comparing with the traditional analog control systems in the aspects of the information display, user interface interaction and management, control systems, alarm systems and procedures system, and the negative effects of digital human-machine interface characteristics on human error are identified by field research and interviewing with operators such as increased cognitive load and workload, mode confusion, loss of situation awareness. As to the adverse effects related above, the corresponding prevention and control measures of human errors are provided to support the prevention and minimization of human errors and the optimization of human-machine interface design. (authors)

  1. Self-assessment of human performance errors in nuclear operations

    International Nuclear Information System (INIS)

    Chambliss, K.V.

    1996-01-01

    One of the most important approaches to improving nuclear safety is to have an effective self-assessment process in place, whose cornerstone is the identification and improvement of human performance errors. Experience has shown that significant events usually have had precursors of human performance errors. If these precursors are left uncorrected or not understood, the symptoms recur and result in unanticipated events of greater safety significance. The Institute of Nuclear Power Operations (INPO) has been championing the cause of promoting excellence in human performance in the nuclear industry. INPO's report, open-quotes Excellence in Human Performance,close quotes emphasizes the importance of several factors that play a role in human performance. They include individual, supervisory, and organizational behaviors; real-time feedback that results in specific behavior to produce safe and reliable performance; and proactive measures that remove obstacles from excellent human performance. Zack Pate, chief executive officer and president of INPO, in his report, open-quotes The Control Room,close quotes provides an excellent discussion of serious events in the nuclear industry since 1994 and compares them with the results from a recent study by the National Transportation Safety Board of airline accidents in the 12-yr period from 1978 to 1990 to draw some common themes that relate to human performance issues in the control room

  2. Twice cutting method reduces tibial cutting error in unicompartmental knee arthroplasty.

    Science.gov (United States)

    Inui, Hiroshi; Taketomi, Shuji; Yamagami, Ryota; Sanada, Takaki; Tanaka, Sakae

    2016-01-01

    Bone cutting error can be one of the causes of malalignment in unicompartmental knee arthroplasty (UKA). The amount of cutting error in total knee arthroplasty has been reported. However, none have investigated cutting error in UKA. The purpose of this study was to reveal the amount of cutting error in UKA when open cutting guide was used and clarify whether cutting the tibia horizontally twice using the same cutting guide reduced the cutting errors in UKA. We measured the alignment of the tibial cutting guides, the first-cut cutting surfaces and the second cut cutting surfaces using the navigation system in 50 UKAs. Cutting error was defined as the angular difference between the cutting guide and cutting surface. The mean absolute first-cut cutting error was 1.9° (1.1° varus) in the coronal plane and 1.1° (0.6° anterior slope) in the sagittal plane, whereas the mean absolute second-cut cutting error was 1.1° (0.6° varus) in the coronal plane and 1.1° (0.4° anterior slope) in the sagittal plane. Cutting the tibia horizontally twice reduced the cutting errors in the coronal plane significantly (Pcutting the tibia horizontally twice using the same cutting guide reduced cutting error in the coronal plane. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. Human error probability estimation using licensee event reports

    International Nuclear Information System (INIS)

    Voska, K.J.; O'Brien, J.N.

    1984-07-01

    Objective of this report is to present a method for using field data from nuclear power plants to estimate human error probabilities (HEPs). These HEPs are then used in probabilistic risk activities. This method of estimating HEPs is one of four being pursued in NRC-sponsored research. The other three are structured expert judgment, analysis of training simulator data, and performance modeling. The type of field data analyzed in this report is from Licensee Event reports (LERs) which are analyzed using a method specifically developed for that purpose. However, any type of field data or human errors could be analyzed using this method with minor adjustments. This report assesses the practicality, acceptability, and usefulness of estimating HEPs from LERs and comprehensively presents the method for use

  4. The using of the control room automation against human errors

    International Nuclear Information System (INIS)

    Kautto, A.

    1993-01-01

    The control room automation has developed very strongly during the 80's in IVO (Imatran Voima Oy). The former work expanded strongly with building of the full scope training simulator to the Loviisa plant. The important milestones has been, for example the testing of the Critical Function Monitoring System, a concept developed by Combustion Eng. Inc., in Loviisa training simulator 1982, the replacing of the process and simulator computers in Loviisa 1989, and 1990 and the presenting the use of the computer based procedures in training of operators 1993. With developing of automation and procedures it is possible to minimize the probability of human error. However, it is not possible totally eliminate the risks caused by human errors. (orig.)

  5. Utilizing AFIS searching tools to reduce errors in fingerprint casework.

    Science.gov (United States)

    Langenburg, Glenn; Hall, Carey; Rosemarie, Quincy

    2015-12-01

    Fifty-six (56) adjudicated, property crime cases involving fingerprint evidence were reviewed using a case-specific AFIS database tool. This tool allowed fingerprint experts to search latent prints in the cases against a database of friction ridge exemplars limited to only the individuals specific to that particular case. We utilized three different methods to encode and search the latent prints: automatic feature extraction, manual encoding performed by a student intern, and manual encoding performed by a fingerprint expert. Performance in the study was strongest when the encoding was conducted by the fingerprint expert. The results of the study showed that while the AFIS tools failed to locate all of the identifications originally reported by the initial fingerprint expert that worked the case, the AFIS tools helped to identify 7 additional latent prints that were not reported by the initial fingerprint expert. We conclude that this technology, when combined with fingerprint expertise, will reduce the number of instances where an erroneous exclusion could occur, increase the efficiency of a fingerprint unit, and be a useful tool for reviewing active or cold cases for missed opportunities to report identifications. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  6. Estimation of the human error probabilities in the human reliability analysis

    International Nuclear Information System (INIS)

    Liu Haibin; He Xuhong; Tong Jiejuan; Shen Shifei

    2006-01-01

    Human error data is an important issue of human reliability analysis (HRA). Using of Bayesian parameter estimation, which can use multiple information, such as the historical data of NPP and expert judgment data to modify the human error data, could get the human error data reflecting the real situation of NPP more truly. This paper, using the numeric compute program developed by the authors, presents some typical examples to illustrate the process of the Bayesian parameter estimation in HRA and discusses the effect of different modification data on the Bayesian parameter estimation. (authors)

  7. Human Errors - A Taxonomy for Describing Human Malfunction in Industrial Installations

    DEFF Research Database (Denmark)

    Rasmussen, Jens

    1982-01-01

    This paper describes the definition and the characteristics of human errors. Different types of human behavior are classified, and their relation to different error mechanisms are analyzed. The effect of conditioning factors related to affective, motivating aspects of the work situation as well...... as physiological factors are also taken into consideration. The taxonomy for event analysis, including human malfunction, is presented. Possibilities for the prediction of human error are discussed. The need for careful studies in actual work situations is expressed. Such studies could provide a better...

  8. Trend analysis of human error events and assessment of their proactive prevention measure at Rokkasho reprocessing plant

    International Nuclear Information System (INIS)

    Yamazaki, Satoru; Tanaka, Izumi; Wakabayashi, Toshio

    2012-01-01

    A trend analysis of human error events is important for preventing the recurrence of human error events. We propose a new method for identifying the common characteristics from results of trend analysis, such as the latent weakness of organization, and a management process for strategic error prevention. In this paper, we describe a trend analysis method for human error events that have been accumulated in the organization and the utilization of the results of trend analysis to prevent accidents proactively. Although the systematic analysis of human error events, the monitoring of their overall trend, and the utilization of the analyzed results have been examined for the plant operation, such information has never been utilized completely. Sharing information on human error events and analyzing their causes lead to the clarification of problems in the management and human factors. This new method was applied to the human error events that occurred in the Rokkasho reprocessing plant from 2010 October. Results revealed that the output of this method is effective in judging the error prevention plan and that the number of human error events is reduced to about 50% those observed in 2009 and 2010. (author)

  9. Identification of failure sequences sensitive to human error

    International Nuclear Information System (INIS)

    1987-06-01

    This report prepared by the participants of the technical committee meeting on ''Identification of Failure Sequences Sensitive to Human Error'' addresses the subjects discussed during the meeting and the conclusions reached by the committee. Chapter 1 reviews the INSAG recommendations and the main elements of the IAEA Programme in the area of human element. In Chapter 2 the role of human actions in nuclear power plants safety from insights of operational experience is reviewed. Chapter 3 is concerned with the relationship between probabilistic safety assessment and human performance associated with severe accident sequences. Chapter 4 addresses the role of simulators in view of training for accident conditions. Chapter 5 presents the conclusions and future trends. The seven papers presented by members of this technical committee are also included in this technical document. A separate abstract was prepared for each of these papers

  10. Modelling the basic error tendencies of human operators

    International Nuclear Information System (INIS)

    Reason, J.

    1988-01-01

    The paper outlines the primary structural features of human cognition: a limited, serial workspace interacting with a parallel distributed knowledge base. It is argued that the essential computational features of human cognition - to be captured by an adequate operator model - reside in the mechanisms by which stored knowledge structures are selected and brought into play. Two such computational 'primitives' are identified: similarity-matching and frequency-gambling. These two retrieval heuristics, it is argued, shape both the overall character of human performance (i.e. its heavy reliance on pattern-matching) and its basic error tendencies ('strong-but-wrong' responses, confirmation, similarity and frequency biases, and cognitive 'lock-up'). The various features of human cognition are integrated with a dynamic operator model capable of being represented in software form. This computer model, when run repeatedly with a variety of problem configurations, should produce a distribution of behaviours which, in total, simulate the general character of operator performance. (author)

  11. Modelling the basic error tendencies of human operators

    International Nuclear Information System (INIS)

    Reason, James

    1988-01-01

    The paper outlines the primary structural features of human cognition: a limited, serial workspace interacting with a parallel distributed knowledge base. It is argued that the essential computational features of human cognition - to be captured by an adequate operator model - reside in the mechanisms by which stored knowledge structures are selected and brought into play. Two such computational 'primitives' are identified: similarity-matching and frequency-gambling. These two retrieval heuristics, it is argued, shape both the overall character of human performance (i.e. its heavy reliance on pattern-matching) and its basic error tendencies ('strong-but-wrong' responses, confirmation, similarity and frequency biases, and cognitive 'lock-up'). The various features of human cognition are integrated with a dynamic operator model capable of being represented in software form. This computer model, when run repeatedly with a variety of problem configurations, should produce a distribution of behaviours which, in toto, simulate the general character of operator performance. (author)

  12. Near field communications technology and the potential to reduce medication errors through multidisciplinary application.

    Science.gov (United States)

    O'Connell, Emer; Pegler, Joe; Lehane, Elaine; Livingstone, Vicki; McCarthy, Nora; Sahm, Laura J; Tabirca, Sabin; O'Driscoll, Aoife; Corrigan, Mark

    2016-01-01

    Patient safety requires optimal management of medications. Electronic systems are encouraged to reduce medication errors. Near field communications (NFC) is an emerging technology that may be used to develop novel medication management systems. An NFC-based system was designed to facilitate prescribing, administration and review of medications commonly used on surgical wards. Final year medical, nursing, and pharmacy students were recruited to test the electronic system in a cross-over observational setting on a simulated ward. Medication errors were compared against errors recorded using a paper-based system. A significant difference in the commission of medication errors was seen when NFC and paper-based medication systems were compared. Paper use resulted in a mean of 4.09 errors per prescribing round while NFC prescribing resulted in a mean of 0.22 errors per simulated prescribing round (P=0.000). Likewise, medication administration errors were reduced from a mean of 2.30 per drug round with a Paper system to a mean of 0.80 errors per round using NFC (PNFC based medication system may be used to effectively reduce medication errors in a simulated ward environment.

  13. Complications: acknowledging, managing, and coping with human error.

    Science.gov (United States)

    Helo, Sevann; Moulton, Carol-Anne E

    2017-08-01

    Errors are inherent in medicine due to the imperfectness of human nature. Health care providers may have a difficult time accepting their fallibility, acknowledging mistakes, and disclosing errors. Fear of litigation, shame, blame, and concern about reputation are just some of the barriers preventing physicians from being more candid with their patients, despite the supporting body of evidence that patients cite poor communication and lack of transparency as primary drivers to file a lawsuit in the wake of a medical complication. Proper error disclosure includes a timely explanation of what happened, who was involved, why the error occurred, and how it will be prevented in the future. Medical mistakes afford the opportunity for individuals and institutions to be candid about their weaknesses while improving patient care processes. When a physician takes the Hippocratic Oath they take on a tremendous sense of responsibility for the care of their patients, and often bear the burden of their mistakes in isolation. Physicians may struggle with guilt, shame, and a crisis of confidence, which may thwart efforts to identify areas for improvement that can lead to meaningful change. Coping strategies for providers include discussing the event with others, seeking professional counseling, and implementing quality improvement projects. Physicians and health care organizations need to find adaptive ways to deal with complications that will benefit patients, providers, and their institutions.

  14. The contributions of human factors on human error in Malaysia aviation maintenance industries

    Science.gov (United States)

    Padil, H.; Said, M. N.; Azizan, A.

    2018-05-01

    Aviation maintenance is a multitasking activity in which individuals perform varied tasks under constant pressure to meet deadlines as well as challenging work conditions. These situational characteristics combined with human factors can lead to various types of human related errors. The primary objective of this research is to develop a structural relationship model that incorporates human factors, organizational factors, and their impact on human errors in aviation maintenance. Towards that end, a questionnaire was developed which was administered to Malaysian aviation maintenance professionals. Structural Equation Modelling (SEM) approach was used in this study utilizing AMOS software. Results showed that there were a significant relationship of human factors on human errors and were tested in the model. Human factors had a partial effect on organizational factors while organizational factors had a direct and positive impact on human errors. It was also revealed that organizational factors contributed to human errors when coupled with human factors construct. This study has contributed to the advancement of knowledge on human factors effecting safety and has provided guidelines for improving human factors performance relating to aviation maintenance activities and could be used as a reference for improving safety performance in the Malaysian aviation maintenance companies.

  15. Deadline pressure and human error: a study of human failures on a particle accelerator at Brookhaven National Laboratory

    International Nuclear Information System (INIS)

    Tiagha, E.A.

    1982-01-01

    The decline in industrial efficiency may be linked to decreased reliability of complex automatic systems. This decline threatens the viability of complex organizations in industrialized economies. Industrial engineering techniques that minimize system failure by increasing the reliability of systems hardware are well developed in comparison with those available to reduce human operator errors. The problem of system reliability and the associated costs of breakdown can be reduced if we understand how highly skilled technical personnel function in complex operations and systems. The purpose of this research is to investigate how human errors are affected by deadline pressures, technical communication and other socio-dynamic factors. Through the analysis of a technologically complex particle accelerator prototype at Brookhaven National Laboratory, two failure mechanisms: (1) physical defects in the production process and (2) human operator errors were identified. Two instruments were used to collect information on human failures: objective laboratory data and a human failure questionnaire. The results of human failures from the objective data were used to test for the deadline hypothesis and also to validate the human failure questionnaire. To explain why the human failures occurred, data were collected from a four-part, closed choice questionnaire administered to two groups of scientists, engineers, and technicians, working together against a deadline to produce an engineering prototype of a particle accelerator

  16. Automated drug dispensing system reduces medication errors in an intensive care setting.

    Science.gov (United States)

    Chapuis, Claire; Roustit, Matthieu; Bal, Gaëlle; Schwebel, Carole; Pansu, Pascal; David-Tchouda, Sandra; Foroni, Luc; Calop, Jean; Timsit, Jean-François; Allenet, Benoît; Bosson, Jean-Luc; Bedouch, Pierrick

    2010-12-01

    We aimed to assess the impact of an automated dispensing system on the incidence of medication errors related to picking, preparation, and administration of drugs in a medical intensive care unit. We also evaluated the clinical significance of such errors and user satisfaction. Preintervention and postintervention study involving a control and an intervention medical intensive care unit. Two medical intensive care units in the same department of a 2,000-bed university hospital. Adult medical intensive care patients. After a 2-month observation period, we implemented an automated dispensing system in one of the units (study unit) chosen randomly, with the other unit being the control. The overall error rate was expressed as a percentage of total opportunities for error. The severity of errors was classified according to National Coordinating Council for Medication Error Reporting and Prevention categories by an expert committee. User satisfaction was assessed through self-administered questionnaires completed by nurses. A total of 1,476 medications for 115 patients were observed. After automated dispensing system implementation, we observed a reduced percentage of total opportunities for error in the study compared to the control unit (13.5% and 18.6%, respectively; perror (20.4% and 13.5%; perror showed a significant impact of the automated dispensing system in reducing preparation errors (perrors caused no harm (National Coordinating Council for Medication Error Reporting and Prevention category C). The automated dispensing system did not reduce errors causing harm. Finally, the mean for working conditions improved from 1.0±0.8 to 2.5±0.8 on the four-point Likert scale. The implementation of an automated dispensing system reduced overall medication errors related to picking, preparation, and administration of drugs in the intensive care unit. Furthermore, most nurses favored the new drug dispensation organization.

  17. Reducing errors benefits the field-based learning of a fundamental movement skill in children.

    Science.gov (United States)

    Capio, C M; Poolton, J M; Sit, C H P; Holmstrom, M; Masters, R S W

    2013-03-01

    Proficient fundamental movement skills (FMS) are believed to form the basis of more complex movement patterns in sports. This study examined the development of the FMS of overhand throwing in children through either an error-reduced (ER) or error-strewn (ES) training program. Students (n = 216), aged 8-12 years (M = 9.16, SD = 0.96), practiced overhand throwing in either a program that reduced errors during practice (ER) or one that was ES. ER program reduced errors by incrementally raising the task difficulty, while the ES program had an incremental lowering of task difficulty. Process-oriented assessment of throwing movement form (Test of Gross Motor Development-2) and product-oriented assessment of throwing accuracy (absolute error) were performed. Changes in performance were examined among children in the upper and lower quartiles of the pretest throwing accuracy scores. ER training participants showed greater gains in movement form and accuracy, and performed throwing more effectively with a concurrent secondary cognitive task. Movement form improved among girls, while throwing accuracy improved among children with low ability. Reduced performance errors in FMS training resulted in greater learning than a program that did not restrict errors. Reduced cognitive processing costs (effective dual-task performance) associated with such approach suggest its potential benefits for children with developmental conditions. © 2011 John Wiley & Sons A/S.

  18. Near field communications technology and the potential to reduce medication errors through multidisciplinary application

    LENUS (Irish Health Repository)

    O’Connell, Emer

    2016-07-01

    Patient safety requires optimal management of medications. Electronic systems are encouraged to reduce medication errors. Near field communications (NFC) is an emerging technology that may be used to develop novel medication management systems.

  19. Quantification of human errors in level-1 PSA studies in NUPEC/JINS

    International Nuclear Information System (INIS)

    Hirano, M.; Hirose, M.; Sugawara, M.; Hashiba, T.

    1991-01-01

    THERP (Technique for Human Error Rate Prediction) method is mainly adopted to evaluate the pre-accident and post-accident human error rates. Performance shaping factors are derived by taking Japanese operational practice into account. Several examples of human error rates with calculational procedures are presented. The important human interventions of typical Japanese NPPs are also presented. (orig./HP)

  20. Task types and error types involved in the human-related unplanned reactor trip events

    International Nuclear Information System (INIS)

    Kim, Jae Whan; Park, Jin Kyun

    2008-01-01

    In this paper, the contribution of task types and error types involved in the human-related unplanned reactor trip events that have occurred between 1986 and 2006 in Korean nuclear power plants are analysed in order to establish a strategy for reducing the human-related unplanned reactor trips. Classification systems for the task types, error modes, and cognitive functions are developed or adopted from the currently available taxonomies, and the relevant information is extracted from the event reports or judged on the basis of an event description. According to the analyses from this study, the contributions of the task types are as follows: corrective maintenance (25.7%), planned maintenance (22.8%), planned operation (19.8%), periodic preventive maintenance (14.9%), response to a transient (9.9%), and design/manufacturing/installation (6.9%). According to the analysis of the error modes, error modes such as control failure (22.2%), wrong object (18.5%), omission (14.8%), wrong action (11.1%), and inadequate (8.3%) take up about 75% of the total unplanned trip events. The analysis of the cognitive functions involved in the events indicated that the planning function had the highest contribution (46.7%) to the human actions leading to unplanned reactor trips. This analysis concludes that in order to significantly reduce human-induced or human-related unplanned reactor trips, an aide system (in support of maintenance personnel) for evaluating possible (negative) impacts of planned actions or erroneous actions as well as an appropriate human error prediction technique, should be developed

  1. Task types and error types involved in the human-related unplanned reactor trip events

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae Whan; Park, Jin Kyun [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2008-12-15

    In this paper, the contribution of task types and error types involved in the human-related unplanned reactor trip events that have occurred between 1986 and 2006 in Korean nuclear power plants are analysed in order to establish a strategy for reducing the human-related unplanned reactor trips. Classification systems for the task types, error modes, and cognitive functions are developed or adopted from the currently available taxonomies, and the relevant information is extracted from the event reports or judged on the basis of an event description. According to the analyses from this study, the contributions of the task types are as follows: corrective maintenance (25.7%), planned maintenance (22.8%), planned operation (19.8%), periodic preventive maintenance (14.9%), response to a transient (9.9%), and design/manufacturing/installation (6.9%). According to the analysis of the error modes, error modes such as control failure (22.2%), wrong object (18.5%), omission (14.8%), wrong action (11.1%), and inadequate (8.3%) take up about 75% of the total unplanned trip events. The analysis of the cognitive functions involved in the events indicated that the planning function had the highest contribution (46.7%) to the human actions leading to unplanned reactor trips. This analysis concludes that in order to significantly reduce human-induced or human-related unplanned reactor trips, an aide system (in support of maintenance personnel) for evaluating possible (negative) impacts of planned actions or erroneous actions as well as an appropriate human error prediction technique, should be developed.

  2. Reducing patient identification errors related to glucose point-of-care testing

    Directory of Open Access Journals (Sweden)

    Gaurav Alreja

    2011-01-01

    Full Text Available Background: Patient identification (ID errors in point-of-care testing (POCT can cause test results to be transferred to the wrong patient′s chart or prevent results from being transmitted and reported. Despite the implementation of patient barcoding and ongoing operator training at our institution, patient ID errors still occur with glucose POCT. The aim of this study was to develop a solution to reduce identification errors with POCT. Materials and Methods: Glucose POCT was performed by approximately 2,400 clinical operators throughout our health system. Patients are identified by scanning in wristband barcodes or by manual data entry using portable glucose meters. Meters are docked to upload data to a database server which then transmits data to any medical record matching the financial number of the test result. With a new model, meters connect to an interface manager where the patient ID (a nine-digit account number is checked against patient registration data from admission, discharge, and transfer (ADT feeds and only matched results are transferred to the patient′s electronic medical record. With the new process, the patient ID is checked prior to testing, and testing is prevented until ID errors are resolved. Results: When averaged over a period of a month, ID errors were reduced to 3 errors/month (0.015% in comparison with 61.5 errors/month (0.319% before implementing the new meters. Conclusion: Patient ID errors may occur with glucose POCT despite patient barcoding. The verification of patient identification should ideally take place at the bedside before testing occurs so that the errors can be addressed in real time. The introduction of an ADT feed directly to glucose meters reduced patient ID errors in POCT.

  3. Human error probability quantification using fuzzy methodology in nuclear plants

    International Nuclear Information System (INIS)

    Nascimento, Claudio Souza do

    2010-01-01

    This work obtains Human Error Probability (HEP) estimates from operator's actions in response to emergency situations a hypothesis on Research Reactor IEA-R1 from IPEN. It was also obtained a Performance Shaping Factors (PSF) evaluation in order to classify them according to their influence level onto the operator's actions and to determine these PSF actual states over the plant. Both HEP estimation and PSF evaluation were done based on Specialists Evaluation using interviews and questionnaires. Specialists group was composed from selected IEA-R1 operators. Specialist's knowledge representation into linguistic variables and group evaluation values were obtained through Fuzzy Logic and Fuzzy Set Theory. HEP obtained values show good agreement with literature published data corroborating the proposed methodology as a good alternative to be used on Human Reliability Analysis (HRA). (author)

  4. A continuous quality improvement project to reduce medication error in the emergency department.

    Science.gov (United States)

    Lee, Sara Bc; Lee, Larry Ly; Yeung, Richard Sd; Chan, Jimmy Ts

    2013-01-01

    Medication errors are a common source of adverse healthcare incidents particularly in the emergency department (ED) that has a number of factors that make it prone to medication errors. This project aims to reduce medication errors and improve the health and economic outcomes of clinical care in Hong Kong ED. In 2009, a task group was formed to identify problems that potentially endanger medication safety and developed strategies to eliminate these problems. Responsible officers were assigned to look after seven error-prone areas. Strategies were proposed, discussed, endorsed and promulgated to eliminate the problems identified. A reduction of medication incidents (MI) from 16 to 6 was achieved before and after the improvement work. This project successfully established a concrete organizational structure to safeguard error-prone areas of medication safety in a sustainable manner.

  5. A strategy for minimizing common mode human error in executing critical functions and tasks

    International Nuclear Information System (INIS)

    Beltracchi, L.; Lindsay, R.W.

    1992-01-01

    Human error in execution of critical functions and tasks can be costly. The Three Mile Island and the Chernobyl Accidents are examples of results from human error in the nuclear industry. There are similar errors that could no doubt be cited from other industries. This paper discusses a strategy to minimize common mode human error in the execution of critical functions and tasks. The strategy consists of the use of human redundancy, and also diversity in human cognitive behavior: skill-, rule-, and knowledge-based behavior. The authors contend that the use of diversity in human cognitive behavior is possible, and it minimizes common mode error

  6. Effects of human errors on the determination of surveillance test interval

    International Nuclear Information System (INIS)

    Chung, Dae Wook; Koo, Bon Hyun

    1990-01-01

    This paper incorporates the effects of human error relevant to the periodic test on the unavailability of the safety system as well as the component unavailability. Two types of possible human error during the test are considered. One is the possibility that a good safety system is inadvertently left in a bad state after the test (Type A human error) and the other is the possibility that bad safety system is undetected upon the test (Type B human error). An event tree model is developed for the steady-state unavailability of safety system to determine the effects of human errors on the component unavailability and the test interval. We perform the reliability analysis of safety injection system (SIS) by applying aforementioned two types of human error to safety injection pumps. Results of various sensitivity analyses show that; 1) the appropriate test interval decreases and steady-state unavailability increases as the probabilities of both types of human errors increase, and they are far more sensitive to Type A human error than Type B and 2) the SIS unavailability increases slightly as the probability of Type B human error increases, and significantly as the probability of Type A human error increases. Therefore, to avoid underestimation, the effects of human error should be incorporated in the system reliability analysis which aims at the relaxations of the surveillance test intervals, and Type A human error has more important effect on the unavailability and surveillance test interval

  7. Modelling the basic error tendencies of human operators

    Energy Technology Data Exchange (ETDEWEB)

    Reason, J.

    1988-01-01

    The paper outlines the primary structural features of human cognition: a limited, serial workspace interacting with a parallel distributed knowledge base. It is argued that the essential computational features of human cognition - to be captured by an adequate operator model - reside in the mechanisms by which stored knowledge structures are selected and brought into play. Two such computational 'primitives' are identified: similarity-matching and frequency-gambling. These two retrieval heuristics, it is argued, shape both the overall character of human performance (i.e. its heavy reliance on pattern-matching) and its basic error tendencies ('strong-but-wrong' responses, confirmation, similarity and frequency biases, and cognitive 'lock-up'). The various features of human cognition are integrated with a dynamic operator model capable of being represented in software form. This computer model, when run repeatedly with a variety of problem configurations, should produce a distribution of behaviours which, in toto, simulate the general character of operator performance.

  8. The possible benefits of reduced errors in the motor skills acquisition of children

    Directory of Open Access Journals (Sweden)

    Capio Catherine M

    2012-01-01

    Full Text Available Abstract An implicit approach to motor learning suggests that relatively complex movement skills may be better acquired in environments that constrain errors during the initial stages of practice. This current concept paper proposes that reducing the number of errors committed during motor learning leads to stable performance when attention demands are increased by concurrent cognitive tasks. While it appears that this approach to practice may be beneficial for motor learning, further studies are needed to both confirm this advantage and better understand the underlying mechanisms. An approach involving error minimization during early learning may have important applications in paediatric rehabilitation.

  9. The Relationship between Human Operators' Psycho-physiological Condition and Human Errors in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Kim, Arryum; Jang, Inseok; Kang, Hyungook; Seong, Poonghyun

    2013-01-01

    The safe operation of nuclear power plants (NPPs) is substantially dependent on the performance of the human operators who operate the systems. In this environment, human errors caused by inappropriate performance of operator have been considered to be critical since it may lead serious problems in the safety-critical plants. In order to provide meaningful insights to prevent human errors and enhance the human performance, operators' physiological conditions such as stress and workload have been investigated. Physiological measurements were considered as reliable tools to assess the stress and workload. T. Q. Tran et al. and J. B. Brooking et al pointed out that operators' workload can be assessed using eye tracking, galvanic skin response, electroencephalograms (EEGs), heart rate, respiration and other measurements. The purpose of this study is to investigate the effect of the human operators' tense level and knowledge level to the number of human errors. For this study, the experiments were conducted in the mimic of the main control rooms (MCR) in NPP. It utilized the compact nuclear simulator (CNS) which is modeled based on the three loop Pressurized Water Reactor, 993MWe, Kori unit 3 and 4 in Korea and the subjects were asked to follow the tasks described in the emergency operating procedures (EOP). During the simulation, three kinds of physiological measurement were utilized; Electrocardiogram (ECG), EEG and nose temperature. Also, subjects were divided into three groups based on their knowledge of the plant operation. The result shows that subjects who are tense make fewer errors. In addition, subjects who are in higher knowledge level tend to be tense and make fewer errors. For the ECG data, subjects who make fewer human errors tend to be located in higher tense level area of high SNS activity and low PSNS activity. The results of EEG data are also similar to ECG result. Beta power ratio of subjects who make fewer errors was higher. Since beta power ratio is

  10. Development of a framework to estimate human error for diagnosis tasks in advanced control room

    International Nuclear Information System (INIS)

    Kim, Ar Ryum; Jang, In Seok; Seong, Proong Hyun

    2014-01-01

    In the emergency situation of nuclear power plants (NPPs), a diagnosis of the occurring events is crucial for managing or controlling the plant to a safe and stable condition. If the operators fail to diagnose the occurring events or relevant situations, their responses can eventually inappropriate or inadequate Accordingly, huge researches have been performed to identify the cause of diagnosis error and estimate the probability of diagnosis error. D.I Gertman et al. asserted that 'the cognitive failures stem from erroneous decision-making, poor understanding of rules and procedures, and inadequate problem solving and this failures may be due to quality of data and people's capacity for processing information'. Also many researchers have asserted that human-system interface (HSI), procedure, training and available time are critical factors to cause diagnosis error. In nuclear power plants, a diagnosis of the event is critical for safe condition of the system. As advanced main control room is being adopted in nuclear power plants, the operators may obtain the plant data via computer-based HSI and procedure. Also many researchers have asserted that HSI, procedure, training and available time are critical factors to cause diagnosis error. In this regards, using simulation data, diagnosis errors and its causes were identified. From this study, some useful insights to reduce diagnosis errors of operators in advanced main control room were provided

  11. Quantification of the effects of dependence on human error probabilities

    International Nuclear Information System (INIS)

    Bell, B.J.; Swain, A.D.

    1980-01-01

    In estimating the probabilities of human error in the performance of a series of tasks in a nuclear power plant, the situation-specific characteristics of the series must be considered. A critical factor not to be overlooked in this estimation is the dependence or independence that pertains to any of the several pairs of task performances. In discussing the quantification of the effects of dependence, the event tree symbology described will be used. In any series of tasks, the only dependence considered for quantification in this document will be that existing between the task of interest and the immediately preceeding task. Tasks performed earlier in the series may have some effect on the end task, but this effect is considered negligible

  12. Systematic analysis of dependent human errors from the maintenance history at finnish NPPs - A status report

    Energy Technology Data Exchange (ETDEWEB)

    Laakso, K. [VTT Industrial Systems (Finland)

    2002-12-01

    Operating experience has shown missed detection events, where faults have passed inspections and functional tests to operating periods after the maintenance activities during the outage. The causes of these failures have often been complex event sequences, involving human and organisational factors. Especially common cause and other dependent failures of safety systems may significantly contribute to the reactor core damage risk. The topic has been addressed in the Finnish studies of human common cause failures, where experiences on latent human errors have been searched and analysed in detail from the maintenance history. The review of the bulk of the analysis results of the Olkiluoto and Loviisa plant sites shows that the instrumentation and control and electrical equipment is more prone to human error caused failure events than the other maintenance and that plant modifications and also predetermined preventive maintenance are significant sources of common cause failures. Most errors stem from the refuelling and maintenance outage period at the both sites, and less than half of the dependent errors were identified during the same outage. The dependent human errors originating from modifications could be reduced by a more tailored specification and coverage of their start-up testing programs. Improvements could also be achieved by a more case specific planning of the installation inspection and functional testing of complicated maintenance works or work objects of higher plant safety and availability importance. A better use and analysis of condition monitoring information for maintenance steering could also help. The feedback from discussions of the analysis results with plant experts and professionals is still crucial in developing the final conclusions and recommendations that meet the specific development needs at the plants. (au)

  13. Identification and Assessment of Human Errors in Postgraduate Endodontic Students of Kerman University of Medical Sciences by Using the SHERPA Method

    Directory of Open Access Journals (Sweden)

    Saman Dastaran

    2016-03-01

    Full Text Available Introduction: Human errors are the cause of many accidents, including industrial and medical, therefore finding out an approach for identifying and reducing them is very important. Since no study has been done about human errors in the dental field, this study aimed to identify and assess human errors in postgraduate endodontic students of Kerman University of Medical Sciences by using the SHERPA Method. Methods: This cross-sectional study was performed during year 2014. Data was collected using task observation and interviewing postgraduate endodontic students. Overall, 10 critical tasks, which were most likely to cause harm to patients were determined. Next, Hierarchical Task Analysis (HTA was conducted and human errors in each task were identified by the Systematic Human Error Reduction Prediction Approach (SHERPA technique worksheets. Results: After analyzing the SHERPA worksheets, 90 human errors were identified including (67.7% action errors, (13.3% checking errors, (8.8% selection errors, (5.5% retrieval errors and (4.4% communication errors. As a result, most of them were action errors and less of them were communication errors. Conclusions: The results of the study showed that the highest percentage of errors and the highest level of risk were associated with action errors, therefore, to reduce the occurrence of such errors and limit their consequences, control measures including periodical training of work procedures, providing work check-lists, development of guidelines and establishment of a systematic and standardized reporting system, should be put in place. Regarding the results of this study, the control of recovery errors with the highest percentage of undesirable risk and action errors with the highest frequency of errors should be in the priority of control

  14. An Analysis and Quantification Method of Human Errors of Soft Controls in Advanced MCRs

    International Nuclear Information System (INIS)

    Lee, Seung Jun; Kim, Jae Whan; Jang, Seung Cheol

    2011-01-01

    In this work, a method was proposed for quantifying human errors that may occur during operation executions using soft control. Soft controls of advanced main control rooms (MCRs) have totally different features from conventional controls, and thus they may have different human error modes and occurrence probabilities. It is important to define the human error modes and to quantify the error probability for evaluating the reliability of the system and preventing errors. This work suggests a modified K-HRA method for quantifying error probability

  15. Latent human error analysis and efficient improvement strategies by fuzzy TOPSIS in aviation maintenance tasks.

    Science.gov (United States)

    Chiu, Ming-Chuan; Hsieh, Min-Chih

    2016-05-01

    The purposes of this study were to develop a latent human error analysis process, to explore the factors of latent human error in aviation maintenance tasks, and to provide an efficient improvement strategy for addressing those errors. First, we used HFACS and RCA to define the error factors related to aviation maintenance tasks. Fuzzy TOPSIS with four criteria was applied to evaluate the error factors. Results show that 1) adverse physiological states, 2) physical/mental limitations, and 3) coordination, communication, and planning are the factors related to airline maintenance tasks that could be addressed easily and efficiently. This research establishes a new analytic process for investigating latent human error and provides a strategy for analyzing human error using fuzzy TOPSIS. Our analysis process complements shortages in existing methodologies by incorporating improvement efficiency, and it enhances the depth and broadness of human error analysis methodology. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  16. The Concept of Human Error and the Design of Reliable Human-Machine Systems

    DEFF Research Database (Denmark)

    Rasmussen, Jens

    1995-01-01

    The concept of human error is unreliable as a basis for design of reliable human-machine systems. Humans are basically highly adaptive and 'errors' are closely related to the process of adaptation and learning. Therefore, reliability of system operation depends on an interface that is not designed...... so as to support a pre-conceived operating procedure, but, instead, makes visible the deep, functional structure of the system together with the boundaries of acceptable operation in away that allows operators to 'touch' the boundaries and to learn to cope with the effects of errors in a reversible...... way. The concepts behind such 'ecological' interfaces are discussed, an it is argued that a 'typology' of visualization concepts is a pressing research need....

  17. A Conceptual Framework of Human Reliability Analysis for Execution Human Error in NPP Advanced MCRs

    International Nuclear Information System (INIS)

    Jang, In Seok; Kim, Ar Ryum; Seong, Poong Hyun; Jung, Won Dea

    2014-01-01

    The operation environment of Main Control Rooms (MCRs) in Nuclear Power Plants (NPPs) has changed with the adoption of new human-system interfaces that are based on computer-based technologies. The MCRs that include these digital and computer technologies, such as large display panels, computerized procedures, and soft controls, are called Advanced MCRs. Among the many features of Advanced MCRs, soft controls are a particularly important feature because the operation action in NPP Advanced MCRs is performed by soft control. Using soft controls such as mouse control, and touch screens, operators can select a specific screen, then choose the controller, and finally manipulate the given devices. Due to the different interfaces between soft control and hardwired conventional type control, different human error probabilities and a new Human Reliability Analysis (HRA) framework should be considered in the HRA for advanced MCRs. In other words, new human error modes should be considered for interface management tasks such as navigation tasks, and icon (device) selection tasks in monitors and a new framework of HRA method taking these newly generated human error modes into account should be considered. In this paper, a conceptual framework for a HRA method for the evaluation of soft control execution human error in advanced MCRs is suggested by analyzing soft control tasks

  18. A Conceptual Framework of Human Reliability Analysis for Execution Human Error in NPP Advanced MCRs

    Energy Technology Data Exchange (ETDEWEB)

    Jang, In Seok; Kim, Ar Ryum; Seong, Poong Hyun [KAIST, Daejeon (Korea, Republic of); Jung, Won Dea [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-08-15

    The operation environment of Main Control Rooms (MCRs) in Nuclear Power Plants (NPPs) has changed with the adoption of new human-system interfaces that are based on computer-based technologies. The MCRs that include these digital and computer technologies, such as large display panels, computerized procedures, and soft controls, are called Advanced MCRs. Among the many features of Advanced MCRs, soft controls are a particularly important feature because the operation action in NPP Advanced MCRs is performed by soft control. Using soft controls such as mouse control, and touch screens, operators can select a specific screen, then choose the controller, and finally manipulate the given devices. Due to the different interfaces between soft control and hardwired conventional type control, different human error probabilities and a new Human Reliability Analysis (HRA) framework should be considered in the HRA for advanced MCRs. In other words, new human error modes should be considered for interface management tasks such as navigation tasks, and icon (device) selection tasks in monitors and a new framework of HRA method taking these newly generated human error modes into account should be considered. In this paper, a conceptual framework for a HRA method for the evaluation of soft control execution human error in advanced MCRs is suggested by analyzing soft control tasks.

  19. Findings from analysing and quantifying human error using current methods

    International Nuclear Information System (INIS)

    Dang, V.N.; Reer, B.

    1999-01-01

    In human reliability analysis (HRA), the scarcity of data means that, at best, judgement must be applied to transfer to the domain of the analysis what data are available for similar tasks. In particular for the quantification of tasks involving decisions, the analyst has to choose among quantification approaches that all depend to a significant degree on expert judgement. The use of expert judgement can be made more reliable by eliciting relative judgements rather than absolute judgements. These approaches, which are based on multiple criterion decision theory, focus on ranking the tasks to be analysed by difficulty. While these approaches remedy at least partially the poor performance of experts in the estimation of probabilities, they nevertheless require the calibration of the relative scale on which the actions are ranked in order to obtain the probabilities of interest. This paper presents some results from a comparison of some current HRA methods performed in the frame of a study of SLIM calibration options. The HRA quantification methods THERP, HEART, and INTENT were applied to derive calibration human error probabilities for two groups of operator actions. (author)

  20. Development of Human Factor Management Requirements and Human Error Classification for the Prevention of Railway Accident

    International Nuclear Information System (INIS)

    Kwak, Sang Log; Park, Chan Woo; Shin, Seung Ryoung

    2008-08-01

    Railway accident analysis results show that accidents cased by human factors are not decreasing, whereas H/W related accidents are steadily decreasing. For the efficient management of human factors, many expertise on design, conditions, safety culture and staffing are required. But current safety management activities on safety critical works are focused on training, due to the limited resource and information. In order to improve railway safety, human factors management requirements for safety critical worker and human error classification is proposed in this report. For this accident analysis, status of safety measure on human factor, safety management system on safety critical worker, current safety planning is analysis

  1. Current pulse: can a production system reduce medical errors in health care?

    Science.gov (United States)

    Printezis, Antonios; Gopalakrishnan, Mohan

    2007-01-01

    One of the reasons for rising health care costs is medical errors, a majority of which result from faulty systems and processes. Health care in the past has used process-based initiatives such as Total Quality Management, Continuous Quality Improvement, and Six Sigma to reduce errors. These initiatives to redesign health care, reduce errors, and improve overall efficiency and customer satisfaction have had moderate success. Current trend is to apply the successful Toyota Production System (TPS) to health care since its organizing principles have led to tremendous improvement in productivity and quality for Toyota and other businesses that have adapted them. This article presents insights on the effectiveness of TPS principles in health care and the challenges that lie ahead in successfully integrating this approach with other quality initiatives.

  2. 75 FR 18514 - Developing Guidance on Naming, Labeling, and Packaging Practices to Reduce Medication Errors...

    Science.gov (United States)

    2010-04-12

    ... packaging designs. Among these measures, FDA agreed that by the end of FY 2010, after public consultation... product names and designing product labels and packaging to reduce medication errors. Four panel... of product packaging design, and costs associated with designing product packaging. Panel 3 will...

  3. Scaling prediction errors to reward variability benefits error-driven learning in humans.

    Science.gov (United States)

    Diederen, Kelly M J; Schultz, Wolfram

    2015-09-01

    Effective error-driven learning requires individuals to adapt learning to environmental reward variability. The adaptive mechanism may involve decays in learning rate across subsequent trials, as shown previously, and rescaling of reward prediction errors. The present study investigated the influence of prediction error scaling and, in particular, the consequences for learning performance. Participants explicitly predicted reward magnitudes that were drawn from different probability distributions with specific standard deviations. By fitting the data with reinforcement learning models, we found scaling of prediction errors, in addition to the learning rate decay shown previously. Importantly, the prediction error scaling was closely related to learning performance, defined as accuracy in predicting the mean of reward distributions, across individual participants. In addition, participants who scaled prediction errors relative to standard deviation also presented with more similar performance for different standard deviations, indicating that increases in standard deviation did not substantially decrease "adapters'" accuracy in predicting the means of reward distributions. However, exaggerated scaling beyond the standard deviation resulted in impaired performance. Thus efficient adaptation makes learning more robust to changing variability. Copyright © 2015 the American Physiological Society.

  4. A Human Error Analysis Procedure for Identifying Potential Error Modes and Influencing Factors for Test and Maintenance Activities

    International Nuclear Information System (INIS)

    Kim, Jae Whan; Park, Jin Kyun

    2010-01-01

    Periodic or non-periodic test and maintenance (T and M) activities in large, complex systems such as nuclear power plants (NPPs) are essential for sustaining stable and safe operation of the systems. On the other hand, it also has been raised that human erroneous actions that might occur during T and M activities has the possibility of incurring unplanned reactor trips (RTs) or power derate, making safety-related systems unavailable, or making the reliability of components degraded. Contribution of human errors during normal and abnormal activities of NPPs to the unplanned RTs is known to be about 20% of the total events. This paper introduces a procedure for predictively analyzing human error potentials when maintenance personnel perform T and M tasks based on a work procedure or their work plan. This procedure helps plant maintenance team prepare for plausible human errors. The procedure to be introduced is focusing on the recurrent error forms (or modes) in execution-based errors such as wrong object, omission, too little, and wrong action

  5. Cognitive emotion regulation enhances aversive prediction error activity while reducing emotional responses.

    Science.gov (United States)

    Mulej Bratec, Satja; Xie, Xiyao; Schmid, Gabriele; Doll, Anselm; Schilbach, Leonhard; Zimmer, Claus; Wohlschläger, Afra; Riedl, Valentin; Sorg, Christian

    2015-12-01

    Cognitive emotion regulation is a powerful way of modulating emotional responses. However, despite the vital role of emotions in learning, it is unknown whether the effect of cognitive emotion regulation also extends to the modulation of learning. Computational models indicate prediction error activity, typically observed in the striatum and ventral tegmental area, as a critical neural mechanism involved in associative learning. We used model-based fMRI during aversive conditioning with and without cognitive emotion regulation to test the hypothesis that emotion regulation would affect prediction error-related neural activity in the striatum and ventral tegmental area, reflecting an emotion regulation-related modulation of learning. Our results show that cognitive emotion regulation reduced emotion-related brain activity, but increased prediction error-related activity in a network involving ventral tegmental area, hippocampus, insula and ventral striatum. While the reduction of response activity was related to behavioral measures of emotion regulation success, the enhancement of prediction error-related neural activity was related to learning performance. Furthermore, functional connectivity between the ventral tegmental area and ventrolateral prefrontal cortex, an area involved in regulation, was specifically increased during emotion regulation and likewise related to learning performance. Our data, therefore, provide first-time evidence that beyond reducing emotional responses, cognitive emotion regulation affects learning by enhancing prediction error-related activity, potentially via tegmental dopaminergic pathways. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. The effect on dose accumulation accuracy of inverse-consistency and transitivity error reduced deformation maps

    International Nuclear Information System (INIS)

    Hardcastle, Nicholas; Bender, Edward T.; Tomé, Wolfgang A.

    2014-01-01

    It has previously been shown that deformable image registrations (DIRs) often result in deformation maps that are neither inverse-consistent nor transitive, and that the dose accumulation based on these deformation maps can be inconsistent if different image pathways are used for dose accumulation. A method presented to reduce inverse consistency and transitivity errors has been shown to result in more consistent dose accumulation, regardless of the image pathway selected for dose accumulation. The present study investigates the effect on the dose accumulation accuracy of deformation maps processed to reduce inverse consistency and transitivity errors. A set of lung 4DCT phases were analysed, consisting of four images on which a dose grid was created. Dose to 75 corresponding anatomical locations was manually tracked. Dose accumulation was performed between all image sets with Demons derived deformation maps as well as deformation maps processed to reduce inverse consistency and transitivity errors. The ground truth accumulated dose was then compared with the accumulated dose derived from DIR. Two dose accumulation image pathways were considered. The post-processing method to reduce inverse consistency and transitivity errors had minimal effect on the dose accumulation accuracy. There was a statistically significant improvement in dose accumulation accuracy for one pathway, but for the other pathway there was no statistically significant difference. A post-processing technique to reduce inverse consistency and transitivity errors has a positive, yet minimal effect on the dose accumulation accuracy. Thus the post-processing technique improves consistency of dose accumulation with minimal effect on dose accumulation accuracy.

  7. Review of U.S. Army Unmanned Aerial Systems Accident Reports: Analysis of Human Error Contributions

    Science.gov (United States)

    2018-03-20

    within report documents. The information presented was obtained through a request to use the U.S. Army Combat Readiness Center’s Risk Management ...controlled flight into terrain (13 accidents), fueling errors by improper techniques (7 accidents), and a variety of maintenance errors (10 accidents). The...and 9 of the 10 maintenance accidents. Table 4. Frequencies Based on Source of Human Error Human error source Presence Poor Planning

  8. Savannah River Site human error data base development for nonreactor nuclear facilities

    International Nuclear Information System (INIS)

    Benhardt, H.C.; Held, J.E.; Olsen, L.M.; Vail, R.E.; Eide, S.A.

    1994-01-01

    As part of an overall effort to upgrade and streamline methodologies for safety analyses of nonreactor nuclear facilities at the Savannah River Site (SRS), a human error data base has been developed and is presented in this report. The data base fulfills several needs of risk analysts supporting safety analysis report (SAR) development. First, it provides a single source for probabilities or rates for a wide variety of human errors associated with the SRS nonreactor nuclear facilities. Second, it provides a documented basis for human error probabilities or rates. And finally, it provides actual SRS-specific human error data to support many of the error probabilities or rates. Use of a single, documented reference source for human errors, supported by SRS-specific human error data, will improve the consistency and accuracy of human error modeling by SRS risk analysts. It is envisioned that SRS risk analysts will use this report as both a guide to identifying the types of human errors that may need to be included in risk models such as fault and event trees, and as a source for human error probabilities or rates. For each human error in this report, ffime different mean probabilities or rates are presented to cover a wide range of conditions and influencing factors. The ask analysts must decide which mean value is most appropriate for each particular application. If other types of human errors are needed for the risk models, the analyst must use other sources. Finally, if human enors are dominant in the quantified risk models (based on the values obtained fmm this report), then it may be appropriate to perform detailed human reliability analyses (HRAS) for the dominant events. This document does not provide guidance for such refined HRAS; in such cases experienced human reliability analysts should be involved

  9. Identification and Evaluation of Human Errors in the Medication Process Using the Extended CREAM Technique

    Directory of Open Access Journals (Sweden)

    Iraj Mohammadfam

    2017-10-01

    Full Text Available Background Medication process is a powerful instrument for curing patients. Obeying the commands of this process has an important role in the treatment and provision of care to patients. Medication error, as a complicated process, can occur in any stage of this process, and to avoid it, appropriate decision-making, cognition, and performance of the hospital staff are needed. Objectives The present study aimed at identifying and evaluating the nature and reasons of human errors in the medication process in a hospital using the extended CREAM method. Methods This was a qualitative and cross-sectional study conducted in a hospital in Hamadan. In this study, first, the medication process was selected as a critical issue based on the opinions of experts, specialists, and experienced individuals in the nursing and medical departments. Then, the process was analyzed into relative steps and substeps using the method of HTA and was evaluated using extended CREAM technique considering the probability of human errors. Results Based on the findings achieved through the basic CREAM method, the highest CFPt was in the step of medicine administration to patients (0.056. Moreover, the results revealed that the highest CFPt was in the substeps of calculating the dose of medicine and determining the method of prescription and identifying the patient (0.0796 and 0.0785, respectively. Also, the least CFPt was related to transcribing the prescribed medicine from file to worksheet of medicine (0.0106. Conclusions Considering the critical consequences of human errors in the medication process, holding pharmacological retraining classes, using the principles of executing pharmaceutical orders, increasing medical personnel, reducing working overtime, organizing work shifts, and using error reporting systems are of paramount importance.

  10. PRA (probabilistic risk analysis) in the nuclear sector. Quantifying human error and human malice

    International Nuclear Information System (INIS)

    Heyes, A.G.

    1995-01-01

    Regardless of the regulatory style chosen ('command and control' or 'functional') a vital prerequisite for coherent safety regulations in the nuclear power industry is the ability to assess accident risk. In this paper we present a critical analysis of current techniques of probabilistic risk analysis applied in the industry, with particular regard to the problems of quantifying risks arising from, or exacerbated by, human risk and/or human error. (Author)

  11. Human-simulation-based learning to prevent medication error: A systematic review.

    Science.gov (United States)

    Sarfati, Laura; Ranchon, Florence; Vantard, Nicolas; Schwiertz, Vérane; Larbre, Virginie; Parat, Stéphanie; Faudel, Amélie; Rioufol, Catherine

    2018-01-31

    In the past 2 decades, there has been an increasing interest in simulation-based learning programs to prevent medication error (ME). To improve knowledge, skills, and attitudes in prescribers, nurses, and pharmaceutical staff, these methods enable training without directly involving patients. However, best practices for simulation for healthcare providers are as yet undefined. By analysing the current state of experience in the field, the present review aims to assess whether human simulation in healthcare helps to reduce ME. A systematic review was conducted on Medline from 2000 to June 2015, associating the terms "Patient Simulation," "Medication Errors," and "Simulation Healthcare." Reports of technology-based simulation were excluded, to focus exclusively on human simulation in nontechnical skills learning. Twenty-one studies assessing simulation-based learning programs were selected, focusing on pharmacy, medicine or nursing students, or concerning programs aimed at reducing administration or preparation errors, managing crises, or learning communication skills for healthcare professionals. The studies varied in design, methodology, and assessment criteria. Few demonstrated that simulation was more effective than didactic learning in reducing ME. This review highlights a lack of long-term assessment and real-life extrapolation, with limited scenarios and participant samples. These various experiences, however, help in identifying the key elements required for an effective human simulation-based learning program for ME prevention: ie, scenario design, debriefing, and perception assessment. The performance of these programs depends on their ability to reflect reality and on professional guidance. Properly regulated simulation is a good way to train staff in events that happen only exceptionally, as well as in standard daily activities. By integrating human factors, simulation seems to be effective in preventing iatrogenic risk related to ME, if the program is

  12. The use of adaptive radiation therapy to reduce setup error: a prospective clinical study

    International Nuclear Information System (INIS)

    Yan Di; Wong, John; Vicini, Frank; Robertson, John; Horwitz, Eric; Brabbins, Donald; Cook, Carla; Gustafson, Gary; Stromberg, Jannifer; Martinez, Alvaro

    1996-01-01

    , eight patients had completed the study. Their mean systematic setup error was 4 mm with a range of 2 mm to 6 mm before adjustment; and was reduced to 0.8 mm with a range of 0.2 mm to 1.8 mm after adjustments. There was no significant difference in their random setup errors before and after adjustment. Analysis of the block overlap distributions shows that the fractions of the prescribed field areas covered by the daily treatment increased after setup adjustment. The block overlap distributions also show that the magnitude of random setup errors at different field edges were different; 50% of which were small enough to allow the treatment margin to be reduced to 4 mm or less. Results from the on-going treatments of the remaining 12 patients show similar trends and magnitudes, and are not expected to be different. Conclusion: Our prospective study demonstrates that the ART process provides an effective and reliable approach to compensate for the systematic setup error of the individual patient. Adjusting the MLC field allows accurate setup adjustment as small as 2 mm, minimizes the possibility of 'unsettling' the patient and reduces the work load of the therapists. The ART process can be extended to correct for random setup errors by further modification of the MLC field shape and prescribed dose. Most importantly, ART integrates the use of advanced technologies to maximize treatment benefits, and can be important in the implementation of dose escalated conformal therapy

  13. An Integrated Signaling-Encryption Mechanism to Reduce Error Propagation in Wireless Communications: Performance Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Olama, Mohammed M [ORNL; Matalgah, Mustafa M [ORNL; Bobrek, Miljko [ORNL

    2015-01-01

    Traditional encryption techniques require packet overhead, produce processing time delay, and suffer from severe quality of service deterioration due to fades and interference in wireless channels. These issues reduce the effective transmission data rate (throughput) considerably in wireless communications, where data rate with limited bandwidth is the main constraint. In this paper, performance evaluation analyses are conducted for an integrated signaling-encryption mechanism that is secure and enables improved throughput and probability of bit-error in wireless channels. This mechanism eliminates the drawbacks stated herein by encrypting only a small portion of an entire transmitted frame, while the rest is not subject to traditional encryption but goes through a signaling process (designed transformation) with the plaintext of the portion selected for encryption. We also propose to incorporate error correction coding solely on the small encrypted portion of the data to drastically improve the overall bit-error rate performance while not noticeably increasing the required bit-rate. We focus on validating the signaling-encryption mechanism utilizing Hamming and convolutional error correction coding by conducting an end-to-end system-level simulation-based study. The average probability of bit-error and throughput of the encryption mechanism are evaluated over standard Gaussian and Rayleigh fading-type channels and compared to the ones of the conventional advanced encryption standard (AES).

  14. An improved approach to reduce partial volume errors in brain SPET

    International Nuclear Information System (INIS)

    Hatton, R.L.; Hatton, B.F.; Michael, G.; Barnden, L.; QUT, Brisbane, QLD; The Queen Elizabeth Hospital, Adelaide, SA

    1999-01-01

    Full text: Limitations in SPET resolution give rise to significant partial volume error (PVE) in small brain structures We have investigated a previously published method (Muller-Gartner et al., J Cereb Blood Flow Metab 1992;16: 650-658) to correct PVE in grey matter using MRI. An MRI is registered and segmented to obtain a grey matter tissue volume which is then smoothed to obtain resolution matched to the corresponding SPET. By dividing the original SPET with this correction map, structures can be corrected for PVE on a pixel-by-pixel basis. Since this approach is limited by space-invariant filtering, modification was made by estimating projections for the segmented MRI and reconstructing these using identical parameters to SPET. The methods were tested on simulated brain scans, reconstructed with the ordered subsets EM algorithm (8,16, 32, 64 equivalent EM iterations) The new method provided better recovery visually. For 32 EM iterations, recovery coefficients were calculated for grey matter regions. The effects of potential errors in the method were examined. Mean recovery was unchanged with one pixel registration error, the maximum error found in most registration programs. Errors in segmentation > 2 pixels results in loss of accuracy for small structures. The method promises to be useful for reducing PVE in brain SPET

  15. AGAPE-ET for human error analysis of emergency tasks and its application

    International Nuclear Information System (INIS)

    Kim, J. H.; Jeong, W. D.

    2002-01-01

    The paper presents a proceduralised human reliability analysis (HRA) methodology, AGAPE-ET (A Guidance And Procedure for Human Error Analysis for Emergency Tasks), covering both qualitative error analysis and quantification of human error probability (HEP) of emergency tasks in nuclear power plants. The AGAPE-ET method is based on the simplified cognitive model. By each cognitive function, error causes or error-likely situations have been identified considering the characteristics of the performance of each cognitive function and influencing mechanism of the performance influencing factors (PIFs) on the cognitive function. Then, error analysis items have been determined from the identified error causes or error-likely situations and a human error analysis procedure based on the error analysis items is organised to help the analysts cue or guide overall human error analysis. The basic scheme for the quantification of HEP consists in the multiplication of the BHEP assigned by the error analysis item and the weight from the influencing factors decision tree (IFDT) constituted by cognitive function. The method can be characterised by the structured identification of the weak points of the task required to perform and the efficient analysis process that the analysts have only to carry out with the necessary cognitive functions. The paper also presents the application of AGAPE-ET to 31 nuclear emergency tasks and its results

  16. Comparison of risk sensitivity to human errors in the Oconee and LaSalle PRAs

    International Nuclear Information System (INIS)

    Wong, S.; Higgins, J.

    1991-01-01

    This paper describes the comparative analyses of plant risk sensitivity to human errors in the Oconee and La Salle Probabilistic Risk Assessment (PRAs). These analyses were performed to determine the reasons for the observed differences in the sensitivity of core melt frequency (CMF) to changes in human error probabilities (HEPs). Plant-specific design features, PRA methods, and the level of detail and assumptions in the human error modeling were evaluated to assess their influence risk estimates and sensitivities

  17. Electronic laboratory system reduces errors in National Tuberculosis Program: a cluster randomized controlled trial.

    Science.gov (United States)

    Blaya, J A; Shin, S S; Yale, G; Suarez, C; Asencios, L; Contreras, C; Rodriguez, P; Kim, J; Cegielski, P; Fraser, H S F

    2010-08-01

    To evaluate the impact of the e-Chasqui laboratory information system in reducing reporting errors compared to the current paper system. Cluster randomized controlled trial in 76 health centers (HCs) between 2004 and 2008. Baseline data were collected every 4 months for 12 months. HCs were then randomly assigned to intervention (e-Chasqui) or control (paper). Further data were collected for the same months the following year. Comparisons were made between intervention and control HCs, and before and after the intervention. Intervention HCs had respectively 82% and 87% fewer errors in reporting results for drug susceptibility tests (2.1% vs. 11.9%, P = 0.001, OR 0.17, 95%CI 0.09-0.31) and cultures (2.0% vs. 15.1%, P Chasqui users sent on average three electronic error reports per week to the laboratories. e-Chasqui reduced the number of missing laboratory results at point-of-care health centers. Clinical users confirmed viewing electronic results not available on paper. Reporting errors to the laboratory using e-Chasqui promoted continuous quality improvement. The e-Chasqui laboratory information system is an important part of laboratory infrastructure improvements to support multidrug-resistant tuberculosis care in Peru.

  18. Human errors identification using the human factors analysis and classification system technique (HFACS

    Directory of Open Access Journals (Sweden)

    G. A. Shirali

    2013-12-01

    .Result: In this study, 158 reports of accident in Ahvaz steel industry were analyzed by HFACS technique. This analysis showed that most of the human errors were: in the first level was related to the skill-based errors, in the second to the physical environment, in the third level to the inadequate supervision and in the fourth level to the management of resources. .Conclusion: Studying and analyzing of past events using the HFACS technique can identify the major and root causes of accidents and can be effective on prevent repetitions of such mishaps. Also, it can be used as a basis for developing strategies to prevent future events in steel industries.

  19. Benefits and risks of using smart pumps to reduce medication error rates: a systematic review.

    Science.gov (United States)

    Ohashi, Kumiko; Dalleur, Olivia; Dykes, Patricia C; Bates, David W

    2014-12-01

    Smart infusion pumps have been introduced to prevent medication errors and have been widely adopted nationally in the USA, though they are not always used in Europe or other regions. Despite widespread usage of smart pumps, intravenous medication errors have not been fully eliminated. Through a systematic review of recent studies and reports regarding smart pump implementation and use, we aimed to identify the impact of smart pumps on error reduction and on the complex process of medication administration, and strategies to maximize the benefits of smart pumps. The medical literature related to the effects of smart pumps for improving patient safety was searched in PUBMED, EMBASE, and the Cochrane Central Register of Controlled Trials (CENTRAL) (2000-2014) and relevant papers were selected by two researchers. After the literature search, 231 papers were identified and the full texts of 138 articles were assessed for eligibility. Of these, 22 were included after removal of papers that did not meet the inclusion criteria. We assessed both the benefits and negative effects of smart pumps from these studies. One of the benefits of using smart pumps was intercepting errors such as the wrong rate, wrong dose, and pump setting errors. Other benefits include reduction of adverse drug event rates, practice improvements, and cost effectiveness. Meanwhile, the current issues or negative effects related to using smart pumps were lower compliance rates of using smart pumps, the overriding of soft alerts, non-intercepted errors, or the possibility of using the wrong drug library. The literature suggests that smart pumps reduce but do not eliminate programming errors. Although the hard limits of a drug library play a main role in intercepting medication errors, soft limits were still not as effective as hard limits because of high override rates. Compliance in using smart pumps is key towards effectively preventing errors. Opportunities for improvement include upgrading drug

  20. Reducing hazards for animals from humans

    Directory of Open Access Journals (Sweden)

    Paul-Pierre Pastoret

    2012-06-01

    Full Text Available If animals may be a source of hazards for humans, the reverse is equally true. The main sources of hazards from humans to animals, are the impact of human introduction of transboundary animal diseases, climate change, globalisation, introduction of invasive species and reduction of biodiversity.There is also a trend toward reducing genetic diversity in domestic animals, such as cattle; there are presently around 700 different breeds of cattle many of which at the verge of extinction (less than 100 reproductive females. The impact of humans is also indirect through detrimental effects on the environment. It is therefore urgent to implement the new concept of “one health"....

  1. Sensitivity of risk parameters to human errors in reactor safety study for a PWR

    International Nuclear Information System (INIS)

    Samanta, P.K.; Hall, R.E.; Swoboda, A.L.

    1981-01-01

    Sensitivities of the risk parameters, emergency safety system unavailabilities, accident sequence probabilities, release category probabilities and core melt probability were investigated for changes in the human error rates within the general methodological framework of the Reactor Safety Study (RSS) for a Pressurized Water Reactor (PWR). Impact of individual human errors were assessed both in terms of their structural importance to core melt and reliability importance on core melt probability. The Human Error Sensitivity Assessment of a PWR (HESAP) computer code was written for the purpose of this study. The code employed point estimate approach and ignored the smoothing technique applied in RSS. It computed the point estimates for the system unavailabilities from the median values of the component failure rates and proceeded in terms of point values to obtain the point estimates for the accident sequence probabilities, core melt probability, and release category probabilities. The sensitivity measure used was the ratio of the top event probability before and after the perturbation of the constituent events. Core melt probability per reactor year showed significant increase with the increase in the human error rates, but did not show similar decrease with the decrease in the human error rates due to the dominance of the hardware failures. When the Minimum Human Error Rate (M.H.E.R.) used is increased to 10 -3 , the base case human error rates start sensitivity to human errors. This effort now allows the evaluation of new error rate data along with proposed changes in the man machine interface

  2. Knowledge-base for the new human reliability analysis method, A Technique for Human Error Analysis (ATHEANA)

    International Nuclear Information System (INIS)

    Cooper, S.E.; Wreathall, J.; Thompson, C.M., Drouin, M.; Bley, D.C.

    1996-01-01

    This paper describes the knowledge base for the application of the new human reliability analysis (HRA) method, a ''A Technique for Human Error Analysis'' (ATHEANA). Since application of ATHEANA requires the identification of previously unmodeled human failure events, especially errors of commission, and associated error-forcing contexts (i.e., combinations of plant conditions and performance shaping factors), this knowledge base is an essential aid for the HRA analyst

  3. PERANCANGAN COMPUTER AIDED SYSTEM DALAM MENGANALISA HUMAN ERROR DI PERKERETAAPIAN INDONESIA

    Directory of Open Access Journals (Sweden)

    Wiwik Budiawan

    2013-06-01

    Full Text Available Kecelakaan kereta api (KA yang terjadi secara beruntun di Indonesia sudah berada pada tingkat kritis. Berdasarkan data dari Direktorat Jendral Perkeretaapian, dalam kurun 5 tahun terakhir (2005-2009 total terdapat 611 kecelakaan KA.  Banyak faktor yang berkontribusi menyebabkan terjadinya kecelakaan, antara lain: sarana, prasarana, SDM operator (human error, eksternal, dan alam.  Kegagalan manusia (Human error merupakan salah satu faktor yang berpotensi menyebabkan terjadinya suatu kecelakaan KA dan dinyatakan sebagai faktor utama penyebab terjadinya suatu kecelakaan kereta api di Indonesia. Namun, tidak jelas bagaimana teknik analisis ini dilakukan. Kajian human error yang dilakukan Komite Nasional Keselamatan Transportasi (KNKT masih relatif terbatas, tidak dilengkapi dengan metode yang sistematis. Terdapat beberapa metode yang telah dikembangkan saat ini, tetapi untuk moda transportasi kereta api masih belum banyak dikembangkan. Human Factors Analysis and Classification System (HFACS merupakan metode analisis human error yang dikembangkan dan disesuaikan dengan sistem perkeretaapian Indonesia. Guna meningkatkan keandalan dalam analisis human error, HFACS kemudian dikembangkan dalam bentuk aplikasi berbasis web yang dapat diakses di komputer maupun smartphone. Hasil penelitian ini dapat dimanfaatkan oleh KNKT sebagai metode analisis kecelakaan kereta api khususnya terkait dengan human error. Kata kunci : human error, HFACS, CAS, kereta api   Abstract Train wreck (KA which occurred in quick succession in Indonesia already at a critical level. Based on data from the Directorate General of Railways, during the last 5 years (2005-2009 there were a total of 611 railway accidents. Many factors contribute to cause accidents, such as: facilities, infrastructure, human operator (human error, external, and natural. Human failure (Human error is one of the factors that could potentially cause a train accident and expressed as the main factors causing

  4. Strategies for reducing basis set superposition error (BSSE) in O/AU and O/Ni

    KAUST Repository

    Shuttleworth, I.G.

    2015-01-01

    © 2015 Elsevier Ltd. All rights reserved. The effect of basis set superposition error (BSSE) and effective strategies for the minimisation have been investigated using the SIESTA-LCAO DFT package. Variation of the energy shift parameter ΔEPAO has been shown to reduce BSSE for bulk Au and Ni and across their oxygenated surfaces. Alternative strategies based on either the expansion or contraction of the basis set have been shown to be ineffective in reducing BSSE. Comparison of the binding energies for the surface systems obtained using LCAO were compared with BSSE-free plane wave energies.

  5. Strategies for reducing basis set superposition error (BSSE) in O/AU and O/Ni

    KAUST Repository

    Shuttleworth, I.G.

    2015-11-01

    © 2015 Elsevier Ltd. All rights reserved. The effect of basis set superposition error (BSSE) and effective strategies for the minimisation have been investigated using the SIESTA-LCAO DFT package. Variation of the energy shift parameter ΔEPAO has been shown to reduce BSSE for bulk Au and Ni and across their oxygenated surfaces. Alternative strategies based on either the expansion or contraction of the basis set have been shown to be ineffective in reducing BSSE. Comparison of the binding energies for the surface systems obtained using LCAO were compared with BSSE-free plane wave energies.

  6. Human error prediction and countermeasures based on CREAM in spent nuclear fuel (SNF) transportation

    International Nuclear Information System (INIS)

    Kim, Jae San

    2007-02-01

    Since the 1980s, in order to secure the storage capacity of spent nuclear fuel (SNF) at NPPs, SNF assemblies have been transported on-site from one unit to another unit nearby. However in the future the amount of the spent fuel will approach capacity in the areas used, and some of these SNFs will have to be transported to an off-site spent fuel repository. Most SNF materials used at NPPs will be transported by general cargo ships from abroad, and these SNFs will be stored in an interim storage facility. In the process of transporting SNF, human interactions will involve inspecting and preparing the cask and spent fuel, loading the cask onto the vehicle or ship, transferring the cask as well as storage or monitoring the cask. The transportation of SNF involves a number of activities that depend on reliable human performance. In the case of the transport of a cask, human errors may include spent fuel bundle misidentification or cask transport accidents among others. Reviews of accident events when transporting the Radioactive Material (RAM) throughout the world indicate that human error is the major causes for more than 65% of significant events. For the safety of SNF transportation, it is very important to predict human error and to deduce a method that minimizes the human error. This study examines the human factor effects on the safety of transporting spent nuclear fuel (SNF). It predicts and identifies the possible human errors in the SNF transport process (loading, transfer and storage of the SNF). After evaluating the human error mode in each transport process, countermeasures to minimize the human error are deduced. The human errors in SNF transportation were analyzed using Hollnagel's Cognitive Reliability and Error Analysis Method (CREAM). After determining the important factors for each process, countermeasures to minimize human error are provided in three parts: System design, Operational environment, and Human ability

  7. A method for optical ground station reduce alignment error in satellite-ground quantum experiments

    Science.gov (United States)

    He, Dong; Wang, Qiang; Zhou, Jian-Wei; Song, Zhi-Jun; Zhong, Dai-Jun; Jiang, Yu; Liu, Wan-Sheng; Huang, Yong-Mei

    2018-03-01

    A satellite dedicated for quantum science experiments, has been developed and successfully launched from Jiuquan, China, on August 16, 2016. Two new optical ground stations (OGSs) were built to cooperate with the satellite to complete satellite-ground quantum experiments. OGS corrected its pointing direction by satellite trajectory error to coarse tracking system and uplink beacon sight, therefore fine tracking CCD and uplink beacon optical axis alignment accuracy was to ensure that beacon could cover the quantum satellite in all time when it passed the OGSs. Unfortunately, when we tested specifications of the OGSs, due to the coarse tracking optical system was commercial telescopes, the change of position of the target in the coarse CCD was up to 600μrad along with the change of elevation angle. In this paper, a method of reduce alignment error between beacon beam and fine tracking CCD is proposed. Firstly, OGS fitted the curve of target positions in coarse CCD along with the change of elevation angle. Secondly, OGS fitted the curve of hexapod secondary mirror positions along with the change of elevation angle. Thirdly, when tracking satellite, the fine tracking error unloaded on the real-time zero point position of coarse CCD which computed by the firstly calibration data. Simultaneously the positions of the hexapod secondary mirror were adjusted by the secondly calibration data. Finally the experiment result is proposed. Results show that the alignment error is less than 50μrad.

  8. Novel error propagation approach for reducing H2S/O2 reaction mechanism

    International Nuclear Information System (INIS)

    Selim, H.; Gupta, A.K.; Sassi, M.

    2012-01-01

    A reduction strategy of hydrogen sulfide/oxygen reaction mechanism is conducted to simplify the detailed mechanism. Direct relation graph and error propagation methodology (DRGEP) has been used. A novel approach of direct elementary reaction error (DERE) has been developed in this study. The developed approach allowed for further reduction of the reaction mechanism. The reduced mechanism has been compared with the detailed mechanism under different conditions to emphasize its validity. The results obtained from the resulting reduced mechanism showed good agreement with that from the detailed mechanism. However, some discrepancies have been found for some species. Hydrogen and oxygen mole fractions showed the largest discrepancy of all combustion products. The reduced mechanism was also found to be capable of tracking the changes that occur in chemical kinetics through the change in reaction conditions. A comparison on the ignition delay time obtained from the reduced mechanism and previous experimental data showed good agreement. The reduced mechanism was used to track changes in mechanistic pathways of Claus reactions with the reaction progress.

  9. Development of an Experimental Measurement System for Human Error Characteristics and a Pilot Test

    International Nuclear Information System (INIS)

    Jang, Tong-Il; Lee, Hyun-Chul; Moon, Kwangsu

    2017-01-01

    Some items out of individual and team characteristics were partially selected, and a pilot test was performed to measure and evaluate them using the experimental measurement system of human error characteristics. It is one of the processes to produce input data to the Eco-DBMS. And also, through the pilot test, it was tried to take methods to measure and acquire the physiological data, and to develop data format and quantification methods for the database. In this study, a pilot test to measure the stress and the tension level, and team cognitive characteristics out of human error characteristics was performed using the human error characteristics measurement and experimental evaluation system. In an experiment measuring the stress level, physiological characteristics using EEG was measured in a simulated unexpected situation. As shown in results, although this experiment was pilot, it was validated that relevant results for evaluating human error coping effects of workers’ FFD management guidelines and unexpected situation against guidelines can be obtained. In following researches, additional experiments including other human error characteristics will be conducted. Furthermore, the human error characteristics measurement and experimental evaluation system will be utilized to validate various human error coping solutions such as human factors criteria, design, and guidelines as well as supplement the human error characteristics database.

  10. BAYES-HEP: Bayesian belief networks for estimation of human error probability

    International Nuclear Information System (INIS)

    Karthick, M.; Senthil Kumar, C.; Paul, Robert T.

    2017-01-01

    Human errors contribute a significant portion of risk in safety critical applications and methods for estimation of human error probability have been a topic of research for over a decade. The scarce data available on human errors and large uncertainty involved in the prediction of human error probabilities make the task difficult. This paper presents a Bayesian belief network (BBN) model for human error probability estimation in safety critical functions of a nuclear power plant. The developed model using BBN would help to estimate HEP with limited human intervention. A step-by-step illustration of the application of the method and subsequent evaluation is provided with a relevant case study and the model is expected to provide useful insights into risk assessment studies

  11. Research on Human-Error Factors of Civil Aircraft Pilots Based On Grey Relational Analysis

    Directory of Open Access Journals (Sweden)

    Guo Yundong

    2018-01-01

    Full Text Available In consideration of the situation that civil aviation accidents involve many human-error factors and show the features of typical grey systems, an index system of civil aviation accident human-error factors is built using human factor analysis and classification system model. With the data of accidents happened worldwide between 2008 and 2011, the correlation between human-error factors can be analyzed quantitatively using the method of grey relational analysis. Research results show that the order of main factors affecting pilot human-error factors is preconditions for unsafe acts, unsafe supervision, organization and unsafe acts. The factor related most closely with second-level indexes and pilot human-error factors is the physical/mental limitations of pilots, followed by supervisory violations. The relevancy between the first-level indexes and the corresponding second-level indexes and the relevancy between second-level indexes can also be analyzed quantitatively.

  12. Effective use of pre-job briefing as tool for the prevention of human error

    International Nuclear Information System (INIS)

    Schlump, Ansgar

    2015-01-01

    There is a fundamental demand to minimise the risks for workers and facilities while executing maintenance work. To ensure that facilities are secure and reliable, any deviation from normal operation behaviour has to be avoided. Accurate planning is the basis for minimising mistakes and making work more secure. All workers involved should understand how the work should be done and what is expected to avoid human errors. Especially in nuclear power plants, the human performance tools (HPT) have proved to be an effective instrument to minimise human errors. These human performance tools consist of numerous different tools that complement each other (e.g. pre-job briefing). The safety culture of the plants is also characterised by these tools. The choice of using the right HP-Tool is often a difficult task for the work planer. On the one hand, he wants to avoid mistakes during the execution of work but on the other hand he does not want to irritate the workers with unnecessary requirements. The proposed concept uses a simple risk analysis to take into account the complexity of the task, the experience of the past and the consequences of failure in to account. One main result of this risk analysis is a recommendation of the detailing of the pre-job briefing, to reduce the risks for the involved staff to a minimum.

  13. The treatment of commission errors in first generation human reliability analysis methods

    Energy Technology Data Exchange (ETDEWEB)

    Alvarengga, Marco Antonio Bayout; Fonseca, Renato Alves da, E-mail: bayout@cnen.gov.b, E-mail: rfonseca@cnen.gov.b [Comissao Nacional de Energia Nuclear (CNEN) Rio de Janeiro, RJ (Brazil); Melo, Paulo Fernando Frutuoso e, E-mail: frutuoso@nuclear.ufrj.b [Coordenacao dos Programas de Pos-Graduacao de Engenharia (PEN/COPPE/UFRJ), RJ (Brazil). Programa de Engenharia Nuclear

    2011-07-01

    Human errors in human reliability analysis can be classified generically as errors of omission and commission errors. Omission errors are related to the omission of any human action that should have been performed, but does not occur. Errors of commission are those related to human actions that should not be performed, but which in fact are performed. Both involve specific types of cognitive error mechanisms, however, errors of commission are more difficult to model because they are characterized by non-anticipated actions that are performed instead of others that are omitted (omission errors) or are entered into an operational task without being part of the normal sequence of this task. The identification of actions that are not supposed to occur depends on the operational context that will influence or become easy certain unsafe actions of the operator depending on the operational performance of its parameters and variables. The survey of operational contexts and associated unsafe actions is a characteristic of second-generation models, unlike the first generation models. This paper discusses how first generation models can treat errors of commission in the steps of detection, diagnosis, decision-making and implementation, in the human information processing, particularly with the use of THERP tables of errors quantification. (author)

  14. Human error as a source of disturbances in Swedish nuclear power plants

    International Nuclear Information System (INIS)

    Sokolowski, E.

    1985-01-01

    Events involving human errors at the Swedish nuclear power plants are registered and periodically analyzed. The philosophy behind the scheme for data collection and analysis is discussed. Human errors cause about 10% of the disturbances registered. Only a small part of these errors are committed by operators in the control room. These and other findings differ from those in other countries. Possible reasons are put forward

  15. The common mode failures analysis of the redundent system with dependent human error

    International Nuclear Information System (INIS)

    Kim, M.K.; Chang, S.H.

    1983-01-01

    Common mode failures (CMFs) have been a serious concern in the nuclear power plant. Thereis a broad category of the failure mechanisms that can cause common mode failures. This paper is a theoretical investigation of the CMFs on the unavailability of the redundent system. It is assumed that the total CMFs consist of the potential CMFs and the dependent human error CMFs. As the human error dependency is higher, the total CMFs are more effected by the dependent human error. If the human error dependence is lower, the system unavailability strongly depends on the potential CMFs, rather than the mechanical failure or the dependent human error. And it is shown that the total CMFs are dominant factor to the unavailability of the redundent system. (Author)

  16. Analysis of human error in occupational accidents in the power plant industries using combining innovative FTA and meta-heuristic algorithms

    Directory of Open Access Journals (Sweden)

    M. Omidvari

    2015-09-01

    .Conclusion: According to the obtained results, it can be concluded that human errors can be greatly reduced by training, right choice of workers with regard to the type of occupations, and provision of appropriate safety conditions in the work place.

  17. The Human Bathtub: Safety and Risk Predictions Including the Dynamic Probability of Operator Errors

    International Nuclear Information System (INIS)

    Duffey, Romney B.; Saull, John W.

    2006-01-01

    Reactor safety and risk are dominated by the potential and major contribution for human error in the design, operation, control, management, regulation and maintenance of the plant, and hence to all accidents. Given the possibility of accidents and errors, now we need to determine the outcome (error) probability, or the chance of failure. Conventionally, reliability engineering is associated with the failure rate of components, or systems, or mechanisms, not of human beings in and interacting with a technological system. The probability of failure requires a prior knowledge of the total number of outcomes, which for any predictive purposes we do not know or have. Analysis of failure rates due to human error and the rate of learning allow a new determination of the dynamic human error rate in technological systems, consistent with and derived from the available world data. The basis for the analysis is the 'learning hypothesis' that humans learn from experience, and consequently the accumulated experience defines the failure rate. A new 'best' equation has been derived for the human error, outcome or failure rate, which allows for calculation and prediction of the probability of human error. We also provide comparisons to the empirical Weibull parameter fitting used in and by conventional reliability engineering and probabilistic safety analysis methods. These new analyses show that arbitrary Weibull fitting parameters and typical empirical hazard function techniques cannot be used to predict the dynamics of human errors and outcomes in the presence of learning. Comparisons of these new insights show agreement with human error data from the world's commercial airlines, the two shuttle failures, and from nuclear plant operator actions and transient control behavior observed in transients in both plants and simulators. The results demonstrate that the human error probability (HEP) is dynamic, and that it may be predicted using the learning hypothesis and the minimum

  18. Development and evaluation of a computer-aided system for analyzing human error in railway operations

    International Nuclear Information System (INIS)

    Kim, Dong San; Baek, Dong Hyun; Yoon, Wan Chul

    2010-01-01

    As human error has been recognized as one of the major contributors to accidents in safety-critical systems, there has been a strong need for techniques that can analyze human error effectively. Although many techniques have been developed so far, much room for improvement remains. As human error analysis is a cognitively demanding and time-consuming task, it is particularly necessary to develop a computerized system supporting this task. This paper presents a computer-aided system for analyzing human error in railway operations, called Computer-Aided System for Human Error Analysis and Reduction (CAS-HEAR). It supports analysts to find multiple levels of error causes and their causal relations by using predefined links between contextual factors and causal factors as well as links between causal factors. In addition, it is based on a complete accident model; hence, it helps analysts to conduct a thorough analysis without missing any important part of human error analysis. A prototype of CAS-HEAR was evaluated by nine field investigators from six railway organizations in Korea. Its overall usefulness in human error analysis was confirmed, although development of its simplified version and some modification of the contextual factors and causal factors are required in order to ensure its practical use.

  19. A methodology for collection and analysis of human error data based on a cognitive model: IDA

    International Nuclear Information System (INIS)

    Shen, S.-H.; Smidts, C.; Mosleh, A.

    1997-01-01

    This paper presents a model-based human error taxonomy and data collection. The underlying model, IDA (described in two companion papers), is a cognitive model of behavior developed for analysis of the actions of nuclear power plant operating crew during abnormal situations. The taxonomy is established with reference to three external reference points (i.e. plant status, procedures, and crew) and four reference points internal to the model (i.e. information collected, diagnosis, decision, action). The taxonomy helps the analyst: (1) recognize errors as such; (2) categorize the error in terms of generic characteristics such as 'error in selection of problem solving strategies' and (3) identify the root causes of the error. The data collection methodology is summarized in post event operator interview and analysis summary forms. The root cause analysis methodology is illustrated using a subset of an actual event. Statistics, which extract generic characteristics of error prone behaviors and error prone situations are presented. Finally, applications of the human error data collection are reviewed. A primary benefit of this methodology is to define better symptom-based and other auxiliary procedures with associated training to minimize or preclude certain human errors. It also helps in design of control rooms, and in assessment of human error probabilities in the probabilistic risk assessment framework. (orig.)

  20. Analysis of Human Error Types and Performance Shaping Factors in the Next Generation Main Control Room

    International Nuclear Information System (INIS)

    Sin, Y. C.; Jung, Y. S.; Kim, K. H.; Kim, J. H.

    2008-04-01

    Main control room of nuclear power plants has been computerized and digitalized in new and modernized plants, as information and digital technologies make great progresses and become mature. Survey on human factors engineering issues in advanced MCRs: Model-based approach, Literature survey-based approach. Analysis of human error types and performance shaping factors is analysis of three human errors. The results of project can be used for task analysis, evaluation of human error probabilities, and analysis of performance shaping factors in the HRA analysis

  1. Checklist Usage as a Guidance on Read-Back Reducing the Potential Risk of Medication Error

    Directory of Open Access Journals (Sweden)

    Ida Bagus N. Maharjana

    2014-06-01

    Full Text Available Hospital as a last line of health services shall provide quality service and oriented on patient safety, one responsibility in preventing medication errors. Effective collaboration and communication between the profession needed to achieve patient safety. Read-back is one way of doing effective communication. Before-after study with PDCA TQM approach. The samples were on the medication chart patient medical rd rd records in the 3 week of May (before and the 3 week in July (after 2013. Treatment using the check list, asked for time 2 minutes to read-back by the doctors and nurses after the visit together. Obtained 57 samples (before and 64 samples (after. Before charging 45.54% incomplete medication chart on patient medical records that have the potential risk of medication error to 10.17% after treatment with a read back check list for 10 weeks, with 77.78% based on the achievement of the PDCA TQM approach. Checklist usage as a guidance on Read-back as an effective communication can reduce charging incompleteness drug records on medical records that have the potential risk of medication errors, 45.54% to 10.17%.

  2. Accounting for response misclassification and covariate measurement error improves power and reduces bias in epidemiologic studies.

    Science.gov (United States)

    Cheng, Dunlei; Branscum, Adam J; Stamey, James D

    2010-07-01

    To quantify the impact of ignoring misclassification of a response variable and measurement error in a covariate on statistical power, and to develop software for sample size and power analysis that accounts for these flaws in epidemiologic data. A Monte Carlo simulation-based procedure is developed to illustrate the differences in design requirements and inferences between analytic methods that properly account for misclassification and measurement error to those that do not in regression models for cross-sectional and cohort data. We found that failure to account for these flaws in epidemiologic data can lead to a substantial reduction in statistical power, over 25% in some cases. The proposed method substantially reduced bias by up to a ten-fold margin compared to naive estimates obtained by ignoring misclassification and mismeasurement. We recommend as routine practice that researchers account for errors in measurement of both response and covariate data when determining sample size, performing power calculations, or analyzing data from epidemiological studies. 2010 Elsevier Inc. All rights reserved.

  3. Impact of Educational Activities in Reducing Pre-Analytical Laboratory Errors: A quality initiative.

    Science.gov (United States)

    Al-Ghaithi, Hamed; Pathare, Anil; Al-Mamari, Sahimah; Villacrucis, Rodrigo; Fawaz, Naglaa; Alkindi, Salam

    2017-08-01

    Pre-analytic errors during diagnostic laboratory investigations can lead to increased patient morbidity and mortality. This study aimed to ascertain the effect of educational nursing activities on the incidence of pre-analytical errors resulting in non-conforming blood samples. This study was conducted between January 2008 and December 2015. All specimens received at the Haematology Laboratory of the Sultan Qaboos University Hospital, Muscat, Oman, during this period were prospectively collected and analysed. Similar data from 2007 were collected retrospectively and used as a baseline for comparison. Non-conforming samples were defined as either clotted samples, haemolysed samples, use of the wrong anticoagulant, insufficient quantities of blood collected, incorrect/lack of labelling on a sample or lack of delivery of a sample in spite of a sample request. From 2008 onwards, multiple educational training activities directed at the hospital nursing staff and nursing students primarily responsible for blood collection were implemented on a regular basis. After initiating corrective measures in 2008, a progressive reduction in the percentage of non-conforming samples was observed from 2009 onwards. Despite a 127.84% increase in the total number of specimens received, there was a significant reduction in non-conforming samples from 0.29% in 2007 to 0.07% in 2015, resulting in an improvement of 75.86% ( P educational activities directed primarily towards hospital nursing staff had a positive impact on the quality of laboratory specimens by significantly reducing pre-analytical errors.

  4. MMI design of K-CPS for preventing human errors and enhancing convenient operation

    International Nuclear Information System (INIS)

    Sung, Chan Ho; Jung, Yeon Sub; Oh, Eoung Se; Shin, Young Chul; Lee, Yong Kwan

    2001-01-01

    In order to supplement defects of paper procedure, reduce human errors and enhance convenient operation, computer-based procedure system is being developed. CPS (Computerized Procedure System) including human-factor engineering design concept for KNGR (Korean Next Generation Reactor) has been also developed with the same object. K-CPS(KNGR CPS) has higher level of automation than paper procedure. It is fully integrated with control and monitoring systems. Combining statements and relevant components, which changes dynamically according to plant status enhances readability of procedure. This paper shows general design criteria on computer-based procedure system, the MMI design characteristics of K-CPS and the results of suitability evaluation for K-CPS by operator

  5. Role of data and judgment in modeling human errors

    International Nuclear Information System (INIS)

    Carnino, A.

    1986-01-01

    Human beings are not a simple component. This is why prediction of human behaviour in a quantitative way is so difficult. For human reliability analysis, the data sources that can be used are the following: operating experience and incident reports, data banks, data from literature, data collected from simulators, data established by expert judgement. The factors important for conducting a good human reliability analysis are then discussed, including the uncertainties to be associated with. (orig.)

  6. Demonstration Integrated Knowledge-Based System for Estimating Human Error Probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Auflick, Jack L.

    1999-04-21

    Human Reliability Analysis (HRA) is currently comprised of at least 40 different methods that are used to analyze, predict, and evaluate human performance in probabilistic terms. Systematic HRAs allow analysts to examine human-machine relationships, identify error-likely situations, and provide estimates of relative frequencies for human errors on critical tasks, highlighting the most beneficial areas for system improvements. Unfortunately, each of HRA's methods has a different philosophical approach, thereby producing estimates of human error probabilities (HEPs) that area better or worse match to the error likely situation of interest. Poor selection of methodology, or the improper application of techniques can produce invalid HEP estimates, where that erroneous estimation of potential human failure could have potentially severe consequences in terms of the estimated occurrence of injury, death, and/or property damage.

  7. Human Error Probability Assessment During Maintenance Activities of Marine Systems

    OpenAIRE

    Rabiul Islam; Faisal Khan; Rouzbeh Abbassi; Vikram Garaniya

    2018-01-01

    Background: Maintenance operations on-board ships are highly demanding. Maintenance operations are intensive activities requiring high man–machine interactions in challenging and evolving conditions. The evolving conditions are weather conditions, workplace temperature, ship motion, noise and vibration, and workload and stress. For example, extreme weather condition affects seafarers' performance, increasing the chances of error, and, consequently, can cause injuries or fatalities to personne...

  8. Errors in data interpretation from genetic variation of human analytes

    OpenAIRE

    Howie, Heather L.; Delaney, Meghan; Wang, Xiaohong; Er, Lay See; Kapp, Linda; Lebedev, Jenna N.; Zimring, James C.

    2017-01-01

    In recent years, the extent of our vulnerability to misinterpretation due to poorly characterized reagents has become an issue of great concern. Antibody reagents have been identified as a major source of error, contributing to the ?reproducibility crisis.? In the current report, we define an additional dimension of the crisis; in particular, we define variation of the targets being analyzed. We report that natural variation in the immunoglobulin ?constant? region alters the reactivity with c...

  9. Determining The Factors Causing Human Error Deficiencies At A Public Utility Company

    Directory of Open Access Journals (Sweden)

    F. W. Badenhorst

    2004-11-01

    Full Text Available According to Neff (1977, as cited by Bergh (1995, the westernised culture considers work important for industrial mental health. Most individuals experience work positively, which creates a positive attitude. Should this positive attitude be inhibited, workers could lose concentration and become bored, potentially resulting in some form of human error. The aim of this research was to determine the factors responsible for human error events, which lead to power supply failures at Eskom power stations. Proposals were made for the reduction of these contributing factors towards improving plant performance. The target population was 700 panel operators in Eskom’s Power Generation Group. The results showed that factors leading to human error can be reduced or even eliminated. Opsomming Neff (1977 soos aangehaal deur Bergh (1995, skryf dat in die westerse kultuur werk belangrik vir bedryfsgeestesgesondheid is. Die meeste persone ervaar werk as positief, wat ’n positiewe gesindheid kweek. Indien hierdie positiewe gesindheid geïnhibeer word, kan dit lei tot ’n gebrek aan konsentrasie by die werkers. Werkers kan verveeld raak en dit kan weer lei tot menslike foute. Die doel van hierdie navorsing is om die faktore vas te stel wat tot menslike foute lei, en wat bydra tot onderbrekings in kragvoorsiening by Eskom kragstasies. Voorstelle is gemaak vir die vermindering van hierdie bydraende faktore ten einde die kragaanleg se prestasie te verbeter. Die teiken-populasie was 700 paneel-operateurs in die Kragopwekkingsgroep by Eskom. Die resultate dui daarop dat die faktore wat aanleiding gee tot menslike foute wel verminder, of geëlimineer kan word.

  10. Exploring human error in military aviation flight safety events using post-incident classification systems.

    Science.gov (United States)

    Hooper, Brionny J; O'Hare, David P A

    2013-08-01

    Human error classification systems theoretically allow researchers to analyze postaccident data in an objective and consistent manner. The Human Factors Analysis and Classification System (HFACS) framework is one such practical analysis tool that has been widely used to classify human error in aviation. The Cognitive Error Taxonomy (CET) is another. It has been postulated that the focus on interrelationships within HFACS can facilitate the identification of the underlying causes of pilot error. The CET provides increased granularity at the level of unsafe acts. The aim was to analyze the influence of factors at higher organizational levels on the unsafe acts of front-line operators and to compare the errors of fixed-wing and rotary-wing operations. This study analyzed 288 aircraft incidents involving human error from an Australasian military organization occurring between 2001 and 2008. Action errors accounted for almost twice (44%) the proportion of rotary wing compared to fixed wing (23%) incidents. Both classificatory systems showed significant relationships between precursor factors such as the physical environment, mental and physiological states, crew resource management, training and personal readiness, and skill-based, but not decision-based, acts. The CET analysis showed different predisposing factors for different aspects of skill-based behaviors. Skill-based errors in military operations are more prevalent in rotary wing incidents and are related to higher level supervisory processes in the organization. The Cognitive Error Taxonomy provides increased granularity to HFACS analyses of unsafe acts.

  11. Probabilistic linkage to enhance deterministic algorithms and reduce data linkage errors in hospital administrative data.

    Science.gov (United States)

    Hagger-Johnson, Gareth; Harron, Katie; Goldstein, Harvey; Aldridge, Robert; Gilbert, Ruth

    2017-06-30

     BACKGROUND: The pseudonymisation algorithm used to link together episodes of care belonging to the same patients in England (HESID) has never undergone any formal evaluation, to determine the extent of data linkage error. To quantify improvements in linkage accuracy from adding probabilistic linkage to existing deterministic HESID algorithms. Inpatient admissions to NHS hospitals in England (Hospital Episode Statistics, HES) over 17 years (1998 to 2015) for a sample of patients (born 13/28th of months in 1992/1998/2005/2012). We compared the existing deterministic algorithm with one that included an additional probabilistic step, in relation to a reference standard created using enhanced probabilistic matching with additional clinical and demographic information. Missed and false matches were quantified and the impact on estimates of hospital readmission within one year were determined. HESID produced a high missed match rate, improving over time (8.6% in 1998 to 0.4% in 2015). Missed matches were more common for ethnic minorities, those living in areas of high socio-economic deprivation, foreign patients and those with 'no fixed abode'. Estimates of the readmission rate were biased for several patient groups owing to missed matches, which was reduced for nearly all groups. CONCLUSION: Probabilistic linkage of HES reduced missed matches and bias in estimated readmission rates, with clear implications for commissioning, service evaluation and performance monitoring of hospitals. The existing algorithm should be modified to address data linkage error, and a retrospective update of the existing data would address existing linkage errors and their implications.

  12. The application of SHERPA (Systematic Human Error Reduction and Prediction Approach) in the development of compensatory cognitive rehabilitation strategies for stroke patients with left and right brain damage.

    Science.gov (United States)

    Hughes, Charmayne M L; Baber, Chris; Bienkiewicz, Marta; Worthington, Andrew; Hazell, Alexa; Hermsdörfer, Joachim

    2015-01-01

    Approximately 33% of stroke patients have difficulty performing activities of daily living, often committing errors during the planning and execution of such activities. The objective of this study was to evaluate the ability of the human error identification (HEI) technique SHERPA (Systematic Human Error Reduction and Prediction Approach) to predict errors during the performance of daily activities in stroke patients with left and right hemisphere lesions. Using SHERPA we successfully predicted 36 of the 38 observed errors, with analysis indicating that the proportion of predicted and observed errors was similar for all sub-tasks and severity levels. HEI results were used to develop compensatory cognitive strategies that clinicians could employ to reduce or prevent errors from occurring. This study provides evidence for the reliability and validity of SHERPA in the design of cognitive rehabilitation strategies in stroke populations.

  13. Guidelines for system modeling: pre-accident human errors, rev.0

    International Nuclear Information System (INIS)

    Kang, Dae Il; Jung, W. D.; Lee, Y. H.; Hwang, M. J.; Yang, J. E.

    2004-01-01

    The evaluation results of Human Reliability Analysis (HRA) of pre-accident human errors in the probabilistic safety assessment (PSA) for the Korea Standard Nuclear Power Plant (KSNP) using the ASME PRA standard show that more than 50% of 10 items to be improved are related to the identification and screening analysis for them. Thus, we developed a guideline for modeling pre-accident human errors for the system analyst to resolve some items to be improved for them. The developed guideline consists of modeling criteria for the pre-accident human errors (identification, qualitative screening, and common restoration errors) and detailed guidelines for pre-accident human errors relating to testing, maintenance, and calibration works of nuclear power plants (NPPs). The system analyst use the developed guideline and he or she applies it to the system which he or she takes care of. The HRA analyst review the application results of the system analyst. We applied the developed guideline to the auxiliary feed water system of the KSNP to show the usefulness of it. The application results of the developed guideline show that more than 50% of the items to be improved for pre-accident human errors of auxiliary feed water system are resolved. The guideline for modeling pre-accident human errors developed in this study can be used for other NPPs as well as the KSNP. It is expected that both use of the detailed procedure, to be developed in the future, for the quantification of pre-accident human errors and the guideline developed in this study will greatly enhance the PSA quality in the HRA of pre-accident human errors

  14. Guidelines for system modeling: pre-accident human errors, rev.0

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Dae Il; Jung, W. D.; Lee, Y. H.; Hwang, M. J.; Yang, J. E

    2004-01-01

    The evaluation results of Human Reliability Analysis (HRA) of pre-accident human errors in the probabilistic safety assessment (PSA) for the Korea Standard Nuclear Power Plant (KSNP) using the ASME PRA standard show that more than 50% of 10 items to be improved are related to the identification and screening analysis for them. Thus, we developed a guideline for modeling pre-accident human errors for the system analyst to resolve some items to be improved for them. The developed guideline consists of modeling criteria for the pre-accident human errors (identification, qualitative screening, and common restoration errors) and detailed guidelines for pre-accident human errors relating to testing, maintenance, and calibration works of nuclear power plants (NPPs). The system analyst use the developed guideline and he or she applies it to the system which he or she takes care of. The HRA analyst review the application results of the system analyst. We applied the developed guideline to the auxiliary feed water system of the KSNP to show the usefulness of it. The application results of the developed guideline show that more than 50% of the items to be improved for pre-accident human errors of auxiliary feed water system are resolved. The guideline for modeling pre-accident human errors developed in this study can be used for other NPPs as well as the KSNP. It is expected that both use of the detailed procedure, to be developed in the future, for the quantification of pre-accident human errors and the guideline developed in this study will greatly enhance the PSA quality in the HRA of pre-accident human errors.

  15. New method of classifying human errors at nuclear power plants and the analysis results of applying this method to maintenance errors at domestic plants

    International Nuclear Information System (INIS)

    Takagawa, Kenichi; Miyazaki, Takamasa; Gofuku, Akio; Iida, Hiroyasu

    2007-01-01

    Since many of the adverse events that have occurred in nuclear power plants in Japan and abroad have been related to maintenance or operation, it is necessary to plan preventive measures based on detailed analyses of human errors made by maintenance workers or operators. Therefore, before planning preventive measures, we developed a new method of analyzing human errors. Since each human error is an unsafe action caused by some misjudgement made by a person, we decided to classify them into six categories according to the stage in the judgment process in which the error was made. By further classifying each error into either an omission-type or commission-type, we produced 12 categories of errors. Then, we divided them into the two categories of basic error tendencies and individual error tendencies, and categorized background factors into four categories: imperfect planning; imperfect facilities or tools; imperfect environment; and imperfect instructions or communication. We thus defined the factors in each category to make it easy to identify factors that caused the error. Then using this method, we studied the characteristics of human errors that involved maintenance workers and planners since many maintenance errors have occurred. Among the human errors made by workers (worker errors) during the implementation stage, the following three types were prevalent with approximately 80%: commission-type 'projection errors', omission-type comprehension errors' and commission type 'action errors'. The most common among the individual factors of worker errors was 'repetition or habit' (schema), based on the assumption of a typical situation, and the half number of the 'repetition or habit' cases (schema) were not influenced by any background factors. The most common background factor that contributed to the individual factor was 'imperfect work environment', followed by 'insufficient knowledge'. Approximately 80% of the individual factors were 'repetition or habit' or

  16. Derivation of main drivers affecting the possibility of human errors during low power and shutdown operation

    International Nuclear Information System (INIS)

    Kim, Ar Ryum; Seong, Poong Hyun; Park, Jin Kyun; Kim, Jae Whan

    2016-01-01

    In order to estimate the possibility of human error and identify its nature, human reliability analysis (HRA) methods have been implemented. For this, various HRA methods have been developed so far: techniques for human error rate prediction (THERP), cause based decision tree (CBDT), the cognitive reliability and error analysis method (CREAM) and so on. Most HRA methods have been developed with a focus on full power operation of NPPs even though human performance may more largely affect the safety of the system during low power and shutdown (LPSD) operation than it would when the system is in full power operation. In this regard, it is necessary to conduct a research for developing HRA method to be used in LPSD operation. For the first step of the study, main drivers which affect the possibility of human error have been developed. Drivers which are commonly called as performance shaping factors (PSFs) are aspects of the human's individual characteristics, environment, organization, or task that specifically decrements or improves human performance, thus respectively increasing or decreasing the likelihood of human errors. In order to estimate the possibility of human error and identify its nature, human reliability analysis (HRA) methods have been implemented. For this, various HRA methods have been developed so far: techniques for human error rate prediction (THERP), cause based decision tree (CBDT), the cognitive reliability and error analysis method (CREAM) and so on. Most HRA methods have been developed with a focus on full power operation of NPPs even though human performance may more largely affect the safety of the system during low power and shutdown (LPSD) operation than it would when the system is in full power operation. In this regard, it is necessary to conduct a research for developing HRA method to be used in LPSD operation. For the first step of the study, main drivers which affect the possibility of human error have been developed. Drivers which

  17. Derivation of main drivers affecting the possibility of human errors during low power and shutdown operation

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ar Ryum; Seong, Poong Hyun [KAIST, Daejeon (Korea, Republic of); Park, Jin Kyun; Kim, Jae Whan [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    In order to estimate the possibility of human error and identify its nature, human reliability analysis (HRA) methods have been implemented. For this, various HRA methods have been developed so far: techniques for human error rate prediction (THERP), cause based decision tree (CBDT), the cognitive reliability and error analysis method (CREAM) and so on. Most HRA methods have been developed with a focus on full power operation of NPPs even though human performance may more largely affect the safety of the system during low power and shutdown (LPSD) operation than it would when the system is in full power operation. In this regard, it is necessary to conduct a research for developing HRA method to be used in LPSD operation. For the first step of the study, main drivers which affect the possibility of human error have been developed. Drivers which are commonly called as performance shaping factors (PSFs) are aspects of the human's individual characteristics, environment, organization, or task that specifically decrements or improves human performance, thus respectively increasing or decreasing the likelihood of human errors. In order to estimate the possibility of human error and identify its nature, human reliability analysis (HRA) methods have been implemented. For this, various HRA methods have been developed so far: techniques for human error rate prediction (THERP), cause based decision tree (CBDT), the cognitive reliability and error analysis method (CREAM) and so on. Most HRA methods have been developed with a focus on full power operation of NPPs even though human performance may more largely affect the safety of the system during low power and shutdown (LPSD) operation than it would when the system is in full power operation. In this regard, it is necessary to conduct a research for developing HRA method to be used in LPSD operation. For the first step of the study, main drivers which affect the possibility of human error have been developed. Drivers

  18. Phytosterol glycosides reduce cholesterol absorption in humans

    OpenAIRE

    Lin, Xiaobo; Ma, Lina; Racette, Susan B.; Anderson Spearie, Catherine L.; Ostlund, Richard E.

    2009-01-01

    Dietary phytosterols inhibit intestinal cholesterol absorption and regulate whole body cholesterol excretion and balance. However, they are biochemically heterogeneous and a portion is glycosylated in some foods with unknown effects on biological activity. We tested the hypothesis that phytosterol glycosides reduce cholesterol absorption in humans. Phytosterol glycosides were extracted and purified from soy lecithin in a novel two-step process. Cholesterol absorption was measured in a series ...

  19. Human error mode identification for NPP main control room operations using soft controls

    International Nuclear Information System (INIS)

    Lee, Seung Jun; Kim, Jaewhan; Jang, Seung-Cheol

    2011-01-01

    The operation environment of main control rooms (MCRs) in modern nuclear power plants (NPPs) has considerably changed over the years. Advanced MCRs, which have been designed by adapting digital and computer technologies, have simpler interfaces using large display panels, computerized displays, soft controls, computerized procedure systems, and so on. The actions for the NPP operations are performed using soft controls in advanced MCRs. Soft controls have different features from conventional controls. Operators need to navigate the screens to find indicators and controls and manipulate controls using a mouse, touch screens, and so on. Due to these different interfaces, different human errors should be considered in the human reliability analysis (HRA) for advanced MCRs. In this work, human errors that could occur during operation executions using soft controls were analyzed. This work classified the human errors in soft controls into six types, and the reasons that affect the occurrence of the human errors were also analyzed. (author)

  20. A method for analysing incidents due to human errors on nuclear installations

    International Nuclear Information System (INIS)

    Griffon, M.

    1980-01-01

    This paper deals with the development of a methodology adapted to a detailed analysis of incidents considered to be due to human errors. An identification of human errors and a search for their eventual multiple causes is then needed. They are categorized in eight classes: education and training of personnel, installation design, work organization, time and work duration, physical environment, social environment, history of the plant and performance of the operator. The method is illustrated by the analysis of a handling incident generated by multiple human errors. (author)

  1. Calculating method on human error probabilities considering influence of management and organization

    International Nuclear Information System (INIS)

    Gao Jia; Huang Xiangrui; Shen Zupei

    1996-01-01

    This paper is concerned with how management and organizational influences can be factored into quantifying human error probabilities on risk assessments, using a three-level Influence Diagram (ID) which is originally only as a tool for construction and representation of models of decision-making trees or event trees. An analytical model of human errors causation has been set up with three influence levels, introducing a method for quantification assessments (of the ID), which can be applied into quantifying probabilities) of human errors on risk assessments, especially into the quantification of complex event trees (system) as engineering decision-making analysis. A numerical case study is provided to illustrate the approach

  2. A study on the critical factors of human error in civil aviation: An early warning management perspective in Bangladesh

    Directory of Open Access Journals (Sweden)

    Md. Salah Uddin Rajib

    2015-01-01

    Full Text Available The safety of civil aviation will be more secured if the errors in all the facets can be reduced. Like the other industrial sectors, human resource is one of the most complex and sensitive resources for the civil aviation. The error of human resources can cause fatal disasters. In these days, a good volume of researches have been conducted on the disaster of civil aviation. The researchers have identified the causes of the civil aviation disasters from various perspectives. They identified the areas where more concern is needed to reduce the disastrous impacts. This paper aims to find out the critical factors of human error in civil aviation in a developing country (Bangladesh as it is accepted that human error is one of main causes of civil aviation disasters. The paper reviews the previous research to find out the critical factors conceptually. Fuzzy analytical hierarchy process (FAHP has been used to find out the critical factors systematically. Analyses indicate that the concentration on precondition for unsafe acts (including sub-factors is required to ensure the aviation safety.

  3. Analysis of measured data of human body based on error correcting frequency

    Science.gov (United States)

    Jin, Aiyan; Peipei, Gao; Shang, Xiaomei

    2014-04-01

    Anthropometry is to measure all parts of human body surface, and the measured data is the basis of analysis and study of the human body, establishment and modification of garment size and formulation and implementation of online clothing store. In this paper, several groups of the measured data are gained, and analysis of data error is gotten by analyzing the error frequency and using analysis of variance method in mathematical statistics method. Determination of the measured data accuracy and the difficulty of measured parts of human body, further studies of the causes of data errors, and summarization of the key points to minimize errors possibly are also mentioned in the paper. This paper analyses the measured data based on error frequency, and in a way , it provides certain reference elements to promote the garment industry development.

  4. Using a Delphi Method to Identify Human Factors Contributing to Nursing Errors.

    Science.gov (United States)

    Roth, Cheryl; Brewer, Melanie; Wieck, K Lynn

    2017-07-01

    The purpose of this study was to identify human factors associated with nursing errors. Using a Delphi technique, this study used feedback from a panel of nurse experts (n = 25) on an initial qualitative survey questionnaire followed by summarizing the results with feedback and confirmation. Synthesized factors regarding causes of errors were incorporated into a quantitative Likert-type scale, and the original expert panel participants were queried a second time to validate responses. The list identified 24 items as most common causes of nursing errors, including swamping and errors made by others that nurses are expected to recognize and fix. The responses provided a consensus top 10 errors list based on means with heavy workload and fatigue at the top of the list. The use of the Delphi survey established consensus and developed a platform upon which future study of nursing errors can evolve as a link to future solutions. This list of human factors in nursing errors should serve to stimulate dialogue among nurses about how to prevent errors and improve outcomes. Human and system failures have been the subject of an abundance of research, yet nursing errors continue to occur. © 2016 Wiley Periodicals, Inc.

  5. Prediction of human errors by maladaptive changes in event-related brain networks

    NARCIS (Netherlands)

    Eichele, T.; Debener, S.; Calhoun, V.D.; Specht, K.; Engel, A.K.; Hugdahl, K.; Cramon, D.Y. von; Ullsperger, M.

    2008-01-01

    Humans engaged in monotonous tasks are susceptible to occasional errors that may lead to serious consequences, but little is known about brain activity patterns preceding errors. Using functional Mill and applying independent component analysis followed by deconvolution of hemodynamic responses, we

  6. Investigating the causes of human error-induced incidents in the maintenance operations of petrochemical industry by using HFACS

    Directory of Open Access Journals (Sweden)

    Mohammadreza Azhdari

    2017-03-01

    Full Text Available Background & Objectives: Maintenance is an important tool for the petrochemical industries to prevent of accidents and increase operational and process safety success. The purpose of this study was to identify the possible causes of incidents caused by human error in the petrochemical maintenance activities by using Human Factors Analysis and Classification System (HFACS. Methods: This study is a cross-sectional analysis that was conducted in Zagros Petrochemical Company, Asaluyeh-Iran. A checklist of human error-induced incidents was developed based on four HFACS main levels and nineteen sub-groups. Hierarchical task analysis (HTA technique was used to identify maintenance activities and tasks. The main causes of possible incidents were identified by checklist and recorded. Corrective and preventive actions were defined depending on priority.   Results: The content analysis of worksheets of 444 activities showed 37.6% of the causes at the level of unsafe actions, 27.5% at the level of unsafe supervision, 20.9% at the level of preconditions for unsafe acts and 14% of the causes at the level of organizational effects. The HFACS sub-groups showed errors (24.36% inadequate supervision (14.89% and violations (13.26% with the most frequency. Conclusion: In order to prevent and reduce the occurrence of the identified errors, reducing the rate of the detected errors is crucial. Findings of this study showed that appropriate controlling measures such as periodical training of work procedures and supervision improvement decrease the human error-induced incidents in petrochemical industry maintenance.

  7. Human Error and the International Space Station: Challenges and Triumphs in Science Operations

    Science.gov (United States)

    Harris, Samantha S.; Simpson, Beau C.

    2016-01-01

    Any system with a human component is inherently risky. Studies in human factors and psychology have repeatedly shown that human operators will inevitably make errors, regardless of how well they are trained. Onboard the International Space Station (ISS) where crew time is arguably the most valuable resource, errors by the crew or ground operators can be costly to critical science objectives. Operations experts at the ISS Payload Operations Integration Center (POIC), located at NASA's Marshall Space Flight Center in Huntsville, Alabama, have learned that from payload concept development through execution, there are countless opportunities to introduce errors that can potentially result in costly losses of crew time and science. To effectively address this challenge, we must approach the design, testing, and operation processes with two specific goals in mind. First, a systematic approach to error and human centered design methodology should be implemented to minimize opportunities for user error. Second, we must assume that human errors will be made and enable rapid identification and recoverability when they occur. While a systematic approach and human centered development process can go a long way toward eliminating error, the complete exclusion of operator error is not a reasonable expectation. The ISS environment in particular poses challenging conditions, especially for flight controllers and astronauts. Operating a scientific laboratory 250 miles above the Earth is a complicated and dangerous task with high stakes and a steep learning curve. While human error is a reality that may never be fully eliminated, smart implementation of carefully chosen tools and techniques can go a long way toward minimizing risk and increasing the efficiency of NASA's space science operations.

  8. Human Error and General Aviation Accidents: A Comprehensive, Fine-Grained Analysis Using HFACS

    National Research Council Canada - National Science Library

    Wiegmann, Douglas; Faaborg, Troy; Boquet, Albert; Detwiler, Cristy; Holcomb, Kali; Shappell, Scott

    2005-01-01

    ... of both commercial and general aviation (GA) accidents. These analyses have helped to identify general trends in the types of human factors issues and aircrew errors that have contributed to civil aviation accidents...

  9. Cause analysis and preventives for human error events in Daya Bay NPP

    International Nuclear Information System (INIS)

    Huang Weigang; Zhang Li

    1998-01-01

    Daya Bay Nuclear Power Plant is put into commercial operation in 1994 Until 1996, there are 368 human error events in operating and maintenance area, occupying 39% of total events. These events occurred mainly in the processes of maintenance, test equipment isolation and system on-line, in particular in refuelling and maintenance. The author analyses root causes for human errorievents, which are mainly operator omission or error procedure deficiency; procedure not followed; lack of training; communication failures; work management inadequacy. The protective measures and treatment principle for human error events are also discussed, and several examples applying them are given. Finally, it is put forward that key to prevent human error event lies in the coordination and management, person in charge of work, and good work habits of staffs

  10. A channel-by-channel method of reducing the errors associated with peak area integration

    International Nuclear Information System (INIS)

    Luedeke, T.P.; Tripard, G.E.

    1996-01-01

    A new method of reducing the errors associated with peak area integration has been developed. This method utilizes the signal content of each channel as an estimate of the overall peak area. These individual estimates can then be weighted according to the precision with which each estimate is known, producing an overall area estimate. Experimental measurements were performed on a small peak sitting on a large background, and the results compared to those obtained from a commercial software program. Results showed a marked decrease in the spread of results around the true value (obtained by counting for a long period of time), and a reduction in the statistical uncertainty associated with the peak area. (orig.)

  11. Human errors in test and maintenance of nuclear power plants. Nordic project work

    International Nuclear Information System (INIS)

    Andersson, H.; Liwaang, B.

    1985-08-01

    The present report is a summary of the NKA/LIT-1 project performed for the period 1981-1985. The report summarizes work on human error influence in test and calibration activities in nuclear power plants, reviews problems regarding optimization of the test intervals, organization of test and maintenance activities, and the analysis of human error contribution to the overall risk in test and mainenace tasks. (author)

  12. SCHEME (Soft Control Human error Evaluation MEthod) for advanced MCR HRA

    International Nuclear Information System (INIS)

    Jang, Inseok; Jung, Wondea; Seong, Poong Hyun

    2015-01-01

    The Technique for Human Error Rate Prediction (THERP), Korean Human Reliability Analysis (K-HRA), Human Error Assessment and Reduction Technique (HEART), A Technique for Human Event Analysis (ATHEANA), Cognitive Reliability and Error Analysis Method (CREAM), and Simplified Plant Analysis Risk Human Reliability Assessment (SPAR-H) in relation to NPP maintenance and operation. Most of these methods were developed considering the conventional type of Main Control Rooms (MCRs). They are still used for HRA in advanced MCRs even though the operating environment of advanced MCRs in NPPs has been considerably changed by the adoption of new human-system interfaces such as computer-based soft controls. Among the many features in advanced MCRs, soft controls are an important feature because the operation action in NPP advanced MCRs is performed by soft controls. Consequently, those conventional methods may not sufficiently consider the features of soft control execution human errors. To this end, a new framework of a HRA method for evaluating soft control execution human error is suggested by performing the soft control task analysis and the literature reviews regarding widely accepted human error taxonomies. In this study, the framework of a HRA method for evaluating soft control execution human error in advanced MCRs is developed. First, the factors which HRA method in advanced MCRs should encompass are derived based on the literature review, and soft control task analysis. Based on the derived factors, execution HRA framework in advanced MCRs is developed mainly focusing on the features of soft control. Moreover, since most current HRA database deal with operation in conventional type of MCRs and are not explicitly designed to deal with digital HSI, HRA database are developed under lab scale simulation

  13. Evaluating a medical error taxonomy.

    OpenAIRE

    Brixey, Juliana; Johnson, Todd R.; Zhang, Jiajie

    2002-01-01

    Healthcare has been slow in using human factors principles to reduce medical errors. The Center for Devices and Radiological Health (CDRH) recognizes that a lack of attention to human factors during product development may lead to errors that have the potential for patient injury, or even death. In response to the need for reducing medication errors, the National Coordinating Council for Medication Errors Reporting and Prevention (NCC MERP) released the NCC MERP taxonomy that provides a stand...

  14. Reducing Individual Variation for fMRI Studies in Children by Minimizing Template Related Errors.

    Directory of Open Access Journals (Sweden)

    Jian Weng

    Full Text Available Spatial normalization is an essential process for group comparisons in functional MRI studies. In practice, there is a risk of normalization errors particularly in studies involving children, seniors or diseased populations and in regions with high individual variation. One way to minimize normalization errors is to create a study-specific template based on a large sample size. However, studies with a large sample size are not always feasible, particularly for children studies. The performance of templates with a small sample size has not been evaluated in fMRI studies in children. In the current study, this issue was encountered in a working memory task with 29 children in two groups. We compared the performance of different templates: a study-specific template created by the experimental population, a Chinese children template and the widely used adult MNI template. We observed distinct differences in the right orbitofrontal region among the three templates in between-group comparisons. The study-specific template and the Chinese children template were more sensitive for the detection of between-group differences in the orbitofrontal cortex than the MNI template. Proper templates could effectively reduce individual variation. Further analysis revealed a correlation between the BOLD contrast size and the norm index of the affine transformation matrix, i.e., the SFN, which characterizes the difference between a template and a native image and differs significantly across subjects. Thereby, we proposed and tested another method to reduce individual variation that included the SFN as a covariate in group-wise statistics. This correction exhibits outstanding performance in enhancing detection power in group-level tests. A training effect of abacus-based mental calculation was also demonstrated, with significantly elevated activation in the right orbitofrontal region that correlated with behavioral response time across subjects in the trained group.

  15. Gunning for accuracy: Company targets human error with 'foolproof' perforating gun

    Energy Technology Data Exchange (ETDEWEB)

    Ross, E.

    2002-12-01

    An innovative new design for a scalloped expandable perforating gun is described. The gun is manufactured in Edmonton by LRI Perforating Systems, and it aimed at reducing the chances of human error that can result in sub-optimal performance. A perforating gun is used to pierce the well's casing to allow hydrocarbons to flow into the wellbore. It consists of a carrier, a pipe with an eighth-inch thick wall that protects the explosives downhole, and an inner tube containing the explosives. The LRI system has a unique interlocking mechanism for the end plates, which are attached to either end of the thin-walled charge holder tube, in addition to other advantageous features, including a price advantage of 10 to 15 per cent. Marketing strategy is designed to interest wi reline companies that focus on vertical well applications. 1 photo.

  16. Human error data collection analysis program undertaken since 1982 by Electricite de France with INPO

    International Nuclear Information System (INIS)

    Ghertman, F.; Dietz, P.

    1985-01-01

    The preoccupation for reducing in frequency and importance events which harm at various degrees the availability, the safety and the security of nuclear power plants lead Electricite de France, in cooperation with INPO (Institute of Nuclear Power Operations) to launch a Human Error Collection and Analysis Program. On account with the difficulties met to develop such a program, it has been decided to begin with a pilot data collection limited to a six months period (October 1982 to April 1983) and three nuclear power plants (three US units and two French units). This pilot data collection followed four steps: (1) elaboration of the collection methodology; (2) sensitization and related training of the power plant personnel; (3) data collection in the power plant; and (4) analysis of the data and results. Each of the steps are discussed in the paper

  17. Coping with human errors through system design: Implications for ecological interface design

    DEFF Research Database (Denmark)

    Rasmussen, Jens; Vicente, Kim J.

    1989-01-01

    Research during recent years has revealed that human errors are not stochastic events which can be removed through improved training programs or optimal interface design. Rather, errors tend to reflect either systematic interference between various models, rules, and schemata, or the effects...... of the adaptive mechanisms involved in learning. In terms of design implications, these findings suggest that reliable human-system interaction will be achieved by designing interfaces which tend to minimize the potential for control interference and support recovery from errors. In other words, the focus should...... be on control of the effects of errors rather than on the elimination of errors per se. In this paper, we propose a theoretical framework for interface design that attempts to satisfy these objectives. The goal of our framework, called ecological interface design, is to develop a meaningful representation...

  18. A Human Reliability Analysis of Post- Accident Human Errors in the Low Power and Shutdown PSA of KSNP

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Daeil; Kim, J. H.; Jang, S. C

    2007-03-15

    Korea Atomic Energy Research Institute, using the ANS low power and shutdown (LPSD) probabilistic risk assessment (PRA) Standard, evaluated the LPSD PSA model of the KSNP, Yonggwang Units 5 and 6, and identified the items to be improved. The evaluation results of human reliability analysis (HRA) of the post-accident human errors in the LPSD PSA model for the KSNP showed that 10 items among 19 items of supporting requirements for those in the ANS PRA Standard were identified as them to be improved. Thus, we newly carried out a HRA for post-accident human errors in the LPSD PSA model for the KSNP. Following tasks are the improvements in the HRA of post-accident human errors of the LPSD PSA model for the KSNP compared with the previous one: Interviews with operators in the interpretation of the procedure, modeling of operator actions, and the quantification results of human errors, site visit. Applications of limiting value to the combined post-accident human errors. Documentation of information of all the input and bases for the detailed quantifications and the dependency analysis using the quantification sheets The assessment results for the new HRA results of post-accident human errors using the ANS LPSD PRA Standard show that above 80% items of its supporting requirements for post-accident human errors were graded as its Category II. The number of the re-estimated human errors using the LPSD Korea Standard HRA method is 385. Among them, the number of individual post-accident human errors is 253. The number of dependent post-accident human errors is 135. The quantification results of the LPSD PSA model for the KSNP with new HEPs show that core damage frequency (CDF) is increased by 5.1% compared with the previous baseline CDF It is expected that this study results will be greatly helpful to improve the PSA quality for the domestic nuclear power plants because they have sufficient PSA quality to meet the Category II of Supporting Requirements for the post

  19. A Human Reliability Analysis of Post- Accident Human Errors in the Low Power and Shutdown PSA of KSNP

    International Nuclear Information System (INIS)

    Kang, Daeil; Kim, J. H.; Jang, S. C.

    2007-03-01

    Korea Atomic Energy Research Institute, using the ANS low power and shutdown (LPSD) probabilistic risk assessment (PRA) Standard, evaluated the LPSD PSA model of the KSNP, Yonggwang Units 5 and 6, and identified the items to be improved. The evaluation results of human reliability analysis (HRA) of the post-accident human errors in the LPSD PSA model for the KSNP showed that 10 items among 19 items of supporting requirements for those in the ANS PRA Standard were identified as them to be improved. Thus, we newly carried out a HRA for post-accident human errors in the LPSD PSA model for the KSNP. Following tasks are the improvements in the HRA of post-accident human errors of the LPSD PSA model for the KSNP compared with the previous one: Interviews with operators in the interpretation of the procedure, modeling of operator actions, and the quantification results of human errors, site visit. Applications of limiting value to the combined post-accident human errors. Documentation of information of all the input and bases for the detailed quantifications and the dependency analysis using the quantification sheets The assessment results for the new HRA results of post-accident human errors using the ANS LPSD PRA Standard show that above 80% items of its supporting requirements for post-accident human errors were graded as its Category II. The number of the re-estimated human errors using the LPSD Korea Standard HRA method is 385. Among them, the number of individual post-accident human errors is 253. The number of dependent post-accident human errors is 135. The quantification results of the LPSD PSA model for the KSNP with new HEPs show that core damage frequency (CDF) is increased by 5.1% compared with the previous baseline CDF It is expected that this study results will be greatly helpful to improve the PSA quality for the domestic nuclear power plants because they have sufficient PSA quality to meet the Category II of Supporting Requirements for the post

  20. Quantitative estimation of the human error probability during soft control operations

    International Nuclear Information System (INIS)

    Lee, Seung Jun; Kim, Jaewhan; Jung, Wondea

    2013-01-01

    Highlights: ► An HRA method to evaluate execution HEP for soft control operations was proposed. ► The soft control tasks were analyzed and design-related influencing factors were identified. ► An application to evaluate the effects of soft controls was performed. - Abstract: In this work, a method was proposed for quantifying human errors that can occur during operation executions using soft controls. Soft controls of advanced main control rooms have totally different features from conventional controls, and thus they may have different human error modes and occurrence probabilities. It is important to identify the human error modes and quantify the error probability for evaluating the reliability of the system and preventing errors. This work suggests an evaluation framework for quantifying the execution error probability using soft controls. In the application result, it was observed that the human error probabilities of soft controls showed both positive and negative results compared to the conventional controls according to the design quality of advanced main control rooms

  1. From human error to organizational failure: a historical perspective

    International Nuclear Information System (INIS)

    Guarnieri, F.; Cambon, J.; Boissieres, I.

    2008-01-01

    This article hinges around three main parts.The first part goes back over the foundations of the human factor approach. It introduces the basic assumptions as well as some of the methods which have been developed. The second part accounts for the reasons why organizational factors have drawn our attention at the first place underlying two major points: the limits of the human factor approach but also the original contribution and the innovative aspect of sociology. At last, the third part describes the keystone principles and hypotheses on which lay the foundations of the organizational factor approach and draws a brief overview of the methods which have lately been implanted within the industrial world. (authors)

  2. Professional, structural and organisational interventions in primary care for reducing medication errors.

    Science.gov (United States)

    Khalil, Hanan; Bell, Brian; Chambers, Helen; Sheikh, Aziz; Avery, Anthony J

    2017-10-04

    Medication-related adverse events in primary care represent an important cause of hospital admissions and mortality. Adverse events could result from people experiencing adverse drug reactions (not usually preventable) or could be due to medication errors (usually preventable). To determine the effectiveness of professional, organisational and structural interventions compared to standard care to reduce preventable medication errors by primary healthcare professionals that lead to hospital admissions, emergency department visits, and mortality in adults. We searched CENTRAL, MEDLINE, Embase, three other databases, and two trial registries on 4 October 2016, together with reference checking, citation searching and contact with study authors to identify additional studies. We also searched several sources of grey literature. We included randomised trials in which healthcare professionals provided community-based medical services. We also included interventions in outpatient clinics attached to a hospital where people are seen by healthcare professionals but are not admitted to hospital. We only included interventions that aimed to reduce medication errors leading to hospital admissions, emergency department visits, or mortality. We included all participants, irrespective of age, who were prescribed medication by a primary healthcare professional. Three review authors independently extracted data. Each of the outcomes (hospital admissions, emergency department visits, and mortality), are reported in natural units (i.e. number of participants with an event per total number of participants at follow-up). We presented all outcomes as risk ratios (RRs) with 95% confidence intervals (CIs). We used the GRADE tool to assess the certainty of evidence. We included 30 studies (169,969 participants) in the review addressing various interventions to prevent medication errors; four studies addressed professional interventions (8266 participants) and 26 studies described

  3. An experimental approach to validating a theory of human error in complex systems

    Science.gov (United States)

    Morris, N. M.; Rouse, W. B.

    1985-01-01

    The problem of 'human error' is pervasive in engineering systems in which the human is involved. In contrast to the common engineering approach of dealing with error probabilistically, the present research seeks to alleviate problems associated with error by gaining a greater understanding of causes and contributing factors from a human information processing perspective. The general approach involves identifying conditions which are hypothesized to contribute to errors, and experimentally creating the conditions in order to verify the hypotheses. The conceptual framework which serves as the basis for this research is discussed briefly, followed by a description of upcoming research. Finally, the potential relevance of this research to design, training, and aiding issues is discussed.

  4. Phytosterol glycosides reduce cholesterol absorption in humans

    Science.gov (United States)

    Lin, Xiaobo; Ma, Lina; Racette, Susan B.; Anderson Spearie, Catherine L.; Ostlund, Richard E.

    2009-01-01

    Dietary phytosterols inhibit intestinal cholesterol absorption and regulate whole body cholesterol excretion and balance. However, they are biochemically heterogeneous and a portion is glycosylated in some foods with unknown effects on biological activity. We tested the hypothesis that phytosterol glycosides reduce cholesterol absorption in humans. Phytosterol glycosides were extracted and purified from soy lecithin in a novel two-step process. Cholesterol absorption was measured in a series of three single-meal tests given at intervals of 2 wk to each of 11 healthy subjects. In a randomized crossover design, participants received ∼300 mg of added phytosterols in the form of phytosterol glycosides or phytosterol esters, or placebo in a test breakfast also containing 30 mg cholesterol-d7. Cholesterol absorption was estimated by mass spectrometry of plasma cholesterol-d7 enrichment 4–5 days after each test. Compared with the placebo test, phytosterol glycosides reduced cholesterol absorption by 37.6 ± 4.8% (P lecithin are bioactive in humans and should be included in methods of phytosterol analysis and tables of food phytosterol content. PMID:19246636

  5. Phytosterol glycosides reduce cholesterol absorption in humans.

    Science.gov (United States)

    Lin, Xiaobo; Ma, Lina; Racette, Susan B; Anderson Spearie, Catherine L; Ostlund, Richard E

    2009-04-01

    Dietary phytosterols inhibit intestinal cholesterol absorption and regulate whole body cholesterol excretion and balance. However, they are biochemically heterogeneous and a portion is glycosylated in some foods with unknown effects on biological activity. We tested the hypothesis that phytosterol glycosides reduce cholesterol absorption in humans. Phytosterol glycosides were extracted and purified from soy lecithin in a novel two-step process. Cholesterol absorption was measured in a series of three single-meal tests given at intervals of 2 wk to each of 11 healthy subjects. In a randomized crossover design, participants received approximately 300 mg of added phytosterols in the form of phytosterol glycosides or phytosterol esters, or placebo in a test breakfast also containing 30 mg cholesterol-d7. Cholesterol absorption was estimated by mass spectrometry of plasma cholesterol-d7 enrichment 4-5 days after each test. Compared with the placebo test, phytosterol glycosides reduced cholesterol absorption by 37.6+/-4.8% (Pphytosterol esters 30.6+/-3.9% (P=0.0001). These results suggest that natural phytosterol glycosides purified from lecithin are bioactive in humans and should be included in methods of phytosterol analysis and tables of food phytosterol content.

  6. A Method and Support Tool for the Analysis of Human Error Hazards in Digital Devices

    International Nuclear Information System (INIS)

    Lee, Yong Hee; Kim, Seon Soo; Lee, Yong Hee

    2012-01-01

    In recent years, many nuclear power plants have adopted modern digital I and C technologies since they are expected to significantly improve their performance and safety. Modern digital technologies were expected to significantly improve both the economical efficiency and safety of nuclear power plants. However, the introduction of an advanced main control room (MCR) is accompanied with lots of changes in forms and features and differences through virtue of new digital devices. Many user-friendly displays and new features in digital devices are not enough to prevent human errors in nuclear power plants (NPPs). It may be an urgent to matter find the human errors potentials due to digital devices, and their detailed mechanisms. We can then consider them during the design of digital devices and their interfaces. The characteristics of digital technologies and devices may give many opportunities to the interface management, and can be integrated into a compact single workstation in an advanced MCR, such that workers can operate the plant with minimum burden under any operating condition. However, these devices may introduce new types of human errors, and thus we need a means to evaluate and prevent such errors, especially within digital devices for NPPs. This research suggests a new method named HEA-BIS (Human Error Analysis based on Interaction Segment) to confirm and detect human errors associated with digital devices. This method can be facilitated by support tools when used to ensure the safety when applying digital devices in NPPs

  7. Making Residents Part of the Safety Culture: Improving Error Reporting and Reducing Harms.

    Science.gov (United States)

    Fox, Michael D; Bump, Gregory M; Butler, Gabriella A; Chen, Ling-Wan; Buchert, Andrew R

    2017-01-30

    Reporting medical errors is a focus of the patient safety movement. As frontline physicians, residents are optimally positioned to recognize errors and flaws in systems of care. Previous work highlights the difficulty of engaging residents in identification and/or reduction of medical errors and in integrating these trainees into their institutions' cultures of safety. The authors describe the implementation of a longitudinal, discipline-based, multifaceted curriculum to enhance the reporting of errors by pediatric residents at Children's Hospital of Pittsburgh of University of Pittsburgh Medical Center. The key elements of this curriculum included providing the necessary education to identify medical errors with an emphasis on systems-based causes, modeling of error reporting by faculty, and integrating error reporting and discussion into the residents' daily activities. The authors tracked monthly error reporting rates by residents and other health care professionals, in addition to serious harm event rates at the institution. The interventions resulted in significant increases in error reports filed by residents, from 3.6 to 37.8 per month over 4 years (P error reporting correlated with a decline in serious harm events, from 15.0 to 8.1 per month over 4 years (P = 0.01). Integrating patient safety into the everyday resident responsibilities encourages frequent reporting and discussion of medical errors and leads to improvements in patient care. Multiple simultaneous interventions are essential to making residents part of the safety culture of their training hospitals.

  8. Reducing Diagnostic Errors through Effective Communication: Harnessing the Power of Information Technology

    Science.gov (United States)

    Naik, Aanand Dinkar; Rao, Raghuram; Petersen, Laura Ann

    2008-01-01

    Diagnostic errors are poorly understood despite being a frequent cause of medical errors. Recent efforts have aimed to advance the "basic science" of diagnostic error prevention by tracing errors to their most basic origins. Although a refined theory of diagnostic error prevention will take years to formulate, we focus on communication breakdown, a major contributor to diagnostic errors and an increasingly recognized preventable factor in medical mishaps. We describe a comprehensive framework that integrates the potential sources of communication breakdowns within the diagnostic process and identifies vulnerable steps in the diagnostic process where various types of communication breakdowns can precipitate error. We then discuss potential information technology-based interventions that may have efficacy in preventing one or more forms of these breakdowns. These possible intervention strategies include using new technologies to enhance communication between health providers and health systems, improve patient involvement, and facilitate management of information in the medical record. PMID:18373151

  9. An Estimation of Human Error Probability of Filtered Containment Venting System Using Dynamic HRA Method

    Energy Technology Data Exchange (ETDEWEB)

    Jang, Seunghyun; Jae, Moosung [Hanyang University, Seoul (Korea, Republic of)

    2016-10-15

    The human failure events (HFEs) are considered in the development of system fault trees as well as accident sequence event trees in part of Probabilistic Safety Assessment (PSA). As a method for analyzing the human error, several methods, such as Technique for Human Error Rate Prediction (THERP), Human Cognitive Reliability (HCR), and Standardized Plant Analysis Risk-Human Reliability Analysis (SPAR-H) are used and new methods for human reliability analysis (HRA) are under developing at this time. This paper presents a dynamic HRA method for assessing the human failure events and estimation of human error probability for filtered containment venting system (FCVS) is performed. The action associated with implementation of the containment venting during a station blackout sequence is used as an example. In this report, dynamic HRA method was used to analyze FCVS-related operator action. The distributions of the required time and the available time were developed by MAAP code and LHS sampling. Though the numerical calculations given here are only for illustrative purpose, the dynamic HRA method can be useful tools to estimate the human error estimation and it can be applied to any kind of the operator actions, including the severe accident management strategy.

  10. Human Performance in Simulated Reduced Gravity Environments

    Science.gov (United States)

    Cowley, Matthew; Harvill, Lauren; Rajulu, Sudhakar

    2014-01-01

    NASA is currently designing a new space suit capable of working in deep space and on Mars. Designing a suit is very difficult and often requires trade-offs between performance, cost, mass, and system complexity. Our current understanding of human performance in reduced gravity in a planetary environment (the moon or Mars) is limited to lunar observations, studies from the Apollo program, and recent suit tests conducted at JSC using reduced gravity simulators. This study will look at our most recent reduced gravity simulations performed on the new Active Response Gravity Offload System (ARGOS) compared to the C-9 reduced gravity plane. Methods: Subjects ambulated in reduced gravity analogs to obtain a baseline for human performance. Subjects were tested in lunar gravity (1.6 m/sq s) and Earth gravity (9.8 m/sq s) in shirt-sleeves. Subjects ambulated over ground at prescribed speeds on the ARGOS, but ambulated at a self-selected speed on the C-9 due to time limitations. Subjects on the ARGOS were given over 3 minutes to acclimate to the different conditions before data was collected. Nine healthy subjects were tested in the ARGOS (6 males, 3 females, 79.5 +/- 15.7 kg), while six subjects were tested on the C-9 (6 males, 78.8 +/- 11.2 kg). Data was collected with an optical motion capture system (Vicon, Oxford, UK) and was analyzed using customized analysis scripts in BodyBuilder (Vicon, Oxford, UK) and MATLAB (MathWorks, Natick, MA, USA). Results: In all offloaded conditions, variation between subjects increased compared to 1-g. Kinematics in the ARGOS at lunar gravity resembled earth gravity ambulation more closely than the C-9 ambulation. Toe-off occurred 10% earlier in both reduced gravity environments compared to earth gravity, shortening the stance phase. Likewise, ankle, knee, and hip angles remained consistently flexed and had reduced peaks compared to earth gravity. Ground reaction forces in lunar gravity (normalized to Earth body weight) were 0.4 +/- 0.2 on

  11. A Human Error Analysis with Physiological Signals during Utilizing Digital Devices

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yong Hee; Oh, Yeon Ju; Shin, Kwang Hyeon [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2011-10-15

    The introduction of advanced MCR is accompanied with lots of changes and different forms and features through the virtue of new digital technologies. There are various kinds of digital devices such as flat panel displays, touch screens, and so on. The characteristics of these digital devices give many chances to the interface management, and can be integrated into a compact single workstation in an advanced MCR so that workers can operate the plant with minimum burden during any operating condition. However, these devices may introduce new types of human errors, and thus we need a means to evaluate and prevent such error, especially those related to the digital devices. Human errors have been retrospectively assessed for accident reviews and quantitatively evaluated through HRA for PSA. However, the ergonomic verification and validation is an important process to defend all human error potential in the NPP design. HRA is a crucial part of a PSA, and helps in preparing a countermeasure for design by drawing potential human error items that affect the overall safety of NPPs. Various HRA techniques are available however: they reveal shortages of the HMI design in the digital era. - HRA techniques depend on PSFs: this means that the scope dealing with human factors is previously limited, and thus all attributes of new digital devices may not be considered in HRA. - The data used to HRA are not close to the evaluation items. So, human error analysis is not easy to apply to design by several individual experiments and cases. - The results of HRA are not statistically meaningful because accidents including human errors in NPPs are rare and have been estimated as having an extremely low probability

  12. An Empirical Study on Human Performance according to the Physical Environment (Potential Human Error Hazard) in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Kim, Ar Ryum; Jang, In Seok; Seong, Proong Hyun

    2014-01-01

    The management of the physical environment for safety is more effective than a nuclear industry. Despite the physical environment such as lighting, noise satisfy with management standards, it can be background factors may cause human error and affect human performance. Because the consequence of extremely human error and human performance is high according to the physical environment, requirement standard could be covered with specific criteria. Particularly, in order to avoid human errors caused by an extremely low or rapidly-changing intensity illumination and masking effect such as power disconnection, plans for better visual environment and better function performances should be made as a careful study on efficient ways to manage and continue the better conditions is conducted

  13. Trauma Quality Improvement: Reducing Triage Errors by Automating the Level Assignment Process.

    Science.gov (United States)

    Stonko, David P; O Neill, Dillon C; Dennis, Bradley M; Smith, Melissa; Gray, Jeffrey; Guillamondegui, Oscar D

    2018-04-12

    Trauma patients are triaged by the severity of their injury or need for intervention while en route to the trauma center according to trauma activation protocols that are institution specific. Significant research has been aimed at improving these protocols in order to optimize patient outcomes while striving for efficiency in care. However, it is known that patients are often undertriaged or overtriaged because protocol adherence remains imperfect. The goal of this quality improvement (QI) project was to improve this adherence, and thereby reduce the triage error. It was conducted as part of the formal undergraduate medical education curriculum at this institution. A QI team was assembled and baseline data were collected, then 2 Plan-Do-Study-Act (PDSA) cycles were implemented sequentially. During the first cycle, a novel web tool was developed and implemented in order to automate the level assignment process (it takes EMS-provided data and automatically determines the level); the tool was based on the existing trauma activation protocol. The second PDSA cycle focused on improving triage accuracy in isolated, less than 10% total body surface area burns, which we identified to be a point of common error. Traumas were reviewed and tabulated at the end of each PDSA cycle, and triage accuracy was followed with a run chart. This study was performed at Vanderbilt University Medical Center and Medical School, which has a large level 1 trauma center covering over 75,000 square miles, and which sees urban, suburban, and rural trauma. The baseline assessment period and each PDSA cycle lasted 2 weeks. During this time, all activated, adult, direct traumas were reviewed. There were 180 patients during the baseline period, 189 after the first test of change, and 150 after the second test of change. All were included in analysis. Of 180 patients, 30 were inappropriately triaged during baseline analysis (3 undertriaged and 27 overtriaged) versus 16 of 189 (3 undertriaged and 13

  14. Human error and the problem of causality in analysis of accidents

    DEFF Research Database (Denmark)

    Rasmussen, Jens

    1990-01-01

    , designers or managers have played a major role. There are, however, several basic problems in analysis of accidents and identification of human error. This paper addresses the nature of causal explanations and the ambiguity of the rules applied for identification of the events to include in analysis......Present technology is characterized by complexity, rapid change and growing size of technical systems. This has caused increasing concern with the human involvement in system safety. Analyses of the major accidents during recent decades have concluded that human errors on part of operators...

  15. Seismic-load-induced human errors and countermeasures using computer graphics in plant-operator communication

    International Nuclear Information System (INIS)

    Hara, Fumio

    1988-01-01

    This paper remarks the importance of seismic load-induced human errors in plant operation by delineating the characteristics of the task performance of human beings under seismic loads. It focuses on man-machine communication via multidimensional data like that conventionally displayed on large panels in a plant control room. It demonstrates a countermeasure to human errors using a computer graphics technique that conveys the global state of the plant operation to operators through cartoon-like, colored graphs in the form of faces that, with different facial expressions, show the plant safety status. (orig.)

  16. Psychological scaling of expert estimates of human error probabilities: application to nuclear power plant operation

    International Nuclear Information System (INIS)

    Comer, K.; Gaddy, C.D.; Seaver, D.A.; Stillwell, W.G.

    1985-01-01

    The US Nuclear Regulatory Commission and Sandia National Laboratories sponsored a project to evaluate psychological scaling techniques for use in generating estimates of human error probabilities. The project evaluated two techniques: direct numerical estimation and paired comparisons. Expert estimates were found to be consistent across and within judges. Convergent validity was good, in comparison to estimates in a handbook of human reliability. Predictive validity could not be established because of the lack of actual relative frequencies of error (which will be a difficulty inherent in validation of any procedure used to estimate HEPs). Application of expert estimates in probabilistic risk assessment and in human factors is discussed

  17. A human error taxonomy and its application to an automatic method accident analysis

    International Nuclear Information System (INIS)

    Matthews, R.H.; Winter, P.W.

    1983-01-01

    Commentary is provided on the quantification aspects of human factors analysis in risk assessment. Methods for quantifying human error in a plant environment are discussed and their application to system quantification explored. Such a programme entails consideration of the data base and a taxonomy of factors contributing to human error. A multi-levelled approach to system quantification is proposed, each level being treated differently drawing on the advantages of different techniques within the fault/event tree framework. Management, as controller of organization, planning and procedure, is assigned a dominant role. (author)

  18. HUMAN ERROR QUANTIFICATION USING PERFORMANCE SHAPING FACTORS IN THE SPAR-H METHOD

    Energy Technology Data Exchange (ETDEWEB)

    Harold S. Blackman; David I. Gertman; Ronald L. Boring

    2008-09-01

    This paper describes a cognitively based human reliability analysis (HRA) quantification technique for estimating the human error probabilities (HEPs) associated with operator and crew actions at nuclear power plants. The method described here, Standardized Plant Analysis Risk-Human Reliability Analysis (SPAR-H) method, was developed to aid in characterizing and quantifying human performance at nuclear power plants. The intent was to develop a defensible method that would consider all factors that may influence performance. In the SPAR-H approach, calculation of HEP rates is especially straightforward, starting with pre-defined nominal error rates for cognitive vs. action-oriented tasks, and incorporating performance shaping factor multipliers upon those nominal error rates.

  19. Analysis of human error and organizational deficiency in events considering risk significance

    International Nuclear Information System (INIS)

    Lee, Yong Suk; Kim, Yoonik; Kim, Say Hyung; Kim, Chansoo; Chung, Chang Hyun; Jung, Won Dea

    2004-01-01

    In this study, we analyzed human and organizational deficiencies in the trip events of Korean nuclear power plants. K-HPES items were used in human error analysis, and the organizational factors by Jacobs and Haber were used for organizational deficiency analysis. We proposed the use of CCDP as a risk measure to consider risk information in prioritizing K-HPES items and organizational factors. Until now, the risk significance of events has not been considered in human error and organizational deficiency analysis. Considering the risk significance of events in the process of analysis is necessary for effective enhancement of nuclear power plant safety by focusing on causes of human error and organizational deficiencies that are associated with significant risk

  20. Quantification of human error and common-mode failures in man-machine systems

    International Nuclear Information System (INIS)

    Lisboa, J.J.

    1988-01-01

    Quantification of human performance, particularly the determination of human error, is essential for realistic assessment of overall system performance of man-machine systems. This paper presents an analysis of human errors in nuclear power plant systems when measured against common-mode failures (CMF). Human errors evaluated are improper testing, inadequate maintenance strategy, and miscalibration. The methodology presented in the paper represents a positive contribution to power plant systems availability by identifying sources of common-mode failure when operational functions are involved. It is also applicable to other complex systems such as chemical plants, aircraft and motor industries; in fact, any large man-created, man-machine system could be included

  1. Systematic Analysis of Video Data from Different Human-Robot Interaction Studies: A Categorisation of Social Signals During Error Situations

    OpenAIRE

    Manuel eGiuliani; Nicole eMirnig; Gerald eStollnberger; Susanne eStadler; Roland eBuchner; Manfred eTscheligi

    2015-01-01

    Human?robot interactions are often affected by error situations that are caused by either the robot or the human. Therefore, robots would profit from the ability to recognize when error situations occur. We investigated the verbal and non-verbal social signals that humans show when error situations occur in human?robot interaction experiments. For that, we analyzed 201 videos of five human?robot interaction user studies with varying tasks from four independent projects. The analysis shows tha...

  2. Hierarchical learning induces two simultaneous, but separable, prediction errors in human basal ganglia.

    Science.gov (United States)

    Diuk, Carlos; Tsai, Karin; Wallis, Jonathan; Botvinick, Matthew; Niv, Yael

    2013-03-27

    Studies suggest that dopaminergic neurons report a unitary, global reward prediction error signal. However, learning in complex real-life tasks, in particular tasks that show hierarchical structure, requires multiple prediction errors that may coincide in time. We used functional neuroimaging to measure prediction error signals in humans performing such a hierarchical task involving simultaneous, uncorrelated prediction errors. Analysis of signals in a priori anatomical regions of interest in the ventral striatum and the ventral tegmental area indeed evidenced two simultaneous, but separable, prediction error signals corresponding to the two levels of hierarchy in the task. This result suggests that suitably designed tasks may reveal a more intricate pattern of firing in dopaminergic neurons. Moreover, the need for downstream separation of these signals implies possible limitations on the number of different task levels that we can learn about simultaneously.

  3. Detailed semantic analyses of human error incidents occurring at nuclear power plants. Extraction of periodical transition of error occurrence patterns by applying multivariate analysis

    International Nuclear Information System (INIS)

    Hirotsu, Yuko; Suzuki, Kunihiko; Takano, Kenichi; Kojima, Mitsuhiro

    2000-01-01

    It is essential for preventing the recurrence of human error incidents to analyze and evaluate them with the emphasis on human factor. Detailed and structured analyses of all incidents at domestic nuclear power plants (NPPs) reported during last 31 years have been conducted based on J-HPES, in which total 193 human error cases are identified. Results obtained by the analyses have been stored into the J-HPES database. In the previous study, by applying multivariate analysis to above case studies, it was suggested that there were several occurrence patterns identified of how errors occur at NPPs. It was also clarified that the causes related to each human error are different depending on age of their occurrence. This paper described the obtained results in respects of periodical transition of human error occurrence patterns. By applying multivariate analysis to the above data, it was suggested there were two types of error occurrence patterns as to each human error type. First type is common occurrence patterns, not depending on the age, and second type is the one influenced by periodical characteristics. (author)

  4. Thresholds for human detection of patient setup errors in digitally reconstructed portal images of prostate fields

    International Nuclear Information System (INIS)

    Phillips, Brooke L.; Jiroutek, Michael R.; Tracton, Gregg; Elfervig, Michelle; Muller, Keith E.; Chaney, Edward L.

    2002-01-01

    Purpose: Computer-assisted methods to analyze electronic portal images for the presence of treatment setup errors should be studied in controlled experiments before use in the clinical setting. Validation experiments using images that contain known errors usually report the smallest errors that can be detected by the image analysis algorithm. This paper offers human error-detection thresholds as one benchmark for evaluating the smallest errors detected by algorithms. Unfortunately, reliable data are lacking describing human performance. The most rigorous benchmarks for human performance are obtained under conditions that favor error detection. To establish such benchmarks, controlled observer studies were carried out to determine the thresholds of detectability for in-plane and out-of-plane translation and rotation setup errors introduced into digitally reconstructed portal radiographs (DRPRs) of prostate fields. Methods and Materials: Seventeen observers comprising radiation oncologists, radiation oncology residents, physicists, and therapy students participated in a two-alternative forced choice experiment involving 378 DRPRs computed using the National Library of Medicine Visible Human data sets. An observer viewed three images at a time displayed on adjacent computer monitors. Each image triplet included a reference digitally reconstructed radiograph displayed on the central monitor and two DRPRs displayed on the flanking monitors. One DRPR was error free. The other DRPR contained a known in-plane or out-of-plane error in the placement of the treatment field over a target region in the pelvis. The range for each type of error was determined from pilot observer studies based on a Probit model for error detection. The smallest errors approached the limit of human visual capability. The observer was told what kind of error was introduced, and was asked to choose the DRPR that contained the error. Observer decisions were recorded and analyzed using repeated

  5. An assessment of the risk significance of human errors in selected PSAs and operating events

    International Nuclear Information System (INIS)

    Palla, R.L. Jr.; El-Bassioni, A.

    1991-01-01

    Sensitivity studies based on Probabilistic Safety Assessments (PSAs) for a pressurized water reactor and a boiling water reactor are described. In each case human errors modeled in the PSAs were categorized according to such factors as error type, location, timing, and plant personnel involved. Sensitivity studies were then conducted by varying the error rates in each category and evaluating the corresponding change in total core damage frequency and accident sequence frequency. Insights obtained are discussed and reasons for differences in risk sensitivity between plants are explored. A separate investigation into the role of human error in risk-important operating events is also described. This investigation involved the analysis of data from the USNRC Accident Sequence Precursor program to determine the effect of operator-initiated events on accident precursor trends, and to determine whether improved training can be correlated to current trends. The findings of this study are also presented. 5 refs., 15 figs., 1 tab

  6. The Relationship between Human Operators' Psycho-physiological Condition and Human Errors in Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Arryum; Jang, Inseok; Kang, Hyungook; Seong, Poonghyun [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of)

    2013-05-15

    The safe operation of nuclear power plants (NPPs) is substantially dependent on the performance of the human operators who operate the systems. In this environment, human errors caused by inappropriate performance of operator have been considered to be critical since it may lead serious problems in the safety-critical plants. In order to provide meaningful insights to prevent human errors and enhance the human performance, operators' physiological conditions such as stress and workload have been investigated. Physiological measurements were considered as reliable tools to assess the stress and workload. T. Q. Tran et al. and J. B. Brooking et al pointed out that operators' workload can be assessed using eye tracking, galvanic skin response, electroencephalograms (EEGs), heart rate, respiration and other measurements. The purpose of this study is to investigate the effect of the human operators' tense level and knowledge level to the number of human errors. For this study, the experiments were conducted in the mimic of the main control rooms (MCR) in NPP. It utilized the compact nuclear simulator (CNS) which is modeled based on the three loop Pressurized Water Reactor, 993MWe, Kori unit 3 and 4 in Korea and the subjects were asked to follow the tasks described in the emergency operating procedures (EOP). During the simulation, three kinds of physiological measurement were utilized; Electrocardiogram (ECG), EEG and nose temperature. Also, subjects were divided into three groups based on their knowledge of the plant operation. The result shows that subjects who are tense make fewer errors. In addition, subjects who are in higher knowledge level tend to be tense and make fewer errors. For the ECG data, subjects who make fewer human errors tend to be located in higher tense level area of high SNS activity and low PSNS activity. The results of EEG data are also similar to ECG result. Beta power ratio of subjects who make fewer errors was higher. Since beta

  7. Improved compliance with the World Health Organization Surgical Safety Checklist is associated with reduced surgical specimen labelling errors.

    Science.gov (United States)

    Martis, Walston R; Hannam, Jacqueline A; Lee, Tracey; Merry, Alan F; Mitchell, Simon J

    2016-09-09

    A new approach to administering the surgical safety checklist (SSC) at our institution using wall-mounted charts for each SSC domain coupled with migrated leadership among operating room (OR) sub-teams, led to improved compliance with the Sign Out domain. Since surgical specimens are reviewed at Sign Out, we aimed to quantify any related change in surgical specimen labelling errors. Prospectively maintained error logs for surgical specimens sent to pathology were examined for the six months before and after introduction of the new SSC administration paradigm. We recorded errors made in the labelling or completion of the specimen pot and on the specimen laboratory request form. Total error rates were calculated from the number of errors divided by total number of specimens. Rates from the two periods were compared using a chi square test. There were 19 errors in 4,760 specimens (rate 3.99/1,000) and eight errors in 5,065 specimens (rate 1.58/1,000) before and after the change in SSC administration paradigm (P=0.0225). Improved compliance with administering the Sign Out domain of the SSC can reduce surgical specimen errors. This finding provides further evidence that OR teams should optimise compliance with the SSC.

  8. Human error identification for laparoscopic surgery: Development of a motion economy perspective.

    Science.gov (United States)

    Al-Hakim, Latif; Sevdalis, Nick; Maiping, Tanaphon; Watanachote, Damrongpan; Sengupta, Shomik; Dissaranan, Charuspong

    2015-09-01

    This study postulates that traditional human error identification techniques fail to consider motion economy principles and, accordingly, their applicability in operating theatres may be limited. This study addresses this gap in the literature with a dual aim. First, it identifies the principles of motion economy that suit the operative environment and second, it develops a new error mode taxonomy for human error identification techniques which recognises motion economy deficiencies affecting the performance of surgeons and predisposing them to errors. A total of 30 principles of motion economy were developed and categorised into five areas. A hierarchical task analysis was used to break down main tasks of a urological laparoscopic surgery (hand-assisted laparoscopic nephrectomy) to their elements and the new taxonomy was used to identify errors and their root causes resulting from violation of motion economy principles. The approach was prospectively tested in 12 observed laparoscopic surgeries performed by 5 experienced surgeons. A total of 86 errors were identified and linked to the motion economy deficiencies. Results indicate the developed methodology is promising. Our methodology allows error prevention in surgery and the developed set of motion economy principles could be useful for training surgeons on motion economy principles. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  9. Minimizing human error in radiopharmaceutical preparation and administration via a bar code-enhanced nuclear pharmacy management system.

    Science.gov (United States)

    Hakala, John L; Hung, Joseph C; Mosman, Elton A

    2012-09-01

    The objective of this project was to ensure correct radiopharmaceutical administration through the use of a bar code system that links patient and drug profiles with on-site information management systems. This new combined system would minimize the amount of manual human manipulation, which has proven to be a primary source of error. The most common reason for dosing errors is improper patient identification when a dose is obtained from the nuclear pharmacy or when a dose is administered. A standardized electronic transfer of information from radiopharmaceutical preparation to injection will further reduce the risk of misadministration. Value stream maps showing the flow of the patient dose information, as well as potential points of human error, were developed. Next, a future-state map was created that included proposed corrections for the most common critical sites of error. Transitioning the current process to the future state will require solutions that address these sites. To optimize the future-state process, a bar code system that links the on-site radiology management system with the nuclear pharmacy management system was proposed. A bar-coded wristband connects the patient directly to the electronic information systems. The bar code-enhanced process linking the patient dose with the electronic information reduces the number of crucial points for human error and provides a framework to ensure that the prepared dose reaches the correct patient. Although the proposed flowchart is designed for a site with an in-house central nuclear pharmacy, much of the framework could be applied by nuclear medicine facilities using unit doses. An electronic connection between information management systems to allow the tracking of a radiopharmaceutical from preparation to administration can be a useful tool in preventing the mistakes that are an unfortunate reality for any facility.

  10. The application of two recently developed human reliability techniques to cognitive error analysis

    International Nuclear Information System (INIS)

    Gall, W.

    1990-01-01

    Cognitive error can lead to catastrophic consequences for manned systems, including those whose design renders them immune to the effects of physical slips made by operators. Four such events, pressurized water and boiling water reactor accidents which occurred recently, were analysed. The analysis identifies the factors which contributed to the errors and suggests practical strategies for error recovery or prevention. Two types of analysis were conducted: an unstructured analysis based on the analyst's knowledge of psychological theory, and a structured analysis using two recently-developed human reliability analysis techniques. In general, the structured techniques required less effort to produce results and these were comparable to those of the unstructured analysis. (author)

  11. Nurses' Behaviors and Visual Scanning Patterns May Reduce Patient Identification Errors

    Science.gov (United States)

    Marquard, Jenna L.; Henneman, Philip L.; He, Ze; Jo, Junghee; Fisher, Donald L.; Henneman, Elizabeth A.

    2011-01-01

    Patient identification (ID) errors occurring during the medication administration process can be fatal. The aim of this study is to determine whether differences in nurses' behaviors and visual scanning patterns during the medication administration process influence their capacities to identify patient ID errors. Nurse participants (n = 20)…

  12. Accounting for measurement error in human life history trade-offs using structural equation modeling.

    Science.gov (United States)

    Helle, Samuli

    2018-03-01

    Revealing causal effects from correlative data is very challenging and a contemporary problem in human life history research owing to the lack of experimental approach. Problems with causal inference arising from measurement error in independent variables, whether related either to inaccurate measurement technique or validity of measurements, seem not well-known in this field. The aim of this study is to show how structural equation modeling (SEM) with latent variables can be applied to account for measurement error in independent variables when the researcher has recorded several indicators of a hypothesized latent construct. As a simple example of this approach, measurement error in lifetime allocation of resources to reproduction in Finnish preindustrial women is modelled in the context of the survival cost of reproduction. In humans, lifetime energetic resources allocated in reproduction are almost impossible to quantify with precision and, thus, typically used measures of lifetime reproductive effort (e.g., lifetime reproductive success and parity) are likely to be plagued by measurement error. These results are contrasted with those obtained from a traditional regression approach where the single best proxy of lifetime reproductive effort available in the data is used for inference. As expected, the inability to account for measurement error in women's lifetime reproductive effort resulted in the underestimation of its underlying effect size on post-reproductive survival. This article emphasizes the advantages that the SEM framework can provide in handling measurement error via multiple-indicator latent variables in human life history studies. © 2017 Wiley Periodicals, Inc.

  13. HUMAN RELIABILITY ANALYSIS DENGAN PENDEKATAN COGNITIVE RELIABILITY AND ERROR ANALYSIS METHOD (CREAM

    Directory of Open Access Journals (Sweden)

    Zahirah Alifia Maulida

    2015-01-01

    Full Text Available Kecelakaan kerja pada bidang grinding dan welding menempati urutan tertinggi selama lima tahun terakhir di PT. X. Kecelakaan ini disebabkan oleh human error. Human error terjadi karena pengaruh lingkungan kerja fisik dan non fisik.Penelitian kali menggunakan skenario untuk memprediksi serta mengurangi kemungkinan terjadinya error pada manusia dengan pendekatan CREAM (Cognitive Reliability and Error Analysis Method. CREAM adalah salah satu metode human reliability analysis yang berfungsi untuk mendapatkan nilai Cognitive Failure Probability (CFP yang dapat dilakukan dengan dua cara yaitu basic method dan extended method. Pada basic method hanya akan didapatkan nilai failure probabailty secara umum, sedangkan untuk extended method akan didapatkan CFP untuk setiap task. Hasil penelitian menunjukkan faktor- faktor yang mempengaruhi timbulnya error pada pekerjaan grinding dan welding adalah kecukupan organisasi, kecukupan dari Man Machine Interface (MMI & dukungan operasional, ketersediaan prosedur/ perencanaan, serta kecukupan pelatihan dan pengalaman. Aspek kognitif pada pekerjaan grinding yang memiliki nilai error paling tinggi adalah planning dengan nilai CFP 0.3 dan pada pekerjaan welding yaitu aspek kognitif execution dengan nilai CFP 0.18. Sebagai upaya untuk mengurangi nilai error kognitif pada pekerjaan grinding dan welding rekomendasi yang diberikan adalah memberikan training secara rutin, work instrucstion yang lebih rinci dan memberikan sosialisasi alat. Kata kunci: CREAM (cognitive reliability and error analysis method, HRA (human reliability analysis, cognitive error Abstract The accidents in grinding and welding sectors were the highest cases over the last five years in PT. X and it caused by human error. Human error occurs due to the influence of working environment both physically and non-physically. This study will implement an approaching scenario called CREAM (Cognitive Reliability and Error Analysis Method. CREAM is one of human

  14. Review of human error analysis methodologies and case study for accident management

    International Nuclear Information System (INIS)

    Jung, Won Dae; Kim, Jae Whan; Lee, Yong Hee; Ha, Jae Joo

    1998-03-01

    In this research, we tried to establish the requirements for the development of a new human error analysis method. To achieve this goal, we performed a case study as following steps; 1. review of the existing HEA methods 2. selection of those methods which are considered to be appropriate for the analysis of operator's tasks in NPPs 3. choice of tasks for the application, selected for the case study: HRMS (Human reliability management system), PHECA (Potential Human Error Cause Analysis), CREAM (Cognitive Reliability and Error Analysis Method). And, as the tasks for the application, 'bleed and feed operation' and 'decision-making for the reactor cavity flooding' tasks are chosen. We measured the applicability of the selected methods to the NPP tasks, and evaluated the advantages and disadvantages between each method. The three methods are turned out to be applicable for the prediction of human error. We concluded that both of CREAM and HRMS are equipped with enough applicability for the NPP tasks, however, compared two methods. CREAM is thought to be more appropriate than HRMS from the viewpoint of overall requirements. The requirements for the new HEA method obtained from the study can be summarized as follows; firstly, it should deal with cognitive error analysis, secondly, it should have adequate classification system for the NPP tasks, thirdly, the description on the error causes and error mechanisms should be explicit, fourthly, it should maintain the consistency of the result by minimizing the ambiguity in each step of analysis procedure, fifty, it should be done with acceptable human resources. (author). 25 refs., 30 tabs., 4 figs

  15. Team errors: definition and taxonomy

    International Nuclear Information System (INIS)

    Sasou, Kunihide; Reason, James

    1999-01-01

    In error analysis or error management, the focus is usually upon individuals who have made errors. In large complex systems, however, most people work in teams or groups. Considering this working environment, insufficient emphasis has been given to 'team errors'. This paper discusses the definition of team errors and its taxonomy. These notions are also applied to events that have occurred in the nuclear power industry, aviation industry and shipping industry. The paper also discusses the relations between team errors and Performance Shaping Factors (PSFs). As a result, the proposed definition and taxonomy are found to be useful in categorizing team errors. The analysis also reveals that deficiencies in communication, resource/task management, excessive authority gradient, excessive professional courtesy will cause team errors. Handling human errors as team errors provides an opportunity to reduce human errors

  16. Basic design of multimedia system for the representation of human error cases in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung Woon; Park, Geun Ok [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1994-04-01

    We have developed a multimedia system for the representation of human error cases with the education and training on human errors can be done effectively. The followings are major topics during the basic design; 1 Establishment of a basic concept for representing human error cases using multimedia, 2 Establishment of a design procedure for the multimedia system, 3 Establishment of a hardware and software environment for operating the multimedia system, 4 Design of multimedia input and output interfaces. In order to verify the results of this basic design, we implemented the basic design with an incident triggered by operator`s misaction which occurred at Uljin NPP Unit 1. (Author) 12 refs., 30 figs.,.

  17. Using a structured morbidity and mortality meeting to understand the contribution of human error to adverse surgical events in a South African regional hospital.

    Science.gov (United States)

    Clarke, Damian L; Furlong, Heidi; Laing, Grant L; Aldous, Colleen; Thomson, Sandie Rutherford

    2013-10-22

    Several authors have suggested that the traditional surgical morbidity and mortality meeting be developed as a tool to identify surgical errors and turn them into learning opportunities for staff. We report our experience with these meetings. A structured template was developed for each morbidity and mortality meeting. We used a grid to analyse mortality and classify the death as: (i) death expected/death unexpected; and (ii) death unpreventable/death preventable. Individual cases were then analysed using a combination of error taxonomies. During the period June - December 2011, a total of 400 acute admissions (195 trauma and 205 non-trauma) were managed at Edendale Hospital, Pietermaritzburg, South Africa. During this period, 20 morbidity and mortality meetings were held, at which 30 patients were discussed. There were 10 deaths, of which 5 were unexpected and potentially avoidable. A total of 43 errors were recognised, all in the domain of the acute admissions ward. There were 33 assessment failures, 5 logistical failures, 5 resuscitation failures, 16 errors of execution and 27 errors of planning. Seven patients experienced a number of errors, of whom 5 died. Error theory successfully dissected out the contribution of error to adverse events in our institution. Translating this insight into effective strategies to reduce the incidence of error remains a challenge. Using the examples of error identified at the meetings as educational cases may help with initiatives that directly target human error in trauma care.

  18. A Conceptual Framework for Predicting Error in Complex Human-Machine Environments

    Science.gov (United States)

    Freed, Michael; Remington, Roger; Null, Cynthia H. (Technical Monitor)

    1998-01-01

    We present a Goals, Operators, Methods, and Selection Rules-Model Human Processor (GOMS-MHP) style model-based approach to the problem of predicting human habit capture errors. Habit captures occur when the model fails to allocate limited cognitive resources to retrieve task-relevant information from memory. Lacking the unretrieved information, decision mechanisms act in accordance with implicit default assumptions, resulting in error when relied upon assumptions prove incorrect. The model helps interface designers identify situations in which such failures are especially likely.

  19. Toward reduced transport errors in a high resolution urban CO2 inversion system

    Directory of Open Access Journals (Sweden)

    Aijun Deng

    2017-05-01

    Full Text Available We present a high-resolution atmospheric inversion system combining a Lagrangian Particle Dispersion Model (LPDM and the Weather Research and Forecasting model (WRF, and test the impact of assimilating meteorological observation on transport accuracy. A Four Dimensional Data Assimilation (FDDA technique continuously assimilates meteorological observations from various observing systems into the transport modeling system, and is coupled to the high resolution CO2 emission product Hestia to simulate the atmospheric mole fractions of CO2. For the Indianapolis Flux Experiment (INFLUX project, we evaluated the impact of assimilating different meteorological observation systems on the linearized adjoint solutions and the CO2 inverse fluxes estimated using observed CO2 mole fractions from 11 out of 12 communications towers over Indianapolis for the Sep.-Nov. 2013 period. While assimilating WMO surface measurements improved the simulated wind speed and direction, their impact on the planetary boundary layer (PBL was limited. Simulated PBL wind statistics improved significantly when assimilating upper-air observations from the commercial airline program Aircraft Communications Addressing and Reporting System (ACARS and continuous ground-based Doppler lidar wind observations. Wind direction mean absolute error (MAE decreased from 26 to 14 degrees and the wind speed MAE decreased from 2.0 to 1.2 m s–1, while the bias remains small in all configurations (< 6 degrees and 0.2 m s–1. Wind speed MAE and ME are larger in daytime than in nighttime. PBL depth MAE is reduced by ~10%, with little bias reduction. The inverse results indicate that the spatial distribution of CO2 inverse fluxes were affected by the model performance while the overall flux estimates changed little across WRF simulations when aggregated over the entire domain. Our results show that PBL wind observations are a potent tool for increasing the precision of urban meteorological reanalyses

  20. Effectiveness of Toyota process redesign in reducing thyroid gland fine-needle aspiration error.

    Science.gov (United States)

    Raab, Stephen S; Grzybicki, Dana Marie; Sudilovsky, Daniel; Balassanian, Ronald; Janosky, Janine E; Vrbin, Colleen M

    2006-10-01

    Our objective was to determine whether the Toyota Production System process redesign resulted in diagnostic error reduction for patients who underwent cytologic evaluation of thyroid nodules. In this longitudinal, nonconcurrent cohort study, we compared the diagnostic error frequency of a thyroid aspiration service before and after implementation of error reduction initiatives consisting of adoption of a standardized diagnostic terminology scheme and an immediate interpretation service. A total of 2,424 patients underwent aspiration. Following terminology standardization, the false-negative rate decreased from 41.8% to 19.1% (P = .006), the specimen nondiagnostic rate increased from 5.8% to 19.8% (P Toyota process change led to significantly fewer diagnostic errors for patients who underwent thyroid fine-needle aspiration.

  1. A system dynamic simulation model for managing the human error in power tools industries

    Science.gov (United States)

    Jamil, Jastini Mohd; Shaharanee, Izwan Nizal Mohd

    2017-10-01

    In the era of modern and competitive life of today, every organization will face the situations in which the work does not proceed as planned when there is problems occur in which it had to be delay. However, human error is often cited as the culprit. The error that made by the employees would cause them have to spend additional time to identify and check for the error which in turn could affect the normal operations of the company as well as the company's reputation. Employee is a key element of the organization in running all of the activities of organization. Hence, work performance of the employees is a crucial factor in organizational success. The purpose of this study is to identify the factors that cause the increasing errors make by employees in the organization by using system dynamics approach. The broadly defined targets in this study are employees in the Regional Material Field team from purchasing department in power tools industries. Questionnaires were distributed to the respondents to obtain their perceptions on the root cause of errors make by employees in the company. The system dynamics model was developed to simulate the factor of the increasing errors make by employees and its impact. The findings of this study showed that the increasing of error make by employees was generally caused by the factors of workload, work capacity, job stress, motivation and performance of employees. However, this problem could be solve by increased the number of employees in the organization.

  2. Human reliability analysis during PSA at Trillo NPP: main characteristics and analysis of diagnostic errors

    International Nuclear Information System (INIS)

    Barquin, M.A.; Gomez, F.

    1998-01-01

    The design difference between Trillo NPP and other Spanish nuclear power plants (basic Westinghouse and General Electric designs) were made clear in the Human Reliability Analysis of the Probabilistic Safety Analysis (PSA) for Trillo NPP. The object of this paper is to describe the most significant characteristics of the Human Reliability Analysis carried out in the PSA, with special emphasis on the possible diagnostic errors and their consequences, based on the characteristics in the Emergency Operations Manual for Trillo NPP. - In the case of human errors before the initiating event (type 1), the existence of four redundancies in most of the plant safety systems, means that the impact of this type or error on the final results of the PSA is insignificant. However, in the case common cause errors, especially in certain calibration errors, some actions are significant in the final equation for core damage - The number of human actions that the operator has to carry out during the accidents (type 3) modelled, is relatively small in comparison with this value in other PSAs. This is basically due to the high level of automation at Rillo NPP - The Plant Operations Manual cannot be strictly considered to be a symptoms-based procedure. The operation Group must select the chapter from the Operations Manual to be followed, after having diagnosed the perturbing event, using for this purpose and Emergency and Anomaly Decision Tree (M.O.3.0.1) based on the different indications, alarms and symptoms present in the plant after the perturbing event. For this reason, it was decided to analyse the possible diagnosis errors. In the bibliography on diagnosis and commission errors available at the present time, there is no precise methodology for the analysis of this type of error and its incorporation into PSAs. The method used in the PSA for Trillo y NPP to evaluate this type of interaction, is to develop a Diagnosis Error Table, the object of which is to identify the situations in

  3. The current approach to human error and blame in the NHS.

    Science.gov (United States)

    Ottewill, Melanie

    There is a large body of research to suggest that serious errors are widespread throughout medicine. The traditional response to these adverse events has been to adopt a 'person approach' - blaming the individual seen as 'responsible'. The culture of medicine is highly complicit in this response. Such an approach results in enormous personal costs to the individuals concerned and does little to address the root causes of errors and thus prevent their recurrence. Other industries, such as aviation, where safety is a paramount concern and which have similar structures to the medical profession, have, over the past decade or so, adopted a 'systems' approach to error, recognizing that human error is ubiquitous and inevitable and that systems need to be developed with this in mind. This approach has been highly successful, but has necessitated, first and foremost, a cultural shift. It is in the best interests of patients, and medical professionals alike, that such a shift is embraced in the NHS.

  4. Detailed semantic analyses of human error incidents occurring at nuclear power plant in USA (interim report). Characteristics of human error incidents occurring in the period from 1992 to 1996

    International Nuclear Information System (INIS)

    Hirotsu, Yuko; Tsuge, Tadashi; Sano, Toshiaki; Takano, Kenichi; Gouda, Hidenori

    2001-01-01

    CRIEPI has been conducting detailed analyses of all human error incidents at domestic nuclear power plants (NPPs) collected from Japanese Licensee Event Reports (LERs) using J-HPES (Japanese version of HPES) as an analysis method. Results obtained by the analyses have been stored in J-HPES database. Since 1999, human error incidents have been selected from U.S. LERs, and they are analyzed using J-HPES. In this report, the results, which classified error action, cause, and preventive measure, are summarized for U.S. human error cases occurring in the period from 1992 to 1996. It was suggested as a result of classification that the categories of error action were almost the same as those of Japanese human error cases. Therefore, problems in the process of error action and checkpoints for preventing errors will be extracted by analyzing both U.S. and domestic human error cases. It was also suggested that the interrelations between error actions, causes, and organizational factors could be identified. While taking these suggestions into consideration, we will continue to analyze U.S. human error cases. (author)

  5. An advanced human reliability analysis methodology: analysis of cognitive errors focused on

    International Nuclear Information System (INIS)

    Kim, J. H.; Jeong, W. D.

    2001-01-01

    The conventional Human Reliability Analysis (HRA) methods such as THERP/ASEP, HCR and SLIM has been criticised for their deficiency in analysing cognitive errors which occurs during operator's decision making process. In order to supplement the limitation of the conventional methods, an advanced HRA method, what is called the 2 nd generation HRA method, including both qualitative analysis and quantitative assessment of cognitive errors has been being developed based on the state-of-the-art theory of cognitive systems engineering and error psychology. The method was developed on the basis of human decision-making model and the relation between the cognitive function and the performance influencing factors. The application of the proposed method to two emergency operation tasks is presented

  6. Analysis of Human Errors in Japanese Nuclear Power Plants using JHPES/JAESS

    International Nuclear Information System (INIS)

    Kojima, Mitsuhiro; Mimura, Masahiro; Yamaguchi, Osamu

    1998-01-01

    CRIEPI (Central Research Institute for Electric Power Industries) / HFC (Human Factors research Center) developed J-HPES (Japanese version of Human Performance Enhancement System) based on the HPES which was originally developed by INPO to analyze events resulted from human errors. J-HPES was systematized into a computer program named JAESS (J-HPES Analysis and Evaluation Support System) and both systems were distributed to all Japanese electric power companies to analyze events by themselves. CRIEPI / HFC also analyzed the incidents in Japanese nuclear power plants (NPPs) which were officially reported and identified as human error related with J-HPES / JAESS. These incidents have numbered up to 188 cases over the last 30 years. An outline of this analysis is given, and some preliminary findings are shown. (authors)

  7. Collection and classification of human error and human reliability data from Indian nuclear power plants for use in PSA

    International Nuclear Information System (INIS)

    Subramaniam, K.; Saraf, R.K.; Sanyasi Rao, V.V.S.; Venkat Raj, V.; Venkatraman, R.

    2000-01-01

    Complex systems such as NPPs involve a large number of Human Interactions (HIs) in every phase of plant operations. Human Reliability Analysis (HRA) in the context of a PSA, attempts to model the HIs and evaluate/predict their impact on safety and reliability using human error/human reliability data. A large number of HRA techniques have been developed for modelling and integrating HIs into PSA but there is a significant lack of HAR data. In the face of insufficient data, human reliability analysts have had to resort to expert judgement methods in order to extend the insufficient data sets. In this situation, the generation of data from plant operating experience assumes importance. The development of a HRA data bank for Indian nuclear power plants was therefore initiated as part of the programme of work on HRA. Later, with the establishment of the coordinated research programme (CRP) on collection of human reliability data and use in PSA by IAEA in 1994-95, the development was carried out under the aegis of the IAEA research contract No. 8239/RB. The work described in this report covers the activities of development of a data taxonomy and a human error reporting form (HERF) based on it, data structuring, review and analysis of plant event reports, collection of data on human errors, analysis of the data and calculation of human error probabilities (HEPs). Analysis of plant operating experience does yield a good amount of qualitative data but obtaining quantitative data on human reliability in the form of HEPs is seen to be more difficult. The difficulties have been highlighted and some ways to bring about improvements in the data situation have been discussed. The implementation of a data system for HRA is described and useful features that can be incorporated in future systems are also discussed. (author)

  8. How we learn to make decisions: rapid propagation of reinforcement learning prediction errors in humans.

    Science.gov (United States)

    Krigolson, Olav E; Hassall, Cameron D; Handy, Todd C

    2014-03-01

    Our ability to make decisions is predicated upon our knowledge of the outcomes of the actions available to us. Reinforcement learning theory posits that actions followed by a reward or punishment acquire value through the computation of prediction errors-discrepancies between the predicted and the actual reward. A multitude of neuroimaging studies have demonstrated that rewards and punishments evoke neural responses that appear to reflect reinforcement learning prediction errors [e.g., Krigolson, O. E., Pierce, L. J., Holroyd, C. B., & Tanaka, J. W. Learning to become an expert: Reinforcement learning and the acquisition of perceptual expertise. Journal of Cognitive Neuroscience, 21, 1833-1840, 2009; Bayer, H. M., & Glimcher, P. W. Midbrain dopamine neurons encode a quantitative reward prediction error signal. Neuron, 47, 129-141, 2005; O'Doherty, J. P. Reward representations and reward-related learning in the human brain: Insights from neuroimaging. Current Opinion in Neurobiology, 14, 769-776, 2004; Holroyd, C. B., & Coles, M. G. H. The neural basis of human error processing: Reinforcement learning, dopamine, and the error-related negativity. Psychological Review, 109, 679-709, 2002]. Here, we used the brain ERP technique to demonstrate that not only do rewards elicit a neural response akin to a prediction error but also that this signal rapidly diminished and propagated to the time of choice presentation with learning. Specifically, in a simple, learnable gambling task, we show that novel rewards elicited a feedback error-related negativity that rapidly decreased in amplitude with learning. Furthermore, we demonstrate the existence of a reward positivity at choice presentation, a previously unreported ERP component that has a similar timing and topography as the feedback error-related negativity that increased in amplitude with learning. The pattern of results we observed mirrored the output of a computational model that we implemented to compute reward

  9. The application of human error prevention tool in Tianwan nuclear power station

    International Nuclear Information System (INIS)

    Qiao Zhiguo

    2013-01-01

    This paper mainly discusses the application and popularization of human error prevention tool in Tianwan nuclear power station, including the study on project implementation background, main contents and innovation, performance management, innovation practice and development, and performance of innovation application. (authors)

  10. Taking human error into account in the design of nuclear reactor centres

    International Nuclear Information System (INIS)

    Prouillac; Lerat; Janoir.

    1982-05-01

    The role of the operator in the centralized management of pressurized water reactors is studied. Different types of human error likely to arise, the means of their prevention and methods of mitigating their consequences are presented. Some possible improvements are outlined

  11. Human error views : a framework for benchmarking organizations and measuring the distance between academia and industry

    NARCIS (Netherlands)

    Karanikas, Nektarios

    2015-01-01

    The paper presents a framework that through structured analysis of accident reports explores the differences between practice and academic literature as well amongst organizations regarding their views on human error. The framework is based on the hypothesis that the wording of accident reports

  12. The human fallibility of scientists : Dealing with error and bias in academic research

    NARCIS (Netherlands)

    Veldkamp, Coosje

    2017-01-01

    THE HUMAN FALLIBILITY OF SCIENTISTS Dealing with error and bias in academic research Recent studies have highlighted that not all published findings in the scientific lit¬erature are trustworthy, suggesting that currently implemented control mechanisms such as high standards for the reporting of

  13. Working group of experts on rare events in human error analysis and quantification

    International Nuclear Information System (INIS)

    Goodstein, L.P.

    1977-01-01

    In dealing with the reference problem of rare events in nuclear power plants, the group has concerned itself with the man-machine system and, in particular, with human error analysis and quantification. The Group was requested to review methods of human reliability prediction, to evaluate the extent to which such analyses can be formalized and to establish criteria to be met by task conditions and system design which would permit a systematic, formal analysis. Recommendations are given on the Fessenheim safety system

  14. Human error data collection as a precursor to the development of a human reliability assessment capability in air traffic management

    International Nuclear Information System (INIS)

    Kirwan, Barry; Gibson, W. Huw; Hickling, Brian

    2008-01-01

    Quantified risk and safety assessments are now required for safety cases for European air traffic management (ATM) services. Since ATM is highly human-dependent for its safety, this suggests a need for formal human reliability assessment (HRA), as carried out in other industries such as nuclear power. Since the fundamental aspect of HRA is human error data, in the form of human error probabilities (HEPs), it was decided to take a first step towards development of an ATM HRA approach by deriving some HEPs in an ATM context. This paper reports a study, which collected HEPs via analysing the results of a real-time simulation involving air traffic controllers (ATCOs) and pilots, with a focus on communication errors. This study did indeed derive HEPs that were found to be concordant with other known communication human error data. This is a first step, and shows promise for HRA in ATM, since HEPs have been derived which could be used in safety assessments, although these HEPs are for only one (albeit critical) aspect of ATCOs' tasks (communications). The paper discusses options and potential ways forward for the development of a full HRA capability in ATM

  15. A human error analysis methodology, AGAPE-ET, for emergency tasks in nuclear power plants and its application

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae Whan; Jung, Won Dea [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2002-03-01

    This report presents a procedurised human reliability analysis (HRA) methodology, AGAPE-ET (A Guidance And Procedure for Human Error Analysis for Emergency Tasks), for both qualitative error analysis and quantification of human error probability (HEP) of emergency tasks in nuclear power plants. The AGAPE-ET is based on the simplified cognitive model. By each cognitive function, error causes or error-likely situations have been identified considering the characteristics of the performance of each cognitive function and influencing mechanism of PIFs on the cognitive function. Then, error analysis items have been determined from the identified error causes or error-likely situations to help the analysts cue or guide overall human error analysis. A human error analysis procedure based on the error analysis items is organised. The basic scheme for the quantification of HEP consists in the multiplication of the BHEP assigned by the error analysis item and the weight from the influencing factors decision tree (IFDT) constituted by cognitive function. The method can be characterised by the structured identification of the weak points of the task required to perform and the efficient analysis process that the analysts have only to carry out with the necessary cognitive functions. The report also presents the the application of AFAPE-ET to 31 nuclear emergency tasks and its results. 42 refs., 7 figs., 36 tabs. (Author)

  16. Basic human error probabilities in advanced MCRs when using soft control

    International Nuclear Information System (INIS)

    Jang, In Seok; Seong, Poong Hyun; Kang, Hyun Gook; Lee, Seung Jun

    2012-01-01

    In a report on one of the renowned HRA methods, Technique for Human Error Rate Prediction (THERP), it is pointed out that 'The paucity of actual data on human performance continues to be a major problem for estimating HEPs and performance times in nuclear power plant (NPP) task'. However, another critical difficulty is that most current HRA databases deal with operation in conventional type of MCRs. With the adoption of new human system interfaces that are based on computer based technologies, the operation environment of MCRs in NPPs has changed. The MCRs including these digital and computer technologies, such as large display panels, computerized procedures, soft controls, and so on, are called advanced MCRs. Because of the different interfaces, different Basic Human Error Probabilities (BHEPs) should be considered in human reliability analyses (HRAs) for advanced MCRs. This study carries out an empirical analysis of human error considering soft controls. The aim of this work is not only to compile a database using the simulator for advanced MCRs but also to compare BHEPs with those of a conventional MCR database

  17. Support of protective work of human error in a nuclear power plant

    International Nuclear Information System (INIS)

    Yoshizawa, Yuriko

    1999-01-01

    The nuclear power plant human factor group of the Tokyo Electric Power Co., Ltd. supports various protective work of human error conducted at the nuclear power plant. Its main researching theme are studies on human factor on operation of a nuclear power plant, and on recovery and common basic study on human factor. In addition, on a base of the obtained informations, assistance to protective work of human error conducted at the nuclear power plant as well as development for its actual use was also promoted. Especially, for actions sharing some dangerous informations, various assistances such as a proposal on actual example analytical method to effectively understand a dangerous information not facially but faithfully, construction of a data base to conveniently share such dangerous information, and practice on non-accident business survey for a hint of effective promotion of the protection work, were promoted. Here were introduced on assistance and investigation for effective sharing of the dangerous informations for various actions on protection of human error mainly conducted in nuclear power plant. (G.K.)

  18. About errors, inaccuracies and sterotypes: Mistakes in media coverage - and how to reduce them

    Science.gov (United States)

    Scherzler, D.

    2010-12-01

    The main complaint made by scientists about the work of journalists is that there are mistakes and inaccuracies in TV programmes, radio or the print media. This seems to be an important reason why too few researchers want to deal with journalists. Such scientists regularly discover omissions, errors, exaggerations, distortions, stereotypes and sensationalism in the media. Surveys carried out on so-called accuracy research seem to concede this point as well. Errors frequently occur in journalism, and it is the task of the editorial offices to work very hard in order to keep the number of errors as low as possible. On closer inspection some errors, however, turn out to be simplifications and omissions. Both are obligatory in journalism and do not automatically cause factual errors. This paper examines the different kinds of mistakes and misleading information that scientists observe in the mass media. By giving a view from inside the mass media it tries to explain how errors come to exist in the journalist’s working routines. It outlines that the criteria of journalistic quality which scientists and science journalists apply differ substantially. The expectation of many scientists is that good science journalism passes on their results to the public in as “unadulterated” a form as possible. The author suggests, however, that quality criteria for journalism cannot be derived from how true to detail and how comprehensively it reports on science, nor to what extent the journalistic presentation is “correct” in the eyes of the researcher. The paper suggests in its main part that scientists who are contacted or interviewed by the mass media should not accept that errors just happen. On the contrary, they can do a lot to help preventing mistakes that might occur in the journalistic product. The author proposes several strategies how scientists and press information officers could identify possible errors, stereotypes and exaggeration by journalists in advance and

  19. An empirical study on the human error recovery failure probability when using soft controls in NPP advanced MCRs

    International Nuclear Information System (INIS)

    Jang, Inseok; Kim, Ar Ryum; Jung, Wondea; Seong, Poong Hyun

    2014-01-01

    Highlights: • Many researchers have tried to understand human recovery process or step. • Modeling human recovery process is not sufficient to be applied to HRA. • The operation environment of MCRs in NPPs has changed by adopting new HSIs. • Recovery failure probability in a soft control operation environment is investigated. • Recovery failure probability here would be important evidence for expert judgment. - Abstract: It is well known that probabilistic safety assessments (PSAs) today consider not just hardware failures and environmental events that can impact upon risk, but also human error contributions. Consequently, the focus on reliability and performance management has been on the prevention of human errors and failures rather than the recovery of human errors. However, the recovery of human errors is as important as the prevention of human errors and failures for the safe operation of nuclear power plants (NPPs). For this reason, many researchers have tried to find a human recovery process or step. However, modeling the human recovery process is not sufficient enough to be applied to human reliability analysis (HRA), which requires human error and recovery probabilities. In this study, therefore, human error recovery failure probabilities based on predefined human error modes were investigated by conducting experiments in the operation mockup of advanced/digital main control rooms (MCRs) in NPPs. To this end, 48 subjects majoring in nuclear engineering participated in the experiments. In the experiments, using the developed accident scenario based on tasks from the standard post trip action (SPTA), the steam generator tube rupture (SGTR), and predominant soft control tasks, which are derived from the loss of coolant accident (LOCA) and the excess steam demand event (ESDE), all error detection and recovery data based on human error modes were checked with the performance sheet and the statistical analysis of error recovery/detection was then

  20. A trend analysis of human error events for proactive prevention of accidents. Methodology development and effective utilization

    International Nuclear Information System (INIS)

    Hirotsu, Yuko; Ebisu, Mitsuhiro; Aikawa, Takeshi; Matsubara, Katsuyuki

    2006-01-01

    This paper described methods for analyzing human error events that has been accumulated in the individual plant and for utilizing the result to prevent accidents proactively. Firstly, a categorization framework of trigger action and causal factors of human error events were reexamined, and the procedure to analyze human error events was reviewed based on the framework. Secondly, a method for identifying the common characteristics of trigger action data and of causal factor data accumulated by analyzing human error events was clarified. In addition, to utilize the results of trend analysis effectively, methods to develop teaching material for safety education, to develop the checkpoints for the error prevention and to introduce an error management process for strategic error prevention were proposed. (author)

  1. Human reliability data, human error and accident models--illustration through the Three Mile Island accident analysis

    International Nuclear Information System (INIS)

    Le Bot, Pierre

    2004-01-01

    Our first objective is to provide a panorama of Human Reliability data used in EDF's Safety Probabilistic Studies, and then, since these concepts are at the heart of Human Reliability and its methods, to go over the notion of human error and the understanding of accidents. We are not sure today that it is actually possible to provide in this field a foolproof and productive theoretical framework. Consequently, the aim of this article is to suggest potential paths of action and to provide information on EDF's progress along those paths which enables us to produce the most potentially useful Human Reliability analyses while taking into account current knowledge in Human Sciences. The second part of this article illustrates our point of view as EDF researchers through the analysis of the most famous civil nuclear accident, the Three Mile Island unit accident in 1979. Analysis of this accident allowed us to validate our positions regarding the need to move, in the case of an accident, from the concept of human error to that of systemic failure in the operation of systems such as a nuclear power plant. These concepts rely heavily on the notion of distributed cognition and we will explain how we applied it. These concepts were implemented in the MERMOS Human Reliability Probabilistic Assessment methods used in the latest EDF Probabilistic Human Reliability Assessment. Besides the fact that it is not very productive to focus exclusively on individual psychological error, the design of the MERMOS method and its implementation have confirmed two things: the significance of qualitative data collection for Human Reliability, and the central role held by Human Reliability experts in building knowledge about emergency operation, which in effect consists of Human Reliability data collection. The latest conclusion derived from the implementation of MERMOS is that, considering the difficulty in building 'generic' Human Reliability data in the field we are involved in, the best

  2. SIMULATED HUMAN ERROR PROBABILITY AND ITS APPLICATION TO DYNAMIC HUMAN FAILURE EVENTS

    Energy Technology Data Exchange (ETDEWEB)

    Herberger, Sarah M.; Boring, Ronald L.

    2016-10-01

    Abstract Objectives: Human reliability analysis (HRA) methods typically analyze human failure events (HFEs) at the overall task level. For dynamic HRA, it is important to model human activities at the subtask level. There exists a disconnect between dynamic subtask level and static task level that presents issues when modeling dynamic scenarios. For example, the SPAR-H method is typically used to calculate the human error probability (HEP) at the task level. As demonstrated in this paper, quantification in SPAR-H does not translate to the subtask level. Methods: Two different discrete distributions were generated for each SPAR-H Performance Shaping Factor (PSF) to define the frequency of PSF levels. The first distribution was a uniform, or uninformed distribution that assumed the frequency of each PSF level was equally likely. The second non-continuous distribution took the frequency of PSF level as identified from an assessment of the HERA database. These two different approaches were created to identify the resulting distribution of the HEP. The resulting HEP that appears closer to the known distribution, a log-normal centered on 1E-3, is the more desirable. Each approach then has median, average and maximum HFE calculations applied. To calculate these three values, three events, A, B and C are generated from the PSF level frequencies comprised of subtasks. The median HFE selects the median PSF level from each PSF and calculates HEP. The average HFE takes the mean PSF level, and the maximum takes the maximum PSF level. The same data set of subtask HEPs yields starkly different HEPs when aggregated to the HFE level in SPAR-H. Results: Assuming that each PSF level in each HFE is equally likely creates an unrealistic distribution of the HEP that is centered at 1. Next the observed frequency of PSF levels was applied with the resulting HEP behaving log-normally with a majority of the values under 2.5% HEP. The median, average and maximum HFE calculations did yield

  3. A human error probability estimate methodology based on fuzzy inference and expert judgment on nuclear plants

    International Nuclear Information System (INIS)

    Nascimento, C.S. do; Mesquita, R.N. de

    2009-01-01

    Recent studies point human error as an important factor for many industrial and nuclear accidents: Three Mile Island (1979), Bhopal (1984), Chernobyl and Challenger (1986) are classical examples. Human contribution to these accidents may be better understood and analyzed by using Human Reliability Analysis (HRA), which has being taken as an essential part on Probabilistic Safety Analysis (PSA) of nuclear plants. Both HRA and PSA depend on Human Error Probability (HEP) for a quantitative analysis. These probabilities are extremely affected by the Performance Shaping Factors (PSF), which has a direct effect on human behavior and thus shape HEP according with specific environment conditions and personal individual characteristics which are responsible for these actions. This PSF dependence raises a great problem on data availability as turn these scarcely existent database too much generic or too much specific. Besides this, most of nuclear plants do not keep historical records of human error occurrences. Therefore, in order to overcome this occasional data shortage, a methodology based on Fuzzy Inference and expert judgment was employed in this paper in order to determine human error occurrence probabilities and to evaluate PSF's on performed actions by operators in a nuclear power plant (IEA-R1 nuclear reactor). Obtained HEP values were compared with reference tabled data used on current literature in order to show method coherence and valid approach. This comparison leads to a conclusion that this work results are able to be employed both on HRA and PSA enabling efficient prospection of plant safety conditions, operational procedures and local working conditions potential improvements (author)

  4. In-plant reliability data base for nuclear plant components: a feasibility study on human error information

    International Nuclear Information System (INIS)

    Borkowski, R.J.; Fragola, J.R.; Schurman, D.L.; Johnson, J.W.

    1984-03-01

    This report documents the procedure and final results of a feasibility study which examined the usefulness of nuclear plant maintenance work requests in the IPRDS as tools for understanding human error and its influence on component failure and repair. Developed in this study were (1) a set of criteria for judging the quality of a plant maintenance record set for studying human error; (2) a scheme for identifying human errors in the maintenance records; and (3) two taxonomies (engineering-based and psychology-based) for categorizing and coding human error-related events

  5. Information Management System Development for the Characterization and Analysis of Human Error in Naval Aviation Maintenance Related Mishaps

    National Research Council Canada - National Science Library

    Wood, Brian

    2000-01-01

    .... The Human Factors Analysis and Classification System-Maintenance Extension taxonomy, an effective framework for classifying and analyzing the presence of maintenance errors that lead to mishaps...

  6. Human error recovery failure probability when using soft controls in computerized control rooms

    International Nuclear Information System (INIS)

    Jang, Inseok; Kim, Ar Ryum; Seong, Poong Hyun; Jung, Wondea

    2014-01-01

    Many literatures categorized recovery process into three phases; detection of problem situation, explanation of problem causes or countermeasures against problem, and end of recovery. Although the focus of recovery promotion has been on categorizing recovery phases and modeling recovery process, research related to human recovery failure probabilities has not been perform actively. On the other hand, a few study regarding recovery failure probabilities were implemented empirically. Summarizing, researches that have performed so far have several problems in terms of use in human reliability analysis (HRA). By adopting new human-system interfaces that are based on computer-based technologies, the operation environment of MCRs in NPPs has changed from conventional MCRs to advanced MCRs. Because of the different interfaces between conventional and advanced MCRs, different recovery failure probabilities should be considered in the HRA for advanced MCRs. Therefore, this study carries out an empirical analysis of human error recovery probabilities under an advanced MCR mockup called compact nuclear simulator (CNS). The aim of this work is not only to compile a recovery failure probability database using the simulator for advanced MCRs but also to collect recovery failure probability according to defined human error modes to compare that which human error mode has highest recovery failure probability. The results show that recovery failure probability regarding wrong screen selection was lowest among human error modes, which means that most of human error related to wrong screen selection can be recovered. On the other hand, recovery failure probabilities of operation selection omission and delayed operation were 1.0. These results imply that once subject omitted one task in the procedure, they have difficulties finding and recovering their errors without supervisor's assistance. Also, wrong screen selection had an effect on delayed operation. That is, wrong screen

  7. Training situational awareness to reduce surgical errors in the operating room

    NARCIS (Netherlands)

    Graafland, M.; Schraagen, J.M.C.; Boermeester, M.A.; Bemelman, W.A.; Schijven, M.P.

    2015-01-01

    Background: Surgical errors result from faulty decision-making, misperceptions and the application of suboptimal problem-solving strategies, just as often as they result from technical failure. To date, surgical training curricula have focused mainly on the acquisition of technical skills. The aim

  8. On failure of the pruning technique in "error repair in shift-reduce parsers"

    NARCIS (Netherlands)

    Bertsch, E; Nederhof, MJ

    A previous article presented a technique to compute the least-cost error repair by incrementally generating configurations that result from inserting and deleting tokens in a syntactically incorrect input. An additional mechanism to improve the run-time efficiency of this algorithm by pruning some

  9. Reducing Check-in Errors at Brigham Young University through Statistical Process Control

    Science.gov (United States)

    Spackman, N. Andrew

    2005-01-01

    The relationship between the library and its patrons is damaged and the library's reputation suffers when returned items are not checked in. An informal survey reveals librarians' concern for this problem and their efforts to combat it, although few libraries collect objective measurements of errors or the effects of improvement efforts. Brigham…

  10. SNP discovery in nonmodel organisms: strand bias and base-substitution errors reduce conversion rates.

    Science.gov (United States)

    Gonçalves da Silva, Anders; Barendse, William; Kijas, James W; Barris, Wes C; McWilliam, Sean; Bunch, Rowan J; McCullough, Russell; Harrison, Blair; Hoelzel, A Rus; England, Phillip R

    2015-07-01

    Single nucleotide polymorphisms (SNPs) have become the marker of choice for genetic studies in organisms of conservation, commercial or biological interest. Most SNP discovery projects in nonmodel organisms apply a strategy for identifying putative SNPs based on filtering rules that account for random sequencing errors. Here, we analyse data used to develop 4723 novel SNPs for the commercially important deep-sea fish, orange roughy (Hoplostethus atlanticus), to assess the impact of not accounting for systematic sequencing errors when filtering identified polymorphisms when discovering SNPs. We used SAMtools to identify polymorphisms in a velvet assembly of genomic DNA sequence data from seven individuals. The resulting set of polymorphisms were filtered to minimize 'bycatch'-polymorphisms caused by sequencing or assembly error. An Illumina Infinium SNP chip was used to genotype a final set of 7714 polymorphisms across 1734 individuals. Five predictors were examined for their effect on the probability of obtaining an assayable SNP: depth of coverage, number of reads that support a variant, polymorphism type (e.g. A/C), strand-bias and Illumina SNP probe design score. Our results indicate that filtering out systematic sequencing errors could substantially improve the efficiency of SNP discovery. We show that BLASTX can be used as an efficient tool to identify single-copy genomic regions in the absence of a reference genome. The results have implications for research aiming to identify assayable SNPs and build SNP genotyping assays for nonmodel organisms. © 2014 John Wiley & Sons Ltd.

  11. Optimal control strategy to reduce the temporal wavefront error in AO systems

    NARCIS (Netherlands)

    Doelman, N.J.; Hinnen, K.J.G.; Stoffelen, F.J.G.; Verhaegen, M.H.

    2004-01-01

    An Adaptive Optics (AO) system for astronomy is analysed from a control point of view. The focus is put on the temporal error. The AO controller is identified as a feedback regulator system, operating in closed-loop with the aim of rejecting wavefront disturbances. Limitations on the performance of

  12. Reducing Bias and Error in the Correlation Coefficient Due to Nonnormality

    Science.gov (United States)

    Bishara, Anthony J.; Hittner, James B.

    2015-01-01

    It is more common for educational and psychological data to be nonnormal than to be approximately normal. This tendency may lead to bias and error in point estimates of the Pearson correlation coefficient. In a series of Monte Carlo simulations, the Pearson correlation was examined under conditions of normal and nonnormal data, and it was compared…

  13. A methodology for analysing human errors of commission in accident scenarios for risk assessment

    International Nuclear Information System (INIS)

    Kim, J. H.; Jung, W. D.; Park, J. K

    2003-01-01

    As the concern on the impact of the operator's inappropriate interventions, so-called Errors Of Commissions(EOCs), on the plant safety has been raised, the interest in the identification and analysis of EOC events from the risk assessment perspective becomes increasing accordingly. To this purpose, we propose a new methodology for identifying and analysing human errors of commission that might be caused from the failures in situation assessment and decision making during accident progressions given an initiating event. The proposed methodology was applied to the accident scenarios of YGN 3 and 4 NPPs, which resulted in about 10 EOC situations that need careful attention

  14. Development of an analysis rule of diagnosis error for standard method of human reliability analysis

    International Nuclear Information System (INIS)

    Jeong, W. D.; Kang, D. I.; Jeong, K. S.

    2003-01-01

    This paper presents the status of development of Korea standard method for Human Reliability Analysis (HRA), and proposed a standard procedure and rules for the evaluation of diagnosis error probability. The quality of KSNP HRA was evaluated using the requirement of ASME PRA standard guideline, and the design requirement for the standard HRA method was defined. Analysis procedure and rules, developed so far, to analyze diagnosis error probability was suggested as a part of the standard method. And also a study of comprehensive application was performed to evaluate the suitability of the proposed rules

  15. Application of human error theory in case analysis of wrong procedures.

    Science.gov (United States)

    Duthie, Elizabeth A

    2010-06-01

    The goal of this study was to contribute to the emerging body of literature about the role of human behaviors and cognitive processes in the commission of wrong procedures. Case analysis of 5 wrong procedures in operative and nonoperative settings using James Reason's human error theory was performed. The case analysis showed that cognitive underspecification, cognitive flips, automode processing, and skill-based errors were contributory to wrong procedures. Wrong-site procedures accounted for the preponderance of the cases. Front-line supervisory staff used corrective actions that focused on the performance of the individual without taking into account cognitive factors. System fixes using human cognition concepts have a greater chance of achieving sustainable safety outcomes than those that are based on the traditional approach of counseling, education, and disciplinary action for staff.

  16. A Benefit/Cost/Deficit (BCD) model for learning from human errors

    International Nuclear Information System (INIS)

    Vanderhaegen, Frederic; Zieba, Stephane; Enjalbert, Simon; Polet, Philippe

    2011-01-01

    This paper proposes an original model for interpreting human errors, mainly violations, in terms of benefits, costs and potential deficits. This BCD model is then used as an input framework to learn from human errors, and two systems based on this model are developed: a case-based reasoning system and an artificial neural network system. These systems are used to predict a specific human car driving violation: not respecting the priority-to-the-right rule, which is a decision to remove a barrier. Both prediction systems learn from previous violation occurrences, using the BCD model and four criteria: safety, for identifying the deficit or the danger; and opportunity for action, driver comfort, and time spent; for identifying the benefits or the costs. The application of learning systems to predict car driving violations gives a rate over 80% of correct prediction after 10 iterations. These results are validated for the non-respect of priority-to-the-right rule.

  17. Human errors during the simulations of an SGTR scenario: Application of the HERA system

    International Nuclear Information System (INIS)

    Jung, Won Dea; Whaley, April M.; Hallbert, Bruce P.

    2009-01-01

    Due to the need of data for a Human Reliability Analysis (HRA), a number of data collection efforts have been undertaken in several different organizations. As a part of this effort, a human error analysis that focused on a set of simulator records on a Steam Generator Tube Rupture (SGTR) scenario was performed by using the Human Event Repository and Analysis (HERA) system. This paper summarizes the process and results of the HERA analysis, including discussions about the usability of the HERA system for a human error analysis of simulator data. Five simulated records of an SGTR scenario were analyzed with the HERA analysis process in order to scrutinize the causes and mechanisms of the human related events. From this study, the authors confirmed that the HERA was a serviceable system that can analyze human performance qualitatively from simulator data. It was possible to identify the human related events in the simulator data that affected the system safety not only negatively but also positively. It was also possible to scrutinize the Performance Shaping Factors (PSFs) and the relevant contributory factors with regard to each identified human event

  18. Good ergonomics and team diversity reduce absenteeism and errors in car manufacturing.

    Science.gov (United States)

    Fritzsche, Lars; Wegge, Jürgen; Schmauder, Martin; Kliegel, Matthias; Schmidt, Klaus-Helmut

    2014-01-01

    Prior research suggests that ergonomics work design and mixed teams (in age and gender) may compensate declines in certain abilities of ageing employees. This study investigates simultaneous effects of both team level factors on absenteeism and performance (error rates) over one year in a sample of 56 car assembly teams (N = 623). Results show that age was related to prolonged absenteeism and more mistakes in work planning, but not to overall performance. In comparison, high-physical workload was strongly associated with longer absenteeism and increased error rates. Furthermore, controlling for physical workload, age diversity was related to shorter absenteeism, and the presence of females in the team was associated with shorter absenteeism and better performance. In summary, this study suggests that both ergonomics work design and mixed team composition may compensate age-related productivity risks in manufacturing by maintaining the work ability of older employees and improving job quality.

  19. High Reliability Organization and Applicability to the Battlefield to Reduce Errors Associated with Combat Casualty Care

    Science.gov (United States)

    2016-06-10

    use different terminology depending on which sister service they are from. Every service has various medical capabilities for each role of medical ... Medical Errors, Combat Casualty Care, Culture of Safety 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18. NUMBER OF PAGES 19a...Army) AE Adverse event AHRQ Agency for Healthcare Research and Quality AHS Army Health System AMEDD Army Medical Department CPQ Clinical Practice

  20. An Investigation of Methods for Reducing Sampling Error in Certain IRT (Item Response Theory) Procedures.

    Science.gov (United States)

    1983-08-01

    Standard Errors for B1 Bell-shaped distribution Rectangular Item b Bn-45 n=90 n-45 n=45 -No. i i N-1500 N=1500 N-6000 N=1500 1 -2.01 -1.75 0.516 0.466...34th Streets Lawrence, KS 66045 Baltimore, MD 21218 ENIC Facility-Acquisitions 1 Dr. Ron Hambleton 4t33 Rugby Avenue School of Education Lcthesda, !ID

  1. In-hospital fellow coverage reduces communication errors in the surgical intensive care unit.

    Science.gov (United States)

    Williams, Mallory; Alban, Rodrigo F; Hardy, James P; Oxman, David A; Garcia, Edward R; Hevelone, Nathanael; Frendl, Gyorgy; Rogers, Selwyn O

    2014-06-01

    Staff coverage strategies of intensive care units (ICUs) impact clinical outcomes. High-intensity staff coverage strategies are associated with lower morbidity and mortality. Accessible clinical expertise, team work, and effective communication have all been attributed to the success of this coverage strategy. We evaluate the impact of in-hospital fellow coverage (IHFC) on improving communication of cardiorespiratory events. A prospective observational study performed in an academic tertiary care center with high-intensity staff coverage. The main outcome measure was resident to fellow communication of cardiorespiratory events during IHFC vs home coverage (HC) periods. Three hundred twelve cardiorespiratory events were collected in 114 surgical ICU patients in 134 study days. Complete data were available for 306 events. One hundred three communication errors occurred. IHFC was associated with significantly better communication of events compared to HC (Pcommunicated 89% of events during IHFC vs 51% of events during HC (PCommunication patterns of junior and midlevel residents were similar. Midlevel residents communicated 68% of all on-call events (87% IHFC vs 50% HC, Pcommunicated 66% of events (94% IHFC vs 52% HC, PCommunication errors were lower in all ICUs during IHFC (Pcommunication errors. Copyright © 2014 Elsevier Inc. All rights reserved.

  2. Plant specification of a generic human-error data through a two-stage Bayesian approach

    International Nuclear Information System (INIS)

    Heising, C.D.; Patterson, E.I.

    1984-01-01

    Expert judgement concerning human performance in nuclear power plants is quantitatively coupled with actuarial data on such performance in order to derive plant-specific human-error rate probability distributions. The coupling procedure consists of a two-stage application of Bayes' theorem to information which is grouped by type. The first information type contains expert judgement concerning human performance at nuclear power plants in general. Data collected on human performance at a group of similar plants forms the second information type. The third information type consists of data on human performance in a specific plant which has the same characteristics as the group members. The first and second information types are coupled in the first application of Bayes' theorem to derive a probability distribution for population performance. This distribution is then combined with the third information type in a second application of Bayes' theorem to determine a plant-specific human-error rate probability distribution. The two stage Bayesian procedure thus provides a means to quantitatively couple sparse data with expert judgement in order to obtain a human performance probability distribution based upon available information. Example calculations for a group of like reactors are also given. (author)

  3. A method to deal with installation errors of wearable accelerometers for human activity recognition

    International Nuclear Information System (INIS)

    Jiang, Ming; Wang, Zhelong; Shang, Hong; Li, Hongyi; Wang, Yuechao

    2011-01-01

    Human activity recognition (HAR) by using wearable accelerometers has gained significant interest in recent years in a range of healthcare areas, including inferring metabolic energy expenditure, predicting falls, measuring gait parameters and monitoring daily activities. The implementation of HAR relies heavily on the correctness of sensor fixation. The installation errors of wearable accelerometers may dramatically decrease the accuracy of HAR. In this paper, a method is proposed to improve the robustness of HAR to the installation errors of accelerometers. The method first calculates a transformation matrix by using Gram–Schmidt orthonormalization in order to eliminate the sensor's orientation error and then employs a low-pass filter with a cut-off frequency of 10 Hz to eliminate the main effect of the sensor's misplacement. The experimental results showed that the proposed method obtained a satisfactory performance for HAR. The average accuracy rate from ten subjects was 95.1% when there were no installation errors, and was 91.9% when installation errors were involved in wearable accelerometers

  4. Reduced penetrance in human inherited disease

    African Journals Online (AJOL)

    Rabah M. Shawky

    2014-01-31

    Jan 31, 2014 ... tant role in cellular senescence, tumorigenesis and in several diseases including type ... between epigenetic DNA modifications and human life span has also been shown .... penetrance mutation that is age dependent especially when compared with the ..... on healthy aging and longevity. Immunity Aging ...

  5. A basic framework for the analysis of the human error potential due to the computerization in nuclear power plants

    International Nuclear Information System (INIS)

    Lee, Y. H.

    1999-01-01

    Computerization and its vivid benefits expected in the nuclear power plant design cannot be realized without verifying the inherent safety problems. Human error aspect is also included in the verification issues. The verification spans from the perception of the changes in operation functions such as automation to the unfamiliar experience of operators due to the interface change. Therefore, a new framework for human error analysis might capture both the positive and the negative effect of the computerization. This paper suggest a basic framework for error identification through the review of the existing human error studies and the experience of computerizations in nuclear power plants

  6. An investigation on unintended reactor trip events in terms of human error hazards of Korean nuclear power plants

    International Nuclear Information System (INIS)

    Kim, Sa Kil; Lee, Yong Hee; Jang, Tong Il; Oh, Yeon Ju; Shin, Kwang Hyeon

    2014-01-01

    Highlights: • A methodology to identify human error hazards has been established. • The proposed methodology is a preventive approach to identify not only human error causes but also its hazards. • Using the HFACS framework we tried to find out not causations but all of the hazards and relationships among them. • We determined countermeasures against human errors through dealing with latent factors such as organizational influences. - Abstract: A new approach for finding the hazards of human errors, and not just their causes, in the nuclear industry is currently required. This is because finding causes of human errors is really impossible owing to the multiplicity of causes in each case. Thus, this study aims at identifying the relationships among human error hazards and determining the strategies for preventing human error events by means of a reanalysis of the reactor trip events in Korea NPPs. We investigated human errors to find latent factors such as decisions and conditions in all of the unintended reactor trip events during the last dozen years. In this study, we applied the HFACS (Human Factors Analysis and Classification System), which is a commonly utilized tool for investigating human contributions to aviation accidents under a widespread evaluation scheme. Using the HFACS framework, we tried to find out not the causations but all of the hazards and their relationships in terms of organizational factors. Through the trial, we proposed not only meaningful frequencies of each hazards also correlations of them. Also, considering the correlations of each hazards, we suggested useful strategies to prevent human error event. A method to investigate unintended nuclear reactor trips by human errors and the results will be discussed in more detail

  7. Strategies to reduce the systematic error due to tumor and rectum motion in radiotherapy of prostate cancer

    International Nuclear Information System (INIS)

    Hoogeman, Mischa S.; Herk, Marcel van; Bois, Josien de; Lebesque, Joos V.

    2005-01-01

    Background and purpose: The goal of this work is to develop and evaluate strategies to reduce the uncertainty in the prostate position and rectum shape that arises in the preparation stage of the radiation treatment of prostate cancer. Patients and methods: Nineteen prostate cancer patients, who were treated with 3-dimensional conformal radiotherapy, received each a planning CT scan and 8-13 repeat CT scans during the treatment period. We quantified prostate motion relative to the pelvic bone by first matching the repeat CT scans on the planning CT scan using the bony anatomy. Subsequently, each contoured prostate, including seminal vesicles, was matched on the prostate in the planning CT scan to obtain the translations and rotations. The variation in prostate position was determined in terms of the systematic, random and group mean error. We tested the performance of two correction strategies to reduce the systematic error due to prostate motion. The first strategy, the pre-treatment strategy, used only the initial rectum volume in the planning CT scan to adjust the angle of the prostate with respect to the left-right (LR) axis and the shape and position of the rectum. The second strategy, the adaptive strategy, used the data of repeat CT scans to improve the estimate of the prostate position and rectum shape during the treatment. Results: The largest component of prostate motion was a rotation around the LR axis. The systematic error (1 SD) was 5.1 deg and the random error was 3.6 deg (1 SD). The average LR-axis rotation between the planning and the repeat CT scans correlated significantly with the rectum volume in the planning CT scan (r=0.86, P<0.0001). Correction of the rotational position on the basis of the planning rectum volume alone reduced the systematic error by 28%. A correction, based on the data of the planning CT scan and 4 repeat CT scans reduced the systematic error over the complete treatment period by a factor of 2. When the correction was

  8. Multiple sequential failure model: A probabilistic approach to quantifying human error dependency

    International Nuclear Information System (INIS)

    Samanta

    1985-01-01

    This paper rpesents a probabilistic approach to quantifying human error dependency when multiple tasks are performed. Dependent human failures are dominant contributors to risks from nuclear power plants. An overview of the Multiple Sequential Failure (MSF) model developed and its use in probabilistic risk assessments (PRAs) depending on the available data are discussed. A small-scale psychological experiment was conducted on the nature of human dependency and the interpretation of the experimental data by the MSF model show remarkable accommodation of the dependent failure data. The model, which provides an unique method for quantification of dependent failures in human reliability analysis, can be used in conjunction with any of the general methods currently used for performing the human reliability aspect in PRAs

  9. Development of an FAA-EUROCONTROL technique for the analysis of human error in ATM : final report.

    Science.gov (United States)

    2002-07-01

    Human error has been identified as a dominant risk factor in safety-oriented industries such as air traffic control (ATC). However, little is known about the factors leading to human errors in current air traffic management (ATM) systems. The first s...

  10. Human error considerations and annunciator effects in determining optimal test intervals for periodically inspected standby systems

    International Nuclear Information System (INIS)

    McWilliams, T.P.; Martz, H.F.

    1981-01-01

    This paper incorporates the effects of four types of human error in a model for determining the optimal time between periodic inspections which maximizes the steady state availability for standby safety systems. Such safety systems are characteristic of nuclear power plant operations. The system is modeled by means of an infinite state-space Markov chain. Purpose of the paper is to demonstrate techniques for computing steady-state availability A and the optimal periodic inspection interval tau* for the system. The model can be used to investigate the effects of human error probabilities on optimal availability, study the benefits of annunciating the standby-system, and to determine optimal inspection intervals. Several examples which are representative of nuclear power plant applications are presented

  11. Sample Size Bounding and Context Ranking as Approaches to the Human Error Quantification Problem

    Energy Technology Data Exchange (ETDEWEB)

    Reer, B

    2004-03-01

    The paper describes a technique denoted as Sub-Sample-Size Bounding (SSSB), which is useable for the statistical derivation of context-specific probabilities from data available in existing reports on operating experience. Applications to human reliability analysis (HRA) are emphasised in the presentation of this technique. Exemplified by a sample of 180 abnormal event sequences, the manner in which SSSB can provide viable input for the quantification of errors of commission (EOCs) are outlined. (author)

  12. Sample Size Bounding and Context Ranking as Approaches to the Human Error Quantification Problem

    International Nuclear Information System (INIS)

    Reer, B.

    2004-01-01

    The paper describes a technique denoted as Sub-Sample-Size Bounding (SSSB), which is useable for the statistical derivation of context-specific probabilities from data available in existing reports on operating experience. Applications to human reliability analysis (HRA) are emphasised in the presentation of this technique. Exemplified by a sample of 180 abnormal event sequences, the manner in which SSSB can provide viable input for the quantification of errors of commission (EOCs) are outlined. (author)

  13. Improving Papanicolaou test quality and reducing medical errors by using Toyota production system methods.

    Science.gov (United States)

    Raab, Stephen S; Andrew-Jaja, Carey; Condel, Jennifer L; Dabbs, David J

    2006-01-01

    The objective of the study was to determine whether the Toyota production system process improves Papanicolaou test quality and patient safety. An 8-month nonconcurrent cohort study that included 464 case and 639 control women who had a Papanicolaou test was performed. Office workflow was redesigned using Toyota production system methods by introducing a 1-by-1 continuous flow process. We measured the frequency of Papanicolaou tests without a transformation zone component, follow-up and Bethesda System diagnostic frequency of atypical squamous cells of undetermined significance, and diagnostic error frequency. After the intervention, the percentage of Papanicolaou tests lacking a transformation zone component decreased from 9.9% to 4.7% (P = .001). The percentage of Papanicolaou tests with a diagnosis of atypical squamous cells of undetermined significance decreased from 7.8% to 3.9% (P = .007). The frequency of error per correlating cytologic-histologic specimen pair decreased from 9.52% to 7.84%. The introduction of the Toyota production system process resulted in improved Papanicolaou test quality.

  14. Assessing the Effects of Climate Variability on Orange Yield in Florida to Reduce Production Forecast Errors

    Science.gov (United States)

    Concha Larrauri, P.

    2015-12-01

    Orange production in Florida has experienced a decline over the past decade. Hurricanes in 2004 and 2005 greatly affected production, almost to the same degree as strong freezes that occurred in the 1980's. The spread of the citrus greening disease after the hurricanes has also contributed to a reduction in orange production in Florida. The occurrence of hurricanes and diseases cannot easily be predicted but the additional effects of climate on orange yield can be studied and incorporated into existing production forecasts that are based on physical surveys, such as the October Citrus forecast issued every year by the USDA. Specific climate variables ocurring before and after the October forecast is issued can have impacts on flowering, orange drop rates, growth, and maturation, and can contribute to the forecast error. Here we present a methodology to incorporate local climate variables to predict the USDA's orange production forecast error, and we study the local effects of climate on yield in different counties in Florida. This information can aid farmers to gain an insight on what is to be expected during the orange production cycle, and can help supply chain managers to better plan their strategy.

  15. Methods to reduce medication errors in a clinical trial of an investigational parenteral medication

    Directory of Open Access Journals (Sweden)

    Gillian L. Fell

    2016-12-01

    Full Text Available There are few evidence-based guidelines to inform optimal design of complex clinical trials, such as those assessing the safety and efficacy of intravenous drugs administered daily with infusion times over many hours per day and treatment durations that may span years. This study is a retrospective review of inpatient administration deviation reports for an investigational drug that is administered daily with infusion times of 8–24 h, and variable treatment durations for each patient. We report study design modifications made in 2007–2008 aimed at minimizing deviations from an investigational drug infusion protocol approved by an institutional review board and the United States Food and Drug Administration. Modifications were specifically aimed at minimizing errors of infusion rate, incorrect dose, incorrect patient, or wrong drug administered. We found that the rate of these types of administration errors of the study drug was significantly decreased following adoption of the specific study design changes. This report provides guidance in the design of clinical trials testing the safety and efficacy of study drugs administered via intravenous infusion in an inpatient setting so as to minimize drug administration protocol deviations and optimize patient safety.

  16. Human error as the root cause of severe accidents at nuclear reactors

    International Nuclear Information System (INIS)

    Kovács Zoltán; Rýdzi, Stanislav

    2017-01-01

    A root cause is a factor inducing an undesirable event. It is feasible for root causes to be eliminated through technological process improvements. Human error was the root cause of all severe accidents at nuclear power plants. The TMI accident was caused by a series of human errors. The Chernobyl disaster occurred after a badly performed test of the turbogenerator at a reactor with design deficiencies, and in addition, the operators ignored the safety principles and disabled the safety systems. At Fukushima the tsunami risk was underestimated and the project failed to consider the specific issues of the site. The paper describes the severe accidents and points out the human errors that caused them. Also, provisions that might have eliminated those severe accidents are suggested. The fact that each severe accident occurred on a different type of reactor is relevant – no severe accident ever occurred twice at the same reactor type. The lessons learnt from the severe accidents and the safety measures implemented on reactor units all over the world seem to be effective. (orig.)

  17. Subsecond dopamine fluctuations in human striatum encode superposed error signals about actual and counterfactual reward

    Science.gov (United States)

    Kishida, Kenneth T.; Saez, Ignacio; Lohrenz, Terry; Witcher, Mark R.; Laxton, Adrian W.; Tatter, Stephen B.; White, Jason P.; Ellis, Thomas L.; Phillips, Paul E. M.; Montague, P. Read

    2016-01-01

    In the mammalian brain, dopamine is a critical neuromodulator whose actions underlie learning, decision-making, and behavioral control. Degeneration of dopamine neurons causes Parkinson’s disease, whereas dysregulation of dopamine signaling is believed to contribute to psychiatric conditions such as schizophrenia, addiction, and depression. Experiments in animal models suggest the hypothesis that dopamine release in human striatum encodes reward prediction errors (RPEs) (the difference between actual and expected outcomes) during ongoing decision-making. Blood oxygen level-dependent (BOLD) imaging experiments in humans support the idea that RPEs are tracked in the striatum; however, BOLD measurements cannot be used to infer the action of any one specific neurotransmitter. We monitored dopamine levels with subsecond temporal resolution in humans (n = 17) with Parkinson’s disease while they executed a sequential decision-making task. Participants placed bets and experienced monetary gains or losses. Dopamine fluctuations in the striatum fail to encode RPEs, as anticipated by a large body of work in model organisms. Instead, subsecond dopamine fluctuations encode an integration of RPEs with counterfactual prediction errors, the latter defined by how much better or worse the experienced outcome could have been. How dopamine fluctuations combine the actual and counterfactual is unknown. One possibility is that this process is the normal behavior of reward processing dopamine neurons, which previously had not been tested by experiments in animal models. Alternatively, this superposition of error terms may result from an additional yet-to-be-identified subclass of dopamine neurons. PMID:26598677

  18. Does the A-not-B error in adult pet dogs indicate sensitivity to human communication?

    Science.gov (United States)

    Kis, Anna; Topál, József; Gácsi, Márta; Range, Friederike; Huber, Ludwig; Miklósi, Adám; Virányi, Zsófia

    2012-07-01

    Recent dog-infant comparisons have indicated that the experimenter's communicative signals in object hide-and-search tasks increase the probability of perseverative (A-not-B) errors in both species (Topál et al. 2009). These behaviourally similar results, however, might reflect different mechanisms in dogs and in children. Similar errors may occur if the motor response of retrieving the object during the A trials cannot be inhibited in the B trials or if the experimenter's movements and signals toward the A hiding place in the B trials ('sham-baiting') distract the dogs' attention. In order to test these hypotheses, we tested dogs similarly to Topál et al. (2009) but eliminated the motor search in the A trials and 'sham-baiting' in the B trials. We found that neither an inability to inhibit previously rewarded motor response nor insufficiencies in their working memory and/or attention skills can explain dogs' erroneous choices. Further, we replicated the finding that dogs have a strong tendency to commit the A-not-B error after ostensive-communicative hiding and demonstrated the crucial effect of socio-communicative cues as the A-not-B error diminishes when location B is ostensively enhanced. These findings further support the hypothesis that the dogs' A-not-B error may reflect a special sensitivity to human communicative cues. Such object-hiding and search tasks provide a typical case for how susceptibility to human social signals could (mis)lead domestic dogs.

  19. Slow brushing reduces heat pain in humans.

    Science.gov (United States)

    Liljencrantz, J; Strigo, I; Ellingsen, D M; Krämer, H H; Lundblad, L C; Nagi, S S; Leknes, S; Olausson, H

    2017-08-01

    C-tactile (CT) afferents are unmyelinated low-threshold mechanoreceptors optimized for signalling affective, gentle touch. In three separate psychophysical experiments, we examined the contribution of CT afferents to pain modulation. In total, 44 healthy volunteers experienced heat pain and CT optimal (slow brushing) and CT sub-optimal (fast brushing or vibration) stimuli. Three different experimental paradigms were used: Concurrent application of heat pain and tactile (slow brushing or vibration) stimulation; Slow brushing, applied for variable duration and intervals, preceding heat pain; Slow versus fast brushing preceding heat pain. Slow brushing was effective in reducing pain, whereas fast brushing or vibration was not. The reduction in pain was significant not only when the CT optimal touch was applied simultaneously with the painful stimulus but also when the two stimuli were separated in time. For subsequent stimulation, the pain reduction was more pronounced for a shorter time interval between brushing and pain. Likewise, the effect was more robust when pain was preceded by a longer duration of brush stimulation. Strong CT-related pain reduction was associated with low anxiety and high calmness scores obtained by a state anxiety questionnaire. Slow brushing - optimal for CT activation - is effective in reducing pain from cutaneous heating. The precise mechanisms for the pain relief are as yet unknown but possible mechanisms include inhibition of nociceptive projection neurons at the level of the dorsal horn as well as analgesia through cortical mechanisms. Slow brushing stimuli - optimal for activation of C-tactile fibres - can reduce pain from cutaneous heating. No such effect was seen with fast brushing or vibration. These observations indicate the role of C-tactile fibres in pain modulation. © 2017 European Pain Federation - EFIC®.

  20. System care improves trauma outcome: patient care errors dominate reduced preventable death rate.

    Science.gov (United States)

    Thoburn, E; Norris, P; Flores, R; Goode, S; Rodriguez, E; Adams, V; Campbell, S; Albrink, M; Rosemurgy, A

    1993-01-01

    A review of 452 trauma deaths in Hillsborough County, Florida, in 1984 documented that 23% of non-CNS trauma deaths were preventable and occurred because of inadequate resuscitation or delay in proper surgical care. In late 1988 Hillsborough County organized a County Trauma Agency (HCTA) to coordinate trauma care among prehospital providers and state-designated trauma centers. The purpose of this study was to review county trauma deaths after the inception of the HCTA to determine the frequency of preventable deaths. 504 trauma deaths occurring between October 1989 and April 1991 were reviewed. Through committee review, 10 deaths were deemed preventable; 2 occurred outside the trauma system. Of the 10 deaths, 5 preventable deaths occurred late in severely injured patients. The preventable death rate has decreased to 7.0% with system care. The causes of preventable deaths have changed from delayed or inadequate intervention to postoperative care errors.

  1. Multidisciplinary strategy to reduce errors with the use of medical gases.

    Science.gov (United States)

    Amor-García, Miguel Ángel; Ibáñez-García, Sara; Díaz-Redondo, Alicia; Herranz Alonso, Ana; Sanjurjo Sáez, María

    2018-05-01

    Lack of awareness of the risks associated with the use of medical  gases amongst health professionals and health organizations is concerning. The  objective of this study is to redefine the use process of medical gases in a  hospital setting. A sentinel event took place in a clinical unit, the incorrect administration of a medical gas to an inpatient. A multidisciplinary  causeroot analysis of the sentinel event was carried out. Different improvement points were identified for each error detected and so we defined a  good strategy to ensure the safe use of these drugs. 9 errors were identified and the following improvement actions were  defined: storage (gases of clinical use were separated from those of industrial  use and proper identification signs were placed), prescription (6 protocols were  included in the hospital´s Computerized Physician Order Entry software),  validation (pharmacist validation of the prescription to ensure appropriate use of  these), dispensation (a new protocol for medical gases dispensation and  transportation was designed and implemented) and administration (information  on the pressure gauges used for each type of gas was collected and reviewed).  72 Signs with recommendations for medical gases identification and  administration were placed in all the clinical units. Specific training on the safe  use of medical gases and general safety training was imparted. The implementation of a process that integrates all phases of use  of medical gases and applies to all professionals involved is presented here as a  strategy to increase safety in the use of these medicines. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.

  2. Reducing human nitrogen use for food production.

    Science.gov (United States)

    Liu, Junguo; Ma, Kun; Ciais, Philippe; Polasky, Stephen

    2016-07-22

    Reactive nitrogen (N) is created in order to sustain food production, but only a small fraction of this N ends up being consumed as food, the rest being lost to the environment. We calculated that the total N input (TN) of global food production was 171 Tg N yr(-1) in 2000. The production of animal products accounted for over 50% of the TN, against 17% for global calories production. Under current TN per unit of food production and assuming no change in agricultural practices and waste-to-food ratios, we estimate that an additional TN of 100 Tg N yr(-1) will be needed by 2030 for a baseline scenario that would meet hunger alleviation targets for over 9 billion people. Increased animal production will have the largest impact on increasing TN, which calls for new food production systems with better N-recycling, such as cooperation between crop and livestock producing farms. Increased N-use efficiency, healthier diet and decreased food waste could mitigate this increase and even reduce TN in 2030 by 8% relative to the 2000 level. Achieving a worldwide reduction of TN is a major challenge that requires sustained actions to improve nitrogen management practices and reduce nitrogen losses into the environment.

  3. Statistical evaluation of major human errors during the development of new technological systems

    International Nuclear Information System (INIS)

    Campbell, G; Ott, K.O.

    1979-01-01

    Statistical procedures are presented to evaluate major human errors during the development of a new system, errors that have led or can lead to accidents or major failures. The first procedure aims at estimating the average residual occurrence rate for s or major failures after several have occurred. The procedure is solely based on the historical record. Certain idealizations are introduced that allow the application of a sound statistical evaluation procedure. These idealizations are practically realized to a sufficient degree such that the proposed estimation procedure yields meaningful results, even for situations with a sparse data base, represented by very few accidents. Under the assumption that the possible human-error-related failure times have exponential distributions, the statistical technique of isotonic regression is proposed to estimate the failure rates due to human design error at the failure times of the system. The last value in the sequence of estimates gives the residual accident chance. In addition, theactual situation is tested against the hypothesis that the failure rate of the system remains constant over time. This test determines the chance for a decreasing failure rate being incidental, rather than an indication of an actual learning process. Both techniques can be applied not merely to a single system but to an entire series of similar systems that a technology would generate, enabling the assessment of technological improvement. For the purpose of illustration, the nuclear decay of isotopes was chosen as an example, since the assumptions of the model are rigorously satisfied in this case. This application shows satisfactory agreement of the estimated and actual failure rates (which are exactly known in this example), although the estimation was deliberately based on a sparse historical record

  4. Maintenance Strategies to Reduce Downtime Due to\\ud Machine Positional Errors

    OpenAIRE

    Shagluf, Abubaker; Longstaff, Andrew P.; Fletcher, Simon

    2014-01-01

    Manufacturing strives to reduce waste and increase\\ud Overall Equipment Effectiveness (OEE). When managing machine tool maintenance a manufacturer must apply an appropriate decision technique in order to reveal hidden costs associated with production losses, reduce equipment downtime\\ud competently and similarly identify the machines’ performance.\\ud Total productive maintenance (TPM) is a maintenance program that involves concepts for maintaining plant and equipment effectively. OEE is a pow...

  5. Development of a Simple Radioactive marker System to Reduce Positioning Errors in Radiation Treatment

    International Nuclear Information System (INIS)

    William H. Miller; Dr. Jatinder Palta

    2007-01-01

    The objective of this research is to implement an inexpensive, quick and simple monitor that provides an accurate indication of proper patient position during the treatment of cancer by external beam X-ray radiation and also checks for any significant changes in patient anatomy. It is believed that this system will significantly reduce the treatment margin, provide an additional, independent quality assurance check of positioning accuracy prior to all treatments and reduce the probability of misadministration of therapeutic dose

  6. Operator errors

    International Nuclear Information System (INIS)

    Knuefer; Lindauer

    1980-01-01

    Besides that at spectacular events a combination of component failure and human error is often found. Especially the Rasmussen-Report and the German Risk Assessment Study show for pressurised water reactors that human error must not be underestimated. Although operator errors as a form of human error can never be eliminated entirely, they can be minimized and their effects kept within acceptable limits if a thorough training of personnel is combined with an adequate design of the plant against accidents. Contrary to the investigation of engineering errors, the investigation of human errors has so far been carried out with relatively small budgets. Intensified investigations in this field appear to be a worthwhile effort. (orig.)

  7. 75 FR 33629 - Developing Guidance on Naming, Labeling, and Packaging Practices to Reduce Medication Errors...

    Science.gov (United States)

    2010-06-14

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration [Docket No. FDA-2010-N-0168... workshop will be held at the Holiday Inn College Park, 10000 Baltimore Ave., College Park, MD 20740. FOR... intended to assist the agency in developing draft guidance for industry on describing practices for naming...

  8. ATHEANA: A Technique for Human Error Analysis: An Overview of Its Methodological Basis

    International Nuclear Information System (INIS)

    Wreathall, John; Ramey-Smith, Ann

    1998-01-01

    The U.S. NRC has developed a new human reliability analysis (HRA) method, called A Technique for Human Event Analysis (ATHEANA), to provide a way of modeling the so-called 'errors of commission' - that is, situations in which operators terminate or disable engineered safety features (ESFs) or similar equipment during accident conditions, thereby putting the plant at an increased risk of core damage. In its reviews of operational events, NRC has found that these errors of commission occur with a relatively high frequency (as high as 2 or 3 per year), but are noticeably missing from the scope of most current probabilistic risk assessments (PRAs). This new method was developed through a formalized approach that describes what can occur when operators behave rationally but have inadequate knowledge or poor judgement. In particular, the method is based on models of decision-making and response planning that have been used extensively in the aviation field, and on the analysis of major accidents in both the nuclear and non-nuclear fields. Other papers at this conference present summaries of these event analyses in both the nuclear and non-nuclear fields. This paper presents an overview of ATHEANA and summarizes how the method structures the analysis of operationally significant events, and helps HRA analysts identify and model potentially risk-significant errors of commission in plant PRAs. (authors)

  9. ATHEANA: open-quotes a technique for human error analysisclose quotes entering the implementation phase

    International Nuclear Information System (INIS)

    Taylor, J.; O'Hara, J.; Luckas, W.

    1997-01-01

    Probabilistic Risk Assessment (PRA) has become an increasingly important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. The NRC recently published a final policy statement, SECY-95-126, encouraging the use of PRA in regulatory activities. Human reliability analysis (HRA), while a critical element of PRA, has limitations in the analysis of human actions in PRAs that have long been recognized as a constraint when using PRA. In fact, better integration of HRA into the PRA process has long been a NRC issue. Of particular concern, has been the omission of errors of commission - those errors that are associated with inappropriate interventions by operators with operating systems. To address these concerns, the NRC identified the need to develop an improved HRA method, so that human reliability can be better represented and integrated into PRA modeling and quantification. The purpose of the Brookhaven National Laboratory (BNL) project, entitled 'Improved HRA Method Based on Operating Experience' is to develop a new method for HRA which is supported by the analysis of risk-significant operating experience. This approach will allow a more realistic assessment and representation of the human contribution to plant risk, and thereby increase the utility of PRA. The project's completed, ongoing, and future efforts fall into four phases: (1) Assessment phase (FY 92/93); (2) Analysis and Characterization phase (FY 93/94); (3) Development phase (FY 95/96); and (4) Implementation phase (FY 96/97 ongoing)

  10. Reducing NIR prediction errors with nonlinear methods and large populations of intact compound feedstuffs

    International Nuclear Information System (INIS)

    Fernández-Ahumada, E; Gómez, A; Vallesquino, P; Guerrero, J E; Pérez-Marín, D; Garrido-Varo, A; Fearn, T

    2008-01-01

    According to the current demands of the authorities, the manufacturers and the consumers, controls and assessments of the feed compound manufacturing process have become a key concern. Among others, it must be assured that a given compound feed is well manufactured and labelled in terms of the ingredient composition. When near-infrared spectroscopy (NIRS) together with linear models were used for the prediction of the ingredient composition, the results were not always acceptable. Therefore, the performance of nonlinear methods has been investigated. Artificial neural networks and least squares support vector machines (LS-SVM) have been applied to a large (N = 20 320) and heterogeneous population of non-milled feed compounds for the NIR prediction of the inclusion percentage of wheat and sunflower meal, as representative of two different classes of ingredients. Compared to partial least squares regression, results showed considerable reductions of standard error of prediction values for both methods and ingredients: reductions of 45% with ANN and 49% with LS-SVM for wheat and reductions of 44% with ANN and 46% with LS-SVM for sunflower meal. These improvements together with the facility of NIRS technology to be implemented in the process make it ideal for meeting the requirements of the animal feed industry

  11. Using Healthcare Failure Mode and Effect Analysis to reduce medication errors in the process of drug prescription, validation and dispensing in hospitalised patients.

    Science.gov (United States)

    Vélez-Díaz-Pallarés, Manuel; Delgado-Silveira, Eva; Carretero-Accame, María Emilia; Bermejo-Vicedo, Teresa

    2013-01-01

    To identify actions to reduce medication errors in the process of drug prescription, validation and dispensing, and to evaluate the impact of their implementation. A Health Care Failure Mode and Effect Analysis (HFMEA) was supported by a before-and-after medication error study to measure the actual impact on error rate after the implementation of corrective actions in the process of drug prescription, validation and dispensing in wards equipped with computerised physician order entry (CPOE) and unit-dose distribution system (788 beds out of 1080) in a Spanish university hospital. The error study was carried out by two observers who reviewed medication orders on a daily basis to register prescription errors by physicians and validation errors by pharmacists. Drugs dispensed in the unit-dose trolleys were reviewed for dispensing errors. Error rates were expressed as the number of errors for each process divided by the total opportunities for error in that process times 100. A reduction in prescription errors was achieved by providing training for prescribers on CPOE, updating prescription procedures, improving clinical decision support and automating the software connection to the hospital census (relative risk reduction (RRR), 22.0%; 95% CI 12.1% to 31.8%). Validation errors were reduced after optimising time spent in educating pharmacy residents on patient safety, developing standardised validation procedures and improving aspects of the software's database (RRR, 19.4%; 95% CI 2.3% to 36.5%). Two actions reduced dispensing errors: reorganising the process of filling trolleys and drawing up a protocol for drug pharmacy checking before delivery (RRR, 38.5%; 95% CI 14.1% to 62.9%). HFMEA facilitated the identification of actions aimed at reducing medication errors in a healthcare setting, as the implementation of several of these led to a reduction in errors in the process of drug prescription, validation and dispensing.

  12. Results of a nuclear power plant Application of a new technique for human error analysis (ATHEANA)

    International Nuclear Information System (INIS)

    Forester, J.A.; Whitehead, D.W.; Kolaczkowski, A.M.; Thompson, C.M.

    1997-01-01

    A new method to analyze human errors has been demonstrated at a pressurized water reactor (PWR) nuclear power plant. This was the first application of the new method referred to as A Technique for Human Error Analysis (ATHEANA). The main goals of the demonstration were to test the ATHEANA process as described in the frame-of-reference manual and the implementation guideline, test a training package developed for the method, test the hypothesis that plant operators and trainers have significant insight into the error-forcing-contexts (EFCs) that can make unsafe actions (UAs) more likely, and to identify ways to improve the method and its documentation. A set of criteria to evaluate the open-quotes successclose quotes of the ATHEANA method as used in the demonstration was identified. A human reliability analysis (HRA) team was formed that consisted of an expert in probabilistic risk assessment (PRA) with some background in HRA (not ATHEANA) and four personnel from the nuclear power plant. Personnel from the plant included two individuals from their PRA staff and two individuals from their training staff. Both individuals from training are currently licensed operators and one of them was a senior reactor operator open-quotes on shiftclose quotes until a few months before the demonstration. The demonstration was conducted over a 5 month period and was observed by members of the Nuclear Regulatory Commission's ATHEANA development team, who also served as consultants to the HRA team when necessary. Example results of the demonstration to date, including identified human failure events (HFEs), UAs, and EFCs are discussed. Also addressed is how simulator exercises are used in the ATHEANA demonstration project

  13. Faces in places: humans and machines make similar face detection errors.

    Directory of Open Access Journals (Sweden)

    Bernard Marius 't Hart

    Full Text Available The human visual system seems to be particularly efficient at detecting faces. This efficiency sometimes comes at the cost of wrongfully seeing faces in arbitrary patterns, including famous examples such as a rock configuration on Mars or a toast's roast patterns. In machine vision, face detection has made considerable progress and has become a standard feature of many digital cameras. The arguably most wide-spread algorithm for such applications ("Viola-Jones" algorithm achieves high detection rates at high computational efficiency. To what extent do the patterns that the algorithm mistakenly classifies as faces also fool humans? We selected three kinds of stimuli from real-life, first-person perspective movies based on the algorithm's output: correct detections ("real faces", false positives ("illusory faces" and correctly rejected locations ("non faces". Observers were shown pairs of these for 20 ms and had to direct their gaze to the location of the face. We found that illusory faces were mistaken for faces more frequently than non faces. In addition, rotation of the real face yielded more errors, while rotation of the illusory face yielded fewer errors. Using colored stimuli increases overall performance, but does not change the pattern of results. When replacing the eye movement by a manual response, however, the preference for illusory faces over non faces disappeared. Taken together, our data show that humans make similar face-detection errors as the Viola-Jones algorithm, when directing their gaze to briefly presented stimuli. In particular, the relative spatial arrangement of oriented filters seems of relevance. This suggests that efficient face detection in humans is likely to be pre-attentive and based on rather simple features as those encoded in the early visual system.

  14. Multidisciplinary framework for human reliability analysis with an application to errors of commission and dependencies

    International Nuclear Information System (INIS)

    Barriere, M.T.; Luckas, W.J.; Wreathall, J.; Cooper, S.E.; Bley, D.C.; Ramey-Smith, A.

    1995-08-01

    Since the early 1970s, human reliability analysis (HRA) has been considered to be an integral part of probabilistic risk assessments (PRAs). Nuclear power plant (NPP) events, from Three Mile Island through the mid-1980s, showed the importance of human performance to NPP risk. Recent events demonstrate that human performance continues to be a dominant source of risk. In light of these observations, the current limitations of existing HRA approaches become apparent when the role of humans is examined explicitly in the context of real NPP events. The development of new or improved HRA methodologies to more realistically represent human performance is recognized by the Nuclear Regulatory Commission (NRC) as a necessary means to increase the utility of PRAS. To accomplish this objective, an Improved HRA Project, sponsored by the NRC's Office of Nuclear Regulatory Research (RES), was initiated in late February, 1992, at Brookhaven National Laboratory (BNL) to develop an improved method for HRA that more realistically assesses the human contribution to plant risk and can be fully integrated with PRA. This report describes the research efforts including the development of a multidisciplinary HRA framework, the characterization and representation of errors of commission, and an approach for addressing human dependencies. The implications of the research and necessary requirements for further development also are discussed

  15. Multidisciplinary framework for human reliability analysis with an application to errors of commission and dependencies

    Energy Technology Data Exchange (ETDEWEB)

    Barriere, M.T.; Luckas, W.J. [Brookhaven National Lab., Upton, NY (United States); Wreathall, J. [Wreathall (John) and Co., Dublin, OH (United States); Cooper, S.E. [Science Applications International Corp., Reston, VA (United States); Bley, D.C. [PLG, Inc., Newport Beach, CA (United States); Ramey-Smith, A. [Nuclear Regulatory Commission, Washington, DC (United States). Div. of Systems Technology

    1995-08-01

    Since the early 1970s, human reliability analysis (HRA) has been considered to be an integral part of probabilistic risk assessments (PRAs). Nuclear power plant (NPP) events, from Three Mile Island through the mid-1980s, showed the importance of human performance to NPP risk. Recent events demonstrate that human performance continues to be a dominant source of risk. In light of these observations, the current limitations of existing HRA approaches become apparent when the role of humans is examined explicitly in the context of real NPP events. The development of new or improved HRA methodologies to more realistically represent human performance is recognized by the Nuclear Regulatory Commission (NRC) as a necessary means to increase the utility of PRAS. To accomplish this objective, an Improved HRA Project, sponsored by the NRC`s Office of Nuclear Regulatory Research (RES), was initiated in late February, 1992, at Brookhaven National Laboratory (BNL) to develop an improved method for HRA that more realistically assesses the human contribution to plant risk and can be fully integrated with PRA. This report describes the research efforts including the development of a multidisciplinary HRA framework, the characterization and representation of errors of commission, and an approach for addressing human dependencies. The implications of the research and necessary requirements for further development also are discussed.

  16. Post-event human decision errors: operator action tree/time reliability correlation

    Energy Technology Data Exchange (ETDEWEB)

    Hall, R E; Fragola, J; Wreathall, J

    1982-11-01

    This report documents an interim framework for the quantification of the probability of errors of decision on the part of nuclear power plant operators after the initiation of an accident. The framework can easily be incorporated into an event tree/fault tree analysis. The method presented consists of a structure called the operator action tree and a time reliability correlation which assumes the time available for making a decision to be the dominating factor in situations requiring cognitive human response. This limited approach decreases the magnitude and complexity of the decision modeling task. Specifically, in the past, some human performance models have attempted prediction by trying to emulate sequences of human actions, or by identifying and modeling the information processing approach applicable to the task. The model developed here is directed at describing the statistical performance of a representative group of hypothetical individuals responding to generalized situations.

  17. Post-event human decision errors: operator action tree/time reliability correlation

    International Nuclear Information System (INIS)

    Hall, R.E.; Fragola, J.; Wreathall, J.

    1982-11-01

    This report documents an interim framework for the quantification of the probability of errors of decision on the part of nuclear power plant operators after the initiation of an accident. The framework can easily be incorporated into an event tree/fault tree analysis. The method presented consists of a structure called the operator action tree and a time reliability correlation which assumes the time available for making a decision to be the dominating factor in situations requiring cognitive human response. This limited approach decreases the magnitude and complexity of the decision modeling task. Specifically, in the past, some human performance models have attempted prediction by trying to emulate sequences of human actions, or by identifying and modeling the information processing approach applicable to the task. The model developed here is directed at describing the statistical performance of a representative group of hypothetical individuals responding to generalized situations

  18. Human dorsal striatum encodes prediction errors during observational learning of instrumental actions.

    Science.gov (United States)

    Cooper, Jeffrey C; Dunne, Simon; Furey, Teresa; O'Doherty, John P

    2012-01-01

    The dorsal striatum plays a key role in the learning and expression of instrumental reward associations that are acquired through direct experience. However, not all learning about instrumental actions require direct experience. Instead, humans and other animals are also capable of acquiring instrumental actions by observing the experiences of others. In this study, we investigated the extent to which human dorsal striatum is involved in observational as well as experiential instrumental reward learning. Human participants were scanned with fMRI while they observed a confederate over a live video performing an instrumental conditioning task to obtain liquid juice rewards. Participants also performed a similar instrumental task for their own rewards. Using a computational model-based analysis, we found reward prediction errors in the dorsal striatum not only during the experiential learning condition but also during observational learning. These results suggest a key role for the dorsal striatum in learning instrumental associations, even when those associations are acquired purely by observing others.

  19. Using human error theory to explore the supply of non-prescription medicines from community pharmacies.

    Science.gov (United States)

    Watson, M C; Bond, C M; Johnston, M; Mearns, K

    2006-08-01

    The importance of theory in underpinning interventions to promote effective professional practice is gaining recognition. The Medical Research Council framework for complex interventions has assisted in promoting awareness and adoption of theory into study design. Human error theory has previously been used by high risk industries but its relevance to healthcare settings and patient safety requires further investigation. This study used this theory as a framework to explore non-prescription medicine supply from community pharmacies. The relevance to other healthcare settings and behaviours is discussed. A 25% random sample was made of 364 observed consultations for non-prescription medicines. Each of the 91 consultations was assessed by two groups: a consensus group (stage 1) to identify common problems with the consultation process, and an expert group (stages 2 and 3) to apply human error theory to these consultations. Paired assessors (most of whom were pharmacists) categorised the perceived problems occurring in each consultation (stage 1). During stage 2 paired assessors from an expert group (comprising patient safety experts, community pharmacists and psychologists) considered whether each consultation was compliant with professional guidelines for the supply of pharmacy medicines. Each non-compliant consultation identified during stage 2 was then categorised as a slip/lapse, mistake, or violation using human error theory (stage 3). During stage 1 most consultations (n = 75, 83%) were deemed deficient in information exchange. At stage 2, paired assessors varied in attributing non-compliance to specific error types. Where agreement was achieved, the error type most often selected was "violation" (n = 27, 51.9%, stage 3). Consultations involving product requests were less likely to be guideline compliant than symptom presentations (OR 0.30, 95% CI 0.10 to 0.95, p = 0.05). The large proportion of consultations classified as violations suggests that either

  20. Development of the Human Error Management Criteria and the Job Aptitude Evaluation Criteria for Rail Safety Personnel

    Energy Technology Data Exchange (ETDEWEB)

    Koo, In Soo; Seo, Sang Mun; Park, Geun Ok (and others)

    2008-08-15

    It has been estimated that up to 90% of all workplace accidents have human error as a cause. Human error has been widely recognized as a key factor in almost all the highly publicized accidents, including Daegu subway fire of February 18, 2003 killed 198 people and injured 147. Because most human behavior is 'unintentional', carried out automatically, root causes of human error should be carefully investigated and regulated by a legal authority. The final goal of this study is to set up some regulatory guidance that are supposed to be used by the korean rail organizations related to safety managements and the contents are : - to develop the regulatory guidance for managing human error, - to develop the regulatory guidance for managing qualifications of rail drivers - to develop the regulatory guidance for evaluating the aptitude of the safety-related personnel.

  1. Development of the Human Error Management Criteria and the Job Aptitude Evaluation Criteria for Rail Safety Personnel

    International Nuclear Information System (INIS)

    Koo, In Soo; Seo, Sang Mun; Park, Geun Ok

    2008-08-01

    It has been estimated that up to 90% of all workplace accidents have human error as a cause. Human error has been widely recognized as a key factor in almost all the highly publicized accidents, including Daegu subway fire of February 18, 2003 killed 198 people and injured 147. Because most human behavior is 'unintentional', carried out automatically, root causes of human error should be carefully investigated and regulated by a legal authority. The final goal of this study is to set up some regulatory guidance that are supposed to be used by the korean rail organizations related to safety managements and the contents are : - to develop the regulatory guidance for managing human error, - to develop the regulatory guidance for managing qualifications of rail drivers - to develop the regulatory guidance for evaluating the aptitude of the safety-related personnel

  2. Neural prediction errors reveal a risk-sensitive reinforcement-learning process in the human brain.

    Science.gov (United States)

    Niv, Yael; Edlund, Jeffrey A; Dayan, Peter; O'Doherty, John P

    2012-01-11

    Humans and animals are exquisitely, though idiosyncratically, sensitive to risk or variance in the outcomes of their actions. Economic, psychological, and neural aspects of this are well studied when information about risk is provided explicitly. However, we must normally learn about outcomes from experience, through trial and error. Traditional models of such reinforcement learning focus on learning about the mean reward value of cues and ignore higher order moments such as variance. We used fMRI to test whether the neural correlates of human reinforcement learning are sensitive to experienced risk. Our analysis focused on anatomically delineated regions of a priori interest in the nucleus accumbens, where blood oxygenation level-dependent (BOLD) signals have been suggested as correlating with quantities derived from reinforcement learning. We first provide unbiased evidence that the raw BOLD signal in these regions corresponds closely to a reward prediction error. We then derive from this signal the learned values of cues that predict rewards of equal mean but different variance and show that these values are indeed modulated by experienced risk. Moreover, a close neurometric-psychometric coupling exists between the fluctuations of the experience-based evaluations of risky options that we measured neurally and the fluctuations in behavioral risk aversion. This suggests that risk sensitivity is integral to human learning, illuminating economic models of choice, neuroscientific models of affective learning, and the workings of the underlying neural mechanisms.

  3. New classification of operators' human errors at overseas nuclear power plants and preparation of easy-to-use case sheets

    International Nuclear Information System (INIS)

    Takagawa, Kenichi

    2004-01-01

    At nuclear power plants, plant operators examine other human error cases, including those that occurred at other plants, so that they can learn from such experiences and avoid making similar errors again. Although there is little data available on errors made at domestic plants, nuclear operators in foreign countries are reporting even minor irregularities and signs of faults, and a large amount of data on human errors at overseas plants could be collected and examined. However, these overseas data have not been used effectively because most of them are poorly organized or not properly classified and are often hard to understand. Accordingly, we carried out a study on the cases of human errors at overseas power plants in order to help plant personnel clearly understand overseas experiences and avoid repeating similar errors, The study produced the following results, which were put to use at nuclear power plants and other facilities. (1) ''One-Point-Advice'' refers to a practice where a leader gives pieces of advice to his team of operators in order to prevent human errors before starting work. Based on this practice and those used in the aviation industry, we have developed a new method of classifying human errors that consists of four basic actions and three applied actions. (2) We used this new classification method to classify human errors made by operators at overseas nuclear power plants. The results show that the most frequent errors caused not by operators themselves but due to insufficient team monitoring, for which superiors and/or their colleagues were responsible. We therefore analyzed and classified possible factors contributing to insufficient team monitoring, and demonstrated that the frequent errors have also occurred at domestic power plants. (3) Using the new classification formula, we prepared a human error case sheets that is easy for plant personnel to understand. The sheets are designed to make data more understandable and easier to remember

  4. Incident reporting: Its role in aviation safety and the acquisition of human error data

    Science.gov (United States)

    Reynard, W. D.

    1983-01-01

    The rationale for aviation incident reporting systems is presented and contrasted to some of the shortcomings of accident investigation procedures. The history of the United State's Aviation Safety Reporting System (ASRS) is outlined and the program's character explained. The planning elements that resulted in the ASRS program's voluntary, confidential, and non-punitive design are discussed. Immunity, from enforcement action and misuse of the volunteered data, is explained and evaluated. Report generation techniques and the ASRS data analysis process are described; in addition, examples of the ASRS program's output and accomplishments are detailed. Finally, the value of incident reporting for the acquisition of safety information, particularly human error data, is explored.

  5. Intrinsic interactive reinforcement learning - Using error-related potentials for real world human-robot interaction.

    Science.gov (United States)

    Kim, Su Kyoung; Kirchner, Elsa Andrea; Stefes, Arne; Kirchner, Frank

    2017-12-14

    Reinforcement learning (RL) enables robots to learn its optimal behavioral strategy in dynamic environments based on feedback. Explicit human feedback during robot RL is advantageous, since an explicit reward function can be easily adapted. However, it is very demanding and tiresome for a human to continuously and explicitly generate feedback. Therefore, the development of implicit approaches is of high relevance. In this paper, we used an error-related potential (ErrP), an event-related activity in the human electroencephalogram (EEG), as an intrinsically generated implicit feedback (rewards) for RL. Initially we validated our approach with seven subjects in a simulated robot learning scenario. ErrPs were detected online in single trial with a balanced accuracy (bACC) of 91%, which was sufficient to learn to recognize gestures and the correct mapping between human gestures and robot actions in parallel. Finally, we validated our approach in a real robot scenario, in which seven subjects freely chose gestures and the real robot correctly learned the mapping between gestures and actions (ErrP detection (90% bACC)). In this paper, we demonstrated that intrinsically generated EEG-based human feedback in RL can successfully be used to implicitly improve gesture-based robot control during human-robot interaction. We call our approach intrinsic interactive RL.

  6. A human error taxonomy for analysing healthcare incident reports: assessing reporting culture and its effects on safety perfomance

    DEFF Research Database (Denmark)

    Itoh, Kenji; Omata, N.; Andersen, Henning Boje

    2009-01-01

    The present paper reports on a human error taxonomy system developed for healthcare risk management and on its application to evaluating safety performance and reporting culture. The taxonomy comprises dimensions for classifying errors, for performance-shaping factors, and for the maturity...

  7. Multimodal system designed to reduce errors in recording and administration of drugs in anaesthesia: prospective randomised clinical evaluation.

    Science.gov (United States)

    Merry, Alan F; Webster, Craig S; Hannam, Jacqueline; Mitchell, Simon J; Henderson, Robert; Reid, Papaarangi; Edwards, Kylie-Ellen; Jardim, Anisoara; Pak, Nick; Cooper, Jeremy; Hopley, Lara; Frampton, Chris; Short, Timothy G

    2011-09-22

    To clinically evaluate a new patented multimodal system (SAFERSleep) designed to reduce errors in the recording and administration of drugs in anaesthesia. Prospective randomised open label clinical trial. Five designated operating theatres in a major tertiary referral hospital. Eighty nine consenting anaesthetists managing 1075 cases in which there were 10,764 drug administrations. Use of the new system (which includes customised drug trays and purpose designed drug trolley drawers to promote a well organised anaesthetic workspace and aseptic technique; pre-filled syringes for commonly used anaesthetic drugs; large legible colour coded drug labels; a barcode reader linked to a computer, speakers, and touch screen to provide automatic auditory and visual verification of selected drugs immediately before each administration; automatic compilation of an anaesthetic record; an on-screen and audible warning if an antibiotic has not been administered within 15 minutes of the start of anaesthesia; and certain procedural rules-notably, scanning the label before each drug administration) versus conventional practice in drug administration with a manually compiled anaesthetic record. Primary: composite of errors in the recording and administration of intravenous drugs detected by direct observation and by detailed reconciliation of the contents of used drug vials against recorded administrations; and lapses in responding to an intermittent visual stimulus (vigilance latency task). Secondary: outcomes in patients; analyses of anaesthetists' tasks and assessments of workload; evaluation of the legibility of anaesthetic records; evaluation of compliance with the procedural rules of the new system; and questionnaire based ratings of the respective systems by participants. The overall mean rate of drug errors per 100 administrations was 9.1 (95% confidence interval 6.9 to 11.4) with the new system (one in 11 administrations) and 11.6 (9.3 to 13.9) with conventional methods (one

  8. Review of advances in human reliability analysis of errors of commission-Part 2: EOC quantification

    International Nuclear Information System (INIS)

    Reer, Bernhard

    2008-01-01

    In close connection with examples relevant to contemporary probabilistic safety assessment (PSA), a review of advances in human reliability analysis (HRA) of post-initiator errors of commission (EOCs), i.e. inappropriate actions under abnormal operating conditions, has been carried out. The review comprises both EOC identification (part 1) and quantification (part 2); part 2 is presented in this article. Emerging HRA methods in this field are: ATHEANA, MERMOS, the EOC HRA method developed by Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS), the MDTA method and CREAM. The essential advanced features are on the conceptual side, especially to envisage the modeling of multiple contexts for an EOC to be quantified (ATHEANA, MERMOS and MDTA), in order to explicitly address adverse conditions. There is promising progress in providing systematic guidance to better account for cognitive demands and tendencies (GRS, CREAM), and EOC recovery (MDTA). Problematic issues are associated with the implementation of multiple context modeling and the assessment of context-specific error probabilities. Approaches for task or error opportunity scaling (CREAM, GRS) and the concept of reference cases (ATHEANA outlook) provide promising orientations for achieving progress towards data-based quantification. Further development work is needed and should be carried out in close connection with large-scale applications of existing approaches

  9. Physician Preferences to Communicate Neuropsychological Results: Comparison of Qualitative Descriptors and a Proposal to Reduce Communication Errors.

    Science.gov (United States)

    Schoenberg, Mike R; Osborn, Katie E; Mahone, E Mark; Feigon, Maia; Roth, Robert M; Pliskin, Neil H

    2017-11-08

    Errors in communication are a leading cause of medical errors. A potential source of error in communicating neuropsychological results is confusion in the qualitative descriptors used to describe standardized neuropsychological data. This study sought to evaluate the extent to which medical consumers of neuropsychological assessments believed that results/findings were not clearly communicated. In addition, preference data for a variety of qualitative descriptors commonly used to communicate normative neuropsychological test scores were obtained. Preference data were obtained for five qualitative descriptor systems as part of a larger 36-item internet-based survey of physician satisfaction with neuropsychological services. A new qualitative descriptor system termed the Simplified Qualitative Classification System (Q-Simple) was proposed to reduce the potential for communication errors using seven terms: very superior, superior, high average, average, low average, borderline, and abnormal/impaired. A non-random convenience sample of 605 clinicians identified from four United States academic medical centers from January 1, 2015 through January 7, 2016 were invited to participate. A total of 182 surveys were completed. A minority of clinicians (12.5%) indicated that neuropsychological study results were not clearly communicated. When communicating neuropsychological standardized scores, the two most preferred qualitative descriptor systems were by Heaton and colleagues (26%) and a newly proposed Q-simple system (22%). Comprehensive norms for an extended Halstead-Reitan battery: Demographic corrections, research findings, and clinical applications. Odessa, TX: Psychological Assessment Resources) (26%) and the newly proposed Q-Simple system (22%). Initial findings highlight the need to improve and standardize communication of neuropsychological results. These data offer initial guidance for preferred terms to communicate test results and form a foundation for more

  10. Error identification and recovery by student nurses using human patient simulation: opportunity to improve patient safety.

    Science.gov (United States)

    Henneman, Elizabeth A; Roche, Joan P; Fisher, Donald L; Cunningham, Helene; Reilly, Cheryl A; Nathanson, Brian H; Henneman, Philip L

    2010-02-01

    This study examined types of errors that occurred or were recovered in a simulated environment by student nurses. Errors occurred in all four rule-based error categories, and all students committed at least one error. The most frequent errors occurred in the verification category. Another common error was related to physician interactions. The least common errors were related to coordinating information with the patient and family. Our finding that 100% of student subjects committed rule-based errors is cause for concern. To decrease errors and improve safe clinical practice, nurse educators must identify effective strategies that students can use to improve patient surveillance. Copyright 2010 Elsevier Inc. All rights reserved.

  11. Reducing Error Rates for Iris Image using higher Contrast in Normalization process

    Science.gov (United States)

    Aminu Ghali, Abdulrahman; Jamel, Sapiee; Abubakar Pindar, Zahraddeen; Hasssan Disina, Abdulkadir; Mat Daris, Mustafa

    2017-08-01

    Iris recognition system is the most secured, and faster means of identification and authentication. However, iris recognition system suffers a setback from blurring, low contrast and illumination due to low quality image which compromises the accuracy of the system. The acceptance or rejection rates of verified user depend solely on the quality of the image. In many cases, iris recognition system with low image contrast could falsely accept or reject user. Therefore this paper adopts Histogram Equalization Technique to address the problem of False Rejection Rate (FRR) and False Acceptance Rate (FAR) by enhancing the contrast of the iris image. A histogram equalization technique enhances the image quality and neutralizes the low contrast of the image at normalization stage. The experimental result shows that Histogram Equalization Technique has reduced FRR and FAR compared to the existing techniques.

  12. Good people who try their best can have problems: recognition of human factors and how to minimise error.

    Science.gov (United States)

    Brennan, Peter A; Mitchell, David A; Holmes, Simon; Plint, Simon; Parry, David

    2016-01-01

    Human error is as old as humanity itself and is an appreciable cause of mistakes by both organisations and people. Much of the work related to human factors in causing error has originated from aviation where mistakes can be catastrophic not only for those who contribute to the error, but for passengers as well. The role of human error in medical and surgical incidents, which are often multifactorial, is becoming better understood, and includes both organisational issues (by the employer) and potential human factors (at a personal level). Mistakes as a result of individual human factors and surgical teams should be better recognised and emphasised. Attitudes and acceptance of preoperative briefing has improved since the introduction of the World Health Organization (WHO) surgical checklist. However, this does not address limitations or other safety concerns that are related to performance, such as stress and fatigue, emotional state, hunger, awareness of what is going on situational awareness, and other factors that could potentially lead to error. Here we attempt to raise awareness of these human factors, and highlight how they can lead to error, and how they can be minimised in our day-to-day practice. Can hospitals move from being "high risk industries" to "high reliability organisations"? Copyright © 2015 The British Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  13. Trend analysis and comparison of operators' human error events occurred at overseas and domestic nuclear power plants

    International Nuclear Information System (INIS)

    Takagawa, Kenichi

    2006-01-01

    Human errors by operators at overseas and domestic nuclear power plants during the period from 2002 to 2005 were compared and their trends analyzed. The most frequently cited cause of such errors was 'insufficient team monitoring' (inadequate superiors' and other crews' instructions and supervision) both at overseas and domestic plants, followed by 'insufficient self-checking' (lack of cautions by the operator himself). A comparison of the effects of the errors on the operations of plants in Japan and the United Sates showed that the drop in plant output and plant shutdowns at plants in Japan were approximately one-tenth of those in the United States. The ratio of automatic reactor trips to the total number of human errors reported is about 6% for both Japanese and American plants. Looking at changes in the incidence of human errors by years of occurrence, although a distinctive trend cannot be identified for domestic nuclear power plants due to insufficient reported cases, 'inadequate self-checking' as a factor contributing to human errors at overseas nuclear power plants has decreased significantly over the past four years. Regarding changes in the effects of human errors on the operations of plants during the four-year period, events leading to an automatic reactor trip have tended to increase at American plants. Conceivable factors behind this increasing tendency included lack of operating experience by a team (e.g., plant transients and reactor shutdowns and startups) and excessive dependence on training simulators. (author)

  14. APPLICATION OF SIX SIGMA METHODOLOGY TO REDUCE MEDICATION ERRORS IN THE OUTPATIENT PHARMACY UNIT: A CASE STUDY FROM THE KING FAHD UNIVERSITY HOSPITAL, SAUDI ARABIA

    Directory of Open Access Journals (Sweden)

    Ahmed Al Kuwaiti

    2016-06-01

    Full Text Available Medication errors will affect the patient safety and quality of healthcare. The aim of this study is to analyze the effect of Six Sigma (DMAIC methodology in reducing medication errors in the outpatient pharmacy of King Fahd Hospital of the University, Saudi Arabia. It was conducted through the five phases of Define, Measure, Analyze, Improve, Control (DMAIC model using various quality tools. The goal was fixed as to reduce medication errors in an outpatient pharmacy by 20%. After implementation of improvement strategies, there was a marked reduction of defects and also improvement of their sigma rating. Especially, Parts per million (PPM of prescription/data entry errors reduced from 56,000 to 5,000 and its sigma rating improved from 3.09 to 4.08. This study concluded that the Six Sigma (DMAIC methodology is found to be more significant in reducing medication errors and ensuring patient safety.

  15. Web-Based Information Management System for the Investigation, Reporting, and Analysis of Human Error in Naval Aviation Maintenance

    National Research Council Canada - National Science Library

    Boex, Anthony

    2001-01-01

    .... The Human Factors Analysis and Classification System-Maintenance Extension (HFACS-ME) taxonomy, a framework for classifying and analyzing the presence of maintenance errors that lead to mishaps, is the foundation of this tool...

  16. Applying the Toyota Production System: using a patient safety alert system to reduce error.

    Science.gov (United States)

    Furman, Cathie; Caplan, Robert

    2007-07-01

    In 2002, Virginia Mason Medical Center (VMMC) adapted the Toyota Production System, also known as lean manufacturing. To translate the techniques of zero defects and stopping the line into health care, the Patient Safety Alert (PSA) system requires any employee who encounters a situation that is likely to harm a patient to make an immediate report and to cease any activity that could cause further harm (stopping the line). IMPLEMENTING THE PSA SYSTEM--STOPPING THE LINE: If any VMMC employee's practice or conduct is deemed capable of causing harm to a patient, a PSA can cause that person to be stopped from working until the problem is resolved. A policy statement, senior executive commitment, dedicated resources, a 24-hour hotline, and communication were all key features of implementation. As of December 2006, 6,112 PSA reports were received: 20% from managers, 8% from physicians, 44% from nurses, and 23% from nonclinical support personnel, for example. The number of reports received per month increased from an average of 3 in 2002 to 285 in 2006. Most reports were processed within 24 hours and were resolved within 2 to 3 weeks. Implementing the PSA system has drastically increased the number of safety concerns that are resolved at VMMC, while drastically reducing the time it takes to resolve them. Transparent discussion and feedback have helped promote staff acceptance and participation.

  17. Classifying and quantifying human error in routine tasks in nuclear power plants

    International Nuclear Information System (INIS)

    Pederson, O.M.; Rasmussen, J.; Carnino, A.; Gagnolet, P.; Griffon, M.; Mancini, G.

    1982-01-01

    This paper results from the work of the OECD/NEA-CSNI Group of Experts on Human Error Data and Assessment. It proposes a classification system (or taxonomy) for use in reporting events involving human malfunction, especially those occurring during the execution of routine tasks. A set of data collection sheets based on this taxonomy has been designed. They include the information needed in order to ensure adequate quality and coherence of the raw data. The sources from which the various data should be obtainable are identified, as are the persons who should analyze them. Improving data collection systems is an iterative process. Therefore Group members are currently making trial applications of the taxonomy to previously analysed real incidents. Results from the initial round of trials are presented and discussed

  18. A classification scheme of erroneous behaviors for human error probability estimations based on simulator data

    International Nuclear Information System (INIS)

    Kim, Yochan; Park, Jinkyun; Jung, Wondea

    2017-01-01

    Because it has been indicated that empirical data supporting the estimates used in human reliability analysis (HRA) is insufficient, several databases have been constructed recently. To generate quantitative estimates from human reliability data, it is important to appropriately sort the erroneous behaviors found in the reliability data. Therefore, this paper proposes a scheme to classify the erroneous behaviors identified by the HuREX (Human Reliability data Extraction) framework through a review of the relevant literature. A case study of the human error probability (HEP) calculations is conducted to verify that the proposed scheme can be successfully implemented for the categorization of the erroneous behaviors and to assess whether the scheme is useful for the HEP quantification purposes. Although continuously accumulating and analyzing simulator data is desirable to secure more reliable HEPs, the resulting HEPs were insightful in several important ways with regard to human reliability in off-normal conditions. From the findings of the literature review and the case study, the potential and limitations of the proposed method are discussed. - Highlights: • A taxonomy of erroneous behaviors is proposed to estimate HEPs from a database. • The cognitive models, procedures, HRA methods, and HRA databases were reviewed. • HEPs for several types of erroneous behaviors are calculated as a case study.

  19. Inborn errors of human STAT1: allelic heterogeneity governs the diversity of immunological and infectious phenotypes

    Science.gov (United States)

    Boisson-Dupuis, Stephanie; Kong, Xiao-Fei; Okada, Satoshi; Cypowyj, Sophie; Puel, Anne; Abel, Laurent; Casanova, Jean-Laurent

    2012-01-01

    The genetic dissection of various human infectious diseases has led to the definition of inborn errors of human STAT1 immunity of four types, including (i) autosomal recessive (AR) complete STAT1 deficiency, (ii) AR partial STAT1 deficiency, (iii) autosomal dominant (AD) STAT1 deficiency, and (iv) AD gain of STAT1 activity. The two types of AR STAT1 defect give rise to a broad infectious phenotype with susceptibility to intramacrophagic bacteria (mostly mycobacteria) and viruses (herpes viruses at least), due principally to the impairment of IFN-γ-mediated and IFN-α/β-mediated immunity, respectively. Clinical outcome depends on the extent to which the STAT1 defect decreases responsiveness to these cytokines. AD STAT1 deficiency selectively predisposes individuals to mycobacterial disease, owing to the impairment of IFN-γ-mediated immunity, as IFN-α/β-mediated immunity is maintained. Finally, AD gain of STAT1 activity is associated with autoimmunity, probably owing to an enhancement of IFN-α/β-mediated immunity. More surprisingly, it is also associated with chronic mucocutaneous candidiasis, through as yet undetermined mechanisms involving an inhibition of the development of IL-17-producing T cells. Thus, germline mutations in human STAT1 define four distinct clinical disorders. Various combinations of viral, mycobacterial and fungal infections are therefore allelic at the human STAT1 locus. These experiments of Nature neatly highlight the clinical and immunological impact of the human genetic dissection of infectious phenotypes. PMID:22651901

  20. A Technological Innovation to Reduce Prescribing Errors Based on Implementation Intentions: The Acceptability and Feasibility of MyPrescribe.

    Science.gov (United States)

    Keyworth, Chris; Hart, Jo; Thoong, Hong; Ferguson, Jane; Tully, Mary

    2017-08-01

    Although prescribing of medication in hospitals is rarely an error-free process, prescribers receive little feedback on their mistakes and ways to change future practices. Audit and feedback interventions may be an effective approach to modifying the clinical practice of health professionals, but these may pose logistical challenges when used in hospitals. Moreover, such interventions are often labor intensive. Consequently, there is a need to develop effective and innovative interventions to overcome these challenges and to improve the delivery of feedback on prescribing. Implementation intentions, which have been shown to be effective in changing behavior, link critical situations with an appropriate response; however, these have rarely been used in the context of improving prescribing practices. Semistructured qualitative interviews were conducted to evaluate the acceptability and feasibility of providing feedback on prescribing errors via MyPrescribe, a mobile-compatible website informed by implementation intentions. Data relating to 200 prescribing errors made by 52 junior doctors were collected by 11 hospital pharmacists. These errors were populated into MyPrescribe, where prescribers were able to construct their own personalized action plans. Qualitative interviews with a subsample of 15 junior doctors were used to explore issues regarding feasibility and acceptability of MyPrescribe and their experiences of using implementation intentions to construct prescribing action plans. Framework analysis was used to identify prominent themes, with findings mapped to the behavioral components of the COM-B model (capability, opportunity, motivation, and behavior) to inform the development of future interventions. MyPrescribe was perceived to be effective in providing opportunities for critical reflection on prescribing errors and to complement existing training (such as junior doctors' e-portfolio). The participants were able to provide examples of how they would use

  1. Sleep quality, posttraumatic stress, depression, and human errors in train drivers: a population-based nationwide study in South Korea.

    Science.gov (United States)

    Jeon, Hong Jin; Kim, Ji-Hae; Kim, Bin-Na; Park, Seung Jin; Fava, Maurizio; Mischoulon, David; Kang, Eun-Ho; Roh, Sungwon; Lee, Dongsoo

    2014-12-01

    Human error is defined as an unintended error that is attributable to humans rather than machines, and that is important to avoid to prevent accidents. We aimed to investigate the association between sleep quality and human errors among train drivers. Cross-sectional. Population-based. A sample of 5,480 subjects who were actively working as train drivers were recruited in South Korea. The participants were 4,634 drivers who completed all questionnaires (response rate 84.6%). None. The Pittsburgh Sleep Quality Index (PSQI), the Center for Epidemiologic Studies Depression Scale (CES-D), the Impact of Event Scale-Revised (IES-R), the State-Trait Anxiety Inventory (STAI), and the Korean Occupational Stress Scale (KOSS). Of 4,634 train drivers, 349 (7.5%) showed more than one human error per 5 y. Human errors were associated with poor sleep quality, higher PSQI total scores, short sleep duration at night, and longer sleep latency. Among train drivers with poor sleep quality, those who experienced severe posttraumatic stress showed a significantly higher number of human errors than those without. Multiple logistic regression analysis showed that human errors were significantly associated with poor sleep quality and posttraumatic stress, whereas there were no significant associations with depression, trait and state anxiety, and work stress after adjusting for age, sex, education years, marital status, and career duration. Poor sleep quality was found to be associated with more human errors in train drivers, especially in those who experienced severe posttraumatic stress. © 2014 Associated Professional Sleep Societies, LLC.

  2. Review of advances in human reliability analysis of errors of commission, Part 1: EOC identification

    International Nuclear Information System (INIS)

    Reer, Bernhard

    2008-01-01

    In close connection with examples relevant to contemporary probabilistic safety assessment (PSA), a review of advances in human reliability analysis (HRA) of post-initiator errors of commission (EOCs), i.e. inappropriate actions under abnormal operating conditions, has been carried out. The review comprises both EOC identification (part 1) and quantification (part 2); part 1 is presented in this article. Emerging HRA methods addressing the problem of EOC identification are: A Technique for Human Event Analysis (ATHEANA), the EOC HRA method developed by Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS), the Misdiagnosis Tree Analysis (MDTA) method, and the Commission Errors Search and Assessment (CESA) method. Most of the EOCs referred to in predictive studies comprise the stop of running or the inhibition of anticipated functions; a few comprise the start of a function. The CESA search scheme-which proceeds from possible operator actions to the affected systems to scenarios and uses procedures and importance measures as key sources of input information-provides a formalized way for identifying relatively important scenarios with EOC opportunities. In the implementation however, attention should be paid regarding EOCs associated with familiar but non-procedural actions and EOCs leading to failures of manually initiated safety functions

  3. [A project to reduce the incidence of intubation care errors among foreign health aides].

    Science.gov (United States)

    Chen, Mei-Ju; Lu, Yu-Hua; Chen, Chiu-Chun; Li, Ai-Cheng

    2014-08-01

    Foreign health aides are the main providers of care for the elderly and the physically disabled in Taiwan. Correct care skills improve patient safety. In 2010, the incidence of mistakes among foreign health aides in our hospital unit was 58% for nasogastric tube care and 57% for tracheostomy tube care. A survey of foreign health aides and nurses in the unit identified the main causes of these mistakes as: communication difficulties, inaccurate instructions given to patients, and a lack of standard operating procedures given to the foreign health aides. This project was designed to reduce the rates of improper nasogastric tube care and improper tracheostomy tube care to 20%, respectively. This project implemented several appropriate measures. We produced patient instruction hand-outs in Bahasa Indonesia, established a dedicated file holder for Bahasa Indonesian tube care reference information, produced Bahasa Indonesian tube-care-related posters, produced a short film about tube care in Bahasa Indonesian, and established a standardized operating procedure for tube care in our unit. Between December 15th and 31st, 2011, we audited the performance of a total of 32 foreign health aides for proper execution of nasogastric tube care (21 aides) and of proper execution of tracheostomy tube care (11 aides). Patients with concurrent nasogastric and tracheostomy tubes were inspected separately for each care group. The incidence of improper care decreased from 58% to 18% nasogastric intubation and 57% to 18% for tracheostomy intubation. This project decreased significantly the incidence of improper tube care by the foreign health aides in our unit. Furthermore, the foreign health aides improved their tube nursing care skills. Therefore, this project improved the quality of patient care.

  4. How to Cope with the Rare Human Error Events Involved with organizational Factors in Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sa Kil; Luo, Meiling; Lee, Yong Hee [Korea Atomic Research Institute, Daejeon (Korea, Republic of)

    2014-10-15

    The current human error guidelines (e.g. US DOD handbooks, US NRC Guidelines) are representative tools to prevent human errors. These tools, however, have limits that they do not adapt all operating situations and circumstances such as design base events. In other words, these tools are only adapted foreseeable standardized operating situations and circumstances. In this study, our research team proposed an evidence-based approach such as UK's safety case to coping with the rare human error events such as TMI, Chernobyl, Fukushima accidents. These accidents are representative events involved with rare human errors. Our research team defined the 'rare human errors' as the follow three characterized events; Extremely low frequency Extremely high complicated structure Extremely serious damage of human life and property A safety case is a structured argument, supported by evidence, intended to justify that a system is acceptably safe. The definition by UK defense standard 00-56 issue 4 states that such an evidence-based approach can be contrast with a prescriptive approach to safety certification, which require safety to be justified using a prescribed process. Safety managements and safety regulatory activities based on safety case are effective to control organizational factors in terms of integrated safety management. Especially safety issues relevant with public acceptance are useful to provide practical evidences to the public reasonably. European Union including UK has developed the concept of engineered safety management system to deal with public acceptance using the safety case. In Korea nuclear industry, the Korean Atomic Research Institute has firstly performed a basic research to adapt the safety case in the field of radioactive waste according to the IAEA SSG-23(KAERI/TR-4497, 4531). Excepting the radioactive waste, there is no try to adapt the safety case yet. Most incidents and accidents involved human during operating NPPs have a tendency

  5. How to Cope with the Rare Human Error Events Involved with organizational Factors in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Kim, Sa Kil; Luo, Meiling; Lee, Yong Hee

    2014-01-01

    The current human error guidelines (e.g. US DOD handbooks, US NRC Guidelines) are representative tools to prevent human errors. These tools, however, have limits that they do not adapt all operating situations and circumstances such as design base events. In other words, these tools are only adapted foreseeable standardized operating situations and circumstances. In this study, our research team proposed an evidence-based approach such as UK's safety case to coping with the rare human error events such as TMI, Chernobyl, Fukushima accidents. These accidents are representative events involved with rare human errors. Our research team defined the 'rare human errors' as the follow three characterized events; Extremely low frequency Extremely high complicated structure Extremely serious damage of human life and property A safety case is a structured argument, supported by evidence, intended to justify that a system is acceptably safe. The definition by UK defense standard 00-56 issue 4 states that such an evidence-based approach can be contrast with a prescriptive approach to safety certification, which require safety to be justified using a prescribed process. Safety managements and safety regulatory activities based on safety case are effective to control organizational factors in terms of integrated safety management. Especially safety issues relevant with public acceptance are useful to provide practical evidences to the public reasonably. European Union including UK has developed the concept of engineered safety management system to deal with public acceptance using the safety case. In Korea nuclear industry, the Korean Atomic Research Institute has firstly performed a basic research to adapt the safety case in the field of radioactive waste according to the IAEA SSG-23(KAERI/TR-4497, 4531). Excepting the radioactive waste, there is no try to adapt the safety case yet. Most incidents and accidents involved human during operating NPPs have a tendency

  6. Human errors and violations in computer and information security: the viewpoint of network administrators and security specialists.

    Science.gov (United States)

    Kraemer, Sara; Carayon, Pascale

    2007-03-01

    This paper describes human errors and violations of end users and network administration in computer and information security. This information is summarized in a conceptual framework for examining the human and organizational factors contributing to computer and information security. This framework includes human error taxonomies to describe the work conditions that contribute adversely to computer and information security, i.e. to security vulnerabilities and breaches. The issue of human error and violation in computer and information security was explored through a series of 16 interviews with network administrators and security specialists. The interviews were audio taped, transcribed, and analyzed by coding specific themes in a node structure. The result is an expanded framework that classifies types of human error and identifies specific human and organizational factors that contribute to computer and information security. Network administrators tended to view errors created by end users as more intentional than unintentional, while errors created by network administrators as more unintentional than intentional. Organizational factors, such as communication, security culture, policy, and organizational structure, were the most frequently cited factors associated with computer and information security.

  7. Human reliability analysis of errors of commission: a review of methods and applications

    Energy Technology Data Exchange (ETDEWEB)

    Reer, B

    2007-06-15

    Illustrated by specific examples relevant to contemporary probabilistic safety assessment (PSA), this report presents a review of human reliability analysis (HRA) addressing post initiator errors of commission (EOCs), i.e. inappropriate actions under abnormal operating conditions. The review addressed both methods and applications. Emerging HRA methods providing advanced features and explicit guidance suitable for PSA are: A Technique for Human Event Analysis (ATHEANA, key publications in 1998/2000), Methode d'Evaluation de la Realisation des Missions Operateur pour la Surete (MERMOS, 1998/2000), the EOC HRA method developed by the Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS, 2003), the Misdiagnosis Tree Analysis (MDTA) method (2005/2006), the Cognitive Reliability and Error Analysis Method (CREAM, 1998), and the Commission Errors Search and Assessment (CESA) method (2002/2004). As a result of a thorough investigation of various PSA/HRA applications, this paper furthermore presents an overview of EOCs (termination of safety injection, shutdown of secondary cooling, etc.) referred to in predictive studies and a qualitative review of cases of EOC quantification. The main conclusions of the review of both the methods and the EOC HRA cases are: (1) The CESA search scheme, which proceeds from possible operator actions to the affected systems to scenarios, may be preferable because this scheme provides a formalized way for identifying relatively important scenarios with EOC opportunities; (2) an EOC identification guidance like CESA, which is strongly based on the procedural guidance and important measures of systems or components affected by inappropriate actions, however should pay some attention to EOCs associated with familiar but non-procedural actions and EOCs leading to failures of manually initiated safety functions. (3) Orientations of advanced EOC quantification comprise a) modeling of multiple contexts for a given scenario, b) accounting for

  8. Human reliability analysis of errors of commission: a review of methods and applications

    International Nuclear Information System (INIS)

    Reer, B.

    2007-06-01

    Illustrated by specific examples relevant to contemporary probabilistic safety assessment (PSA), this report presents a review of human reliability analysis (HRA) addressing post initiator errors of commission (EOCs), i.e. inappropriate actions under abnormal operating conditions. The review addressed both methods and applications. Emerging HRA methods providing advanced features and explicit guidance suitable for PSA are: A Technique for Human Event Analysis (ATHEANA, key publications in 1998/2000), Methode d'Evaluation de la Realisation des Missions Operateur pour la Surete (MERMOS, 1998/2000), the EOC HRA method developed by the Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS, 2003), the Misdiagnosis Tree Analysis (MDTA) method (2005/2006), the Cognitive Reliability and Error Analysis Method (CREAM, 1998), and the Commission Errors Search and Assessment (CESA) method (2002/2004). As a result of a thorough investigation of various PSA/HRA applications, this paper furthermore presents an overview of EOCs (termination of safety injection, shutdown of secondary cooling, etc.) referred to in predictive studies and a qualitative review of cases of EOC quantification. The main conclusions of the review of both the methods and the EOC HRA cases are: (1) The CESA search scheme, which proceeds from possible operator actions to the affected systems to scenarios, may be preferable because this scheme provides a formalized way for identifying relatively important scenarios with EOC opportunities; (2) an EOC identification guidance like CESA, which is strongly based on the procedural guidance and important measures of systems or components affected by inappropriate actions, however should pay some attention to EOCs associated with familiar but non-procedural actions and EOCs leading to failures of manually initiated safety functions. (3) Orientations of advanced EOC quantification comprise a) modeling of multiple contexts for a given scenario, b) accounting for

  9. Joint Estimation of Contamination, Error and Demography for Nuclear DNA from Ancient Humans

    Science.gov (United States)

    Slatkin, Montgomery

    2016-01-01

    When sequencing an ancient DNA sample from a hominin fossil, DNA from present-day humans involved in excavation and extraction will be sequenced along with the endogenous material. This type of contamination is problematic for downstream analyses as it will introduce a bias towards the population of the contaminating individual(s). Quantifying the extent of contamination is a crucial step as it allows researchers to account for possible biases that may arise in downstream genetic analyses. Here, we present an MCMC algorithm to co-estimate the contamination rate, sequencing error rate and demographic parameters—including drift times and admixture rates—for an ancient nuclear genome obtained from human remains, when the putative contaminating DNA comes from present-day humans. We assume we have a large panel representing the putative contaminant population (e.g. European, East Asian or African). The method is implemented in a C++ program called ‘Demographic Inference with Contamination and Error’ (DICE). We applied it to simulations and genome data from ancient Neanderthals and modern humans. With reasonable levels of genome sequence coverage (>3X), we find we can recover accurate estimates of all these parameters, even when the contamination rate is as high as 50%. PMID:27049965

  10. Human factors evaluation of remote afterloading brachytherapy: Human error and critical tasks in remote afterloading brachytherapy and approaches for improved system performance. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Callan, J.R.; Kelly, R.T.; Quinn, M.L. [Pacific Science and Engineering Group, San Diego, CA (United States)] [and others

    1995-05-01

    Remote Afterloading Brachytherapy (RAB) is a medical process used in the treatment of cancer. RAB uses a computer-controlled device to remotely insert and remove radioactive sources close to a target (or tumor) in the body. Some RAB problems affecting the radiation dose to the patient have been reported and attributed to human error. To determine the root cause of human error in the RAB system, a human factors team visited 23 RAB treatment sites in the US The team observed RAB treatment planning and delivery, interviewed RAB personnel, and performed walk-throughs, during which staff demonstrated the procedures and practices used in performing RAB tasks. Factors leading to human error in the RAB system were identified. The impact of those factors on the performance of RAB was then evaluated and prioritized in terms of safety significance. Finally, the project identified and evaluated alternative approaches for resolving the safety significant problems related to human error.

  11. Human factors evaluation of remote afterloading brachytherapy: Human error and critical tasks in remote afterloading brachytherapy and approaches for improved system performance. Volume 1

    International Nuclear Information System (INIS)

    Callan, J.R.; Kelly, R.T.; Quinn, M.L.

    1995-05-01

    Remote Afterloading Brachytherapy (RAB) is a medical process used in the treatment of cancer. RAB uses a computer-controlled device to remotely insert and remove radioactive sources close to a target (or tumor) in the body. Some RAB problems affecting the radiation dose to the patient have been reported and attributed to human error. To determine the root cause of human error in the RAB system, a human factors team visited 23 RAB treatment sites in the US The team observed RAB treatment planning and delivery, interviewed RAB personnel, and performed walk-throughs, during which staff demonstrated the procedures and practices used in performing RAB tasks. Factors leading to human error in the RAB system were identified. The impact of those factors on the performance of RAB was then evaluated and prioritized in terms of safety significance. Finally, the project identified and evaluated alternative approaches for resolving the safety significant problems related to human error

  12. An empirical study on the basic human error probabilities for NPP advanced main control room operation using soft control

    International Nuclear Information System (INIS)

    Jang, Inseok; Kim, Ar Ryum; Harbi, Mohamed Ali Salem Al; Lee, Seung Jun; Kang, Hyun Gook; Seong, Poong Hyun

    2013-01-01

    Highlights: ► The operation environment of MCRs in NPPs has changed by adopting new HSIs. ► The operation action in NPP Advanced MCRs is performed by soft control. ► Different basic human error probabilities (BHEPs) should be considered. ► BHEPs in a soft control operation environment are investigated empirically. ► This work will be helpful to verify if soft control has positive or negative effects. -- Abstract: By adopting new human–system interfaces that are based on computer-based technologies, the operation environment of main control rooms (MCRs) in nuclear power plants (NPPs) has changed. The MCRs that include these digital and computer technologies, such as large display panels, computerized procedures, soft controls, and so on, are called Advanced MCRs. Among the many features in Advanced MCRs, soft controls are an important feature because the operation action in NPP Advanced MCRs is performed by soft control. Using soft controls such as mouse control, touch screens, and so on, operators can select a specific screen, then choose the controller, and finally manipulate the devices. However, because of the different interfaces between soft control and hardwired conventional type control, different basic human error probabilities (BHEPs) should be considered in the Human Reliability Analysis (HRA) for advanced MCRs. Although there are many HRA methods to assess human reliabilities, such as Technique for Human Error Rate Prediction (THERP), Accident Sequence Evaluation Program (ASEP), Human Error Assessment and Reduction Technique (HEART), Human Event Repository and Analysis (HERA), Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR), Cognitive Reliability and Error Analysis Method (CREAM), and so on, these methods have been applied to conventional MCRs, and they do not consider the new features of advance MCRs such as soft controls. As a result, there is an insufficient database for assessing human reliabilities in advanced

  13. Reducing the occurrence of plant events through improved human performance

    International Nuclear Information System (INIS)

    Ross, T.; Burkhart, A.D.

    1993-01-01

    During a routine control room surveillance, the reactor operator is distracted by an alarming secondary annunciator and a telephone call. When the reactor operator resumes the surveillance, he inadvertently performs the procedural steps out of order. This causes a reportable nuclear event. How can procedure-related human performance problems such as this be prevented? The question is vitally important for the nuclear industry. The U.S. Nuclear Regulatory Commission's Office for Analysis and Evaluation of Operational Data observed, open-quotes With the perceived reduction in the number of events caused by equipment failures, INPO and other industry groups and human performance experts agree that a key to continued improvement in plant performance and safety is improved human performance.close quotes In fact, open-quotes more than 50% of the reportable events occurring at nuclear power plants involve human error.close quotes Prevention (or correction) of a human performance problem is normally based on properly balancing the following three factors: (1) supervisory involvement; (2) personnel training; and (3) procedures. The nuclear industry is implementing a formula known as ACME, which better balances supervisory involvement, personnel training, and procedures. Webster's New World Dictionary defines acme as the highest point, the peak. ACME human performance is the goal: ACME Adherence to and use of procedures; Self-Checking; Management Involvement; and Event Investigations

  14. Reducing Monte Carlo error in the Bayesian estimation of risk ratios using log-binomial regression models.

    Science.gov (United States)

    Salmerón, Diego; Cano, Juan A; Chirlaque, María D

    2015-08-30

    In cohort studies, binary outcomes are very often analyzed by logistic regression. However, it is well known that when the goal is to estimate a risk ratio, the logistic regression is inappropriate if the outcome is common. In these cases, a log-binomial regression model is preferable. On the other hand, the estimation of the regression coefficients of the log-binomial model is difficult owing to the constraints that must be imposed on these coefficients. Bayesian methods allow a straightforward approach for log-binomial regression models and produce smaller mean squared errors in the estimation of risk ratios than the frequentist methods, and the posterior inferences can be obtained using the software WinBUGS. However, Markov chain Monte Carlo methods implemented in WinBUGS can lead to large Monte Carlo errors in the approximations to the posterior inferences because they produce correlated simulations, and the accuracy of the approximations are inversely related to this correlation. To reduce correlation and to improve accuracy, we propose a reparameterization based on a Poisson model and a sampling algorithm coded in R. Copyright © 2015 John Wiley & Sons, Ltd.

  15. Human error analysis project (HEAP) - The fourth pilot study: verbal data for analysis of operator performance

    International Nuclear Information System (INIS)

    Braarud, Per Oeyvind; Droeyvoldsmo, Asgeir; Hollnagel, Erik

    1997-06-01

    This report is the second report from the Pilot study No. 4 within the Human Error Analyses Project (HEAP). The overall objective of HEAP is to provide a better understanding and explicit modelling of how and why ''cognitive errors'' occur. This study investigated the contribution from different verbal data sources for analysis of control room operator's performance. Operator's concurrent verbal report, retrospective verbal report, and process expert's comments were compared for their contribution to an operator performance measure. This study looked into verbal protocols for single operator and for team. The main findings of the study were that all the three verbal data sources could be used to study performance. There was a relative high overlap between the data sources, but also a unique contribution from each source. There was a common pattern in the types of operator activities the data sources gave information about. The operator's concurrent protocol overall contained slightly more information on the operator's activities than the other two verbal sources. The study also showed that concurrent verbal protocol is feasible and useful for analysis of team's activities during a scenario. (author)

  16. Evaluation of a breath-motion-correction technique in reducing measurement error in hepatic CT perfusion imaging

    International Nuclear Information System (INIS)

    He Wei; Liu Jianyu; Li Xuan; Li Jianying; Liao Jingmin

    2009-01-01

    Objective: To evaluate the effect of a breath-motion-correction (BMC) technique in reducing measurement error of the time-density curve (TDC) in hepatic CT perfusion imaging. Methods: Twenty-five patients with suspected liver diseases underwent hepatic CT perfusion scans. The right branch of portal vein was selected as the anatomy of interest and performed BMC to realign image slices for the TDC according to the rule of minimizing the temporal changes of overall structures. Ten ROIs was selected on the right branch of portal vein to generate 10 TDCs each with and without BMC. The values of peak enhancement and the time-to-peak enhancement for each TDC were measured. The coefficients of variation (CV) of peak enhancement and the time-to-peak enhancement were calculated for each patient with and without BMC. Wilcoxon signed ranks test was used to evaluate the difference between the CV of the two parameters obtained with and without BMC. Independent-samples t test was used to evaluate the difference between the values of peak enhancement obtained with and without BMC. Results: The median (quartiles) of CV of peak enhancement with BMC [2.84% (2.10%, 4.57%)] was significantly lower than that without BMC [5.19% (3.90%, 7.27%)] (Z=-3.108,P<0.01). The median (quartiles) of CV of time-to-peak enhancement with BMC [2.64% (0.76%, 4.41%)] was significantly lower than that without BMC [5.23% (3.81%, 7.43%)] (Z=-3.924, P<0.01). In 8 cases, TDC demonstrated statistically significant higher peak enhancement with BMC (P<0.05). Conclusion: By applying the BMC technique we can effectively reduce measurement error for parameters of the TDC in hepatic CT perfusion imaging. (authors)

  17. A Recent Revisit Study on the Human Error Events of Nuclear Facilities in Korea

    International Nuclear Information System (INIS)

    Lee, Y.-H.

    2016-01-01

    After Fukushima accident we have launched two new projects in Korea. One is for the development of the countermeasures for human errors in nuclear facilities, and the other is for the safety culture of nuclear power plant itself. There had happened several succeeding events that turned out to be the typical flags of the human and organizational factor issues for the safety of the other socio-technical systems as well as nuclear power plants in Korea. The second safety culture project was an ambitious development to establish an infra system utilising system dynamics, business process modeling and big-data techniques to provide effective and efficient information basis to various interest parties related to the nuclear power plants. However the project has been drastically cancelled last year without any further discussion on the original issues raised before in Korea. It may come not only from the conflicting perspectives among the different approaches to nuclear safety culture but also from the misunderstandings on the human factors for the nuclear safety.

  18. Procedures for using expert judgment to estimate human-error probabilities in nuclear power plant operations

    International Nuclear Information System (INIS)

    Seaver, D.A.; Stillwell, W.G.

    1983-03-01

    This report describes and evaluates several procedures for using expert judgment to estimate human-error probabilities (HEPs) in nuclear power plant operations. These HEPs are currently needed for several purposes, particularly for probabilistic risk assessments. Data do not exist for estimating these HEPs, so expert judgment can provide these estimates in a timely manner. Five judgmental procedures are described here: paired comparisons, ranking and rating, direct numerical estimation, indirect numerical estimation and multiattribute utility measurement. These procedures are evaluated in terms of several criteria: quality of judgments, difficulty of data collection, empirical support, acceptability, theoretical justification, and data processing. Situational constraints such as the number of experts available, the number of HEPs to be estimated, the time available, the location of the experts, and the resources available are discussed in regard to their implications for selecting a procedure for use

  19. Organizational change and human expertise in nuclear power plants: some implications for training and error prevention

    International Nuclear Information System (INIS)

    Masson, M.; Malaise, N.; Housiaux, A.; Keyser, V. de

    1993-01-01

    Reliability and safety are two very important goals, which depend on technical and organizational factors, but also on human expertise. How to ensure a safe functioning of a nuclear power plant in a changing context, and what might be the role and aspects of training and transfer of knowledge? These are the questions we shall deal with in this paper, on the basis of two field studies. The two field studies stress the needs for setting up case based training, which best ensure the acquisition of know-how. Furthermore, as shown by the second one, gaining expertise involves developing large repertoires of highly skilled, semi-routinized activities. Supporting expert operators not only should tackle problem solving activities but should thus also include the prevention of routine errors, which go along with skill acquisition. (orig.)

  20. Evolutionary enhancement of the SLIM-MAUD method of estimating human error rates

    International Nuclear Information System (INIS)

    Zamanali, J.H.; Hubbard, F.R.; Mosleh, A.; Waller, M.A.

    1992-01-01

    The methodology described in this paper assigns plant-specific dynamic human error rates (HERs) for individual plant examinations based on procedural difficulty, on configuration features, and on the time available to perform the action. This methodology is an evolutionary improvement of the success likelihood index methodology (SLIM-MAUD) for use in systemic scenarios. It is based on the assumption that the HER in a particular situation depends of the combined effects of a comprehensive set of performance-shaping factors (PSFs) that influence the operator's ability to perform the action successfully. The PSFs relate the details of the systemic scenario in which the action must be performed according to the operator's psychological and cognitive condition

  1. The role of human error in risk analysis: Application to pre- and post-maintenance procedures of process facilities

    International Nuclear Information System (INIS)

    Noroozi, Alireza; Khakzad, Nima; Khan, Faisal; MacKinnon, Scott; Abbassi, Rouzbeh

    2013-01-01

    Human factors play an important role in the safe operation of a facility. Human factors include the systematic application of information about human characteristics and behavior to increase the safety of a process system. A significant proportion of human errors occur during the maintenance phase. However, the quantification of human error probabilities in the maintenance phase has not been given the amount of attention it deserves. This paper focuses on a human factors analysis in pre-and post- pump maintenance operations. The procedures for removing process equipment from service (pre-maintenance) and returning the equipment to service (post-maintenance) are considered for possible failure scenarios. For each scenario, human error probability is calculated for each activity using the Success Likelihood Index Method (SLIM). Consequences are also assessed in this methodology. The risk assessment is conducted for each component and the overall risk is estimated by adding individual risks. The present study is aimed at highlighting the importance of considering human error in quantitative risk analyses. The developed methodology has been applied to a case study of an offshore process facility

  2. PS-022 Complex automated medication systems reduce medication administration error rates in an acute medical ward

    DEFF Research Database (Denmark)

    Risør, Bettina Wulff; Lisby, Marianne; Sørensen, Jan

    2017-01-01

    Background Medication errors have received extensive attention in recent decades and are of significant concern to healthcare organisations globally. Medication errors occur frequently, and adverse events associated with medications are one of the largest causes of harm to hospitalised patients...... cabinet, automated dispensing and barcode medication administration; (2) non-patient specific automated dispensing and barcode medication administration. The occurrence of administration errors was observed in three 3 week periods. The error rates were calculated by dividing the number of doses with one...

  3. A study on fatigue measurement of operators for human error prevention in NPPs

    Energy Technology Data Exchange (ETDEWEB)

    Ju, Oh Yeon; Il, Jang Tong; Meiling, Luo; Hee, Lee Young [KAERI, Daejeon (Korea, Republic of)

    2012-10-15

    The identification and the analysis of individual factor of operators, which is one of the various causes of adverse effects in human performance, is not easy in NPPs. There are work types (including shift), environment, personality, qualification, training, education, cognition, fatigue, job stress, workload, etc in individual factors for the operators. Research at the Finnish Institute of Occupational Health (FIOH) reported that a 'burn out (extreme fatigue)' is related to alcohol dependent habits and must be dealt with using a stress management program. USNRC (U.S. Nuclear Regulatory Commission) developed FFD (Fitness for Duty) for improving the task efficiency and preventing human errors. 'Managing Fatigue' of 10CFR26 presented as requirements to control operator fatigue in NPPs. The committee explained that excessive fatigue is due to stressful work environments, working hours, shifts, sleep disorders, and unstable circadian rhythms. In addition, an International Labor Organization (ILO) developed and suggested a checklist to manage fatigue and job stress. In domestic, a systematic evaluation way is presented by the Final Safety Analysis Report (FSAR) chapter 18, Human Factors, in the licensing process. However, it almost focused on the interface design such as HMI (Human Machine Interface), not individual factors. In particular, because our country is in a process of the exporting the NPP to UAE, the development and setting of fatigue management technique is important and urgent to present the technical standard and FFD criteria to UAE. And also, it is anticipated that the domestic regulatory body applies the FFD program as the regulation requirement so that a preparation for that situation is required. In this paper, advanced researches are investigated to find the fatigue measurement and evaluation methods of operators in a high reliability industry. Also, this study tries to review the NRC report and discuss the causal factors and

  4. A study on fatigue measurement of operators for human error prevention in NPPs

    International Nuclear Information System (INIS)

    Ju, Oh Yeon; Il, Jang Tong; Meiling, Luo; Hee, Lee Young

    2012-01-01

    The identification and the analysis of individual factor of operators, which is one of the various causes of adverse effects in human performance, is not easy in NPPs. There are work types (including shift), environment, personality, qualification, training, education, cognition, fatigue, job stress, workload, etc in individual factors for the operators. Research at the Finnish Institute of Occupational Health (FIOH) reported that a 'burn out (extreme fatigue)' is related to alcohol dependent habits and must be dealt with using a stress management program. USNRC (U.S. Nuclear Regulatory Commission) developed FFD (Fitness for Duty) for improving the task efficiency and preventing human errors. 'Managing Fatigue' of 10CFR26 presented as requirements to control operator fatigue in NPPs. The committee explained that excessive fatigue is due to stressful work environments, working hours, shifts, sleep disorders, and unstable circadian rhythms. In addition, an International Labor Organization (ILO) developed and suggested a checklist to manage fatigue and job stress. In domestic, a systematic evaluation way is presented by the Final Safety Analysis Report (FSAR) chapter 18, Human Factors, in the licensing process. However, it almost focused on the interface design such as HMI (Human Machine Interface), not individual factors. In particular, because our country is in a process of the exporting the NPP to UAE, the development and setting of fatigue management technique is important and urgent to present the technical standard and FFD criteria to UAE. And also, it is anticipated that the domestic regulatory body applies the FFD program as the regulation requirement so that a preparation for that situation is required. In this paper, advanced researches are investigated to find the fatigue measurement and evaluation methods of operators in a high reliability industry. Also, this study tries to review the NRC report and discuss the causal factors and management

  5. Using the area under the curve to reduce measurement error in predicting young adult blood pressure from childhood measures.

    Science.gov (United States)

    Cook, Nancy R; Rosner, Bernard A; Chen, Wei; Srinivasan, Sathanur R; Berenson, Gerald S

    2004-11-30

    Tracking correlations of blood pressure, particularly childhood measures, may be attenuated by within-person variability. Combining multiple measurements can reduce this error substantially. The area under the curve (AUC) computed from longitudinal growth curve models can be used to improve the prediction of young adult blood pressure from childhood measures. Quadratic random-effects models over unequally spaced repeated measures were used to compute the area under the curve separately within the age periods 5-14 and 20-34 years in the Bogalusa Heart Study. This method adjusts for the uneven age distribution and captures the underlying or average blood pressure, leading to improved estimates of correlation and risk prediction. Tracking correlations were computed by race and gender, and were approximately 0.6 for systolic, 0.5-0.6 for K4 diastolic, and 0.4-0.6 for K5 diastolic blood pressure. The AUC can also be used to regress young adult blood pressure on childhood blood pressure and childhood and young adult body mass index (BMI). In these data, while childhood blood pressure and young adult BMI were generally directly predictive of young adult blood pressure, childhood BMI was negatively correlated with young adult blood pressure when childhood blood pressure was in the model. In addition, racial differences in young adult blood pressure were reduced, but not eliminated, after controlling for childhood blood pressure, childhood BMI, and young adult BMI, suggesting that other genetic or lifestyle factors contribute to this difference. 2004 John Wiley & Sons, Ltd.

  6. Study on a new framework of Human Reliability Analysis to evaluate soft control execution error in advanced MCRs of NPPs

    International Nuclear Information System (INIS)

    Jang, Inseok; Kim, Ar Ryum; Jung, Wondea; Seong, Poong Hyun

    2016-01-01

    Highlights: • The operation environment of MCRs in NPPs has changed by adopting new HSIs. • The operation action in NPP Advanced MCRs is performed by soft control. • New HRA framework should be considered in the HRA for advanced MCRs. • HRA framework for evaluation of soft control execution human error is suggested. • Suggested method will be helpful to analyze human reliability in advance MCRs. - Abstract: Since the Three Mile Island (TMI)-2 accident, human error has been recognized as one of the main causes of Nuclear Power Plant (NPP) accidents, and numerous studies related to Human Reliability Analysis (HRA) have been carried out. Most of these methods were developed considering the conventional type of Main Control Rooms (MCRs). However, the operating environment of MCRs in NPPs has changed with the adoption of new Human-System Interfaces (HSIs) that are based on computer-based technologies. The MCRs that include these digital technologies, such as large display panels, computerized procedures, and soft controls, are called advanced MCRs. Among the many features of advanced MCRs, soft controls are a particularly important feature because operating actions in NPP advanced MCRs are performed by soft control. Due to the differences in interfaces between soft control and hardwired conventional type control, different Human Error Probabilities (HEPs) and a new HRA framework should be considered in the HRA for advanced MCRs. To this end, a new framework of a HRA method for evaluating soft control execution human error is suggested by performing a soft control task analysis and the literature regarding widely accepted human error taxonomies is reviewed. Moreover, since most current HRA databases deal with operation in conventional MCRs and are not explicitly designed to deal with digital HSIs, empirical analysis of human error and error recovery considering soft controls under an advanced MCR mockup are carried out to collect human error data, which is

  7. Systematic analysis of video data from different human-robot interaction studies: a categorization of social signals during error situations.

    Science.gov (United States)

    Giuliani, Manuel; Mirnig, Nicole; Stollnberger, Gerald; Stadler, Susanne; Buchner, Roland; Tscheligi, Manfred

    2015-01-01

    Human-robot interactions are often affected by error situations that are caused by either the robot or the human. Therefore, robots would profit from the ability to recognize when error situations occur. We investigated the verbal and non-verbal social signals that humans show when error situations occur in human-robot interaction experiments. For that, we analyzed 201 videos of five human-robot interaction user studies with varying tasks from four independent projects. The analysis shows that there are two types of error situations: social norm violations and technical failures. Social norm violations are situations in which the robot does not adhere to the underlying social script of the interaction. Technical failures are caused by technical shortcomings of the robot. The results of the video analysis show that the study participants use many head movements and very few gestures, but they often smile, when in an error situation with the robot. Another result is that the participants sometimes stop moving at the beginning of error situations. We also found that the participants talked more in the case of social norm violations and less during technical failures. Finally, the participants use fewer non-verbal social signals (for example smiling, nodding, and head shaking), when they are interacting with the robot alone and no experimenter or other human is present. The results suggest that participants do not see the robot as a social interaction partner with comparable communication skills. Our findings have implications for builders and evaluators of human-robot interaction systems. The builders need to consider including modules for recognition and classification of head movements to the robot input channels. The evaluators need to make sure that the presence of an experimenter does not skew the results of their user studies.

  8. Spatiotemporal neural characterization of prediction error valence and surprise during reward learning in humans.

    Science.gov (United States)

    Fouragnan, Elsa; Queirazza, Filippo; Retzler, Chris; Mullinger, Karen J; Philiastides, Marios G

    2017-07-06

    Reward learning depends on accurate reward associations with potential choices. These associations can be attained with reinforcement learning mechanisms using a reward prediction error (RPE) signal (the difference between actual and expected rewards) for updating future reward expectations. Despite an extensive body of literature on the influence of RPE on learning, little has been done to investigate the potentially separate contributions of RPE valence (positive or negative) and surprise (absolute degree of deviation from expectations). Here, we coupled single-trial electroencephalography with simultaneously acquired fMRI, during a probabilistic reversal-learning task, to offer evidence of temporally overlapping but largely distinct spatial representations of RPE valence and surprise. Electrophysiological variability in RPE valence correlated with activity in regions of the human reward network promoting approach or avoidance learning. Electrophysiological variability in RPE surprise correlated primarily with activity in regions of the human attentional network controlling the speed of learning. Crucially, despite the largely separate spatial extend of these representations our EEG-informed fMRI approach uniquely revealed a linear superposition of the two RPE components in a smaller network encompassing visuo-mnemonic and reward areas. Activity in this network was further predictive of stimulus value updating indicating a comparable contribution of both signals to reward learning.

  9. Reducing classification error of grassland overgrowth by combing low-density lidar acquisitions and optical remote sensing data

    Science.gov (United States)

    Pitkänen, T. P.; Käyhkö, N.

    2017-08-01

    Mapping structural changes in vegetation dynamics has, for a long time, been carried out using satellite images, orthophotos and, more recently, airborne lidar acquisitions. Lidar has established its position as providing accurate material for structure-based analyses but its limited availability, relatively short history, and lack of spectral information, however, are generally impeding the use of lidar data for change detection purposes. A potential solution in respect of detecting both contemporary vegetation structures and their previous trajectories is to combine lidar acquisitions with optical remote sensing data, which can substantially extend the coverage, span and spectral range needed for vegetation mapping. In this study, we tested the simultaneous use of a single low-density lidar data set, a series of Landsat satellite frames and two high-resolution orthophotos to detect vegetation succession related to grassland overgrowth, i.e. encroachment of woody plants into semi-natural grasslands. We built several alternative Random Forest models with different sets of variables and tested the applicability of respective data sources for change detection purposes, aiming at distinguishing unchanged grassland and woodland areas from overgrown grasslands. Our results show that while lidar alone provides a solid basis for indicating structural differences between grassland and woodland vegetation, and orthophoto-generated variables alone are better in detecting successional changes, their combination works considerably better than its respective parts. More specifically, a model combining all the used data sets reduces the total error from 17.0% to 11.0% and omission error of detecting overgrown grasslands from 56.9% to 31.2%, when compared to model constructed solely based on lidar data. This pinpoints the efficiency of the approach where lidar-generated structural metrics are combined with optical and multitemporal observations, providing a workable framework to

  10. Risk assessment of component failure modes and human errors using a new FMECA approach: application in the safety analysis of HDR brachytherapy

    International Nuclear Information System (INIS)

    Giardina, M; Castiglia, F; Tomarchio, E

    2014-01-01

    Failure mode, effects and criticality analysis (FMECA) is a safety technique extensively used in many different industrial fields to identify and prevent potential failures. In the application of traditional FMECA, the risk priority number (RPN) is determined to rank the failure modes; however, the method has been criticised for having several weaknesses. Moreover, it is unable to adequately deal with human errors or negligence. In this paper, a new versatile fuzzy rule-based assessment model is proposed to evaluate the RPN index to rank both component failure and human error. The proposed methodology is applied to potential radiological over-exposure of patients during high-dose-rate brachytherapy treatments. The critical analysis of the results can provide recommendations and suggestions regarding safety provisions for the equipment and procedures required to reduce the occurrence of accidental events. (paper)

  11. The spontaneous replication error and the mismatch discrimination mechanisms of human DNA polymerase β

    Science.gov (United States)

    Koag, Myong-Chul; Nam, Kwangho; Lee, Seongmin

    2014-01-01

    To provide molecular-level insights into the spontaneous replication error and the mismatch discrimination mechanisms of human DNA polymerase β (polβ), we report four crystal structures of polβ complexed with dG•dTTP and dA•dCTP mismatches in the presence of Mg2+ or Mn2+. The Mg2+-bound ground-state structures show that the dA•dCTP-Mg2+ complex adopts an ‘intermediate’ protein conformation while the dG•dTTP-Mg2+ complex adopts an open protein conformation. The Mn2+-bound ‘pre-chemistry-state’ structures show that the dA•dCTP-Mn2+ complex is structurally very similar to the dA•dCTP-Mg2+ complex, whereas the dG•dTTP-Mn2+ complex undergoes a large-scale conformational change to adopt a Watson–Crick-like dG•dTTP base pair and a closed protein conformation. These structural differences, together with our molecular dynamics simulation studies, suggest that polβ increases replication fidelity via a two-stage mismatch discrimination mechanism, where one is in the ground state and the other in the closed conformation state. In the closed conformation state, polβ appears to allow only a Watson–Crick-like conformation for purine•pyrimidine base pairs, thereby discriminating the mismatched base pairs based on their ability to form the Watson–Crick-like conformation. Overall, the present studies provide new insights into the spontaneous replication error and the replication fidelity mechanisms of polβ. PMID:25200079

  12. Practical Insights from Initial Studies Related to Human Error Analysis Project (HEAP)

    International Nuclear Information System (INIS)

    Follesoe, Knut; Kaarstad, Magnhild; Droeivoldsmo, Asgeir; Hollnagel, Erik; Kirwan; Barry

    1996-01-01

    This report presents practical insights made from an analysis of the three initial studies in the Human Error Analysis Project (HEAP), and the first study in the US NRC Staffing Project. These practical insights relate to our understanding of diagnosis in Nuclear Power Plant (NPP) emergency scenarios and, in particular, the factors that influence whether a diagnosis will succeed or fail. The insights reported here focus on three inter-related areas: (1) the diagnostic strategies and styles that have been observed in single operator and team-based studies; (2) the qualitative aspects of the key operator support systems, namely VDU interfaces, alarms, training and procedures, that have affected the outcome of diagnosis; and (3) the overall success rates of diagnosis and the error types that have been observed in the various studies. With respect to diagnosis, certain patterns have emerged from the various studies, depending on whether operators were alone or in teams, and on their familiarity with the process. Some aspects of the interface and alarm systems were found to contribute to diagnostic failures while others supported performance and recovery. Similar results were found for training and experience. Furthermore, the availability of procedures did not preclude the need for some diagnosis. With respect to HRA and PSA, it was possible to record the failure types seen in the studies, and in some cases to give crude estimates of the failure likelihood for certain scenarios. Although these insights are interim in nature, they do show the type of information that can be derived from these studies. More importantly, they clarify aspects of our understanding of diagnosis in NPP emergencies, including implications for risk assessment, operator support systems development, and for research into diagnosis in a broader range of fields than the nuclear power industry. (author)

  13. Compound Stimulus Extinction Reduces Spontaneous Recovery in Humans

    Science.gov (United States)

    Coelho, Cesar A. O.; Dunsmoor, Joseph E.; Phelps, Elizabeth A.

    2015-01-01

    Fear-related behaviors are prone to relapse following extinction. We tested in humans a compound extinction design ("deepened extinction") shown in animal studies to reduce post-extinction fear recovery. Adult subjects underwent fear conditioning to a visual and an auditory conditioned stimulus (CSA and CSB, respectively) separately…

  14. Human Error Prediction and Countermeasures based on CREAM in Loading and Storage Phase of Spent Nuclear Fuel (SNF)

    International Nuclear Information System (INIS)

    Kim, Jae San; Kim, Min Su; Jo, Seong Youn

    2007-01-01

    With the steady demands for nuclear power energy in Korea, the amount of accumulated SNF has inevitably increased year by year. Thus far, SNF has been on-site transported from one unit to a nearby unit or an on-site dry storage facility. In the near future, as the amount of SNF generated approaches the capacity of these facilities, a percentage of it will be transported to another SNF storage facility. In the process of transporting SNF, human interactions involve inspecting and preparing the cask and spent fuel, loading the cask, transferring the cask and storage or monitoring the cask, etc. So, human actions play a significant role in SNF transportation. In analyzing incidents that have occurred during transport operations, several recent studies have indicated that 'human error' is a primary cause. Therefore, the objectives of this study are to predict and identify possible human errors during the loading and storage of SNF. Furthermore, after evaluating human error for each process, countermeasures to minimize human error are deduced

  15. Exploiting sparsity and low-rank structure for the recovery of multi-slice breast MRIs with reduced sampling error.

    Science.gov (United States)

    Yin, X X; Ng, B W-H; Ramamohanarao, K; Baghai-Wadji, A; Abbott, D

    2012-09-01

    It has been shown that, magnetic resonance images (MRIs) with sparsity representation in a transformed domain, e.g. spatial finite-differences (FD), or discrete cosine transform (DCT), can be restored from undersampled k-space via applying current compressive sampling theory. The paper presents a model-based method for the restoration of MRIs. The reduced-order model, in which a full-system-response is projected onto a subspace of lower dimensionality, has been used to accelerate image reconstruction by reducing the size of the involved linear system. In this paper, the singular value threshold (SVT) technique is applied as a denoising scheme to reduce and select the model order of the inverse Fourier transform image, and to restore multi-slice breast MRIs that have been compressively sampled in k-space. The restored MRIs with SVT for denoising show reduced sampling errors compared to the direct MRI restoration methods via spatial FD, or DCT. Compressive sampling is a technique for finding sparse solutions to underdetermined linear systems. The sparsity that is implicit in MRIs is to explore the solution to MRI reconstruction after transformation from significantly undersampled k-space. The challenge, however, is that, since some incoherent artifacts result from the random undersampling, noise-like interference is added to the image with sparse representation. These recovery algorithms in the literature are not capable of fully removing the artifacts. It is necessary to introduce a denoising procedure to improve the quality of image recovery. This paper applies a singular value threshold algorithm to reduce the model order of image basis functions, which allows further improvement of the quality of image reconstruction with removal of noise artifacts. The principle of the denoising scheme is to reconstruct the sparse MRI matrices optimally with a lower rank via selecting smaller number of dominant singular values. The singular value threshold algorithm is performed

  16. Human error probability evaluation as part of reliability analysis of digital protection system of advanced pressurized water reactor - APR 1400

    International Nuclear Information System (INIS)

    Varde, P. V.; Lee, D. Y.; Han, J. B.

    2003-03-01

    A case of study on human reliability analysis has been performed as part of reliability analysis of digital protection system of the reactor automatically actuates the shutdown system of the reactor when demanded. However, the safety analysis takes credit for operator action as a diverse mean for tripping the reactor for, though a low probability, ATWS scenario. Based on the available information two cases, viz., human error in tripping the reactor and calibration error for instrumentations in protection system, have been analyzed. Wherever applicable a parametric study has also been performed

  17. Detailed semantic analyses of human error incidents occurring at domestic nuclear power plants to fiscal year 2000

    International Nuclear Information System (INIS)

    Tsuge, Tadashi; Hirotsu, Yuko; Takano, Kenichi; Ebisu, Mitsuhiro; Tsumura, Joji

    2003-01-01

    Analysing and evaluating observed cases of human error incidents with the emphasis on human factors and behavior involved was essential for preventing recurrence of those. CRIEPI has been conducting detailed and structures analyses of all incidents reported during last 35 year based on J-HPES, from the beginning of the first Tokai nuclear power operation till fiscal year of 2000, in which total 212 human error cases are identified. Results obtained by the analyses have been stored into the J-HPES data-base. This summarized the semantic analyses on all case-studies stored in the above data-base to grasp the practical and concrete contents and trend of more frequently observed human errors (as are called trigger actions here), causal factors and preventive measures. These semantic analyses have been executed by classifying all those items into some categories that could be considered as having almost the same meaning using the KJ method. Followings are obtained typical results by above analyses: (1) Trigger action-Those could be classified into categories of operation or categories of maintenance. Operational timing errors' and 'operational quantitative errors' were major actions in trigger actions of operation, those occupied about 20% among all actions. At trigger actions of maintenance, 'maintenance quantitative error' were major actions, those occupied quarter among all actions; (2) Causal factor- 'Human internal status' were major factors, as in concrete factors, those occupied 'improper persistence' and 'lack of knowledge'; (3) Preventive measure-Most frequent measures got were job management changes in procedural software improvements, which was from 70% to 80%. As for preventive measures of operation, software improvements have been implemented on 'organization and work practices' and 'individual consciousness'. Concerning preventive measures of maintenance, improvements have been implemented on 'organization and work practices'. (author)

  18. An Enhancement of Campaign Posters for Human Error Prevention in NPPs

    International Nuclear Information System (INIS)

    Lee, Yong Hee; Lee, Yong Hee; Kwon, Soon Il

    2010-01-01

    Accidents in high reliability systems such as nuclear power plants (NPPs) give rise to not only a loss of property and life, but also social problems. One of the most frequently used techniques to grasp the current situation for hazard factors in the NPPs is an event investigation analysis based on the INPO's Human Performance Enhancement System (HPES), and the Korean Human Performance Enhancement System (K-HPES) in Korea, respectively. There are many methods and approaches for an HE assessment that is valuable for investigating the causes of undesirable events and counter-plans to prevent their recurrence in the NPPs. They differ from each other according to the objectives of the analysis; the explanation of the event, the investigation of the causes, the allocation of the responsibility, and the establishment of the counter-plan. Event databases include their own events and information from various sources such as the IAEA, regulatory bodies, and also from the INPO and WANO. As many as 111 reactor trips have occurred in the past 5 years ('01∼'05), and 26 cases of them have occurred due to HE. The trend of human error rate didn't decrease in 2004, so the KHNP started to make efforts to decrease HEs. The KHNP created as many as 40 posters for human performance improvement in 2006. The INPO has been using a traditional form of poster; additionally, the Central Research Institute of Electric Power Industry (CRIEPI) developed a type of caution report. The caution report is comprised of a poster name, a serial number, a figure, work situations, the point at issue, and a countermeasure. The preceding posters which KHNP developed in 2006 give a message about specific information related to HE events. However, it is not enough to arouse interest in the effectiveness of the posters because most people are favorably disposed toward a simple poster with many illustrations. Therefore, we stressed the need for worker's receptiveness rather than notification of information

  19. Monte Carlo simulation of expert judgments on human errors in chemical analysis--a case study of ICP-MS.

    Science.gov (United States)

    Kuselman, Ilya; Pennecchi, Francesca; Epstein, Malka; Fajgelj, Ales; Ellison, Stephen L R

    2014-12-01

    Monte Carlo simulation of expert judgments on human errors in a chemical analysis was used for determination of distributions of the error quantification scores (scores of likelihood and severity, and scores of effectiveness of a laboratory quality system in prevention of the errors). The simulation was based on modeling of an expert behavior: confident, reasonably doubting and irresolute expert judgments were taken into account by means of different probability mass functions (pmfs). As a case study, 36 scenarios of human errors which may occur in elemental analysis of geological samples by ICP-MS were examined. Characteristics of the score distributions for three pmfs of an expert behavior were compared. Variability of the scores, as standard deviation of the simulated score values from the distribution mean, was used for assessment of the score robustness. A range of the score values, calculated directly from elicited data and simulated by a Monte Carlo method for different pmfs, was also discussed from the robustness point of view. It was shown that robustness of the scores, obtained in the case study, can be assessed as satisfactory for the quality risk management and improvement of a laboratory quality system against human errors. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. A Preliminary Study on the Measures to Assess the Organizational Safety: The Cultural Impact on Human Error Potential

    International Nuclear Information System (INIS)

    Lee, Yong Hee; Lee, Yong Hee

    2011-01-01

    The Fukushima I nuclear accident following the Tohoku earthquake and tsunami on 11 March 2011 occurred after twelve years had passed since the JCO accident which was caused as a result of an error made by JCO employees. These accidents, along with the Chernobyl accident, associated with characteristic problems of various organizations caused severe social and economic disruptions and have had significant environmental and health impact. The cultural problems with human errors occur for various reasons, and different actions are needed to prevent different errors. Unfortunately, much of the research on organization and human error has shown widely various or different results which call for different approaches. In other words, we have to find more practical solutions from various researches for nuclear safety and lead a systematic approach to organizational deficiency causing human error. This paper reviews Hofstede's criteria, IAEA safety culture, safety areas of periodic safety review (PSR), teamwork and performance, and an evaluation of HANARO safety culture to verify the measures used to assess the organizational safety

  1. A Preliminary Study on the Measures to Assess the Organizational Safety: The Cultural Impact on Human Error Potential

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yong Hee; Lee, Yong Hee [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2011-10-15

    The Fukushima I nuclear accident following the Tohoku earthquake and tsunami on 11 March 2011 occurred after twelve years had passed since the JCO accident which was caused as a result of an error made by JCO employees. These accidents, along with the Chernobyl accident, associated with characteristic problems of various organizations caused severe social and economic disruptions and have had significant environmental and health impact. The cultural problems with human errors occur for various reasons, and different actions are needed to prevent different errors. Unfortunately, much of the research on organization and human error has shown widely various or different results which call for different approaches. In other words, we have to find more practical solutions from various researches for nuclear safety and lead a systematic approach to organizational deficiency causing human error. This paper reviews Hofstede's criteria, IAEA safety culture, safety areas of periodic safety review (PSR), teamwork and performance, and an evaluation of HANARO safety culture to verify the measures used to assess the organizational safety

  2. A bottom-up model of spatial attention predicts human error patterns in rapid scene recognition.

    Science.gov (United States)

    Einhäuser, Wolfgang; Mundhenk, T Nathan; Baldi, Pierre; Koch, Christof; Itti, Laurent

    2007-07-20

    Humans demonstrate a peculiar ability to detect complex targets in rapidly presented natural scenes. Recent studies suggest that (nearly) no focal attention is required for overall performance in such tasks. Little is known, however, of how detection performance varies from trial to trial and which stages in the processing hierarchy limit performance: bottom-up visual processing (attentional selection and/or recognition) or top-down factors (e.g., decision-making, memory, or alertness fluctuations)? To investigate the relative contribution of these factors, eight human observers performed an animal detection task in natural scenes presented at 20 Hz. Trial-by-trial performance was highly consistent across observers, far exceeding the prediction of independent errors. This consistency demonstrates that performance is not primarily limited by idiosyncratic factors but by visual processing. Two statistical stimulus properties, contrast variation in the target image and the information-theoretical measure of "surprise" in adjacent images, predict performance on a trial-by-trial basis. These measures are tightly related to spatial attention, demonstrating that spatial attention and rapid target detection share common mechanisms. To isolate the causal contribution of the surprise measure, eight additional observers performed the animal detection task in sequences that were reordered versions of those all subjects had correctly recognized in the first experiment. Reordering increased surprise before and/or after the target while keeping the target and distractors themselves unchanged. Surprise enhancement impaired target detection in all observers. Consequently, and contrary to several previously published findings, our results demonstrate that attentional limitations, rather than target recognition alone, affect the detection of targets in rapidly presented visual sequences.

  3. Experiences with Lean Six Sigma as improvement strategy to reduce parenteral medication administration errors and associated potential risk of harm

    NARCIS (Netherlands)

    van de Plas, Afke; Slikkerveer, Mariëlle; Hoen, Saskia; Schrijnemakers, Rick; Driessen, Johanna; de Vries, Frank; van den Bemt, Patricia

    2017-01-01

    In this controlled before-after study the effect of improvements, derived from Lean Six Sigma strategy, on parenteral medication administration errors and the potential risk of harm was determined. During baseline measurement, on control versus intervention ward, at least one administration error

  4. Compound stimulus extinction reduces spontaneous recovery in humans

    OpenAIRE

    Coelho, Cesar A.O.; Dunsmoor, Joseph E.; Phelps, Elizabeth A.

    2015-01-01

    Fear-related behaviors are prone to relapse following extinction. We tested in humans a compound extinction design (“deepened extinction”) shown in animal studies to reduce post-extinction fear recovery. Adult subjects underwent fear conditioning to a visual and an auditory conditioned stimulus (CSA and CSB, respectively) separately paired with an electric shock. The target CS (CSA) was extinguished alone followed by compound presentations of the extinguished CSA and nonextinguished CSB. Reco...

  5. Analysis of human error in occupational accidents in the power plant industries using combining innovative FTA and meta-heuristic algorithms

    Directory of Open Access Journals (Sweden)

    M. Omidvari

    2015-09-01

    Full Text Available Introduction: Occupational accidents are of the main issues in industries. It is necessary to identify the main root causes of accidents for their control. Several models have been proposed for determining the accidents root causes. FTA is one of the most widely used models which could graphically establish the root causes of accidents. The non-linear function is one of the main challenges in FTA compliance and in order to obtain the exact number, the meta-heuristic algorithms can be used. Material and Method: The present research was done in power plant industries in construction phase. In this study, a pattern for the analysis of human error in work-related accidents was provided by combination of neural network algorithms and FTA analytical model. Finally, using this pattern, the potential rate of all causes was determined. Result: The results showed that training, age, and non-compliance with safety principals in the workplace were the most important factors influencing human error in the occupational accident. Conclusion: According to the obtained results, it can be concluded that human errors can be greatly reduced by training, right choice of workers with regard to the type of occupations, and provision of appropriate safety conditions in the work place.

  6. Evaluating the Performance Diagnostic Checklist-Human Services to Assess Incorrect Error-Correction Procedures by Preschool Paraprofessionals

    Science.gov (United States)

    Bowe, Melissa; Sellers, Tyra P.

    2018-01-01

    The Performance Diagnostic Checklist-Human Services (PDC-HS) has been used to assess variables contributing to undesirable staff performance. In this study, three preschool teachers completed the PDC-HS to identify the factors contributing to four paraprofessionals' inaccurate implementation of error-correction procedures during discrete trial…

  7. A model-based and computer-aided approach to analysis of human errors in nuclear power plants

    International Nuclear Information System (INIS)

    Yoon, Wan C.; Lee, Yong H.; Kim, Young S.

    1996-01-01

    Since the operator's mission in NPPs is increasingly defined by cognitive tasks such as monitoring, diagnosis and planning, the focus of human error analysis should also move from external actions to internal decision-making processes. While more elaborate analysis of cognitive aspects of human errors will help understand their causes and derive effective countermeasures, a lack of framework and an arbitrary resolution of description may hamper the effectiveness of such analysis. This paper presents new model-based schemes of event description and error classification as well as an interactive computerized support system. The schemes and the support system were produced in an effort to develop an improved version of HPES. The use of a decision-making model enables the analyst to document cognitive aspects of human performance explicitly and in a proper resolution. The stage-specific terms used in the proposed schemes make the task of characterizing human errors easier and confident for field analysts. The support system was designed to help the analyst achieve a contextually well-integrated analysis throughout the different parts of HPES

  8. Inhibition of fatty acid metabolism reduces human myeloma cells proliferation.

    Directory of Open Access Journals (Sweden)

    José Manuel Tirado-Vélez

    Full Text Available Multiple myeloma is a haematological malignancy characterized by the clonal proliferation of plasma cells. It has been proposed that targeting cancer cell metabolism would provide a new selective anticancer therapeutic strategy. In this work, we tested the hypothesis that inhibition of β-oxidation and de novo fatty acid synthesis would reduce cell proliferation in human myeloma cells. We evaluated the effect of etomoxir and orlistat on fatty acid metabolism, glucose metabolism, cell cycle distribution, proliferation, cell death and expression of G1/S phase regulatory proteins in myeloma cells. Etomoxir and orlistat inhibited β-oxidation and de novo fatty acid synthesis respectively in myeloma cells, without altering significantly glucose metabolism. These effects were associated with reduced cell viability and cell cycle arrest in G0/G1. Specifically, etomoxir and orlistat reduced by 40-70% myeloma cells proliferation. The combination of etomoxir and orlistat resulted in an additive inhibitory effect on cell proliferation. Orlistat induced apoptosis and sensitized RPMI-8226 cells to apoptosis induction by bortezomib, whereas apoptosis was not altered by etomoxir. Finally, the inhibitory effect of both drugs on cell proliferation was associated with reduced p21 protein levels and phosphorylation levels of retinoblastoma protein. In conclusion, inhibition of fatty acid metabolism represents a potential therapeutic approach to treat human multiple myeloma.

  9. Trend analysis of nuclear reactor automatic trip events subjected to operator's human error at United States nuclear power plants

    International Nuclear Information System (INIS)

    Takagawa, Kenichi

    2009-01-01

    Trends in nuclear reactor automatic trip events due to human errors during plant operating mode have been analyzed by extracting 20 events which took place in the United States during the period of seven years from 2002 to 2008, cited in the LERs (Licensee Event Reports) submitted to the US Nuclear Regulatory Commission (NRC). It was shown that the yearly number of events was relatively large before 2005, and thereafter the number decreased. A period of stable operation, in which the yearly number was kept very small, continued for about three years, and then the yearly number turned to increase again. Before 2005, automatic trip events occurred more frequently during periodic inspections or start-up/shut-down operations. The recent trends, however, indicate that trip events became more frequent due to human errors during daily operations. Human errors were mostly caused by the self-conceit and carelessness of operators through the whole period. The before mentioned trends in the yearly number of events might be explained as follows. The decrease in the automatic trip events is attributed to sharing trouble information, leading as a consequence to improvement of the manual and training for the operations which have a higher potential risk of automatic trip. Then, while the period of stable operation continued, some operators came to pay less attention to preventing human errors and not interest in the training, leading to automatic trip events in reality due to miss-operation. From these analyses on trouble experiences in the US, we learnt the followings to prevent the occurrence similar troubles in Japan: Operators should be thoroughly skilled in basic actions to prevent human errors as persons concerned. And it should be further emphasized that they should elaborate by imaging actual plant operations even though the simulator training gives them successful experiences. (author)

  10. The Second Victim Phenomenon After a Clinical Error: The Design and Evaluation of a Website to Reduce Caregivers' Emotional Responses After a Clinical Error.

    Science.gov (United States)

    Mira, José Joaquín; Carrillo, Irene; Guilabert, Mercedes; Lorenzo, Susana; Pérez-Pérez, Pastora; Silvestre, Carmen; Ferrús, Lena

    2017-06-08

    Adverse events (incidents that harm a patient) can also produce emotional hardship for the professionals involved (second victims). Although a few international pioneering programs exist that aim to facilitate the recovery of the second victim, there are no known initiatives that aim to raise awareness in the professional community about this issue and prevent the situation from worsening. The aim of this study was to design and evaluate an online program directed at frontline hospital and primary care health professionals that raises awareness and provides information about the second victim phenomenon. The design of the Mitigating Impact in Second Victims (MISE) online program was based on a literature review, and its contents were selected by a group of 15 experts on patient safety with experience in both clinical and academic settings. The website hosting MISE was subjected to an accreditation process by an external quality agency that specializes in evaluating health websites. The MISE structure and content were evaluated by 26 patient safety managers at hospitals and within primary care in addition to 266 frontline health care professionals who followed the program, taking into account its comprehension, usefulness of the information, and general adequacy. Finally, the amount of knowledge gained from the program was assessed with three objective measures (pre- and posttest design). The website earned Advanced Accreditation for health websites after fulfilling required standards. The comprehension and practical value of the MISE content were positively assessed by 88% (23/26) and 92% (24/26) of patient safety managers, respectively. MISE was positively evaluated by health care professionals, who awarded it 8.8 points out of a maximum 10. Users who finished MISE improved their knowledge on patient safety terminology, prevalence and impact of adverse events and clinical errors, second victim support models, and recommended actions following a severe adverse

  11. Management and Evaluation System on Human Error, Licence Requirements, and Job-aptitude in Rail and the Other Industries

    Energy Technology Data Exchange (ETDEWEB)

    Koo, In Soo; Suh, S. M.; Park, G. O. (and others)

    2006-07-15

    Rail system is a system that is very closely related to the public life. When an accident happens, the public using this system should be injured or even be killed. The accident that recently took place in Taegu subway system, because of the inappropriate human-side task performance, showed demonstratively how its results could turn out to be tragic one. Many studies have shown that the most cases of the accidents have occurred because of performing his/her tasks in inappropriate way. It is generally recognised that the rail system without human element could never be happened quite long time. So human element in rail system is going to be the major factor to the next tragic accident. This state of the art report studied the cases of the managements and evaluation systems related to human errors, license requirements, and job aptitudes in the areas of rail and the other industries for the purpose of improvement of the task performance of personnel which consists of an element and finally enhancement of rail safety. The human errors, license requirements, and evaluation system of the job aptitude on people engaged in agencies with close relation to rail do much for development and preservation their abilities. But due to various inside and outside factors, to some extent it may have limitations to timely reflect overall trends of society, technology, and a sense of value. Removal and control of the factors of human errors will have epochal roles in safety of the rail system through the case studies of this report. Analytical results on case studies of this report will be used in the project 'Development of Management Criteria on Human Error and Evaluation Criteria on Job-aptitude of Rail Safe-operation Personnel' which has been carried out as a part of 'Integrated R and D Program for Railway Safety'.

  12. Management and Evaluation System on Human Error, Licence Requirements, and Job-aptitude in Rail and the Other Industries

    International Nuclear Information System (INIS)

    Koo, In Soo; Suh, S. M.; Park, G. O.

    2006-07-01

    Rail system is a system that is very closely related to the public life. When an accident happens, the public using this system should be injured or even be killed. The accident that recently took place in Taegu subway system, because of the inappropriate human-side task performance, showed demonstratively how its results could turn out to be tragic one. Many studies have shown that the most cases of the accidents have occurred because of performing his/her tasks in inappropriate way. It is generally recognised that the rail system without human element could never be happened quite long time. So human element in rail system is going to be the major factor to the next tragic accident. This state of the art report studied the cases of the managements and evaluation systems related to human errors, license requirements, and job aptitudes in the areas of rail and the other industries for the purpose of improvement of the task performance of personnel which consists of an element and finally enhancement of rail safety. The human errors, license requirements, and evaluation system of the job aptitude on people engaged in agencies with close relation to rail do much for development and preservation their abilities. But due to various inside and outside factors, to some extent it may have limitations to timely reflect overall trends of society, technology, and a sense of value. Removal and control of the factors of human errors will have epochal roles in safety of the rail system through the case studies of this report. Analytical results on case studies of this report will be used in the project 'Development of Management Criteria on Human Error and Evaluation Criteria on Job-aptitude of Rail Safe-operation Personnel' which has been carried out as a part of 'Integrated R and D Program for Railway Safety'

  13. Reducing the sensitivity of IMPT treatment plans to setup errors and range uncertainties via probabilistic treatment planning

    International Nuclear Information System (INIS)

    Unkelbach, Jan; Bortfeld, Thomas; Martin, Benjamin C.; Soukup, Martin

    2009-01-01

    Treatment plans optimized for intensity modulated proton therapy (IMPT) may be very sensitive to setup errors and range uncertainties. If these errors are not accounted for during treatment planning, the dose distribution realized in the patient may by strongly degraded compared to the planned dose distribution. The authors implemented the probabilistic approach to incorporate uncertainties directly into the optimization of an intensity modulated treatment plan. Following this approach, the dose distribution depends on a set of random variables which parameterize the uncertainty, as does the objective function used to optimize the treatment plan. The authors optimize the expected value of the objective function. They investigate IMPT treatment planning regarding range uncertainties and setup errors. They demonstrate that incorporating these uncertainties into the optimization yields qualitatively different treatment plans compared to conventional plans which do not account for uncertainty. The sensitivity of an IMPT plan depends on the dose contributions of individual beam directions. Roughly speaking, steep dose gradients in beam direction make treatment plans sensitive to range errors. Steep lateral dose gradients make plans sensitive to setup errors. More robust treatment plans are obtained by redistributing dose among different beam directions. This can be achieved by the probabilistic approach. In contrast, the safety margin approach as widely applied in photon therapy fails in IMPT and is neither suitable for handling range variations nor setup errors.

  14. Humans running in place on water at simulated reduced gravity.

    Directory of Open Access Journals (Sweden)

    Alberto E Minetti

    Full Text Available BACKGROUND: On Earth only a few legged species, such as water strider insects, some aquatic birds and lizards, can run on water. For most other species, including humans, this is precluded by body size and proportions, lack of appropriate appendages, and limited muscle power. However, if gravity is reduced to less than Earth's gravity, running on water should require less muscle power. Here we use a hydrodynamic model to predict the gravity levels at which humans should be able to run on water. We test these predictions in the laboratory using a reduced gravity simulator. METHODOLOGY/PRINCIPAL FINDINGS: We adapted a model equation, previously used by Glasheen and McMahon to explain the dynamics of Basilisk lizard, to predict the body mass, stride frequency and gravity necessary for a person to run on water. Progressive body-weight unloading of a person running in place on a wading pool confirmed the theoretical predictions that a person could run on water, at lunar (or lower gravity levels using relatively small rigid fins. Three-dimensional motion capture of reflective markers on major joint centers showed that humans, similarly to the Basilisk Lizard and to the Western Grebe, keep the head-trunk segment at a nearly constant height, despite the high stride frequency and the intensive locomotor effort. Trunk stabilization at a nearly constant height differentiates running on water from other, more usual human gaits. CONCLUSIONS/SIGNIFICANCE: The results showed that a hydrodynamic model of lizards running on water can also be applied to humans, despite the enormous difference in body size and morphology.

  15. Asymmetric generalization in adaptation to target displacement errors in humans and in a neural network model.

    Science.gov (United States)

    Westendorff, Stephanie; Kuang, Shenbing; Taghizadeh, Bahareh; Donchin, Opher; Gail, Alexander

    2015-04-01

    Different error signals can induce sensorimotor adaptation during visually guided reaching, possibly evoking different neural adaptation mechanisms. Here we investigate reach adaptation induced by visual target errors without perturbing the actual or sensed hand position. We analyzed the spatial generalization of adaptation to target error to compare it with other known generalization patterns and simulated our results with a neural network model trained to minimize target error independent of prediction errors. Subjects reached to different peripheral visual targets and had to adapt to a sudden fixed-amplitude displacement ("jump") consistently occurring for only one of the reach targets. Subjects simultaneously had to perform contralateral unperturbed saccades, which rendered the reach target jump unnoticeable. As a result, subjects adapted by gradually decreasing reach errors and showed negative aftereffects for the perturbed reach target. Reach errors generalized to unperturbed targets according to a translational rather than rotational generalization pattern, but locally, not globally. More importantly, reach errors generalized asymmetrically with a skewed generalization function in the direction of the target jump. Our neural network model reproduced the skewed generalization after adaptation to target jump without having been explicitly trained to produce a specific generalization pattern. Our combined psychophysical and simulation results suggest that target jump adaptation in reaching can be explained by gradual updating of spatial motor goal representations in sensorimotor association networks, independent of learning induced by a prediction-error about the hand position. The simulations make testable predictions about the underlying changes in the tuning of sensorimotor neurons during target jump adaptation. Copyright © 2015 the American Physiological Society.

  16. Science, practice, and human errors in controlling Clostridium botulinum in heat-preserved food in hermetic containers.

    Science.gov (United States)

    Pflug, Irving J

    2010-05-01

    The incidence of botulism in canned food in the last century is reviewed along with the background science; a few conclusions are reached based on analysis of published data. There are two primary aspects to botulism control: the design of an adequate process and the delivery of the adequate process to containers of food. The probability that the designed process will not be adequate to control Clostridium botulinum is very small, probably less than 1.0 x 10(-6), based on containers of food, whereas the failure of the operator of the processing equipment to deliver the specified process to containers of food may be of the order of 1 in 40, to 1 in 100, based on processing units (retort loads). In the commercial food canning industry, failure to deliver the process will probably be of the order of 1.0 x 10(-4) to 1.0 x 10(-6) when U.S. Food and Drug Administration (FDA) regulations are followed. Botulism incidents have occurred in food canning plants that have not followed the FDA regulations. It is possible but very rare to have botulism result from postprocessing contamination. It may thus be concluded that botulism incidents in canned food are primarily the result of human failure in the delivery of the designed or specified process to containers of food that, in turn, result in the survival, outgrowth, and toxin production of C. botulinum spores. Therefore, efforts in C. botulinum control should be concentrated on reducing human errors in the delivery of the specified process to containers of food.

  17. Control of Human Error and comparison Level risk after correction action With the SHERPA Method in a control Room of petrochemical industry

    Directory of Open Access Journals (Sweden)

    A. Zakerian

    2011-12-01

    Full Text Available Background and aims Today in many jobs like nuclear, military and chemical industries, human errors may result in a disaster. Accident in different places of the world emphasizes this subject and we indicate for example, Chernobyl disaster in (1986, tree Mile accident in (1974 and Flixborough explosion in (1974.So human errors identification especially in important and intricate systems is necessary and unavoidable for predicting control methods.   Methods Recent research is a case study and performed in Zagross Methanol Company in Asalouye (South pars.   Walking –Talking through method with process expert and control room operators, inspecting technical documents are used for collecting required information and completing Systematic Human Error Reductive and Predictive Approach (SHERPA worksheets.   Results analyzing SHERPA worksheet indicated that, were accepting capable invertebrate errors % 71.25, % 26.75 undesirable errors, % 2 accepting capable(with change errors, % 0 accepting capable errors, and after correction action forecast Level risk to this arrangement, accepting capable invertebrate errors % 0, % 4.35 undesirable errors , % 58.55 accepting capable(with change errors, % 37.1 accepting capable errors .   ConclusionFinally this result is comprehension that this method in different industries especially in chemical industries is enforceable and useful for human errors identification that may lead to accident and adventures.

  18. Natural Selection Reduced Diversity on Human Y Chromosomes

    Science.gov (United States)

    Wilson Sayres, Melissa A.; Lohmueller, Kirk E.; Nielsen, Rasmus

    2014-01-01

    The human Y chromosome exhibits surprisingly low levels of genetic diversity. This could result from neutral processes if the effective population size of males is reduced relative to females due to a higher variance in the number of offspring from males than from females. Alternatively, selection acting on new mutations, and affecting linked neutral sites, could reduce variability on the Y chromosome. Here, using genome-wide analyses of X, Y, autosomal and mitochondrial DNA, in combination with extensive population genetic simulations, we show that low observed Y chromosome variability is not consistent with a purely neutral model. Instead, we show that models of purifying selection are consistent with observed Y diversity. Further, the number of sites estimated to be under purifying selection greatly exceeds the number of Y-linked coding sites, suggesting the importance of the highly repetitive ampliconic regions. While we show that purifying selection removing deleterious mutations can explain the low diversity on the Y chromosome, we cannot exclude the possibility that positive selection acting on beneficial mutations could have also reduced diversity in linked neutral regions, and may have contributed to lowering human Y chromosome diversity. Because the functional significance of the ampliconic regions is poorly understood, our findings should motivate future research in this area. PMID:24415951

  19. Human error risk management for engineering systems: a methodology for design, safety assessment, accident investigation and training

    International Nuclear Information System (INIS)

    Cacciabue, P.C.

    2004-01-01

    The objective of this paper is to tackle methodological issues associated with the inclusion of cognitive and dynamic considerations into Human Reliability methods. A methodology called Human Error Risk Management for Engineering Systems is presented that offers a 'roadmap' for selecting and consistently applying Human Factors approaches in different areas of application and contains also a 'body' of possible methods and techniques of its own. Two types of possible application are discussed to demonstrate practical applications of the methodology. Specific attention is dedicated to the issue of data collection and definition from specific field assessment

  20. Quality assurance of human error modelling in a major probabilistic risk assessment programme

    International Nuclear Information System (INIS)

    Rycraft, H.S.

    1990-01-01

    A method of incorporating the consideration of operator error within a major PRA exercise is described along with the quality assurance procedures employed to ensure a quality product. The exercise was undertaken at the Sellafield Reprocessing Plant. (author)

  1. Experiences with Lean Six Sigma as improvement strategy to reduce parenteral medication administration errors and associated potential risk of harm.

    Science.gov (United States)

    van de Plas, Afke; Slikkerveer, Mariëlle; Hoen, Saskia; Schrijnemakers, Rick; Driessen, Johanna; de Vries, Frank; van den Bemt, Patricia

    2017-01-01

    In this controlled before-after study the effect of improvements, derived from Lean Six Sigma strategy, on parenteral medication administration errors and the potential risk of harm was determined. During baseline measurement, on control versus intervention ward, at least one administration error occurred in 14 (74%) and 6 (46%) administrations with potential risk of harm in 6 (32%) and 1 (8%) administrations. Most administration errors with high potential risk of harm occurred in bolus injections: 8 (57%) versus 2 (67%) bolus injections were injected too fast with a potential risk of harm in 6 (43%) and 1 (33%) bolus injections on control and intervention ward. Implemented improvement strategies, based on major causes of too fast administration of bolus injections, were: Substitution of bolus injections by infusions, education, availability of administration information and drug round tabards. Post intervention, on the control ward in 76 (76%) administrations at least one error was made (RR 1.03; CI95:0.77-1.38), with a potential risk of harm in 14 (14%) administrations (RR 0.45; CI95:0.20-1.02). In 40 (68%) administrations on the intervention ward at least one error occurred (RR 1.47; CI95:0.80-2.71) but no administrations were associated with a potential risk of harm. A shift in wrong duration administration errors from bolus injections to infusions, with a reduction of potential risk of harm, seems to have occurred on the intervention ward. Although data are insufficient to prove an effect, Lean Six Sigma was experienced as a suitable strategy to select tailored improvements. Further studies are required to prove the effect of the strategy on parenteral medication administration errors.

  2. The use of ionospheric tomography and elevation masks to reduce the overall error in single-frequency GPS timing applications

    Science.gov (United States)

    Rose, Julian A. R.; Tong, Jenna R.; Allain, Damien J.; Mitchell, Cathryn N.

    2011-01-01

    Signals from Global Positioning System (GPS) satellites at the horizon or at low elevations are often excluded from a GPS solution because they experience considerable ionospheric delays and multipath effects. Their exclusion can degrade the overall satellite geometry for the calculations, resulting in greater errors; an effect known as the Dilution of Precision (DOP). In contrast, signals from high elevation satellites experience less ionospheric delays and multipath effects. The aim is to find a balance in the choice of elevation mask, to reduce the propagation delays and multipath whilst maintaining good satellite geometry, and to use tomography to correct for the ionosphere and thus improve single-frequency GPS timing accuracy. GPS data, collected from a global network of dual-frequency GPS receivers, have been used to produce four GPS timing solutions, each with a different ionospheric compensation technique. One solution uses a 4D tomographic algorithm, Multi-Instrument Data Analysis System (MIDAS), to compensate for the ionospheric delay. Maps of ionospheric electron density are produced and used to correct the single-frequency pseudorange observations. This method is compared to a dual-frequency solution and two other single-frequency solutions: one does not include any ionospheric compensation and the other uses the broadcast Klobuchar model. Data from the solar maximum year 2002 and October 2003 have been investigated to display results when the ionospheric delays are large and variable. The study focuses on Europe and results are produced for the chosen test site, VILL (Villafranca, Spain). The effects of excluding all of the GPS satellites below various elevation masks, ranging from 5° to 40°, on timing solutions for fixed (static) and mobile (moving) situations are presented. The greatest timing accuracies when using the fixed GPS receiver technique are obtained by using a 40° mask, rather than a 5° mask. The mobile GPS timing solutions are most

  3. Porphyromonas endodontalis binds, reduces and grows on human hemoglobin.

    Science.gov (United States)

    Zerr, M; Drake, D; Johnson, W; Cox, C D

    2001-08-01

    Porphyromonas endodontalis is a black-pigmented, obligate anaerobic rod-shaped bacterium implicated as playing a major role in endodontic infections. We have previously shown that P. endodontalis requires the porphyrin nucleus, preferably supplied as hemoglobin, as a growth supplement. The bacteria also actively transport free iron, although this activity does not support growth in the absence of a porphyrin source. The purpose of this study was to further investigate the binding and subsequent utilization of human hemoglobin by P. endodontalis. P. endodontalis binds hemoglobin and reduces the Fe(III) porphyrin, resulting in a steady accumulation of ferrous hemoglobin. Reduction of methemoglobin was similar to the extracellular reduction of nitrobluetetrazolium in the presence of oxidizable substrate. Turbidimetric and viable cell determinations showed that P. endodontalis grew when supplied only hemoglobin. Therefore, we conclude that hemoglobin appears to serve as a sole carbon and nitrogen source, and that these bacteria reduce extracellular compounds at the expense of oxidized substrates.

  4. Effective use of pre-job briefing as tool for the prevention of human error; Effektive Nutzung der Arbeitsvorbesprechung als Werkzeug zur Vermeidung von Fehlhandlungen

    Energy Technology Data Exchange (ETDEWEB)

    Schlump, Ansgar [KLE GmbH, Lingen (Germany). Kernkraftwerk Emsland

    2015-06-15

    There is a fundamental demand to minimise the risks for workers and facilities while executing maintenance work. To ensure that facilities are secure and reliable, any deviation from normal operation behaviour has to be avoided. Accurate planning is the basis for minimising mistakes and making work more secure. All workers involved should understand how the work should be done and what is expected to avoid human errors. Especially in nuclear power plants, the human performance tools (HPT) have proved to be an effective instrument to minimise human errors. These human performance tools consist of numerous different tools that complement each other (e.g. pre-job briefing). The safety culture of the plants is also characterised by these tools. The choice of using the right HP-Tool is often a difficult task for the work planer. On the one hand, he wants to avoid mistakes during the execution of work but on the other hand he does not want to irritate the workers with unnecessary requirements. The proposed concept uses a simple risk analysis to take into account the complexity of the task, the experience of the past and the consequences of failure in to account. One main result of this risk analysis is a recommendation of the detailing of the pre-job briefing, to reduce the risks for the involved staff to a minimum.

  5. Conciliatory gestures promote forgiveness and reduce anger in humans.

    Science.gov (United States)

    McCullough, Michael E; Pedersen, Eric J; Tabak, Benjamin A; Carter, Evan C

    2014-07-29

    Conflict is an inevitable component of social life, and natural selection has exerted strong effects on many organisms to facilitate victory in conflict and to deter conspecifics from imposing harms upon them. Like many species, humans likely possess cognitive systems whose function is to motivate revenge as a means of deterring individuals who have harmed them from harming them again in the future. However, many social relationships often retain value even after conflicts have occurred between interactants, so natural selection has very likely also endowed humans with cognitive systems whose function is to motivate reconciliation with transgressors whom they perceive as valuable and nonthreatening, notwithstanding their harmful prior actions. In a longitudinal study with 337 participants who had recently been harmed by a relationship partner, we found that conciliatory gestures (e.g., apologies, offers of compensation) were associated with increases in victims' perceptions of their transgressors' relationship value and reductions in perceptions of their transgressors' exploitation risk. In addition, conciliatory gestures appeared to accelerate forgiveness and reduce reactive anger via their intermediate effects on relationship value and exploitation risk. These results strongly suggest that conciliatory gestures facilitate forgiveness and reduce anger by modifying victims' perceptions of their transgressors' value as relationship partners and likelihood of recidivism.

  6. Error analysis for reducing noisy wide-gap concentric cylinder rheometric data for nonlinear fluids - Theory and applications

    Science.gov (United States)

    Borgia, Andrea; Spera, Frank J.

    1990-01-01

    This work discusses the propagation of errors for the recovery of the shear rate from wide-gap concentric cylinder viscometric measurements of non-Newtonian fluids. A least-square regression of stress on angular velocity data to a system of arbitrary functions is used to propagate the errors for the series solution to the viscometric flow developed by Krieger and Elrod (1953) and Pawlowski (1953) ('power-law' approximation) and for the first term of the series developed by Krieger (1968). A numerical experiment shows that, for measurements affected by significant errors, the first term of the Krieger-Elrod-Pawlowski series ('infinite radius' approximation) and the power-law approximation may recover the shear rate with equal accuracy as the full Krieger-Elrod-Pawlowski solution. An experiment on a clay slurry indicates that the clay has a larger yield stress at rest than during shearing, and that, for the range of shear rates investigated, a four-parameter constitutive equation approximates reasonably well its rheology. The error analysis presented is useful for studying the rheology of fluids such as particle suspensions, slurries, foams, and magma.

  7. Reducing Cognitive Skill Decay and Diagnostic Error: Theory-Based Practices for Continuing Education in Health Care

    Science.gov (United States)

    Weaver, Sallie J.; Newman-Toker, David E.; Rosen, Michael A.

    2012-01-01

    Missed, delayed, or wrong diagnoses can have a severe impact on patients, providers, and the entire health care system. One mechanism implicated in such diagnostic errors is the deterioration of cognitive diagnostic skills that are used rarely or not at all over a prolonged period of time. Existing evidence regarding maintenance of effective…

  8. Reducing user error in dipstick urinalysis with a low-cost slipping manifold and mobile phone platform (Conference Presentation)

    Science.gov (United States)

    Smith, Gennifer T.; Dwork, Nicholas; Khan, Saara A.; Millet, Matthew; Magar, Kiran; Javanmard, Mehdi; Bowden, Audrey K.

    2017-03-01

    Urinalysis dipsticks were designed to revolutionize urine-based medical diagnosis. They are cheap, extremely portable, and have multiple assays patterned on a single platform. They were also meant to be incredibly easy to use. Unfortunately, there are many aspects in both the preparation and the analysis of the dipsticks that are plagued by user error. This high error is one reason that dipsticks have failed to flourish in both the at-home market and in low-resource settings. Sources of error include: inaccurate volume deposition, varying lighting conditions, inconsistent timing measurements, and misinterpreted color comparisons. We introduce a novel manifold and companion software for dipstick urinalysis that eliminates the aforementioned error sources. A micro-volume slipping manifold ensures precise sample delivery, an opaque acrylic box guarantees consistent lighting conditions, a simple sticker-based timing mechanism maintains accurate timing, and custom software that processes video data captured by a mobile phone ensures proper color comparisons. We show that the results obtained with the proposed device are as accurate and consistent as a properly executed dip-and-wipe method, the industry gold-standard, suggesting the potential for this strategy to enable confident urinalysis testing. Furthermore, the proposed all-acrylic slipping manifold is reusable and low in cost, making it a potential solution for at-home users and low-resource settings.

  9. Quantitative developments in the cognitive reliability and error analysis method (CREAM) for the assessment of human performance

    International Nuclear Information System (INIS)

    Marseguerra, Marzio; Zio, Enrico; Librizzi, Massimo

    2006-01-01

    The current 'second generation' approaches in human reliability analysis focus their attention on the contextual conditions under which a given action is performed rather than on the notion of inherent human error probabilities, as was done in the earlier 'first generation' techniques. Among the 'second generation' methods, this paper considers the Cognitive Reliability and Error Analysis Method (CREAM) and proposes some developments with respect to a systematic procedure for computing probabilities of action failure. The starting point for the quantification is a previously introduced fuzzy version of the CREAM paradigm which is here further extended to include uncertainty on the qualification of the conditions under which the action is performed and to account for the fact that the effects of the common performance conditions (CPCs) on performance reliability may not all be equal. By the proposed approach, the probability of action failure is estimated by rating the performance conditions in terms of their effect on the action

  10. The analysis of human error as causes in the maintenance of machines: a case study in mining companies

    Directory of Open Access Journals (Sweden)

    Kovacevic, Srdja

    2016-12-01

    Full Text Available This paper describes the two-step method used to analyse the factors and aspects influencing human error during the maintenance of mining machines. The first step is the cause-effect analysis, supported by brainstorming, where five factors and 21 aspects are identified. During the second step, the group fuzzy analytic hierarchy process is used to rank the identified factors and aspects. A case study is done on mining companies in Serbia. The key aspects are ranked according to an analysis that included experts who assess risks in mining companies (a maintenance engineer, a technologist, an ergonomist, a psychologist, and an organisational scientist. Failure to follow technical maintenance instructions, poor organisation of the training process, inadequate diagnostic equipment, and a lack of understanding of the work process are identified as the most important causes of human error.

  11. A statistical approach to estimating effects of performance shaping factors on human error probabilities of soft controls

    International Nuclear Information System (INIS)

    Kim, Yochan; Park, Jinkyun; Jung, Wondea; Jang, Inseok; Hyun Seong, Poong

    2015-01-01

    Despite recent efforts toward data collection for supporting human reliability analysis, there remains a lack of empirical basis in determining the effects of performance shaping factors (PSFs) on human error probabilities (HEPs). To enhance the empirical basis regarding the effects of the PSFs, a statistical methodology using a logistic regression and stepwise variable selection was proposed, and the effects of the PSF on HEPs related with the soft controls were estimated through the methodology. For this estimation, more than 600 human error opportunities related to soft controls in a computerized control room were obtained through laboratory experiments. From the eight PSF surrogates and combinations of these variables, the procedure quality, practice level, and the operation type were identified as significant factors for screen switch and mode conversion errors. The contributions of these significant factors to HEPs were also estimated in terms of a multiplicative form. The usefulness and limitation of the experimental data and the techniques employed are discussed herein, and we believe that the logistic regression and stepwise variable selection methods will provide a way to estimate the effects of PSFs on HEPs in an objective manner. - Highlights: • It is necessary to develop an empirical basis for the effects of the PSFs on the HEPs. • A statistical method using a logistic regression and variable selection was proposed. • The effects of PSFs on the HEPs of soft controls were empirically investigated. • The significant factors were identified and their effects were estimated

  12. Human Factors Risk Analyses of a Doffing Protocol for Ebola-Level Personal Protective Equipment: Mapping Errors to Contamination.

    Science.gov (United States)

    Mumma, Joel M; Durso, Francis T; Ferguson, Ashley N; Gipson, Christina L; Casanova, Lisa; Erukunuakpor, Kimberly; Kraft, Colleen S; Walsh, Victoria L; Zimring, Craig; DuBose, Jennifer; Jacob, Jesse T

    2018-03-05

    Doffing protocols for personal protective equipment (PPE) are critical for keeping healthcare workers (HCWs) safe during care of patients with Ebola virus disease. We assessed the relationship between errors and self-contamination during doffing. Eleven HCWs experienced with doffing Ebola-level PPE participated in simulations in which HCWs donned PPE marked with surrogate viruses (ɸ6 and MS2), completed a clinical task, and were assessed for contamination after doffing. Simulations were video recorded, and a failure modes and effects analysis and fault tree analyses were performed to identify errors during doffing, quantify their risk (risk index), and predict contamination data. Fifty-one types of errors were identified, many having the potential to spread contamination. Hand hygiene and removing the powered air purifying respirator (PAPR) hood had the highest total risk indexes (111 and 70, respectively) and number of types of errors (9 and 13, respectively). ɸ6 was detected on 10% of scrubs and the fault tree predicted a 10.4% contamination rate, likely occurring when the PAPR hood inadvertently contacted scrubs during removal. MS2 was detected on 10% of hands, 20% of scrubs, and 70% of inner gloves and the predicted rates were 7.3%, 19.4%, 73.4%, respectively. Fault trees for MS2 and ɸ6 contamination suggested similar pathways. Ebola-level PPE can both protect and put HCWs at risk for self-contamination throughout the doffing process, even among experienced HCWs doffing with a trained observer. Human factors methodologies can identify error-prone steps, delineate the relationship between errors and self-contamination, and suggest remediation strategies.

  13. Reducing visual deficits caused by refractive errors in school and preschool children: results of a pilot school program in the Andean region of Apurimac, Peru

    Science.gov (United States)

    Latorre-Arteaga, Sergio; Gil-González, Diana; Enciso, Olga; Phelan, Aoife; García-Muñoz, Ángel; Kohler, Johannes

    2014-01-01

    Background Refractive error is defined as the inability of the eye to bring parallel rays of light into focus on the retina, resulting in nearsightedness (myopia), farsightedness (Hyperopia) or astigmatism. Uncorrected refractive error in children is associated with increased morbidity and reduced educational opportunities. Vision screening (VS) is a method for identifying children with visual impairment or eye conditions likely to lead to visual impairment. Objective To analyze the utility of vision screening conducted by teachers and to contribute to a better estimation of the prevalence of childhood refractive errors in Apurimac, Peru. Design A pilot vision screening program in preschool (Group I) and elementary school children (Group II) was conducted with the participation of 26 trained teachers. Children whose visual acuity was<6/9 [20/30] (Group I) and≤6/9 (Group II) in one or both eyes, measured with the Snellen Tumbling E chart at 6 m, were referred for a comprehensive eye exam. Specificity and positive predictive value to detect refractive error were calculated against clinical examination. Program assessment with participants was conducted to evaluate outcomes and procedures. Results A total sample of 364 children aged 3–11 were screened; 45 children were examined at Centro Oftalmológico Monseñor Enrique Pelach (COMEP) Eye Hospital. Prevalence of refractive error was 6.2% (Group I) and 6.9% (Group II); specificity of teacher vision screening was 95.8% and 93.0%, while positive predictive value was 59.1% and 47.8% for each group, respectively. Aspects highlighted to improve the program included extending training, increasing parental involvement, and helping referred children to attend the hospital. Conclusion Prevalence of refractive error in children is significant in the region. Vision screening performed by trained teachers is a valid intervention for early detection of refractive error, including screening of preschool children. Program

  14. Distinguishing science from pseudoscience in school psychology: science and scientific thinking as safeguards against human error.

    Science.gov (United States)

    Lilienfeld, Scott O; Ammirati, Rachel; David, Michal

    2012-02-01

    Like many domains of professional psychology, school psychology continues to struggle with the problem of distinguishing scientific from pseudoscientific and otherwise questionable clinical practices. We review evidence for the scientist-practitioner gap in school psychology and provide a user-friendly primer on science and scientific thinking for school psychologists. Specifically, we (a) outline basic principles of scientific thinking, (b) delineate widespread cognitive errors that can contribute to belief in pseudoscientific practices within school psychology and allied professions, (c) provide a list of 10 key warning signs of pseudoscience, illustrated by contemporary examples from school psychology and allied disciplines, and (d) offer 10 user-friendly prescriptions designed to encourage scientific thinking among school psychology practitioners and researchers. We argue that scientific thinking, although fallible, is ultimately school psychologists' best safeguard against a host of errors in thinking. Copyright © 2011 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.

  15. HiRITER - An evaluation tool to reduce the adverse effect of inappropriate human actions in nuclear power plants

    International Nuclear Information System (INIS)

    Park, J.; Jung, W.; Kim, J.; Kim, S.; Heo, G.

    2012-01-01

    From end-users to regulatory bodies, it is widely recognized that human-induced events including inappropriate human actions are one of the most crucial sources degrading the overall safety of nuclear power plants (NPPs). This means that a systematic framework through which inappropriate human actions can be effectively identified is necessary to enhance the safety of NPPs. For this reason, HiRITER (High Risk Inducible Task Evaluator) has been developed, which is able to evaluate the effect of inappropriate human actions on risk as well as performance. To this end, HiRITER integrates three modules that have distinctive roles: human error prediction module that is able to determine the types of failure modes resulting from inappropriate human actions with the associated daily task, performance evaluation module that computes the loss of electric power due to the change of component configurations caused by human error and risk evaluation module that clarifies whether or not the propagation of human error can trigger an unexpected shutdown of NPPs. In addition, a couple of real events that had occurred in domestic NPPs are simulated in order to validate the feasibility of HiRITER. As a result, it is observed that the results of HiRITER are largely congruent with those of real events. Therefore, although a huge amount of additional effort is indispensable to enhance the overall accuracy of estimated results, it is expected that HiRITER could be a good starting point to reduce the adverse effect of inappropriate human actions in NPPs

  16. Human error and the associated recovery probabilities for soft control being used in the advanced MCRs of NPPs

    International Nuclear Information System (INIS)

    Jang, Inseok; Jung, Wondea; Seong, Poong Hyun

    2016-01-01

    Highlights: • The operation environment of MCRs in NPPs has changed by adopting digital HSIs. • Most current HRA databases are not explicitly designed to deal with digital HSI. • Empirical analysis for new HRA DB under an advanced MCR mockup are carried. • It is expected that the results can be used for advanced MCR HRA. - Abstract: Since the Three Mile Island (TMI)-2 accident, human error has been recognized as one of the main causes of Nuclear Power Plant (NPP) accidents, and numerous studies related to Human Reliability Analysis (HRA) have been carried out. Most of these studies were focused on considering the conventional Main Control Room (MCR) environment. However, the operating environment of MCRs in NPPs has changed with the adoption of new human-system interfaces (HSI) largely based on up-to-date digital technologies. The MCRs that include these digital and computer technologies, such as large display panels, computerized procedures, and soft controls, are called advanced MCRs. Among the many features of advanced MCRs, soft controls are a particularly important because operating actions in advanced MCRs are performed by soft control. Due to the difference in interfaces between soft control and hardwired conventional controls, different HEP should be used in the HRA for advanced MCRs. Unfortunately, most current HRA databases deal with operations in conventional MCRs and are not explicitly designed to deal with digital Human System Interface (HSI). For this reason, empirical human error and the associated error recovery probabilities were collected from the mockup of an advanced MCR equipped with soft controls. To this end, small-scaled experiments are conducted with 48 graduated students in the department of nuclear engineering in Korea Advanced Institute of Science and Technology (KAIST) are participated, and accident scenarios are designed with respect to the typical Design Basis Accidents (DBAs) in NPPs, such as Steam Generator Tube Rupture

  17. Development of a new cause classification method considering plant ageing and human errors for adverse events which occurred in nuclear power plants and some results of its application

    International Nuclear Information System (INIS)

    Miyazaki, Takamasa

    2007-01-01

    The adverse events which occurred in nuclear power plants are analyzed to prevent similar events, and in the analysis of each event, the cause of the event is classified by a cause classification method. This paper shows a new cause classification method which is improved in several points as follows: (1) the whole causes are systematically classified into three major categories such as machine system, operation system and plant outside causes, (2) the causes of the operation system are classified into several management errors normally performed in a nuclear power plant, (3) the content of ageing is defined in detail for their further analysis, (4) human errors are divided and defined by the error stage, (5) human errors can be related to background factors, and so on. This new method is applied to the adverse events which occurred in domestic and overseas nuclear power plants in 2005. From these results, it is clarified that operation system errors account for about 60% of the whole causes, of which approximately 60% are maintenance errors, about 40% are worker's human errors, and that the prevention of maintenance errors, especially worker's human errors is crucial. (author)

  18. Analysis of Task Types and Error Types of the Human Actions Involved in the Human-related Unplanned Reactor Trip Events

    International Nuclear Information System (INIS)

    Kim, Jae Whan; Park, Jin Kyun; Jung, Won Dea

    2008-02-01

    This report provides the task types and error types involved in the unplanned reactor trip events that have occurred during 1986 - 2006. The events that were caused by the secondary system of the nuclear power plants amount to 67 %, and the remaining 33 % was by the primary system. The contribution of the activities of the plant personnel was identified as the following order: corrective maintenance (25.7 %), planned maintenance (22.8 %), planned operation (19.8 %), periodic preventive maintenance (14.9 %), response to a transient (9.9 %), and design/manufacturing/installation (9.9%). According to the analysis of error modes, the error modes such as control failure (22.2 %), wrong object (18.5 %), omission (14.8 %), wrong action (11.1 %), and inadequate (8.3 %) take up about 75 % of all the unplanned trip events. The analysis of the cognitive functions involved showed that the planning function makes the highest contribution to the human actions leading to unplanned reactor trips, and it is followed by the observation function (23.4%), the execution function (17.8 %), and the interpretation function (10.3 %). The results of this report are to be used as important bases for development of the error reduction measures or development of the error mode prediction system for the test and maintenance tasks in nuclear power plants

  19. Analysis of Task Types and Error Types of the Human Actions Involved in the Human-related Unplanned Reactor Trip Events

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae Whan; Park, Jin Kyun; Jung, Won Dea

    2008-02-15

    This report provides the task types and error types involved in the unplanned reactor trip events that have occurred during 1986 - 2006. The events that were caused by the secondary system of the nuclear power plants amount to 67 %, and the remaining 33 % was by the primary system. The contribution of the activities of the plant personnel was identified as the following order: corrective maintenance (25.7 %), planned maintenance (22.8 %), planned opera