WorldWideScience

Sample records for greatest errors occurred

  1. Responsibility for reporting patient death due to hospital error in Japan when an error occurred at a referring institution.

    Science.gov (United States)

    Maeda, Shoichi; Starkey, Jay; Kamishiraki, Etsuko; Ikeda, Noriaki

    2013-12-01

    In Japan, physicians are required to report unexpected health care-associated patient deaths to the police. Patients needing to be transferred to another institution often have complex medical problems. If a medical error occurs, it may be either at the final or the referring institution. Some fear that liability will fall on the final institution regardless of where the error occurred or that the referring facility may oppose such reporting, leading to a failure to report to police or to recommend an autopsy. Little is known about the actual opinions of physicians and risk managers in this regard. The authors sent standardised, self-administered questionnaires to all hospitals in Japan that participate in the national general residency program. Most physicians and risk managers in Japan indicated that they would report a patient's death to the police where the patient has been transferred. Of those who indicated they would not report to the police, the majority still indicated they would recommend an autopsy

  2. The Greatest Mathematical Discovery?

    Energy Technology Data Exchange (ETDEWEB)

    Bailey, David H.; Borwein, Jonathan M.

    2010-05-12

    What mathematical discovery more than 1500 years ago: (1) Is one of the greatest, if not the greatest, single discovery in the field of mathematics? (2) Involved three subtle ideas that eluded the greatest minds of antiquity, even geniuses such as Archimedes? (3) Was fiercely resisted in Europe for hundreds of years after its discovery? (4) Even today, in historical treatments of mathematics, is often dismissed with scant mention, or else is ascribed to the wrong source? Answer: Our modern system of positional decimal notation with zero, together with the basic arithmetic computational schemes, which were discovered in India about 500 CE.

  3. Detailed semantic analyses of human error incidents occurring at nuclear power plant in USA (interim report). Characteristics of human error incidents occurring in the period from 1992 to 1996

    International Nuclear Information System (INIS)

    Hirotsu, Yuko; Tsuge, Tadashi; Sano, Toshiaki; Takano, Kenichi; Gouda, Hidenori

    2001-01-01

    CRIEPI has been conducting detailed analyses of all human error incidents at domestic nuclear power plants (NPPs) collected from Japanese Licensee Event Reports (LERs) using J-HPES (Japanese version of HPES) as an analysis method. Results obtained by the analyses have been stored in J-HPES database. Since 1999, human error incidents have been selected from U.S. LERs, and they are analyzed using J-HPES. In this report, the results, which classified error action, cause, and preventive measure, are summarized for U.S. human error cases occurring in the period from 1992 to 1996. It was suggested as a result of classification that the categories of error action were almost the same as those of Japanese human error cases. Therefore, problems in the process of error action and checkpoints for preventing errors will be extracted by analyzing both U.S. and domestic human error cases. It was also suggested that the interrelations between error actions, causes, and organizational factors could be identified. While taking these suggestions into consideration, we will continue to analyze U.S. human error cases. (author)

  4. Nature's Greatest Puzzles

    International Nuclear Information System (INIS)

    Quigg, Chris

    2005-01-01

    It is a pleasure to be part of the SLAC Summer Institute again, not simply because it is one of the great traditions in our field, but because this is a moment of great promise for particle physics. I look forward to exploring many opportunities with you over the course of our two weeks together. My first task in talking about Nature's Greatest Puzzles, the title of this year's Summer Institute, is to deconstruct the premise a little bit

  5. Similarities between the target and the intruder in naturally-occurring repeated person naming errors

    Directory of Open Access Journals (Sweden)

    Serge eBredart

    2015-09-01

    Full Text Available The present study investigated an intriguing phenomenon that did not receive much attention so far: repeatedly calling a familiar person with someone else’s name. From participants’ responses to a questionnaire, these repeated naming errors were characterized with respect to a number of properties (e.g., type of names being substituted, error frequency, error longevity and different features of similarity (e.g., age, gender, type of relationship with the participant, face resemblance and similarity of the contexts of encounter between the bearer of the target name and the bearer of the wrong name. Moreover, it was evaluated whether the phonological similarity between names, the participants’ age, the difference of age between the two persons whose names were substituted, and face resemblance between the two persons predicted the frequency of error. Regression analyses indicated that phonological similarity between the target name and the wrong name predicted the frequency of repeated person naming errors. The age of the participant was also a significant predictor of error frequency: the older the participant the higher the frequency of errors. Consistent with previous research stressing the importance of the age of acquisition of words on lexical access in speech production, results indicated that bearer of the wrong name was on average known for longer than the bearer of the target name.

  6. Computational Physics' Greatest Hits

    Science.gov (United States)

    Bug, Amy

    2011-03-01

    The digital computer, has worked its way so effectively into our profession that now, roughly 65 years after its invention, it is virtually impossible to find a field of experimental or theoretical physics unaided by computational innovation. It is tough to think of another device about which one can make that claim. In the session ``What is computational physics?'' speakers will distinguish computation within the field of computational physics from this ubiquitous importance across all subfields of physics. This talk will recap the invited session ``Great Advances...Past, Present and Future'' in which five dramatic areas of discovery (five of our ``greatest hits'') are chronicled: The physics of many-boson systems via Path Integral Monte Carlo, the thermodynamic behavior of a huge number of diverse systems via Monte Carlo Methods, the discovery of new pharmaceutical agents via molecular dynamics, predictive simulations of global climate change via detailed, cross-disciplinary earth system models, and an understanding of the formation of the first structures in our universe via galaxy formation simulations. The talk will also identify ``greatest hits'' in our field from the teaching and research perspectives of other members of DCOMP, including its Executive Committee.

  7. A model for the statistical description of analytical errors occurring in clinical chemical laboratories with time.

    Science.gov (United States)

    Hyvärinen, A

    1985-01-01

    The main purpose of the present study was to describe the statistical behaviour of daily analytical errors in the dimensions of place and time, providing a statistical basis for realistic estimates of the analytical error, and hence allowing the importance of the error and the relative contributions of its different sources to be re-evaluated. The observation material consists of creatinine and glucose results for control sera measured in daily routine quality control in five laboratories for a period of one year. The observation data were processed and computed by means of an automated data processing system. Graphic representations of time series of daily observations, as well as their means and dispersion limits when grouped over various time intervals, were investigated. For partition of the total variation several two-way analyses of variance were done with laboratory and various time classifications as factors. Pooled sets of observations were tested for normality of distribution and for consistency of variances, and the distribution characteristics of error variation in different categories of place and time were compared. Errors were found from the time series to vary typically between days. Due to irregular fluctuations in general and particular seasonal effects in creatinine, stable estimates of means or of dispersions for errors in individual laboratories could not be easily obtained over short periods of time but only from data sets pooled over long intervals (preferably at least one year). Pooled estimates of proportions of intralaboratory variation were relatively low (less than 33%) when the variation was pooled within days. However, when the variation was pooled over longer intervals this proportion increased considerably, even to a maximum of 89-98% (95-98% in each method category) when an outlying laboratory in glucose was omitted, with a concomitant decrease in the interaction component (representing laboratory-dependent variation with time

  8. Detailed semantic analyses of human error incidents occurring at nuclear power plants. Extraction of periodical transition of error occurrence patterns by applying multivariate analysis

    International Nuclear Information System (INIS)

    Hirotsu, Yuko; Suzuki, Kunihiko; Takano, Kenichi; Kojima, Mitsuhiro

    2000-01-01

    It is essential for preventing the recurrence of human error incidents to analyze and evaluate them with the emphasis on human factor. Detailed and structured analyses of all incidents at domestic nuclear power plants (NPPs) reported during last 31 years have been conducted based on J-HPES, in which total 193 human error cases are identified. Results obtained by the analyses have been stored into the J-HPES database. In the previous study, by applying multivariate analysis to above case studies, it was suggested that there were several occurrence patterns identified of how errors occur at NPPs. It was also clarified that the causes related to each human error are different depending on age of their occurrence. This paper described the obtained results in respects of periodical transition of human error occurrence patterns. By applying multivariate analysis to the above data, it was suggested there were two types of error occurrence patterns as to each human error type. First type is common occurrence patterns, not depending on the age, and second type is the one influenced by periodical characteristics. (author)

  9. Dosage uniformity problems which occur due to technological errors in extemporaneously prepared suppositories in hospitals and pharmacies

    Science.gov (United States)

    Kalmár, Éva; Lasher, Jason Richard; Tarry, Thomas Dean; Myers, Andrea; Szakonyi, Gerda; Dombi, György; Baki, Gabriella; Alexander, Kenneth S.

    2013-01-01

    The availability of suppositories in Hungary, especially in clinical pharmacy practice, is usually provided by extemporaneous preparations. Due to the known advantages of rectal drug administration, its benefits are frequently utilized in pediatrics. However, errors during the extemporaneous manufacturing process can lead to non-homogenous drug distribution within the dosage units. To determine the root cause of these errors and provide corrective actions, we studied suppository samples prepared with exactly known errors using both cerimetric titration and HPLC technique. Our results show that the most frequent technological error occurs when the pharmacist fails to use the correct displacement factor in the calculations which could lead to a 4.6% increase/decrease in the assay in individual dosage units. The second most important source of error can occur when the molding excess is calculated solely for the suppository base. This can further dilute the final suppository drug concentration causing the assay to be as low as 80%. As a conclusion we emphasize that the application of predetermined displacement factors in calculations for the formulation of suppositories is highly important, which enables the pharmacist to produce a final product containing exactly the determined dose of an active substance despite the different densities of the components. PMID:25161378

  10. Development of a new cause classification method considering plant ageing and human errors for adverse events which occurred in nuclear power plants and some results of its application

    International Nuclear Information System (INIS)

    Miyazaki, Takamasa

    2007-01-01

    The adverse events which occurred in nuclear power plants are analyzed to prevent similar events, and in the analysis of each event, the cause of the event is classified by a cause classification method. This paper shows a new cause classification method which is improved in several points as follows: (1) the whole causes are systematically classified into three major categories such as machine system, operation system and plant outside causes, (2) the causes of the operation system are classified into several management errors normally performed in a nuclear power plant, (3) the content of ageing is defined in detail for their further analysis, (4) human errors are divided and defined by the error stage, (5) human errors can be related to background factors, and so on. This new method is applied to the adverse events which occurred in domestic and overseas nuclear power plants in 2005. From these results, it is clarified that operation system errors account for about 60% of the whole causes, of which approximately 60% are maintenance errors, about 40% are worker's human errors, and that the prevention of maintenance errors, especially worker's human errors is crucial. (author)

  11. Trend analysis and comparison of operators' human error events occurred at overseas and domestic nuclear power plants

    International Nuclear Information System (INIS)

    Takagawa, Kenichi

    2006-01-01

    Human errors by operators at overseas and domestic nuclear power plants during the period from 2002 to 2005 were compared and their trends analyzed. The most frequently cited cause of such errors was 'insufficient team monitoring' (inadequate superiors' and other crews' instructions and supervision) both at overseas and domestic plants, followed by 'insufficient self-checking' (lack of cautions by the operator himself). A comparison of the effects of the errors on the operations of plants in Japan and the United Sates showed that the drop in plant output and plant shutdowns at plants in Japan were approximately one-tenth of those in the United States. The ratio of automatic reactor trips to the total number of human errors reported is about 6% for both Japanese and American plants. Looking at changes in the incidence of human errors by years of occurrence, although a distinctive trend cannot be identified for domestic nuclear power plants due to insufficient reported cases, 'inadequate self-checking' as a factor contributing to human errors at overseas nuclear power plants has decreased significantly over the past four years. Regarding changes in the effects of human errors on the operations of plants during the four-year period, events leading to an automatic reactor trip have tended to increase at American plants. Conceivable factors behind this increasing tendency included lack of operating experience by a team (e.g., plant transients and reactor shutdowns and startups) and excessive dependence on training simulators. (author)

  12. Detailed semantic analyses of human error incidents occurring at domestic nuclear power plants to fiscal year 2000

    International Nuclear Information System (INIS)

    Tsuge, Tadashi; Hirotsu, Yuko; Takano, Kenichi; Ebisu, Mitsuhiro; Tsumura, Joji

    2003-01-01

    Analysing and evaluating observed cases of human error incidents with the emphasis on human factors and behavior involved was essential for preventing recurrence of those. CRIEPI has been conducting detailed and structures analyses of all incidents reported during last 35 year based on J-HPES, from the beginning of the first Tokai nuclear power operation till fiscal year of 2000, in which total 212 human error cases are identified. Results obtained by the analyses have been stored into the J-HPES data-base. This summarized the semantic analyses on all case-studies stored in the above data-base to grasp the practical and concrete contents and trend of more frequently observed human errors (as are called trigger actions here), causal factors and preventive measures. These semantic analyses have been executed by classifying all those items into some categories that could be considered as having almost the same meaning using the KJ method. Followings are obtained typical results by above analyses: (1) Trigger action-Those could be classified into categories of operation or categories of maintenance. Operational timing errors' and 'operational quantitative errors' were major actions in trigger actions of operation, those occupied about 20% among all actions. At trigger actions of maintenance, 'maintenance quantitative error' were major actions, those occupied quarter among all actions; (2) Causal factor- 'Human internal status' were major factors, as in concrete factors, those occupied 'improper persistence' and 'lack of knowledge'; (3) Preventive measure-Most frequent measures got were job management changes in procedural software improvements, which was from 70% to 80%. As for preventive measures of operation, software improvements have been implemented on 'organization and work practices' and 'individual consciousness'. Concerning preventive measures of maintenance, improvements have been implemented on 'organization and work practices'. (author)

  13. Nature's Greatest Puzzles

    Energy Technology Data Exchange (ETDEWEB)

    Quigg, Chris; /Fermilab

    2005-02-01

    It is a pleasure to be part of the SLAC Summer Institute again, not simply because it is one of the great traditions in our field, but because this is a moment of great promise for particle physics. I look forward to exploring many opportunities with you over the course of our two weeks together. My first task in talking about Nature's Greatest Puzzles, the title of this year's Summer Institute, is to deconstruct the premise a little bit.

  14. Comparison of maintenance worker's human error events occurred at United States and domestic nuclear power plants. The proposal of the classification method with insufficient knowledge and experience and the classification result of its application

    International Nuclear Information System (INIS)

    Takagawa, Kenichi

    2008-01-01

    Human errors by maintenance workers in U.S. nuclear power plants were compared with those in Japanese nuclear power plants for the same period in order to identify the characteristics of such errors. As for U.S. events, cases which occurred during 2006 were selected from the Nuclear Information Database of the Institute to Nuclear Safety System while Japanese cases that occurred during the same period, were extracted from the Nuclear Information Archives (NUCIA) owned by JANTI. The most common cause of human errors was insufficient knowledge or experience' accounting for about 40% for U.S. cases and 50% or more of cases in Japan. To break down 'insufficient knowledge', we classified the contents of knowledge into five categories; method', 'nature', 'reason', 'scope' and 'goal', and classified the level of knowledge into four categories: 'known', 'comprehended', 'applied' and analytic'. By using this classification, the patterns of combination of each item of the content and the level of knowledge were compared. In the U.S. cases, errors due to 'insufficient knowledge of nature and insufficient knowledge of method' were prevalent while three other items', 'reason', scope' and 'goal' which involve work conditions among the contents of knowledge rarely occurred. In Japan, errors arising from 'nature not being comprehended' were rather prevalent while other cases were distributed evenly for all categories including the work conditions. For addressing insufficient knowledge or experience', we consider that the following approaches are valid: according to the knowledge level which is required for the work, the reflection of knowledge on the procedure or education materials, training and confirmation of understanding level, virtual practice and instruction of experience should be implemented. As for the knowledge on the work conditions, it is necessary to enter the work conditions in the procedure and education materials while conducting training or education. (author)

  15. Masses of galaxies and the greatest redshifts of quasars

    Energy Technology Data Exchange (ETDEWEB)

    Hills, J G [Illinois Univ., Urbana (USA)

    1977-04-01

    The outer parts of a typical galaxy follows an R/sup -2/ density distribution which results in the collapse time of its protogalaxy being proportional to its mass. Since quasars probably occur in the nuclei of galaxies which can only form after the collapse of their parent galaxies, their greatest observed redshift, Zsub(max), is largely determined by the mass, Msub(t), of a typical protogalaxy. The observed Zsub(max) of quasars indicates that Msub(t) = 1 x 10/sup 12/ solar masses. This mass is consistent with the masses of galaxies found in recent dynamical studies. It indicates that most of the mass in a typical galaxy is in the halo lying beyond the familiar optically-bright core, but the mass of a standard galaxy is still only 0.3 of that required for galaxies alone to close the universe.

  16. Drought occurence

    Science.gov (United States)

    John W. Coulston

    2007-01-01

    Why Is Drought Important? Drought is an important forest disturbance that occurs regularly in the Western United States and irregularly in the Eastern United States (Dale and others 2001). Moderate drought stress tends to slow plant growth while severedrought stress can also reduce photosynthesis (Kareiva and others 1993). Drought can also interact with...

  17. Learning from prescribing errors

    OpenAIRE

    Dean, B

    2002-01-01

    

 The importance of learning from medical error has recently received increasing emphasis. This paper focuses on prescribing errors and argues that, while learning from prescribing errors is a laudable goal, there are currently barriers that can prevent this occurring. Learning from errors can take place on an individual level, at a team level, and across an organisation. Barriers to learning from prescribing errors include the non-discovery of many prescribing errors, lack of feedback to th...

  18. Was ocean acidification responsible for history's greatest extinction?

    Science.gov (United States)

    Schultz, Colin

    2011-11-01

    Two hundred fifty million years ago, the world suffered the greatest recorded extinction of all time. More than 90% of marine animals and a majority of terrestrial species disappeared, yet the cause of the Permian-Triassic boundary (PTB) dieoff remains unknown. Various theories abound, with most focusing on rampant Siberian volcanism and its potential consequences: global warming, carbon dioxide poisoning, ocean acidification, or the severe drawdown of oceanic dissolved oxygen levels, also known as anoxia. To narrow the range of possible causes, Montenegro et al. ran climate simulations for PTB using the University of Victoria Earth System Climate Model, a carbon cycle-climate coupled general circulation model.

  19. Massive the Higgs boson and the greatest hunt in science

    CERN Document Server

    Sample, Ian

    2013-01-01

    Now fully updated -- this is the dramatic and gripping account of the greatest scientific discovery of our time. In the early 1960s, three groups of physicists, working independently in different countries, stumbled upon an idea that would change physics and fuel the imagination of scientists for decades. That idea was the Higgs boson -- to find it would be to finally understand the origins of mass -- the last building block of life itself. Now, almost 50 years later, that particle has finally been discovered.

  20. Action errors, error management, and learning in organizations.

    Science.gov (United States)

    Frese, Michael; Keith, Nina

    2015-01-03

    Every organization is confronted with errors. Most errors are corrected easily, but some may lead to negative consequences. Organizations often focus on error prevention as a single strategy for dealing with errors. Our review suggests that error prevention needs to be supplemented by error management--an approach directed at effectively dealing with errors after they have occurred, with the goal of minimizing negative and maximizing positive error consequences (examples of the latter are learning and innovations). After defining errors and related concepts, we review research on error-related processes affected by error management (error detection, damage control). Empirical evidence on positive effects of error management in individuals and organizations is then discussed, along with emotional, motivational, cognitive, and behavioral pathways of these effects. Learning from errors is central, but like other positive consequences, learning occurs under certain circumstances--one being the development of a mind-set of acceptance of human error.

  1. Penicillin: the medicine with the greatest impact on therapeutic outcomes.

    Science.gov (United States)

    Kardos, Nelson; Demain, Arnold L

    2011-11-01

    The principal point of this paper is that the discovery of penicillin and the development of the supporting technologies in microbiology and chemical engineering leading to its commercial scale production represent it as the medicine with the greatest impact on therapeutic outcomes. Our nomination of penicillin for the top therapeutic molecule rests on two lines of evidence concerning the impact of this event: (1) the magnitude of the therapeutic outcomes resulting from the clinical application of penicillin and the subsequent widespread use of antibiotics and (2) the technologies developed for production of penicillin, including both microbial strain selection and improvement plus chemical engineering methods responsible for successful submerged fermentation production. These became the basis for production of all subsequent antibiotics in use today. These same technologies became the model for the development and production of new types of bioproducts (i.e., anticancer agents, monoclonal antibodies, and industrial enzymes). The clinical impact of penicillin was large and immediate. By ushering in the widespread clinical use of antibiotics, penicillin was responsible for enabling the control of many infectious diseases that had previously burdened mankind, with subsequent impact on global population demographics. Moreover, the large cumulative public effect of the many new antibiotics and new bioproducts that were developed and commercialized on the basis of the science and technology after penicillin demonstrates that penicillin had the greatest therapeutic impact event of all times. © Springer-Verlag 2011

  2. Greatest Happiness Principle in a Complex System Approach

    Directory of Open Access Journals (Sweden)

    Katalin Martinás

    2012-06-01

    Full Text Available The principle of greatest happiness was the basis of ethics in Plato’s and Aristotle’s work, it served as the basis of utility principle in economics, and the happiness research has become a hot topic in social sciences in Western countries in particular in economics recently. Nevertheless there is a considerable scientific pessimism over whether it is even possible to affect sustainable increases in happiness.In this paper we outline an economic theory of decision based on the greatest happiness principle (GHP. Modern equilibrium economics is a simple system simplification of the GHP, the complex approach outlines a non-equilibrium economic theory. The comparison of the approaches reveals the fact that the part of the results – laws of modern economics – follow from the simplifications and they are against the economic nature. The most important consequence is that within the free market economy one cannot be sure that the path found by it leads to a beneficial economic system.

  3. Field error lottery

    Energy Technology Data Exchange (ETDEWEB)

    Elliott, C.J.; McVey, B. (Los Alamos National Lab., NM (USA)); Quimby, D.C. (Spectra Technology, Inc., Bellevue, WA (USA))

    1990-01-01

    The level of field errors in an FEL is an important determinant of its performance. We have computed 3D performance of a large laser subsystem subjected to field errors of various types. These calculations have been guided by simple models such as SWOOP. The technique of choice is utilization of the FELEX free electron laser code that now possesses extensive engineering capabilities. Modeling includes the ability to establish tolerances of various types: fast and slow scale field bowing, field error level, beam position monitor error level, gap errors, defocusing errors, energy slew, displacement and pointing errors. Many effects of these errors on relative gain and relative power extraction are displayed and are the essential elements of determining an error budget. The random errors also depend on the particular random number seed used in the calculation. The simultaneous display of the performance versus error level of cases with multiple seeds illustrates the variations attributable to stochasticity of this model. All these errors are evaluated numerically for comprehensive engineering of the system. In particular, gap errors are found to place requirements beyond mechanical tolerances of {plus minus}25{mu}m, and amelioration of these may occur by a procedure utilizing direct measurement of the magnetic fields at assembly time. 4 refs., 12 figs.

  4. Error Patterns

    NARCIS (Netherlands)

    Hoede, C.; Li, Z.

    2001-01-01

    In coding theory the problem of decoding focuses on error vectors. In the simplest situation code words are $(0,1)$-vectors, as are the received messages and the error vectors. Comparison of a received word with the code words yields a set of error vectors. In deciding on the original code word,

  5. Team errors: definition and taxonomy

    International Nuclear Information System (INIS)

    Sasou, Kunihide; Reason, James

    1999-01-01

    In error analysis or error management, the focus is usually upon individuals who have made errors. In large complex systems, however, most people work in teams or groups. Considering this working environment, insufficient emphasis has been given to 'team errors'. This paper discusses the definition of team errors and its taxonomy. These notions are also applied to events that have occurred in the nuclear power industry, aviation industry and shipping industry. The paper also discusses the relations between team errors and Performance Shaping Factors (PSFs). As a result, the proposed definition and taxonomy are found to be useful in categorizing team errors. The analysis also reveals that deficiencies in communication, resource/task management, excessive authority gradient, excessive professional courtesy will cause team errors. Handling human errors as team errors provides an opportunity to reduce human errors

  6. Operator errors

    International Nuclear Information System (INIS)

    Knuefer; Lindauer

    1980-01-01

    Besides that at spectacular events a combination of component failure and human error is often found. Especially the Rasmussen-Report and the German Risk Assessment Study show for pressurised water reactors that human error must not be underestimated. Although operator errors as a form of human error can never be eliminated entirely, they can be minimized and their effects kept within acceptable limits if a thorough training of personnel is combined with an adequate design of the plant against accidents. Contrary to the investigation of engineering errors, the investigation of human errors has so far been carried out with relatively small budgets. Intensified investigations in this field appear to be a worthwhile effort. (orig.)

  7. A description of medication errors reported by pharmacists in a neonatal intensive care unit.

    Science.gov (United States)

    Pawluk, Shane; Jaam, Myriam; Hazi, Fatima; Al Hail, Moza Sulaiman; El Kassem, Wessam; Khalifa, Hanan; Thomas, Binny; Abdul Rouf, Pallivalappila

    2017-02-01

    Background Patients in the Neonatal Intensive Care Unit (NICU) are at an increased risk for medication errors. Objective The objective of this study is to describe the nature and setting of medication errors occurring in patients admitted to an NICU in Qatar based on a standard electronic system reported by pharmacists. Setting Neonatal intensive care unit, Doha, Qatar. Method This was a retrospective cross-sectional study on medication errors reported electronically by pharmacists in the NICU between January 1, 2014 and April 30, 2015. Main outcome measure Data collected included patient information, and incident details including error category, medications involved, and follow-up completed. Results A total of 201 NICU pharmacists-reported medication errors were submitted during the study period. All reported errors did not reach the patient and did not cause harm. Of the errors reported, 98.5% occurred in the prescribing phase of the medication process with 58.7% being due to calculation errors. Overall, 53 different medications were documented in error reports with the anti-infective agents being the most frequently cited. The majority of incidents indicated that the primary prescriber was contacted and the error was resolved before reaching the next phase of the medication process. Conclusion Medication errors reported by pharmacists occur most frequently in the prescribing phase of the medication process. Our data suggest that error reporting systems need to be specific to the population involved. Special attention should be paid to frequently used medications in the NICU as these were responsible for the greatest numbers of medication errors.

  8. Co-Occurring Disorders

    Science.gov (United States)

    ... the mental health field. Alcohol and Drug Abuse, Addiction and Co-occurring Disorders: Co-occurring Disorders and ... 500 Montgomery Street, Suite 820 Alexandria, VA 22314 Phone (703) 684.7722 Toll Free (800) 969.6642 ...

  9. Einstein's error

    International Nuclear Information System (INIS)

    Winterflood, A.H.

    1980-01-01

    In discussing Einstein's Special Relativity theory it is claimed that it violates the principle of relativity itself and that an anomalous sign in the mathematics is found in the factor which transforms one inertial observer's measurements into those of another inertial observer. The apparent source of this error is discussed. Having corrected the error a new theory, called Observational Kinematics, is introduced to replace Einstein's Special Relativity. (U.K.)

  10. Error forecasting schemes of error correction at receiver

    International Nuclear Information System (INIS)

    Bhunia, C.T.

    2007-08-01

    To combat error in computer communication networks, ARQ (Automatic Repeat Request) techniques are used. Recently Chakraborty has proposed a simple technique called the packet combining scheme in which error is corrected at the receiver from the erroneous copies. Packet Combining (PC) scheme fails: (i) when bit error locations in erroneous copies are the same and (ii) when multiple bit errors occur. Both these have been addressed recently by two schemes known as Packet Reversed Packet Combining (PRPC) Scheme, and Modified Packet Combining (MPC) Scheme respectively. In the letter, two error forecasting correction schemes are reported, which in combination with PRPC offer higher throughput. (author)

  11. The greatest hydroelectric power plant in the world. Itaipu Hydroelectric Power Plant

    International Nuclear Information System (INIS)

    Andonov - Chento, Ilija

    2004-01-01

    Details to demonstrate the size and engineering achievements of one of the world's greatest hydroelectric power plant are given. Principal technical features of construction and operation of the Itaipu Dam are tabulated and discussed

  12. Medication Errors - A Review

    OpenAIRE

    Vinay BC; Nikhitha MK; Patel Sunil B

    2015-01-01

    In this present review article, regarding medication errors its definition, medication error problem, types of medication errors, common causes of medication errors, monitoring medication errors, consequences of medication errors, prevention of medication error and managing medication errors have been explained neatly and legibly with proper tables which is easy to understand.

  13. School Issues Under [Section] 504 and the ADA: The Latest and Greatest.

    Science.gov (United States)

    Aleman, Steven R.

    This paper highlights recent guidance and rulings from the Office of Civil Rights (OCR) of interest to administrators, advocates, and attorneys. It is a companion piece to Student Issues on SectionNB504/ADA: The Latest and Greatest. Compliance with SectionNB504 and the Americans with Disabilities Act (ADA) continues to involve debate and dialog on…

  14. Stigma and Discrimination in HIV/AIDS; The greatest Challenge to ...

    African Journals Online (AJOL)

    The greatest challenge to the efforts of the various agencies and governments in the care, support and treatment of people living with HIV/AIDS, appears to be stigma and discrimination. Stigma and discrimination has to be addressed through public education, legislation to protect people living with HIV/AIDS and also by ...

  15. FedWeb Greatest Hits: Presenting the New Test Collection for Federated Web Search

    NARCIS (Netherlands)

    Demeester, Thomas; Trieschnigg, Rudolf Berend; Zhou, Ke; Nguyen, Dong-Phuong; Hiemstra, Djoerd

    This paper presents 'FedWeb Greatest Hits', a large new test collection for research in web information retrieval. As a combination and extension of the datasets used in the TREC Federated Web Search Track, this collection opens up new research possibilities on federated web search challenges, as

  16. Error Budgeting

    Energy Technology Data Exchange (ETDEWEB)

    Vinyard, Natalia Sergeevna [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Perry, Theodore Sonne [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Usov, Igor Olegovich [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-10-04

    We calculate opacity from k (hn)=-ln[T(hv)]/pL, where T(hv) is the transmission for photon energy hv, p is sample density, and L is path length through the sample. The density and path length are measured together by Rutherford backscatter. Δk = $\\partial k$\\ $\\partial T$ ΔT + $\\partial k$\\ $\\partial (pL)$. We can re-write this in terms of fractional error as Δk/k = Δ1n(T)/T + Δ(pL)/(pL). Transmission itself is calculated from T=(U-E)/(V-E)=B/B0, where B is transmitted backlighter (BL) signal and B0 is unattenuated backlighter signal. Then ΔT/T=Δln(T)=ΔB/B+ΔB0/B0, and consequently Δk/k = 1/T (ΔB/B + ΔB$_0$/B$_0$ + Δ(pL)/(pL). Transmission is measured in the range of 0.2

  17. Error begat error: design error analysis and prevention in social infrastructure projects.

    Science.gov (United States)

    Love, Peter E D; Lopez, Robert; Edwards, David J; Goh, Yang M

    2012-09-01

    Design errors contribute significantly to cost and schedule growth in social infrastructure projects and to engineering failures, which can result in accidents and loss of life. Despite considerable research that has addressed their error causation in construction projects they still remain prevalent. This paper identifies the underlying conditions that contribute to design errors in social infrastructure projects (e.g. hospitals, education, law and order type buildings). A systemic model of error causation is propagated and subsequently used to develop a learning framework for design error prevention. The research suggests that a multitude of strategies should be adopted in congruence to prevent design errors from occurring and so ensure that safety and project performance are ameliorated. Copyright © 2011. Published by Elsevier Ltd.

  18. Greatest Happiness Principle in a Complex System: Maximisation versus Driving Force

    Directory of Open Access Journals (Sweden)

    Katalin Martinás

    2012-06-01

    Full Text Available From philosophical point of view, micro-founded economic theories depart from the principle of the pursuit of the greatest happiness. From mathematical point of view, micro-founded economic theories depart from the utility maximisation program. Though economists are aware of the serious limitations of the equilibrium analysis, they remain in that framework. We show that the maximisation principle, which implies the equilibrium hypothesis, is responsible for this impasse. We formalise the pursuit of the greatest happiness principle by the help of the driving force postulate: the volumes of activities depend on the expected wealth increase. In that case we can get rid of the equilibrium hypothesis and have new insights into economic theory. For example, in what extent standard economic results depend on the equilibrium hypothesis?

  19. Social Media - DoD’s Greatest Information Sharing Tool or Weakest Security Link?

    Science.gov (United States)

    2010-04-15

    or position of the Department of the Army, Department of Defense, or the U.S. Government. SOCIAL MEDIA – DOD’S GREATEST INFORMATION SHARING TOOL...appropriateness and effectiveness of these policies in securing the information network. 15. SUBJECT TERMS Social media , information...TYPE Civilian Research Paper 3. DATES COVERED (From - To) August 2009-April 2010 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER Social Media

  20. The conditions for attaining the greatest degree of system stability with strict generator excitation control

    Energy Technology Data Exchange (ETDEWEB)

    Gruzdev, I.A.; Ekimova, M.M.; Truspekova, G.A.

    1982-01-01

    Expressions are derived for an idealized model of a complex electric power system; these expressions define the greatest level of stability of an electric power system and the optimum combination of stabilization factors with automatic excitation control in a single power system. The possibility of increasing the level of stability of an electric power system with simultaneous strict automatic excitation control of the synychronous generators in several power systems is analyzed.

  1. Asymptotics for the greatest zeros of solutions of a particular O.D.E.

    Directory of Open Access Journals (Sweden)

    Silvia Noschese

    1994-05-01

    Full Text Available This paper deals with the Liouville-Stekeloff method for approximating solutions of homogeneous linear ODE and a general result due to Tricomi which provides estimates for the zeros of functions by means of the knowledge of an asymptotic representation. From the classical tools we deduce information about the asymptotics of the greatest zeros of a class of solutions of a particular ODE, including the classical Hermite polynomials.

  2. Errors in clinical laboratories or errors in laboratory medicine?

    Science.gov (United States)

    Plebani, Mario

    2006-01-01

    Laboratory testing is a highly complex process and, although laboratory services are relatively safe, they are not as safe as they could or should be. Clinical laboratories have long focused their attention on quality control methods and quality assessment programs dealing with analytical aspects of testing. However, a growing body of evidence accumulated in recent decades demonstrates that quality in clinical laboratories cannot be assured by merely focusing on purely analytical aspects. The more recent surveys on errors in laboratory medicine conclude that in the delivery of laboratory testing, mistakes occur more frequently before (pre-analytical) and after (post-analytical) the test has been performed. Most errors are due to pre-analytical factors (46-68.2% of total errors), while a high error rate (18.5-47% of total errors) has also been found in the post-analytical phase. Errors due to analytical problems have been significantly reduced over time, but there is evidence that, particularly for immunoassays, interference may have a serious impact on patients. A description of the most frequent and risky pre-, intra- and post-analytical errors and advice on practical steps for measuring and reducing the risk of errors is therefore given in the present paper. Many mistakes in the Total Testing Process are called "laboratory errors", although these may be due to poor communication, action taken by others involved in the testing process (e.g., physicians, nurses and phlebotomists), or poorly designed processes, all of which are beyond the laboratory's control. Likewise, there is evidence that laboratory information is only partially utilized. A recent document from the International Organization for Standardization (ISO) recommends a new, broader definition of the term "laboratory error" and a classification of errors according to different criteria. In a modern approach to total quality, centered on patients' needs and satisfaction, the risk of errors and mistakes

  3. How Do Simulated Error Experiences Impact Attitudes Related to Error Prevention?

    Science.gov (United States)

    Breitkreuz, Karen R; Dougal, Renae L; Wright, Melanie C

    2016-10-01

    The objective of this project was to determine whether simulated exposure to error situations changes attitudes in a way that may have a positive impact on error prevention behaviors. Using a stratified quasi-randomized experiment design, we compared risk perception attitudes of a control group of nursing students who received standard error education (reviewed medication error content and watched movies about error experiences) to an experimental group of students who reviewed medication error content and participated in simulated error experiences. Dependent measures included perceived memorability of the educational experience, perceived frequency of errors, and perceived caution with respect to preventing errors. Experienced nursing students perceived the simulated error experiences to be more memorable than movies. Less experienced students perceived both simulated error experiences and movies to be highly memorable. After the intervention, compared with movie participants, simulation participants believed errors occurred more frequently. Both types of education increased the participants' intentions to be more cautious and reported caution remained higher than baseline for medication errors 6 months after the intervention. This study provides limited evidence of an advantage of simulation over watching movies describing actual errors with respect to manipulating attitudes related to error prevention. Both interventions resulted in long-term impacts on perceived caution in medication administration. Simulated error experiences made participants more aware of how easily errors can occur, and the movie education made participants more aware of the devastating consequences of errors.

  4. Error Analysis in Mathematics. Technical Report #1012

    Science.gov (United States)

    Lai, Cheng-Fei

    2012-01-01

    Error analysis is a method commonly used to identify the cause of student errors when they make consistent mistakes. It is a process of reviewing a student's work and then looking for patterns of misunderstanding. Errors in mathematics can be factual, procedural, or conceptual, and may occur for a number of reasons. Reasons why students make…

  5. Approximation errors during variance propagation

    International Nuclear Information System (INIS)

    Dinsmore, Stephen

    1986-01-01

    Risk and reliability analyses are often performed by constructing and quantifying large fault trees. The inputs to these models are component failure events whose probability of occuring are best represented as random variables. This paper examines the errors inherent in two approximation techniques used to calculate the top event's variance from the inputs' variance. Two sample fault trees are evaluated and several three dimensional plots illustrating the magnitude of the error over a wide range of input means and variances are given

  6. Northeast and Midwest regional species and habitats at greatest risk and most vulnerable to climate impacts

    Science.gov (United States)

    Staudinger, Michelle D.; Hilberg, Laura; Janowiak, Maria; Swanton, C.O.

    2016-01-01

    The objectives of this Chapter are to describe climate change vulnerability, it’s components, the range of assessment methods being implemented regionally, and examples of training resources and tools. Climate Change Vulnerability Assessments (CCVAs) have already been conducted for numerous Regional Species of Greatest Conservation Need and their dependent 5 habitats across the Northeast and Midwest. This chapter provides a synthesis of different assessment frameworks, information on the locations (e.g., States) where vulnerability assessments were conducted, lists of individual species and habitats with their respective vulnerability rankings, and a comparison of how vulnerability rankings were determined among studies.

  7. Errors and untimely radiodiagnosis of occupational diseases

    International Nuclear Information System (INIS)

    Sokolik, L.I.; Shkondin, A.N.; Sergienko, N.S.; Doroshenko, A.N.; Shumakov, A.V.

    1987-01-01

    Most errors in the diagnosis of occupational diseases occur due to hyperdiagnosis (37%), because data of dynamic clinico-roentgenological examination were not considered (23%). Defects in the organization of prophylactic fluorography results in untimely diagnosis of dust-induced occupational diseases. Errors also occurred because working conditions were not always considered atypical development and course were not always analyzed

  8. Modeling coherent errors in quantum error correction

    Science.gov (United States)

    Greenbaum, Daniel; Dutton, Zachary

    2018-01-01

    Analysis of quantum error correcting codes is typically done using a stochastic, Pauli channel error model for describing the noise on physical qubits. However, it was recently found that coherent errors (systematic rotations) on physical data qubits result in both physical and logical error rates that differ significantly from those predicted by a Pauli model. Here we examine the accuracy of the Pauli approximation for noise containing coherent errors (characterized by a rotation angle ɛ) under the repetition code. We derive an analytic expression for the logical error channel as a function of arbitrary code distance d and concatenation level n, in the small error limit. We find that coherent physical errors result in logical errors that are partially coherent and therefore non-Pauli. However, the coherent part of the logical error is negligible at fewer than {ε }-({dn-1)} error correction cycles when the decoder is optimized for independent Pauli errors, thus providing a regime of validity for the Pauli approximation. Above this number of correction cycles, the persistent coherent logical error will cause logical failure more quickly than the Pauli model would predict, and this may need to be combated with coherent suppression methods at the physical level or larger codes.

  9. Coping and acceptance: the greatest challenge for veterans with intestinal stomas.

    Science.gov (United States)

    Krouse, Robert S; Grant, Marcia; Rawl, Susan M; Mohler, M Jane; Baldwin, Carol M; Coons, Stephen Joel; McCorkle, Ruth; Schmidt, C Max; Ko, Clifford Y

    2009-03-01

    Intestinal stomas (ostomies) create challenges for veterans. The goal of this qualitative analysis was to understand better patients' perspectives regarding their greatest challenge. Ostomates at three Veterans Affairs locations were surveyed using the modified City of Hope Quality of Life-Ostomy questionnaire that contained an open-ended request for respondents to describe their greatest challenge. The response rate was 51% (239 of 467); 68% (163 of 239) completed the open-ended item. Content analysis was performed by an experienced qualitative research team. Coping and acceptance were the most commonly addressed themes. The most frequently expressed issues and advice were related to a need for positive thinking and insight regarding adjustment over time. Coping strategies included the use of humor, recognition of positive changes resulting from the stoma, and normalization of life with an ostomy. Coping and acceptance are common themes described by veterans with an intestinal stoma. Health-care providers can assist veterans by utilizing ostomate self-management strategies, experience, and advice.

  10. The greatest challenges reported by long-term colorectal cancer survivors with stomas.

    Science.gov (United States)

    McMullen, Carmit K; Hornbrook, Mark C; Grant, Marcia; Baldwin, Carol M; Wendel, Christopher S; Mohler, M Jane; Altschuler, Andrea; Ramirez, Michelle; Krouse, Robert S

    2008-04-01

    This paper presents a qualitative analysis of the greatest challenges reported by long-term colorectal cancer survivors with ostomies. Surveys that included an open-ended question about challenges of living with an ostomy were administered at three Kaiser Permanente regions: Northern California, Northwest, and Hawaii. The study was coordinated at the Southern Arizona Veterans Affairs Health Care System in Tucson. The City of Hope Quality of Life Model for Ostomy Patients provided a framework for the study's design, measures, data collection, and data analysis. The study's findings may be generalized broadly to community settings across the United States. Results replicate those of previous research among veterans, California members of the United Ostomy Association, Koreans with ostomies, and colorectal cancer survivors with ostomies residing in the United Kingdom. The greatest challenges reported by 178 colorectal cancer survivors with ostomies confirmed the Institute of Medicine's findings that survivorship is a distinct, chronic phase of cancer care and that cancer's effects are broad and pervasive. The challenges reported by study participants should inform the design, testing and integration of targeted education, early interventions, and ongoing support services for colorectal cancer patients with ostomies.

  11. Medical Error and Moral Luck.

    Science.gov (United States)

    Hubbeling, Dieneke

    2016-09-01

    This paper addresses the concept of moral luck. Moral luck is discussed in the context of medical error, especially an error of omission that occurs frequently, but only rarely has adverse consequences. As an example, a failure to compare the label on a syringe with the drug chart results in the wrong medication being administered and the patient dies. However, this error may have previously occurred many times with no tragic consequences. Discussions on moral luck can highlight conflicting intuitions. Should perpetrators receive a harsher punishment because of an adverse outcome, or should they be dealt with in the same way as colleagues who have acted similarly, but with no adverse effects? An additional element to the discussion, specifically with medical errors, is that according to the evidence currently available, punishing individual practitioners does not seem to be effective in preventing future errors. The following discussion, using relevant philosophical and empirical evidence, posits a possible solution for the moral luck conundrum in the context of medical error: namely, making a distinction between the duty to make amends and assigning blame. Blame should be assigned on the basis of actual behavior, while the duty to make amends is dependent on the outcome.

  12. Reducing mortality from childhood pneumonia: The leading priority is also the greatest opportunity

    Directory of Open Access Journals (Sweden)

    Igor Rudan

    2013-06-01

    Full Text Available Pneumonia and diarrhoea have been the leading causes of global child mortality for many decades. The work of Child Health Epidemiology Reference Group (CHERG has been pivotal in raising awareness that the UN's Millennium Development Goal 4 cannot be achieved without increased focus on preventing and treating the two diseases in low– and middle–income countries. Global Action Plan for Pneumonia (GAPP and Diarrhoea Global Action Plan (DGAP groups recently concluded that addressing childhood pneumonia and diarrhoea is not only the leading priority but also the greatest opportunity in global health today: scaling up of existing highly cost–effective interventions could prevent 95% of diarrhoea deaths and 67% of pneumonia deaths in children younger than 5 years by the year 2025. The cost of such effort was estimated at about US$ 6.7 billion.

  13. Error field considerations for BPX

    International Nuclear Information System (INIS)

    LaHaye, R.J.

    1992-01-01

    Irregularities in the position of poloidal and/or toroidal field coils in tokamaks produce resonant toroidal asymmetries in the vacuum magnetic fields. Otherwise stable tokamak discharges become non-linearly unstable to disruptive locked modes when subjected to low level error fields. Because of the field errors, magnetic islands are produced which would not otherwise occur in tearing mode table configurations; a concomitant reduction of the total confinement can result. Poloidal and toroidal asymmetries arise in the heat flux to the divertor target. In this paper, the field errors from perturbed BPX coils are used in a field line tracing code of the BPX equilibrium to study these deleterious effects. Limits on coil irregularities for device design and fabrication are computed along with possible correcting coils for reducing such field errors

  14. Error-information in tutorial documentation: Supporting users' errors to facilitate initial skill learning

    NARCIS (Netherlands)

    Lazonder, Adrianus W.; van der Meij, Hans

    1995-01-01

    Novice users make many errors when they first try to learn how to work with a computer program like a spreadsheet or wordprocessor. No matter how user-friendly the software or the training manual, errors can and will occur. The current view on errors is that they can be helpful or disruptive,

  15. Characteristics of medication errors with parenteral cytotoxic drugs

    OpenAIRE

    Fyhr, A; Akselsson, R

    2012-01-01

    Errors involving cytotoxic drugs have the potential of being fatal and should therefore be prevented. The objective of this article is to identify the characteristics of medication errors involving parenteral cytotoxic drugs in Sweden. A total of 60 cases reported to the national error reporting systems from 1996 to 2008 were reviewed. Classification was made to identify cytotoxic drugs involved, type of error, where the error occurred, error detection mechanism, and consequences for the pati...

  16. Medication errors in anesthesia: unacceptable or unavoidable?

    Directory of Open Access Journals (Sweden)

    Ira Dhawan

    Full Text Available Abstract Medication errors are the common causes of patient morbidity and mortality. It adds financial burden to the institution as well. Though the impact varies from no harm to serious adverse effects including death, it needs attention on priority basis since medication errors' are preventable. In today's world where people are aware and medical claims are on the hike, it is of utmost priority that we curb this issue. Individual effort to decrease medication error alone might not be successful until a change in the existing protocols and system is incorporated. Often drug errors that occur cannot be reversed. The best way to ‘treat' drug errors is to prevent them. Wrong medication (due to syringe swap, overdose (due to misunderstanding or preconception of the dose, pump misuse and dilution error, incorrect administration route, under dosing and omission are common causes of medication error that occur perioperatively. Drug omission and calculation mistakes occur commonly in ICU. Medication errors can occur perioperatively either during preparation, administration or record keeping. Numerous human and system errors can be blamed for occurrence of medication errors. The need of the hour is to stop the blame - game, accept mistakes and develop a safe and ‘just' culture in order to prevent medication errors. The newly devised systems like VEINROM, a fluid delivery system is a novel approach in preventing drug errors due to most commonly used medications in anesthesia. Similar developments along with vigilant doctors, safe workplace culture and organizational support all together can help prevent these errors.

  17. Current Global Pricing For Human Papillomavirus Vaccines Brings The Greatest Economic Benefits To Rich Countries.

    Science.gov (United States)

    Herlihy, Niamh; Hutubessy, Raymond; Jit, Mark

    2016-02-01

    Vaccinating females against human papillomavirus (HPV) prior to the debut of sexual activity is an effective way to prevent cervical cancer, yet vaccine uptake in low- and middle-income countries has been hindered by high vaccine prices. We created an economic model to estimate the distribution of the economic surplus-the sum of all health and economic benefits of a vaccine, minus the costs of development, production, and distribution-among different country income groups and manufacturers for a cohort of twelve-year-old females in 2012. We found that manufacturers may have received economic returns worth five times their original investment in HPV vaccine development. High-income countries gained the greatest economic surplus of any income category, realizing over five times more economic value per vaccinated female than low-income countries did. Subsidizing vaccine prices in low- and middle-income countries could both reduce financial barriers to vaccine adoption and still allow high-income countries to retain their economic surpluses and manufacturers to retain their profits. Project HOPE—The People-to-People Health Foundation, Inc.

  18. MreB filaments align along greatest principal membrane curvature to orient cell wall synthesis

    Science.gov (United States)

    Szwedziak, Piotr; Wong, Felix; Schaefer, Kaitlin; Izoré, Thierry; Renner, Lars D; Holmes, Matthew J; Sun, Yingjie; Bisson-Filho, Alexandre W; Walker, Suzanne; Amir, Ariel; Löwe, Jan

    2018-01-01

    MreB is essential for rod shape in many bacteria. Membrane-associated MreB filaments move around the rod circumference, helping to insert cell wall in the radial direction to reinforce rod shape. To understand how oriented MreB motion arises, we altered the shape of Bacillus subtilis. MreB motion is isotropic in round cells, and orientation is restored when rod shape is externally imposed. Stationary filaments orient within protoplasts, and purified MreB tubulates liposomes in vitro, orienting within tubes. Together, this demonstrates MreB orients along the greatest principal membrane curvature, a conclusion supported with biophysical modeling. We observed that spherical cells regenerate into rods in a local, self-reinforcing manner: rapidly propagating rods emerge from small bulges, exhibiting oriented MreB motion. We propose that the coupling of MreB filament alignment to shape-reinforcing peptidoglycan synthesis creates a locally-acting, self-organizing mechanism allowing the rapid establishment and stable maintenance of emergent rod shape. PMID:29469806

  19. Covering women's greatest health fear: breast cancer information in consumer magazines.

    Science.gov (United States)

    Walsh-Childers, Kim; Edwards, Heather; Grobmyer, Stephen

    2011-04-01

    Women identify consumer magazines as a key source of information on many health topics, including breast cancer, which continues to rank as women's greatest personal health fear. This study examined the comprehensiveness and accuracy of breast cancer information provided in 555 articles published in 17 consumer magazines from 2002 through 2007. Accuracy of information was determined for 33 key breast cancer facts identified by an expert panel as important information for women to know. The results show that only 7 of 33 key facts were mentioned in at least 5% of the articles. These facts all dealt with breast cancer risk factors, screening, and detection; none of the key facts related to treatment or outcomes appeared in at least 5% of the articles. Other topics (not key facts) mentioned centered around controllable risk factors, support for breast cancer patients, and chemotherapy treatment. The majority of mentions of key facts were coded as fully accurate, although as much as 44% of mentions of some topics (the link between hormone replacement therapy and breast cancer) were coded as inaccurate or only partially accurate. The magazines were most likely to emphasize family history of breast cancer or genetic characteristics as risk factors for breast cancers; family history was twice as likely to be discussed as increasing age, which is in fact the most important risk factor for breast cancer other than being female. Magazine coverage may contribute to women's inaccurate perceptions of their breast cancer risk.

  20. Two-dimensional errors

    International Nuclear Information System (INIS)

    Anon.

    1991-01-01

    This chapter addresses the extension of previous work in one-dimensional (linear) error theory to two-dimensional error analysis. The topics of the chapter include the definition of two-dimensional error, the probability ellipse, the probability circle, elliptical (circular) error evaluation, the application to position accuracy, and the use of control systems (points) in measurements

  1. Part two: Error propagation

    International Nuclear Information System (INIS)

    Picard, R.R.

    1989-01-01

    Topics covered in this chapter include a discussion of exact results as related to nuclear materials management and accounting in nuclear facilities; propagation of error for a single measured value; propagation of error for several measured values; error propagation for materials balances; and an application of error propagation to an example of uranium hexafluoride conversion process

  2. Learning from Errors

    OpenAIRE

    Martínez-Legaz, Juan Enrique; Soubeyran, Antoine

    2003-01-01

    We present a model of learning in which agents learn from errors. If an action turns out to be an error, the agent rejects not only that action but also neighboring actions. We find that, keeping memory of his errors, under mild assumptions an acceptable solution is asymptotically reached. Moreover, one can take advantage of big errors for a faster learning.

  3. Generalized Gaussian Error Calculus

    CERN Document Server

    Grabe, Michael

    2010-01-01

    For the first time in 200 years Generalized Gaussian Error Calculus addresses a rigorous, complete and self-consistent revision of the Gaussian error calculus. Since experimentalists realized that measurements in general are burdened by unknown systematic errors, the classical, widespread used evaluation procedures scrutinizing the consequences of random errors alone turned out to be obsolete. As a matter of course, the error calculus to-be, treating random and unknown systematic errors side by side, should ensure the consistency and traceability of physical units, physical constants and physical quantities at large. The generalized Gaussian error calculus considers unknown systematic errors to spawn biased estimators. Beyond, random errors are asked to conform to the idea of what the author calls well-defined measuring conditions. The approach features the properties of a building kit: any overall uncertainty turns out to be the sum of a contribution due to random errors, to be taken from a confidence inter...

  4. Medication errors: an overview for clinicians.

    Science.gov (United States)

    Wittich, Christopher M; Burkle, Christopher M; Lanier, William L

    2014-08-01

    Medication error is an important cause of patient morbidity and mortality, yet it can be a confusing and underappreciated concept. This article provides a review for practicing physicians that focuses on medication error (1) terminology and definitions, (2) incidence, (3) risk factors, (4) avoidance strategies, and (5) disclosure and legal consequences. A medication error is any error that occurs at any point in the medication use process. It has been estimated by the Institute of Medicine that medication errors cause 1 of 131 outpatient and 1 of 854 inpatient deaths. Medication factors (eg, similar sounding names, low therapeutic index), patient factors (eg, poor renal or hepatic function, impaired cognition, polypharmacy), and health care professional factors (eg, use of abbreviations in prescriptions and other communications, cognitive biases) can precipitate medication errors. Consequences faced by physicians after medication errors can include loss of patient trust, civil actions, criminal charges, and medical board discipline. Methods to prevent medication errors from occurring (eg, use of information technology, better drug labeling, and medication reconciliation) have been used with varying success. When an error is discovered, patients expect disclosure that is timely, given in person, and accompanied with an apology and communication of efforts to prevent future errors. Learning more about medication errors may enhance health care professionals' ability to provide safe care to their patients. Copyright © 2014 Mayo Foundation for Medical Education and Research. Published by Elsevier Inc. All rights reserved.

  5. Medication errors: prescribing faults and prescription errors.

    Science.gov (United States)

    Velo, Giampaolo P; Minuz, Pietro

    2009-06-01

    1. Medication errors are common in general practice and in hospitals. Both errors in the act of writing (prescription errors) and prescribing faults due to erroneous medical decisions can result in harm to patients. 2. Any step in the prescribing process can generate errors. Slips, lapses, or mistakes are sources of errors, as in unintended omissions in the transcription of drugs. Faults in dose selection, omitted transcription, and poor handwriting are common. 3. Inadequate knowledge or competence and incomplete information about clinical characteristics and previous treatment of individual patients can result in prescribing faults, including the use of potentially inappropriate medications. 4. An unsafe working environment, complex or undefined procedures, and inadequate communication among health-care personnel, particularly between doctors and nurses, have been identified as important underlying factors that contribute to prescription errors and prescribing faults. 5. Active interventions aimed at reducing prescription errors and prescribing faults are strongly recommended. These should be focused on the education and training of prescribers and the use of on-line aids. The complexity of the prescribing procedure should be reduced by introducing automated systems or uniform prescribing charts, in order to avoid transcription and omission errors. Feedback control systems and immediate review of prescriptions, which can be performed with the assistance of a hospital pharmacist, are also helpful. Audits should be performed periodically.

  6. Eliminating US hospital medical errors.

    Science.gov (United States)

    Kumar, Sameer; Steinebach, Marc

    2008-01-01

    Healthcare costs in the USA have continued to rise steadily since the 1980s. Medical errors are one of the major causes of deaths and injuries of thousands of patients every year, contributing to soaring healthcare costs. The purpose of this study is to examine what has been done to deal with the medical-error problem in the last two decades and present a closed-loop mistake-proof operation system for surgery processes that would likely eliminate preventable medical errors. The design method used is a combination of creating a service blueprint, implementing the six sigma DMAIC cycle, developing cause-and-effect diagrams as well as devising poka-yokes in order to develop a robust surgery operation process for a typical US hospital. In the improve phase of the six sigma DMAIC cycle, a number of poka-yoke techniques are introduced to prevent typical medical errors (identified through cause-and-effect diagrams) that may occur in surgery operation processes in US hospitals. It is the authors' assertion that implementing the new service blueprint along with the poka-yokes, will likely result in the current medical error rate to significantly improve to the six-sigma level. Additionally, designing as many redundancies as possible in the delivery of care will help reduce medical errors. Primary healthcare providers should strongly consider investing in adequate doctor and nurse staffing, and improving their education related to the quality of service delivery to minimize clinical errors. This will lead to an increase in higher fixed costs, especially in the shorter time frame. This paper focuses additional attention needed to make a sound technical and business case for implementing six sigma tools to eliminate medical errors that will enable hospital managers to increase their hospital's profitability in the long run and also ensure patient safety.

  7. Oxidation mechanisms occurring in wines

    OpenAIRE

    Oliveira, Carla Maria; Ferreira, António César Silva; Freitas, Victor De; Silva, Artur M. S.

    2011-01-01

    The present review aims to show the state of the art on the oxidation mechanisms occurring in wines, as well as the methods to monitor, classify and diagnose wine oxidation. Wine oxidation can be divided in enzymatic oxidation and non-enzymatic oxidation. Enzymatic oxidation almost entirely occurs in grape must and is largely correlated with the content of hydroxycinnamates, such as caffeoyltartaric acid and paracoumaroyltartaric acid, and flavan-3-ols. Non-enzymatic oxidation, al...

  8. Prescription Errors in Psychiatry

    African Journals Online (AJOL)

    Arun Kumar Agnihotri

    clinical pharmacists in detecting errors before they have a (sometimes serious) clinical impact should not be underestimated. Research on medication error in mental health care is limited. .... participation in ward rounds and adverse drug.

  9. Common patterns in 558 diagnostic radiology errors.

    Science.gov (United States)

    Donald, Jennifer J; Barnard, Stuart A

    2012-04-01

    As a Quality Improvement initiative our department has held regular discrepancy meetings since 2003. We performed a retrospective analysis of the cases presented and identified the most common pattern of error. A total of 558 cases were referred for discussion over 92 months, and errors were classified as perceptual or interpretative. The most common patterns of error for each imaging modality were analysed, and the misses were scored by consensus as subtle or non-subtle. Of 558 diagnostic errors, 447 (80%) were perceptual and 111 (20%) were interpretative errors. Plain radiography and computed tomography (CT) scans were the most frequent imaging modalities accounting for 246 (44%) and 241 (43%) of the total number of errors, respectively. In the plain radiography group 120 (49%) of the errors occurred in chest X-ray reports with perceptual miss of a lung nodule occurring in 40% of this subgroup. In the axial and appendicular skeleton missed fractures occurred most frequently, and metastatic bone disease was overlooked in 12 of 50 plain X-rays of the pelvis or spine. The majority of errors within the CT group were in reports of body scans with the commonest perceptual errors identified including 16 missed significant bone lesions, 14 cases of thromboembolic disease and 14 gastrointestinal tumours. Of the 558 errors, 312 (56%) were considered subtle and 246 (44%) non-subtle. Diagnostic errors are not uncommon and are most frequently perceptual in nature. Identification of the most common patterns of error has the potential to improve the quality of reporting by improving the search behaviour of radiologists. © 2012 The Authors. Journal of Medical Imaging and Radiation Oncology © 2012 The Royal Australian and New Zealand College of Radiologists.

  10. Symmetry and the Monster: One of the Greatest Quests of Mathematics

    Energy Technology Data Exchange (ETDEWEB)

    Szabo, R J [Colin Maclaurin Building, Heriot-Watt University, Edinburgh EH14 4AS (United Kingdom)

    2007-04-13

    The book Symmetry and the Monster: One of the Greatest Quests of Mathematics describes historical events leading up to the discovery of the Monster sporadic group, the largest simple sporadic group. It also expounds the significance and deep relationships between this group and other areas of mathematics and theoretical physics. It begins, in the prologue, with a nice overview of some of the mathematical drama surrounding the discovery of the Monster and its subsequent relationship to number theory (the so-called Moonshine conjectures). From a historical perspective, the book traces back to the roots of group theory, Galois theory, and steadily runs through time through the many famous mathematicians who contributed to group theory, including Lie, Killing and Cartan. Throughout, the author has provided a very nice and deep insight into the sociological and scientific problems at the time, and gives the reader a very prominent inside view of the real people behind the mathematics. The book should be an enjoyable read to anyone with an interest in the history of mathematics. For the non-mathematician the book makes a good, and mostly successful, attempt at being non-technical. Technical mathematical jargon is replaced with more heuristic, intuitive terminology, making the mathematical descriptions in the book fairly easy going. A glossary/hspace{l_brace}0.25pc{r_brace} of/hspace{l_brace}0.25pc{r_brace} terminology for noindent the more scientifically inclined is included in various footnotes throughout the book and in a comprehensive listing at the end of the book. Some more technical material is also included in the form of appendices at the end of the book. Some aspects of physics are also explained in a simple, intuitive way. The author further attempts at various places to give the non-specialist a glimpse into what mathematical proof is all about, and explains the difficulties and technicalities involved in this very nicely (for instance, he mentions the various

  11. Symmetry and the Monster: One of the Greatest Quests of Mathematics

    International Nuclear Information System (INIS)

    Szabo, R J

    2007-01-01

    The book Symmetry and the Monster: One of the Greatest Quests of Mathematics describes historical events leading up to the discovery of the Monster sporadic group, the largest simple sporadic group. It also expounds the significance and deep relationships between this group and other areas of mathematics and theoretical physics. It begins, in the prologue, with a nice overview of some of the mathematical drama surrounding the discovery of the Monster and its subsequent relationship to number theory (the so-called Moonshine conjectures). From a historical perspective, the book traces back to the roots of group theory, Galois theory, and steadily runs through time through the many famous mathematicians who contributed to group theory, including Lie, Killing and Cartan. Throughout, the author has provided a very nice and deep insight into the sociological and scientific problems at the time, and gives the reader a very prominent inside view of the real people behind the mathematics. The book should be an enjoyable read to anyone with an interest in the history of mathematics. For the non-mathematician the book makes a good, and mostly successful, attempt at being non-technical. Technical mathematical jargon is replaced with more heuristic, intuitive terminology, making the mathematical descriptions in the book fairly easy going. A glossary/hspace{0.25pc} of/hspace{0.25pc} terminology for noindent the more scientifically inclined is included in various footnotes throughout the book and in a comprehensive listing at the end of the book. Some more technical material is also included in the form of appendices at the end of the book. Some aspects of physics are also explained in a simple, intuitive way. The author further attempts at various places to give the non-specialist a glimpse into what mathematical proof is all about, and explains the difficulties and technicalities involved in this very nicely (for instance, he mentions the various 100+ page articles that

  12. BOOK REVIEW: Symmetry and the Monster: One of the Greatest Quests of Mathematics

    Science.gov (United States)

    Szabo, R. J.

    2007-04-01

    The book Symmetry and the Monster: One of the Greatest Quests of Mathematics describes historical events leading up to the discovery of the Monster sporadic group, the largest simple sporadic group. It also expounds the significance and deep relationships between this group and other areas of mathematics and theoretical physics. It begins, in the prologue, with a nice overview of some of the mathematical drama surrounding the discovery of the Monster and its subsequent relationship to number theory (the so-called Moonshine conjectures). From a historical perspective, the book traces back to the roots of group theory, Galois theory, and steadily runs through time through the many famous mathematicians who contributed to group theory, including Lie, Killing and Cartan. Throughout, the author has provided a very nice and deep insight into the sociological and scientific problems at the time, and gives the reader a very prominent inside view of the real people behind the mathematics. The book should be an enjoyable read to anyone with an interest in the history of mathematics. For the non-mathematician the book makes a good, and mostly successful, attempt at being non-technical. Technical mathematical jargon is replaced with more heuristic, intuitive terminology, making the mathematical descriptions in the book fairly easy going. A glossary\\hspace{0.25pc} of\\hspace{0.25pc} terminology for noindent the more scientifically inclined is included in various footnotes throughout the book and in a comprehensive listing at the end of the book. Some more technical material is also included in the form of appendices at the end of the book. Some aspects of physics are also explained in a simple, intuitive way. The author further attempts at various places to give the non-specialist a glimpse into what mathematical proof is all about, and explains the difficulties and technicalities involved in this very nicely (for instance, he mentions the various 100+ page articles that

  13. Errors in otology.

    Science.gov (United States)

    Kartush, J M

    1996-11-01

    Practicing medicine successfully requires that errors in diagnosis and treatment be minimized. Malpractice laws encourage litigators to ascribe all medical errors to incompetence and negligence. There are, however, many other causes of unintended outcomes. This article describes common causes of errors and suggests ways to minimize mistakes in otologic practice. Widespread dissemination of knowledge about common errors and their precursors can reduce the incidence of their occurrence. Consequently, laws should be passed to allow for a system of non-punitive, confidential reporting of errors and "near misses" that can be shared by physicians nationwide.

  14. Error identification and recovery by student nurses using human patient simulation: opportunity to improve patient safety.

    Science.gov (United States)

    Henneman, Elizabeth A; Roche, Joan P; Fisher, Donald L; Cunningham, Helene; Reilly, Cheryl A; Nathanson, Brian H; Henneman, Philip L

    2010-02-01

    This study examined types of errors that occurred or were recovered in a simulated environment by student nurses. Errors occurred in all four rule-based error categories, and all students committed at least one error. The most frequent errors occurred in the verification category. Another common error was related to physician interactions. The least common errors were related to coordinating information with the patient and family. Our finding that 100% of student subjects committed rule-based errors is cause for concern. To decrease errors and improve safe clinical practice, nurse educators must identify effective strategies that students can use to improve patient surveillance. Copyright 2010 Elsevier Inc. All rights reserved.

  15. The greatest step in vertebrate history: a paleobiological review of the fish-tetrapod transition.

    Science.gov (United States)

    Long, John A; Gordon, Malcolm S

    2004-01-01

    Recent discoveries of previously unknown fossil forms have dramatically transformed understanding of many aspects of the fish-tetrapod transition. Newer paleobiological approaches have also contributed to changed views of which animals were involved and when, where, and how the transition occurred. This review summarizes major advances made and reevaluates alternative interpretations of important parts of the evidence. We begin with general issues and concepts, including limitations of the Paleozoic fossil record. We summarize important features of paleoclimates, paleoenvironments, paleobiogeography, and taphonomy. We then review the history of Devonian tetrapods and their closest stem group ancestors within the sarcopterygian fishes. It is now widely accepted that the first tetrapods arose from advanced tetrapodomorph stock (the elpistostegalids) in the Late Devonian, probably in Euramerica. However, truly terrestrial forms did not emerge until much later, in geographically far-flung regions, in the Lower Carboniferous. The complete transition occurred over about 25 million years; definitive emergences onto land took place during the most recent 5 million years. The sequence of character acquisition during the transition can be seen as a five-step process involving: (1) higher osteichthyan (tetrapodomorph) diversification in the Middle Devonian (beginning about 380 million years ago [mya]), (2) the emergence of "prototetrapods" (e.g., Elginerpeton) in the Frasnian stage (about 372 mya), (3) the appearance of aquatic tetrapods (e.g., Acanthostega) sometime in the early to mid-Famennian (about 360 mya), (4) the appearance of "eutetrapods" (e.g., Tulerpeton) at the very end of the Devonian period (about 358 mya), and (5) the first truly terrestrial tetrapods (e.g., Pederpes) in the Lower Carboniferous (about 340 mya). We discuss each of these steps with respect to inferred functional utility of acquired character sets. Dissociated heterochrony is seen as the most

  16. Naturally Occurring Radioactive Materials (NORM)

    International Nuclear Information System (INIS)

    Gray, P.

    1997-01-01

    This paper discusses the broad problems presented by Naturally Occuring Radioactive Materials (NORM). Technologically Enhanced naturally occuring radioactive material includes any radionuclides whose physical, chemical, radiological properties or radionuclide concentration have been altered from their natural state. With regard to NORM in particular, radioactive contamination is radioactive material in an undesired location. This is a concern in a range of industries: petroleum; uranium mining; phosphorus and phosphates; fertilizers; fossil fuels; forestry products; water treatment; metal mining and processing; geothermal energy. The author discusses in more detail the problem in the petroleum industry, including the isotopes of concern, the hazards they present, the contamination which they cause, ways to dispose of contaminated materials, and regulatory issues. He points out there are three key programs to reduce legal exposure and problems due to these contaminants: waste minimization; NORM assesment (surveys); NORM compliance (training)

  17. Naturally Occurring Radioactive Materials (NORM)

    Energy Technology Data Exchange (ETDEWEB)

    Gray, P. [ed.

    1997-02-01

    This paper discusses the broad problems presented by Naturally Occuring Radioactive Materials (NORM). Technologically Enhanced naturally occuring radioactive material includes any radionuclides whose physical, chemical, radiological properties or radionuclide concentration have been altered from their natural state. With regard to NORM in particular, radioactive contamination is radioactive material in an undesired location. This is a concern in a range of industries: petroleum; uranium mining; phosphorus and phosphates; fertilizers; fossil fuels; forestry products; water treatment; metal mining and processing; geothermal energy. The author discusses in more detail the problem in the petroleum industry, including the isotopes of concern, the hazards they present, the contamination which they cause, ways to dispose of contaminated materials, and regulatory issues. He points out there are three key programs to reduce legal exposure and problems due to these contaminants: waste minimization; NORM assesment (surveys); NORM compliance (training).

  18. Maximising the effect of combination HIV prevention through prioritisation of the people and places in greatest need: a modelling study.

    Science.gov (United States)

    Anderson, Sarah-Jane; Cherutich, Peter; Kilonzo, Nduku; Cremin, Ide; Fecht, Daniela; Kimanga, Davies; Harper, Malayah; Masha, Ruth Laibon; Ngongo, Prince Bahati; Maina, William; Dybul, Mark; Hallett, Timothy B

    2014-07-19

    Epidemiological data show substantial variation in the risk of HIV infection between communities within African countries. We hypothesised that focusing appropriate interventions on geographies and key populations at high risk of HIV infection could improve the effect of investments in the HIV response. With use of Kenya as a case study, we developed a mathematical model that described the spatiotemporal evolution of the HIV epidemic and that incorporated the demographic, behavioural, and programmatic differences across subnational units. Modelled interventions (male circumcision, behaviour change communication, early antiretoviral therapy, and pre-exposure prophylaxis) could be provided to different population groups according to their risk behaviours or their location. For a given national budget, we compared the effect of a uniform intervention strategy, in which the same complement of interventions is provided across the country, with a focused strategy that tailors the set of interventions and amount of resources allocated to the local epidemiological conditions. A uniformly distributed combination of HIV prevention interventions could reduce the total number of new HIV infections by 40% during a 15-year period. With no additional spending, this effect could be increased by 14% during the 15 years-almost 100,000 extra infections, and result in 33% fewer new HIV infections occurring every year by the end of the period if the focused approach is used to tailor resource allocation to reflect patterns in local epidemiology. The cumulative difference in new infections during the 15-year projection period depends on total budget and costs of interventions, and could be as great as 150,000 (a cumulative difference as great as 22%) under different assumptions about the unit costs of intervention. The focused approach achieves greater effect than the uniform approach despite exactly the same investment. Through prioritisation of the people and locations at greatest

  19. Naturally occurring radionuclides in food

    International Nuclear Information System (INIS)

    Djujic, I.

    1995-01-01

    The naturally occurring radionuclides are the major source of radiation exposure to humans. The principal way of natural radiation exposure is the inhalation of 222 Rn decay products (about 85% of the total). The remainder is equally divided between internally deposited radionuclides, cosmic and terrestrial sources. In the present study, the content of 40 K, 210 Pb, 226 Ra, 230 Th, 232 Th and 238 U in representative food samples (milk, pork, beef, potatoes, wheat and corn flour) and samples of different food items that do not represent entire national production but provide interesting additional data for approximative calculation of naturally occurring radionuclide intake is presented. Daily weight of food eaten, participation of food groups, as well as daily intake by food of mentioned naturally occurring radionuclides in the Serbian diet was obtained on the base of house hold budget surveys. The result obtained for daily intake estimates in mBq for Serbian population are 78.1 ( 40 K), 38.2( 210 Pb), 52.3( 226 Ra), 2.0( 230 Th) and 12.5( 238 U). (author)

  20. Performance trade-offs and ageing in the 'world's greatest athletes'.

    Science.gov (United States)

    Careau, Vincent; Wilson, Robbie S

    2017-08-16

    The mechanistic foundations of performance trade-offs are clear: because body size and shape constrains movement, and muscles vary in strength and fibre type, certain physical traits should act in opposition with others (e.g. sprint versus endurance). Yet performance trade-offs are rarely detected, and traits are often positively correlated. A potential resolution to this conundrum is that within -individual performance trade-offs can be masked by among -individual variation in 'quality'. Although there is a current debate on how to unambiguously define and account for quality, no previous studies have partitioned trait correlations at the within- and among-individual levels. Here, we evaluate performance trade-offs among and within 1369 elite athletes that performed in a total of 6418 combined-events competitions (decathlon and heptathlon). Controlling for age, experience and wind conditions, we detected strong trade-offs between groups of functionally similar events (throwing versus jumping versus running) occurring at the among-individual level. We further modelled individual (co)variation in age-related plasticity of performance and found previously unseen trade-offs in throwing versus running performance that manifest through ageing. Our results verify that human performance is limited by fundamental genetic, environmental and ageing constraints that preclude the simultaneous improvement of performance in multiple dimensions. Identifying these constraints is fundamental to understanding performance trade-offs and predicting the ageing of motor function. © 2017 The Author(s).

  1. The Greatest Challenge Ever for Mankind, Requiring Policies of Accelerating Hardship and Implementation Difficulty

    Science.gov (United States)

    Wilson, John

    2015-04-01

    Providing energy for the contemporary world has resulted in a multi-variable problem in which a confluence of historical anomalies and economic, psychological, political, and demographic factors thwart efforts to prevent significant harm from increasing atmospheric CO2. This unlikely combination has created the perfect storm in which the warnings by scientists are ineffective. Global warming is occurring simultaneously with increased population, some dysfunctional political institutions, ascendency of oversimplified economic theory, campaigns to discredit scientists, misinterpretation of the meaning of noise in the Milankovitch climate cycles, and substantially improved hydrocarbon extraction methods. These factors are compounded by traits of human nature, such as greed and resistance to changing the familiar and discontinuing profitable endeavors. The idea that future people are equal with us may not be widely supported, yet this value is the foundation of climate change action. History shows that most people and nations will not take appropriate measures until forced, yet the cost increases as action is delayed. This makes appropriate policies even more extreme and difficult to accomplish as more wealth is consumed in treating global warming symptoms.

  2. Errors and mistakes in breast ultrasound diagnostics

    Directory of Open Access Journals (Sweden)

    Wiesław Jakubowski

    2012-09-01

    Full Text Available Sonomammography is often the first additional examination performed in the diagnostics of breast diseases. The development of ultrasound imaging techniques, particularly the introduction of high frequency transducers, matrix transducers, harmonic imaging and finally, elastography, influenced the improvement of breast disease diagnostics. Neverthe‑ less, as in each imaging method, there are errors and mistakes resulting from the techni‑ cal limitations of the method, breast anatomy (fibrous remodeling, insufficient sensitivity and, in particular, specificity. Errors in breast ultrasound diagnostics can be divided into impossible to be avoided and potentially possible to be reduced. In this article the most frequently made errors in ultrasound have been presented, including the ones caused by the presence of artifacts resulting from volumetric averaging in the near and far field, artifacts in cysts or in dilated lactiferous ducts (reverberations, comet tail artifacts, lateral beam artifacts, improper setting of general enhancement or time gain curve or range. Errors dependent on the examiner, resulting in the wrong BIRADS‑usg classification, are divided into negative and positive errors. The sources of these errors have been listed. The methods of minimization of the number of errors made have been discussed, includ‑ ing the ones related to the appropriate examination technique, taking into account data from case history and the use of the greatest possible number of additional options such as: harmonic imaging, color and power Doppler and elastography. In the article examples of errors resulting from the technical conditions of the method have been presented, and those dependent on the examiner which are related to the great diversity and variation of ultrasound images of pathological breast lesions.

  3. Characteristics of pediatric chemotherapy medication errors in a national error reporting database.

    Science.gov (United States)

    Rinke, Michael L; Shore, Andrew D; Morlock, Laura; Hicks, Rodney W; Miller, Marlene R

    2007-07-01

    Little is known regarding chemotherapy medication errors in pediatrics despite studies suggesting high rates of overall pediatric medication errors. In this study, the authors examined patterns in pediatric chemotherapy errors. The authors queried the United States Pharmacopeia MEDMARX database, a national, voluntary, Internet-accessible error reporting system, for all error reports from 1999 through 2004 that involved chemotherapy medications and patients aged error reports, 85% reached the patient, and 15.6% required additional patient monitoring or therapeutic intervention. Forty-eight percent of errors originated in the administering phase of medication delivery, and 30% originated in the drug-dispensing phase. Of the 387 medications cited, 39.5% were antimetabolites, 14.0% were alkylating agents, 9.3% were anthracyclines, and 9.3% were topoisomerase inhibitors. The most commonly involved chemotherapeutic agents were methotrexate (15.3%), cytarabine (12.1%), and etoposide (8.3%). The most common error types were improper dose/quantity (22.9% of 327 cited error types), wrong time (22.6%), omission error (14.1%), and wrong administration technique/wrong route (12.2%). The most common error causes were performance deficit (41.3% of 547 cited error causes), equipment and medication delivery devices (12.4%), communication (8.8%), knowledge deficit (6.8%), and written order errors (5.5%). Four of the 5 most serious errors occurred at community hospitals. Pediatric chemotherapy errors often reached the patient, potentially were harmful, and differed in quality between outpatient and inpatient areas. This study indicated which chemotherapeutic agents most often were involved in errors and that administering errors were common. Investigation is needed regarding targeted medication administration safeguards for these high-risk medications. Copyright (c) 2007 American Cancer Society.

  4. Applying Intelligent Algorithms to Automate the Identification of Error Factors.

    Science.gov (United States)

    Jin, Haizhe; Qu, Qingxing; Munechika, Masahiko; Sano, Masataka; Kajihara, Chisato; Duffy, Vincent G; Chen, Han

    2018-05-03

    Medical errors are the manifestation of the defects occurring in medical processes. Extracting and identifying defects as medical error factors from these processes are an effective approach to prevent medical errors. However, it is a difficult and time-consuming task and requires an analyst with a professional medical background. The issues of identifying a method to extract medical error factors and reduce the extraction difficulty need to be resolved. In this research, a systematic methodology to extract and identify error factors in the medical administration process was proposed. The design of the error report, extraction of the error factors, and identification of the error factors were analyzed. Based on 624 medical error cases across four medical institutes in both Japan and China, 19 error-related items and their levels were extracted. After which, they were closely related to 12 error factors. The relational model between the error-related items and error factors was established based on a genetic algorithm (GA)-back-propagation neural network (BPNN) model. Additionally, compared to GA-BPNN, BPNN, partial least squares regression and support vector regression, GA-BPNN exhibited a higher overall prediction accuracy, being able to promptly identify the error factors from the error-related items. The combination of "error-related items, their different levels, and the GA-BPNN model" was proposed as an error-factor identification technology, which could automatically identify medical error factors.

  5. The error in total error reduction.

    Science.gov (United States)

    Witnauer, James E; Urcelay, Gonzalo P; Miller, Ralph R

    2014-02-01

    Most models of human and animal learning assume that learning is proportional to the discrepancy between a delivered outcome and the outcome predicted by all cues present during that trial (i.e., total error across a stimulus compound). This total error reduction (TER) view has been implemented in connectionist and artificial neural network models to describe the conditions under which weights between units change. Electrophysiological work has revealed that the activity of dopamine neurons is correlated with the total error signal in models of reward learning. Similar neural mechanisms presumably support fear conditioning, human contingency learning, and other types of learning. Using a computational modeling approach, we compared several TER models of associative learning to an alternative model that rejects the TER assumption in favor of local error reduction (LER), which assumes that learning about each cue is proportional to the discrepancy between the delivered outcome and the outcome predicted by that specific cue on that trial. The LER model provided a better fit to the reviewed data than the TER models. Given the superiority of the LER model with the present data sets, acceptance of TER should be tempered. Copyright © 2013 Elsevier Inc. All rights reserved.

  6. The human endogenous circadian system causes greatest platelet activation during the biological morning independent of behaviors.

    Directory of Open Access Journals (Sweden)

    Frank A J L Scheer

    Full Text Available Platelets are involved in the thromboses that are central to myocardial infarctions and ischemic strokes. Such adverse cardiovascular events have day/night patterns with peaks in the morning (~9 AM, potentially related to endogenous circadian clock control of platelet activation. The objective was to test if the human endogenous circadian system influences (1 platelet function and (2 platelet response to standardized behavioral stressors. We also aimed to compare the magnitude of any effects on platelet function caused by the circadian system with that caused by varied standardized behavioral stressors, including mental arithmetic, passive postural tilt and mild cycling exercise.We studied 12 healthy adults (6 female who lived in individual laboratory suites in dim light for 240 h, with all behaviors scheduled on a 20-h recurring cycle to permit assessment of endogenous circadian function independent from environmental and behavioral effects including the sleep/wake cycle. Circadian phase was assessed from core body temperature. There were highly significant endogenous circadian rhythms in platelet surface activated glycoprotein (GP IIb-IIIa, GPIb and P-selectin (6-17% peak-trough amplitudes; p ≤ 0.01. These circadian peaks occurred at a circadian phase corresponding to 8-9 AM. Platelet count, ATP release, aggregability, and plasma epinephrine also had significant circadian rhythms but with later peaks (corresponding to 3-8 PM. The circadian effects on the platelet activation markers were always larger than that of any of the three behavioral stressors.These data demonstrate robust effects of the endogenous circadian system on platelet activation in humans--independent of the sleep/wake cycle, other behavioral influences and the environment. The 9 AM timing of the circadian peaks of the three platelet surface markers, including platelet surface activated GPIIb-IIIa, the final common pathway of platelet aggregation, suggests that endogenous

  7. Errors in Neonatology

    OpenAIRE

    Antonio Boldrini; Rosa T. Scaramuzzo; Armando Cuttano

    2013-01-01

    Introduction: Danger and errors are inherent in human activities. In medical practice errors can lean to adverse events for patients. Mass media echo the whole scenario. Methods: We reviewed recent published papers in PubMed database to focus on the evidence and management of errors in medical practice in general and in Neonatology in particular. We compared the results of the literature with our specific experience in Nina Simulation Centre (Pisa, Italy). Results: In Neonatology the main err...

  8. Systematic Procedural Error

    National Research Council Canada - National Science Library

    Byrne, Michael D

    2006-01-01

    .... This problem has received surprisingly little attention from cognitive psychologists. The research summarized here examines such errors in some detail both empirically and through computational cognitive modeling...

  9. Human errors and mistakes

    International Nuclear Information System (INIS)

    Wahlstroem, B.

    1993-01-01

    Human errors have a major contribution to the risks for industrial accidents. Accidents have provided important lesson making it possible to build safer systems. In avoiding human errors it is necessary to adapt the systems to their operators. The complexity of modern industrial systems is however increasing the danger of system accidents. Models of the human operator have been proposed, but the models are not able to give accurate predictions of human performance. Human errors can never be eliminated, but their frequency can be decreased by systematic efforts. The paper gives a brief summary of research in human error and it concludes with suggestions for further work. (orig.)

  10. Early occurring and continuing effects

    International Nuclear Information System (INIS)

    Scott, B.R.; Hahn, F.F.

    1985-01-01

    This chapter deals with health-risk estimates for early and continuing effects of exposure to ionizing radiations that could be associated with light water nuclear power plants accidents. Early and continuing effects considered are nonneoplastic diseases and symptoms that normally occur soon after radiation exposure, but may also occur after years have passed. They are generally associated with relatively high (greater than 1 Gy) doses. For most of the effects considered, there is a practical dose threshold. Organs of primary interest, because of their high sensitivity or the likelihood of receiving a large radiation dose, are bone marrow, gastrointestinal tract, thyroid glands, lungs, skin, gonads, and eyes. In utero exposure of the fetus is also considered. New data and modeling techniques available since publication of the Reactor Safety Study (WASH 1400, 1975) were used along with data cited in the Study to develop improved health-risk models for morbidity and mortality. The new models are applicable to a broader range of accident scenarios, provide a more detailed treatment of dose protraction effects, and include morbidity effects not considered in the Reactor Safety Study. 115 references, 20 figures, 19 tables

  11. Naturally-occurring alpha activity

    Energy Technology Data Exchange (ETDEWEB)

    Mayneord, W V

    1960-12-01

    In view of the difficulties of assessing the significance of man-made radioactivity it is important to study for comparison the background of natural radioactivity against which the human race has evolved and lives. It is also important to define the present levels of activity so that it will be possible to detect and study as quickly as possible any changes which may occur owing to the release into the environment of new radioactive materials. Moreover, by the study of the behaviour of natural radioactivity light may be shed upon that of the artificially produced isotopes and a number of analogies traced between the two groups. These concepts have led to studies of naturally-occurring radioactive materials alongside a programme of research into fission products in food, water and air, as well as studies of the metabolism of both sets of materials in the human body. Since the last report there has been a useful increase in our knowledge of natural radioactivity in the biosphere, and its levels relative to the new man-made activities. These studies have necessitated technical developments, particularly in the methods of measuring and identifying alpha-ray emitters, to which group many of the more important natural radioactive materials belong.

  12. Inequities in the Global Health Workforce: The Greatest Impediment to Health in Sub-Saharan Africa

    Directory of Open Access Journals (Sweden)

    Chipayeni Mtonga

    2007-06-01

    Full Text Available Health systems played a key role in the dramatic rise in global life expectancy that occurred during the 20th century, and have continued to contribute enormously to the improvement of the health of most of the world’s population. The health workforce is the backbone of each health system, the lubricant that facilitates the smooth implementation of health action for sustainable socio-economic development. It has been proved beyond reasonable doubt that the density of the health workforce is directly correlated with positive health outcomes. In other words, health workers save lives and improve health. About 59 million people make up the health workforce of paid full-time health workers world-wide. However, enormous gaps remain between the potential of health systems and their actual performance, and there are far too many inequities in the distribution of health workers between countries and within countries. The Americas (mainly USA and Canada are home to 14% of the world’s population, bear only 10% of the world’s disease burden, have 37% of the global health workforce and spend about 50% of the world’s financial resources for health. Conversely, sub-Saharan Africa, with about 11% of the world’s population bears over 24% of the global disease burden, is home to only 3% of the global health workforce, and spends less than 1% of the world’s financial resources on health. In most developing countries, the health workforce is concentrated in the major towns and cities, while rural areas can only boast of about 23% and 38% of the country’s doctors and nurses respectively. The imbalances exist not only in the total numbers and geographical distribution of health workers, but also in the skills mix of available health workers. WHO estimates that 57 countries world wide have a critical shortage of health workers, equivalent to a global deficit of about 2

  13. Naturally occurring methyl salicylate glycosides.

    Science.gov (United States)

    Mao, Ping; Liu, Zizhen; Xie, Meng; Jiang, Rui; Liu, Weirui; Wang, Xiaohong; Meng, Shen; She, Gaimei

    2014-01-01

    As an important part of non steroids anti-inflammation drug (NSAIDs), salicylate has developed from natural substance salicylic acid to natrium salicylicum, to aspirin. Now, methyl salicylate glycoside, a new derivative of salicylic acid, is modified with a -COOH group integrated one methyl radical into formic ether, and a -OH linked with a monosaccharide, a disaccharide or a trisaccharide unit by glycosidic linkage. It has the similar pharmacological activities, anti-inflammatory, analgesic, antipyretic and antithrombotic as the previous salicylates' without resulting in serious side effects, particularly the gastrointestinal toxicity. Owing to the superiority of those significant bioactivities, methyl salicylate glycosides have became a hot research area in NSAIDs for several years. This paper compiles all 9 naturally occurring methyl salicylate glycosides, their distribution of the resource and pharmacological mechanism, which could contribute to the new drug discovery.

  14. Learning from Errors

    Science.gov (United States)

    Metcalfe, Janet

    2017-01-01

    Although error avoidance during learning appears to be the rule in American classrooms, laboratory studies suggest that it may be a counterproductive strategy, at least for neurologically typical students. Experimental investigations indicate that errorful learning followed by corrective feedback is beneficial to learning. Interestingly, the…

  15. What occurred in the reactors

    International Nuclear Information System (INIS)

    Kudo, Kazuhiko

    2013-01-01

    Described is what occurred in the reactors of Fukushima Daiichi Nuclear Power Plant at the Tohoku earthquake and tsunami (Mar. 11, 2011) from the aspect of engineering science. The tsunami attacked the Plant 1 hr after the quake. The Plant had reactors in buildings no.1-4 at 10 m height from the normal sea level which was flooded by 1.5-5.5 m high wave. All reactors in no.1-6 in the Plant were the boiling water type, and their core nuclear reactions were stopped within 3 sec due to the first quake by control rods inserted automatically. Reactors in no.1-5 lost their external AC power sources by the breakdown and subsequent submergence (no.1-4) of various equipments and in no.1, 2 and 4, the secondary DC power was then lost by the battery death. Although the isolation condenser started to cool the reactor in no.1 after DC cut, its valve was then kept closed to heat up the reactor, leading to the reaction of heated Zr in the fuel tube and water to yield H 2 which was accumulated in the building: the cause of hydrogen explosion on 12th. The reactor in no.2 had the reactor core isolation cooling system (RCIC) which operated normally for few hrs, then probably stopped to heat up the reactor, resulting in meltdown of the core but no explosion occurred because of the opened door of the blowout panel on the wall by the blast of no.1 explosion. The reactor in no.3 had RCIC and high pressure coolant injection system, but their works stopped to result in the core damage and H 2 accumulation leading to the explosion on 14th. The reactor in no.4 had not been operated because of its periodical annual examination, but was explored on 15th, of which cause was thought to be due to backward flow of H 2 from no.3. Finally, the author discusses about this accident from the industrial aspect of the design of safety level (defense in depth) on international views, and problems and tasks given. (T.T.)

  16. Error Tendencies in Processing Student Feedback for Instructional Decision Making.

    Science.gov (United States)

    Schermerhorn, John R., Jr.; And Others

    1985-01-01

    Seeks to assist instructors in recognizing two basic errors that can occur in processing student evaluation data on instructional development efforts; offers a research framework for future investigations of the error tendencies and related issues; and suggests ways in which instructors can confront and manage error tendencies in practice. (MBR)

  17. Audit of medication errors by anesthetists in North Western Nigeria ...

    African Journals Online (AJOL)

    ... errors do occur in the everyday practice of anesthetists in Nigeria as in other countries and can lead to morbidity and mortality in our patients. Routine audit and reporting of critical incidents including errors in drug administration should be encouraged. Reduction of medication errors is an important aspect of patient safety, ...

  18. Prescribing Errors in Cardiovascular Diseases in a Tertiary Health ...

    African Journals Online (AJOL)

    Prescription errors are now known to be contributing to a large number of deaths during the treatment of cardiovascular diseases. However, there is paucity of information about these errors occurring in health facilities in Nigeria. The objective of this study was to investigate the prevalence of prescribing errors in ...

  19. Iatrogenic medication errors in a paediatric intensive care unit in ...

    African Journals Online (AJOL)

    Errors most frequently encountered included failure to calculate rates of infusion and the conversion of mL to mEq or mL to mg for potassium, phenobarbitone and digoxin. Of the 117 children admitted, 111 (94.9%) were exposed to at least one medication error. Two or more medication errors occurred in 34.1% of cases.

  20. Earl occurring and continuing effects

    International Nuclear Information System (INIS)

    Scott, B.R.; Hahn, F.F.

    1989-01-01

    This chapter develops health-risk models for early and continuing effects of exposure to beta or gamma radiation that could be associated with light water nuclear power plant accidents. The main purpose of the chapter is to provide details on each health-risk model and on the data used. Early and continuing effects considered are prodromal symptoms and nonneoplastic diseases that usually occur soon after a brief radiation exposure. These effects are generally associated with relatively high (greater than 1 Gy) absorbed organ doses. For most of the effects considered, there is an absorbed organ dose threshold below which no effects are seen. Some information is provided on health effects observed in victims of the Chernobyl power plant accident. Organs of primary interest, because of their high sensitivity or their potential for receiving large doses, are bone marrow, gastrointestinal tract, thyroid glands, lungs, skin, gonads, and eyes. Exposure of the fetus is also considered. Additional data and modeling techniques available since publication of the Reactor Safety Study were used to obtain models for morbidity and mortality

  1. Does overtraining occur in triathletes?

    Directory of Open Access Journals (Sweden)

    I Margaritis

    2003-06-01

    Full Text Available 1. Objective: Long distance triathlon training is characterized by considerably high volume training loads. This volume can provoke an overtraining state. The aim of the study was to determine whether overtraining occurs in well-trained male triathletes in relation with their volume training loads. 2. Experimental design: A questionnaire investigation was completed two days before the Nice long-distance triathlon (October 1995: 4-km swim, 120-km bike ride and 30-km run. 3. Participants: Ninety-three well-trained male triathletes who took part in the triathlon race. 4. Measures: A questionnaire to relate clinical symptoms, which are known to appear in case of overtraining, was collected. 5. Results: 39.8% of the questioned triathletes reported a decrease in triathlon performances within the last month preceding the race. Moreover, these triathletes exhibited significantly more overtraining-relied symptoms than the others (5.9±3.8 vs 3.4±2.6, P<0.05. Surprisingly, the occurrence of overtraining in triathletes appears not to depend on the volume training loads. 6. Conclusions: These results suggest that overtraining has to be considered in the case of triathletes. This preliminary study evidences the need for further investigation in order to monitor triathletes training respond and prevent overtraining.

  2. The District Nursing Clinical Error Reduction Programme.

    Science.gov (United States)

    McGraw, Caroline; Topping, Claire

    2011-01-01

    The District Nursing Clinical Error Reduction (DANCER) Programme was initiated in NHS Islington following an increase in the number of reported medication errors. The objectives were to reduce the actual degree of harm and the potential risk of harm associated with medication errors and to maintain the existing positive reporting culture, while robustly addressing performance issues. One hundred medication errors reported in 2007/08 were analysed using a framework that specifies the factors that predispose to adverse medication events in domiciliary care. Various contributory factors were identified and interventions were subsequently developed to address poor drug calculation and medication problem-solving skills and incorrectly transcribed medication administration record charts. Follow up data were obtained at 12 months and two years. The evaluation has shown that although medication errors do still occur, the programme has resulted in a marked shift towards a reduction in the associated actual degree of harm and the potential risk of harm.

  3. Medication Errors: New EU Good Practice Guide on Risk Minimisation and Error Prevention.

    Science.gov (United States)

    Goedecke, Thomas; Ord, Kathryn; Newbould, Victoria; Brosch, Sabine; Arlett, Peter

    2016-06-01

    A medication error is an unintended failure in the drug treatment process that leads to, or has the potential to lead to, harm to the patient. Reducing the risk of medication errors is a shared responsibility between patients, healthcare professionals, regulators and the pharmaceutical industry at all levels of healthcare delivery. In 2015, the EU regulatory network released a two-part good practice guide on medication errors to support both the pharmaceutical industry and regulators in the implementation of the changes introduced with the EU pharmacovigilance legislation. These changes included a modification of the 'adverse reaction' definition to include events associated with medication errors, and the requirement for national competent authorities responsible for pharmacovigilance in EU Member States to collaborate and exchange information on medication errors resulting in harm with national patient safety organisations. To facilitate reporting and learning from medication errors, a clear distinction has been made in the guidance between medication errors resulting in adverse reactions, medication errors without harm, intercepted medication errors and potential errors. This distinction is supported by an enhanced MedDRA(®) terminology that allows for coding all stages of the medication use process where the error occurred in addition to any clinical consequences. To better understand the causes and contributing factors, individual case safety reports involving an error should be followed-up with the primary reporter to gather information relevant for the conduct of root cause analysis where this may be appropriate. Such reports should also be summarised in periodic safety update reports and addressed in risk management plans. Any risk minimisation and prevention strategy for medication errors should consider all stages of a medicinal product's life-cycle, particularly the main sources and types of medication errors during product development. This article

  4. Uncorrected refractive errors.

    Science.gov (United States)

    Naidoo, Kovin S; Jaggernath, Jyoti

    2012-01-01

    Global estimates indicate that more than 2.3 billion people in the world suffer from poor vision due to refractive error; of which 670 million people are considered visually impaired because they do not have access to corrective treatment. Refractive errors, if uncorrected, results in an impaired quality of life for millions of people worldwide, irrespective of their age, sex and ethnicity. Over the past decade, a series of studies using a survey methodology, referred to as Refractive Error Study in Children (RESC), were performed in populations with different ethnic origins and cultural settings. These studies confirmed that the prevalence of uncorrected refractive errors is considerably high for children in low-and-middle-income countries. Furthermore, uncorrected refractive error has been noted to have extensive social and economic impacts, such as limiting educational and employment opportunities of economically active persons, healthy individuals and communities. The key public health challenges presented by uncorrected refractive errors, the leading cause of vision impairment across the world, require urgent attention. To address these issues, it is critical to focus on the development of human resources and sustainable methods of service delivery. This paper discusses three core pillars to addressing the challenges posed by uncorrected refractive errors: Human Resource (HR) Development, Service Development and Social Entrepreneurship.

  5. Uncorrected refractive errors

    Directory of Open Access Journals (Sweden)

    Kovin S Naidoo

    2012-01-01

    Full Text Available Global estimates indicate that more than 2.3 billion people in the world suffer from poor vision due to refractive error; of which 670 million people are considered visually impaired because they do not have access to corrective treatment. Refractive errors, if uncorrected, results in an impaired quality of life for millions of people worldwide, irrespective of their age, sex and ethnicity. Over the past decade, a series of studies using a survey methodology, referred to as Refractive Error Study in Children (RESC, were performed in populations with different ethnic origins and cultural settings. These studies confirmed that the prevalence of uncorrected refractive errors is considerably high for children in low-and-middle-income countries. Furthermore, uncorrected refractive error has been noted to have extensive social and economic impacts, such as limiting educational and employment opportunities of economically active persons, healthy individuals and communities. The key public health challenges presented by uncorrected refractive errors, the leading cause of vision impairment across the world, require urgent attention. To address these issues, it is critical to focus on the development of human resources and sustainable methods of service delivery. This paper discusses three core pillars to addressing the challenges posed by uncorrected refractive errors: Human Resource (HR Development, Service Development and Social Entrepreneurship.

  6. Common Errors in Ecological Data Sharing

    Directory of Open Access Journals (Sweden)

    Robert B. Cook

    2013-04-01

    Full Text Available Objectives: (1 to identify common errors in data organization and metadata completeness that would preclude a “reader” from being able to interpret and re-use the data for a new purpose; and (2 to develop a set of best practices derived from these common errors that would guide researchers in creating more usable data products that could be readily shared, interpreted, and used.Methods: We used directed qualitative content analysis to assess and categorize data and metadata errors identified by peer reviewers of data papers published in the Ecological Society of America’s (ESA Ecological Archives. Descriptive statistics provided the relative frequency of the errors identified during the peer review process.Results: There were seven overarching error categories: Collection & Organization, Assure, Description, Preserve, Discover, Integrate, and Analyze/Visualize. These categories represent errors researchers regularly make at each stage of the Data Life Cycle. Collection & Organization and Description errors were some of the most common errors, both of which occurred in over 90% of the papers.Conclusions: Publishing data for sharing and reuse is error prone, and each stage of the Data Life Cycle presents opportunities for mistakes. The most common errors occurred when the researcher did not provide adequate metadata to enable others to interpret and potentially re-use the data. Fortunately, there are ways to minimize these mistakes through carefully recording all details about study context, data collection, QA/ QC, and analytical procedures from the beginning of a research project and then including this descriptive information in the metadata.

  7. Error estimation in plant growth analysis

    Directory of Open Access Journals (Sweden)

    Andrzej Gregorczyk

    2014-01-01

    Full Text Available The scheme is presented for calculation of errors of dry matter values which occur during approximation of data with growth curves, determined by the analytical method (logistic function and by the numerical method (Richards function. Further formulae are shown, which describe absolute errors of growth characteristics: Growth rate (GR, Relative growth rate (RGR, Unit leaf rate (ULR and Leaf area ratio (LAR. Calculation examples concerning the growth course of oats and maize plants are given. The critical analysis of the estimation of obtained results has been done. The purposefulness of joint application of statistical methods and error calculus in plant growth analysis has been ascertained.

  8. Preventing Errors in Laterality

    OpenAIRE

    Landau, Elliot; Hirschorn, David; Koutras, Iakovos; Malek, Alexander; Demissie, Seleshie

    2014-01-01

    An error in laterality is the reporting of a finding that is present on the right side as on the left or vice versa. While different medical and surgical specialties have implemented protocols to help prevent such errors, very few studies have been published that describe these errors in radiology reports and ways to prevent them. We devised a system that allows the radiologist to view reports in a separate window, displayed in a simple font and with all terms of laterality highlighted in sep...

  9. Errors and violations

    International Nuclear Information System (INIS)

    Reason, J.

    1988-01-01

    This paper is in three parts. The first part summarizes the human failures responsible for the Chernobyl disaster and argues that, in considering the human contribution to power plant emergencies, it is necessary to distinguish between: errors and violations; and active and latent failures. The second part presents empirical evidence, drawn from driver behavior, which suggest that errors and violations have different psychological origins. The concluding part outlines a resident pathogen view of accident causation, and seeks to identify the various system pathways along which errors and violations may be propagated

  10. Visual correlation analytics of event-based error reports for advanced manufacturing

    OpenAIRE

    Nazir, Iqbal

    2017-01-01

    With the growing digitalization and automation in the manufacturing domain, an increasing amount of process data and error reports become available. To minimize the number of errors and maximize the efficiency of the production line, it is important to analyze the generated error reports and find solutions that can reduce future errors. However, not all errors have the equal importance, as some errors may be the result of previously occurred errors. Therefore, it is important for domain exper...

  11. Managing organizational errors: Three theoretical lenses on a bank collapse

    OpenAIRE

    Giolito, Vincent

    2015-01-01

    Errors have been shown to be a major source of organizational disasters, yet scant research has paid attention to the management of errors that is, what managers do once errors have occurred and how actions may determine outcomes. In an early attempt to build a theory of the management of organizational errors, this paper examines how extant theory applies to the collapse of a bank. The financial industry was chosen because of the systemic risks it entails, as demonstrated by the financial cr...

  12. Help prevent hospital errors

    Science.gov (United States)

    ... this page: //medlineplus.gov/ency/patientinstructions/000618.htm Help prevent hospital errors To use the sharing features ... in the hospital. If You Are Having Surgery, Help Keep Yourself Safe Go to a hospital you ...

  13. Pedal Application Errors

    Science.gov (United States)

    2012-03-01

    This project examined the prevalence of pedal application errors and the driver, vehicle, roadway and/or environmental characteristics associated with pedal misapplication crashes based on a literature review, analysis of news media reports, a panel ...

  14. Rounding errors in weighing

    International Nuclear Information System (INIS)

    Jeach, J.L.

    1976-01-01

    When rounding error is large relative to weighing error, it cannot be ignored when estimating scale precision and bias from calibration data. Further, if the data grouping is coarse, rounding error is correlated with weighing error and may also have a mean quite different from zero. These facts are taken into account in a moment estimation method. A copy of the program listing for the MERDA program that provides moment estimates is available from the author. Experience suggests that if the data fall into four or more cells or groups, it is not necessary to apply the moment estimation method. Rather, the estimate given by equation (3) is valid in this instance. 5 tables

  15. Spotting software errors sooner

    International Nuclear Information System (INIS)

    Munro, D.

    1989-01-01

    Static analysis is helping to identify software errors at an earlier stage and more cheaply than conventional methods of testing. RTP Software's MALPAS system also has the ability to check that a code conforms to its original specification. (author)

  16. Errors in energy bills

    International Nuclear Information System (INIS)

    Kop, L.

    2001-01-01

    On request, the Dutch Association for Energy, Environment and Water (VEMW) checks the energy bills for her customers. It appeared that in the year 2000 many small, but also big errors were discovered in the bills of 42 businesses

  17. Medical Errors Reduction Initiative

    National Research Council Canada - National Science Library

    Mutter, Michael L

    2005-01-01

    The Valley Hospital of Ridgewood, New Jersey, is proposing to extend a limited but highly successful specimen management and medication administration medical errors reduction initiative on a hospital-wide basis...

  18. Study of Errors among Nursing Students

    Directory of Open Access Journals (Sweden)

    Ella Koren

    2007-09-01

    Full Text Available The study of errors in the health system today is a topic of considerable interest aimed at reducing errors through analysis of the phenomenon and the conclusions reached. Errors that occur frequently among health professionals have also been observed among nursing students. True, in most cases they are actually “near errors,” but these could be a future indicator of therapeutic reality and the effect of nurses' work environment on their personal performance. There are two different approaches to such errors: (a The EPP (error prone person approach lays full responsibility at the door of the individual involved in the error, whether a student, nurse, doctor, or pharmacist. According to this approach, handling consists purely in identifying and penalizing the guilty party. (b The EPE (error prone environment approach emphasizes the environment as a primary contributory factor to errors. The environment as an abstract concept includes components and processes of interpersonal communications, work relations, human engineering, workload, pressures, technical apparatus, and new technologies. The objective of the present study was to examine the role played by factors in and components of personal performance as compared to elements and features of the environment. The study was based on both of the aforementioned approaches, which, when combined, enable a comprehensive understanding of the phenomenon of errors among the student population as well as a comparison of factors contributing to human error and to error deriving from the environment. The theoretical basis of the study was a model that combined both approaches: one focusing on the individual and his or her personal performance and the other focusing on the work environment. The findings emphasize the work environment of health professionals as an EPE. However, errors could have been avoided by means of strict adherence to practical procedures. The authors examined error events in the

  19. The surveillance error grid.

    Science.gov (United States)

    Klonoff, David C; Lias, Courtney; Vigersky, Robert; Clarke, William; Parkes, Joan Lee; Sacks, David B; Kirkman, M Sue; Kovatchev, Boris

    2014-07-01

    Currently used error grids for assessing clinical accuracy of blood glucose monitors are based on out-of-date medical practices. Error grids have not been widely embraced by regulatory agencies for clearance of monitors, but this type of tool could be useful for surveillance of the performance of cleared products. Diabetes Technology Society together with representatives from the Food and Drug Administration, the American Diabetes Association, the Endocrine Society, and the Association for the Advancement of Medical Instrumentation, and representatives of academia, industry, and government, have developed a new error grid, called the surveillance error grid (SEG) as a tool to assess the degree of clinical risk from inaccurate blood glucose (BG) monitors. A total of 206 diabetes clinicians were surveyed about the clinical risk of errors of measured BG levels by a monitor. The impact of such errors on 4 patient scenarios was surveyed. Each monitor/reference data pair was scored and color-coded on a graph per its average risk rating. Using modeled data representative of the accuracy of contemporary meters, the relationships between clinical risk and monitor error were calculated for the Clarke error grid (CEG), Parkes error grid (PEG), and SEG. SEG action boundaries were consistent across scenarios, regardless of whether the patient was type 1 or type 2 or using insulin or not. No significant differences were noted between responses of adult/pediatric or 4 types of clinicians. Although small specific differences in risk boundaries between US and non-US clinicians were noted, the panel felt they did not justify separate grids for these 2 types of clinicians. The data points of the SEG were classified in 15 zones according to their assigned level of risk, which allowed for comparisons with the classic CEG and PEG. Modeled glucose monitor data with realistic self-monitoring of blood glucose errors derived from meter testing experiments plotted on the SEG when compared to

  20. Design for Error Tolerance

    DEFF Research Database (Denmark)

    Rasmussen, Jens

    1983-01-01

    An important aspect of the optimal design of computer-based operator support systems is the sensitivity of such systems to operator errors. The author discusses how a system might allow for human variability with the use of reversibility and observability.......An important aspect of the optimal design of computer-based operator support systems is the sensitivity of such systems to operator errors. The author discusses how a system might allow for human variability with the use of reversibility and observability....

  1. Apologies and Medical Error

    Science.gov (United States)

    2008-01-01

    One way in which physicians can respond to a medical error is to apologize. Apologies—statements that acknowledge an error and its consequences, take responsibility, and communicate regret for having caused harm—can decrease blame, decrease anger, increase trust, and improve relationships. Importantly, apologies also have the potential to decrease the risk of a medical malpractice lawsuit and can help settle claims by patients. Patients indicate they want and expect explanations and apologies after medical errors and physicians indicate they want to apologize. However, in practice, physicians tend to provide minimal information to patients after medical errors and infrequently offer complete apologies. Although fears about potential litigation are the most commonly cited barrier to apologizing after medical error, the link between litigation risk and the practice of disclosure and apology is tenuous. Other barriers might include the culture of medicine and the inherent psychological difficulties in facing one’s mistakes and apologizing for them. Despite these barriers, incorporating apology into conversations between physicians and patients can address the needs of both parties and can play a role in the effective resolution of disputes related to medical error. PMID:18972177

  2. Thermodynamics of Error Correction

    Directory of Open Access Journals (Sweden)

    Pablo Sartori

    2015-12-01

    Full Text Available Information processing at the molecular scale is limited by thermal fluctuations. This can cause undesired consequences in copying information since thermal noise can lead to errors that can compromise the functionality of the copy. For example, a high error rate during DNA duplication can lead to cell death. Given the importance of accurate copying at the molecular scale, it is fundamental to understand its thermodynamic features. In this paper, we derive a universal expression for the copy error as a function of entropy production and work dissipated by the system during wrong incorporations. Its derivation is based on the second law of thermodynamics; hence, its validity is independent of the details of the molecular machinery, be it any polymerase or artificial copying device. Using this expression, we find that information can be copied in three different regimes. In two of them, work is dissipated to either increase or decrease the error. In the third regime, the protocol extracts work while correcting errors, reminiscent of a Maxwell demon. As a case study, we apply our framework to study a copy protocol assisted by kinetic proofreading, and show that it can operate in any of these three regimes. We finally show that, for any effective proofreading scheme, error reduction is limited by the chemical driving of the proofreading reaction.

  3. Phonological errors predominate in Arabic spelling across grades 1-9.

    Science.gov (United States)

    Abu-Rabia, Salim; Taha, Haitham

    2006-03-01

    Most of the spelling error analysis has been conducted in Latin orthographies and rarely conducted in other orthographies like Arabic. Two hundred and eighty-eight students in grades 1-9 participated in the study. They were presented nine lists of words to test their spelling skills. Their spelling errors were analyzed by error categories. The most frequent errors were phonological. The results did not indicate any significant differences in the percentages of phonological errors across grades one to nine.Thus, phonology probably presents the greatest challenge to students developing spelling skills in Arabic.

  4. Learning from Errors

    Directory of Open Access Journals (Sweden)

    MA. Lendita Kryeziu

    2015-06-01

    Full Text Available “Errare humanum est”, a well known and widespread Latin proverb which states that: to err is human, and that people make mistakes all the time. However, what counts is that people must learn from mistakes. On these grounds Steve Jobs stated: “Sometimes when you innovate, you make mistakes. It is best to admit them quickly, and get on with improving your other innovations.” Similarly, in learning new language, learners make mistakes, thus it is important to accept them, learn from them, discover the reason why they make them, improve and move on. The significance of studying errors is described by Corder as: “There have always been two justifications proposed for the study of learners' errors: the pedagogical justification, namely that a good understanding of the nature of error is necessary before a systematic means of eradicating them could be found, and the theoretical justification, which claims that a study of learners' errors is part of the systematic study of the learners' language which is itself necessary to an understanding of the process of second language acquisition” (Corder, 1982; 1. Thus the importance and the aim of this paper is analyzing errors in the process of second language acquisition and the way we teachers can benefit from mistakes to help students improve themselves while giving the proper feedback.

  5. Compact disk error measurements

    Science.gov (United States)

    Howe, D.; Harriman, K.; Tehranchi, B.

    1993-01-01

    The objectives of this project are as follows: provide hardware and software that will perform simple, real-time, high resolution (single-byte) measurement of the error burst and good data gap statistics seen by a photoCD player read channel when recorded CD write-once discs of variable quality (i.e., condition) are being read; extend the above system to enable measurement of the hard decision (i.e., 1-bit error flags) and soft decision (i.e., 2-bit error flags) decoding information that is produced/used by the Cross Interleaved - Reed - Solomon - Code (CIRC) block decoder employed in the photoCD player read channel; construct a model that uses data obtained via the systems described above to produce meaningful estimates of output error rates (due to both uncorrected ECC words and misdecoded ECC words) when a CD disc having specific (measured) error statistics is read (completion date to be determined); and check the hypothesis that current adaptive CIRC block decoders are optimized for pressed (DAD/ROM) CD discs. If warranted, do a conceptual design of an adaptive CIRC decoder that is optimized for write-once CD discs.

  6. Medication administration errors in Eastern Saudi Arabia

    International Nuclear Information System (INIS)

    Mir Sadat-Ali

    2010-01-01

    To assess the prevalence and characteristics of medication errors (ME) in patients admitted to King Fahd University Hospital, Alkhobar, Kingdom of Saudi Arabia. Medication errors are documented by the nurses and physicians standard reporting forms (Hospital Based Incident Report). The study was carried out in King Fahd University Hospital, Alkhobar, Kingdom of Saudi Arabia and all the incident reports were collected during the period from January 2008 to December 2009. The incident reports were analyzed for age, gender, nationality, nursing unit, and time where ME was reported. The data were analyzed and the statistical significance differences between groups were determined by Student's t-test, and p-values of <0.05 using confidence interval of 95% were considered significant. There were 38 ME reported for the study period. The youngest patient was 5 days and the oldest 70 years. There were 31 Saudis, and 7 non-Saudi patients involved. The most common error was missed medication, which was seen in 15 (39.5%) patients. Over 15 (39.5%) of errors occurred in 2 units (pediatric medicine, and obstetrics and gynecology). Nineteen (50%) of the errors occurred during the 3-11 pm shift. Our study shows that the prevalence of ME in our institution is low, in comparison with the world literature. This could be due to under reporting of the errors, and we believe that ME reporting should be made less punitive so that ME can be studied and preventive measures implemented (Author).

  7. Analyzing temozolomide medication errors: potentially fatal.

    Science.gov (United States)

    Letarte, Nathalie; Gabay, Michael P; Bressler, Linda R; Long, Katie E; Stachnik, Joan M; Villano, J Lee

    2014-10-01

    The EORTC-NCIC regimen for glioblastoma requires different dosing of temozolomide (TMZ) during radiation and maintenance therapy. This complexity is exacerbated by the availability of multiple TMZ capsule strengths. TMZ is an alkylating agent and the major toxicity of this class is dose-related myelosuppression. Inadvertent overdose can be fatal. The websites of the Institute for Safe Medication Practices (ISMP), and the Food and Drug Administration (FDA) MedWatch database were reviewed. We searched the MedWatch database for adverse events associated with TMZ and obtained all reports including hematologic toxicity submitted from 1st November 1997 to 30th May 2012. The ISMP describes errors with TMZ resulting from the positioning of information on the label of the commercial product. The strength and quantity of capsules on the label were in close proximity to each other, and this has been changed by the manufacturer. MedWatch identified 45 medication errors. Patient errors were the most common, accounting for 21 or 47% of errors, followed by dispensing errors, which accounted for 13 or 29%. Seven reports or 16% were errors in the prescribing of TMZ. Reported outcomes ranged from reversible hematological adverse events (13%), to hospitalization for other adverse events (13%) or death (18%). Four error reports lacked detail and could not be categorized. Although the FDA issued a warning in 2003 regarding fatal medication errors and the product label warns of overdosing, errors in TMZ dosing occur for various reasons and involve both healthcare professionals and patients. Overdosing errors can be fatal.

  8. Errors in Neonatology

    Directory of Open Access Journals (Sweden)

    Antonio Boldrini

    2013-06-01

    Full Text Available Introduction: Danger and errors are inherent in human activities. In medical practice errors can lean to adverse events for patients. Mass media echo the whole scenario. Methods: We reviewed recent published papers in PubMed database to focus on the evidence and management of errors in medical practice in general and in Neonatology in particular. We compared the results of the literature with our specific experience in Nina Simulation Centre (Pisa, Italy. Results: In Neonatology the main error domains are: medication and total parenteral nutrition, resuscitation and respiratory care, invasive procedures, nosocomial infections, patient identification, diagnostics. Risk factors include patients’ size, prematurity, vulnerability and underlying disease conditions but also multidisciplinary teams, working conditions providing fatigue, a large variety of treatment and investigative modalities needed. Discussion and Conclusions: In our opinion, it is hardly possible to change the human beings but it is likely possible to change the conditions under they work. Voluntary errors report systems can help in preventing adverse events. Education and re-training by means of simulation can be an effective strategy too. In Pisa (Italy Nina (ceNtro di FormazIone e SimulazioNe NeonAtale is a simulation center that offers the possibility of a continuous retraining for technical and non-technical skills to optimize neonatological care strategies. Furthermore, we have been working on a novel skill trainer for mechanical ventilation (MEchatronic REspiratory System SImulator for Neonatal Applications, MERESSINA. Finally, in our opinion national health policy indirectly influences risk for errors. Proceedings of the 9th International Workshop on Neonatology · Cagliari (Italy · October 23rd-26th, 2013 · Learned lessons, changing practice and cutting-edge research

  9. Spent fuel bundle counter sequence error manual - BRUCE NGS

    International Nuclear Information System (INIS)

    Nicholson, L.E.

    1992-01-01

    The Spent Fuel Bundle Counter (SFBC) is used to count the number and type of spent fuel transfers that occur into or out of controlled areas at CANDU reactor sites. However if the transfers are executed in a non-standard manner or the SFBC is malfunctioning, the transfers are recorded as sequence errors. Each sequence error message typically contains adequate information to determine the cause of the message. This manual provides a guide to interpret the various sequence error messages that can occur and suggests probable cause or causes of the sequence errors. Each likely sequence error is presented on a 'card' in Appendix A. Note that it would be impractical to generate a sequence error card file with entries for all possible combinations of faults. Therefore the card file contains sequences with only one fault at a time. Some exceptions have been included however where experience has indicated that several faults can occur simultaneously

  10. Spent fuel bundle counter sequence error manual - DARLINGTON NGS

    International Nuclear Information System (INIS)

    Nicholson, L.E.

    1992-01-01

    The Spent Fuel Bundle Counter (SFBC) is used to count the number and type of spent fuel transfers that occur into or out of controlled areas at CANDU reactor sites. However if the transfers are executed in a non-standard manner or the SFBC is malfunctioning, the transfers are recorded as sequence errors. Each sequence error message typically contains adequate information to determine the cause of the message. This manual provides a guide to interpret the various sequence error messages that can occur and suggests probable cause or causes of the sequence errors. Each likely sequence error is presented on a 'card' in Appendix A. Note that it would be impractical to generate a sequence error card file with entries for all possible combinations of faults. Therefore the card file contains sequences with only one fault at a time. Some exceptions have been included however where experience has indicated that several faults can occur simultaneously

  11. LIBERTARISMO & ERROR CATEGORIAL

    Directory of Open Access Journals (Sweden)

    Carlos G. Patarroyo G.

    2009-01-01

    Full Text Available En este artículo se ofrece una defensa del libertarismo frente a dos acusaciones según las cuales éste comete un error categorial. Para ello, se utiliza la filosofía de Gilbert Ryle como herramienta para explicar las razones que fundamentan estas acusaciones y para mostrar por qué, pese a que ciertas versiones del libertarismo que acuden a la causalidad de agentes o al dualismo cartesiano cometen estos errores, un libertarismo que busque en el indeterminismo fisicalista la base de la posibilidad de la libertad humana no necesariamente puede ser acusado de incurrir en ellos.

  12. Libertarismo & Error Categorial

    OpenAIRE

    PATARROYO G, CARLOS G

    2009-01-01

    En este artículo se ofrece una defensa del libertarismo frente a dos acusaciones según las cuales éste comete un error categorial. Para ello, se utiliza la filosofía de Gilbert Ryle como herramienta para explicar las razones que fundamentan estas acusaciones y para mostrar por qué, pese a que ciertas versiones del libertarismo que acuden a la causalidad de agentes o al dualismo cartesiano cometen estos errores, un libertarismo que busque en el indeterminismo fisicalista la base de la posibili...

  13. Error Free Software

    Science.gov (United States)

    1985-01-01

    A mathematical theory for development of "higher order" software to catch computer mistakes resulted from a Johnson Space Center contract for Apollo spacecraft navigation. Two women who were involved in the project formed Higher Order Software, Inc. to develop and market the system of error analysis and correction. They designed software which is logically error-free, which, in one instance, was found to increase productivity by 600%. USE.IT defines its objectives using AXES -- a user can write in English and the system converts to computer languages. It is employed by several large corporations.

  14. Human errors, countermeasures for their prevention and evaluation

    International Nuclear Information System (INIS)

    Kohda, Takehisa; Inoue, Koichi

    1992-01-01

    The accidents originated in human errors have occurred as ever in recent large accidents such as the TMI accident and the Chernobyl accident. The proportion of the accidents originated in human errors is unexpectedly high, therefore, the reliability and safety of hardware are improved hereafter, but the improvement of human reliability cannot be expected. Human errors arise by the difference between the function required for men and the function actually accomplished by men, and the results exert some adverse effect to systems. Human errors are classified into design error, manufacture error, operation error, maintenance error, checkup error and general handling error. In terms of behavior, human errors are classified into forget to do, fail to do, do that must not be done, mistake in order and do at improper time. The factors in human error occurrence are circumstantial factor, personal factor and stress factor. As the method of analyzing and evaluating human errors, system engineering method such as probabilistic risk assessment is used. The technique for human error rate prediction, the method for human cognitive reliability, confusion matrix and SLIM-MAUD are also used. (K.I.)

  15. Overestimation of reliability by Guttman’s λ4, λ5, and λ6, and the greatest lower bound

    NARCIS (Netherlands)

    Oosterwijk, P.R.; van der Ark, L.A.; Sijtsma, K.; van der Ark, L.A.; Wiberg, M.; Culpepper, S.A.; Douglas, J.A.; Wang, W.-C.

    2017-01-01

    For methods using statistical optimization to estimate lower bounds to test-score reliability, we investigated the degree to which they overestimate true reliability. Optimization methods do not only exploit real relationships between items but also tend to capitalize on sampling error and do this

  16. Error Correcting Codes

    Indian Academy of Sciences (India)

    Science and Automation at ... the Reed-Solomon code contained 223 bytes of data, (a byte ... then you have a data storage system with error correction, that ..... practical codes, storing such a table is infeasible, as it is generally too large.

  17. Error Correcting Codes

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 2; Issue 3. Error Correcting Codes - Reed Solomon Codes. Priti Shankar. Series Article Volume 2 Issue 3 March ... Author Affiliations. Priti Shankar1. Department of Computer Science and Automation, Indian Institute of Science, Bangalore 560 012, India ...

  18. Quantum error-correcting code for ternary logic

    Science.gov (United States)

    Majumdar, Ritajit; Basu, Saikat; Ghosh, Shibashis; Sur-Kolay, Susmita

    2018-05-01

    Ternary quantum systems are being studied because they provide more computational state space per unit of information, known as qutrit. A qutrit has three basis states, thus a qubit may be considered as a special case of a qutrit where the coefficient of one of the basis states is zero. Hence both (2 ×2 ) -dimensional and (3 ×3 ) -dimensional Pauli errors can occur on qutrits. In this paper, we (i) explore the possible (2 ×2 ) -dimensional as well as (3 ×3 ) -dimensional Pauli errors in qutrits and show that any pairwise bit swap error can be expressed as a linear combination of shift errors and phase errors, (ii) propose a special type of error called a quantum superposition error and show its equivalence to arbitrary rotation, (iii) formulate a nine-qutrit code which can correct a single error in a qutrit, and (iv) provide its stabilizer and circuit realization.

  19. Nursing Errors in Intensive Care Unit by Human Error Identification in Systems Tool: A Case Study

    Directory of Open Access Journals (Sweden)

    Nezamodini

    2016-03-01

    Full Text Available Background Although health services are designed and implemented to improve human health, the errors in health services are a very common phenomenon and even sometimes fatal in this field. Medical errors and their cost are global issues with serious consequences for the patients’ community that are preventable and require serious attention. Objectives The current study aimed to identify possible nursing errors applying human error identification in systems tool (HEIST in the intensive care units (ICUs of hospitals. Patients and Methods This descriptive research was conducted in the intensive care unit of a hospital in Khuzestan province in 2013. Data were collected through observation and interview by nine nurses in this section in a period of four months. Human error classification was based on Rose and Rose and Swain and Guttmann models. According to HEIST work sheets the guide questions were answered and error causes were identified after the determination of the type of errors. Results In total 527 errors were detected. The performing operation on the wrong path had the highest frequency which was 150, and the second rate with a frequency of 136 was doing the tasks later than the deadline. Management causes with a frequency of 451 were the first rank among identified errors. Errors mostly occurred in the system observation stage and among the performance shaping factors (PSFs, time was the most influencing factor in occurrence of human errors. Conclusions Finally, in order to prevent the occurrence and reduce the consequences of identified errors the following suggestions were proposed : appropriate training courses, applying work guidelines and monitoring their implementation, increasing the number of work shifts, hiring professional workforce, equipping work space with appropriate facilities and equipment.

  20. Challenge and Error: Critical Events and Attention-Related Errors

    Science.gov (United States)

    Cheyne, James Allan; Carriere, Jonathan S. A.; Solman, Grayden J. F.; Smilek, Daniel

    2011-01-01

    Attention lapses resulting from reactivity to task challenges and their consequences constitute a pervasive factor affecting everyday performance errors and accidents. A bidirectional model of attention lapses (error [image omitted] attention-lapse: Cheyne, Solman, Carriere, & Smilek, 2009) argues that errors beget errors by generating attention…

  1. A Simulation-Based Soft Error Estimation Methodology for Computer Systems

    OpenAIRE

    Sugihara, Makoto; Ishihara, Tohru; Hashimoto, Koji; Muroyama, Masanori

    2006-01-01

    This paper proposes a simulation-based soft error estimation methodology for computer systems. Accumulating soft error rates (SERs) of all memories in a computer system results in pessimistic soft error estimation. This is because memory cells are used spatially and temporally and not all soft errors in them make the computer system faulty. Our soft-error estimation methodology considers the locations and the timings of soft errors occurring at every level of memory hierarchy and estimates th...

  2. Evaluation of drug administration errors in a teaching hospital

    Directory of Open Access Journals (Sweden)

    Berdot Sarah

    2012-03-01

    Full Text Available Abstract Background Medication errors can occur at any of the three steps of the medication use process: prescribing, dispensing and administration. We aimed to determine the incidence, type and clinical importance of drug administration errors and to identify risk factors. Methods Prospective study based on disguised observation technique in four wards in a teaching hospital in Paris, France (800 beds. A pharmacist accompanied nurses and witnessed the preparation and administration of drugs to all patients during the three drug rounds on each of six days per ward. Main outcomes were number, type and clinical importance of errors and associated risk factors. Drug administration error rate was calculated with and without wrong time errors. Relationship between the occurrence of errors and potential risk factors were investigated using logistic regression models with random effects. Results Twenty-eight nurses caring for 108 patients were observed. Among 1501 opportunities for error, 415 administrations (430 errors with one or more errors were detected (27.6%. There were 312 wrong time errors, ten simultaneously with another type of error, resulting in an error rate without wrong time error of 7.5% (113/1501. The most frequently administered drugs were the cardiovascular drugs (425/1501, 28.3%. The highest risks of error in a drug administration were for dermatological drugs. No potentially life-threatening errors were witnessed and 6% of errors were classified as having a serious or significant impact on patients (mainly omission. In multivariate analysis, the occurrence of errors was associated with drug administration route, drug classification (ATC and the number of patient under the nurse's care. Conclusion Medication administration errors are frequent. The identification of its determinants helps to undertake designed interventions.

  3. The development of a concise questionnaire designed to measure perceived outcomes on the issues of greatest importance to patients.

    Science.gov (United States)

    Busby, M; Burke, F J T; Matthews, R; Cyrta, J; Mullins, A

    2012-04-01

    To develop a concise patient feedback audit instrument designed to inform practice development on those issues of greatest importance to patients. A literature review was used to establish the issues which were of greatest importance to patients. Ten core questions were then designed with the help of an experienced survey and polling organisation. A challenging grading of patient responses was utilised in an attempt to differentiate perceived performance within a practice on the different aspects and between practices. A feasibility study was conducted using the interactive voice response mode with seven volunteer practices in 2009. The instrument was then used in the later part of 2010 by 61 practices mostly in paper-based format. Practices received feedback which is primarily based on a bar chart plotting their percentage of top grades received against a national reference sample (NRS) compiled from the results of other participating practices. A statistical analysis was conducted to establish the level at which an individual practice result becomes statistically significant against the NRS. The 61 participating practices each received an average of 121 responses (total 7,381 responses). Seventy-four percent of responses across all ten questions received the top grade, 'ideal'. Statistical analysis indicated that at the level of 121 responses, a score of around 4-9% difference to the National Reference Sample, depending on the specific question, was statistically significant. In keeping with international experience with dental patient feedback surveys this audit suggests high levels of patient satisfaction with their dental service. Nevertheless, by focusing results on the proportion of highest grades received, this instrument is capable of indicating when perceived performance falls significantly below the average. It can therefore inform practice development.

  4. Correlated Errors in the Surface Code

    Science.gov (United States)

    Lopez, Daniel; Mucciolo, E. R.; Novais, E.

    2012-02-01

    A milestone step into the development of quantum information technology would be the ability to design and operate a reliable quantum memory. The greatest obstacle to create such a device has been decoherence due to the unavoidable interaction between the quantum system and its environment. Quantum Error Correction is therefore an essential ingredient to any quantum computing information device. A great deal of attention has been given to surface codes, since it has very good scaling properties. In this seminar, we discuss the time evolution of a qubit encoded in the logical basis of a surface code. The system is interacting with a bosonic environment at zero temperature. Our results show how much spatial and time correlations can be detrimental to the efficiency of the code.

  5. Imagery of Errors in Typing

    Science.gov (United States)

    Rieger, Martina; Martinez, Fanny; Wenke, Dorit

    2011-01-01

    Using a typing task we investigated whether insufficient imagination of errors and error corrections is related to duration differences between execution and imagination. In Experiment 1 spontaneous error imagination was investigated, whereas in Experiment 2 participants were specifically instructed to imagine errors. Further, in Experiment 2 we…

  6. NDE errors and their propagation in sizing and growth estimates

    International Nuclear Information System (INIS)

    Horn, D.; Obrutsky, L.; Lakhan, R.

    2009-01-01

    this work, additional calculations can be performed as needed. Changes in the identification of correlated effects, the magnitude of errors, and the analytical form of voltage response can be made easily. The calculated errors on growth may be used to reduce conservative margins on plugging limits and the sensitivity analysis can be used to identify the technique improvements that would provide the greatest benefits. (author)

  7. Correction of refractive errors

    Directory of Open Access Journals (Sweden)

    Vladimir Pfeifer

    2005-10-01

    Full Text Available Background: Spectacles and contact lenses are the most frequently used, the safest and the cheapest way to correct refractive errors. The development of keratorefractive surgery has brought new opportunities for correction of refractive errors in patients who have the need to be less dependent of spectacles or contact lenses. Until recently, RK was the most commonly performed refractive procedure for nearsighted patients.Conclusions: The introduction of excimer laser in refractive surgery has given the new opportunities of remodelling the cornea. The laser energy can be delivered on the stromal surface like in PRK or deeper on the corneal stroma by means of lamellar surgery. In LASIK flap is created with microkeratome in LASEK with ethanol and in epi-LASIK the ultra thin flap is created mechanically.

  8. Error-Free Software

    Science.gov (United States)

    1989-01-01

    001 is an integrated tool suited for automatically developing ultra reliable models, simulations and software systems. Developed and marketed by Hamilton Technologies, Inc. (HTI), it has been applied in engineering, manufacturing, banking and software tools development. The software provides the ability to simplify the complex. A system developed with 001 can be a prototype or fully developed with production quality code. It is free of interface errors, consistent, logically complete and has no data or control flow errors. Systems can be designed, developed and maintained with maximum productivity. Margaret Hamilton, President of Hamilton Technologies, also directed the research and development of USE.IT, an earlier product which was the first computer aided software engineering product in the industry to concentrate on automatically supporting the development of an ultrareliable system throughout its life cycle. Both products originated in NASA technology developed under a Johnson Space Center contract.

  9. Minimum Tracking Error Volatility

    OpenAIRE

    Luca RICCETTI

    2010-01-01

    Investors assign part of their funds to asset managers that are given the task of beating a benchmark. The risk management department usually imposes a maximum value of the tracking error volatility (TEV) in order to keep the risk of the portfolio near to that of the selected benchmark. However, risk management does not establish a rule on TEV which enables us to understand whether the asset manager is really active or not and, in practice, asset managers sometimes follow passively the corres...

  10. Error-correction coding

    Science.gov (United States)

    Hinds, Erold W. (Principal Investigator)

    1996-01-01

    This report describes the progress made towards the completion of a specific task on error-correcting coding. The proposed research consisted of investigating the use of modulation block codes as the inner code of a concatenated coding system in order to improve the overall space link communications performance. The study proposed to identify and analyze candidate codes that will complement the performance of the overall coding system which uses the interleaved RS (255,223) code as the outer code.

  11. Satellite Photometric Error Determination

    Science.gov (United States)

    2015-10-18

    Satellite Photometric Error Determination Tamara E. Payne, Philip J. Castro, Stephen A. Gregory Applied Optimization 714 East Monument Ave, Suite...advocate the adoption of new techniques based on in-frame photometric calibrations enabled by newly available all-sky star catalogs that contain highly...filter systems will likely be supplanted by the Sloan based filter systems. The Johnson photometric system is a set of filters in the optical

  12. Video Error Correction Using Steganography

    Science.gov (United States)

    Robie, David L.; Mersereau, Russell M.

    2002-12-01

    The transmission of any data is always subject to corruption due to errors, but video transmission, because of its real time nature must deal with these errors without retransmission of the corrupted data. The error can be handled using forward error correction in the encoder or error concealment techniques in the decoder. This MPEG-2 compliant codec uses data hiding to transmit error correction information and several error concealment techniques in the decoder. The decoder resynchronizes more quickly with fewer errors than traditional resynchronization techniques. It also allows for perfect recovery of differentially encoded DCT-DC components and motion vectors. This provides for a much higher quality picture in an error-prone environment while creating an almost imperceptible degradation of the picture in an error-free environment.

  13. Video Error Correction Using Steganography

    Directory of Open Access Journals (Sweden)

    Robie David L

    2002-01-01

    Full Text Available The transmission of any data is always subject to corruption due to errors, but video transmission, because of its real time nature must deal with these errors without retransmission of the corrupted data. The error can be handled using forward error correction in the encoder or error concealment techniques in the decoder. This MPEG-2 compliant codec uses data hiding to transmit error correction information and several error concealment techniques in the decoder. The decoder resynchronizes more quickly with fewer errors than traditional resynchronization techniques. It also allows for perfect recovery of differentially encoded DCT-DC components and motion vectors. This provides for a much higher quality picture in an error-prone environment while creating an almost imperceptible degradation of the picture in an error-free environment.

  14. Cultural differences in categorical memory errors persist with age.

    Science.gov (United States)

    Gutchess, Angela; Boduroglu, Aysecan

    2018-01-02

    This cross-sectional experiment examined the influence of aging on cross-cultural differences in memory errors. Previous research revealed that Americans committed more categorical memory errors than Turks; we tested whether the cognitive constraints associated with aging impacted the pattern of memory errors across cultures. Furthermore, older adults are vulnerable to memory errors for semantically-related information, and we assessed whether this tendency occurs across cultures. Younger and older adults from the US and Turkey studied word pairs, with some pairs sharing a categorical relationship and some unrelated. Participants then completed a cued recall test, generating the word that was paired with the first. These responses were scored for correct responses or different types of errors, including categorical and semantic. The tendency for Americans to commit more categorical memory errors emerged for both younger and older adults. In addition, older adults across cultures committed more memory errors, and these were for semantically-related information (including both categorical and other types of semantic errors). Heightened vulnerability to memory errors with age extends across cultural groups, and Americans' proneness to commit categorical memory errors occurs across ages. The findings indicate some robustness in the ways that age and culture influence memory errors.

  15. [Medication errors in Spanish intensive care units].

    Science.gov (United States)

    Merino, P; Martín, M C; Alonso, A; Gutiérrez, I; Alvarez, J; Becerril, F

    2013-01-01

    To estimate the incidence of medication errors in Spanish intensive care units. Post hoc study of the SYREC trial. A longitudinal observational study carried out during 24 hours in patients admitted to the ICU. Spanish intensive care units. Patients admitted to the intensive care unit participating in the SYREC during the period of study. Risk, individual risk, and rate of medication errors. The final study sample consisted of 1017 patients from 79 intensive care units; 591 (58%) were affected by one or more incidents. Of these, 253 (43%) had at least one medication-related incident. The total number of incidents reported was 1424, of which 350 (25%) were medication errors. The risk of suffering at least one incident was 22% (IQR: 8-50%) while the individual risk was 21% (IQR: 8-42%). The medication error rate was 1.13 medication errors per 100 patient-days of stay. Most incidents occurred in the prescription (34%) and administration (28%) phases, 16% resulted in patient harm, and 82% were considered "totally avoidable". Medication errors are among the most frequent types of incidents in critically ill patients, and are more common in the prescription and administration stages. Although most such incidents have no clinical consequences, a significant percentage prove harmful for the patient, and a large proportion are avoidable. Copyright © 2012 Elsevier España, S.L. and SEMICYUC. All rights reserved.

  16. Error-related brain activity and error awareness in an error classification paradigm.

    Science.gov (United States)

    Di Gregorio, Francesco; Steinhauser, Marco; Maier, Martin E

    2016-10-01

    Error-related brain activity has been linked to error detection enabling adaptive behavioral adjustments. However, it is still unclear which role error awareness plays in this process. Here, we show that the error-related negativity (Ne/ERN), an event-related potential reflecting early error monitoring, is dissociable from the degree of error awareness. Participants responded to a target while ignoring two different incongruent distractors. After responding, they indicated whether they had committed an error, and if so, whether they had responded to one or to the other distractor. This error classification paradigm allowed distinguishing partially aware errors, (i.e., errors that were noticed but misclassified) and fully aware errors (i.e., errors that were correctly classified). The Ne/ERN was larger for partially aware errors than for fully aware errors. Whereas this speaks against the idea that the Ne/ERN foreshadows the degree of error awareness, it confirms the prediction of a computational model, which relates the Ne/ERN to post-response conflict. This model predicts that stronger distractor processing - a prerequisite of error classification in our paradigm - leads to lower post-response conflict and thus a smaller Ne/ERN. This implies that the relationship between Ne/ERN and error awareness depends on how error awareness is related to response conflict in a specific task. Our results further indicate that the Ne/ERN but not the degree of error awareness determines adaptive performance adjustments. Taken together, we conclude that the Ne/ERN is dissociable from error awareness and foreshadows adaptive performance adjustments. Our results suggest that the relationship between the Ne/ERN and error awareness is correlative and mediated by response conflict. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Medication errors in anaesthetic practice: a report of two cases and ...

    African Journals Online (AJOL)

    EB

    2013-09-03

    Sep 3, 2013 ... Key words: Medication errors, anaesthetic practice, vigilance, safety .... reports in the Australian Incident Monitoring Study. (AIMS). ... contribute to systems failure and prescription errors were most ... being due to equipment error.17 Previous studies have ... errors reported occurred during day shifts and they.

  18. Defining near misses : towards a sharpened definition based on empirical data about error handling processes

    NARCIS (Netherlands)

    Kessels-Habraken, M.M.P.; Schaaf, van der T.W.; Jonge, de J.; Rutte, C.G.

    2010-01-01

    Medical errors in health care still occur frequently. Unfortunately, errors cannot be completely prevented and 100% safety can never be achieved. Therefore, in addition to error reduction strategies, health care organisations could also implement strategies that promote timely error detection and

  19. Trial application of a technique for human error analysis (ATHEANA)

    International Nuclear Information System (INIS)

    Bley, D.C.; Cooper, S.E.; Parry, G.W.

    1996-01-01

    The new method for HRA, ATHEANA, has been developed based on a study of the operating history of serious accidents and an understanding of the reasons why people make errors. Previous publications associated with the project have dealt with the theoretical framework under which errors occur and the retrospective analysis of operational events. This is the first attempt to use ATHEANA in a prospective way, to select and evaluate human errors within the PSA context

  20. Diagnostic errors in pediatric radiology

    International Nuclear Information System (INIS)

    Taylor, George A.; Voss, Stephan D.; Melvin, Patrice R.; Graham, Dionne A.

    2011-01-01

    Little information is known about the frequency, types and causes of diagnostic errors in imaging children. Our goals were to describe the patterns and potential etiologies of diagnostic error in our subspecialty. We reviewed 265 cases with clinically significant diagnostic errors identified during a 10-year period. Errors were defined as a diagnosis that was delayed, wrong or missed; they were classified as perceptual, cognitive, system-related or unavoidable; and they were evaluated by imaging modality and level of training of the physician involved. We identified 484 specific errors in the 265 cases reviewed (mean:1.8 errors/case). Most discrepancies involved staff (45.5%). Two hundred fifty-eight individual cognitive errors were identified in 151 cases (mean = 1.7 errors/case). Of these, 83 cases (55%) had additional perceptual or system-related errors. One hundred sixty-five perceptual errors were identified in 165 cases. Of these, 68 cases (41%) also had cognitive or system-related errors. Fifty-four system-related errors were identified in 46 cases (mean = 1.2 errors/case) of which all were multi-factorial. Seven cases were unavoidable. Our study defines a taxonomy of diagnostic errors in a large academic pediatric radiology practice and suggests that most are multi-factorial in etiology. Further study is needed to define effective strategies for improvement. (orig.)

  1. Investigating Medication Errors in Educational Health Centers of Kermanshah

    Directory of Open Access Journals (Sweden)

    Mohsen Mohammadi

    2015-08-01

    Full Text Available Background and objectives : Medication errors can be a threat to the safety of patients. Preventing medication errors requires reporting and investigating such errors. The present study was conducted with the purpose of investigating medication errors in educational health centers of Kermanshah. Material and Methods: The present research is an applied, descriptive-analytical study and is done as a survey. Error Report of Ministry of Health and Medical Education was used for data collection. The population of the study included all the personnel (nurses, doctors, paramedics of educational health centers of Kermanshah. Among them, those who reported the committed errors were selected as the sample of the study. The data analysis was done using descriptive statistics and Chi 2 Test using SPSS version 18. Results: The findings of the study showed that most errors were related to not using medication properly, the least number of errors were related to improper dose, and the majority of errors occurred in the morning. The most frequent reason for errors was staff negligence and the least frequent was the lack of knowledge. Conclusion: The health care system should create an environment for detecting and reporting errors by the personnel, recognizing related factors causing errors, training the personnel and create a good working environment and standard workload.

  2. Minimum Error Entropy Classification

    CERN Document Server

    Marques de Sá, Joaquim P; Santos, Jorge M F; Alexandre, Luís A

    2013-01-01

    This book explains the minimum error entropy (MEE) concept applied to data classification machines. Theoretical results on the inner workings of the MEE concept, in its application to solving a variety of classification problems, are presented in the wider realm of risk functionals. Researchers and practitioners also find in the book a detailed presentation of practical data classifiers using MEE. These include multi‐layer perceptrons, recurrent neural networks, complexvalued neural networks, modular neural networks, and decision trees. A clustering algorithm using a MEE‐like concept is also presented. Examples, tests, evaluation experiments and comparison with similar machines using classic approaches, complement the descriptions.

  3. Nocturia is the Lower Urinary Tract Symptom With Greatest Impact on Quality of Life of Men From a Community Setting

    Directory of Open Access Journals (Sweden)

    Eduardo de Paula Miranda

    2014-06-01

    Full Text Available PurposeLower urinary tract symptoms are numerous, but the specific impact of each of these symptoms on the quality of life (QoL has not been evaluated in community-dwelling men. An assessment of these symptoms and their effects on QoL was the focus of this study.MethodsWe performed a cross-sectional study with 373 men aged >50 years from a community setting. Patients completed the International Prostate Symptom Score questionnaire, which includes questions on each of the specific urinary symptoms and a question addressing health-related QoL that are graded from 0 to 5. We used the Pearson correlation test to assess the impact of each symptom on QoL.ResultsNocturia (58.9% was the most prevalent urinary symptom. The mean score was 0.9±1.4 for incomplete emptying, 1.0±1.5 for frequency, 0.9±1.3 for intermittency, 0.8±1.3 for urgency, 1.0±1.5 for weak stream, 0.5±1.0 for straining, and 2.0±1.6 for nocturia. Nocturia and frequency were the only symptoms associated with poorer QoL, with nocturia showing a stronger association.ConclusionsNocturia affects 50% of community dwelling men aged >50 years, and is the lower urinary tract symptom with the greatest negative impact on QoL.

  4. Eigen's Error Threshold and Mutational Meltdown in a Quasispecies Model

    OpenAIRE

    Bagnoli, F.; Bezzi, M.

    1998-01-01

    We introduce a toy model for interacting populations connected by mutations and limited by a shared resource. We study the presence of Eigen's error threshold and mutational meltdown. The phase diagram of the system shows that the extinction of the whole population due to mutational meltdown can occur well before an eventual error threshold transition.

  5. The Frame Constraint on Experimentally Elicited Speech Errors in Japanese

    Science.gov (United States)

    Saito, Akie; Inoue, Tomoyoshi

    2017-01-01

    The so-called syllable position effect in speech errors has been interpreted as reflecting constraints posed by the frame structure of a given language, which is separately operating from linguistic content during speech production. The effect refers to the phenomenon that when a speech error occurs, replaced and replacing sounds tend to be in the…

  6. Analysis of the interface tracking errors

    International Nuclear Information System (INIS)

    Cerne, G.; Tiselj, I.; Petelin, S.

    2001-01-01

    An important limitation of the interface-tracking algorithm is the grid density, which determines the space scale of the surface tracking. In this paper the analysis of the interface tracking errors, which occur in a dispersed flow, is performed for the VOF interface tracking method. A few simple two-fluid tests are proposed for the investigation of the interface tracking errors and their grid dependence. When the grid density becomes too coarse to follow the interface changes, the errors can be reduced either by using denser nodalization or by switching to the two-fluid model during the simulation. Both solutions are analyzed and compared on a simple vortex-flow test.(author)

  7. Standard Errors for Matrix Correlations.

    Science.gov (United States)

    Ogasawara, Haruhiko

    1999-01-01

    Derives the asymptotic standard errors and intercorrelations for several matrix correlations assuming multivariate normality for manifest variables and derives the asymptotic standard errors of the matrix correlations for two factor-loading matrices. (SLD)

  8. Fish species of greatest conservation need in wadeable Iowa streams: current status and effectiveness of Aquatic Gap Program distribution models

    Science.gov (United States)

    Sindt, Anthony R.; Pierce, Clay; Quist, Michael C.

    2012-01-01

    Effective conservation of fish species of greatest conservation need (SGCN) requires an understanding of species–habitat relationships and distributional trends. Thus, modeling the distribution of fish species across large spatial scales may be a valuable tool for conservation planning. Our goals were to evaluate the status of 10 fish SGCN in wadeable Iowa streams and to test the effectiveness of Iowa Aquatic Gap Analysis Project (IAGAP) species distribution models. We sampled fish assemblages from 86 wadeable stream segments in the Mississippi River drainage of Iowa during 2009 and 2010 to provide contemporary, independent fish species presence–absence data. The frequencies of occurrence in stream segments where species were historically documented varied from 0.0% for redfin shiner Lythrurus umbratilis to 100.0% for American brook lampreyLampetra appendix, with a mean of 53.0%, suggesting that the status of Iowa fish SGCN is highly variable. Cohen's kappa values and other model performance measures were calculated by comparing field-collected presence–absence data with IAGAP model–predicted presences and absences for 12 fish SGCN. Kappa values varied from 0.00 to 0.50, with a mean of 0.15. The models only predicted the occurrences of banded darterEtheostoma zonale, southern redbelly dace Phoxinus erythrogaster, and longnose daceRhinichthys cataractae more accurately than would be expected by chance. Overall, the accuracy of the twelve models was low, with a mean correct classification rate of 58.3%. Poor model performance probably reflects the difficulties associated with modeling the distribution of rare species and the inability of the large-scale habitat variables used in IAGAP models to explain the variation in fish species occurrences. Our results highlight the importance of quantifying the confidence in species distribution model predictions with an independent data set and the need for long-term monitoring to better understand the

  9. When the most potent combination of antibiotics selects for the greatest bacterial load: the smile-frown transition.

    Directory of Open Access Journals (Sweden)

    Rafael Pena-Miller

    Full Text Available Conventional wisdom holds that the best way to treat infection with antibiotics is to 'hit early and hit hard'. A favoured strategy is to deploy two antibiotics that produce a stronger effect in combination than if either drug were used alone. But are such synergistic combinations necessarily optimal? We combine mathematical modelling, evolution experiments, whole genome sequencing and genetic manipulation of a resistance mechanism to demonstrate that deploying synergistic antibiotics can, in practice, be the worst strategy if bacterial clearance is not achieved after the first treatment phase. As treatment proceeds, it is only to be expected that the strength of antibiotic synergy will diminish as the frequency of drug-resistant bacteria increases. Indeed, antibiotic efficacy decays exponentially in our five-day evolution experiments. However, as the theory of competitive release predicts, drug-resistant bacteria replicate fastest when their drug-susceptible competitors are eliminated by overly-aggressive treatment. Here, synergy exerts such strong selection for resistance that an antagonism consistently emerges by day 1 and the initially most aggressive treatment produces the greatest bacterial load, a fortiori greater than if just one drug were given. Whole genome sequencing reveals that such rapid evolution is the result of the amplification of a genomic region containing four drug-resistance mechanisms, including the acrAB efflux operon. When this operon is deleted in genetically manipulated mutants and the evolution experiment repeated, antagonism fails to emerge in five days and antibiotic synergy is maintained for longer. We therefore conclude that unless super-inhibitory doses are achieved and maintained until the pathogen is successfully cleared, synergistic antibiotics can have the opposite effect to that intended by helping to increase pathogen load where, and when, the drugs are found at sub-inhibitory concentrations.

  10. When the most potent combination of antibiotics selects for the greatest bacterial load: the smile-frown transition.

    Science.gov (United States)

    Pena-Miller, Rafael; Laehnemann, David; Jansen, Gunther; Fuentes-Hernandez, Ayari; Rosenstiel, Philip; Schulenburg, Hinrich; Beardmore, Robert

    2013-01-01

    Conventional wisdom holds that the best way to treat infection with antibiotics is to 'hit early and hit hard'. A favoured strategy is to deploy two antibiotics that produce a stronger effect in combination than if either drug were used alone. But are such synergistic combinations necessarily optimal? We combine mathematical modelling, evolution experiments, whole genome sequencing and genetic manipulation of a resistance mechanism to demonstrate that deploying synergistic antibiotics can, in practice, be the worst strategy if bacterial clearance is not achieved after the first treatment phase. As treatment proceeds, it is only to be expected that the strength of antibiotic synergy will diminish as the frequency of drug-resistant bacteria increases. Indeed, antibiotic efficacy decays exponentially in our five-day evolution experiments. However, as the theory of competitive release predicts, drug-resistant bacteria replicate fastest when their drug-susceptible competitors are eliminated by overly-aggressive treatment. Here, synergy exerts such strong selection for resistance that an antagonism consistently emerges by day 1 and the initially most aggressive treatment produces the greatest bacterial load, a fortiori greater than if just one drug were given. Whole genome sequencing reveals that such rapid evolution is the result of the amplification of a genomic region containing four drug-resistance mechanisms, including the acrAB efflux operon. When this operon is deleted in genetically manipulated mutants and the evolution experiment repeated, antagonism fails to emerge in five days and antibiotic synergy is maintained for longer. We therefore conclude that unless super-inhibitory doses are achieved and maintained until the pathogen is successfully cleared, synergistic antibiotics can have the opposite effect to that intended by helping to increase pathogen load where, and when, the drugs are found at sub-inhibitory concentrations.

  11. Pain is the Greatest Preoperative Concern for Patients and Parents Before Posterior Spinal Fusion for Adolescent Idiopathic Scoliosis.

    Science.gov (United States)

    Chan, Priscella; Skaggs, David L; Sanders, Austin E; Villamor, Gabriela A; Choi, Paul D; Tolo, Vernon T; Andras, Lindsay M

    2017-11-01

    Prospective cross-sectional study. To evaluate patients' and parents' concerns so they can be addressed with appropriate preoperative counseling. Despite much research on outcomes for posterior spinal fusion (PSF) in adolescent idiopathic scoliosis (AIS), little is available about preoperative fears or concerns. Patients with AIS undergoing PSF, their parents, and surgeons were prospectively enrolled and asked to complete a survey on their fears and concerns about surgery at their preoperative appointment. Forty-eight patients and parents completed surveys. Four attending pediatric spine surgeons participated and submitted 48 responses. Mean age of patients was 14.2 years. On a scale of 0 to 10, mean level of concern reported by parents (6.9) was higher than that reported by patients (4.6). Surgeons rated the procedure's complexity on a scale of 0 to 10 and reported a mean of 5.2. Neither patients' nor parents' level of concern correlated with the surgeons' assessment of the procedure's complexity level (R = 0.19 and 0.12, P = 0.20 and P = 0.42, respectively). Top three concerns for patients were pain (25%), ability to return to activities (21%), and neurologic injury (17%). Top three concerns for parents were pain (35%), neurologic injury (21%), and amount of correction (17%). Top three concerns for surgeons were postoperative shoulder balance (44%), neurologic injury (27%), and lowest instrumented vertebrae selection (27%). Patients reported the same concerns 23% of the time as parents, and 17% of the time as surgeons. Parents and surgeons reported the same concerns 21% of the time. Pain was the greatest concern for both patients and parents but was rarely listed as a concern by surgeons. Parent and patient level of concern did not correlate to the surgeon's assessment of the procedure's complexity. Neurologic injury was a top concern for all groups, but otherwise there was little overlap between physician, patient, and parent concerns. 3.

  12. Evaluating a medical error taxonomy.

    OpenAIRE

    Brixey, Juliana; Johnson, Todd R.; Zhang, Jiajie

    2002-01-01

    Healthcare has been slow in using human factors principles to reduce medical errors. The Center for Devices and Radiological Health (CDRH) recognizes that a lack of attention to human factors during product development may lead to errors that have the potential for patient injury, or even death. In response to the need for reducing medication errors, the National Coordinating Council for Medication Errors Reporting and Prevention (NCC MERP) released the NCC MERP taxonomy that provides a stand...

  13. Generalizing human error rates: A taxonomic approach

    International Nuclear Information System (INIS)

    Buffardi, L.; Fleishman, E.; Allen, J.

    1989-01-01

    It is well established that human error plays a major role in malfunctioning of complex, technological systems and in accidents associated with their operation. Estimates of the rate of human error in the nuclear industry range from 20-65% of all system failures. In response to this, the Nuclear Regulatory Commission has developed a variety of techniques for estimating human error probabilities for nuclear power plant personnel. Most of these techniques require the specification of the range of human error probabilities for various tasks. Unfortunately, very little objective performance data on error probabilities exist for nuclear environments. Thus, when human reliability estimates are required, for example in computer simulation modeling of system reliability, only subjective estimates (usually based on experts' best guesses) can be provided. The objective of the current research is to provide guidelines for the selection of human error probabilities based on actual performance data taken in other complex environments and applying them to nuclear settings. A key feature of this research is the application of a comprehensive taxonomic approach to nuclear and non-nuclear tasks to evaluate their similarities and differences, thus providing a basis for generalizing human error estimates across tasks. In recent years significant developments have occurred in classifying and describing tasks. Initial goals of the current research are to: (1) identify alternative taxonomic schemes that can be applied to tasks, and (2) describe nuclear tasks in terms of these schemes. Three standardized taxonomic schemes (Ability Requirements Approach, Generalized Information-Processing Approach, Task Characteristics Approach) are identified, modified, and evaluated for their suitability in comparing nuclear and non-nuclear power plant tasks. An agenda for future research and its relevance to nuclear power plant safety is also discussed

  14. Forecast Combination under Heavy-Tailed Errors

    Directory of Open Access Journals (Sweden)

    Gang Cheng

    2015-11-01

    Full Text Available Forecast combination has been proven to be a very important technique to obtain accurate predictions for various applications in economics, finance, marketing and many other areas. In many applications, forecast errors exhibit heavy-tailed behaviors for various reasons. Unfortunately, to our knowledge, little has been done to obtain reliable forecast combinations for such situations. The familiar forecast combination methods, such as simple average, least squares regression or those based on the variance-covariance of the forecasts, may perform very poorly due to the fact that outliers tend to occur, and they make these methods have unstable weights, leading to un-robust forecasts. To address this problem, in this paper, we propose two nonparametric forecast combination methods. One is specially proposed for the situations in which the forecast errors are strongly believed to have heavy tails that can be modeled by a scaled Student’s t-distribution; the other is designed for relatively more general situations when there is a lack of strong or consistent evidence on the tail behaviors of the forecast errors due to a shortage of data and/or an evolving data-generating process. Adaptive risk bounds of both methods are developed. They show that the resulting combined forecasts yield near optimal mean forecast errors relative to the candidate forecasts. Simulations and a real example demonstrate their superior performance in that they indeed tend to have significantly smaller prediction errors than the previous combination methods in the presence of forecast outliers.

  15. Uncertainty quantification and error analysis

    Energy Technology Data Exchange (ETDEWEB)

    Higdon, Dave M [Los Alamos National Laboratory; Anderson, Mark C [Los Alamos National Laboratory; Habib, Salman [Los Alamos National Laboratory; Klein, Richard [Los Alamos National Laboratory; Berliner, Mark [OHIO STATE UNIV.; Covey, Curt [LLNL; Ghattas, Omar [UNIV OF TEXAS; Graziani, Carlo [UNIV OF CHICAGO; Seager, Mark [LLNL; Sefcik, Joseph [LLNL; Stark, Philip [UC/BERKELEY; Stewart, James [SNL

    2010-01-01

    UQ studies all sources of error and uncertainty, including: systematic and stochastic measurement error; ignorance; limitations of theoretical models; limitations of numerical representations of those models; limitations on the accuracy and reliability of computations, approximations, and algorithms; and human error. A more precise definition for UQ is suggested below.

  16. Error Patterns in Problem Solving.

    Science.gov (United States)

    Babbitt, Beatrice C.

    Although many common problem-solving errors within the realm of school mathematics have been previously identified, a compilation of such errors is not readily available within learning disabilities textbooks, mathematics education texts, or teacher's manuals for school mathematics texts. Using data on error frequencies drawn from both the Fourth…

  17. Performance, postmodernity and errors

    DEFF Research Database (Denmark)

    Harder, Peter

    2013-01-01

    speaker’s competency (note the –y ending!) reflects adaptation to the community langue, including variations. This reversal of perspective also reverses our understanding of the relationship between structure and deviation. In the heyday of structuralism, it was tempting to confuse the invariant system...... with the prestige variety, and conflate non-standard variation with parole/performance and class both as erroneous. Nowadays the anti-structural sentiment of present-day linguistics makes it tempting to confuse the rejection of ideal abstract structure with a rejection of any distinction between grammatical...... as deviant from the perspective of function-based structure and discuss to what extent the recognition of a community langue as a source of adaptive pressure may throw light on different types of deviation, including language handicaps and learner errors....

  18. Post-error action control is neurobehaviorally modulated under conditions of constant speeded response

    Directory of Open Access Journals (Sweden)

    Takahiro eSoshi

    2015-01-01

    Full Text Available Post-error slowing is an error recovery strategy that contributes to action control, and occurs after errors in order to prevent future behavioral flaws. Error recovery often malfunctions in clinical populations, but the relationship between behavioral traits and recovery from error is unclear in healthy populations. The present study investigated the relationship between impulsivity and error recovery by simulating a speeded response situation using a Go/No-go paradigm that forced the participants to constantly make accelerated responses prior to stimuli disappearance (stimulus duration: 250 ms. Neural correlates of post-error processing were examined using event-related potentials (ERPs. Impulsivity traits were measured with self-report questionnaires (BIS-11, BIS/BAS. Behavioral results demonstrated that the commission error for No-go trials was 15%, but post-error slowing did not take place immediately. Delayed post-error slowing was negatively correlated with error rates and impulsivity traits, showing that response slowing was associated with reduced error rates and changed with impulsivity. Response-locked error ERPs were clearly observed for the error trials. Contrary to previous studies, error ERPs were not significantly related to post-error slowing. Stimulus-locked N2 was negatively correlated with post-error slowing and positively correlated with impulsivity traits at the second post-error Go trial: larger N2 activity was associated with greater post-error slowing and less impulsivity. In summary, under constant speeded conditions, error monitoring was dissociated from post-error action control, and post-error slowing did not occur quickly. Furthermore, post-error slowing and its neural correlate (N2 were modulated by impulsivity traits. These findings suggest that there may be clinical and practical efficacy of maintaining cognitive control of actions during error recovery under common daily environments that frequently evoke

  19. Errors in causal inference: an organizational schema for systematic error and random error.

    Science.gov (United States)

    Suzuki, Etsuji; Tsuda, Toshihide; Mitsuhashi, Toshiharu; Mansournia, Mohammad Ali; Yamamoto, Eiji

    2016-11-01

    To provide an organizational schema for systematic error and random error in estimating causal measures, aimed at clarifying the concept of errors from the perspective of causal inference. We propose to divide systematic error into structural error and analytic error. With regard to random error, our schema shows its four major sources: nondeterministic counterfactuals, sampling variability, a mechanism that generates exposure events and measurement variability. Structural error is defined from the perspective of counterfactual reasoning and divided into nonexchangeability bias (which comprises confounding bias and selection bias) and measurement bias. Directed acyclic graphs are useful to illustrate this kind of error. Nonexchangeability bias implies a lack of "exchangeability" between the selected exposed and unexposed groups. A lack of exchangeability is not a primary concern of measurement bias, justifying its separation from confounding bias and selection bias. Many forms of analytic errors result from the small-sample properties of the estimator used and vanish asymptotically. Analytic error also results from wrong (misspecified) statistical models and inappropriate statistical methods. Our organizational schema is helpful for understanding the relationship between systematic error and random error from a previously less investigated aspect, enabling us to better understand the relationship between accuracy, validity, and precision. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. Common errors of drug administration in infants: causes and avoidance.

    Science.gov (United States)

    Anderson, B J; Ellis, J F

    1999-01-01

    Drug administration errors are common in infants. Although the infant population has a high exposure to drugs, there are few data concerning pharmacokinetics or pharmacodynamics, or the influence of paediatric diseases on these processes. Children remain therapeutic orphans. Formulations are often suitable only for adults; in addition, the lack of maturation of drug elimination processes, alteration of body composition and influence of size render the calculation of drug doses complex in infants. The commonest drug administration error in infants is one of dose, and the commonest hospital site for this error is the intensive care unit. Drug errors are a consequence of system error, and preventive strategies are possible through system analysis. The goal of a zero drug error rate should be aggressively sought, with systems in place that aim to eliminate the effects of inevitable human error. This involves review of the entire system from drug manufacture to drug administration. The nuclear industry, telecommunications and air traffic control services all practise error reduction policies with zero error as a clear goal, not by finding fault in the individual, but by identifying faults in the system and building into that system mechanisms for picking up faults before they occur. Such policies could be adapted to medicine using interventions both specific (the production of formulations which are for children only and clearly labelled, regular audit by pharmacists, legible prescriptions, standardised dose tables) and general (paediatric drug trials, education programmes, nonpunitive error reporting) to reduce the number of errors made in giving medication to infants.

  1. Reducing diagnostic errors in medicine: what's the goal?

    Science.gov (United States)

    Graber, Mark; Gordon, Ruthanna; Franklin, Nancy

    2002-10-01

    This review considers the feasibility of reducing or eliminating the three major categories of diagnostic errors in medicine: "No-fault errors" occur when the disease is silent, presents atypically, or mimics something more common. These errors will inevitably decline as medical science advances, new syndromes are identified, and diseases can be detected more accurately or at earlier stages. These errors can never be eradicated, unfortunately, because new diseases emerge, tests are never perfect, patients are sometimes noncompliant, and physicians will inevitably, at times, choose the most likely diagnosis over the correct one, illustrating the concept of necessary fallibility and the probabilistic nature of choosing a diagnosis. "System errors" play a role when diagnosis is delayed or missed because of latent imperfections in the health care system. These errors can be reduced by system improvements, but can never be eliminated because these improvements lag behind and degrade over time, and each new fix creates the opportunity for novel errors. Tradeoffs also guarantee system errors will persist, when resources are just shifted. "Cognitive errors" reflect misdiagnosis from faulty data collection or interpretation, flawed reasoning, or incomplete knowledge. The limitations of human processing and the inherent biases in using heuristics guarantee that these errors will persist. Opportunities exist, however, for improving the cognitive aspect of diagnosis by adopting system-level changes (e.g., second opinions, decision-support systems, enhanced access to specialists) and by training designed to improve cognition or cognitive awareness. Diagnostic error can be substantially reduced, but never eradicated.

  2. The error performance analysis over cyclic redundancy check codes

    Science.gov (United States)

    Yoon, Hee B.

    1991-06-01

    The burst error is generated in digital communication networks by various unpredictable conditions, which occur at high error rates, for short durations, and can impact services. To completely describe a burst error one has to know the bit pattern. This is impossible in practice on working systems. Therefore, under the memoryless binary symmetric channel (MBSC) assumptions, the performance evaluation or estimation schemes for digital signal 1 (DS1) transmission systems carrying live traffic is an interesting and important problem. This study will present some analytical methods, leading to efficient detecting algorithms of burst error using cyclic redundancy check (CRC) code. The definition of burst error is introduced using three different models. Among the three burst error models, the mathematical model is used in this study. The probability density function, function(b) of burst error of length b is proposed. The performance of CRC-n codes is evaluated and analyzed using function(b) through the use of a computer simulation model within CRC block burst error. The simulation result shows that the mean block burst error tends to approach the pattern of the burst error which random bit errors generate.

  3. Optimizer convergence and local minima errors and their clinical importance

    International Nuclear Information System (INIS)

    Jeraj, Robert; Wu, Chuan; Mackie, Thomas R

    2003-01-01

    Two of the errors common in the inverse treatment planning optimization have been investigated. The first error is the optimizer convergence error, which appears because of non-perfect convergence to the global or local solution, usually caused by a non-zero stopping criterion. The second error is the local minima error, which occurs when the objective function is not convex and/or the feasible solution space is not convex. The magnitude of the errors, their relative importance in comparison to other errors as well as their clinical significance in terms of tumour control probability (TCP) and normal tissue complication probability (NTCP) were investigated. Two inherently different optimizers, a stochastic simulated annealing and deterministic gradient method were compared on a clinical example. It was found that for typical optimization the optimizer convergence errors are rather small, especially compared to other convergence errors, e.g., convergence errors due to inaccuracy of the current dose calculation algorithms. This indicates that stopping criteria could often be relaxed leading into optimization speed-ups. The local minima errors were also found to be relatively small and typically in the range of the dose calculation convergence errors. Even for the cases where significantly higher objective function scores were obtained the local minima errors were not significantly higher. Clinical evaluation of the optimizer convergence error showed good correlation between the convergence of the clinical TCP or NTCP measures and convergence of the physical dose distribution. On the other hand, the local minima errors resulted in significantly different TCP or NTCP values (up to a factor of 2) indicating clinical importance of the local minima produced by physical optimization

  4. Posts, pics, or polls? Which post type generates the greatest engagement in a Facebook physical activity intervention?

    Science.gov (United States)

    Edney, Sarah; Looyestyn, Jemma; Ryan, Jillian; Kernot, Jocelyn; Maher, Carol

    2018-04-05

    Social networking websites have attracted considerable attention as a delivery platform for physical activity interventions. Current evidence highlights a need to enhance user engagement with these interventions to actualize their potential. The purpose of this study was to determine which post type generates the most engagement from participants and whether engagement was related to change in physical activity in an intervention delivered via Facebook. Subgroup analysis of the intervention condition of a randomized controlled trial was conducted. The group moderator posted a new message to the private Facebook group each day of the program. The Facebook posts (n = 118) were categorized into the following types: moderator-initiated running program, multimedia, motivational, opinion polls, or discussion question and participant-initiated experience shares, or questions. Four metrics were used to measure volume of engagement with each post type, "likes," "comments," "poll votes," and "photo uploads." One-way ANOVA was used to determine whether engagement differed by post type and an independent samples t-test to determine differences in engagement between moderator and participant-initiated posts. Pearson correlation was used to examine associations between total engagement and change in physical activity. Engagement varied by post type. Polls elicited the greatest engagement (p ≤ .01). The most common form of engagement was "likes," and engagement was higher for moderator-initiated rather than participant-initiated posts (mean = 8.0 [SD 6.8] vs. 5.3 [SD 3.2]; p ≤ .01). Total engagement with the Facebook group was not directly associated with change in physical activity (r = -.13, p = .47). However, engagement was associated with compliance with the running program (r = .37, p = .04) and there was a nonsignificant positive association between compliance and change in physical activity (r = .32, p = .08). Posts requiring a simple response generated the most

  5. Controlling errors in unidosis carts

    Directory of Open Access Journals (Sweden)

    Inmaculada Díaz Fernández

    2010-01-01

    Full Text Available Objective: To identify errors in the unidosis system carts. Method: For two months, the Pharmacy Service controlled medication either returned or missing from the unidosis carts both in the pharmacy and in the wards. Results: Uncorrected unidosis carts show a 0.9% of medication errors (264 versus 0.6% (154 which appeared in unidosis carts previously revised. In carts not revised, the error is 70.83% and mainly caused when setting up unidosis carts. The rest are due to a lack of stock or unavailability (21.6%, errors in the transcription of medical orders (6.81% or that the boxes had not been emptied previously (0.76%. The errors found in the units correspond to errors in the transcription of the treatment (3.46%, non-receipt of the unidosis copy (23.14%, the patient did not take the medication (14.36%or was discharged without medication (12.77%, was not provided by nurses (14.09%, was withdrawn from the stocks of the unit (14.62%, and errors of the pharmacy service (17.56% . Conclusions: It is concluded the need to redress unidosis carts and a computerized prescription system to avoid errors in transcription.Discussion: A high percentage of medication errors is caused by human error. If unidosis carts are overlooked before sent to hospitalization units, the error diminishes to 0.3%.

  6. Prioritising interventions against medication errors

    DEFF Research Database (Denmark)

    Lisby, Marianne; Pape-Larsen, Louise; Sørensen, Ann Lykkegaard

    errors are therefore needed. Development of definition: A definition of medication errors including an index of error types for each stage in the medication process was developed from existing terminology and through a modified Delphi-process in 2008. The Delphi panel consisted of 25 interdisciplinary......Abstract Authors: Lisby M, Larsen LP, Soerensen AL, Nielsen LP, Mainz J Title: Prioritising interventions against medication errors – the importance of a definition Objective: To develop and test a restricted definition of medication errors across health care settings in Denmark Methods: Medication...... errors constitute a major quality and safety problem in modern healthcare. However, far from all are clinically important. The prevalence of medication errors ranges from 2-75% indicating a global problem in defining and measuring these [1]. New cut-of levels focusing the clinical impact of medication...

  7. Social aspects of clinical errors.

    Science.gov (United States)

    Richman, Joel; Mason, Tom; Mason-Whitehead, Elizabeth; McIntosh, Annette; Mercer, Dave

    2009-08-01

    Clinical errors, whether committed by doctors, nurses or other professions allied to healthcare, remain a sensitive issue requiring open debate and policy formulation in order to reduce them. The literature suggests that the issues underpinning errors made by healthcare professionals involve concerns about patient safety, professional disclosure, apology, litigation, compensation, processes of recording and policy development to enhance quality service. Anecdotally, we are aware of narratives of minor errors, which may well have been covered up and remain officially undisclosed whilst the major errors resulting in damage and death to patients alarm both professionals and public with resultant litigation and compensation. This paper attempts to unravel some of these issues by highlighting the historical nature of clinical errors and drawing parallels to contemporary times by outlining the 'compensation culture'. We then provide an overview of what constitutes a clinical error and review the healthcare professional strategies for managing such errors.

  8. Barriers to medication error reporting among hospital nurses.

    Science.gov (United States)

    Rutledge, Dana N; Retrosi, Tina; Ostrowski, Gary

    2018-03-01

    The study purpose was to report medication error reporting barriers among hospital nurses, and to determine validity and reliability of an existing medication error reporting barriers questionnaire. Hospital medication errors typically occur between ordering of a medication to its receipt by the patient with subsequent staff monitoring. To decrease medication errors, factors surrounding medication errors must be understood; this requires reporting by employees. Under-reporting can compromise patient safety by disabling improvement efforts. This 2017 descriptive study was part of a larger workforce engagement study at a faith-based Magnet ® -accredited community hospital in California (United States). Registered nurses (~1,000) were invited to participate in the online survey via email. Reported here are sample demographics (n = 357) and responses to the 20-item medication error reporting barriers questionnaire. Using factor analysis, four factors that accounted for 67.5% of the variance were extracted. These factors (subscales) were labelled Fear, Cultural Barriers, Lack of Knowledge/Feedback and Practical/Utility Barriers; each demonstrated excellent internal consistency. The medication error reporting barriers questionnaire, originally developed in long-term care, demonstrated good validity and excellent reliability among hospital nurses. Substantial proportions of American hospital nurses (11%-48%) considered specific factors as likely reporting barriers. Average scores on most barrier items were categorised "somewhat unlikely." The highest six included two barriers concerning the time-consuming nature of medication error reporting and four related to nurses' fear of repercussions. Hospitals need to determine the presence of perceived barriers among nurses using questionnaires such as the medication error reporting barriers and work to encourage better reporting. Barriers to medication error reporting make it less likely that nurses will report medication

  9. Errors in abdominal computed tomography

    International Nuclear Information System (INIS)

    Stephens, S.; Marting, I.; Dixon, A.K.

    1989-01-01

    Sixty-nine patients are presented in whom a substantial error was made on the initial abdominal computed tomography report. Certain features of these errors have been analysed. In 30 (43.5%) a lesion was simply not recognised (error of observation); in 39 (56.5%) the wrong conclusions were drawn about the nature of normal or abnormal structures (error of interpretation). The 39 errors of interpretation were more complex; in 7 patients an abnormal structure was noted but interpreted as normal, whereas in four a normal structure was thought to represent a lesion. Other interpretive errors included those where the wrong cause for a lesion had been ascribed (24 patients), and those where the abnormality was substantially under-reported (4 patients). Various features of these errors are presented and discussed. Errors were made just as often in relation to small and large lesions. Consultants made as many errors as senior registrar radiologists. It is like that dual reporting is the best method of avoiding such errors and, indeed, this is widely practised in our unit. (Author). 9 refs.; 5 figs.; 1 tab

  10. Laboratory errors and patient safety.

    Science.gov (United States)

    Miligy, Dawlat A

    2015-01-01

    Laboratory data are extensively used in medical practice; consequently, laboratory errors have a tremendous impact on patient safety. Therefore, programs designed to identify and reduce laboratory errors, as well as, setting specific strategies are required to minimize these errors and improve patient safety. The purpose of this paper is to identify part of the commonly encountered laboratory errors throughout our practice in laboratory work, their hazards on patient health care and some measures and recommendations to minimize or to eliminate these errors. Recording the encountered laboratory errors during May 2008 and their statistical evaluation (using simple percent distribution) have been done in the department of laboratory of one of the private hospitals in Egypt. Errors have been classified according to the laboratory phases and according to their implication on patient health. Data obtained out of 1,600 testing procedure revealed that the total number of encountered errors is 14 tests (0.87 percent of total testing procedures). Most of the encountered errors lay in the pre- and post-analytic phases of testing cycle (representing 35.7 and 50 percent, respectively, of total errors). While the number of test errors encountered in the analytic phase represented only 14.3 percent of total errors. About 85.7 percent of total errors were of non-significant implication on patients health being detected before test reports have been submitted to the patients. On the other hand, the number of test errors that have been already submitted to patients and reach the physician represented 14.3 percent of total errors. Only 7.1 percent of the errors could have an impact on patient diagnosis. The findings of this study were concomitant with those published from the USA and other countries. This proves that laboratory problems are universal and need general standardization and bench marking measures. Original being the first data published from Arabic countries that

  11. Learning from mistakes: errors in approaches to melanoma and the urgent need for updated national guidelines.

    Science.gov (United States)

    Simionescu, Olga; Blum, Andreas; Grigore, Mariana; Costache, Mariana; Avram, Alina; Testori, Alessandro

    2016-09-01

    The tracking and identification of errors in the detection and follow-up of melanoma are important because there is huge potential to increase awareness about the most vulnerable aspects of diagnosis and treatment, and to improve both from the perspective of healthcare economics. The present study was designed to identify where errors occur and to propose a minimum set of rules for the routine guidance of any specialist in melanoma management. This report describes the evaluation of a unique series of 33 cases in which errors applying to many steps in the diagnosis and treatment of melanoma were detected. Cases were collected at two centers in Romania, one public and one private, as part of a process of obtaining patient-requested second opinions. A total of 166 errors were identified across the 33 patients, most of which were treatment errors. The errors fell into six categories: clinical diagnostic errors (36 errors among 30 patients); primary surgical errors (31 errors among 16 patients); pathology errors (24 errors among 17 patients); sentinel lymph node biopsy errors (13 errors among 13 patients); staging errors (17 errors among 13 patients); and treatment or management errors (45 errors among 33 patients). Based on the present results, we propose that in countries lacking national guidelines, clinicians should adhere to international evidence-based guidelines for the diagnosis and treatment of melanoma. © 2015 The International Society of Dermatology.

  12. Dopamine reward prediction error coding.

    Science.gov (United States)

    Schultz, Wolfram

    2016-03-01

    Reward prediction errors consist of the differences between received and predicted rewards. They are crucial for basic forms of learning about rewards and make us strive for more rewards-an evolutionary beneficial trait. Most dopamine neurons in the midbrain of humans, monkeys, and rodents signal a reward prediction error; they are activated by more reward than predicted (positive prediction error), remain at baseline activity for fully predicted rewards, and show depressed activity with less reward than predicted (negative prediction error). The dopamine signal increases nonlinearly with reward value and codes formal economic utility. Drugs of addiction generate, hijack, and amplify the dopamine reward signal and induce exaggerated, uncontrolled dopamine effects on neuronal plasticity. The striatum, amygdala, and frontal cortex also show reward prediction error coding, but only in subpopulations of neurons. Thus, the important concept of reward prediction errors is implemented in neuronal hardware.

  13. Statistical errors in Monte Carlo estimates of systematic errors

    Science.gov (United States)

    Roe, Byron P.

    2007-01-01

    For estimating the effects of a number of systematic errors on a data sample, one can generate Monte Carlo (MC) runs with systematic parameters varied and examine the change in the desired observed result. Two methods are often used. In the unisim method, the systematic parameters are varied one at a time by one standard deviation, each parameter corresponding to a MC run. In the multisim method (see ), each MC run has all of the parameters varied; the amount of variation is chosen from the expected distribution of each systematic parameter, usually assumed to be a normal distribution. The variance of the overall systematic error determination is derived for each of the two methods and comparisons are made between them. If one focuses not on the error in the prediction of an individual systematic error, but on the overall error due to all systematic errors in the error matrix element in data bin m, the number of events needed is strongly reduced because of the averaging effect over all of the errors. For simple models presented here the multisim model was far better if the statistical error in the MC samples was larger than an individual systematic error, while for the reverse case, the unisim model was better. Exact formulas and formulas for the simple toy models are presented so that realistic calculations can be made. The calculations in the present note are valid if the errors are in a linear region. If that region extends sufficiently far, one can have the unisims or multisims correspond to k standard deviations instead of one. This reduces the number of events required by a factor of k2. The specific terms unisim and multisim were coined by Peter Meyers and Steve Brice, respectively, for the MiniBooNE experiment. However, the concepts have been developed over time and have been in general use for some time.

  14. Statistical errors in Monte Carlo estimates of systematic errors

    Energy Technology Data Exchange (ETDEWEB)

    Roe, Byron P. [Department of Physics, University of Michigan, Ann Arbor, MI 48109 (United States)]. E-mail: byronroe@umich.edu

    2007-01-01

    For estimating the effects of a number of systematic errors on a data sample, one can generate Monte Carlo (MC) runs with systematic parameters varied and examine the change in the desired observed result. Two methods are often used. In the unisim method, the systematic parameters are varied one at a time by one standard deviation, each parameter corresponding to a MC run. In the multisim method (see ), each MC run has all of the parameters varied; the amount of variation is chosen from the expected distribution of each systematic parameter, usually assumed to be a normal distribution. The variance of the overall systematic error determination is derived for each of the two methods and comparisons are made between them. If one focuses not on the error in the prediction of an individual systematic error, but on the overall error due to all systematic errors in the error matrix element in data bin m, the number of events needed is strongly reduced because of the averaging effect over all of the errors. For simple models presented here the multisim model was far better if the statistical error in the MC samples was larger than an individual systematic error, while for the reverse case, the unisim model was better. Exact formulas and formulas for the simple toy models are presented so that realistic calculations can be made. The calculations in the present note are valid if the errors are in a linear region. If that region extends sufficiently far, one can have the unisims or multisims correspond to k standard deviations instead of one. This reduces the number of events required by a factor of k{sup 2}.

  15. Statistical errors in Monte Carlo estimates of systematic errors

    International Nuclear Information System (INIS)

    Roe, Byron P.

    2007-01-01

    For estimating the effects of a number of systematic errors on a data sample, one can generate Monte Carlo (MC) runs with systematic parameters varied and examine the change in the desired observed result. Two methods are often used. In the unisim method, the systematic parameters are varied one at a time by one standard deviation, each parameter corresponding to a MC run. In the multisim method (see ), each MC run has all of the parameters varied; the amount of variation is chosen from the expected distribution of each systematic parameter, usually assumed to be a normal distribution. The variance of the overall systematic error determination is derived for each of the two methods and comparisons are made between them. If one focuses not on the error in the prediction of an individual systematic error, but on the overall error due to all systematic errors in the error matrix element in data bin m, the number of events needed is strongly reduced because of the averaging effect over all of the errors. For simple models presented here the multisim model was far better if the statistical error in the MC samples was larger than an individual systematic error, while for the reverse case, the unisim model was better. Exact formulas and formulas for the simple toy models are presented so that realistic calculations can be made. The calculations in the present note are valid if the errors are in a linear region. If that region extends sufficiently far, one can have the unisims or multisims correspond to k standard deviations instead of one. This reduces the number of events required by a factor of k 2

  16. Architecture design for soft errors

    CERN Document Server

    Mukherjee, Shubu

    2008-01-01

    This book provides a comprehensive description of the architetural techniques to tackle the soft error problem. It covers the new methodologies for quantitative analysis of soft errors as well as novel, cost-effective architectural techniques to mitigate them. To provide readers with a better grasp of the broader problem deffinition and solution space, this book also delves into the physics of soft errors and reviews current circuit and software mitigation techniques.

  17. Dopamine reward prediction error coding

    OpenAIRE

    Schultz, Wolfram

    2016-01-01

    Reward prediction errors consist of the differences between received and predicted rewards. They are crucial for basic forms of learning about rewards and make us strive for more rewards?an evolutionary beneficial trait. Most dopamine neurons in the midbrain of humans, monkeys, and rodents signal a reward prediction error; they are activated by more reward than predicted (positive prediction error), remain at baseline activity for fully predicted rewards, and show depressed activity with less...

  18. Medication errors: the role of the patient.

    Science.gov (United States)

    Britten, Nicky

    2009-06-01

    1. Patients and their carers will usually be the first to notice any observable problems resulting from medication errors. They will probably be unable to distinguish between medication errors, adverse drug reactions, or 'side effects'. 2. Little is known about how patients understand drug related problems or how they make attributions of adverse effects. Some research suggests that patients' cognitive models of adverse drug reactions bear a close relationship to models of illness perception. 3. Attributions of adverse drug reactions are related to people's previous experiences and to their level of education. The evidence suggests that on the whole patients' reports of adverse drug reactions are accurate. However, patients do not report all the problems they perceive and are more likely to report those that they do perceive as severe. Patients may not report problems attributed to their medications if they are fearful of doctors' reactions. Doctors may respond inappropriately to patients' concerns, for example by ignoring them. Some authors have proposed the use of a symptom checklist to elicit patients' reports of suspected adverse drug reactions. 4. Many patients want information about adverse drug effects, and the challenge for the professional is to judge how much information to provide and the best way of doing so. Professionals' inappropriate emphasis on adherence may be dangerous when a medication error has occurred. 5. Recent NICE guidelines recommend that professionals should ask patients if they have any concerns about their medicines, and this approach is likely to yield information conducive to the identification of medication errors.

  19. Meniscal tear. Diagnostic errors in MR imaging

    International Nuclear Information System (INIS)

    Barrera, M. C.; Recondo, J. A.; Gervas, C.; Fernandez, E.; Villanua, J. A.M.; Salvador, E.

    2003-01-01

    To analyze diagnostic discrepancies found between magnetic resonance (MR) and arthroscopy, and the determine the reasons that they occur. Two-hundred and forty-eight MR knee explorations were retrospectively checked. Forty of these showed diagnostic discrepancies between MR and arthroscopy. Two radiologists independently re-analyzed the images from 29 of the 40 studies without knowing which diagnosis had resulted from which of the two techniques. Their interpretations were correlated with the initial MR diagnosis, MR images and arthroscopic results. Initial errors in MR imaging were classified as either unavoidable, interpretive, or secondary to equivocal findings. Eleven MR examinations could not be checked since their corresponding imaging results could not be located. Of 34 errors found in the original diagnoses, 12 (35.5%)were classified as unavoidable, 14 (41.2%) as interpretative and 8 (23.5%) as secondary to equivocal findings. 41.2% of the errors were avoided in the retrospective study probably due to our department having greater experience in interpreting MR images, 25.5% were unavailable even in the retrospective study. A small percentage of diagnostic errors were due to the presence of subtle equivocal findings. (Author) 15 refs

  20. An Endogenous Circadian Rhythm in Sleep Inertia Results in Greatest Cognitive Impairment upon Awakening during the Biological Night

    Science.gov (United States)

    Scheer, Frank A. J. L.; Shea, Thomas J.; Hilton, Michael F.; Shea, Steven A.

    2011-01-01

    Sleep inertia is the impaired cognitive performance immediately upon awakening, which decays over tens of minutes. This phenomenon has relevance to people who need to make important decisions soon after awakening, such as on-call emergency workers. Such awakenings can occur at varied times of day or night, so the objective of the study was to determine whether or not the magnitude of sleep inertia varies according to the phase of the endogenous circadian cycle. Twelve adults (mean, 24 years; 7 men) with no medical disorders other than mild asthma were studied. Following 2 baseline days and nights, subjects underwent a forced desynchrony protocol composed of seven 28-h sleep/wake cycles, while maintaining a sleep/wakefulness ratio of 1:2 throughout. Subjects were awakened by a standardized auditory stimulus 3 times each sleep period for sleep inertia assessments. The magnitude of sleep inertia was quantified as the change in cognitive performance (number of correct additions in a 2-min serial addition test) across the first 20 min of wakefulness. Circadian phase was estimated from core body temperature (fitted temperature minimum assigned 0°). Data were segregated according to: (1) circadian phase (60° bins); (2) sleep stage; and (3) 3rd of the night after which awakenings occurred (i.e., tertiary 1, 2, or 3). To control for any effect of sleep stage, the circadian rhythm of sleep inertia was initially assessed following awakenings from Stage 2 (62% of awakening occurred from this stage; n = 110). This revealed a significant circadian rhythm in the sleep inertia of cognitive performance (p = 0.007), which was 3.6 times larger during the biological night (circadian bin 300°, ~2300–0300 h in these subjects) than during the biological day (bin 180°, ~1500–1900 h). The circadian rhythm in sleep inertia was still present when awakenings from all sleep stages were included (p = 0.004), and this rhythm could not be explained by changes in underlying sleep drive

  1. Drill machine guidance using natural occurring radiation

    International Nuclear Information System (INIS)

    Dahl, H.D.; Schroeder, R.L.; Williams, B.J.

    1980-01-01

    A drilling machine guidance system is described which uses only the naturally occuring radiation within the seam or stratum of interest. The apparatus can be used for guiding horizontal drilling machines through coal seams and the like. (U.K.)

  2. Multiple Primary Cancers: Simultaneously Occurring Prostate ...

    African Journals Online (AJOL)

    2016-05-20

    May 20, 2016 ... occurring prostate cancer and other primary tumors-our experience and literature ..... thyroid cancers, pancreatic tumors, renal cancers, and melanoma. ... Hsing AW, Yeboah E, Biritwum R, Tettey Y, De Marzo AM,. Adjei A, et ...

  3. Identifying Error in AUV Communication

    National Research Council Canada - National Science Library

    Coleman, Joseph; Merrill, Kaylani; O'Rourke, Michael; Rajala, Andrew G; Edwards, Dean B

    2006-01-01

    Mine Countermeasures (MCM) involving Autonomous Underwater Vehicles (AUVs) are especially susceptible to error, given the constraints on underwater acoustic communication and the inconstancy of the underwater communication channel...

  4. Human Errors in Decision Making

    OpenAIRE

    Mohamad, Shahriari; Aliandrina, Dessy; Feng, Yan

    2005-01-01

    The aim of this paper was to identify human errors in decision making process. The study was focused on a research question such as: what could be the human error as a potential of decision failure in evaluation of the alternatives in the process of decision making. Two case studies were selected from the literature and analyzed to find the human errors contribute to decision fail. Then the analysis of human errors was linked with mental models in evaluation of alternative step. The results o...

  5. Finding beam focus errors automatically

    International Nuclear Information System (INIS)

    Lee, M.J.; Clearwater, S.H.; Kleban, S.D.

    1987-01-01

    An automated method for finding beam focus errors using an optimization program called COMFORT-PLUS. The steps involved in finding the correction factors using COMFORT-PLUS has been used to find the beam focus errors for two damping rings at the SLAC Linear Collider. The program is to be used as an off-line program to analyze actual measured data for any SLC system. A limitation on the application of this procedure is found to be that it depends on the magnitude of the machine errors. Another is that the program is not totally automated since the user must decide a priori where to look for errors

  6. Heuristic errors in clinical reasoning.

    Science.gov (United States)

    Rylander, Melanie; Guerrasio, Jeannette

    2016-08-01

    Errors in clinical reasoning contribute to patient morbidity and mortality. The purpose of this study was to determine the types of heuristic errors made by third-year medical students and first-year residents. This study surveyed approximately 150 clinical educators inquiring about the types of heuristic errors they observed in third-year medical students and first-year residents. Anchoring and premature closure were the two most common errors observed amongst third-year medical students and first-year residents. There was no difference in the types of errors observed in the two groups. Errors in clinical reasoning contribute to patient morbidity and mortality Clinical educators perceived that both third-year medical students and first-year residents committed similar heuristic errors, implying that additional medical knowledge and clinical experience do not affect the types of heuristic errors made. Further work is needed to help identify methods that can be used to reduce heuristic errors early in a clinician's education. © 2015 John Wiley & Sons Ltd.

  7. A Hybrid Unequal Error Protection / Unequal Error Resilience ...

    African Journals Online (AJOL)

    The quality layers are then assigned an Unequal Error Resilience to synchronization loss by unequally allocating the number of headers available for synchronization to them. Following that Unequal Error Protection against channel noise is provided to the layers by the use of Rate Compatible Punctured Convolutional ...

  8. Physician assistants and the disclosure of medical error.

    Science.gov (United States)

    Brock, Douglas M; Quella, Alicia; Lipira, Lauren; Lu, Dave W; Gallagher, Thomas H

    2014-06-01

    Evolving state law, professional societies, and national guidelines, including those of the American Medical Association and Joint Commission, recommend that patients receive transparent communication when a medical error occurs. Recommendations for error disclosure typically consist of an explanation that an error has occurred, delivery of an explicit apology, an explanation of the facts around the event, its medical ramifications and how care will be managed, and a description of how similar errors will be prevented in the future. Although error disclosure is widely endorsed in the medical and nursing literature, there is little discussion of the unique role that the physician assistant (PA) might play in these interactions. PAs are trained in the medical model and technically practice under the supervision of a physician. They are also commonly integrated into interprofessional health care teams in surgical and urgent care settings. PA practice is characterized by widely varying degrees of provider autonomy. How PAs should collaborate with physicians in sensitive error disclosure conversations with patients is unclear. With the number of practicing PAs growing rapidly in nearly all domains of medicine, their role in the error disclosure process warrants exploration. The authors call for educational societies and accrediting agencies to support policy to establish guidelines for PA disclosure of error. They encourage medical and PA researchers to explore and report best-practice disclosure roles for PAs. Finally, they recommend that PA educational programs implement trainings in disclosure skills, and hospitals and supervising physicians provide and support training for practicing PAs.

  9. Errors in the administration of intravenous medication in Brazilian hospitals.

    Science.gov (United States)

    Anselmi, Maria Luiza; Peduzzi, Marina; Dos Santos, Claudia Benedita

    2007-10-01

    To verify the frequency of errors in the preparation and administration of intravenous medication in three Brazilian hospitals in the State of Bahia. The administration of intravenous medications constitutes a central activity in Brazilian nursing. Errors in performing this activity may result in irreparable damage to patients and may compromise the quality of care. Cross-sectional study, conducted in three hospitals in the State of Bahia, Brazil. Direct observation of the nursing staff (nurse technicians, auxiliary nurses and nurse attendants), preparing and administering intravenous medication. When preparing medication, wrong patient error did not occur in any of the three hospitals, whereas omission dose was the most frequent error in all study sites. When administering medication, the most frequent errors in the three hospitals were wrong dose and omission dose. The rates of error found are considered low compared with similar studies. The most frequent types of errors were wrong dose and omission dose. The hospitals studied showed different results with the smallest rates of errors occurring in hospital 1 that presented the best working conditions. Relevance to clinical practice. Studies such as this one have the potential to improve the quality of care.

  10. Error-Transparent Quantum Gates for Small Logical Qubit Architectures

    Science.gov (United States)

    Kapit, Eliot

    2018-02-01

    One of the largest obstacles to building a quantum computer is gate error, where the physical evolution of the state of a qubit or group of qubits during a gate operation does not match the intended unitary transformation. Gate error stems from a combination of control errors and random single qubit errors from interaction with the environment. While great strides have been made in mitigating control errors, intrinsic qubit error remains a serious problem that limits gate fidelity in modern qubit architectures. Simultaneously, recent developments of small error-corrected logical qubit devices promise significant increases in logical state lifetime, but translating those improvements into increases in gate fidelity is a complex challenge. In this Letter, we construct protocols for gates on and between small logical qubit devices which inherit the parent device's tolerance to single qubit errors which occur at any time before or during the gate. We consider two such devices, a passive implementation of the three-qubit bit flip code, and the author's own [E. Kapit, Phys. Rev. Lett. 116, 150501 (2016), 10.1103/PhysRevLett.116.150501] very small logical qubit (VSLQ) design, and propose error-tolerant gate sets for both. The effective logical gate error rate in these models displays superlinear error reduction with linear increases in single qubit lifetime, proving that passive error correction is capable of increasing gate fidelity. Using a standard phenomenological noise model for superconducting qubits, we demonstrate a realistic, universal one- and two-qubit gate set for the VSLQ, with error rates an order of magnitude lower than those for same-duration operations on single qubits or pairs of qubits. These developments further suggest that incorporating small logical qubits into a measurement based code could substantially improve code performance.

  11. A preliminary taxonomy of medical errors in family practice.

    Science.gov (United States)

    Dovey, S M; Meyers, D S; Phillips, R L; Green, L A; Fryer, G E; Galliher, J M; Kappus, J; Grob, P

    2002-09-01

    To develop a preliminary taxonomy of primary care medical errors. Qualitative analysis to identify categories of error reported during a randomized controlled trial of computer and paper reporting methods. The National Network for Family Practice and Primary Care Research. Family physicians. Medical error category, context, and consequence. Forty two physicians made 344 reports: 284 (82.6%) arose from healthcare systems dysfunction; 46 (13.4%) were errors due to gaps in knowledge or skills; and 14 (4.1%) were reports of adverse events, not errors. The main subcategories were: administrative failure (102; 30.9% of errors), investigation failures (82; 24.8%), treatment delivery lapses (76; 23.0%), miscommunication (19; 5.8%), payment systems problems (4; 1.2%), error in the execution of a clinical task (19; 5.8%), wrong treatment decision (14; 4.2%), and wrong diagnosis (13; 3.9%). Most reports were of errors that were recognized and occurred in reporters' practices. Affected patients ranged in age from 8 months to 100 years, were of both sexes, and represented all major US ethnic groups. Almost half the reports were of events which had adverse consequences. Ten errors resulted in patients being admitted to hospital and one patient died. This medical error taxonomy, developed from self-reports of errors observed by family physicians during their routine clinical practice, emphasizes problems in healthcare processes and acknowledges medical errors arising from shortfalls in clinical knowledge and skills. Patient safety strategies with most effect in primary care settings need to be broader than the current focus on medication errors.

  12. Error studies for SNS Linac. Part 1: Transverse errors

    International Nuclear Information System (INIS)

    Crandall, K.R.

    1998-01-01

    The SNS linac consist of a radio-frequency quadrupole (RFQ), a drift-tube linac (DTL), a coupled-cavity drift-tube linac (CCDTL) and a coupled-cavity linac (CCL). The RFQ and DTL are operated at 402.5 MHz; the CCDTL and CCL are operated at 805 MHz. Between the RFQ and DTL is a medium-energy beam-transport system (MEBT). This error study is concerned with the DTL, CCDTL and CCL, and each will be analyzed separately. In fact, the CCL is divided into two sections, and each of these will be analyzed separately. The types of errors considered here are those that affect the transverse characteristics of the beam. The errors that cause the beam center to be displaced from the linac axis are quad displacements and quad tilts. The errors that cause mismatches are quad gradient errors and quad rotations (roll)

  13. A chance to avoid mistakes human error

    International Nuclear Information System (INIS)

    Amaro, Pablo; Obeso, Eduardo; Gomez, Ruben

    2010-01-01

    Trying to give an answer to the lack of public information in the industry, in relationship with the different tools that are managed in the nuclear industry for minimizing the human error, a group of workers from different sections of the St. Maria de Garona NPP (Quality Assurance/ Organization and Human Factors) decided to embark on a challenging and exciting project: 'Write a book collecting all the knowledge accumulated during their daily activities, very often during lecture time of external information received from different organizations within the nuclear industry (INPO, WANO...), but also visiting different NPP's, maintaining meetings and participating in training courses related de Human and Organizational Factors'. Main objective of the book is presenting to the industry in general, the different tools that are used and fostered in the nuclear industry, in a practical way. In this way, the assimilation and implementation in others industries could be possible and achievable in and efficient context. One year of work, and our project is a reality. We have presented and abstract during the last Spanish Nuclear Society meeting in Sevilla, last October...and the best, the book is into the market for everybody in web-site: www.bubok.com. The book is structured in the following areas: 'Errare humanum est': Trying to present what is the human error to the reader, its origin and the different barriers. The message is that the reader see the error like something continuously present in our lives... even more frequently than we think. Studying its origin can be established aimed at barriers to avoid or at least minimize it. 'Error's bitter face': Shows the possible consequences of human errors. What better that presenting real experiences that have occurred in the industry. In the book, accidents in the nuclear industry, like Tree Mile Island NPP, Chernobyl NPP, and incidents like Davis Besse NPP in the past, helps to the reader to make a reflection about the

  14. Medication errors with the use of allopurinol and colchicine: a retrospective study of a national, anonymous Internet-accessible error reporting system.

    Science.gov (United States)

    Mikuls, Ted R; Curtis, Jeffrey R; Allison, Jeroan J; Hicks, Rodney W; Saag, Kenneth G

    2006-03-01

    To more closely assess medication errors in gout care, we examined data from a national, Internet-accessible error reporting program over a 5-year reporting period. We examined data from the MEDMARX database, covering the period from January 1, 1999 through December 31, 2003. For allopurinol and colchicine, we examined error severity, source, type, contributing factors, and healthcare personnel involved in errors, and we detailed errors resulting in patient harm. Causes of error and the frequency of other error characteristics were compared for gout medications versus other musculoskeletal treatments using the chi-square statistic. Gout medication errors occurred in 39% (n = 273) of facilities participating in the MEDMARX program. Reported errors were predominantly from the inpatient hospital setting and related to the use of allopurinol (n = 524), followed by colchicine (n = 315), probenecid (n = 50), and sulfinpyrazone (n = 2). Compared to errors involving other musculoskeletal treatments, allopurinol and colchicine errors were more often ascribed to problems with physician prescribing (7% for other therapies versus 23-39% for allopurinol and colchicine, p < 0.0001) and less often due to problems with drug administration or nursing error (50% vs 23-27%, p < 0.0001). Our results suggest that inappropriate prescribing practices are characteristic of errors occurring with the use of allopurinol and colchicine. Physician prescribing practices are a potential target for quality improvement interventions in gout care.

  15. Updating expected action outcome in the medial frontal cortex involves an evaluation of error type.

    Science.gov (United States)

    Maier, Martin E; Steinhauser, Marco

    2013-10-02

    Forming expectations about the outcome of an action is an important prerequisite for action control and reinforcement learning in the human brain. The medial frontal cortex (MFC) has been shown to play an important role in the representation of outcome expectations, particularly when an update of expected outcome becomes necessary because an error is detected. However, error detection alone is not always sufficient to compute expected outcome because errors can occur in various ways and different types of errors may be associated with different outcomes. In the present study, we therefore investigate whether updating expected outcome in the human MFC is based on an evaluation of error type. Our approach was to consider an electrophysiological correlate of MFC activity on errors, the error-related negativity (Ne/ERN), in a task in which two types of errors could occur. Because the two error types were associated with different amounts of monetary loss, updating expected outcomes on error trials required an evaluation of error type. Our data revealed a pattern of Ne/ERN amplitudes that closely mirrored the amount of monetary loss associated with each error type, suggesting that outcome expectations are updated based on an evaluation of error type. We propose that this is achieved by a proactive evaluation process that anticipates error types by continuously monitoring error sources or by dynamically representing possible response-outcome relations.

  16. Dual Processing and Diagnostic Errors

    Science.gov (United States)

    Norman, Geoff

    2009-01-01

    In this paper, I review evidence from two theories in psychology relevant to diagnosis and diagnostic errors. "Dual Process" theories of thinking, frequently mentioned with respect to diagnostic error, propose that categorization decisions can be made with either a fast, unconscious, contextual process called System 1 or a slow, analytical,…

  17. Barriers to medical error reporting

    Directory of Open Access Journals (Sweden)

    Jalal Poorolajal

    2015-01-01

    Full Text Available Background: This study was conducted to explore the prevalence of medical error underreporting and associated barriers. Methods: This cross-sectional study was performed from September to December 2012. Five hospitals, affiliated with Hamadan University of Medical Sciences, in Hamedan,Iran were investigated. A self-administered questionnaire was used for data collection. Participants consisted of physicians, nurses, midwives, residents, interns, and staffs of radiology and laboratory departments. Results: Overall, 50.26% of subjects had committed but not reported medical errors. The main reasons mentioned for underreporting were lack of effective medical error reporting system (60.0%, lack of proper reporting form (51.8%, lack of peer supporting a person who has committed an error (56.0%, and lack of personal attention to the importance of medical errors (62.9%. The rate of committing medical errors was higher in men (71.4%, age of 50-40 years (67.6%, less-experienced personnel (58.7%, educational level of MSc (87.5%, and staff of radiology department (88.9%. Conclusions: This study outlined the main barriers to reporting medical errors and associated factors that may be helpful for healthcare organizations in improving medical error reporting as an essential component for patient safety enhancement.

  18. Error and discrepancy in radiology: inevitable or avoidable?

    Science.gov (United States)

    Brady, Adrian P

    2017-02-01

    Errors and discrepancies in radiology practice are uncomfortably common, with an estimated day-to-day rate of 3-5% of studies reported, and much higher rates reported in many targeted studies. Nonetheless, the meaning of the terms "error" and "discrepancy" and the relationship to medical negligence are frequently misunderstood. This review outlines the incidence of such events, the ways they can be categorized to aid understanding, and potential contributing factors, both human- and system-based. Possible strategies to minimise error are considered, along with the means of dealing with perceived underperformance when it is identified. The inevitability of imperfection is explained, while the importance of striving to minimise such imperfection is emphasised. • Discrepancies between radiology reports and subsequent patient outcomes are not inevitably errors. • Radiologist reporting performance cannot be perfect, and some errors are inevitable. • Error or discrepancy in radiology reporting does not equate negligence. • Radiologist errors occur for many reasons, both human- and system-derived. • Strategies exist to minimise error causes and to learn from errors made.

  19. An error taxonomy system for analysis of haemodialysis incidents.

    Science.gov (United States)

    Gu, Xiuzhu; Itoh, Kenji; Suzuki, Satoshi

    2014-12-01

    This paper describes the development of a haemodialysis error taxonomy system for analysing incidents and predicting the safety status of a dialysis organisation. The error taxonomy system was developed by adapting an error taxonomy system which assumed no specific specialty to haemodialysis situations. Its application was conducted with 1,909 incident reports collected from two dialysis facilities in Japan. Over 70% of haemodialysis incidents were reported as problems or complications related to dialyser, circuit, medication and setting of dialysis condition. Approximately 70% of errors took place immediately before and after the four hours of haemodialysis therapy. Error types most frequently made in the dialysis unit were omission and qualitative errors. Failures or complications classified to staff human factors, communication, task and organisational factors were found in most dialysis incidents. Device/equipment/materials, medicine and clinical documents were most likely to be involved in errors. Haemodialysis nurses were involved in more incidents related to medicine and documents, whereas dialysis technologists made more errors with device/equipment/materials. This error taxonomy system is able to investigate incidents and adverse events occurring in the dialysis setting but is also able to estimate safety-related status of an organisation, such as reporting culture. © 2014 European Dialysis and Transplant Nurses Association/European Renal Care Association.

  20. Mismeasurement and the resonance of strong confounders: correlated errors.

    Science.gov (United States)

    Marshall, J R; Hastrup, J L; Ross, J S

    1999-07-01

    Confounding in epidemiology, and the limits of standard methods of control for an imperfectly measured confounder, have been understood for some time. However, most treatments of this problem are based on the assumption that errors of measurement in confounding and confounded variables are independent. This paper considers the situation in which a strong risk factor (confounder) and an inconsequential but suspected risk factor (confounded) are each measured with errors that are correlated; the situation appears especially likely to occur in the field of nutritional epidemiology. Error correlation appears to add little to measurement error as a source of bias in estimating the impact of a strong risk factor: it can add to, diminish, or reverse the bias induced by measurement error in estimating the impact of the inconsequential risk factor. Correlation of measurement errors can add to the difficulty involved in evaluating structures in which confounding and measurement error are present. In its presence, observed correlations among risk factors can be greater than, less than, or even opposite to the true correlations. Interpretation of multivariate epidemiologic structures in which confounding is likely requires evaluation of measurement error structures, including correlations among measurement errors.

  1. Sensitivity analysis of periodic errors in heterodyne interferometry

    International Nuclear Information System (INIS)

    Ganguly, Vasishta; Kim, Nam Ho; Kim, Hyo Soo; Schmitz, Tony

    2011-01-01

    Periodic errors in heterodyne displacement measuring interferometry occur due to frequency mixing in the interferometer. These nonlinearities are typically characterized as first- and second-order periodic errors which cause a cyclical (non-cumulative) variation in the reported displacement about the true value. This study implements an existing analytical periodic error model in order to identify sensitivities of the first- and second-order periodic errors to the input parameters, including rotational misalignments of the polarizing beam splitter and mixing polarizer, non-orthogonality of the two laser frequencies, ellipticity in the polarizations of the two laser beams, and different transmission coefficients in the polarizing beam splitter. A local sensitivity analysis is first conducted to examine the sensitivities of the periodic errors with respect to each input parameter about the nominal input values. Next, a variance-based approach is used to study the global sensitivities of the periodic errors by calculating the Sobol' sensitivity indices using Monte Carlo simulation. The effect of variation in the input uncertainty on the computed sensitivity indices is examined. It is seen that the first-order periodic error is highly sensitive to non-orthogonality of the two linearly polarized laser frequencies, while the second-order error is most sensitive to the rotational misalignment between the laser beams and the polarizing beam splitter. A particle swarm optimization technique is finally used to predict the possible setup imperfections based on experimentally generated values for periodic errors

  2. Sensitivity analysis of periodic errors in heterodyne interferometry

    Science.gov (United States)

    Ganguly, Vasishta; Kim, Nam Ho; Kim, Hyo Soo; Schmitz, Tony

    2011-03-01

    Periodic errors in heterodyne displacement measuring interferometry occur due to frequency mixing in the interferometer. These nonlinearities are typically characterized as first- and second-order periodic errors which cause a cyclical (non-cumulative) variation in the reported displacement about the true value. This study implements an existing analytical periodic error model in order to identify sensitivities of the first- and second-order periodic errors to the input parameters, including rotational misalignments of the polarizing beam splitter and mixing polarizer, non-orthogonality of the two laser frequencies, ellipticity in the polarizations of the two laser beams, and different transmission coefficients in the polarizing beam splitter. A local sensitivity analysis is first conducted to examine the sensitivities of the periodic errors with respect to each input parameter about the nominal input values. Next, a variance-based approach is used to study the global sensitivities of the periodic errors by calculating the Sobol' sensitivity indices using Monte Carlo simulation. The effect of variation in the input uncertainty on the computed sensitivity indices is examined. It is seen that the first-order periodic error is highly sensitive to non-orthogonality of the two linearly polarized laser frequencies, while the second-order error is most sensitive to the rotational misalignment between the laser beams and the polarizing beam splitter. A particle swarm optimization technique is finally used to predict the possible setup imperfections based on experimentally generated values for periodic errors.

  3. Error Correction for Non-Abelian Topological Quantum Computation

    Directory of Open Access Journals (Sweden)

    James R. Wootton

    2014-03-01

    Full Text Available The possibility of quantum computation using non-Abelian anyons has been considered for over a decade. However, the question of how to obtain and process information about what errors have occurred in order to negate their effects has not yet been considered. This is in stark contrast with quantum computation proposals for Abelian anyons, for which decoding algorithms have been tailor-made for many topological error-correcting codes and error models. Here, we address this issue by considering the properties of non-Abelian error correction, in general. We also choose a specific anyon model and error model to probe the problem in more detail. The anyon model is the charge submodel of D(S_{3}. This shares many properties with important models such as the Fibonacci anyons, making our method more generally applicable. The error model is a straightforward generalization of those used in the case of Abelian anyons for initial benchmarking of error correction methods. It is found that error correction is possible under a threshold value of 7% for the total probability of an error on each physical spin. This is remarkably comparable with the thresholds for Abelian models.

  4. A theory of human error

    Science.gov (United States)

    Mcruer, D. T.; Clement, W. F.; Allen, R. W.

    1981-01-01

    Human errors tend to be treated in terms of clinical and anecdotal descriptions, from which remedial measures are difficult to derive. Correction of the sources of human error requires an attempt to reconstruct underlying and contributing causes of error from the circumstantial causes cited in official investigative reports. A comprehensive analytical theory of the cause-effect relationships governing propagation of human error is indispensable to a reconstruction of the underlying and contributing causes. A validated analytical theory of the input-output behavior of human operators involving manual control, communication, supervisory, and monitoring tasks which are relevant to aviation, maritime, automotive, and process control operations is highlighted. This theory of behavior, both appropriate and inappropriate, provides an insightful basis for investigating, classifying, and quantifying the needed cause-effect relationships governing propagation of human error.

  5. Correcting AUC for Measurement Error.

    Science.gov (United States)

    Rosner, Bernard; Tworoger, Shelley; Qiu, Weiliang

    2015-12-01

    Diagnostic biomarkers are used frequently in epidemiologic and clinical work. The ability of a diagnostic biomarker to discriminate between subjects who develop disease (cases) and subjects who do not (controls) is often measured by the area under the receiver operating characteristic curve (AUC). The diagnostic biomarkers are usually measured with error. Ignoring measurement error can cause biased estimation of AUC, which results in misleading interpretation of the efficacy of a diagnostic biomarker. Several methods have been proposed to correct AUC for measurement error, most of which required the normality assumption for the distributions of diagnostic biomarkers. In this article, we propose a new method to correct AUC for measurement error and derive approximate confidence limits for the corrected AUC. The proposed method does not require the normality assumption. Both real data analyses and simulation studies show good performance of the proposed measurement error correction method.

  6. Cognitive aspect of diagnostic errors.

    Science.gov (United States)

    Phua, Dong Haur; Tan, Nigel C K

    2013-01-01

    Diagnostic errors can result in tangible harm to patients. Despite our advances in medicine, the mental processes required to make a diagnosis exhibits shortcomings, causing diagnostic errors. Cognitive factors are found to be an important cause of diagnostic errors. With new understanding from psychology and social sciences, clinical medicine is now beginning to appreciate that our clinical reasoning can take the form of analytical reasoning or heuristics. Different factors like cognitive biases and affective influences can also impel unwary clinicians to make diagnostic errors. Various strategies have been proposed to reduce the effect of cognitive biases and affective influences when clinicians make diagnoses; however evidence for the efficacy of these methods is still sparse. This paper aims to introduce the reader to the cognitive aspect of diagnostic errors, in the hope that clinicians can use this knowledge to improve diagnostic accuracy and patient outcomes.

  7. Determination of natural occurring radionuclides concentrations

    International Nuclear Information System (INIS)

    Stajic, J.; Markovic, V.; Krstic, D.; Nikezic, D.

    2011-01-01

    Tobacco smoke contains certain concentrations of naturally occurring radionuclides from radioactive chains of uranium and thorium - 214 Pb, 214 Bi, 228 Ac, 208 Tl, 226 Ra, 232 Th and 40 K. Inhaling of tobacco smoke leads to internal exposure of man. In order to estimate absorbed dose of irradiation it is necessary to determine concentrations of radionuclides present in the tobacco leaves. In this paper specific activities of naturally occurring radionuclides were measured in tobacco samples from cigarettes which are used in Serbia. [sr

  8. The epidemiology and type of medication errors reported to the National Poisons Information Centre of Ireland.

    Science.gov (United States)

    Cassidy, Nicola; Duggan, Edel; Williams, David J P; Tracey, Joseph A

    2011-07-01

    Medication errors are widely reported for hospitalised patients, but limited data are available for medication errors that occur in community-based and clinical settings. Epidemiological data from poisons information centres enable characterisation of trends in medication errors occurring across the healthcare spectrum. The objective of this study was to characterise the epidemiology and type of medication errors reported to the National Poisons Information Centre (NPIC) of Ireland. A 3-year prospective study on medication errors reported to the NPIC was conducted from 1 January 2007 to 31 December 2009 inclusive. Data on patient demographics, enquiry source, location, pharmaceutical agent(s), type of medication error, and treatment advice were collated from standardised call report forms. Medication errors were categorised as (i) prescribing error (i.e. physician error), (ii) dispensing error (i.e. pharmacy error), and (iii) administration error involving the wrong medication, the wrong dose, wrong route, or the wrong time. Medication errors were reported for 2348 individuals, representing 9.56% of total enquiries to the NPIC over 3 years. In total, 1220 children and adolescents under 18 years of age and 1128 adults (≥ 18 years old) experienced a medication error. The majority of enquiries were received from healthcare professionals, but members of the public accounted for 31.3% (n = 736) of enquiries. Most medication errors occurred in a domestic setting (n = 2135), but a small number occurred in healthcare facilities: nursing homes (n = 110, 4.68%), hospitals (n = 53, 2.26%), and general practitioner surgeries (n = 32, 1.36%). In children, medication errors with non-prescription pharmaceuticals predominated (n = 722) and anti-pyretics and non-opioid analgesics, anti-bacterials, and cough and cold preparations were the main pharmaceutical classes involved. Medication errors with prescription medication predominated for adults (n = 866) and the major medication

  9. The epidemiology and type of medication errors reported to the National Poisons Information Centre of Ireland.

    LENUS (Irish Health Repository)

    Cassidy, Nicola

    2012-02-01

    INTRODUCTION: Medication errors are widely reported for hospitalised patients, but limited data are available for medication errors that occur in community-based and clinical settings. Epidemiological data from poisons information centres enable characterisation of trends in medication errors occurring across the healthcare spectrum. AIM: The objective of this study was to characterise the epidemiology and type of medication errors reported to the National Poisons Information Centre (NPIC) of Ireland. METHODS: A 3-year prospective study on medication errors reported to the NPIC was conducted from 1 January 2007 to 31 December 2009 inclusive. Data on patient demographics, enquiry source, location, pharmaceutical agent(s), type of medication error, and treatment advice were collated from standardised call report forms. Medication errors were categorised as (i) prescribing error (i.e. physician error), (ii) dispensing error (i.e. pharmacy error), and (iii) administration error involving the wrong medication, the wrong dose, wrong route, or the wrong time. RESULTS: Medication errors were reported for 2348 individuals, representing 9.56% of total enquiries to the NPIC over 3 years. In total, 1220 children and adolescents under 18 years of age and 1128 adults (>\\/= 18 years old) experienced a medication error. The majority of enquiries were received from healthcare professionals, but members of the public accounted for 31.3% (n = 736) of enquiries. Most medication errors occurred in a domestic setting (n = 2135), but a small number occurred in healthcare facilities: nursing homes (n = 110, 4.68%), hospitals (n = 53, 2.26%), and general practitioner surgeries (n = 32, 1.36%). In children, medication errors with non-prescription pharmaceuticals predominated (n = 722) and anti-pyretics and non-opioid analgesics, anti-bacterials, and cough and cold preparations were the main pharmaceutical classes involved. Medication errors with prescription medication predominated for

  10. Error Analysis of Brailled Instructional Materials Produced by Public School Personnel in Texas

    Science.gov (United States)

    Herzberg, Tina

    2010-01-01

    In this study, a detailed error analysis was performed to determine if patterns of errors existed in braille transcriptions. The most frequently occurring errors were the insertion of letters or words that were not contained in the original print material; the incorrect usage of the emphasis indicator; and the incorrect formatting of titles,…

  11. The Effect of In-Game Errors on Learning Outcomes. CRESST Report 835

    Science.gov (United States)

    Kerr, Deirdre; Chung, Gregory K. W. K.

    2013-01-01

    Student mathematical errors are rarely random and often occur because students are applying procedures that they believe to be accurate. Traditional approaches often view such errors as indicators of students' failure to understand the construct in question, but some theorists view errors as opportunities for students to expand their mental model…

  12. Scaffolding--How Can Contingency Lead to Successful Learning When Dealing with Errors?

    Science.gov (United States)

    Wischgoll, Anke; Pauli, Christine; Reusser, Kurt

    2015-01-01

    Errors indicate learners' misunderstanding and can provide learning opportunities. Providing learning support which is contingent on learners' needs when errors occur is considered effective for developing learners' understanding. The current investigation examines how tutors and tutees interact productively with errors when working on a…

  13. Digital Particle Image Velocimetry: Partial Image Error (PIE)

    International Nuclear Information System (INIS)

    Anandarajah, K; Hargrave, G K; Halliwell, N A

    2006-01-01

    This paper quantifies the errors due to partial imaging of seeding particles which occur at the edges of interrogation regions in Digital Particle Image Velocimetry (DPIV). Hitherto, in the scientific literature the effect of these partial images has been assumed to be negligible. The results show that the error is significant even at a commonly used interrogation region size of 32 x 32 pixels. If correlation of interrogation region sizes of 16 x 16 pixels and smaller is attempted, the error which occurs can preclude meaningful results being obtained. In order to reduce the error normalisation of the correlation peak values is necessary. The paper introduces Normalisation by Signal Strength (NSS) as the preferred means of normalisation for optimum accuracy. In addition, it is shown that NSS increases the dynamic range of DPIV

  14. Association of medication errors with drug classifications, clinical units, and consequence of errors: Are they related?

    Science.gov (United States)

    Muroi, Maki; Shen, Jay J; Angosta, Alona

    2017-02-01

    Registered nurses (RNs) play an important role in safe medication administration and patient safety. This study examined a total of 1276 medication error (ME) incident reports made by RNs in hospital inpatient settings in the southwestern region of the United States. The most common drug class associated with MEs was cardiovascular drugs (24.7%). Among this class, anticoagulants had the most errors (11.3%). The antimicrobials was the second most common drug class associated with errors (19.1%) and vancomycin was the most common antimicrobial that caused errors in this category (6.1%). MEs occurred more frequently in the medical-surgical and intensive care units than any other hospital units. Ten percent of MEs reached the patients with harm and 11% reached the patients with increased monitoring. Understanding the contributing factors related to MEs, addressing and eliminating risk of errors across hospital units, and providing education and resources for nurses may help reduce MEs. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. Error threshold ghosts in a simple hypercycle with error prone self-replication

    International Nuclear Information System (INIS)

    Sardanyes, Josep

    2008-01-01

    A delayed transition because of mutation processes is shown to happen in a simple hypercycle composed by two indistinguishable molecular species with error prone self-replication. The appearance of a ghost near the hypercycle error threshold causes a delay in the extinction and thus in the loss of information of the mutually catalytic replicators, in a kind of information memory. The extinction time, τ, scales near bifurcation threshold according to the universal square-root scaling law i.e. τ ∼ (Q hc - Q) -1/2 , typical of dynamical systems close to a saddle-node bifurcation. Here, Q hc represents the bifurcation point named hypercycle error threshold, involved in the change among the asymptotic stability phase and the so-called Random Replication State (RRS) of the hypercycle; and the parameter Q is the replication quality factor. The ghost involves a longer transient towards extinction once the saddle-node bifurcation has occurred, being extremely long near the bifurcation threshold. The role of this dynamical effect is expected to be relevant in fluctuating environments. Such a phenomenon should also be found in larger hypercycles when considering the hypercycle species in competition with their error tail. The implications of the ghost in the survival and evolution of error prone self-replicating molecules with hypercyclic organization are discussed

  16. Residents' numeric inputting error in computerized physician order entry prescription.

    Science.gov (United States)

    Wu, Xue; Wu, Changxu; Zhang, Kan; Wei, Dong

    2016-04-01

    Computerized physician order entry (CPOE) system with embedded clinical decision support (CDS) can significantly reduce certain types of prescription error. However, prescription errors still occur. Various factors such as the numeric inputting methods in human computer interaction (HCI) produce different error rates and types, but has received relatively little attention. This study aimed to examine the effects of numeric inputting methods and urgency levels on numeric inputting errors of prescription, as well as categorize the types of errors. Thirty residents participated in four prescribing tasks in which two factors were manipulated: numeric inputting methods (numeric row in the main keyboard vs. numeric keypad) and urgency levels (urgent situation vs. non-urgent situation). Multiple aspects of participants' prescribing behavior were measured in sober prescribing situations. The results revealed that in urgent situations, participants were prone to make mistakes when using the numeric row in the main keyboard. With control of performance in the sober prescribing situation, the effects of the input methods disappeared, and urgency was found to play a significant role in the generalized linear model. Most errors were either omission or substitution types, but the proportion of transposition and intrusion error types were significantly higher than that of the previous research. Among numbers 3, 8, and 9, which were the less common digits used in prescription, the error rate was higher, which was a great risk to patient safety. Urgency played a more important role in CPOE numeric typing error-making than typing skills and typing habits. It was recommended that inputting with the numeric keypad had lower error rates in urgent situation. An alternative design could consider increasing the sensitivity of the keys with lower frequency of occurrence and decimals. To improve the usability of CPOE, numeric keyboard design and error detection could benefit from spatial

  17. Medication errors: classification of seriousness, type, and of medications involved in the reports from a university teaching hospital

    Directory of Open Access Journals (Sweden)

    Gabriella Rejane dos Santos Dalmolin

    2013-12-01

    Full Text Available Medication errors can be frequent in hospitals; these errors are multidisciplinary and occur at various stages of the drug therapy. The present study evaluated the seriousness, the type and the drugs involved in medication errors reported at the Hospital de Clínicas de Porto Alegre. We analyzed written error reports for 2010-2011. The sample consisted of 165 reports. The errors identified were classified according to seriousness, type and pharmacological class. 114 reports were categorized as actual errors (medication errors and 51 reports were categorized as potential errors. There were more medication error reports in 2011 compared to 2010, but there was no significant change in the seriousness of the reports. The most common type of error was prescribing error (48.25%. Errors that occurred during the process of drug therapy sometimes generated additional medication errors. In 114 reports of medication errors identified, 122 drugs were cited. The reflection on medication errors, the possibility of harm resulting from these errors, and the methods for error identification and evaluation should include a broad perspective of the aspects involved in the occurrence of errors. Patient safety depends on the process of communication involving errors, on the proper recording of information, and on the monitoring itself.

  18. Detection of Harmonic Occurring using Kalman Filtering

    DEFF Research Database (Denmark)

    Hussain, Dil Muhammad Akbar; Shoro, Ghulam Mustafa; Imran, Raja Muhammed

    2014-01-01

    /current characteristic. These harmonics are not to be allowed to grow beyond a certain limit to avoid any grave consequence to the customer’s main supply. Filters can be implemented at the power source or utility location to eliminate these harmonics. In this paper we detect the instance at which these harmonics occur...

  19. Formal synthesis of naturally occurring norephedrine

    Indian Academy of Sciences (India)

    A concise and simple synthesis of 1-hydroxy-phenethylamine derivatives has been achieved following classical organic transformations using commercially available chiral pools. The said derivatives were explored for the synthesis of naturally occurring bio-active small molecules. Formal synthesis of norephedrine, virolin ...

  20. Percieved functions of naturally occurring autobiographical memories

    DEFF Research Database (Denmark)

    Treebak, L. S.; Henriksen, J. R.; Lundhus, S.

    2005-01-01

    The main empirical reference on functions of autobiographical memories is still Hyman & Faries (1992) who used the cue-word-method and retrospective judgements. We used diaries to sample naturally occurring autobiographical memories and participants? perceived use of these. Results partly replicate...

  1. A naturally occurring trap for antiprotons

    International Nuclear Information System (INIS)

    Eades, J.; Morita, N.; Ito, T.M.

    1993-05-01

    The phenomenon of delayed annihilation of antiprotons in helium is the first instance of a naturally occurring trap for antimatter in ordinary matter. Recent studies of this effect at CERN are summarized, and plans are described for laser excitation experiments to test its interpretation in terms of metastable exotic helium atom formation. (author)

  2. Jerky periods: myoclonus occurring solely during menses

    NARCIS (Netherlands)

    Buijink, Arthur W. G.; Gelauff, Jeannette M.; van der Salm, Sandra M. A.; Tijssen, Marina A. J.; van Rootselaar, Anne-Fleur

    2013-01-01

    In this case report, we describe an unusual case of a patient with myoclonus only occurring during menses. A 41-year-old female, known to have neurological sequelae after a car accident 1 year earlier, presented with myoclonic movements of the right arm and hand only during menses. Brain magnetic

  3. Nonresponse Error in Mail Surveys: Top Ten Problems

    Directory of Open Access Journals (Sweden)

    Jeanette M. Daly

    2011-01-01

    Full Text Available Conducting mail surveys can result in nonresponse error, which occurs when the potential participant is unwilling to participate or impossible to contact. Nonresponse can result in a reduction in precision of the study and may bias results. The purpose of this paper is to describe and make readers aware of a top ten list of mailed survey problems affecting the response rate encountered over time with different research projects, while utilizing the Dillman Total Design Method. Ten nonresponse error problems were identified, such as inserter machine gets sequence out of order, capitalization in databases, and mailing discarded by postal service. These ten mishaps can potentiate nonresponse errors, but there are ways to minimize their frequency. Suggestions offered stem from our own experiences during research projects. Our goal is to increase researchers' knowledge of nonresponse error problems and to offer solutions which can decrease nonresponse error in future projects.

  4. Preferential flow occurs in unsaturated conditions

    Science.gov (United States)

    Nimmo, John R.

    2012-01-01

    Because it commonly generates high-speed, high-volume flow with minimal exposure to solid earth materials, preferential flow in the unsaturated zone is a dominant influence in many problems of infiltration, recharge, contaminant transport, and ecohydrology. By definition, preferential flow occurs in a portion of a medium – that is, a preferred part, whether a pathway, pore, or macroscopic subvolume. There are many possible classification schemes, but usual consideration of preferential flow includes macropore or fracture flow, funneled flow determined by macroscale heterogeneities, and fingered flow determined by hydraulic instability rather than intrinsic heterogeneity. That preferential flow is spatially concentrated associates it with other characteristics that are typical, although not defining: it tends to be unusually fast, to transport high fluxes, and to occur with hydraulic disequilibrium within the medium. It also has a tendency to occur in association with large conduits and high water content, although these are less universal than is commonly assumed. Predictive unsaturated-zone flow models in common use employ several different criteria for when and where preferential flow occurs, almost always requiring a nearly saturated medium. A threshold to be exceeded may be specified in terms of the following (i) water content; (ii) matric potential, typically a value high enough to cause capillary filling in a macropore of minimum size; (iii) infiltration capacity or other indication of incipient surface ponding; or (iv) other conditions related to total filling of certain pores. Yet preferential flow does occur without meeting these criteria. My purpose in this commentary is to point out important exceptions and implications of ignoring them. Some of these pertain mainly to macropore flow, others to fingered or funneled flow, and others to combined or undifferentiated flow modes.

  5. Operating personnel error analysis during operation failures in the Kozloduj NPP

    International Nuclear Information System (INIS)

    Jonkova, A.

    1990-01-01

    The failures due to personnel errors are analyzed for 10 years period (1977-1986). Most of the results are presented in absolute values and are considered in dynamics. The indices for relative shares are compared by alternative analysis. One of the most important causes is the fluctuation of manpower. The failures distribution by months within the year and by hours of the day is given. The biggest number of failures occurred in the period April-October (without August - the month of the leaves), when the refueling and repair were taken place, and in January-February, due to heavy meteorological conditions and some fatigue and disconcentration because of multiple holidays. The failures during the day shifts had the greatest relative share - 42%, during the afternoon shifts - 26% and during the night shifts - 32% The most 'dangerous' time periods happened to be 11-12 h and 13-14 h (deteriorated attention after lunch), 20-22 h (physiological drop of the psychological activity), 0-3 h (the lowest level of physiological and psychological activity) and in the first and last hours of every shift. Three groups of causes are pointed out as the most frequent: improper actions connected with orders; improper independent actions; uncoordinated teamwork. The following measures are proposed for reducing the effect of the human factor: setting up the training centre; preliminary evaluation of the professional qualification of the operators; current dynamic control of their neuro-psychological fitness and occupational reliability. 1 fig, 2 tabs, 5 refs

  6. Evaluation of drug administration errors in a teaching hospital

    OpenAIRE

    Berdot, Sarah; Sabatier, Brigitte; Gillaizeau, Florence; Caruba, Thibaut; Prognon, Patrice; Durieux, Pierre

    2012-01-01

    Abstract Background Medication errors can occur at any of the three steps of the medication use process: prescribing, dispensing and administration. We aimed to determine the incidence, type and clinical importance of drug administration errors and to identify risk factors. Methods Prospective study based on disguised observation technique in four wards in a teaching hospital in Paris, France (800 beds). A pharmacist accompanied nurses and witnessed the preparation and administration of drugs...

  7. Errors in dentistry: a call for apology.

    Science.gov (United States)

    Schwartz, Barry

    2005-01-01

    Bad outcomes occur in dentistry and sometimes these are the results of dental errors. In both cases, this essay will argue that apologies are very important in maintaining a relationship with the patient that is based on trust and mutual respect. Nevertheless, apologies are often not forthcoming in dentistry for a number of reasons that deserve careful examination. In particular, the dentist's fear that an apology will increase the risk of legal harm will be critiqued. Ethical and psychological reasons for making an apology will be discussed, and strategies to assist clinicians in making an apology will be offered.

  8. Errors, error detection, error correction and hippocampal-region damage: data and theories.

    Science.gov (United States)

    MacKay, Donald G; Johnson, Laura W

    2013-11-01

    This review and perspective article outlines 15 observational constraints on theories of errors, error detection, and error correction, and their relation to hippocampal-region (HR) damage. The core observations come from 10 studies with H.M., an amnesic with cerebellar and HR damage but virtually no neocortical damage. Three studies examined the detection of errors planted in visual scenes (e.g., a bird flying in a fish bowl in a school classroom) and sentences (e.g., I helped themselves to the birthday cake). In all three experiments, H.M. detected reliably fewer errors than carefully matched memory-normal controls. Other studies examined the detection and correction of self-produced errors, with controls for comprehension of the instructions, impaired visual acuity, temporal factors, motoric slowing, forgetting, excessive memory load, lack of motivation, and deficits in visual scanning or attention. In these studies, H.M. corrected reliably fewer errors than memory-normal and cerebellar controls, and his uncorrected errors in speech, object naming, and reading aloud exhibited two consistent features: omission and anomaly. For example, in sentence production tasks, H.M. omitted one or more words in uncorrected encoding errors that rendered his sentences anomalous (incoherent, incomplete, or ungrammatical) reliably more often than controls. Besides explaining these core findings, the theoretical principles discussed here explain H.M.'s retrograde amnesia for once familiar episodic and semantic information; his anterograde amnesia for novel information; his deficits in visual cognition, sentence comprehension, sentence production, sentence reading, and object naming; and effects of aging on his ability to read isolated low frequency words aloud. These theoretical principles also explain a wide range of other data on error detection and correction and generate new predictions for future test. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. Wind Power Forecasting Error Distributions: An International Comparison

    DEFF Research Database (Denmark)

    Hodge, Bri-Mathias; Lew, Debra; Milligan, Michael

    2012-01-01

    Wind power forecasting is essential for greater penetration of wind power into electricity systems. Because no wind forecasting system is perfect, a thorough understanding of the errors that may occur is a critical factor for system operation functions, such as the setting of operating reserve...... levels. This paper provides an international comparison of the distribution of wind power forecasting errors from operational systems, based on real forecast data. The paper concludes with an assessment of similarities and differences between the errors observed in different locations....

  10. Automated Classification of Phonological Errors in Aphasic Language

    Science.gov (United States)

    Ahuja, Sanjeev B.; Reggia, James A.; Berndt, Rita S.

    1984-01-01

    Using heuristically-guided state space search, a prototype program has been developed to simulate and classify phonemic errors occurring in the speech of neurologically-impaired patients. Simulations are based on an interchangeable rule/operator set of elementary errors which represent a theory of phonemic processing faults. This work introduces and evaluates a novel approach to error simulation and classification, it provides a prototype simulation tool for neurolinguistic research, and it forms the initial phase of a larger research effort involving computer modelling of neurolinguistic processes.

  11. Human errors in NPP operations

    International Nuclear Information System (INIS)

    Sheng Jufang

    1993-01-01

    Based on the operational experiences of nuclear power plants (NPPs), the importance of studying human performance problems is described. Statistical analysis on the significance or frequency of various root-causes and error-modes from a large number of human-error-related events demonstrate that the defects in operation/maintenance procedures, working place factors, communication and training practices are primary root-causes, while omission, transposition, quantitative mistake are the most frequent among the error-modes. Recommendations about domestic research on human performance problem in NPPs are suggested

  12. Linear network error correction coding

    CERN Document Server

    Guang, Xuan

    2014-01-01

    There are two main approaches in the theory of network error correction coding. In this SpringerBrief, the authors summarize some of the most important contributions following the classic approach, which represents messages by sequences?similar to algebraic coding,?and also briefly discuss the main results following the?other approach,?that uses the theory of rank metric codes for network error correction of representing messages by subspaces. This book starts by establishing the basic linear network error correction (LNEC) model and then characterizes two equivalent descriptions. Distances an

  13. Competition increases binding errors in visual working memory.

    Science.gov (United States)

    Emrich, Stephen M; Ferber, Susanne

    2012-04-20

    When faced with maintaining multiple objects in visual working memory, item information must be bound to the correct object in order to be correctly recalled. Sometimes, however, binding errors occur, and participants report the feature (e.g., color) of an unprobed, non-target item. In the present study, we examine whether the configuration of sample stimuli affects the proportion of these binding errors. The results demonstrate that participants mistakenly report the identity of the unprobed item (i.e., they make a non-target response) when sample items are presented close together in space, suggesting that binding errors can increase independent of increases in memory load. Moreover, the proportion of these non-target responses is linearly related to the distance between sample items, suggesting that these errors are spatially specific. Finally, presenting sample items sequentially decreases non-target responses, suggesting that reducing competition between sample stimuli reduces the number of binding errors. Importantly, these effects all occurred without increases in the amount of error in the memory representation. These results suggest that competition during encoding can account for some of the binding errors made during VWM recall.

  14. Human Error Assessmentin Minefield Cleaning Operation Using Human Event Analysis

    Directory of Open Access Journals (Sweden)

    Mohammad Hajiakbari

    2015-12-01

    Full Text Available Background & objective: Human error is one of the main causes of accidents. Due to the unreliability of the human element and the high-risk nature of demining operations, this study aimed to assess and manage human errors likely to occur in such operations. Methods: This study was performed at a demining site in war zones located in the West of Iran. After acquiring an initial familiarity with the operations, methods, and tools of clearing minefields, job task related to clearing landmines were specified. Next, these tasks were studied using HTA and related possible errors were assessed using ATHEANA. Results: de-mining task was composed of four main operations, including primary detection, technical identification, investigation, and neutralization. There were found four main reasons for accidents occurring in such operations; walking on the mines, leaving mines with no action, error in neutralizing operation and environmental explosion. The possibility of human error in mine clearance operations was calculated as 0.010. Conclusion: The main causes of human error in de-mining operations can be attributed to various factors such as poor weather and operating conditions like outdoor work, inappropriate personal protective equipment, personality characteristics, insufficient accuracy in the work, and insufficient time available. To reduce the probability of human error in de-mining operations, the aforementioned factors should be managed properly.

  15. Natural occurring radioactive substances. Vol. 1

    Energy Technology Data Exchange (ETDEWEB)

    Emara, A E [National Center for radiation Research and Technology Atomic Energy Authority, Cairo (Egypt)

    1996-03-01

    Naturally occurring radioactive substances produced by cosmic rays of those of terrestrial origin are surveyed. The different radioactive decay series are discussed. Special emphasis is given to the element radium as regards its properties and distribution in different environmental samples. The properties of naturally occurring k-40 and its distribution in different natural media are also outlined. Induced radionuclides which are formed as a result of the interaction of cosmic rays with the constituents of the atmosphere are mentioned. In this respect the intensity of natural background radiation and the dose at different locations and levels is surveyed. Some regions of exceptionally high radioactivity which result in high exposure rates are mentioned. Monazite deposits and water springs are mentioned in some detail. The Oklo phenomenon as a natural reactor is also discussed. 8 tabs.

  16. Natural occurring radioactive substances. Vol. 1

    International Nuclear Information System (INIS)

    Emara, A.E.

    1996-01-01

    Naturally occurring radioactive substances produced by cosmic rays of those of terrestrial origin are surveyed. The different radioactive decay series are discussed. Special emphasis is given to the element radium as regards its properties and distribution in different environmental samples. The properties of naturally occurring k-40 and its distribution in different natural media are also outlined. Induced radionuclides which are formed as a result of the interaction of cosmic rays with the constituents of the atmosphere are mentioned. In this respect the intensity of natural background radiation and the dose at different locations and levels is surveyed. Some regions of exceptionally high radioactivity which result in high exposure rates are mentioned. Monazite deposits and water springs are mentioned in some detail. The Oklo phenomenon as a natural reactor is also discussed. 8 tabs

  17. An Analysis and Quantification Method of Human Errors of Soft Controls in Advanced MCRs

    International Nuclear Information System (INIS)

    Lee, Seung Jun; Kim, Jae Whan; Jang, Seung Cheol

    2011-01-01

    In this work, a method was proposed for quantifying human errors that may occur during operation executions using soft control. Soft controls of advanced main control rooms (MCRs) have totally different features from conventional controls, and thus they may have different human error modes and occurrence probabilities. It is important to define the human error modes and to quantify the error probability for evaluating the reliability of the system and preventing errors. This work suggests a modified K-HRA method for quantifying error probability

  18. Jerky Periods - Myoclonus Occurring Solely During Menses

    OpenAIRE

    Arthur W. Buijink; Jeannette M. Gelauff; Sandra M. van der Salm; Marina A. Tijssen; Anne-Fleur van Rootselaar

    2013-01-01

    Background: In this case report, we describe an unusual case of a patient with myoclonus only occurring during menses. Case Report: A 41-year-old female, known to have neurological sequelae after a car accident 1 year earlier, presented with myoclonic movements of the right arm and hand only during menses. Brain magnetic resonance imaging is compatible with head trauma. Electromyography shows brief irregular bursts with a duration of about 20 ms. Discussion: This appears to be the first descr...

  19. The uncorrected refractive error challenge

    Directory of Open Access Journals (Sweden)

    Kovin Naidoo

    2016-11-01

    Full Text Available Refractive error affects people of all ages, socio-economic status and ethnic groups. The most recent statistics estimate that, worldwide, 32.4 million people are blind and 191 million people have vision impairment. Vision impairment has been defined based on distance visual acuity only, and uncorrected distance refractive error (mainly myopia is the single biggest cause of worldwide vision impairment. However, when we also consider near visual impairment, it is clear that even more people are affected. From research it was estimated that the number of people with vision impairment due to uncorrected distance refractive error was 107.8 million,1 and the number of people affected by uncorrected near refractive error was 517 million, giving a total of 624.8 million people.

  20. Quantile Regression With Measurement Error

    KAUST Repository

    Wei, Ying

    2009-08-27

    Regression quantiles can be substantially biased when the covariates are measured with error. In this paper we propose a new method that produces consistent linear quantile estimation in the presence of covariate measurement error. The method corrects the measurement error induced bias by constructing joint estimating equations that simultaneously hold for all the quantile levels. An iterative EM-type estimation algorithm to obtain the solutions to such joint estimation equations is provided. The finite sample performance of the proposed method is investigated in a simulation study, and compared to the standard regression calibration approach. Finally, we apply our methodology to part of the National Collaborative Perinatal Project growth data, a longitudinal study with an unusual measurement error structure. © 2009 American Statistical Association.

  1. Comprehensive Error Rate Testing (CERT)

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Centers for Medicare and Medicaid Services (CMS) implemented the Comprehensive Error Rate Testing (CERT) program to measure improper payments in the Medicare...

  2. Numerical optimization with computational errors

    CERN Document Server

    Zaslavski, Alexander J

    2016-01-01

    This book studies the approximate solutions of optimization problems in the presence of computational errors. A number of results are presented on the convergence behavior of algorithms in a Hilbert space; these algorithms are examined taking into account computational errors. The author illustrates that algorithms generate a good approximate solution, if computational errors are bounded from above by a small positive constant. Known computational errors are examined with the aim of determining an approximate solution. Researchers and students interested in the optimization theory and its applications will find this book instructive and informative. This monograph contains 16 chapters; including a chapters devoted to the subgradient projection algorithm, the mirror descent algorithm, gradient projection algorithm, the Weiszfelds method, constrained convex minimization problems, the convergence of a proximal point method in a Hilbert space, the continuous subgradient method, penalty methods and Newton’s meth...

  3. Dual processing and diagnostic errors.

    Science.gov (United States)

    Norman, Geoff

    2009-09-01

    In this paper, I review evidence from two theories in psychology relevant to diagnosis and diagnostic errors. "Dual Process" theories of thinking, frequently mentioned with respect to diagnostic error, propose that categorization decisions can be made with either a fast, unconscious, contextual process called System 1 or a slow, analytical, conscious, and conceptual process, called System 2. Exemplar theories of categorization propose that many category decisions in everyday life are made by unconscious matching to a particular example in memory, and these remain available and retrievable individually. I then review studies of clinical reasoning based on these theories, and show that the two processes are equally effective; System 1, despite its reliance in idiosyncratic, individual experience, is no more prone to cognitive bias or diagnostic error than System 2. Further, I review evidence that instructions directed at encouraging the clinician to explicitly use both strategies can lead to consistent reduction in error rates.

  4. Error correcting coding for OTN

    DEFF Research Database (Denmark)

    Justesen, Jørn; Larsen, Knud J.; Pedersen, Lars A.

    2010-01-01

    Forward error correction codes for 100 Gb/s optical transmission are currently receiving much attention from transport network operators and technology providers. We discuss the performance of hard decision decoding using product type codes that cover a single OTN frame or a small number...... of such frames. In particular we argue that a three-error correcting BCH is the best choice for the component code in such systems....

  5. Negligence, genuine error, and litigation

    OpenAIRE

    Sohn DH

    2013-01-01

    David H SohnDepartment of Orthopedic Surgery, University of Toledo Medical Center, Toledo, OH, USAAbstract: Not all medical injuries are the result of negligence. In fact, most medical injuries are the result either of the inherent risk in the practice of medicine, or due to system errors, which cannot be prevented simply through fear of disciplinary action. This paper will discuss the differences between adverse events, negligence, and system errors; the current medical malpractice tort syst...

  6. Errors in Aviation Decision Making: Bad Decisions or Bad Luck?

    Science.gov (United States)

    Orasanu, Judith; Martin, Lynne; Davison, Jeannie; Null, Cynthia H. (Technical Monitor)

    1998-01-01

    Despite efforts to design systems and procedures to support 'correct' and safe operations in aviation, errors in human judgment still occur and contribute to accidents. In this paper we examine how an NDM (naturalistic decision making) approach might help us to understand the role of decision processes in negative outcomes. Our strategy was to examine a collection of identified decision errors through the lens of an aviation decision process model and to search for common patterns. The second, and more difficult, task was to determine what might account for those patterns. The corpus we analyzed consisted of tactical decision errors identified by the NTSB (National Transportation Safety Board) from a set of accidents in which crew behavior contributed to the accident. A common pattern emerged: about three quarters of the errors represented plan-continuation errors, that is, a decision to continue with the original plan despite cues that suggested changing the course of action. Features in the context that might contribute to these errors were identified: (a) ambiguous dynamic conditions and (b) organizational and socially-induced goal conflicts. We hypothesize that 'errors' are mediated by underestimation of risk and failure to analyze the potential consequences of continuing with the initial plan. Stressors may further contribute to these effects. Suggestions for improving performance in these error-inducing contexts are discussed.

  7. Hospital medication errors in a pharmacovigilance system in Colombia

    Directory of Open Access Journals (Sweden)

    Jorge Enrique Machado-Alba

    2015-11-01

    Full Text Available Objective: this study analyzes the medication errors reported to a pharmacovigilance system by 26 hospitals for patients in the healthcare system of Colombia. Methods: this retrospective study analyzed the medication errors reported to a systematized database between 1 January 2008 and 12 September 2013. The medication is dispensed by the company Audifarma S.A. to hospitals and clinics around Colombia. Data were classified according to the taxonomy of the National Coordinating Council for Medication Error Reporting and Prevention (NCC MERP. The data analysis was performed using SPSS 22.0 for Windows, considering p-values < 0.05 significant. Results: there were 9 062 medication errors in 45 hospital pharmacies. Real errors accounted for 51.9% (n = 4 707, of which 12.0% (n = 567 reached the patient (Categories C to I and caused harm (Categories E to I to 17 subjects (0.36%. The main process involved in errors that occurred (categories B to I was prescription (n = 1 758, 37.3%, followed by dispensation (n = 1 737, 36.9%, transcription (n = 970, 20.6% and administration (n = 242, 5.1%. The errors in the administration process were 45.2 times more likely to reach the patient (CI 95%: 20.2–100.9. Conclusions: medication error reporting systems and prevention strategies should be widespread in hospital settings, prioritizing efforts to address the administration process.

  8. Error in the delivery of radiation therapy: Results of a quality assurance review

    International Nuclear Information System (INIS)

    Huang, Grace; Medlam, Gaylene; Lee, Justin; Billingsley, Susan; Bissonnette, Jean-Pierre; Ringash, Jolie; Kane, Gabrielle; Hodgson, David C.

    2005-01-01

    Purpose: To examine error rates in the delivery of radiation therapy (RT), technical factors associated with RT errors, and the influence of a quality improvement intervention on the RT error rate. Methods and materials: We undertook a review of all RT errors that occurred at the Princess Margaret Hospital (Toronto) from January 1, 1997, to December 31, 2002. Errors were identified according to incident report forms that were completed at the time the error occurred. Error rates were calculated per patient, per treated volume (≥1 volume per patient), and per fraction delivered. The association between tumor site and error was analyzed. Logistic regression was used to examine the association between technical factors and the risk of error. Results: Over the study interval, there were 555 errors among 28,136 patient treatments delivered (error rate per patient = 1.97%, 95% confidence interval [CI], 1.81-2.14%) and among 43,302 treated volumes (error rate per volume = 1.28%, 95% CI, 1.18-1.39%). The proportion of fractions with errors from July 1, 2000, to December 31, 2002, was 0.29% (95% CI, 0.27-0.32%). Patients with sarcoma or head-and-neck tumors experienced error rates significantly higher than average (5.54% and 4.58%, respectively); however, when the number of treated volumes was taken into account, the head-and-neck error rate was no longer higher than average (1.43%). The use of accessories was associated with an increased risk of error, and internal wedges were more likely to be associated with an error than external wedges (relative risk = 2.04; 95% CI, 1.11-3.77%). Eighty-seven errors (15.6%) were directly attributed to incorrect programming of the 'record and verify' system. Changes to planning and treatment processes aimed at reducing errors within the head-and-neck site group produced a substantial reduction in the error rate. Conclusions: Errors in the delivery of RT are uncommon and usually of little clinical significance. Patient subgroups and

  9. [Medical errors: inevitable but preventable].

    Science.gov (United States)

    Giard, R W

    2001-10-27

    Medical errors are increasingly reported in the lay press. Studies have shown dramatic error rates of 10 percent or even higher. From a methodological point of view, studying the frequency and causes of medical errors is far from simple. Clinical decisions on diagnostic or therapeutic interventions are always taken within a clinical context. Reviewing outcomes of interventions without taking into account both the intentions and the arguments for a particular action will limit the conclusions from a study on the rate and preventability of errors. The interpretation of the preventability of medical errors is fraught with difficulties and probably highly subjective. Blaming the doctor personally does not do justice to the actual situation and especially the organisational framework. Attention for and improvement of the organisational aspects of error are far more important then litigating the person. To err is and will remain human and if we want to reduce the incidence of faults we must be able to learn from our mistakes. That requires an open attitude towards medical mistakes, a continuous effort in their detection, a sound analysis and, where feasible, the institution of preventive measures.

  10. Quantum error correction for beginners

    International Nuclear Information System (INIS)

    Devitt, Simon J; Nemoto, Kae; Munro, William J

    2013-01-01

    Quantum error correction (QEC) and fault-tolerant quantum computation represent one of the most vital theoretical aspects of quantum information processing. It was well known from the early developments of this exciting field that the fragility of coherent quantum systems would be a catastrophic obstacle to the development of large-scale quantum computers. The introduction of quantum error correction in 1995 showed that active techniques could be employed to mitigate this fatal problem. However, quantum error correction and fault-tolerant computation is now a much larger field and many new codes, techniques, and methodologies have been developed to implement error correction for large-scale quantum algorithms. In response, we have attempted to summarize the basic aspects of quantum error correction and fault-tolerance, not as a detailed guide, but rather as a basic introduction. The development in this area has been so pronounced that many in the field of quantum information, specifically researchers who are new to quantum information or people focused on the many other important issues in quantum computation, have found it difficult to keep up with the general formalisms and methodologies employed in this area. Rather than introducing these concepts from a rigorous mathematical and computer science framework, we instead examine error correction and fault-tolerance largely through detailed examples, which are more relevant to experimentalists today and in the near future. (review article)

  11. Issues with data and analyses: Errors, underlying themes, and potential solutions.

    Science.gov (United States)

    Brown, Andrew W; Kaiser, Kathryn A; Allison, David B

    2018-03-13

    Some aspects of science, taken at the broadest level, are universal in empirical research. These include collecting, analyzing, and reporting data. In each of these aspects, errors can and do occur. In this work, we first discuss the importance of focusing on statistical and data errors to continually improve the practice of science. We then describe underlying themes of the types of errors and postulate contributing factors. To do so, we describe a case series of relatively severe data and statistical errors coupled with surveys of some types of errors to better characterize the magnitude, frequency, and trends. Having examined these errors, we then discuss the consequences of specific errors or classes of errors. Finally, given the extracted themes, we discuss methodological, cultural, and system-level approaches to reducing the frequency of commonly observed errors. These approaches will plausibly contribute to the self-critical, self-correcting, ever-evolving practice of science, and ultimately to furthering knowledge.

  12. Jerky periods: myoclonus occurring solely during menses.

    Science.gov (United States)

    Buijink, Arthur W G; Gelauff, Jeannette M; van der Salm, Sandra M A; Tijssen, Marina A J; van Rootselaar, Anne-Fleur

    2013-01-01

    In this case report, we describe an unusual case of a patient with myoclonus only occurring during menses. A 41-year-old female, known to have neurological sequelae after a car accident 1 year earlier, presented with myoclonic movements of the right arm and hand only during menses. Brain magnetic resonance imaging is compatible with head trauma. Electromyography shows brief irregular bursts with a duration of about 20 ms. This appears to be the first description of myoclonus appearing only during menses. We suggest a cortical origin for myoclonus.

  13. Jerky Periods - Myoclonus Occurring Solely During Menses

    Directory of Open Access Journals (Sweden)

    Arthur W. Buijink

    2013-05-01

    Full Text Available Background: In this case report, we describe an unusual case of a patient with myoclonus only occurring during menses. Case Report: A 41-year-old female, known to have neurological sequelae after a car accident 1 year earlier, presented with myoclonic movements of the right arm and hand only during menses. Brain magnetic resonance imaging is compatible with head trauma. Electromyography shows brief irregular bursts with a duration of about 20 ms. Discussion: This appears to be the first description of myoclonus appearing only during menses. We suggest a cortical origin for myoclonus.

  14. Nipah virus entry can occur by macropinocytosis

    International Nuclear Information System (INIS)

    Pernet, Olivier; Pohl, Christine; Ainouze, Michelle; Kweder, Hasan; Buckland, Robin

    2009-01-01

    Nipah virus (NiV) is a zoonotic biosafety level 4 paramyxovirus that emerged recently in Asia with high mortality in man. NiV is a member, with Hendra virus (HeV), of the Henipavirus genus in the Paramyxoviridae family. Although NiV entry, like that of other paramyxoviruses, is believed to occur via pH-independent fusion with the host cell's plasma membrane we present evidence that entry can occur by an endocytic pathway. The NiV receptor ephrinB2 has receptor kinase activity and we find that ephrinB2's cytoplasmic domain is required for entry but is dispensable for post-entry viral spread. The mutation of a single tyrosine residue (Y304F) in ephrinB2's cytoplasmic tail abrogates NiV entry. Moreover, our results show that NiV entry is inhibited by constructions and drugs specific for the endocytic pathway of macropinocytosis. Our findings could potentially permit the rapid development of novel low-cost antiviral treatments not only for NiV but also HeV.

  15. Leachability of naturally occurring radioactive materials

    International Nuclear Information System (INIS)

    Desideri, D.; Feduzi, L.; Meli, M.A.; Roselli, C.

    2006-01-01

    Naturally occurring radioactive materials (NORM) are present in the environment and can be concentrated by technical activities, particularly those involving natural resources. These NORM deposits are highly stable and very insoluble under environmental conditions at the earth's surface. However, reducing or oxidant conditions or pH changes may enable a fraction of naturally occurring radionuclides to eventually be released to the environment. Leachability of 210 Pb and 210 Po was determined in three samples coming from a refractories production plant (dust, sludge, finished product), in one dust sample from a steelwork and in one ash sample coming from an electric power station. A sequential extraction method consisting of five operationally-defined fractions was used. The average leaching potential observed in the samples from the refractory industry is very low (mean values: 5.8% for 210 Pb and 1.7% for 210 Po). The 210 Pb and 210 Po leachability increases for the ash sample coming from an electric power plant using carbon (17.8% for 210 Pb and 10.0% for 210 Po); for the dust sample coming from a steelwork, the percent soluble fraction is 41.1% for 210 Pb and 8.5% for 210 Po. For all samples the results obtained show that 210 Pb is slightly more soluble than 210 Po. (author)

  16. Normalization of Deviation: Quotation Error in Human Factors.

    Science.gov (United States)

    Lock, Jordan; Bearman, Chris

    2018-05-01

    Objective The objective of this paper is to examine quotation error in human factors. Background Science progresses through building on the work of previous research. This requires accurate quotation. Quotation error has a number of adverse consequences: loss of credibility, loss of confidence in the journal, and a flawed basis for academic debate and scientific progress. Quotation error has been observed in a number of domains, including marine biology and medicine, but there has been little or no previous study of this form of error in human factors, a domain that specializes in the causes and management of error. Methods A study was conducted examining quotation accuracy of 187 extracts from 118 published articles that cited a control article (Vaughan's 1996 book: The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA). Results Of extracts studied, 12.8% ( n = 24) were classed as inaccurate, with 87.2% ( n = 163) being classed as accurate. A second dimension of agreement was examined with 96.3% ( n = 180) agreeing with the control article and only 3.7% ( n = 7) disagreeing. The categories of accuracy and agreement form a two by two matrix. Conclusion Rather than simply blaming individuals for quotation error, systemic factors should also be considered. Vaughan's theory, normalization of deviance, is one systemic theory that can account for quotation error. Application Quotation error is occurring in human factors and should receive more attention. According to Vaughan's theory, the normal everyday systems that promote scholarship may also allow mistakes, mishaps, and quotation error to occur.

  17. Medication errors in home care: a qualitative focus group study.

    Science.gov (United States)

    Berland, Astrid; Bentsen, Signe Berit

    2017-11-01

    To explore registered nurses' experiences of medication errors and patient safety in home care. The focus of care for older patients has shifted from institutional care towards a model of home care. Medication errors are common in this situation and can result in patient morbidity and mortality. An exploratory qualitative design with focus group interviews was used. Four focus group interviews were conducted with 20 registered nurses in home care. The data were analysed using content analysis. Five categories were identified as follows: lack of information, lack of competence, reporting medication errors, trade name products vs. generic name products, and improving routines. Medication errors occur frequently in home care and can threaten the safety of patients. Insufficient exchange of information and poor communication between the specialist and home-care health services, and between general practitioners and healthcare workers can lead to medication errors. A lack of competence in healthcare workers can also lead to medication errors. To prevent these, it is important that there should be up-to-date information and communication between healthcare workers during the transfer of patients from specialist to home care. Ensuring competence among healthcare workers with regard to medication is also important. In addition, there should be openness and accurate reporting of medication errors, as well as in setting routines for the preparation, alteration and administration of medicines. To prevent medication errors in home care, up-to-date information and communication between healthcare workers is important when patients are transferred from specialist to home care. It is also important to ensure adequate competence with regard to medication, and that there should be openness when medication errors occur, as well as in setting routines for the preparation, alteration and administration of medications. © 2017 John Wiley & Sons Ltd.

  18. Radiology errors: are we learning from our mistakes?

    International Nuclear Information System (INIS)

    Mankad, K.; Hoey, E.T.D.; Jones, J.B.; Tirukonda, P.; Smith, J.T.

    2009-01-01

    Aim: To question practising radiologists and radiology trainees at a large international meeting in an attempt to survey individuals about error reporting. Materials and methods: Radiologists attending the 2007 Radiological Society of North America (RSNA) annual meeting were approached to fill in a written questionnaire. Participants were questioned as to their grade, country in which they practised, and subspecialty interest. They were asked whether they kept a personal log of their errors (with an error defined as 'a mistake that has management implications for the patient'), how many errors they had made in the preceding 12 months, and the types of errors that had occurred. They were also asked whether their local department held regular discrepancy/errors meetings, how many they had attended in the preceding 12 months, and the perceived atmosphere at these meetings (on a qualitative scale). Results: A total of 301 radiologists with a wide range of specialty interests from 32 countries agreed to take part. One hundred and sixty-six of 301 (55%) of responders were consultant/attending grade. One hundred and thirty-five of 301 (45%) were residents/fellows. Fifty-nine of 301 (20%) of responders kept a personal record of their errors. The number of errors made per person per year ranged from none (2%) to 16 or more (7%). The majority (91%) reported making between one and 15 errors/year. Overcalls (40%), under-calls (25%), and interpretation error (15%) were the predominant error types. One hundred and seventy-eight of 301 (59%) of participants stated that their department held regular errors meeting. One hundred and twenty-seven of 301 (42%) had attended three or more meetings in the preceding year. The majority (55%) who had attended errors meetings described the atmosphere as 'educational.' Only a small minority (2%) described the atmosphere as 'poor' meaning non-educational and/or blameful. Conclusion: Despite the undeniable importance of learning from errors

  19. Nature and frequency of medication errors in a geriatric ward: an Indonesian experience

    Directory of Open Access Journals (Sweden)

    Ernawati DK

    2014-06-01

    Full Text Available Desak Ketut Ernawati,1,2 Ya Ping Lee,2 Jeffery David Hughes21Faculty of Medicine, Udayana University, Denpasar, Bali, Indonesia; 2School of Pharmacy and Curtin Health Innovation and Research Institute, Curtin University, Perth, WA, AustraliaPurpose: To determine the nature and frequency of medication errors during medication delivery processes in a public teaching hospital geriatric ward in Bali, Indonesia.Methods: A 20-week prospective study on medication errors occurring during the medication delivery process was conducted in a geriatric ward in a public teaching hospital in Bali, Indonesia. Participants selected were inpatients aged more than 60 years. Patients were excluded if they had a malignancy, were undergoing surgery, or receiving chemotherapy treatment. The occurrence of medication errors in prescribing, transcribing, dispensing, and administration were detected by the investigator providing in-hospital clinical pharmacy services.Results: Seven hundred and seventy drug orders and 7,662 drug doses were reviewed as part of the study. There were 1,563 medication errors detected among the 7,662 drug doses reviewed, representing an error rate of 20.4%. Administration errors were the most frequent medication errors identified (59%, followed by transcription errors (15%, dispensing errors (14%, and prescribing errors (7%. Errors in documentation were the most common form of administration errors. Of these errors, 2.4% were classified as potentially serious and 10.3% as potentially significant.Conclusion: Medication errors occurred in every stage of the medication delivery process, with administration errors being the most frequent. The majority of errors identified in the administration stage were related to documentation. Provision of in-hospital clinical pharmacy services could potentially play a significant role in detecting and preventing medication errors.Keywords: geriatric, medication errors, inpatients, medication delivery process

  20. The last Fermat theorem. The story of the riddle that has defied the greatest minds in the world during 358 years

    International Nuclear Information System (INIS)

    Singh, S.

    1998-01-01

    Pierre de Fermat, one of the greatest French mathematician of the seventeenth century, noticed in the margin of his exercise book 'X n + Y n Z n impossible if n upper than 2, i have found a wonderful solution but i am short of place to develop it here'. Only in 1993 a young British man, Andrew Wiles, professor at Princeton, after seven years of work settled this riddle. That is that story that is told here. (N.C.)

  1. Stochastic and sensitivity analysis of shape error of inflatable antenna reflectors

    Science.gov (United States)

    San, Bingbing; Yang, Qingshan; Yin, Liwei

    2017-03-01

    Inflatable antennas are promising candidates to realize future satellite communications and space observations since they are lightweight, low-cost and small-packaged-volume. However, due to their high flexibility, inflatable reflectors are difficult to manufacture accurately, which may result in undesirable shape errors, and thus affect their performance negatively. In this paper, the stochastic characteristics of shape errors induced during manufacturing process are investigated using Latin hypercube sampling coupled with manufacture simulations. Four main random error sources are involved, including errors in membrane thickness, errors in elastic modulus of membrane, boundary deviations and pressure variations. Using regression and correlation analysis, a global sensitivity study is conducted to rank the importance of these error sources. This global sensitivity analysis is novel in that it can take into account the random variation and the interaction between error sources. Analyses are parametrically carried out with various focal-length-to-diameter ratios (F/D) and aperture sizes (D) of reflectors to investigate their effects on significance ranking of error sources. The research reveals that RMS (Root Mean Square) of shape error is a random quantity with an exponent probability distribution and features great dispersion; with the increase of F/D and D, both mean value and standard deviation of shape errors are increased; in the proposed range, the significance ranking of error sources is independent of F/D and D; boundary deviation imposes the greatest effect with a much higher weight than the others; pressure variation ranks the second; error in thickness and elastic modulus of membrane ranks the last with very close sensitivities to pressure variation. Finally, suggestions are given for the control of the shape accuracy of reflectors and allowable values of error sources are proposed from the perspective of reliability.

  2. Predictors of Errors of Novice Java Programmers

    Science.gov (United States)

    Bringula, Rex P.; Manabat, Geecee Maybelline A.; Tolentino, Miguel Angelo A.; Torres, Edmon L.

    2012-01-01

    This descriptive study determined which of the sources of errors would predict the errors committed by novice Java programmers. Descriptive statistics revealed that the respondents perceived that they committed the identified eighteen errors infrequently. Thought error was perceived to be the main source of error during the laboratory programming…

  3. Learning time-dependent noise to reduce logical errors: real time error rate estimation in quantum error correction

    Science.gov (United States)

    Huo, Ming-Xia; Li, Ying

    2017-12-01

    Quantum error correction is important to quantum information processing, which allows us to reliably process information encoded in quantum error correction codes. Efficient quantum error correction benefits from the knowledge of error rates. We propose a protocol for monitoring error rates in real time without interrupting the quantum error correction. Any adaptation of the quantum error correction code or its implementation circuit is not required. The protocol can be directly applied to the most advanced quantum error correction techniques, e.g. surface code. A Gaussian processes algorithm is used to estimate and predict error rates based on error correction data in the past. We find that using these estimated error rates, the probability of error correction failures can be significantly reduced by a factor increasing with the code distance.

  4. Wegener's granulomatosis occurring de novo during pregnancy.

    Science.gov (United States)

    Alfhaily, F; Watts, R; Leather, A

    2009-01-01

    Wegener's granulomatosis (WG) is rarely diagnosed during the reproductive years and uncommonly manifests for the first time during pregnancy. We report a case of de novo WG presenting at 30 weeks gestation with classical symptoms of WG (ENT, pulmonary). The diagnosis was confirmed by radiological, laboratory, and histological investigations. With a multidisciplinary approach, she had a successful vaginal delivery of a healthy baby. She was treated successfully by a combination of steroids, azathioprine and intravenous immunoglobulin in the active phase of disease for induction of remission and by azathioprine and steroids for maintenance of remission. The significant improvement in her symptoms allowed us to continue her pregnancy to 37 weeks when delivery was electively induced. Transplacental transmission of PR3-ANCA occurred but the neonate remained well. This case of de novo WG during pregnancy highlights the seriousness of this disease and the challenge in management of such patients.

  5. New method of classifying human errors at nuclear power plants and the analysis results of applying this method to maintenance errors at domestic plants

    International Nuclear Information System (INIS)

    Takagawa, Kenichi; Miyazaki, Takamasa; Gofuku, Akio; Iida, Hiroyasu

    2007-01-01

    Since many of the adverse events that have occurred in nuclear power plants in Japan and abroad have been related to maintenance or operation, it is necessary to plan preventive measures based on detailed analyses of human errors made by maintenance workers or operators. Therefore, before planning preventive measures, we developed a new method of analyzing human errors. Since each human error is an unsafe action caused by some misjudgement made by a person, we decided to classify them into six categories according to the stage in the judgment process in which the error was made. By further classifying each error into either an omission-type or commission-type, we produced 12 categories of errors. Then, we divided them into the two categories of basic error tendencies and individual error tendencies, and categorized background factors into four categories: imperfect planning; imperfect facilities or tools; imperfect environment; and imperfect instructions or communication. We thus defined the factors in each category to make it easy to identify factors that caused the error. Then using this method, we studied the characteristics of human errors that involved maintenance workers and planners since many maintenance errors have occurred. Among the human errors made by workers (worker errors) during the implementation stage, the following three types were prevalent with approximately 80%: commission-type 'projection errors', omission-type comprehension errors' and commission type 'action errors'. The most common among the individual factors of worker errors was 'repetition or habit' (schema), based on the assumption of a typical situation, and the half number of the 'repetition or habit' cases (schema) were not influenced by any background factors. The most common background factor that contributed to the individual factor was 'imperfect work environment', followed by 'insufficient knowledge'. Approximately 80% of the individual factors were 'repetition or habit' or

  6. The Greatest Show on Earth

    Indian Academy of Sciences (India)

    Darwin and Alfred Russel Wallace: life on earth had evolved ... over long epochs, the pace of change was infinitesimal. ... Thanks to the work of the Japanese theoreti- cian Motoo ... pleasure-minus-expenditure balance is posi- tive. This way of ...

  7. Freedom's Greatest Threat, The Metaterrorist

    National Research Council Canada - National Science Library

    Jones, Gary

    1997-01-01

    The end of the Cold War ushered on to the world scene a new hybrid of terrorist. This new breed of criminal is called the metaterrorist, because his art of instilling terror goes beyond anything we have ever seen in the past...

  8. Climate change: Wilderness's greatest challenge

    Science.gov (United States)

    Nathan L. Stephenson; Connie Millar

    2014-01-01

    Anthropogenic climatic change can no longer be considered an abstract possibility. It is here, its effects are already evident, and changes are expected to accelerate in coming decades, profoundly altering wilderness ecosystems. At the most fundamental level, wilderness stewards will increasingly be confronted with a trade-off between untrammeled wilderness character...

  9. Ironic Effects of Drawing Attention to Story Errors

    Science.gov (United States)

    Eslick, Andrea N.; Fazio, Lisa K.; Marsh, Elizabeth J.

    2014-01-01

    Readers learn errors embedded in fictional stories and use them to answer later general knowledge questions (Marsh, Meade, & Roediger, 2003). Suggestibility is robust and occurs even when story errors contradict well-known facts. The current study evaluated whether suggestibility is linked to participants’ inability to judge story content as correct versus incorrect. Specifically, participants read stories containing correct and misleading information about the world; some information was familiar (making error discovery possible), while some was more obscure. To improve participants’ monitoring ability, we highlighted (in red font) a subset of story phrases requiring evaluation; readers no longer needed to find factual information. Rather, they simply needed to evaluate its correctness. Readers were more likely to answer questions with story errors if they were highlighted in red font, even if they contradicted well-known facts. Though highlighting to-be-evaluated information freed cognitive resources for monitoring, an ironic effect occurred: Drawing attention to specific errors increased rather than decreased later suggestibility. Failure to monitor for errors, not failure to identify the information requiring evaluation, leads to suggestibility. PMID:21294039

  10. Medication administration errors in an intensive care unit in Ethiopia

    Directory of Open Access Journals (Sweden)

    Agalu Asrat

    2012-05-01

    Full Text Available Abstract Background Medication administration errors in patient care have been shown to be frequent and serious. Such errors are particularly prevalent in highly technical specialties such as the intensive care unit (ICU. In Ethiopia, the prevalence of medication administration errors in the ICU is not studied. Objective To assess medication administration errors in the intensive care unit of Jimma University Specialized Hospital (JUSH, Southwest Ethiopia. Methods Prospective observation based cross-sectional study was conducted in the ICU of JUSH from February 7 to March 24, 2011. All medication interventions administered by the nurses to all patients admitted to the ICU during the study period were included in the study. Data were collected by directly observing drug administration by the nurses supplemented with review of medication charts. Data was edited, coded and entered in to SPSS for windows version 16.0. Descriptive statistics was used to measure the magnitude and type of the problem under study. Results Prevalence of medication administration errors in the ICU of JUSH was 621 (51.8%. Common administration errors were attributed to wrong timing (30.3%, omission due to unavailability (29.0% and missed doses (18.3% among others. Errors associated with antibiotics took the lion's share in medication administration errors (36.7%. Conclusion Medication errors at the administration phase were highly prevalent in the ICU of Jimma University Specialized Hospital. Supervision to the nurses administering medications by more experienced ICU nurses or other relevant professionals in regular intervals is helpful in ensuring that medication errors don’t occur as frequently as observed in this study.

  11. Redundant measurements for controlling errors

    International Nuclear Information System (INIS)

    Ehinger, M.H.; Crawford, J.M.; Madeen, M.L.

    1979-07-01

    Current federal regulations for nuclear materials control require consideration of operating data as part of the quality control program and limits of error propagation. Recent work at the BNFP has revealed that operating data are subject to a number of measurement problems which are very difficult to detect and even more difficult to correct in a timely manner. Thus error estimates based on operational data reflect those problems. During the FY 1978 and FY 1979 R and D demonstration runs at the BNFP, redundant measurement techniques were shown to be effective in detecting these problems to allow corrective action. The net effect is a reduction in measurement errors and a significant increase in measurement sensitivity. Results show that normal operation process control measurements, in conjunction with routine accountability measurements, are sensitive problem indicators when incorporated in a redundant measurement program

  12. Large errors and severe conditions

    CERN Document Server

    Smith, D L; Van Wormer, L A

    2002-01-01

    Physical parameters that can assume real-number values over a continuous range are generally represented by inherently positive random variables. However, if the uncertainties in these parameters are significant (large errors), conventional means of representing and manipulating the associated variables can lead to erroneous results. Instead, all analyses involving them must be conducted in a probabilistic framework. Several issues must be considered: First, non-linear functional relations between primary and derived variables may lead to significant 'error amplification' (severe conditions). Second, the commonly used normal (Gaussian) probability distribution must be replaced by a more appropriate function that avoids the occurrence of negative sampling results. Third, both primary random variables and those derived through well-defined functions must be dealt with entirely in terms of their probability distributions. Parameter 'values' and 'errors' should be interpreted as specific moments of these probabil...

  13. Negligence, genuine error, and litigation

    Directory of Open Access Journals (Sweden)

    Sohn DH

    2013-02-01

    Full Text Available David H SohnDepartment of Orthopedic Surgery, University of Toledo Medical Center, Toledo, OH, USAAbstract: Not all medical injuries are the result of negligence. In fact, most medical injuries are the result either of the inherent risk in the practice of medicine, or due to system errors, which cannot be prevented simply through fear of disciplinary action. This paper will discuss the differences between adverse events, negligence, and system errors; the current medical malpractice tort system in the United States; and review current and future solutions, including medical malpractice reform, alternative dispute resolution, health courts, and no-fault compensation systems. The current political environment favors investigation of non-cap tort reform remedies; investment into more rational oversight systems, such as health courts or no-fault systems may reap both quantitative and qualitative benefits for a less costly and safer health system.Keywords: medical malpractice, tort reform, no fault compensation, alternative dispute resolution, system errors

  14. Spacecraft and propulsion technician error

    Science.gov (United States)

    Schultz, Daniel Clyde

    Commercial aviation and commercial space similarly launch, fly, and land passenger vehicles. Unlike aviation, the U.S. government has not established maintenance policies for commercial space. This study conducted a mixed methods review of 610 U.S. space launches from 1984 through 2011, which included 31 failures. An analysis of the failure causal factors showed that human error accounted for 76% of those failures, which included workmanship error accounting for 29% of the failures. With the imminent future of commercial space travel, the increased potential for the loss of human life demands that changes be made to the standardized procedures, training, and certification to reduce human error and failure rates. Several recommendations were made by this study to the FAA's Office of Commercial Space Transportation, space launch vehicle operators, and maintenance technician schools in an effort to increase the safety of the space transportation passengers.

  15. Sensation seeking and error processing.

    Science.gov (United States)

    Zheng, Ya; Sheng, Wenbin; Xu, Jing; Zhang, Yuanyuan

    2014-09-01

    Sensation seeking is defined by a strong need for varied, novel, complex, and intense stimulation, and a willingness to take risks for such experience. Several theories propose that the insensitivity to negative consequences incurred by risks is one of the hallmarks of sensation-seeking behaviors. In this study, we investigated the time course of error processing in sensation seeking by recording event-related potentials (ERPs) while high and low sensation seekers performed an Eriksen flanker task. Whereas there were no group differences in ERPs to correct trials, sensation seeking was associated with a blunted error-related negativity (ERN), which was female-specific. Further, different subdimensions of sensation seeking were related to ERN amplitude differently. These findings indicate that the relationship between sensation seeking and error processing is sex-specific. Copyright © 2014 Society for Psychophysiological Research.

  16. Errors of Inference Due to Errors of Measurement.

    Science.gov (United States)

    Linn, Robert L.; Werts, Charles E.

    Failure to consider errors of measurement when using partial correlation or analysis of covariance techniques can result in erroneous conclusions. Certain aspects of this problem are discussed and particular attention is given to issues raised in a recent article by Brewar, Campbell, and Crano. (Author)

  17. Measurement error models with uncertainty about the error variance

    NARCIS (Netherlands)

    Oberski, D.L.; Satorra, A.

    2013-01-01

    It is well known that measurement error in observable variables induces bias in estimates in standard regression analysis and that structural equation models are a typical solution to this problem. Often, multiple indicator equations are subsumed as part of the structural equation model, allowing

  18. Reward positivity: Reward prediction error or salience prediction error?

    Science.gov (United States)

    Heydari, Sepideh; Holroyd, Clay B

    2016-08-01

    The reward positivity is a component of the human ERP elicited by feedback stimuli in trial-and-error learning and guessing tasks. A prominent theory holds that the reward positivity reflects a reward prediction error signal that is sensitive to outcome valence, being larger for unexpected positive events relative to unexpected negative events (Holroyd & Coles, 2002). Although the theory has found substantial empirical support, most of these studies have utilized either monetary or performance feedback to test the hypothesis. However, in apparent contradiction to the theory, a recent study found that unexpected physical punishments also elicit the reward positivity (Talmi, Atkinson, & El-Deredy, 2013). The authors of this report argued that the reward positivity reflects a salience prediction error rather than a reward prediction error. To investigate this finding further, in the present study participants navigated a virtual T maze and received feedback on each trial under two conditions. In a reward condition, the feedback indicated that they would either receive a monetary reward or not and in a punishment condition the feedback indicated that they would receive a small shock or not. We found that the feedback stimuli elicited a typical reward positivity in the reward condition and an apparently delayed reward positivity in the punishment condition. Importantly, this signal was more positive to the stimuli that predicted the omission of a possible punishment relative to stimuli that predicted a forthcoming punishment, which is inconsistent with the salience hypothesis. © 2016 Society for Psychophysiological Research.

  19. ERROR HANDLING IN INTEGRATION WORKFLOWS

    Directory of Open Access Journals (Sweden)

    Alexey M. Nazarenko

    2017-01-01

    Full Text Available Simulation experiments performed while solving multidisciplinary engineering and scientific problems require joint usage of multiple software tools. Further, when following a preset plan of experiment or searching for optimum solu- tions, the same sequence of calculations is run multiple times with various simulation parameters, input data, or conditions while overall workflow does not change. Automation of simulations like these requires implementing of a workflow where tool execution and data exchange is usually controlled by a special type of software, an integration environment or plat- form. The result is an integration workflow (a platform-dependent implementation of some computing workflow which, in the context of automation, is a composition of weakly coupled (in terms of communication intensity typical subtasks. These compositions can then be decomposed back into a few workflow patterns (types of subtasks interaction. The pat- terns, in their turn, can be interpreted as higher level subtasks.This paper considers execution control and data exchange rules that should be imposed by the integration envi- ronment in the case of an error encountered by some integrated software tool. An error is defined as any abnormal behavior of a tool that invalidates its result data thus disrupting the data flow within the integration workflow. The main requirementto the error handling mechanism implemented by the integration environment is to prevent abnormal termination of theentire workflow in case of missing intermediate results data. Error handling rules are formulated on the basic pattern level and on the level of a composite task that can combine several basic patterns as next level subtasks. The cases where workflow behavior may be different, depending on user's purposes, when an error takes place, and possible error handling op- tions that can be specified by the user are also noted in the work.

  20. Analysis of Medication Error Reports

    Energy Technology Data Exchange (ETDEWEB)

    Whitney, Paul D.; Young, Jonathan; Santell, John; Hicks, Rodney; Posse, Christian; Fecht, Barbara A.

    2004-11-15

    In medicine, as in many areas of research, technological innovation and the shift from paper based information to electronic records has created a climate of ever increasing availability of raw data. There has been, however, a corresponding lag in our abilities to analyze this overwhelming mass of data, and classic forms of statistical analysis may not allow researchers to interact with data in the most productive way. This is true in the emerging area of patient safety improvement. Traditionally, a majority of the analysis of error and incident reports has been carried out based on an approach of data comparison, and starts with a specific question which needs to be answered. Newer data analysis tools have been developed which allow the researcher to not only ask specific questions but also to “mine” data: approach an area of interest without preconceived questions, and explore the information dynamically, allowing questions to be formulated based on patterns brought up by the data itself. Since 1991, United States Pharmacopeia (USP) has been collecting data on medication errors through voluntary reporting programs. USP’s MEDMARXsm reporting program is the largest national medication error database and currently contains well over 600,000 records. Traditionally, USP has conducted an annual quantitative analysis of data derived from “pick-lists” (i.e., items selected from a list of items) without an in-depth analysis of free-text fields. In this paper, the application of text analysis and data analysis tools used by Battelle to analyze the medication error reports already analyzed in the traditional way by USP is described. New insights and findings were revealed including the value of language normalization and the distribution of error incidents by day of the week. The motivation for this effort is to gain additional insight into the nature of medication errors to support improvements in medication safety.

  1. Medication errors: definitions and classification

    Science.gov (United States)

    Aronson, Jeffrey K

    2009-01-01

    To understand medication errors and to identify preventive strategies, we need to classify them and define the terms that describe them. The four main approaches to defining technical terms consider etymology, usage, previous definitions, and the Ramsey–Lewis method (based on an understanding of theory and practice). A medication error is ‘a failure in the treatment process that leads to, or has the potential to lead to, harm to the patient’. Prescribing faults, a subset of medication errors, should be distinguished from prescription errors. A prescribing fault is ‘a failure in the prescribing [decision-making] process that leads to, or has the potential to lead to, harm to the patient’. The converse of this, ‘balanced prescribing’ is ‘the use of a medicine that is appropriate to the patient's condition and, within the limits created by the uncertainty that attends therapeutic decisions, in a dosage regimen that optimizes the balance of benefit to harm’. This excludes all forms of prescribing faults, such as irrational, inappropriate, and ineffective prescribing, underprescribing and overprescribing. A prescription error is ‘a failure in the prescription writing process that results in a wrong instruction about one or more of the normal features of a prescription’. The ‘normal features’ include the identity of the recipient, the identity of the drug, the formulation, dose, route, timing, frequency, and duration of administration. Medication errors can be classified, invoking psychological theory, as knowledge-based mistakes, rule-based mistakes, action-based slips, and memory-based lapses. This classification informs preventive strategies. PMID:19594526

  2. Spent fuel bundle counter sequence error manual - RAPPS (200 MW) NGS

    International Nuclear Information System (INIS)

    Nicholson, L.E.

    1992-01-01

    The Spent Fuel Bundle Counter (SFBC) is used to count the number and type of spent fuel transfers that occur into or out of controlled areas at CANDU reactor sites. However if the transfers are executed in a non-standard manner or the SFBC is malfunctioning, the transfers are recorded as sequence errors. Each sequence error message typically contains adequate information to determine the cause of the message. This manual provides a guide to interpret the various sequence error messages that can occur and suggests probable cause or causes of the sequence errors. Each likely sequence error is presented on a 'card' in Appendix A. Note that it would be impractical to generate a sequence error card file with entries for all possible combinations of faults. Therefore the card file contains sequences with only one fault at a time. Some exceptions have been included however where experience has indicated that several faults can occur simultaneously

  3. Spent fuel bundle counter sequence error manual - KANUPP (125 MW) NGS

    International Nuclear Information System (INIS)

    Nicholson, L.E.

    1992-01-01

    The Spent Fuel Bundle Counter (SFBC) is used to count the number and type of spent fuel transfers that occur into or out of controlled areas at CANDU reactor sites. However if the transfers are executed in a non-standard manner or the SFBC is malfunctioning, the transfers are recorded as sequence errors. Each sequence error message may contain adequate information to determine the cause of the message. This manual provides a guide to interpret the various sequence error messages that can occur and suggests probable cause or causes of the sequence errors. Each likely sequence error is presented on a 'card' in Appendix A. Note that it would be impractical to generate a sequence error card file with entries for all possible combinations of faults. Therefore the card file contains sequences with only one fault at a time. Some exceptions have been included however where experience has indicated that several faults can occur simultaneously

  4. Introduction to naturally occurring radioactive material

    International Nuclear Information System (INIS)

    Egidi, P.

    1997-01-01

    Naturally occurring radioactive material (NORM) is everywhere; we are exposed to it every day. It is found in our bodies, the food we eat, the places where we live and work, and in products we use. Some industrial practices involving natural resources concentrate these radionuclides to a degree that they may pose risk to humans and the environment if they are not controlled. This session will concentrate on diffuse sources of technologically-enhanced (TE) NORM, which are generally large-volume, low-activity waste streams produced by industries such as mineral mining, ore benefication, production of phosphate Fertilizers, water treatment and purification, and oil and gas production. The majority of radionuclides in TENORM are found in the uranium and thorium decay chains. Radium and its subsequent decay products (radon) are the principal radionuclides used in characterizing the redistribution of TENORM in the environment by human activity. We will briefly review other radionuclides occurring in nature (potassium and rubidium) that contribute primarily to background doses. TENORM is found in many waste streams; for example, scrap metal, sludges, slags, fluids, and is being discovered in industries traditionally not thought of as affected by radionuclide contamination. Not only the forms and volumes, but the levels of radioactivity in TENORM vary. Current discussions about the validity of the linear no dose threshold theory are central to the TENORM issue. TENORM is not regulated by the Atomic Energy Act or other Federal regulations. Control and regulation of TENORM is not consistent from industry to industry nor from state to state. Proposed regulations are moving from concentration-based standards to dose-based standards. So when is TENORM a problem? Where is it a problem? That depends on when, where, and whom you talk to exclamation point We will start by reviewing background radioactivity, then we will proceed to the geology, mobility, and variability of these

  5. Introduction to naturally occurring radioactive material

    Energy Technology Data Exchange (ETDEWEB)

    Egidi, P.

    1997-08-01

    Naturally occurring radioactive material (NORM) is everywhere; we are exposed to it every day. It is found in our bodies, the food we eat, the places where we live and work, and in products we use. We are also bathed in a sea of natural radiation coming from the sun and deep space. Living systems have adapted to these levels of radiation and radioactivity. But some industrial practices involving natural resources concentrate these radionuclides to a degree that they may pose risk to humans and the environment if they are not controlled. Other activities, such as flying at high altitudes, expose us to elevated levels of NORM. This session will concentrate on diffuse sources of technologically-enhanced (TE) NORM, which are generally large-volume, low-activity waste streams produced by industries such as mineral mining, ore benefication, production of phosphate Fertilizers, water treatment and purification, and oil and gas production. The majority of radionuclides in TENORM are found in the uranium and thorium decay chains. Radium and its subsequent decay products (radon) are the principal radionuclides used in characterizing the redistribution of TENORM in the environment by human activity. We will briefly review other radionuclides occurring in nature (potassium and rubidium) that contribute primarily to background doses. TENORM is found in many waste streams; for example, scrap metal, sludges, slags, fluids, and is being discovered in industries traditionally not thought of as affected by radionuclide contamination. Not only the forms and volumes, but the levels of radioactivity in TENORM vary. Current discussions about the validity of the linear no dose threshold theory are central to the TENORM issue. TENORM is not regulated by the Atomic Energy Act or other Federal regulations. Control and regulation of TENORM is not consistent from industry to industry nor from state to state. Proposed regulations are moving from concentration-based standards to dose

  6. Correcting quantum errors with entanglement.

    Science.gov (United States)

    Brun, Todd; Devetak, Igor; Hsieh, Min-Hsiu

    2006-10-20

    We show how entanglement shared between encoder and decoder can simplify the theory of quantum error correction. The entanglement-assisted quantum codes we describe do not require the dual-containing constraint necessary for standard quantum error-correcting codes, thus allowing us to "quantize" all of classical linear coding theory. In particular, efficient modern classical codes that attain the Shannon capacity can be made into entanglement-assisted quantum codes attaining the hashing bound (closely related to the quantum capacity). For systems without large amounts of shared entanglement, these codes can also be used as catalytic codes, in which a small amount of initial entanglement enables quantum communication.

  7. Human Error and Organizational Management

    Directory of Open Access Journals (Sweden)

    Alecxandrina DEACONU

    2009-01-01

    Full Text Available The concern for performance is a topic that raises interest in the businessenvironment but also in other areas that – even if they seem distant from thisworld – are aware of, interested in or conditioned by the economy development.As individual performance is very much influenced by the human resource, wechose to analyze in this paper the mechanisms that generate – consciously or not–human error nowadays.Moreover, the extremely tense Romanian context,where failure is rather a rule than an exception, made us investigate thephenomenon of generating a human error and the ways to diminish its effects.

  8. Preventing statistical errors in scientific journals.

    NARCIS (Netherlands)

    Nuijten, M.B.

    2016-01-01

    There is evidence for a high prevalence of statistical reporting errors in psychology and other scientific fields. These errors display a systematic preference for statistically significant results, distorting the scientific literature. There are several possible causes for this systematic error

  9. Learning from mistakes. Factors that influence how students and residents learn from medical errors.

    Science.gov (United States)

    Fischer, Melissa A; Mazor, Kathleen M; Baril, Joann; Alper, Eric; DeMarco, Deborah; Pugnaire, Michele

    2006-05-01

    Trainees are exposed to medical errors throughout medical school and residency. Little is known about what facilitates and limits learning from these experiences. To identify major factors and areas of tension in trainees' learning from medical errors. Structured telephone interviews with 59 trainees (medical students and residents) from 1 academic medical center. Five authors reviewed transcripts of audiotaped interviews using content analysis. Trainees were aware that medical errors occur from early in medical school. Many had an intense emotional response to the idea of committing errors in patient care. Students and residents noted variation and conflict in institutional recommendations and individual actions. Many expressed role confusion regarding whether and how to initiate discussion after errors occurred. Some noted the conflict between reporting errors to seniors who were responsible for their evaluation. Learners requested more open discussion of actual errors and faculty disclosure. No students or residents felt that they learned better from near misses than from actual errors, and many believed that they learned the most when harm was caused. Trainees are aware of medical errors, but remaining tensions may limit learning. Institutions can immediately address variability in faculty response and local culture by disseminating clear, accessible algorithms to guide behavior when errors occur. Educators should develop longitudinal curricula that integrate actual cases and faculty disclosure. Future multi-institutional work should focus on identified themes such as teaching and learning in emotionally charged situations, learning from errors and near misses and balance between individual and systems responsibility.

  10. Errors in laboratory medicine: practical lessons to improve patient safety.

    Science.gov (United States)

    Howanitz, Peter J

    2005-10-01

    Patient safety is influenced by the frequency and seriousness of errors that occur in the health care system. Error rates in laboratory practices are collected routinely for a variety of performance measures in all clinical pathology laboratories in the United States, but a list of critical performance measures has not yet been recommended. The most extensive databases describing error rates in pathology were developed and are maintained by the College of American Pathologists (CAP). These databases include the CAP's Q-Probes and Q-Tracks programs, which provide information on error rates from more than 130 interlaboratory studies. To define critical performance measures in laboratory medicine, describe error rates of these measures, and provide suggestions to decrease these errors, thereby ultimately improving patient safety. A review of experiences from Q-Probes and Q-Tracks studies supplemented with other studies cited in the literature. Q-Probes studies are carried out as time-limited studies lasting 1 to 4 months and have been conducted since 1989. In contrast, Q-Tracks investigations are ongoing studies performed on a yearly basis and have been conducted only since 1998. Participants from institutions throughout the world simultaneously conducted these studies according to specified scientific designs. The CAP has collected and summarized data for participants about these performance measures, including the significance of errors, the magnitude of error rates, tactics for error reduction, and willingness to implement each of these performance measures. A list of recommended performance measures, the frequency of errors when these performance measures were studied, and suggestions to improve patient safety by reducing these errors. Error rates for preanalytic and postanalytic performance measures were higher than for analytic measures. Eight performance measures were identified, including customer satisfaction, test turnaround times, patient identification

  11. Post-error action control is neurobehaviorally modulated under conditions of constant speeded response.

    Science.gov (United States)

    Soshi, Takahiro; Ando, Kumiko; Noda, Takamasa; Nakazawa, Kanako; Tsumura, Hideki; Okada, Takayuki

    2014-01-01

    Post-error slowing (PES) is an error recovery strategy that contributes to action control, and occurs after errors in order to prevent future behavioral flaws. Error recovery often malfunctions in clinical populations, but the relationship between behavioral traits and recovery from error is unclear in healthy populations. The present study investigated the relationship between impulsivity and error recovery by simulating a speeded response situation using a Go/No-go paradigm that forced the participants to constantly make accelerated responses prior to stimuli disappearance (stimulus duration: 250 ms). Neural correlates of post-error processing were examined using event-related potentials (ERPs). Impulsivity traits were measured with self-report questionnaires (BIS-11, BIS/BAS). Behavioral results demonstrated that the commission error for No-go trials was 15%, but PES did not take place immediately. Delayed PES was negatively correlated with error rates and impulsivity traits, showing that response slowing was associated with reduced error rates and changed with impulsivity. Response-locked error ERPs were clearly observed for the error trials. Contrary to previous studies, error ERPs were not significantly related to PES. Stimulus-locked N2 was negatively correlated with PES and positively correlated with impulsivity traits at the second post-error Go trial: larger N2 activity was associated with greater PES and less impulsivity. In summary, under constant speeded conditions, error monitoring was dissociated from post-error action control, and PES did not occur quickly. Furthermore, PES and its neural correlate (N2) were modulated by impulsivity traits. These findings suggest that there may be clinical and practical efficacy of maintaining cognitive control of actions during error recovery under common daily environments that frequently evoke impulsive behaviors.

  12. Computational error and complexity in science and engineering computational error and complexity

    CERN Document Server

    Lakshmikantham, Vangipuram; Chui, Charles K; Chui, Charles K

    2005-01-01

    The book "Computational Error and Complexity in Science and Engineering” pervades all the science and engineering disciplines where computation occurs. Scientific and engineering computation happens to be the interface between the mathematical model/problem and the real world application. One needs to obtain good quality numerical values for any real-world implementation. Just mathematical quantities symbols are of no use to engineers/technologists. Computational complexity of the numerical method to solve the mathematical model, also computed along with the solution, on the other hand, will tell us how much computation/computational effort has been spent to achieve that quality of result. Anyone who wants the specified physical problem to be solved has every right to know the quality of the solution as well as the resources spent for the solution. The computed error as well as the complexity provide the scientific convincing answer to these questions. Specifically some of the disciplines in which the book w...

  13. Threat and error management for anesthesiologists: a predictive risk taxonomy

    Science.gov (United States)

    Ruskin, Keith J.; Stiegler, Marjorie P.; Park, Kellie; Guffey, Patrick; Kurup, Viji; Chidester, Thomas

    2015-01-01

    Purpose of review Patient care in the operating room is a dynamic interaction that requires cooperation among team members and reliance upon sophisticated technology. Most human factors research in medicine has been focused on analyzing errors and implementing system-wide changes to prevent them from recurring. We describe a set of techniques that has been used successfully by the aviation industry to analyze errors and adverse events and explain how these techniques can be applied to patient care. Recent findings Threat and error management (TEM) describes adverse events in terms of risks or challenges that are present in an operational environment (threats) and the actions of specific personnel that potentiate or exacerbate those threats (errors). TEM is a technique widely used in aviation, and can be adapted for the use in a medical setting to predict high-risk situations and prevent errors in the perioperative period. A threat taxonomy is a novel way of classifying and predicting the hazards that can occur in the operating room. TEM can be used to identify error-producing situations, analyze adverse events, and design training scenarios. Summary TEM offers a multifaceted strategy for identifying hazards, reducing errors, and training physicians. A threat taxonomy may improve analysis of critical events with subsequent development of specific interventions, and may also serve as a framework for training programs in risk mitigation. PMID:24113268

  14. Medication errors in outpatient care in Colombia, 2005-2013.

    Science.gov (United States)

    Machado-Alba, Jorge E; Moncada, Juan Carlos; Moreno-Gutiérrez, Paula Andrea

    2016-06-03

    Medication errors outside the hospital have been poorly studied despite representing an important threat to patient safety. To describe the characteristics of medication errors in outpatient dispensing pharmacists reported in a pharmaco-surveillance system between 2005 and 2013 in Colombia. We conducted a descriptive study by reviewing and categorizing medication error reports from outpatient pharmacy services to a national medication dispensing company between January, 2005 and September, 2013. Variables considered included: process involved (administration, dispensing, prescription and transcription), wrong drug, time delay for the report, error type, cause and severity. The analysis was conducted in the SPSS® software, version 22.0. A total of 14,873 medication errors were reviewed, of which 67.2% in fact occurred, 15.5% reached the patient and 0.7% caused harm. Administration (OR=93.61, CI 95%: 48.510-180.655, perrors (OR=5.64; CI 95%: 3.488-9.142, perror reaching the patient. It is necessary to develop surveillance systems for medication errors in ambulatory care, focusing on the prescription, transcription and dispensation processes. Special strategies are needed for the prevention of medication errors related to anti-infective drugs.

  15. Critical evidence for the prediction error theory in associative learning.

    Science.gov (United States)

    Terao, Kanta; Matsumoto, Yukihisa; Mizunami, Makoto

    2015-03-10

    In associative learning in mammals, it is widely accepted that the discrepancy, or error, between actual and predicted reward determines whether learning occurs. Complete evidence for the prediction error theory, however, has not been obtained in any learning systems: Prediction error theory stems from the finding of a blocking phenomenon, but blocking can also be accounted for by other theories, such as the attentional theory. We demonstrated blocking in classical conditioning in crickets and obtained evidence to reject the attentional theory. To obtain further evidence supporting the prediction error theory and rejecting alternative theories, we constructed a neural model to match the prediction error theory, by modifying our previous model of learning in crickets, and we tested a prediction from the model: the model predicts that pharmacological intervention of octopaminergic transmission during appetitive conditioning impairs learning but not formation of reward prediction itself, and it thus predicts no learning in subsequent training. We observed such an "auto-blocking", which could be accounted for by the prediction error theory but not by other competitive theories to account for blocking. This study unambiguously demonstrates validity of the prediction error theory in associative learning.

  16. Fungi of genus Alternaria occurring on tomato

    Directory of Open Access Journals (Sweden)

    Joanna Marcinkowska

    2013-12-01

    Full Text Available Tomato early blight in central Poland was caused by Alternaria solani (A. porri f. sp., solani and A. alernata (A. tenuis. A. alternata was isolated more often than A. solani. All isolates of A. solani in controlled conditions killed tomato seedlings, while pathogenic isolates of A. alternata caused only slight seedling blight. In greenhouse tests A. solani proved to be strongly pathogenic for leaves and stems of tomato but A. alternata was weakly pathogenic. The latter species attacked only injured fruits while, A. solanicould penetrate through undamaged peel of fruits. Both of these species caused the same type of symptoms; the differences consisted only in intensification of disease symptoms. During 1974 and 1975 field tomatoes were moderately attacked by early blight. Thebest development of this disease occurred by the turn of August and September. Determinate variety 'New Yorker' was distinguished by more severe infection of stem parts of tomato whereas the fruits of a stock variety 'Apollo' were more strongly attacked.

  17. Uranium occurence in California near Bucaramanga (Columbia)

    International Nuclear Information System (INIS)

    Heider Polania, J.

    1980-01-01

    The mining district of California, Bucaramanga, is on the west side of the Cordillera Oriental in the Santander massif region. The oldest rocks of the area form a complex of metamorphites and migmatites of the predevonic age. Amphibolite various types of paragneiss and orthogneiss are represented. Several stages of metamorphism can be documented in some rocks, as well as double anatexis. Triassic to jurassic quarz diorites and leukogranites show wide distribution. Porphyric rocks of granodioritic to granitic composition, to which the uranium mineralization is mainly bonded, intruded into the sediments of the lower cretaceous. Atomic absorption spectral analyses were carried out for the elements Cu, Zn and Li, as well as the uranium contents of some samples using fluorimetry. Uranium is primarily bonded to pitch blende and coffinite. The latter mostly occur in fine distribution grown in quarz and belong to the most recent mineralization phase. Autunite, meta-autunite, torbernite, meta-torbernite, zeunerite, meta-zeunerite and meta uranocircite detected as secondary uranium minerals. (orig./HP) [de

  18. Bioassay of naturally occurring allelochemicals for phytotoxicity.

    Science.gov (United States)

    Leather, G R; Einhellig, F A

    1988-10-01

    The bioassay has been one of the most widely used tests to demonstrate allelopathic activity. Often, claims that a particular plant species inhibits the growth of another are based entirely on the seed germination response to solvent extracts of the suspected allelopathic plant; few of these tests are of value in demonstrating allelopathy under natural conditions. The veracity of the bioassay for evaluating naturally occurring compounds for phytotoxicity depends upon the physiological and biochemical response capacity of the bioassay organism and the mechanism(s) of action of the allelochemicals. The possibility that more than one allelochemical, acting in concert at very low concentrations, may be responsible for an observed allelopathic effect makes it imperative that bioassays be extremely sensitive to chemical growth perturbation agents. Among the many measures of phytotoxicity of allelochemicals, the inhibition (or stimulation) of seed germination, radicle elongation, and/or seedling growth have been the parameters of choice for most investigations. Few of these assays have been selected with the view towards the possible mechanism of the allelopathic effect.

  19. A general model of cognitive errors applicable to the behaviour of NPP operators

    International Nuclear Information System (INIS)

    Senders, J.W.; Moray, N.P.

    1986-01-01

    Cognitive behaviour is, most generally put, that behaviour which is mental. In the context of Nuclear Power Plant (NPP) operations, cognitive behaviour is that which analyses, judges and transforms information received from the environment; makes use of memory; imposes meaning; predicts, interpolates and extrapolates; decides on and chooses goals, plans and courses of action. Behaviour of these kinds arises in the course of responding to emergencies and it is precisely there that the greatest importance must be attached to errors by operators

  20. THE SELF-CORRECTION OF ENGLISH SPEECH ERRORS IN SECOND LANGUANGE LEARNING

    Directory of Open Access Journals (Sweden)

    Ketut Santi Indriani

    2015-05-01

    Full Text Available The process of second language (L2 learning is strongly influenced by the factors of error reconstruction that occur when the language is learned. Errors will definitely appear in the learning process. However, errors can be used as a step to accelerate the process of understanding the language. Doing self-correction (with or without giving cues is one of the examples. In the aspect of speaking, self-correction is done immediately after the error appears. This study is aimed at finding (i what speech errors the L2 speakers are able to identify, (ii of the errors identified, what speech errors the L2 speakers are able to self correct and (iii whether the self-correction of speech error are able to immediately improve the L2 learning. Based on the data analysis, it was found that the majority identified errors are related to noun (plurality, subject-verb agreement, grammatical structure and pronunciation.. B2 speakers tend to correct errors properly. Of the 78% identified speech errors, as much as 66% errors could be self-corrected accurately by the L2 speakers. Based on the analysis, it was also found that self-correction is able to improve L2 learning ability directly. This is evidenced by the absence of repetition of the same error after the error had been corrected.

  1. Prediction and error of baldcypress stem volume from stump diameter

    Science.gov (United States)

    Bernard R. Parresol

    1998-01-01

    The need to estimate the volume of removals occurs for many reasons, such as in trespass cases, severance tax reports, and post-harvest assessments. A logarithmic model is presented for prediction of baldcypress total stem cubic foot volume using stump diameter as the independent variable. Because the error of prediction is as important as the volume estimate, the...

  2. RAMs: the problem of transient errors due to alpha radiation

    International Nuclear Information System (INIS)

    Goujon, Pierre.

    1980-01-01

    Errors that remained unexplained for a long time have occurred with dynamic random access memories. It has been known since 1978 that they are due to stray alpha radiation. A good understanding of this phenomenon enables its effects to be neutralized and the reliability of the products to be guarantied [fr

  3. Study of Periodic Fabrication Error of Optical Splitter Device Performance

    OpenAIRE

    Ab-Rahman, Mohammad Syuhaimi; Ater, Foze Saleh; Jumari, Kasmiran; Mohammad, Rahmah

    2012-01-01

    In this paper, the effect of fabrication errors (FEs) on the performance of 1×4 optical power splitter is investigated in details. The FE, which is assumed to take regular shape, is considered in each section of the device. Simulation result show that FE has a significant effect on the output power especially when it occurs in coupling regions.

  4. Naturally occurring flavonoids against human norovirus surrogates.

    Science.gov (United States)

    Su, Xiaowei; D'Souza, Doris H

    2013-06-01

    Naturally occurring plant-derived flavonoids are reported to have antibacterial, antiviral, and pharmacological activities. The objectives of this study were to determine the antiviral effects of four flavonoids (myricetin, L-epicatechin, tangeretin, and naringenin) on the infectivity of food borne norovirus surrogates after 2 h at 37 °C. The lab-culturable surrogates, feline calicivirus (FCV-F9) at titers of ~7 log₁₀ PFU/ml (high titer) or ~5 log₁₀ PFU/ml (low titer) and murine norovirus (MNV-1) at ~5 log₁₀ PFU/ml, were mixed with equal volumes of myricetin, L-epicatechin, tangeretin, or naringenin at concentrations of 0.5 or 1 mM, and incubated for 2 h at 37 °C. Treatments of viruses were neutralized in cell culture medium containing 10 % heat-inactivated fetal bovine serum, serially diluted, and plaque assayed. Each treatment was replicated thrice and assayed in duplicate. FCV-F9 (low titer) was not found to be reduced by tangeretin or naringenin, but was reduced to undetectable levels by myricetin at both concentrations. Low titer FCV-F9 was also decreased by 1.40 log₁₀ PFU/ml with L-epicatechin at 0.5 mM. FCV-F9 at high titers was decreased by 3.17 and 0.72 log₁₀ PFU/ml with myricetin and L-epicatechin at 0.5 mM, and 1.73 log10 PFU/ml with myricetin at 0.25 mM, respectively. However, MNV-1 showed no significant inactivation by the four tested treatments. The antiviral effects of the tested flavonoids are dependent on the virus type, titer, and dose. Further research will focus on understanding the antiviral mechanism of myricetin and L-epicatechin.

  5. Differential dormancy of co-occurring copepods

    Science.gov (United States)

    Ohman, Mark D.; Drits, Aleksandr V.; Elizabeth Clarke, M.; Plourde, Stéphane

    1998-08-01

    Four species of planktonic calanoid copepods that co-occur in the California Current System ( Eucalanus californicus Johnson, Rhincalanus nasutus Giesbrecht, Calanus pacificus californicus Brodsky, and Metridia pacifica Brodsky) were investigated for evidence of seasonal dormancy in the San Diego Trough. Indices used to differentiate actively growing from dormant animals included developmental stage structure and vertical distribution; activity of aerobic metabolic enzymes (Citrate Synthase and the Electron Transfer System complex); investment in depot lipids (wax esters and triacylglycerols); in situ grazing activity from gut fluorescence; and egg production rates in simulated in situ conditions. None of the 4 species exhibited a canonical calanoid pattern of winter dormancy - i.e., synchronous developmental arrest as copepodid stage V, descent into deep waters, reduced metabolism, and lack of winter reproduction. Instead, Calanus pacificus californicus has a biphasic life history in this region, with an actively reproducing segment of the population in surface waters overlying a deep dormant segment in winter. Eucalanus californicus is dormant as both adult females and copepodid V's, although winter females respond relatively rapidly to elevated food and temperature conditions; they begin feeding and producing eggs within 2-3 days. Rhincalanus nasutus appears to enter dormancy as adult females, although the evidence is equivocal. Metridia pacifica shows no evidence of dormancy, with sustained active feeding, diel vertical migration behavior, and elevated activity of metabolic enzymes in December as well as in June. The four species also differ markedly in water content, classes of storage lipids, and specific activity of Citrate Synthase. These results suggest that copepod dormancy traits and structural composition reflect diverse adaptations to regional environmental conditions rather than a uniform, canonical series of traits that remain invariant among taxa

  6. Medication errors in pediatric inpatients

    DEFF Research Database (Denmark)

    Rishoej, Rikke Mie; Almarsdóttir, Anna Birna; Christesen, Henrik Thybo

    2017-01-01

    The aim was to describe medication errors (MEs) in hospitalized children reported to the national mandatory reporting and learning system, the Danish Patient Safety Database (DPSD). MEs were extracted from DPSD from the 5-year period of 2010–2014. We included reports from public hospitals on pati...... safety in pediatric inpatients.(Table presented.)...

  7. Learner Corpora without Error Tagging

    Directory of Open Access Journals (Sweden)

    Rastelli, Stefano

    2009-01-01

    Full Text Available The article explores the possibility of adopting a form-to-function perspective when annotating learner corpora in order to get deeper insights about systematic features of interlanguage. A split between forms and functions (or categories is desirable in order to avoid the "comparative fallacy" and because – especially in basic varieties – forms may precede functions (e.g., what resembles to a "noun" might have a different function or a function may show up in unexpected forms. In the computer-aided error analysis tradition, all items produced by learners are traced to a grid of error tags which is based on the categories of the target language. Differently, we believe it is possible to record and make retrievable both words and sequence of characters independently from their functional-grammatical label in the target language. For this purpose at the University of Pavia we adapted a probabilistic POS tagger designed for L1 on L2 data. Despite the criticism that this operation can raise, we found that it is better to work with "virtual categories" rather than with errors. The article outlines the theoretical background of the project and shows some examples in which some potential of SLA-oriented (non error-based tagging will be possibly made clearer.

  8. Theory of Test Translation Error

    Science.gov (United States)

    Solano-Flores, Guillermo; Backhoff, Eduardo; Contreras-Nino, Luis Angel

    2009-01-01

    In this article, we present a theory of test translation whose intent is to provide the conceptual foundation for effective, systematic work in the process of test translation and test translation review. According to the theory, translation error is multidimensional; it is not simply the consequence of defective translation but an inevitable fact…

  9. and Correlated Error-Regressor

    African Journals Online (AJOL)

    Nekky Umera

    in queuing theory and econometrics, where the usual assumption of independent error terms may not be plausible in most cases. Also, when using time-series data on a number of micro-economic units, such as households and service oriented channels, where the stochastic disturbance terms in part reflect variables which ...

  10. Rank error-correcting pairs

    DEFF Research Database (Denmark)

    Martinez Peñas, Umberto; Pellikaan, Ruud

    2017-01-01

    Error-correcting pairs were introduced as a general method of decoding linear codes with respect to the Hamming metric using coordinatewise products of vectors, and are used for many well-known families of codes. In this paper, we define new types of vector products, extending the coordinatewise ...

  11. Clinical errors and medical negligence.

    Science.gov (United States)

    Oyebode, Femi

    2013-01-01

    This paper discusses the definition, nature and origins of clinical errors including their prevention. The relationship between clinical errors and medical negligence is examined as are the characteristics of litigants and events that are the source of litigation. The pattern of malpractice claims in different specialties and settings is examined. Among hospitalized patients worldwide, 3-16% suffer injury as a result of medical intervention, the most common being the adverse effects of drugs. The frequency of adverse drug effects appears superficially to be higher in intensive care units and emergency departments but once rates have been corrected for volume of patients, comorbidity of conditions and number of drugs prescribed, the difference is not significant. It is concluded that probably no more than 1 in 7 adverse events in medicine result in a malpractice claim and the factors that predict that a patient will resort to litigation include a prior poor relationship with the clinician and the feeling that the patient is not being kept informed. Methods for preventing clinical errors are still in their infancy. The most promising include new technologies such as electronic prescribing systems, diagnostic and clinical decision-making aids and error-resistant systems. Copyright © 2013 S. Karger AG, Basel.

  12. Finding errors in big data

    NARCIS (Netherlands)

    Puts, Marco; Daas, Piet; de Waal, A.G.

    No data source is perfect. Mistakes inevitably creep in. Spotting errors is hard enough when dealing with survey responses from several thousand people, but the difficulty is multiplied hugely when that mysterious beast Big Data comes into play. Statistics Netherlands is about to publish its first

  13. The Errors of Our Ways

    Science.gov (United States)

    Kane, Michael

    2011-01-01

    Errors don't exist in our data, but they serve a vital function. Reality is complicated, but our models need to be simple in order to be manageable. We assume that attributes are invariant over some conditions of observation, and once we do that we need some way of accounting for the variability in observed scores over these conditions of…

  14. Cascade Error Projection Learning Algorithm

    Science.gov (United States)

    Duong, T. A.; Stubberud, A. R.; Daud, T.

    1995-01-01

    A detailed mathematical analysis is presented for a new learning algorithm termed cascade error projection (CEP) and a general learning frame work. This frame work can be used to obtain the cascade correlation learning algorithm by choosing a particular set of parameters.

  15. Numerical study of the systematic error in Monte Carlo schemes for semiconductors

    Energy Technology Data Exchange (ETDEWEB)

    Muscato, Orazio [Univ. degli Studi di Catania (Italy). Dipt. di Matematica e Informatica; Di Stefano, Vincenza [Univ. degli Studi di Messina (Italy). Dipt. di Matematica; Wagner, Wolfgang [Weierstrass-Institut fuer Angewandte Analysis und Stochastik (WIAS) im Forschungsverbund Berlin e.V. (Germany)

    2008-07-01

    The paper studies the convergence behavior of Monte Carlo schemes for semiconductors. A detailed analysis of the systematic error with respect to numerical parameters is performed. Different sources of systematic error are pointed out and illustrated in a spatially one-dimensional test case. The error with respect to the number of simulation particles occurs during the calculation of the internal electric field. The time step error, which is related to the splitting of transport and electric field calculations, vanishes sufficiently fast. The error due to the approximation of the trajectories of particles depends on the ODE solver used in the algorithm. It is negligible compared to the other sources of time step error, when a second order Runge-Kutta solver is used. The error related to the approximate scattering mechanism is the most significant source of error with respect to the time step. (orig.)

  16. Information Needs While A Disaster Is Occurring

    Science.gov (United States)

    Perry, S. C.

    2010-12-01

    that rainfall intensity at their homes might be less than the intensity up in the mountains where the debris flows would start. Nor did they know that debris flows travel too quickly to be outrun. These and many other examples indicate need for social and natural scientists to increase awareness of what to expect when the disaster strikes. This information must be solidly understood before the event occurs - while a disaster is unfolding there are no teachable moments. Case studies indicate that even those who come into a disaster well educated about the phenomenon can struggle to apply what they know when the real situation is at hand. In addition, psychological studies confirm diminished ability to comprehend information at times of stress.

  17. Error and its meaning in forensic science.

    Science.gov (United States)

    Christensen, Angi M; Crowder, Christian M; Ousley, Stephen D; Houck, Max M

    2014-01-01

    The discussion of "error" has gained momentum in forensic science in the wake of the Daubert guidelines and has intensified with the National Academy of Sciences' Report. Error has many different meanings, and too often, forensic practitioners themselves as well as the courts misunderstand scientific error and statistical error rates, often confusing them with practitioner error (or mistakes). Here, we present an overview of these concepts as they pertain to forensic science applications, discussing the difference between practitioner error (including mistakes), instrument error, statistical error, and method error. We urge forensic practitioners to ensure that potential sources of error and method limitations are understood and clearly communicated and advocate that the legal community be informed regarding the differences between interobserver errors, uncertainty, variation, and mistakes. © 2013 American Academy of Forensic Sciences.

  18. North error estimation based on solar elevation errors in the third step of sky-polarimetric Viking navigation.

    Science.gov (United States)

    Száz, Dénes; Farkas, Alexandra; Barta, András; Kretzer, Balázs; Egri, Ádám; Horváth, Gábor

    2016-07-01

    The theory of sky-polarimetric Viking navigation has been widely accepted for decades without any information about the accuracy of this method. Previously, we have measured the accuracy of the first and second steps of this navigation method in psychophysical laboratory and planetarium experiments. Now, we have tested the accuracy of the third step in a planetarium experiment, assuming that the first and second steps are errorless. Using the fists of their outstretched arms, 10 test persons had to estimate the elevation angles (measured in numbers of fists and fingers) of black dots (representing the position of the occluded Sun) projected onto the planetarium dome. The test persons performed 2400 elevation estimations, 48% of which were more accurate than ±1°. We selected three test persons with the (i) largest and (ii) smallest elevation errors and (iii) highest standard deviation of the elevation error. From the errors of these three persons, we calculated their error function, from which the North errors (the angles with which they deviated from the geographical North) were determined for summer solstice and spring equinox, two specific dates of the Viking sailing period. The range of possible North errors Δ ω N was the lowest and highest at low and high solar elevations, respectively. At high elevations, the maximal Δ ω N was 35.6° and 73.7° at summer solstice and 23.8° and 43.9° at spring equinox for the best and worst test person (navigator), respectively. Thus, the best navigator was twice as good as the worst one. At solstice and equinox, high elevations occur the most frequently during the day, thus high North errors could occur more frequently than expected before. According to our findings, the ideal periods for sky-polarimetric Viking navigation are immediately after sunrise and before sunset, because the North errors are the lowest at low solar elevations.

  19. A methodology for translating positional error into measures of attribute error, and combining the two error sources

    Science.gov (United States)

    Yohay Carmel; Curtis Flather; Denis Dean

    2006-01-01

    This paper summarizes our efforts to investigate the nature, behavior, and implications of positional error and attribute error in spatiotemporal datasets. Estimating the combined influence of these errors on map analysis has been hindered by the fact that these two error types are traditionally expressed in different units (distance units, and categorical units,...

  20. The Need For ``Pleasure in Finding Things Out:'' The Use of History and Our Greatest Scientists for Human Survival and Scientific Integrity

    Science.gov (United States)

    Borchardt, Joshua

    2011-03-01

    Why Homo sapiens search for interesting things and the methods of which we do so. The use of philosophical, theoretical, and demonstrated processes for exploration of the natural, and not so natural world are presented based on the ideas and wishes of some of History's greatest scientists, with concentration on Richard P. Feynman's lens on scientific discovery and pursuit, for which the abstract gets its title. This talk is presented towards the layman as well as the physicist, and gives insight to the nature of discovery and what it means to have pleasure in finding things out for the betterment of all mankind.

  1. Analysis of liquid medication dose errors made by patients and caregivers using alternative measuring devices.

    Science.gov (United States)

    Ryu, Gyeong Suk; Lee, Yu Jeung

    2012-01-01

    Patients use several types of devices to measure liquid medication. Using a criterion ranging from a 10% to 40% variation from a target 5 mL for a teaspoon dose, previous studies have found that a considerable proportion of patients or caregivers make errors when dosing liquid medication with measuring devices. To determine the rate and magnitude of liquid medication dose errors that occur with patient/caregiver use of various measuring devices in a community pharmacy. Liquid medication measurements by patients or caregivers were observed in a convenience sample of community pharmacy patrons in Korea during a 2-week period in March 2011. Participants included all patients or caregivers (N = 300) who came to the pharmacy to buy over-the-counter liquid medication or to have a liquid medication prescription filled during the study period. The participants were instructed by an investigator who was also a pharmacist to select their preferred measuring devices from 6 alternatives (etched-calibration dosing cup, printed-calibration dosing cup, dosing spoon, syringe, dispensing bottle, or spoon with a bottle adapter) and measure a 5 mL dose of Coben (chlorpheniramine maleate/phenylephrine HCl, Daewoo Pharm. Co., Ltd) syrup using the device of their choice. The investigator used an ISOLAB graduated cylinder (Germany, blue grad, 10 mL) to measure the amount of syrup dispensed by the study participants. Participant characteristics were recorded including gender, age, education level, and relationship to the person for whom the medication was intended. Of the 300 participants, 257 (85.7%) were female; 286 (95.3%) had at least a high school education; and 282 (94.0%) were caregivers (parent or grandparent) for the patient. The mean (SD) measured dose was 4.949 (0.378) mL for the 300 participants. In analysis of variance of the 6 measuring devices, the greatest difference from the 5 mL target was a mean 5.552 mL for 17 subjects who used the regular (etched) dosing cup and 4

  2. Using snowball sampling method with nurses to understand medication administration errors.

    Science.gov (United States)

    Sheu, Shuh-Jen; Wei, Ien-Lan; Chen, Ching-Huey; Yu, Shu; Tang, Fu-In

    2009-02-01

    We aimed to encourage nurses to release information about drug administration errors to increase understanding of error-related circumstances and to identify high-alert situations. Drug administration errors represent the majority of medication errors, but errors are underreported. Effective ways are lacking to encourage nurses to actively report errors. Snowball sampling was conducted to recruit participants. A semi-structured questionnaire was used to record types of error, hospital and nurse backgrounds, patient consequences, error discovery mechanisms and reporting rates. Eighty-five nurses participated, reporting 328 administration errors (259 actual, 69 near misses). Most errors occurred in medical surgical wards of teaching hospitals, during day shifts, committed by nurses working fewer than two years. Leading errors were wrong drugs and doses, each accounting for about one-third of total errors. Among 259 actual errors, 83.8% resulted in no adverse effects; among remaining 16.2%, 6.6% had mild consequences and 9.6% had serious consequences (severe reaction, coma, death). Actual errors and near misses were discovered mainly through double-check procedures by colleagues and nurses responsible for errors; reporting rates were 62.5% (162/259) vs. 50.7% (35/69) and only 3.5% (9/259) vs. 0% (0/69) were disclosed to patients and families. High-alert situations included administration of 15% KCl, insulin and Pitocin; using intravenous pumps; and implementation of cardiopulmonary resuscitation (CPR). Snowball sampling proved to be an effective way to encourage nurses to release details concerning medication errors. Using empirical data, we identified high-alert situations. Strategies for reducing drug administration errors by nurses are suggested. Survey results suggest that nurses should double check medication administration in known high-alert situations. Nursing management can use snowball sampling to gather error details from nurses in a non

  3. Pathways to extinction: beyond the error threshold.

    Science.gov (United States)

    Manrubia, Susanna C; Domingo, Esteban; Lázaro, Ester

    2010-06-27

    Since the introduction of the quasispecies and the error catastrophe concepts for molecular evolution by Eigen and their subsequent application to viral populations, increased mutagenesis has become a common strategy to cause the extinction of viral infectivity. Nevertheless, the high complexity of virus populations has shown that viral extinction can occur through several other pathways apart from crossing an error threshold. Increases in the mutation rate enhance the appearance of defective forms and promote the selection of mechanisms that are able to counteract the accelerated appearance of mutations. Current models of viral evolution take into account more realistic scenarios that consider compensatory and lethal mutations, a highly redundant genotype-to-phenotype map, rough fitness landscapes relating phenotype and fitness, and where phenotype is described as a set of interdependent traits. Further, viral populations cannot be understood without specifying the characteristics of the environment where they evolve and adapt. Altogether, it turns out that the pathways through which viral quasispecies go extinct are multiple and diverse.

  4. Errors and complications in laparoscopic surgery

    Directory of Open Access Journals (Sweden)

    Liviu Drăghici

    2017-05-01

    Full Text Available Background. In laparoscopic surgery errors are unavoidable and require proper acknowledgment to reduce the risk of intraoperative and accurately assess the appropriate therapeutic approach. Fortunately, their frequency is low and cannot overshadow the benefits of laparoscopic surgery. Materials and Methods. We made an epidemiological investigation in General Surgery Department of Emergency Clinical Hospital "St. John" Bucharest, analyzing 20 years of experience in laparoscopic surgery, during 1994-2014. We wanted to identify evolution trends in complications of laparoscopic surgery, analyzing the dynamic of errors occurred in all patients with laparoscopic procedures. Results. We recorded 26847 laparoscopic interventions with a total of 427 intra-or postoperative complications that required 160 conversions and 267 reinterventions to resolve inconsistencies. The average frequency of occurrence of complications was 15.9‰ (15.9 of 1,000 cases. In the period under review it was a good momentum of laparoscopic procedures in our department. Number of minimally invasive interventions increased almost 10 times, from 266 cases operated laparoscopically in 1995 to 2638 cases in 2008. Annual growth of the number of laparoscopic procedures has surpassed the number of complications. Conclusions. Laborious work of laparoscopic surgery and a specialized centre with well-trained team of surgeons provide premises for a good performance even in the assimilation of new and difficult procedures.

  5. Voluntary Medication Error Reporting by ED Nurses: Examining the Association With Work Environment and Social Capital.

    Science.gov (United States)

    Farag, Amany; Blegen, Mary; Gedney-Lose, Amalia; Lose, Daniel; Perkhounkova, Yelena

    2017-05-01

    Medication errors are one of the most frequently occurring errors in health care settings. The complexity of the ED work environment places patients at risk for medication errors. Most hospitals rely on nurses' voluntary medication error reporting, but these errors are under-reported. The purpose of this study was to examine the relationship among work environment (nurse manager leadership style and safety climate), social capital (warmth and belonging relationships and organizational trust), and nurses' willingness to report medication errors. A cross-sectional descriptive design using a questionnaire with a convenience sample of emergency nurses was used. Data were analyzed using descriptive, correlation, Mann-Whitney U, and Kruskal-Wallis statistics. A total of 71 emergency nurses were included in the study. Emergency nurses' willingness to report errors decreased as the nurses' years of experience increased (r = -0.25, P = .03). Their willingness to report errors increased when they received more feedback about errors (r = 0.25, P = .03) and when their managers used a transactional leadership style (r = 0.28, P = .01). ED nurse managers can modify their leadership style to encourage error reporting. Timely feedback after an error report is particularly important. Engaging experienced nurses to understand error root causes could increase voluntary error reporting. Published by Elsevier Inc.

  6. Sleep Disturbances in Adults With Arthritis: Prevalence, Mediators, and Subgroups at Greatest Risk. Data From the 2007 National Health Interview Survey

    Science.gov (United States)

    LOUIE, GRANT H.; TEKTONIDOU, MARIA G.; CABAN-MARTINEZ, ALBERTO J.; WARD, MICHAEL M.

    2012-01-01

    Objective To examine the prevalence of sleep disturbances in adults with arthritis in a nationally representative sample, mediators of sleep difficulties, and subgroups of individuals with arthritis at greatest risk. Methods Using data on US adults ages ≥18 years participating in the 2007 National Health Interview Survey, we computed the prevalence of 3 measures of sleep disturbance (insomnia, excessive daytime sleepiness, and sleep duration arthritis. We used logistic regression analysis to examine if the association of arthritis and sleep disturbances was independent of sociodemographic characteristics and comorbidities, and to identify potential mediators. We used classification trees to identify subgroups at higher risk. Results The adjusted prevalence of insomnia was higher among adults with arthritis than those without arthritis (23.1% versus 16.4%; P arthritis were more likely than those without arthritis to report insomnia (unadjusted odds ratio 2.92, 95% confidence interval 2.68 –3.17), but adjustment for sociodemographic characteristics and comorbidities attenuated this association. Joint pain and limitation due to pain mediated the association between arthritis and insomnia. Among adults with arthritis, those with depression and anxiety were at highest risk for sleep disturbance. Results for excessive daytime sleepiness and sleep duration arthritis, and is mediated by joint pain and limitation due to pain. Among individuals with arthritis, those with depression and anxiety are at greatest risk. PMID:20890980

  7. Will Climate Change, Genetic and Demographic Variation or Rat Predation Pose the Greatest Risk for Persistence of an Altitudinally Distributed Island Endemic?

    Directory of Open Access Journals (Sweden)

    Alison Shapcott

    2012-11-01

    Full Text Available Species endemic to mountains on oceanic islands are subject to a number of existing threats (in particular, invasive species along with the impacts of a rapidly changing climate. The Lord Howe Island endemic palm Hedyscepe canterburyana is restricted to two mountains above 300 m altitude. Predation by the introduced Black Rat (Rattus rattus is known to significantly reduce seedling recruitment. We examined the variation in Hedyscepe in terms of genetic variation, morphology, reproductive output and demographic structure, across an altitudinal gradient. We used demographic data to model population persistence under climate change predictions of upward range contraction incorporating long-term climatic records for Lord Howe Island. We also accounted for alternative levels of rat predation into the model to reflect management options for control. We found that Lord Howe Island is getting warmer and drier and quantified the degree of temperature change with altitude (0.9 °C per 100 m. For H. canterburyana, differences in development rates, population structure, reproductive output and population growth rate were identified between altitudes. In contrast, genetic variation was high and did not vary with altitude. There is no evidence of an upward range contraction as was predicted and recruitment was greatest at lower altitudes. Our models predicted slow population decline in the species and that the highest altitude populations are under greatest threat of extinction. Removal of rat predation would significantly enhance future persistence of this species.

  8. Discretization vs. Rounding Error in Euler's Method

    Science.gov (United States)

    Borges, Carlos F.

    2011-01-01

    Euler's method for solving initial value problems is an excellent vehicle for observing the relationship between discretization error and rounding error in numerical computation. Reductions in stepsize, in order to decrease discretization error, necessarily increase the number of steps and so introduce additional rounding error. The problem is…

  9. Total Survey Error for Longitudinal Surveys

    NARCIS (Netherlands)

    Lynn, Peter; Lugtig, P.J.

    2016-01-01

    This article describes the application of the total survey error paradigm to longitudinal surveys. Several aspects of survey error, and of the interactions between different types of error, are distinct in the longitudinal survey context. Furthermore, error trade-off decisions in survey design and

  10. The influence of selected socio-demographic variables on symptoms occurring during the menopause

    Directory of Open Access Journals (Sweden)

    Marta Makara-Studzińska

    2015-02-01

    Full Text Available Introduction: It is considered that the lifestyle conditioned by socio-demographic or socio-economic factors determines the health condition of people to the greatest extent. The aim of this study is to evaluate the influence of selected socio-demographic factors on the kinds of symptoms occurring during menopause. Material and methods : The study group consisted of 210 women aged 45 to 65, not using hormone replacement therapy, staying at healthcare centers for rehabilitation treatment. The study was carried out in 2013-2014 in the Silesian, Podlaskie and Lesser Poland voivodeships. The set of tools consisted of the authors’ own survey questionnaire and the Menopause Rating Scale (MRS. Results : The most commonly occurring symptom in the group of studied women was a depressive mood, from the group of psychological symptoms, followed by physical and mental fatigue, and discomfort connected with muscle and joint pain. The greatest intensity of symptoms was observed in the group of women with the lowest level of education, reporting an average or bad material situation, and unemployed women. Conclusions : An alarmingly high number of reported psychological symptoms in the group of menopausal women was observed, and in particular among the group of low socio-economic status. Career seems to be a factor reducing the risk of occurrence of psychological symptoms. There is an urgent need for health promotion and prophylaxis in the group of menopausal women, and in many cases for implementation of specialist psychological assistance.

  11. Negligence, genuine error, and litigation

    Science.gov (United States)

    Sohn, David H

    2013-01-01

    Not all medical injuries are the result of negligence. In fact, most medical injuries are the result either of the inherent risk in the practice of medicine, or due to system errors, which cannot be prevented simply through fear of disciplinary action. This paper will discuss the differences between adverse events, negligence, and system errors; the current medical malpractice tort system in the United States; and review current and future solutions, including medical malpractice reform, alternative dispute resolution, health courts, and no-fault compensation systems. The current political environment favors investigation of non-cap tort reform remedies; investment into more rational oversight systems, such as health courts or no-fault systems may reap both quantitative and qualitative benefits for a less costly and safer health system. PMID:23426783

  12. Robot learning and error correction

    Science.gov (United States)

    Friedman, L.

    1977-01-01

    A model of robot learning is described that associates previously unknown perceptions with the sensed known consequences of robot actions. For these actions, both the categories of outcomes and the corresponding sensory patterns are incorporated in a knowledge base by the system designer. Thus the robot is able to predict the outcome of an action and compare the expectation with the experience. New knowledge about what to expect in the world may then be incorporated by the robot in a pre-existing structure whether it detects accordance or discrepancy between a predicted consequence and experience. Errors committed during plan execution are detected by the same type of comparison process and learning may be applied to avoiding the errors.

  13. Error studies of Halbach Magnets

    Energy Technology Data Exchange (ETDEWEB)

    Brooks, S. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2017-03-02

    These error studies were done on the Halbach magnets for the CBETA “First Girder” as described in note [CBETA001]. The CBETA magnets have since changed slightly to the lattice in [CBETA009]. However, this is not a large enough change to significantly affect the results here. The QF and BD arc FFAG magnets are considered. For each assumed set of error distributions and each ideal magnet, 100 random magnets with errors are generated. These are then run through an automated version of the iron wire multipole cancellation algorithm. The maximum wire diameter allowed is 0.063” as in the proof-of-principle magnets. Initially, 32 wires (2 per Halbach wedge) are tried, then if this does not achieve 1e-­4 level accuracy in the simulation, 48 and then 64 wires. By “1e-4 accuracy”, it is meant the FOM defined by √(Σn≥sextupole an 2+bn 2) is less than 1 unit, where the multipoles are taken at the maximum nominal beam radius, R=23mm for these magnets. The algorithm initially uses 20 convergence interations. If 64 wires does not achieve 1e-­4 accuracy, this is increased to 50 iterations to check for slow converging cases. There are also classifications for magnets that do not achieve 1e-4 but do achieve 1e-3 (FOM ≤ 10 units). This is technically within the spec discussed in the Jan 30, 2017 review; however, there will be errors in practical shimming not dealt with in the simulation, so it is preferable to do much better than the spec in the simulation.

  14. [Errors in laboratory daily practice].

    Science.gov (United States)

    Larrose, C; Le Carrer, D

    2007-01-01

    Legislation set by GBEA (Guide de bonne exécution des analyses) requires that, before performing analysis, the laboratory directors have to check both the nature of the samples and the patients identity. The data processing of requisition forms, which identifies key errors, was established in 2000 and in 2002 by the specialized biochemistry laboratory, also with the contribution of the reception centre for biological samples. The laboratories follow a strict criteria of defining acceptability as a starting point for the reception to then check requisition forms and biological samples. All errors are logged into the laboratory database and analysis report are sent to the care unit specifying the problems and the consequences they have on the analysis. The data is then assessed by the laboratory directors to produce monthly or annual statistical reports. This indicates the number of errors, which are then indexed to patient files to reveal the specific problem areas, therefore allowing the laboratory directors to teach the nurses and enable corrective action.

  15. Technical errors in MR arthrography

    International Nuclear Information System (INIS)

    Hodler, Juerg

    2008-01-01

    This article discusses potential technical problems of MR arthrography. It starts with contraindications, followed by problems relating to injection technique, contrast material and MR imaging technique. For some of the aspects discussed, there is only little published evidence. Therefore, the article is based on the personal experience of the author and on local standards of procedures. Such standards, as well as medico-legal considerations, may vary from country to country. Contraindications for MR arthrography include pre-existing infection, reflex sympathetic dystrophy and possibly bleeding disorders, avascular necrosis and known allergy to contrast media. Errors in injection technique may lead to extra-articular collection of contrast agent or to contrast agent leaking from the joint space, which may cause diagnostic difficulties. Incorrect concentrations of contrast material influence image quality and may also lead to non-diagnostic examinations. Errors relating to MR imaging include delays between injection and imaging and inadequate choice of sequences. Potential solutions to the various possible errors are presented. (orig.)

  16. Technical errors in MR arthrography

    Energy Technology Data Exchange (ETDEWEB)

    Hodler, Juerg [Orthopaedic University Hospital of Balgrist, Radiology, Zurich (Switzerland)

    2008-01-15

    This article discusses potential technical problems of MR arthrography. It starts with contraindications, followed by problems relating to injection technique, contrast material and MR imaging technique. For some of the aspects discussed, there is only little published evidence. Therefore, the article is based on the personal experience of the author and on local standards of procedures. Such standards, as well as medico-legal considerations, may vary from country to country. Contraindications for MR arthrography include pre-existing infection, reflex sympathetic dystrophy and possibly bleeding disorders, avascular necrosis and known allergy to contrast media. Errors in injection technique may lead to extra-articular collection of contrast agent or to contrast agent leaking from the joint space, which may cause diagnostic difficulties. Incorrect concentrations of contrast material influence image quality and may also lead to non-diagnostic examinations. Errors relating to MR imaging include delays between injection and imaging and inadequate choice of sequences. Potential solutions to the various possible errors are presented. (orig.)

  17. Uranium occurence in nature: Geophysical prospecting, and its occurence in Syria

    International Nuclear Information System (INIS)

    Al-Haj Rasheed, Zaki

    1985-01-01

    A general idea about naturaly occured uranium minerals such as uranite, pechblende, carnotite, coffinit, and bronnerit is given. At the same time, different geophysical methods and detecting devices applied for uranium exploration have been demonstrated. Investigations and studies carried out in Syria point to a uranium content of 100 ppm in the exploited Syrian phosphorite. 1 fig., 1 tab

  18. Frequency of medical errors in hospitalized children in khorramabad Madani hospital during six months in 2008

    Directory of Open Access Journals (Sweden)

    azam Mohsenzadeh

    2010-02-01

    Full Text Available Many hospitalized children are suffered from medical errors that may cause serious injuries. The aim of this study was to evaluate medical errors in hospitalized children in khorramabad Madani hospital in the first half of 2008. Materials and Methods: This study was a cross sectional that was performed for all medical errors in hospitalized children in khorramabad Madani hospital from 21/3/2008 to 21/9/2008. The sampling method was census. Studied variables included: age, sex, weight, kinds of errers, education of parents, job of parents. Data was collected by questionnaire and analyzed by SPSS software. Results: In this study out of 2250 records, 151 (6/3% had medical errors. 53%were girls and 47% were boys that there was a significant relation between sex and medical errors. 46/4%were related to age group lower than 2 years old. Most of the errors were occurred in weight group of 6kg. Types of medical errors included drug ordering 46/3% (involved incorrect dosage of drug (37%, frequency 28%, rout 19% and others 16%, transcribing10%, administering32/4%, dispensing11/3%. Most errors related to liquid therapy 76/2% and intravenous rout 85/4%. Most errors were occurred during night 47% and during weekend 56/6%. Conclusion: Medical errors are common in hospitalized patients, and in our study the rate of medical errors was 6/3%. So further efforts are needed to reduce them.

  19. Persistent and late occurring lesions in irradiated feet of rats: their clinical relevance

    International Nuclear Information System (INIS)

    Hopewell, J.W.

    1982-01-01

    Radiation-induced deformity, as characterized by tissue loss, has been investigated in rat feet. The acute epithelial response and the loss of deeper tissues occur concomitantly after irradiation. The greatest loss of tissue (severe deformity) was produced in feet where the healing of the epithelial reaction was greatly delayed. While deformity will clearly continue to ''persist'' after the acute reaction has healed it is misleading to refer to this lesion as ''late'' damage. A late-occurring lesion, not previously described in the literature, can be produced in the rat foot by high doses of radiation delivered in such a way that moist desquamation is avoided, i.e. by extending the total treatment time. Parallels are drawn between reactions in rodents and those in the skin of pig and man. (author)

  20. Clock error models for simulation and estimation

    International Nuclear Information System (INIS)

    Meditch, J.S.

    1981-10-01

    Mathematical models for the simulation and estimation of errors in precision oscillators used as time references in satellite navigation systems are developed. The results, based on all currently known oscillator error sources, are directly implementable on a digital computer. The simulation formulation is sufficiently flexible to allow for the inclusion or exclusion of individual error sources as desired. The estimation algorithms, following from Kalman filter theory, provide directly for the error analysis of clock errors in both filtering and prediction

  1. Decodoku: Quantum error rorrection as a simple puzzle game

    Science.gov (United States)

    Wootton, James

    To build quantum computers, we need to detect and manage any noise that occurs. This will be done using quantum error correction. At the hardware level, QEC is a multipartite system that stores information non-locally. Certain measurements are made which do not disturb the stored information, but which do allow signatures of errors to be detected. Then there is a software problem. How to take these measurement outcomes and determine: a) The errors that caused them, and (b) how to remove their effects. For qubit error correction, the algorithms required to do this are well known. For qudits, however, current methods are far from optimal. We consider the error correction problem of qubit surface codes. At the most basic level, this is a problem that can be expressed in terms of a grid of numbers. Using this fact, we take the inherent problem at the heart of quantum error correction, remove it from its quantum context, and presented in terms of simple grid based puzzle games. We have developed three versions of these puzzle games, focussing on different aspects of the required algorithms. These have been presented and iOS and Android apps, allowing the public to try their hand at developing good algorithms to solve the puzzles. For more information, see www.decodoku.com. Funding from the NCCR QSIT.

  2. Potential Errors and Test Assessment in Software Product Line Engineering

    Directory of Open Access Journals (Sweden)

    Hartmut Lackner

    2015-04-01

    Full Text Available Software product lines (SPL are a method for the development of variant-rich software systems. Compared to non-variable systems, testing SPLs is extensive due to an increasingly amount of possible products. Different approaches exist for testing SPLs, but there is less research for assessing the quality of these tests by means of error detection capability. Such test assessment is based on error injection into correct version of the system under test. However to our knowledge, potential errors in SPL engineering have never been systematically identified before. This article presents an overview over existing paradigms for specifying software product lines and the errors that can occur during the respective specification processes. For assessment of test quality, we leverage mutation testing techniques to SPL engineering and implement the identified errors as mutation operators. This allows us to run existing tests against defective products for the purpose of test assessment. From the results, we draw conclusions about the error-proneness of the surveyed SPL design paradigms and how quality of SPL tests can be improved.

  3. Double checking medicines: defence against error or contributory factor?

    Science.gov (United States)

    Armitage, Gerry

    2008-08-01

    The double checking of medicines in health care is a contestable procedure. It occupies an obvious position in health care practice and is understood to be an effective defence against medication error but the process is variable and the outcomes have not been exposed to testing. This paper presents an appraisal of the process using data from part of a larger study on the contributory factors in medication errors and their reporting. Previous research studies are reviewed; data are analysed from a review of 991 drug error reports and a subsequent series of 40 in-depth interviews with health professionals in an acute hospital in northern England. The incident reports showed that errors occurred despite double checking but that action taken did not appear to investigate the checking process. Most interview participants (34) talked extensively about double checking but believed the process to be inconsistent. Four key categories were apparent: deference to authority, reduction of responsibility, automatic processing and lack of time. Solutions to the problems were also offered, which are discussed with several recommendations. Double checking medicines should be a selective and systematic procedure informed by key principles and encompassing certain behaviours. Psychological research may be instructive in reducing checking errors but the aviation industry may also have a part to play in increasing error wisdom and reducing risk.

  4. [Monitoring medication errors in an internal medicine service].

    Science.gov (United States)

    Smith, Ann-Loren M; Ruiz, Inés A; Jirón, Marcela A

    2014-01-01

    Patients admitted to internal medicine services receive multiple drugs and thus are at risk of medication errors. To determine the frequency of medication errors (ME) among patients admitted to an internal medicine service of a high complexity hospital. A prospective observational study conducted in 225 patients admitted to an internal medicine service. Each stage of drug utilization system (prescription, transcription, dispensing, preparation and administration) was directly observed by trained pharmacists not related to hospital staff during three months. ME were described and categorized according to the National Coordinating Council for Medication Error Reporting and Prevention. In each stage of medication use, the frequency of ME and their characteristics were determined. A total of 454 drugs were prescribed to the studied patients. In 138 (30,4%) indications, at least one ME occurred, involving 67 (29,8%) patients. Twenty four percent of detected ME occurred during administration, mainly due to wrong time schedules. Anticoagulants were the therapeutic group with the highest occurrence of ME. At least one ME occurred in approximately one third of patients studied, especially during the administration stage. These errors could affect the medication safety and avoid achieving therapeutic goals. Strategies to improve the quality and safe use of medications can be implemented using this information.

  5. Prevalence and cost of hospital medical errors in the general and elderly United States populations.

    Science.gov (United States)

    Mallow, Peter J; Pandya, Bhavik; Horblyuk, Ruslan; Kaplan, Harold S

    2013-12-01

    The primary objective of this study was to quantify the differences in the prevalence rate and costs of hospital medical errors between the general population and an elderly population aged ≥65 years. Methods from an actuarial study of medical errors were modified to identify medical errors in the Premier Hospital Database using data from 2009. Visits with more than four medical errors were removed from the population to avoid over-estimation of cost. Prevalence rates were calculated based on the total number of inpatient visits. There were 3,466,596 total inpatient visits in 2009. Of these, 1,230,836 (36%) occurred in people aged ≥ 65. The prevalence rate was 49 medical errors per 1000 inpatient visits in the general cohort and 79 medical errors per 1000 inpatient visits for the elderly cohort. The top 10 medical errors accounted for more than 80% of the total in the general cohort and the 65+ cohort. The most costly medical error for the general population was postoperative infection ($569,287,000). Pressure ulcers were most costly ($347,166,257) in the elderly population. This study was conducted with a hospital administrative database, and assumptions were necessary to identify medical errors in the database. Further, there was no method to identify errors of omission or misdiagnoses within the database. This study indicates that prevalence of hospital medical errors for the elderly is greater than the general population and the associated cost of medical errors in the elderly population is quite substantial. Hospitals which further focus their attention on medical errors in the elderly population may see a significant reduction in costs due to medical errors as a disproportionate percentage of medical errors occur in this age group.

  6. ERROR DETECTION BY ANTICIPATION FOR VISION-BASED CONTROL

    Directory of Open Access Journals (Sweden)

    A ZAATRI

    2001-06-01

    Full Text Available A vision-based control system has been developed.  It enables a human operator to remotely direct a robot, equipped with a camera, towards targets in 3D space by simply pointing on their images with a pointing device. This paper presents an anticipatory system, which has been designed for improving the safety and the effectiveness of the vision-based commands. It simulates these commands in a virtual environment. It attempts to detect hard contacts that may occur between the robot and its environment, which can be caused by machine errors or operator errors as well.

  7. Implementing parallel spreadsheet models for health policy decisions: The impact of unintentional errors on model projections.

    Science.gov (United States)

    Bailey, Stephanie L; Bono, Rose S; Nash, Denis; Kimmel, April D

    2018-01-01

    Spreadsheet software is increasingly used to implement systems science models informing health policy decisions, both in academia and in practice where technical capacity may be limited. However, spreadsheet models are prone to unintentional errors that may not always be identified using standard error-checking techniques. Our objective was to illustrate, through a methodologic case study analysis, the impact of unintentional errors on model projections by implementing parallel model versions. We leveraged a real-world need to revise an existing spreadsheet model designed to inform HIV policy. We developed three parallel versions of a previously validated spreadsheet-based model; versions differed by the spreadsheet cell-referencing approach (named single cells; column/row references; named matrices). For each version, we implemented three model revisions (re-entry into care; guideline-concordant treatment initiation; immediate treatment initiation). After standard error-checking, we identified unintentional errors by comparing model output across the three versions. Concordant model output across all versions was considered error-free. We calculated the impact of unintentional errors as the percentage difference in model projections between model versions with and without unintentional errors, using +/-5% difference to define a material error. We identified 58 original and 4,331 propagated unintentional errors across all model versions and revisions. Over 40% (24/58) of original unintentional errors occurred in the column/row reference model version; most (23/24) were due to incorrect cell references. Overall, >20% of model spreadsheet cells had material unintentional errors. When examining error impact along the HIV care continuum, the percentage difference between versions with and without unintentional errors ranged from +3% to +16% (named single cells), +26% to +76% (column/row reference), and 0% (named matrices). Standard error-checking techniques may not

  8. Symmetric and Asymmetric Patterns of Attraction Errors in Producing Subject-Predicate Agreement in Hebrew: An Issue of Morphological Structure

    Science.gov (United States)

    Deutsch, Avital; Dank, Maya

    2011-01-01

    A common characteristic of subject-predicate agreement errors (usually termed attraction errors) in complex noun phrases is an asymmetrical pattern of error distribution, depending on the inflectional state of the nouns comprising the complex noun phrase. That is, attraction is most likely to occur when the head noun is the morphologically…

  9. Post-error expression of speed and force while performing a simple, monotonous task with a haptic pen

    NARCIS (Netherlands)

    Bruns, M.; Keyson, D.V.; Jabon, M.E.; Hummels, C.C.M.; Hekkert, P.P.M.; Bailenson, J.N.

    2013-01-01

    Control errors often occur in repetitive and monotonous tasks, such as manual assembly tasks. Much research has been done in the area of human error identification; however, most existing systems focus solely on the prediction of errors, not on increasing worker accuracy. The current study examines

  10. Heuristic thinking: interdisciplinary perspectives on medical error

    Directory of Open Access Journals (Sweden)

    Annegret F. Hannawa

    2013-12-01

    Full Text Available Approximately 43 million adverse events occur across the globe each year at a cost of at least 23 million disability-adjusted life years and $132 billion in excess health care spending, ranking this safety burden among the top 10 medical causes of disability in the world.1 These findings are likely to be an understatement of the actual severity of the problem, given that the numbers merely reflect seven types of adverse events and completely neglect ambulatory care, and of course they only cover reported incidents. Furthermore, they do not include statistics on children and incidents from India and China, which host more than a third of the world’s population. Best estimates imply that about two thirds of these incidents are preventable. Thus, from a public health perspective, medical errors are a seri- ous global health burden, in fact ahead of high-profile health problems like AIDS and cancer. Interventions to date have not reduced medical errors to satisfactory rates. Even today, far too often, hand hygiene is not practiced properly (even in developed countries, surgical procedures take place in underequipped operating theaters, and checklists are missing or remain uncompleted. The healthcare system seems to be failing in managing its errors − it is costing too much, and the complexity of care causes severe safety hazards that too often harm rather than help patients. In response to this evolving discussion, the International Society for Quality in Healthcare recently nominated an Innovations Team that is now developing new strategies. One of the emerging themes is that the medical field cannot resolve this problem on its own. Instead, interdisciplinary collaborations are needed to advance effective, evidence-based interventions that will eventually result in competent changes. In March 2013, the Institute of Communication and Health at the University of Lugano organized a conference on Communicating Medical Error (COME 2013 in

  11. Measuring Error Identification and Recovery Skills in Surgical Residents.

    Science.gov (United States)

    Sternbach, Joel M; Wang, Kevin; El Khoury, Rym; Teitelbaum, Ezra N; Meyerson, Shari L

    2017-02-01

    Although error identification and recovery skills are essential for the safe practice of surgery, they have not traditionally been taught or evaluated in residency training. This study validates a method for assessing error identification and recovery skills in surgical residents using a thoracoscopic lobectomy simulator. We developed a 5-station, simulator-based examination containing the most commonly encountered cognitive and technical errors occurring during division of the superior pulmonary vein for left upper lobectomy. Successful completion of each station requires identification and correction of these errors. Examinations were video recorded and scored in a blinded fashion using an examination-specific rating instrument evaluating task performance as well as error identification and recovery skills. Evidence of validity was collected in the categories of content, response process, internal structure, and relationship to other variables. Fifteen general surgical residents (9 interns and 6 third-year residents) completed the examination. Interrater reliability was high, with an intraclass correlation coefficient of 0.78 between 4 trained raters. Station scores ranged from 64% to 84% correct. All stations adequately discriminated between high- and low-performing residents, with discrimination ranging from 0.35 to 0.65. The overall examination score was significantly higher for intermediate residents than for interns (mean, 74 versus 64 of 90 possible; p = 0.03). The described simulator-based examination with embedded errors and its accompanying assessment tool can be used to measure error identification and recovery skills in surgical residents. This examination provides a valid method for comparing teaching strategies designed to improve error recognition and recovery to enhance patient safety. Copyright © 2017 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.

  12. Righting errors in writing errors: the Wing and Baddeley (1980) spelling error corpus revisited.

    Science.gov (United States)

    Wing, Alan M; Baddeley, Alan D

    2009-03-01

    We present a new analysis of our previously published corpus of handwriting errors (slips) using the proportional allocation algorithm of Machtynger and Shallice (2009). As previously, the proportion of slips is greater in the middle of the word than at the ends, however, in contrast to before, the proportion is greater at the end than at the beginning of the word. The findings are consistent with the hypothesis of memory effects in a graphemic output buffer.

  13. Digital halftoning methods for selectively partitioning error into achromatic and chromatic channels

    Science.gov (United States)

    Mulligan, Jeffrey B.

    1990-01-01

    A method is described for reducing the visibility of artifacts arising in the display of quantized color images on CRT displays. The method is based on the differential spatial sensitivity of the human visual system to chromatic and achromatic modulations. Because the visual system has the highest spatial and temporal acuity for the luminance component of an image, a technique which will reduce luminance artifacts at the expense of introducing high-frequency chromatic errors is sought. A method based on controlling the correlations between the quantization errors in the individual phosphor images is explored. The luminance component is greatest when the phosphor errors are positively correlated, and is minimized when the phosphor errors are negatively correlated. The greatest effect of the correlation is obtained when the intensity quantization step sizes of the individual phosphors have equal luminances. For the ordered dither algorithm, a version of the method can be implemented by simply inverting the matrix of thresholds for one of the color components.

  14. Nurses' Perceived Skills and Attitudes About Updated Safety Concepts: Impact on Medication Administration Errors and Practices.

    Science.gov (United States)

    Armstrong, Gail E; Dietrich, Mary; Norman, Linda; Barnsteiner, Jane; Mion, Lorraine

    Approximately a quarter of medication errors in the hospital occur at the administration phase, which is solely under the purview of the bedside nurse. The purpose of this study was to assess bedside nurses' perceived skills and attitudes about updated safety concepts and examine their impact on medication administration errors and adherence to safe medication administration practices. Findings support the premise that medication administration errors result from an interplay among system-, unit-, and nurse-level factors.

  15. Double symbol error rates for differential detection of narrow-band FM

    Science.gov (United States)

    Simon, M. K.

    1985-01-01

    This paper evaluates the double symbol error rate (average probability of two consecutive symbol errors) in differentially detected narrow-band FM. Numerical results are presented for the special case of MSK with a Gaussian IF receive filter. It is shown that, not unlike similar results previously obtained for the single error probability of such systems, large inaccuracies in predicted performance can occur when intersymbol interference is ignored.

  16. Theoretical explanations and practices regarding the distinction between the concepts: judicial error, error of law and fundamental vice in the legislation of the Republic of Moldova

    Directory of Open Access Journals (Sweden)

    Vasilisa Muntean

    2017-10-01

    Full Text Available In the research, a doctrinal and legal analysis of the concept of legal error is carried out. The author provides a self-defined definition of the concept addressed and highlights the main causes and conditions for the occurrence of judicial errors. At present, in the specialized legal doctrine of the Republic of Moldova, the problem of defining the judicial error has been little approached. In this respect, this scientific article is a scientific approach aimed at elucidating the theoretical and normative deficiencies and errors that occur in the area of reparation of the prejudice caused by judicial errors. In order to achieve our goal, we aim to create a core of ideas and referral mechanisms that ensure a certain interpretative and decisional homogeneity in the doctrinal and legal characterization of the phrase "judicial error".

  17. The Large Hadron Collider the greatest adventure in town and ten reasons why it matters, as illustrated by the ATLAS experiment

    CERN Document Server

    Millington, Andrew J; MacPherson, Rob; Nordberg, Markus

    2016-01-01

    When the discovery of the Higgs Boson at CERN hit the headlines in 2012, the world was stunned by this achievement of modern science. Less well appreciated, however, were the many ways in which this benefited wider society. The Large Hadron Collider — The Greatest Adventure in Town charts a path through the cultural, economic and medical gains of modern particle physics. It illustrates these messages through the ATLAS experiment at CERN, one of the two big experiments which found the Higgs particle. Moving clear of in-depth physics analysis, it draws on the unparalleled curiosity about particle physics aroused by the Higgs discovery, and relates it to developments familiar in the modern world, including the Internet, its successor "The Grid", and the latest cancer treatments. In this book, advances made from developing the 27 kilometre particle accelerator and its detectors are presented with the benefit of first hand interviews and are extensively illustrated throughout. Interviewees are leading physicis...

  18. Medication Administration Errors in an Adult Emergency Department of a Tertiary Health Care Facility in Ghana.

    Science.gov (United States)

    Acheampong, Franklin; Tetteh, Ashalley Raymond; Anto, Berko Panyin

    2016-12-01

    This study determined the incidence, types, clinical significance, and potential causes of medication administration errors (MAEs) at the emergency department (ED) of a tertiary health care facility in Ghana. This study used a cross-sectional nonparticipant observational technique. Study participants (nurses) were observed preparing and administering medication at the ED of a 2000-bed tertiary care hospital in Accra, Ghana. The observations were then compared with patients' medication charts, and identified errors were clarified with staff for possible causes. Of the 1332 observations made, involving 338 patients and 49 nurses, 362 had errors, representing 27.2%. However, the error rate excluding "lack of drug availability" fell to 12.8%. Without wrong time error, the error rate was 22.8%. The 2 most frequent error types were omission (n = 281, 77.6%) and wrong time (n = 58, 16%) errors. Omission error was mainly due to unavailability of medicine, 48.9% (n = 177). Although only one of the errors was potentially fatal, 26.7% were definitely clinically severe. The common themes that dominated the probable causes of MAEs were unavailability, staff factors, patient factors, prescription, and communication problems. This study gives credence to similar studies in different settings that MAEs occur frequently in the ED of hospitals. Most of the errors identified were not potentially fatal; however, preventive strategies need to be used to make life-saving processes such as drug administration in such specialized units error-free.

  19. Assessing explicit error reporting in the narrative electronic medical record using keyword searching.

    Science.gov (United States)

    Cao, Hui; Stetson, Peter; Hripcsak, George

    2003-01-01

    Many types of medical errors occur in and outside of hospitals, some of which have very serious consequences and increase cost. Identifying errors is a critical step for managing and preventing them. In this study, we assessed the explicit reporting of medical errors in the electronic record. We used five search terms "mistake," "error," "incorrect," "inadvertent," and "iatrogenic" to survey several sets of narrative reports including discharge summaries, sign-out notes, and outpatient notes from 1991 to 2000. We manually reviewed all the positive cases and identified them based on the reporting of physicians. We identified 222 explicitly reported medical errors. The positive predictive value varied with different keywords. In general, the positive predictive value for each keyword was low, ranging from 3.4 to 24.4%. Therapeutic-related errors were the most common reported errors and these reported therapeutic-related errors were mainly medication errors. Keyword searches combined with manual review indicated some medical errors that were reported in medical records. It had a low sensitivity and a moderate positive predictive value, which varied by search term. Physicians were most likely to record errors in the Hospital Course and History of Present Illness sections of discharge summaries. The reported errors in medical records covered a broad range and were related to several types of care providers as well as non-health care professionals.

  20. Learning from your mistakes: The functional value of spontaneous error monitoring in aphasia

    Directory of Open Access Journals (Sweden)

    Erica L. Middleton

    2014-04-01

    Ex. 4.\t(T = umbrella “umbelella, umbrella”: Phonological error; DetCorr We used mixed effects logistic regression to assess whether the log odds of changing from error to correct was predicted by monitoring status of the error (DetCorr vs. NoDet; DetNoCorr vs. NoDet; whether the monitoring benefit interacted with direction of change (forward, backward; and whether effects varied by error type. Figure 1 (top shows that the proportion accuracy change was higher for DetCorr, relative to NoDet, consistent with a monitoring benefit. The difference in log odds was significant for semantic errors in both directions (forward: coeff. = -1.73; z= -7.78; p < .001; backward: coeff = -0.92; z= -3.60; p < .001, and for phonological errors in both directions (forward: coeff. = -0.74; z= -2.73; p=.006; backward : coeff. = -.76; z = -2.73; p = .006. The difference between DetNoCorr and NoDet was not significant in any condition. Figure 1 (bottom shows that for Semantic errors, there was a directional asymmetry favoring the Forward condition (interaction: coeff. = .79; z = 2.32; p = .02. Phonological errors, in contrast, produced comparable effects in Forward and Backward direction. The results demonstrated a benefit for errors that were detected and corrected. This monitoring benefit was present in both the forward and backward direction, supporting the Strength hypothesis. Of greatest interest, the monitoring benefit for Semantic errors was greater in the forward than backward direction, indicating a role for learning.

  1. Main visual symptoms associated to refractive errors and spectacle need in a Brazilian population

    Directory of Open Access Journals (Sweden)

    Silvana Schellini

    2016-12-01

    Full Text Available AIM: To determine the main visual symptoms in a Brazilian population sample, associated to refractive errors (REs and spectacle need to suggest priorities in preventive programs. METHODS: A cross-sectional study was conducted in nine counties of the southeast region of Brazil, using a systematic sampling of households, between March 2004 and July 2005. The population was defined as individuals aged between 1 and 96y, inhabitants of 3600 residences to be evaluated and 3012 households were included, corresponding to 8010 subjects considered for participation in the survey, of whom 7654 underwent ophthalmic examinations. The individuals were evaluated according their demographic data, eye complaints and eye examination including the RE and the need to prescribe spectacles according to age. Statistical analysis was performed using SPSS software package and descriptive analysis using 95% confidence intervals (P<0.05. RESULTS: The main symptom detected was asthenopia, most frequent in the 2nd and 3rd decades of life, with a significant decline after the 4th decade. Astigmatism was the RE most associated with asthenopia. Reduced near vision sight was more frequent in those ≥40y with a progressive decline thereafter. Spectacles were most frequently required in subjects of ≥40 years of age. CONCLUSION: The main symptom related to the vision was asthenopia and was associated to astigmatism. The greatest need for spectacles prescription occurred after 40’s, mainly to correct near vision. Subjects of ≥40 years old were determined to be at high risk of uncorrected REs. These observations can guide intervention programs for the Brazilian population.

  2. An adaptive orienting theory of error processing.

    Science.gov (United States)

    Wessel, Jan R

    2018-03-01

    The ability to detect and correct action errors is paramount to safe and efficient goal-directed behaviors. Existing work on the neural underpinnings of error processing and post-error behavioral adaptations has led to the development of several mechanistic theories of error processing. These theories can be roughly grouped into adaptive and maladaptive theories. While adaptive theories propose that errors trigger a cascade of processes that will result in improved behavior after error commission, maladaptive theories hold that error commission momentarily impairs behavior. Neither group of theories can account for all available data, as different empirical studies find both impaired and improved post-error behavior. This article attempts a synthesis between the predictions made by prominent adaptive and maladaptive theories. Specifically, it is proposed that errors invoke a nonspecific cascade of processing that will rapidly interrupt and inhibit ongoing behavior and cognition, as well as orient attention toward the source of the error. It is proposed that this cascade follows all unexpected action outcomes, not just errors. In the case of errors, this cascade is followed by error-specific, controlled processing, which is specifically aimed at (re)tuning the existing task set. This theory combines existing predictions from maladaptive orienting and bottleneck theories with specific neural mechanisms from the wider field of cognitive control, including from error-specific theories of adaptive post-error processing. The article aims to describe the proposed framework and its implications for post-error slowing and post-error accuracy, propose mechanistic neural circuitry for post-error processing, and derive specific hypotheses for future empirical investigations. © 2017 Society for Psychophysiological Research.

  3. WACC: Definition, misconceptions and errors

    OpenAIRE

    Fernandez, Pablo

    2011-01-01

    The WACC is just the rate at which the Free Cash Flows must be discounted to obtain the same result as in the valuation using Equity Cash Flows discounted at the required return to equity (Ke) The WACC is neither a cost nor a required return: it is a weighted average of a cost and a required return. To refer to the WACC as the "cost of capital" may be misleading because it is not a cost. The paper includes 7 errors due to not remembering the definition of WACC and shows the relationship betwe...

  4. Wavefront error sensing for LDR

    Science.gov (United States)

    Tubbs, Eldred F.; Glavich, T. A.

    1988-01-01

    Wavefront sensing is a significant aspect of the LDR control problem and requires attention at an early stage of the control system definition and design. A combination of a Hartmann test for wavefront slope measurement and an interference test for piston errors of the segments was examined and is presented as a point of departure for further discussion. The assumption is made that the wavefront sensor will be used for initial alignment and periodic alignment checks but that it will not be used during scientific observations. The Hartmann test and the interferometric test are briefly examined.

  5. Adverse Drug Events caused by Serious Medication Administration Errors

    Science.gov (United States)

    Sawarkar, Abhivyakti; Keohane, Carol A.; Maviglia, Saverio; Gandhi, Tejal K; Poon, Eric G

    2013-01-01

    OBJECTIVE To determine how often serious or life-threatening medication administration errors with the potential to cause patient harm (or potential adverse drug events) result in actual patient harm (or adverse drug events (ADEs)) in the hospital setting. DESIGN Retrospective chart review of clinical events that transpired following observed medication administration errors. BACKGROUND Medication errors are common at the medication administration stage for hospitalized patients. While many of these errors are considered capable of causing patient harm, it is not clear how often patients are actually harmed by these errors. METHODS In a previous study where 14,041 medication administrations in an acute-care hospital were directly observed, investigators discovered 1271 medication administration errors, of which 133 had the potential to cause serious or life-threatening harm to patients and were considered serious or life-threatening potential ADEs. In the current study, clinical reviewers conducted detailed chart reviews of cases where a serious or life-threatening potential ADE occurred to determine if an actual ADE developed following the potential ADE. Reviewers further assessed the severity of the ADE and attribution to the administration error. RESULTS Ten (7.5% [95% C.I. 6.98, 8.01]) actual adverse drug events or ADEs resulted from the 133 serious and life-threatening potential ADEs, of which 6 resulted in significant, three in serious, and one life threatening injury. Therefore 4 (3% [95% C.I. 2.12, 3.6]) serious and life threatening potential ADEs led to serious or life threatening ADEs. Half of the ten actual ADEs were caused by dosage or monitoring errors for anti-hypertensives. The life threatening ADE was caused by an error that was both a transcription and a timing error. CONCLUSION Potential ADEs at the medication administration stage can cause serious patient harm. Given previous estimates of serious or life-threatening potential ADE of 1.33 per 100

  6. Nurses' Behaviors and Visual Scanning Patterns May Reduce Patient Identification Errors

    Science.gov (United States)

    Marquard, Jenna L.; Henneman, Philip L.; He, Ze; Jo, Junghee; Fisher, Donald L.; Henneman, Elizabeth A.

    2011-01-01

    Patient identification (ID) errors occurring during the medication administration process can be fatal. The aim of this study is to determine whether differences in nurses' behaviors and visual scanning patterns during the medication administration process influence their capacities to identify patient ID errors. Nurse participants (n = 20)…

  7. The Impact of Bar Code Medication Administration Technology on Reported Medication Errors

    Science.gov (United States)

    Holecek, Andrea

    2011-01-01

    The use of bar-code medication administration technology is on the rise in acute care facilities in the United States. The technology is purported to decrease medication errors that occur at the point of administration. How significantly this technology affects actual rate and severity of error is unknown. This descriptive, longitudinal research…

  8. Drug administration errors in an institution for individuals with intellectual disability : an observational study

    NARCIS (Netherlands)

    van den Bemt, P M L A; Robertz, R; de Jong, A L; van Roon, E N; Leufkens, H G M

    BACKGROUND: Medication errors can result in harm, unless barriers to prevent them are present. Drug administration errors are less likely to be prevented, because they occur in the last stage of the drug distribution process. This is especially the case in non-alert patients, as patients often form

  9. Minimizing driver errors: examining factors leading to failed target tracking and detection.

    Science.gov (United States)

    2013-06-01

    Driving a motor vehicle is a common practice for many individuals. Although driving becomes : repetitive and a very habitual task, errors can occur that lead to accidents. One factor that can be a : cause for such errors is a lapse in attention or a ...

  10. Error rates in forensic DNA analysis: Definition, numbers, impact and communication

    NARCIS (Netherlands)

    Kloosterman, A.; Sjerps, M.; Quak, A.

    2014-01-01

    Forensic DNA casework is currently regarded as one of the most important types of forensic evidence, and important decisions in intelligence and justice are based on it. However, errors occasionally occur and may have very serious consequences. In other domains, error rates have been defined and

  11. Antiretroviral medication prescribing errors are common with hospitalization of HIV-infected patients.

    Science.gov (United States)

    Commers, Tessa; Swindells, Susan; Sayles, Harlan; Gross, Alan E; Devetten, Marcel; Sandkovsky, Uriel

    2014-01-01

    Errors in prescribing antiretroviral therapy (ART) often occur with the hospitalization of HIV-infected patients. The rapid identification and prevention of errors may reduce patient harm and healthcare-associated costs. A retrospective review of hospitalized HIV-infected patients was carried out between 1 January 2009 and 31 December 2011. Errors were documented as omission, underdose, overdose, duplicate therapy, incorrect scheduling and/or incorrect therapy. The time to error correction was recorded. Relative risks (RRs) were computed to evaluate patient characteristics and error rates. A total of 289 medication errors were identified in 146/416 admissions (35%). The most common was drug omission (69%). At an error rate of 31%, nucleoside reverse transcriptase inhibitors were associated with an increased risk of error when compared with protease inhibitors (RR 1.32; 95% CI 1.04-1.69) and co-formulated drugs (RR 1.59; 95% CI 1.19-2.09). Of the errors, 31% were corrected within the first 24 h, but over half (55%) were never remedied. Admissions with an omission error were 7.4 times more likely to have all errors corrected within 24 h than were admissions without an omission. Drug interactions with ART were detected on 51 occasions. For the study population (n = 177), an increased risk of admission error was observed for black (43%) compared with white (28%) individuals (RR 1.53; 95% CI 1.16-2.03) but no significant differences were observed between white patients and other minorities or between men and women. Errors in inpatient ART were common, and the majority were never detected. The most common errors involved omission of medication, and nucleoside reverse transcriptase inhibitors had the highest rate of prescribing error. Interventions to prevent and correct errors are urgently needed.

  12. Cumulative error models for the tank calibration problem

    International Nuclear Information System (INIS)

    Goldman, A.; Anderson, L.G.; Weber, J.

    1983-01-01

    The purpose of a tank calibration equation is to obtain an estimate of the liquid volume that corresponds to a liquid level measurement. Calibration experimental errors occur in both liquid level and liquid volume measurements. If one of the errors is relatively small, the calibration equation can be determined from wellknown regression and calibration methods. If both variables are assumed to be in error, then for linear cases a prototype model should be considered. Many investigators are not familiar with this model or do not have computing facilities capable of obtaining numerical solutions. This paper discusses and compares three linear models that approximate the prototype model and have the advantage of much simpler computations. Comparisons among the four models and recommendations of suitability are made from simulations and from analyses of six sets of experimental data

  13. The treatment of commission errors in first generation human reliability analysis methods

    Energy Technology Data Exchange (ETDEWEB)

    Alvarengga, Marco Antonio Bayout; Fonseca, Renato Alves da, E-mail: bayout@cnen.gov.b, E-mail: rfonseca@cnen.gov.b [Comissao Nacional de Energia Nuclear (CNEN) Rio de Janeiro, RJ (Brazil); Melo, Paulo Fernando Frutuoso e, E-mail: frutuoso@nuclear.ufrj.b [Coordenacao dos Programas de Pos-Graduacao de Engenharia (PEN/COPPE/UFRJ), RJ (Brazil). Programa de Engenharia Nuclear

    2011-07-01

    Human errors in human reliability analysis can be classified generically as errors of omission and commission errors. Omission errors are related to the omission of any human action that should have been performed, but does not occur. Errors of commission are those related to human actions that should not be performed, but which in fact are performed. Both involve specific types of cognitive error mechanisms, however, errors of commission are more difficult to model because they are characterized by non-anticipated actions that are performed instead of others that are omitted (omission errors) or are entered into an operational task without being part of the normal sequence of this task. The identification of actions that are not supposed to occur depends on the operational context that will influence or become easy certain unsafe actions of the operator depending on the operational performance of its parameters and variables. The survey of operational contexts and associated unsafe actions is a characteristic of second-generation models, unlike the first generation models. This paper discusses how first generation models can treat errors of commission in the steps of detection, diagnosis, decision-making and implementation, in the human information processing, particularly with the use of THERP tables of errors quantification. (author)

  14. Human decision error (HUMDEE) trees

    International Nuclear Information System (INIS)

    Ostrom, L.T.

    1993-01-01

    Graphical presentations of human actions in incident and accident sequences have been used for many years. However, for the most part, human decision making has been underrepresented in these trees. This paper presents a method of incorporating the human decision process into graphical presentations of incident/accident sequences. This presentation is in the form of logic trees. These trees are called Human Decision Error Trees or HUMDEE for short. The primary benefit of HUMDEE trees is that they graphically illustrate what else the individuals involved in the event could have done to prevent either the initiation or continuation of the event. HUMDEE trees also present the alternate paths available at the operator decision points in the incident/accident sequence. This is different from the Technique for Human Error Rate Prediction (THERP) event trees. There are many uses of these trees. They can be used for incident/accident investigations to show what other courses of actions were available and for training operators. The trees also have a consequence component so that not only the decision can be explored, also the consequence of that decision

  15. Apology for errors: whose responsibility?

    Science.gov (United States)

    Leape, Lucian L

    2012-01-01

    When things go wrong during a medical procedure, patients' expectations are fairly straightforward: They expect an explanation of what happened, an apology if an error was made, and assurance that something will be done to prevent it from happening to another patient. Patients have a right to full disclosure; it is also therapeutic in relieving their anxiety. But if they have been harmed by our mistake, they also need an apology to maintain trust. Apology conveys respect, mutual suffering, and responsibility. Meaningful apology requires that the patient's physician and the institution both take responsibility, show remorse, and make amends. As the patient's advocate, the physician must play the lead role. However, as custodian of the systems, the hospital has primary responsibility for the mishap, for preventing that error in the future, and for compensation. The responsibility for making all this happen rests with the CEO. The hospital must have policies and practices that ensure that every injured patient is treated the way we would want to be treated ourselves--openly, honestly, with compassion, and, when indicated, with an apology and compensation. To make that happen, hospitals need to greatly expand training of physicians and others, and develop support programs for patients and caregivers.

  16. Error exponents for entanglement concentration

    International Nuclear Information System (INIS)

    Hayashi, Masahito; Koashi, Masato; Matsumoto, Keiji; Morikoshi, Fumiaki; Winter, Andreas

    2003-01-01

    Consider entanglement concentration schemes that convert n identical copies of a pure state into a maximally entangled state of a desired size with success probability being close to one in the asymptotic limit. We give the distillable entanglement, the number of Bell pairs distilled per copy, as a function of an error exponent, which represents the rate of decrease in failure probability as n tends to infinity. The formula fills the gap between the least upper bound of distillable entanglement in probabilistic concentration, which is the well-known entropy of entanglement, and the maximum attained in deterministic concentration. The method of types in information theory enables the detailed analysis of the distillable entanglement in terms of the error rate. In addition to the probabilistic argument, we consider another type of entanglement concentration scheme, where the initial state is deterministically transformed into a (possibly mixed) final state whose fidelity to a maximally entangled state of a desired size converges to one in the asymptotic limit. We show that the same formula as in the probabilistic argument is valid for the argument on fidelity by replacing the success probability with the fidelity. Furthermore, we also discuss entanglement yield when optimal success probability or optimal fidelity converges to zero in the asymptotic limit (strong converse), and give the explicit formulae for those cases

  17. PS-022 Complex automated medication systems reduce medication administration error rates in an acute medical ward

    DEFF Research Database (Denmark)

    Risør, Bettina Wulff; Lisby, Marianne; Sørensen, Jan

    2017-01-01

    Background Medication errors have received extensive attention in recent decades and are of significant concern to healthcare organisations globally. Medication errors occur frequently, and adverse events associated with medications are one of the largest causes of harm to hospitalised patients...... cabinet, automated dispensing and barcode medication administration; (2) non-patient specific automated dispensing and barcode medication administration. The occurrence of administration errors was observed in three 3 week periods. The error rates were calculated by dividing the number of doses with one...

  18. Measurement error models with interactions

    Science.gov (United States)

    Midthune, Douglas; Carroll, Raymond J.; Freedman, Laurence S.; Kipnis, Victor

    2016-01-01

    An important use of measurement error models is to correct regression models for bias due to covariate measurement error. Most measurement error models assume that the observed error-prone covariate (\\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}$W$\\end{document}) is a linear function of the unobserved true covariate (\\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}$X$\\end{document}) plus other covariates (\\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}$Z$\\end{document}) in the regression model. In this paper, we consider models for \\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}$W$\\end{document} that include interactions between \\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}$X$\\end{document} and \\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}$Z$\\end{document}. We derive the conditional distribution of

  19. Association between Refractive Errors and Ocular Biometry in Iranian Adults

    Science.gov (United States)

    Hashemi, Hassan; Khabazkhoob, Mehdi; Emamian, Mohammad Hassan; Shariati, Mohammad; Miraftab, Mohammad; Yekta, Abbasali; Ostadimoghaddam, Hadi; Fotouhi, Akbar

    2015-01-01

    Purpose: To investigate the association between ocular biometrics such as axial length (AL), anterior chamber depth (ACD), lens thickness (LT), vitreous chamber depth (VCD) and corneal power (CP) with different refractive errors. Methods: In a cross-sectional study on the 40 to 64-year-old population of Shahroud, random cluster sampling was performed. Ocular biometrics were measured using the Allegro Biograph (WaveLight AG, Erlangen, Germany) for all participants. Refractive errors were determined using cycloplegic refraction. Results: In the first model, the strongest correlations were found between spherical equivalent with axial length and corneal power. Spherical equivalent was strongly correlated with axial length in high myopic and high hyperopic cases, and with corneal power in high hyperopic cases; 69.5% of variability in spherical equivalent was attributed to changes in these variables. In the second model, the correlations between vitreous chamber depth and corneal power with spherical equivalent were stronger in myopes than hyperopes, while the correlations between lens thickness and anterior chamber depth with spherical equivalent were stronger in hyperopic cases than myopic ones. In the third model, anterior chamber depth + lens thickness correlated with spherical equivalent only in moderate and severe cases of hyperopia, and this index was not correlated with spherical equivalent in moderate to severe myopia. Conclusion: In individuals aged 40-64 years, corneal power and axial length make the greatest contribution to spherical equivalent in high hyperopia and high myopia. Anterior segment biometric components have a more important role in hyperopia than myopia. PMID:26730304

  20. Identification of Special Patterns of Numerical Typographic Errors to Increases the Likelihood of Finding a Misplaced Patient File

    OpenAIRE

    Sun, Ying-Chou; Tang, Dah-Dian; Zeng, Qing; Greenes, Robert

    2001-01-01

    When a typographic error of a patient identification number occurs on a patient document such as an envelope for radiology films or the cover of a patient record, it will result in misplacement of the document. Once misplaced, such documents are often extremely difficult to recover. After analyzing 290 numerical typos, we found that errors do not occur randomly. Instead, many of the typos share certain specific patterns. Six major types of non-random numeral typographic error patterns have be...

  1. Identification of Special Patterns of Numerical Typographic Errors Increases the Likelihood of Finding a Misplaced Patient File

    OpenAIRE

    Sun, Ying-Chou; Tang, Dah-Dian; Zeng, Qing; Greenes, Robert

    2002-01-01

    When a typographic error of a patient identification number occurs on a patient document such as an envelope for radiology films or the cover of a patient record, it will result in misplacement of the document. Once misplaced, such documents are often extremely difficult to recover. After analyzing 290 numerical typos, we found that errors do not occur randomly. Instead, many of the typos share certain specific patterns. Six major types of non-random numeral typographic error patterns have be...

  2. Systematic Analysis of Video Data from Different Human-Robot Interaction Studies: A Categorisation of Social Signals During Error Situations

    OpenAIRE

    Manuel eGiuliani; Nicole eMirnig; Gerald eStollnberger; Susanne eStadler; Roland eBuchner; Manfred eTscheligi

    2015-01-01

    Human?robot interactions are often affected by error situations that are caused by either the robot or the human. Therefore, robots would profit from the ability to recognize when error situations occur. We investigated the verbal and non-verbal social signals that humans show when error situations occur in human?robot interaction experiments. For that, we analyzed 201 videos of five human?robot interaction user studies with varying tasks from four independent projects. The analysis shows tha...

  3. Game Design Principles based on Human Error

    Directory of Open Access Journals (Sweden)

    Guilherme Zaffari

    2016-03-01

    Full Text Available This paper displays the result of the authors’ research regarding to the incorporation of Human Error, through design principles, to video game design. In a general way, designers must consider Human Error factors throughout video game interface development; however, when related to its core design, adaptations are in need, since challenge is an important factor for fun and under the perspective of Human Error, challenge can be considered as a flaw in the system. The research utilized Human Error classifications, data triangulation via predictive human error analysis, and the expanded flow theory to allow the design of a set of principles in order to match the design of playful challenges with the principles of Human Error. From the results, it was possible to conclude that the application of Human Error in game design has a positive effect on player experience, allowing it to interact only with errors associated with the intended aesthetics of the game.

  4. Understanding human management of automation errors

    Science.gov (United States)

    McBride, Sara E.; Rogers, Wendy A.; Fisk, Arthur D.

    2013-01-01

    Automation has the potential to aid humans with a diverse set of tasks and support overall system performance. Automated systems are not always reliable, and when automation errs, humans must engage in error management, which is the process of detecting, understanding, and correcting errors. However, this process of error management in the context of human-automation interaction is not well understood. Therefore, we conducted a systematic review of the variables that contribute to error management. We examined relevant research in human-automation interaction and human error to identify critical automation, person, task, and emergent variables. We propose a framework for management of automation errors to incorporate and build upon previous models. Further, our analysis highlights variables that may be addressed through design and training to positively influence error management. Additional efforts to understand the error management process will contribute to automation designed and implemented to support safe and effective system performance. PMID:25383042

  5. An Error Analysis on TFL Learners’ Writings

    Directory of Open Access Journals (Sweden)

    Arif ÇERÇİ

    2016-12-01

    Full Text Available The main purpose of the present study is to identify and represent TFL learners’ writing errors through error analysis. All the learners started learning Turkish as foreign language with A1 (beginner level and completed the process by taking C1 (advanced certificate in TÖMER at Gaziantep University. The data of the present study were collected from 14 students’ writings in proficiency exams for each level. The data were grouped as grammatical, syntactic, spelling, punctuation, and word choice errors. The ratio and categorical distributions of identified errors were analyzed through error analysis. The data were analyzed through statistical procedures in an effort to determine whether error types differ according to the levels of the students. The errors in this study are limited to the linguistic and intralingual developmental errors

  6. Field errors in hybrid insertion devices

    International Nuclear Information System (INIS)

    Schlueter, R.D.

    1995-02-01

    Hybrid magnet theory as applied to the error analyses used in the design of Advanced Light Source (ALS) insertion devices is reviewed. Sources of field errors in hybrid insertion devices are discussed

  7. Field errors in hybrid insertion devices

    Energy Technology Data Exchange (ETDEWEB)

    Schlueter, R.D. [Lawrence Berkeley Lab., CA (United States)

    1995-02-01

    Hybrid magnet theory as applied to the error analyses used in the design of Advanced Light Source (ALS) insertion devices is reviewed. Sources of field errors in hybrid insertion devices are discussed.

  8. Error Covariance Estimation of Mesoscale Data Assimilation

    National Research Council Canada - National Science Library

    Xu, Qin

    2005-01-01

    The goal of this project is to explore and develop new methods of error covariance estimation that will provide necessary statistical descriptions of prediction and observation errors for mesoscale data assimilation...

  9. Recommendations to avoid gross errors of dose in radiotherapeutic treatments

    International Nuclear Information System (INIS)

    Souza, Cleber Nogueira de; Monti, Carlos Roberto; Sibata, Claudio Hissao

    2001-01-01

    Human mistakes are an important source of errors in radiotherapy and may occur at every step of the radiotherapy planning and treatment. To reduce this level of uncertainties, several specialized organizations have recommended a comprehensive quality assurance program. In Brazil, the requirement for these programs has been strongly stressed, and most radiotherapy services have pursued this goal regarding radiation units and dosimetry equipment, as well as the verification of the calculations of the patient's dose and the revision of the plan charts. As a contribution to the improvement of quality control, we present some recommendations to avoid failure of treatment due to error in the delivered dose, such as redundant check of the manual or computer calculations, weekly check of the total dose for each patient, and prevention of inadvertent access to any safety system of the equipment by any staff member that is only supposed to operate the machine. Moreover, the use of a computerized treatment record and verification system should be considered in order to eliminate errors due to incorrect selection of the treatment parameters, in a daily basis. We report four radioactive incidents with patient injuries occurred throughout the world and some gross errors of dose. (author)

  10. Spectrum of diagnostic errors in radiology

    OpenAIRE

    Pinto, Antonio; Brunese, Luca

    2010-01-01

    Diagnostic errors are important in all branches of medicine because they are an indication of poor patient care. Since the early 1970s, physicians have been subjected to an increasing number of medical malpractice claims. Radiology is one of the specialties most liable to claims of medical negligence. Most often, a plaintiff’s complaint against a radiologist will focus on a failure to diagnose. The etiology of radiological error is multi-factorial. Errors fall into recurrent patterns. Errors ...

  11. Improving Type Error Messages in OCaml

    OpenAIRE

    Charguéraud , Arthur

    2015-01-01

    International audience; Cryptic type error messages are a major obstacle to learning OCaml or other ML-based languages. In many cases, error messages cannot be interpreted without a sufficiently-precise model of the type inference algorithm. The problem of improving type error messages in ML has received quite a bit of attention over the past two decades, and many different strategies have been considered. The challenge is not only to produce error messages that are both sufficiently concise ...

  12. Different grades MEMS accelerometers error characteristics

    Science.gov (United States)

    Pachwicewicz, M.; Weremczuk, J.

    2017-08-01

    The paper presents calibration effects of two different MEMS accelerometers of different price and quality grades and discusses different accelerometers errors types. The calibration for error determining is provided by reference centrifugal measurements. The design and measurement errors of the centrifuge are discussed as well. It is shown that error characteristics of the sensors are very different and it is not possible to use simple calibration methods presented in the literature in both cases.

  13. Naming game with learning errors in communications

    OpenAIRE

    Lou, Yang; Chen, Guanrong

    2014-01-01

    Naming game simulates the process of naming an objective by a population of agents organized in a certain communication network topology. By pair-wise iterative interactions, the population reaches a consensus state asymptotically. In this paper, we study naming game with communication errors during pair-wise conversations, where errors are represented by error rates in a uniform probability distribution. First, a model of naming game with learning errors in communications (NGLE) is proposed....

  14. A second study of the prediction of cognitive errors using the 'CREAM' technique

    International Nuclear Information System (INIS)

    Collier, Steve; Andresen, Gisle

    2000-03-01

    Some human errors, such as errors of commission and knowledge-based errors, are not adequately modelled in probabilistic safety assessments. Even qualitative methods for handling these sorts of errors are comparatively underdeveloped. The 'Cognitive Reliability and Error Analysis Method' (CREAM) was recently developed for prediction of cognitive error modes. It has not yet been comprehensively established how reliable, valid and generally useful it could be to researchers and practitioners. A previous study of CREAM at Halden was promising, showing a relationship between errors predicted in advance and those that actually occurred in simulated fault scenarios. The present study continues this work. CREAM was used to make predictions of cognitive error modes throughout two rather difficult fault scenarios. Predictions were made of the most likely cognitive error mode, were one to occur at all, at several points throughout the expected scenarios, based upon the scenario design and description. Each scenario was then run 15 times with different operators. Error modes occurring during simulations were later scored using the task description for the scenario, videotapes of operator actions, eye-track recording, operators' verbal protocols and an expert's concurrent commentary. The scoring team had no previous substantive knowledge of the experiment or the techniques used, so as to provide a more stringent test of the data and knowledge needed for scoring. The scored error modes were then compared with the CREAM predictions to assess the degree of agreement. Some cognitive error modes were predicted successfully, but the results were generally not so encouraging as the previous study. Several problems were found with both the CREAM technique and the data needed to complete the analysis. It was felt that further development was needed before this kind of analysis can be reliable and valid, either in a research setting or as a practitioner's tool in a safety assessment

  15. Structural damage detection robust against time synchronization errors

    International Nuclear Information System (INIS)

    Yan, Guirong; Dyke, Shirley J

    2010-01-01

    Structural damage detection based on wireless sensor networks can be affected significantly by time synchronization errors among sensors. Precise time synchronization of sensor nodes has been viewed as crucial for addressing this issue. However, precise time synchronization over a long period of time is often impractical in large wireless sensor networks due to two inherent challenges. First, time synchronization needs to be performed periodically, requiring frequent wireless communication among sensors at significant energy cost. Second, significant time synchronization errors may result from node failures which are likely to occur during long-term deployment over civil infrastructures. In this paper, a damage detection approach is proposed that is robust against time synchronization errors in wireless sensor networks. The paper first examines the ways in which time synchronization errors distort identified mode shapes, and then proposes a strategy for reducing distortion in the identified mode shapes. Modified values for these identified mode shapes are then used in conjunction with flexibility-based damage detection methods to localize damage. This alternative approach relaxes the need for frequent sensor synchronization and can tolerate significant time synchronization errors caused by node failures. The proposed approach is successfully demonstrated through numerical simulations and experimental tests in a lab

  16. Interpreting the change detection error matrix

    NARCIS (Netherlands)

    Oort, van P.A.J.

    2007-01-01

    Two different matrices are commonly reported in assessment of change detection accuracy: (1) single date error matrices and (2) binary change/no change error matrices. The third, less common form of reporting, is the transition error matrix. This paper discuses the relation between these matrices.

  17. Human Errors and Bridge Management Systems

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle; Nowak, A. S.

    on basis of reliability profiles for bridges without human errors are extended to include bridges with human errors. The first rehabilitation distributions for bridges without and with human errors are combined into a joint first rehabilitation distribution. The methodology presented is illustrated...... for reinforced concrete bridges....

  18. On-Error Training (Book Excerpt).

    Science.gov (United States)

    Fukuda, Ryuji

    1985-01-01

    This excerpt from "Managerial Engineering: Techniques for Improving Quality and Productivity in the Workplace" describes the development, objectives, and use of On-Error Training (OET), a method which trains workers to learn from their errors. Also described is New Joharry's Window, a performance-error data analysis technique used in…

  19. Human Error Mechanisms in Complex Work Environments

    DEFF Research Database (Denmark)

    Rasmussen, Jens

    1988-01-01

    will account for most of the action errors observed. In addition, error mechanisms appear to be intimately related to the development of high skill and know-how in a complex work context. This relationship between errors and human adaptation is discussed in detail for individuals and organisations...

  20. Measurement error in a single regressor

    NARCIS (Netherlands)

    Meijer, H.J.; Wansbeek, T.J.

    2000-01-01

    For the setting of multiple regression with measurement error in a single regressor, we present some very simple formulas to assess the result that one may expect when correcting for measurement error. It is shown where the corrected estimated regression coefficients and the error variance may lie,

  1. Valuing Errors for Learning: Espouse or Enact?

    Science.gov (United States)

    Grohnert, Therese; Meuwissen, Roger H. G.; Gijselaers, Wim H.

    2017-01-01

    Purpose: This study aims to investigate how organisations can discourage covering up and instead encourage learning from errors through a supportive learning from error climate. In explaining professionals' learning from error behaviour, this study distinguishes between espoused (verbally expressed) and enacted (behaviourally expressed) values…

  2. Improved Landau gauge fixing and discretisation errors

    International Nuclear Information System (INIS)

    Bonnet, F.D.R.; Bowman, P.O.; Leinweber, D.B.; Richards, D.G.; Williams, A.G.

    2000-01-01

    Lattice discretisation errors in the Landau gauge condition are examined. An improved gauge fixing algorithm in which O(a 2 ) errors are removed is presented. O(a 2 ) improvement of the gauge fixing condition displays the secondary benefit of reducing the size of higher-order errors. These results emphasise the importance of implementing an improved gauge fixing condition

  3. Acoustic Evidence for Phonologically Mismatched Speech Errors

    Science.gov (United States)

    Gormley, Andrea

    2015-01-01

    Speech errors are generally said to accommodate to their new phonological context. This accommodation has been validated by several transcription studies. The transcription methodology is not the best choice for detecting errors at this level, however, as this type of error can be difficult to perceive. This paper presents an acoustic analysis of…

  4. Average beta-beating from random errors

    CERN Document Server

    Tomas Garcia, Rogelio; Langner, Andy Sven; Malina, Lukas; Franchi, Andrea; CERN. Geneva. ATS Department

    2018-01-01

    The impact of random errors on average β-beating is studied via analytical derivations and simulations. A systematic positive β-beating is expected from random errors quadratic with the sources or, equivalently, with the rms β-beating. However, random errors do not have a systematic effect on the tune.

  5. Jonas Olson's Evidence for Moral Error Theory

    NARCIS (Netherlands)

    Evers, Daan

    2016-01-01

    Jonas Olson defends a moral error theory in (2014). I first argue that Olson is not justified in believing the error theory as opposed to moral nonnaturalism in his own opinion. I then argue that Olson is not justified in believing the error theory as opposed to moral contextualism either (although

  6. Trends in Health Information Technology Safety: From Technology-Induced Errors to Current Approaches for Ensuring Technology Safety

    Science.gov (United States)

    2013-01-01

    Objectives Health information technology (HIT) research findings suggested that new healthcare technologies could reduce some types of medical errors while at the same time introducing classes of medical errors (i.e., technology-induced errors). Technology-induced errors have their origins in HIT, and/or HIT contribute to their occurrence. The objective of this paper is to review current trends in the published literature on HIT safety. Methods A review and synthesis of the medical and life sciences literature focusing on the area of technology-induced error was conducted. Results There were four main trends in the literature on technology-induced error. The following areas were addressed in the literature: definitions of technology-induced errors; models, frameworks and evidence for understanding how technology-induced errors occur; a discussion of monitoring; and methods for preventing and learning about technology-induced errors. Conclusions The literature focusing on technology-induced errors continues to grow. Research has focused on the defining what an error is, models and frameworks used to understand these new types of errors, monitoring of such errors and methods that can be used to prevent these errors. More research will be needed to better understand and mitigate these types of errors. PMID:23882411

  7. Prevalence and reporting of recruitment, randomisation and treatment errors in clinical trials: A systematic review.

    Science.gov (United States)

    Yelland, Lisa N; Kahan, Brennan C; Dent, Elsa; Lee, Katherine J; Voysey, Merryn; Forbes, Andrew B; Cook, Jonathan A

    2018-06-01

    Background/aims In clinical trials, it is not unusual for errors to occur during the process of recruiting, randomising and providing treatment to participants. For example, an ineligible participant may inadvertently be randomised, a participant may be randomised in the incorrect stratum, a participant may be randomised multiple times when only a single randomisation is permitted or the incorrect treatment may inadvertently be issued to a participant at randomisation. Such errors have the potential to introduce bias into treatment effect estimates and affect the validity of the trial, yet there is little motivation for researchers to report these errors and it is unclear how often they occur. The aim of this study is to assess the prevalence of recruitment, randomisation and treatment errors and review current approaches for reporting these errors in trials published in leading medical journals. Methods We conducted a systematic review of individually randomised, phase III, randomised controlled trials published in New England Journal of Medicine, Lancet, Journal of the American Medical Association, Annals of Internal Medicine and British Medical Journal from January to March 2015. The number and type of recruitment, randomisation and treatment errors that were reported and how they were handled were recorded. The corresponding authors were contacted for a random sample of trials included in the review and asked to provide details on unreported errors that occurred during their trial. Results We identified 241 potentially eligible articles, of which 82 met the inclusion criteria and were included in the review. These trials involved a median of 24 centres and 650 participants, and 87% involved two treatment arms. Recruitment, randomisation or treatment errors were reported in 32 in 82 trials (39%) that had a median of eight errors. The most commonly reported error was ineligible participants inadvertently being randomised. No mention of recruitment, randomisation

  8. Slow Learner Errors Analysis in Solving Fractions Problems in Inclusive Junior High School Class

    Science.gov (United States)

    Novitasari, N.; Lukito, A.; Ekawati, R.

    2018-01-01

    A slow learner whose IQ is between 71 and 89 will have difficulties in solving mathematics problems that often lead to errors. The errors could be analyzed to where the errors may occur and its type. This research is qualitative descriptive which aims to describe the locations, types, and causes of slow learner errors in the inclusive junior high school class in solving the fraction problem. The subject of this research is one slow learner of seventh-grade student which was selected through direct observation by the researcher and through discussion with mathematics teacher and special tutor which handles the slow learner students. Data collection methods used in this study are written tasks and semistructured interviews. The collected data was analyzed by Newman’s Error Analysis (NEA). Results show that there are four locations of errors, namely comprehension, transformation, process skills, and encoding errors. There are four types of errors, such as concept, principle, algorithm, and counting errors. The results of this error analysis will help teachers to identify the causes of the errors made by the slow learner.

  9. Medication prescribing errors in a public teaching hospital in India: A prospective study.

    Directory of Open Access Journals (Sweden)

    Pote S

    2007-03-01

    Full Text Available Background: To prevent medication errors in prescribing, one needs to know their types and relative occurrence. Such errors are a great cause of concern as they have the potential to cause patient harm. The aim of this study was to determine the nature and types of medication prescribing errors in an Indian setting.Methods: The medication errors were analyzed in a prospective observational study conducted in 3 medical wards of a public teaching hospital in India. The medication errors were analyzed by means of Micromedex Drug-Reax database.Results: Out of 312 patients, only 304 were included in the study. Of the 304 cases, 103 (34% cases had at least one error. The total number of errors found was 157. The drug-drug interactions were the most frequently (68.2% occurring type of error, which was followed by incorrect dosing interval (12% and dosing errors (9.5%. The medication classes involved most were antimicrobial agents (29.4%, cardiovascular agents (15.4%, GI agents (8.6% and CNS agents (8.2%. The moderate errors contributed maximum (61.8% to the total errors when compared to the major (25.5% and minor (12.7% errors. The results showed that the number of errors increases with age and number of medicines prescribed.Conclusion: The results point to the establishment of medication error reporting at each hospital and to share the data with other hospitals. The role of clinical pharmacist in this situation appears to be a strong intervention; and the clinical pharmacist, initially, could confine to identification of the medication errors.

  10. Remote one-qubit information concentration and decoding of operator quantum error-correction codes

    International Nuclear Information System (INIS)

    Hsu Liyi

    2007-01-01

    We propose the general scheme of remote one-qubit information concentration. To achieve the task, the Bell-correlated mixed states are exploited. In addition, the nonremote one-qubit information concentration is equivalent to the decoding of the quantum error-correction code. Here we propose how to decode the stabilizer codes. In particular, the proposed scheme can be used for the operator quantum error-correction codes. The encoded state can be recreated on the errorless qubit, regardless how many bit-flip errors and phase-flip errors have occurred

  11. Error Distributions on Large Entangled States with Non-Markovian Dynamics

    DEFF Research Database (Denmark)

    McCutcheon, Dara; Lindner, Netanel H.; Rudolph, Terry

    2014-01-01

    We investigate the distribution of errors on a computationally useful entangled state generated via the repeated emission from an emitter undergoing strongly non-Markovian evolution. For emitter-environment coupling of pure-dephasing form, we show that the probability that a particular patten...... of errors occurs has a bound of Markovian form, and thus, accuracy threshold theorems based on Markovian models should be just as effective. Beyond the pure-dephasing assumption, though complicated error structures can arise, they can still be qualitatively bounded by a Markovian error model....

  12. Profile of drug administration errors in anesthesia among anesthesiologists from Santa Catarina

    Directory of Open Access Journals (Sweden)

    Thomas Rolf Erdmann

    2016-02-01

    Full Text Available INTRODUCTION: Anesthesiology is the only medical specialty that prescribes, dilutes, and administers drugs without conferral by another professional. Adding to the high frequency of drug administration, a propitious scenario to errors is created. OBJECTIVE: Access the prevalence of drug administration errors during anesthesia among anesthesiologists from Santa Catarina, the circumstances in which they occurred, and possible associated factors. MATERIALS AND METHODS: An electronic questionnaire was sent to all anesthesiologists from Sociedade de Anestesiologia do Estado de Santa Catarina, with direct or multiple choice questions on responder demographics and anesthesia practice profile; prevalence of errors, type and consequence of error; and factors that may have contributed to the errors. RESULTS: Of the respondents, 91.8% reported they had committed administration errors, adding the total error of 274 and mean of 4.7 (6.9 errors per respondent. The most common error was replacement (68.4%, followed by dose error (49.1%, and omission (35%. Only 7% of respondents reported neuraxial administration error. Regarding circumstances of errors, they mainly occurred in the morning (32.7%, in anesthesia maintenance (49%, with 47.8% without harm to the patient and 1.75% with the highest morbidity and irreversible damage, and 87.3% of cases with immediate identification. As for possible contributing factors, the most frequent were distraction and fatigue (64.9% and misreading of labels, ampoules, or syringes (54.4%. CONCLUSION: Most respondents committed more than one error in anesthesia administration, mainly justified as a distraction or fatigue, and of low gravity.

  13. The Implementation of APIQ Creative Mathematics Game Method in the Subject Matter of Greatest Common Factor and Least Common Multiple in Elementary School

    Science.gov (United States)

    Rahman, Abdul; Saleh Ahmar, Ansari; Arifin, A. Nurani M.; Upu, Hamzah; Mulbar, Usman; Alimuddin; Arsyad, Nurdin; Ruslan; Rusli; Djadir; Sutamrin; Hamda; Minggi, Ilham; Awi; Zaki, Ahmad; Ahmad, Asdar; Ihsan, Hisyam

    2018-01-01

    One of causal factors for uninterested feeling of the students in learning mathematics is a monotonous learning method, like in traditional learning method. One of the ways for motivating students to learn mathematics is by implementing APIQ (Aritmetika Plus Intelegensi Quantum) creative mathematics game method. The purposes of this research are (1) to describe students’ responses toward the implementation of APIQ creative mathematics game method on the subject matter of Greatest Common Factor (GCF) and Least Common Multiple (LCM) and (2) to find out whether by implementing this method, the student’s learning completeness will improve or not. Based on the results of this research, it is shown that the responses of the students toward the implementation of APIQ creative mathematics game method in the subject matters of GCF and LCM were good. It is seen in the percentage of the responses were between 76-100%. (2) The implementation of APIQ creative mathematics game method on the subject matters of GCF and LCM improved the students’ learning.

  14. List of Error-Prone Abbreviations, Symbols, and Dose Designations

    Science.gov (United States)

    ... Analysis and Coaching Report an Error Report a Medication Error Report a Vaccine Error Consumer Error Reporting Search ... which have been reported through the ISMP National Medication Errors Reporting Program (ISMP MERP) as being frequently misinterpreted ...

  15. A Human Error Analysis Procedure for Identifying Potential Error Modes and Influencing Factors for Test and Maintenance Activities

    International Nuclear Information System (INIS)

    Kim, Jae Whan; Park, Jin Kyun

    2010-01-01

    Periodic or non-periodic test and maintenance (T and M) activities in large, complex systems such as nuclear power plants (NPPs) are essential for sustaining stable and safe operation of the systems. On the other hand, it also has been raised that human erroneous actions that might occur during T and M activities has the possibility of incurring unplanned reactor trips (RTs) or power derate, making safety-related systems unavailable, or making the reliability of components degraded. Contribution of human errors during normal and abnormal activities of NPPs to the unplanned RTs is known to be about 20% of the total events. This paper introduces a procedure for predictively analyzing human error potentials when maintenance personnel perform T and M tasks based on a work procedure or their work plan. This procedure helps plant maintenance team prepare for plausible human errors. The procedure to be introduced is focusing on the recurrent error forms (or modes) in execution-based errors such as wrong object, omission, too little, and wrong action

  16. Analysis of error patterns in clinical radiotherapy

    International Nuclear Information System (INIS)

    Macklis, Roger; Meier, Tim; Barrett, Patricia; Weinhous, Martin

    1996-01-01

    Purpose: Until very recently, prescription errors and adverse treatment events have rarely been studied or reported systematically in oncology. We wished to understand the spectrum and severity of radiotherapy errors that take place on a day-to-day basis in a high-volume academic practice and to understand the resource needs and quality assurance challenges placed on a department by rapid upswings in contract-based clinical volumes requiring additional operating hours, procedures, and personnel. The goal was to define clinical benchmarks for operating safety and to detect error-prone treatment processes that might function as 'early warning' signs. Methods: A multi-tiered prospective and retrospective system for clinical error detection and classification was developed, with formal analysis of the antecedents and consequences of all deviations from prescribed treatment delivery, no matter how trivial. A department-wide record-and-verify system was operational during this period and was used as one method of treatment verification and error detection. Brachytherapy discrepancies were analyzed separately. Results: During the analysis year, over 2000 patients were treated with over 93,000 individual fields. A total of 59 errors affecting a total of 170 individual treated fields were reported or detected during this period. After review, all of these errors were classified as Level 1 (minor discrepancy with essentially no potential for negative clinical implications). This total treatment delivery error rate (170/93, 332 or 0.18%) is significantly better than corresponding error rates reported for other hospital and oncology treatment services, perhaps reflecting the relatively sophisticated error avoidance and detection procedures used in modern clinical radiation oncology. Error rates were independent of linac model and manufacturer, time of day (normal operating hours versus late evening or early morning) or clinical machine volumes. There was some relationship to

  17. Comparison between calorimeter and HLNC errors

    International Nuclear Information System (INIS)

    Goldman, A.S.; De Ridder, P.; Laszlo, G.

    1991-01-01

    This paper summarizes an error analysis that compares systematic and random errors of total plutonium mass estimated for high-level neutron coincidence counter (HLNC) and calorimeter measurements. This task was part of an International Atomic Energy Agency (IAEA) study on the comparison of the two instruments to determine if HLNC measurement errors met IAEA standards and if the calorimeter gave ''significantly'' better precision. Our analysis was based on propagation of error models that contained all known sources of errors including uncertainties associated with plutonium isotopic measurements. 5 refs., 2 tabs

  18. Naturally occurring dominant drug resistance mutations occur infrequently in the setting of recently acquired hepatitis C.

    Science.gov (United States)

    Applegate, Tanya L; Gaudieri, Silvana; Plauzolles, Anne; Chopra, Abha; Grebely, Jason; Lucas, Michaela; Hellard, Margaret; Luciani, Fabio; Dore, Gregory J; Matthews, Gail V

    2015-01-01

    Direct-acting antivirals (DAAs) are predicted to transform hepatitis C therapy, yet little is known about the prevalence of naturally occurring resistance mutations in recently acquired HCV. This study aimed to determine the prevalence and frequency of drug resistance mutations in the viral quasispecies among HIV-positive and -negative individuals with recent HCV. The NS3 protease, NS5A and NS5B polymerase genes were amplified from 50 genotype 1a participants of the Australian Trial in Acute Hepatitis C. Amino acid variations at sites known to be associated with possible drug resistance were analysed by ultra-deep pyrosequencing. A total of 12% of individuals harboured dominant resistance mutations, while 36% demonstrated non-dominant resistant variants below that detectable by bulk sequencing (that is, Resistance variants (resistance from all classes, with the exception of sofosbuvir. Dominant resistant mutations were uncommonly observed in the setting of recent HCV. However, low-level mutations to all DAA classes were observed by deep sequencing at the majority of sites and in most individuals. The significance of these variants and impact on future treatment options remains to be determined. Clinicaltrials.gov NCT00192569.

  19. Analysis of errors in forensic science

    Directory of Open Access Journals (Sweden)

    Mingxiao Du

    2017-01-01

    Full Text Available Reliability of expert testimony is one of the foundations of judicial justice. Both expert bias and scientific errors affect the reliability of expert opinion, which in turn affects the trustworthiness of the findings of fact in legal proceedings. Expert bias can be eliminated by replacing experts; however, it may be more difficult to eliminate scientific errors. From the perspective of statistics, errors in operation of forensic science include systematic errors, random errors, and gross errors. In general, process repetition and abiding by the standard ISO/IEC:17025: 2005, general requirements for the competence of testing and calibration laboratories, during operation are common measures used to reduce errors that originate from experts and equipment, respectively. For example, to reduce gross errors, the laboratory can ensure that a test is repeated several times by different experts. In applying for forensic principles and methods, the Federal Rules of Evidence 702 mandate that judges consider factors such as peer review, to ensure the reliability of the expert testimony. As the scientific principles and methods may not undergo professional review by specialists in a certain field, peer review serves as an exclusive standard. This study also examines two types of statistical errors. As false-positive errors involve a higher possibility of an unfair decision-making, they should receive more attention than false-negative errors.

  20. Error management process for power stations

    International Nuclear Information System (INIS)

    Hirotsu, Yuko; Takeda, Daisuke; Fujimoto, Junzo; Nagasaka, Akihiko

    2016-01-01

    The purpose of this study is to establish 'error management process for power stations' for systematizing activities for human error prevention and for festering continuous improvement of these activities. The following are proposed by deriving concepts concerning error management process from existing knowledge and realizing them through application and evaluation of their effectiveness at a power station: an entire picture of error management process that facilitate four functions requisite for maraging human error prevention effectively (1. systematizing human error prevention tools, 2. identifying problems based on incident reports and taking corrective actions, 3. identifying good practices and potential problems for taking proactive measures, 4. prioritizeng human error prevention tools based on identified problems); detail steps for each activity (i.e. developing an annual plan for human error prevention, reporting and analyzing incidents and near misses) based on a model of human error causation; procedures and example of items for identifying gaps between current and desired levels of executions and outputs of each activity; stages for introducing and establishing the above proposed error management process into a power station. By giving shape to above proposals at a power station, systematization and continuous improvement of activities for human error prevention in line with the actual situation of the power station can be expected. (author)

  1. Impact of Measurement Error on Synchrophasor Applications

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Yilu [Univ. of Tennessee, Knoxville, TN (United States); Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Gracia, Jose R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Ewing, Paul D. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Zhao, Jiecheng [Univ. of Tennessee, Knoxville, TN (United States); Tan, Jin [Univ. of Tennessee, Knoxville, TN (United States); Wu, Ling [Univ. of Tennessee, Knoxville, TN (United States); Zhan, Lingwei [Univ. of Tennessee, Knoxville, TN (United States)

    2015-07-01

    Phasor measurement units (PMUs), a type of synchrophasor, are powerful diagnostic tools that can help avert catastrophic failures in the power grid. Because of this, PMU measurement errors are particularly worrisome. This report examines the internal and external factors contributing to PMU phase angle and frequency measurement errors and gives a reasonable explanation for them. It also analyzes the impact of those measurement errors on several synchrophasor applications: event location detection, oscillation detection, islanding detection, and dynamic line rating. The primary finding is that dynamic line rating is more likely to be influenced by measurement error. Other findings include the possibility of reporting nonoscillatory activity as an oscillation as the result of error, failing to detect oscillations submerged by error, and the unlikely impact of error on event location and islanding detection.

  2. Advanced hardware design for error correcting codes

    CERN Document Server

    Coussy, Philippe

    2015-01-01

    This book provides thorough coverage of error correcting techniques. It includes essential basic concepts and the latest advances on key topics in design, implementation, and optimization of hardware/software systems for error correction. The book’s chapters are written by internationally recognized experts in this field. Topics include evolution of error correction techniques, industrial user needs, architectures, and design approaches for the most advanced error correcting codes (Polar Codes, Non-Binary LDPC, Product Codes, etc). This book provides access to recent results, and is suitable for graduate students and researchers of mathematics, computer science, and engineering. • Examines how to optimize the architecture of hardware design for error correcting codes; • Presents error correction codes from theory to optimized architecture for the current and the next generation standards; • Provides coverage of industrial user needs advanced error correcting techniques.

  3. Approximate error conjugation gradient minimization methods

    Science.gov (United States)

    Kallman, Jeffrey S

    2013-05-21

    In one embodiment, a method includes selecting a subset of rays from a set of all rays to use in an error calculation for a constrained conjugate gradient minimization problem, calculating an approximate error using the subset of rays, and calculating a minimum in a conjugate gradient direction based on the approximate error. In another embodiment, a system includes a processor for executing logic, logic for selecting a subset of rays from a set of all rays to use in an error calculation for a constrained conjugate gradient minimization problem, logic for calculating an approximate error using the subset of rays, and logic for calculating a minimum in a conjugate gradient direction based on the approximate error. In other embodiments, computer program products, methods, and systems are described capable of using approximate error in constrained conjugate gradient minimization problems.

  4. Verification of mid-ocean ballast water exchange using naturally occurring coastal tracers

    International Nuclear Information System (INIS)

    Murphy, Kathleen; Boehme, Jennifer; Coble, Paula; Cullen, Jay; Field, Paul; Moore, Willard; Perry, Elgin; Sherrell, Robert; Ruiz, Gregory

    2004-01-01

    We examined methods for verifying whether or not ships have performed mid-ocean ballast water exchange (BWE) on four commercial vessels operating in the Pacific and Atlantic Oceans. During BWE, a ship replaces the coastal water in its ballast tanks with water drawn from the open ocean, which is considered to harbor fewer organisms capable of establishing in coastal environments. We measured concentrations of several naturally occurring chemical tracers (salinity, six trace elements, colored dissolved organic matter fluorescence and radium isotopes) along ocean transects and in ballast tanks subjected to varying degrees of BWE (0-99%). Many coastal tracers showed significant concentration changes due to BWE, and our ability to detect differences between exchanged and unexchanged ballast tanks was greatest under multivariate analysis. An expanded dataset, which includes additional geographic regions, is now needed to test the generality of our results

  5. Verification of mid-ocean ballast water exchange using naturally occurring coastal tracers

    Energy Technology Data Exchange (ETDEWEB)

    Murphy, Kathleen; Boehme, Jennifer; Coble, Paula; Cullen, Jay; Field, Paul; Moore, Willard; Perry, Elgin; Sherrell, Robert; Ruiz, Gregory

    2004-04-01

    We examined methods for verifying whether or not ships have performed mid-ocean ballast water exchange (BWE) on four commercial vessels operating in the Pacific and Atlantic Oceans. During BWE, a ship replaces the coastal water in its ballast tanks with water drawn from the open ocean, which is considered to harbor fewer organisms capable of establishing in coastal environments. We measured concentrations of several naturally occurring chemical tracers (salinity, six trace elements, colored dissolved organic matter fluorescence and radium isotopes) along ocean transects and in ballast tanks subjected to varying degrees of BWE (0-99%). Many coastal tracers showed significant concentration changes due to BWE, and our ability to detect differences between exchanged and unexchanged ballast tanks was greatest under multivariate analysis. An expanded dataset, which includes additional geographic regions, is now needed to test the generality of our results.

  6. Speech abilities in preschool children with speech sound disorder with and without co-occurring language impairment.

    Science.gov (United States)

    Macrae, Toby; Tyler, Ann A

    2014-10-01

    The authors compared preschool children with co-occurring speech sound disorder (SSD) and language impairment (LI) to children with SSD only in their numbers and types of speech sound errors. In this post hoc quasi-experimental study, independent samples t tests were used to compare the groups in the standard score from different tests of articulation/phonology, percent consonants correct, and the number of omission, substitution, distortion, typical, and atypical error patterns used in the production of different wordlists that had similar levels of phonetic and structural complexity. In comparison with children with SSD only, children with SSD and LI used similar numbers but different types of errors, including more omission patterns ( p < .001, d = 1.55) and fewer distortion patterns ( p = .022, d = 1.03). There were no significant differences in substitution, typical, and atypical error pattern use. Frequent omission error pattern use may reflect a more compromised linguistic system characterized by absent phonological representations for target sounds (see Shriberg et al., 2005). Research is required to examine the diagnostic potential of early frequent omission error pattern use in predicting later diagnoses of co-occurring SSD and LI and/or reading problems.

  7. ERROR ANALYSIS IN THE TRAVEL WRITING MADE BY THE STUDENTS OF ENGLISH STUDY PROGRAM

    Directory of Open Access Journals (Sweden)

    Vika Agustina

    2015-05-01

    Full Text Available This study was conducted to identify the kinds of errors in surface strategy taxonomy and to know the dominant type of errors made by the fifth semester students of English Department of one State University in Malang-Indonesia in producing their travel writing. The type of research of this study is document analysis since it analyses written materials, in this case travel writing texts. The analysis finds that the grammatical errors made by the students based on surface strategy taxonomy theory consist of four types. They are (1 omission, (2 addition, (3 misformation and (4 misordering. The most frequent errors occuring in misformation are in the use of tense form. Secondly, the errors are in omission of noun/verb inflection. The next error, there are many clauses that contain unnecessary phrase added there.

  8. Understanding Human Error in Naval Aviation Mishaps.

    Science.gov (United States)

    Miranda, Andrew T

    2018-04-01

    To better understand the external factors that influence the performance and decisions of aviators involved in Naval aviation mishaps. Mishaps in complex activities, ranging from aviation to nuclear power operations, are often the result of interactions between multiple components within an organization. The Naval aviation mishap database contains relevant information, both in quantitative statistics and qualitative reports, that permits analysis of such interactions to identify how the working atmosphere influences aviator performance and judgment. Results from 95 severe Naval aviation mishaps that occurred from 2011 through 2016 were analyzed using Bayes' theorem probability formula. Then a content analysis was performed on a subset of relevant mishap reports. Out of the 14 latent factors analyzed, the Bayes' application identified 6 that impacted specific aspects of aviator behavior during mishaps. Technological environment, misperceptions, and mental awareness impacted basic aviation skills. The remaining 3 factors were used to inform a content analysis of the contextual information within mishap reports. Teamwork failures were the result of plan continuation aggravated by diffused responsibility. Resource limitations and risk management deficiencies impacted judgments made by squadron commanders. The application of Bayes' theorem to historical mishap data revealed the role of latent factors within Naval aviation mishaps. Teamwork failures were seen to be considerably damaging to both aviator skill and judgment. Both the methods and findings have direct application for organizations interested in understanding the relationships between external factors and human error. It presents real-world evidence to promote effective safety decisions.

  9. Frequency of medication errors in an emergency department of a large teaching hospital in southern Iran

    Directory of Open Access Journals (Sweden)

    Vazin A

    2014-12-01

    Full Text Available Afsaneh Vazin,1 Zahra Zamani,1 Nahid Hatam2 1Department of Clinical Pharmacy, Faculty of Pharmacy, 2School of Management and Medical Information Sciences, Shiraz University of Medical Sciences, Shiraz, Iran Abstract: This study was conducted with the purpose of determining the frequency of medication errors (MEs occurring in tertiary care emergency department (ED of a large academic hospital in Iran. The incidence of MEs was determined through the disguised direct observation method conducted by a trained observer. A total of 1,031 medication doses administered to 202 patients admitted to the tertiary care ED were observed over a course of 54 6-hour shifts. Following collection of the data and analysis of the errors with the assistance of a clinical pharmacist, frequency of errors in the different stages was reported and analyzed in SPSS-21 software. For the 202 patients and the 1,031 medication doses evaluated in the present study, 707 (68.5% MEs were recorded in total. In other words, 3.5 errors per patient and almost 0.69 errors per medication are reported to have occurred, with the highest frequency of errors pertaining to cardiovascular (27.2% and antimicrobial (23.6% medications. The highest rate of errors occurred during the administration phase of the medication use process with a share of 37.6%, followed by errors of prescription and transcription with a share of 21.1% and 10% of errors, respectively. Omission (7.6% and wrong time error (4.4% were the most frequent administration errors. The less-experienced nurses (P=0.04, higher patient-to-nurse ratio (P=0.017, and the morning shifts (P=0.035 were positively related to administration errors. Administration errors marked the highest share of MEs occurring in the different medication use processes. Increasing the number of nurses and employing the more experienced of them in EDs can help reduce nursing errors. Addressing the shortcomings with further research should result in reduction

  10. The Error Reporting in the ATLAS TDAQ System

    Science.gov (United States)

    Kolos, Serguei; Kazarov, Andrei; Papaevgeniou, Lykourgos

    2015-05-01

    The ATLAS Error Reporting provides a service that allows experts and shift crew to track and address errors relating to the data taking components and applications. This service, called the Error Reporting Service (ERS), gives to software applications the opportunity to collect and send comprehensive data about run-time errors, to a place where it can be intercepted in real-time by any other system component. Other ATLAS online control and monitoring tools use the ERS as one of their main inputs to address system problems in a timely manner and to improve the quality of acquired data. The actual destination of the error messages depends solely on the run-time environment, in which the online applications are operating. When an application sends information to ERS, depending on the configuration, it may end up in a local file, a database, distributed middleware which can transport it to an expert system or display it to users. Thanks to the open framework design of ERS, new information destinations can be added at any moment without touching the reporting and receiving applications. The ERS Application Program Interface (API) is provided in three programming languages used in the ATLAS online environment: C++, Java and Python. All APIs use exceptions for error reporting but each of them exploits advanced features of a given language to simplify the end-user program writing. For example, as C++ lacks language support for exceptions, a number of macros have been designed to generate hierarchies of C++ exception classes at compile time. Using this approach a software developer can write a single line of code to generate a boilerplate code for a fully qualified C++ exception class declaration with arbitrary number of parameters and multiple constructors, which encapsulates all relevant static information about the given type of issues. When a corresponding error occurs at run time, the program just need to create an instance of that class passing relevant values to one

  11. Factors within the family environment such as parents' dietary habits and fruit and vegetable availability have the greatest influence on fruit and vegetable consumption by Polish children.

    Science.gov (United States)

    Wolnicka, Katarzyna; Taraszewska, Anna Małgorzata; Jaczewska-Schuetz, Joanna; Jarosz, Mirosław

    2015-10-01

    To identify determinants of fruit and vegetable (F&V) consumption among school-aged children. A survey study was conducted in October 2010. The questionnaire contained questions concerning social and demographic data, lifestyle and dietary habits, particularly the frequency of F&V consumption, availability of F&V and knowledge about recommended amounts of F&V intake. Polish primary schools. Children (n 1255) aged 9 years from randomly selected primary schools and their parents. The children's consumption of fruit and of vegetables was influenced by the fruit consumption and vegetable consumption of their parents (r=0·333 and r=0·273, respectively; P=0·001), parents encouraging their children to eat F&V (r=0·259 and r=0·271, respectively; P=0·001), giving children F&V to take to school (r=0·338 and r=0·321, respectively; P=0·001) and the availability of F&V at home (r=0·200 and r=0·296, respectively; P=0·001). Parental education influenced only the frequency of fruit consumption (r=0·074; P=0·01). A correlation between parents' knowledge of the recommended intakes and the frequency of vegetable and fruit consumption by children was noticed (r=0·258 and r=0·192, respectively, P=0·001). Factors within the family environment such as parents' dietary habits and F&V availability had the greatest influence on the F&V consumption by children. Educational activities aimed at parents are crucial to increase the consumption of F&V among children.

  12. Frequency of Burnout, Sleepiness and Depression in Emergency Medicine Residents with Medical Errors in the Emergency Department

    Directory of Open Access Journals (Sweden)

    Alireza Aala

    2014-07-01

    Full Text Available Aims: Medical error is a great concern of the patients and physicians. It usually occurs due to physicians’ exhaustion, distress and fatigue. In this study, we aimed to evaluate frequency of distress and fatigue among emergency medicine residents reporting a medical error. Materials and Methods: The study population consisted of emergency medicine residents who completed an emailed questionnaire including self-assessment of medical errors, the Epworth Sleepiness Scale (ESS score, the Maslach Burnout Inventory, and PRIME-MD validated depression screening tool.   Results: In this survey, 100 medical errors were reported including diagnostic errors in 53, therapeutic errors in 24 and following errors in 23 subjects. Most errors were reported by males and third year residents. Residents had no signs of depression, but all had some degrees of sleepiness and burnout. There were significant differences between errors subtypes and age, residency year, depression, sleepiness and burnout scores (p<0.0001.   Conclusion: In conclusion, residents committing a medical error usually experience burnout and have some grades of sleepiness that makes them less motivated increasing the probability of medical errors. However, as none of the residents had depression, it could be concluded that depression has no significant role in medical error occurrence and perhaps it is a possible consequence of medical error.    Keywords: Residents; Medical error; Burnout; Sleepiness; Depression

  13. A systems perspective of managing error recovery and tactical re-planning of operating teams in safety critical domains.

    Science.gov (United States)

    Kontogiannis, Tom

    2011-04-01

    Research in human error has provided useful tools for designing procedures, training, and intelligent interfaces that trap errors at an early stage. However, this "error prevention" policy may not be entirely successful because human errors will inevitably occur. This requires that the error management process (e.g., detection, diagnosis and correction) must also be supported. Research has focused almost exclusively on error detection; little is known about error recovery, especially in the context of safety critical systems. The aim of this paper is to develop a research framework that integrates error recovery strategies employed by experienced practitioners in handling their own errors. A control theoretic model of human performance was used to integrate error recovery strategies assembled from reviews of the literature, analyses of near misses from aviation and command & control domains, and observations of abnormal situations training at air traffic control facilities. The method of system dynamics has been used to analyze and compare error recovery strategies in terms of patterns of interaction, system affordances, and types of recovery plans. System dynamics offer a promising basis for studying the nature of error recovery management in the context of team interactions and system characteristics. The proposed taxonomy of error recovery strategies can help human factors and safety experts to develop resilient system designs and training solutions for managing human errors in unforeseen situations; it may also help incident investigators to explore why people's actions and assessments were not corrected at the time. Copyright © 2011 Elsevier Ltd. All rights reserved.

  14. How common are cognitive errors in cases presented at emergency medicine resident morbidity and mortality conferences?

    Science.gov (United States)

    Chu, David; Xiao, Jane; Shah, Payal; Todd, Brett

    2018-06-20

    Cognitive errors are a major contributor to medical error. Traditionally, medical errors at teaching hospitals are analyzed in morbidity and mortality (M&M) conferences. We aimed to describe the frequency of cognitive errors in relation to the occurrence of diagnostic and other error types, in cases presented at an emergency medicine (EM) resident M&M conference. We conducted a retrospective study of all cases presented at a suburban US EM residency monthly M&M conference from September 2011 to August 2016. Each case was reviewed using the electronic medical record (EMR) and notes from the M&M case by two EM physicians. Each case was categorized by type of primary medical error that occurred as described by Okafor et al. When a diagnostic error occurred, the case was reviewed for contributing cognitive and non-cognitive factors. Finally, when a cognitive error occurred, the case was classified into faulty knowledge, faulty data gathering or faulty synthesis, as described by Graber et al. Disagreements in error type were mediated by a third EM physician. A total of 87 M&M cases were reviewed; the two reviewers agreed on 73 cases, and 14 cases required mediation by a third reviewer. Forty-eight cases involved diagnostic errors, 47 of which were cognitive errors. Of these 47 cases, 38 involved faulty synthesis, 22 involved faulty data gathering and only 11 involved faulty knowledge. Twenty cases contained more than one type of cognitive error. Twenty-nine cases involved both a resident and an attending physician, while 17 cases involved only an attending physician. Twenty-one percent of the resident cases involved all three cognitive errors, while none of the attending cases involved all three. Forty-one percent of the resident cases and only 6% of the attending cases involved faulty knowledge. One hundred percent of the resident cases and 94% of the attending cases involved faulty synthesis. Our review of 87 EM M&M cases revealed that cognitive errors are commonly

  15. Modeling the probability distribution of positional errors incurred by residential address geocoding

    Directory of Open Access Journals (Sweden)

    Mazumdar Soumya

    2007-01-01

    Full Text Available Abstract Background The assignment of a point-level geocode to subjects' residences is an important data assimilation component of many geographic public health studies. Often, these assignments are made by a method known as automated geocoding, which attempts to match each subject's address to an address-ranged street segment georeferenced within a streetline database and then interpolate the position of the address along that segment. Unfortunately, this process results in positional errors. Our study sought to model the probability distribution of positional errors associated with automated geocoding and E911 geocoding. Results Positional errors were determined for 1423 rural addresses in Carroll County, Iowa as the vector difference between each 100%-matched automated geocode and its true location as determined by orthophoto and parcel information. Errors were also determined for 1449 60%-matched geocodes and 2354 E911 geocodes. Huge (> 15 km outliers occurred among the 60%-matched geocoding errors; outliers occurred for the other two types of geocoding errors also but were much smaller. E911 geocoding was more accurate (median error length = 44 m than 100%-matched automated geocoding (median error length = 168 m. The empirical distributions of positional errors associated with 100%-matched automated geocoding and E911 geocoding exhibited a distinctive Greek-cross shape and had many other interesting features that were not capable of being fitted adequately by a single bivariate normal or t distribution. However, mixtures of t distributions with two or three components fit the errors very well. Conclusion Mixtures of bivariate t distributions with few components appear to be flexible enough to fit many positional error datasets associated with geocoding, yet parsimonious enough to be feasible for nascent applications of measurement-error methodology to spatial epidemiology.

  16. A prospective, multicenter study of pharmacist activities resulting in medication error interception in the emergency department.

    Science.gov (United States)

    Patanwala, Asad E; Sanders, Arthur B; Thomas, Michael C; Acquisto, Nicole M; Weant, Kyle A; Baker, Stephanie N; Merritt, Erica M; Erstad, Brian L

    2012-05-01

    The primary objective of this study is to determine the activities of pharmacists that lead to medication error interception in the emergency department (ED). This was a prospective, multicenter cohort study conducted in 4 geographically diverse academic and community EDs in the United States. Each site had clinical pharmacy services. Pharmacists at each site recorded their medication error interceptions for 250 hours of cumulative time when present in the ED (1,000 hours total for all 4 sites). Items recorded included the activities of the pharmacist that led to medication error interception, type of orders, phase of medication use process, and type of error. Independent evaluators reviewed all medication errors. Descriptive analyses were performed for all variables. A total of 16,446 patients presented to the EDs during the study, resulting in 364 confirmed medication error interceptions by pharmacists. The pharmacists' activities that led to medication error interception were as follows: involvement in consultative activities (n=187; 51.4%), review of medication orders (n=127; 34.9%), and other (n=50; 13.7%). The types of orders resulting in medication error interceptions were written or computerized orders (n=198; 54.4%), verbal orders (n=119; 32.7%), and other (n=47; 12.9%). Most medication error interceptions occurred during the prescribing phase of the medication use process (n=300; 82.4%) and the most common type of error was wrong dose (n=161; 44.2%). Pharmacists' review of written or computerized medication orders accounts for only a third of medication error interceptions. Most medication error interceptions occur during consultative activities. Copyright © 2011. Published by Mosby, Inc.

  17. Advanced MMIS Toward Substantial Reduction in Human Errors in NPPs

    Energy Technology Data Exchange (ETDEWEB)

    Seong, Poong Hyun; Kang, Hyun Gook [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Na, Man Gyun [Chosun Univ., Gwangju (Korea, Republic of); Kim, Jong Hyun [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of); Heo, Gyunyoung [Kyung Hee Univ., Yongin (Korea, Republic of); Jung, Yoensub [Korea Hydro and Nuclear Power Co., Ltd., Daejeon (Korea, Republic of)

    2013-04-15

    This paper aims to give an overview of the methods to inherently prevent human errors and to effectively mitigate the consequences of such errors by securing defense-in-depth during plant management through the advanced man-machine interface system (MMIS). It is needless to stress the significance of human error reduction during an accident in nuclear power plants (NPPs). Unexpected shutdowns caused by human errors not only threaten nuclear safety but also make public acceptance of nuclear power extremely lower. We have to recognize there must be the possibility of human errors occurring since humans are not essentially perfect particularly under stressful conditions. However, we have the opportunity to improve such a situation through advanced information and communication technologies on the basis of lessons learned from our experiences. As important lessons, authors explained key issues associated with automation, man-machine interface, operator support systems, and procedures. Upon this investigation, we outlined the concept and technical factors to develop advanced automation, operation and maintenance support systems, and computer-based procedures using wired/wireless technology. It should be noted that the ultimate responsibility of nuclear safety obviously belongs to humans not to machines. Therefore, safety culture including education and training, which is a kind of organizational factor, should be emphasized as well. In regard to safety culture for human error reduction, several issues that we are facing these days were described. We expect the ideas of the advanced MMIS proposed in this paper to lead in the future direction of related researches and finally supplement the safety of NPPs.

  18. Advanced MMIS Toward Substantial Reduction in Human Errors in NPPs

    International Nuclear Information System (INIS)

    Seong, Poong Hyun; Kang, Hyun Gook; Na, Man Gyun; Kim, Jong Hyun; Heo, Gyunyoung; Jung, Yoensub

    2013-01-01

    This paper aims to give an overview of the methods to inherently prevent human errors and to effectively mitigate the consequences of such errors by securing defense-in-depth during plant management through the advanced man-machine interface system (MMIS). It is needless to stress the significance of human error reduction during an accident in nuclear power plants (NPPs). Unexpected shutdowns caused by human errors not only threaten nuclear safety but also make public acceptance of nuclear power extremely lower. We have to recognize there must be the possibility of human errors occurring since humans are not essentially perfect particularly under stressful conditions. However, we have the opportunity to improve such a situation through advanced information and communication technologies on the basis of lessons learned from our experiences. As important lessons, authors explained key issues associated with automation, man-machine interface, operator support systems, and procedures. Upon this investigation, we outlined the concept and technical factors to develop advanced automation, operation and maintenance support systems, and computer-based procedures using wired/wireless technology. It should be noted that the ultimate responsibility of nuclear safety obviously belongs to humans not to machines. Therefore, safety culture including education and training, which is a kind of organizational factor, should be emphasized as well. In regard to safety culture for human error reduction, several issues that we are facing these days were described. We expect the ideas of the advanced MMIS proposed in this paper to lead in the future direction of related researches and finally supplement the safety of NPPs

  19. Evaluation of measurement precision errors at different bone density values

    International Nuclear Information System (INIS)

    Wilson, M.; Wong, J.; Bartlett, M.; Lee, N.

    2002-01-01

    Full text: The precision error commonly used in serial monitoring of BMD values using Dual Energy X Ray Absorptometry (DEXA) is 0.01-0.015g/cm - for both the L2 L4 lumbar spine and total femur. However, this limit is based on normal individuals with bone densities similar to the population mean. The purpose of this study was to systematically evaluate precision errors over the range of bone density values encountered in clinical practice. In 96 patients a BMD scan of the spine and femur was immediately repeated by the same technologist with the patient taken off the bed and repositioned between scans. Nine technologists participated. Values were obtained for the total femur and spine. Each value was classified as low range (0.75-1.05 g/cm ) and medium range (1.05- 1.35g/cm ) for the spine, low range (0.55 0. 85 g/cm ) and medium range (0.85-1.15 g/cm ) for the total femur. Results show that the precision error was significantly lower in the medium range for total femur results with the medium range value at 0.015 g/cm - and the low range at 0.025 g/cm - (p<0.01). No significant difference was found for the spine results. We also analysed precision errors between three technologists and found a significant difference (p=0.05) occurred between only two technologists and this was seen in the spine data only. We conclude that there is some evidence that the precision error increases at the outer limits of the normal bone density range. Also, the results show that having multiple trained operators does not greatly increase the BMD precision error. Copyright (2002) The Australian and New Zealand Society of Nuclear Medicine Inc

  20. Diagnostic Error in Stroke-Reasons and Proposed Solutions.

    Science.gov (United States)

    Bakradze, Ekaterina; Liberman, Ava L

    2018-02-13

    We discuss the frequency of stroke misdiagnosis and identify subgroups of stroke at high risk for specific diagnostic errors. In addition, we review common reasons for misdiagnosis and propose solutions to decrease error. According to a recent report by the National Academy of Medicine, most people in the USA are likely to experience a diagnostic error during their lifetimes. Nearly half of such errors result in serious disability and death. Stroke misdiagnosis is a major health care concern, with initial misdiagnosis estimated to occur in 9% of all stroke patients in the emergency setting. Under- or missed diagnosis (false negative) of stroke can result in adverse patient outcomes due to the preclusion of acute treatments and failure to initiate secondary prevention strategies. On the other hand, the overdiagnosis of stroke can result in inappropriate treatment, delayed identification of actual underlying disease, and increased health care costs. Young patients, women, minorities, and patients presenting with non-specific, transient, or posterior circulation stroke symptoms are at increased risk of misdiagnosis. Strategies to decrease diagnostic error in stroke have largely focused on early stroke detection via bedside examination strategies and a clinical decision rules. Targeted interventions to improve the diagnostic accuracy of stroke diagnosis among high-risk groups as well as symptom-specific clinical decision supports are needed. There are a number of open questions in the study of stroke misdiagnosis. To improve patient outcomes, existing strategies to improve stroke diagnostic accuracy should be more broadly adopted and novel interventions devised and tested to reduce diagnostic errors.

  1. ADVANCED MMIS TOWARD SUBSTANTIAL REDUCTION IN HUMAN ERRORS IN NPPS

    Directory of Open Access Journals (Sweden)

    POONG HYUN SEONG

    2013-04-01

    Full Text Available This paper aims to give an overview of the methods to inherently prevent human errors and to effectively mitigate the consequences of such errors by securing defense-in-depth during plant management through the advanced man-machine interface system (MMIS. It is needless to stress the significance of human error reduction during an accident in nuclear power plants (NPPs. Unexpected shutdowns caused by human errors not only threaten nuclear safety but also make public acceptance of nuclear power extremely lower. We have to recognize there must be the possibility of human errors occurring since humans are not essentially perfect particularly under stressful conditions. However, we have the opportunity to improve such a situation through advanced information and communication technologies on the basis of lessons learned from our experiences. As important lessons, authors explained key issues associated with automation, man-machine interface, operator support systems, and procedures. Upon this investigation, we outlined the concept and technical factors to develop advanced automation, operation and maintenance support systems, and computer-based procedures using wired/wireless technology. It should be noted that the ultimate responsibility of nuclear safety obviously belongs to humans not to machines. Therefore, safety culture including education and training, which is a kind of organizational factor, should be emphasized as well. In regard to safety culture for human error reduction, several issues that we are facing these days were described. We expect the ideas of the advanced MMIS proposed in this paper to lead in the future direction of related researches and finally supplement the safety of NPPs.

  2. The two errors of using the within-subject standard deviation (WSD) as the standard error of a reliable change index.

    Science.gov (United States)

    Maassen, Gerard H

    2010-08-01

    In this Journal, Lewis and colleagues introduced a new Reliable Change Index (RCI(WSD)), which incorporated the within-subject standard deviation (WSD) of a repeated measurement design as the standard error. In this note, two opposite errors in using WSD this way are demonstrated. First, being the standard error of measurement of only a single assessment makes WSD too small when practice effects are absent. Then, too many individuals will be designated reliably changed. Second, WSD can grow unlimitedly to the extent that differential practice effects occur. This can even make RCI(WSD) unable to detect any reliable change.

  3. At the cross-roads: an on-road examination of driving errors at intersections.

    Science.gov (United States)

    Young, Kristie L; Salmon, Paul M; Lenné, Michael G

    2013-09-01

    A significant proportion of road trauma occurs at intersections. Understanding the nature of driving errors at intersections therefore has the potential to lead to significant injury reductions. To further understand how the complexity of modern intersections shapes behaviour of these errors are compared to errors made mid-block, and the role of wider systems failures in intersection error causation is investigated in an on-road study. Twenty-five participants drove a pre-determined urban route incorporating 25 intersections. Two in-vehicle observers recorded the errors made while a range of other data was collected, including driver verbal protocols, video, driver eye glance behaviour and vehicle data (e.g., speed, braking and lane position). Participants also completed a post-trial cognitive task analysis interview. Participants were found to make 39 specific error types, with speeding violations the most common. Participants made significantly more errors at intersections compared to mid-block, with misjudgement, action and perceptual/observation errors more commonly observed at intersections. Traffic signal configuration was found to play a key role in intersection error causation, with drivers making more errors at partially signalised compared to fully signalised intersections. Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. Spectral Analysis of Forecast Error Investigated with an Observing System Simulation Experiment

    Science.gov (United States)

    Prive, N. C.; Errico, Ronald M.

    2015-01-01

    The spectra of analysis and forecast error are examined using the observing system simulation experiment (OSSE) framework developed at the National Aeronautics and Space Administration Global Modeling and Assimilation Office (NASAGMAO). A global numerical weather prediction model, the Global Earth Observing System version 5 (GEOS-5) with Gridpoint Statistical Interpolation (GSI) data assimilation, is cycled for two months with once-daily forecasts to 336 hours to generate a control case. Verification of forecast errors using the Nature Run as truth is compared with verification of forecast errors using self-analysis; significant underestimation of forecast errors is seen using self-analysis verification for up to 48 hours. Likewise, self analysis verification significantly overestimates the error growth rates of the early forecast, as well as mischaracterizing the spatial scales at which the strongest growth occurs. The Nature Run-verified error variances exhibit a complicated progression of growth, particularly for low wave number errors. In a second experiment, cycling of the model and data assimilation over the same period is repeated, but using synthetic observations with different explicitly added observation errors having the same error variances as the control experiment, thus creating a different realization of the control. The forecast errors of the two experiments become more correlated during the early forecast period, with correlations increasing for up to 72 hours before beginning to decrease.

  5. Human errors related to maintenance and modifications

    International Nuclear Information System (INIS)

    Laakso, K.; Pyy, P.; Reiman, L.

    1998-01-01

    The focus in human reliability analysis (HRA) relating to nuclear power plants has traditionally been on human performance in disturbance conditions. On the other hand, some studies and incidents have shown that also maintenance errors, which have taken place earlier in plant history, may have an impact on the severity of a disturbance, e.g. if they disable safety related equipment. Especially common cause and other dependent failures of safety systems may significantly contribute to the core damage risk. The first aim of the study was to identify and give examples of multiple human errors which have penetrated the various error detection and inspection processes of plant safety barriers. Another objective was to generate numerical safety indicators to describe and forecast the effectiveness of maintenance. A more general objective was to identify needs for further development of maintenance quality and planning. In the first phase of this operational experience feedback analysis, human errors recognisable in connection with maintenance were looked for by reviewing about 4400 failure and repair reports and some special reports which cover two nuclear power plant units on the same site during 1992-94. A special effort was made to study dependent human errors since they are generally the most serious ones. An in-depth root cause analysis was made for 14 dependent errors by interviewing plant maintenance foremen and by thoroughly analysing the errors. A more simple treatment was given to maintenance-related single errors. The results were shown as a distribution of errors among operating states i.a. as regards the following matters: in what operational state the errors were committed and detected; in what operational and working condition the errors were detected, and what component and error type they were related to. These results were presented separately for single and dependent maintenance-related errors. As regards dependent errors, observations were also made

  6. [Medication errors in a hospital emergency department: study of the current situation and critical points for improving patient safety].

    Science.gov (United States)

    Pérez-Díez, Cristina; Real-Campaña, José Manuel; Noya-Castro, María Carmen; Andrés-Paricio, Felicidad; Reyes Abad-Sazatornil, María; Bienvenido Povar-Marco, Javier

    2017-01-01

    To determine the frequency of medication errors and incident types in a tertiary-care hospital emergency department. To quantify and classify medication errors and identify critical points where measures should be implemented to improve patient safety. Prospective direct-observation study to detect errors made in June and July 2016. The overall error rate was 23.7%. The most common errors were made while medications were administered (10.9%). We detected 1532 incidents: 53.6% on workdays (P=.001), 43.1% during the afternoon/evening shift (P=.004), and 43.1% in observation areas (P=.004). The medication error rate was significant. Most errors and incidents occurred during the afternoon/evening shift and in the observation area. Most errors were related to administration of medications.

  7. Speech Abilities in Preschool Children with Speech Sound Disorder with and without Co-Occurring Language Impairment

    Science.gov (United States)

    Macrae, Toby; Tyler, Ann A.

    2014-01-01

    Purpose: The authors compared preschool children with co-occurring speech sound disorder (SSD) and language impairment (LI) to children with SSD only in their numbers and types of speech sound errors. Method: In this post hoc quasi-experimental study, independent samples t tests were used to compare the groups in the standard score from different…

  8. Angular truncation errors in integrating nephelometry

    International Nuclear Information System (INIS)

    Moosmueller, Hans; Arnott, W. Patrick

    2003-01-01

    Ideal integrating nephelometers integrate light scattered by particles over all directions. However, real nephelometers truncate light scattered in near-forward and near-backward directions below a certain truncation angle (typically 7 deg. ). This results in truncation errors, with the forward truncation error becoming important for large particles. Truncation errors are commonly calculated using Mie theory, which offers little physical insight and no generalization to nonspherical particles. We show that large particle forward truncation errors can be calculated and understood using geometric optics and diffraction theory. For small truncation angles (i.e., <10 deg. ) as typical for modern nephelometers, diffraction theory by itself is sufficient. Forward truncation errors are, by nearly a factor of 2, larger for absorbing particles than for nonabsorbing particles because for large absorbing particles most of the scattered light is due to diffraction as transmission is suppressed. Nephelometers calibration procedures are also discussed as they influence the effective truncation error

  9. Collection of offshore human error probability data

    International Nuclear Information System (INIS)

    Basra, Gurpreet; Kirwan, Barry

    1998-01-01

    Accidents such as Piper Alpha have increased concern about the effects of human errors in complex systems. Such accidents can in theory be predicted and prevented by risk assessment, and in particular human reliability assessment (HRA), but HRA ideally requires qualitative and quantitative human error data. A research initiative at the University of Birmingham led to the development of CORE-DATA, a Computerised Human Error Data Base. This system currently contains a reasonably large number of human error data points, collected from a variety of mainly nuclear-power related sources. This article outlines a recent offshore data collection study, concerned with collecting lifeboat evacuation data. Data collection methods are outlined and a selection of human error probabilities generated as a result of the study are provided. These data give insights into the type of errors and human failure rates that could be utilised to support offshore risk analyses

  10. Applications of human error analysis to aviation and space operations

    International Nuclear Information System (INIS)

    Nelson, W.R.

    1998-01-01

    . We are currently adapting our methods and tools of human error analysis to the domain of air traffic management (ATM) systems. Under the NASA-sponsored Advanced Air Traffic Technologies (AATT) program we are working to address issues of human reliability in the design of ATM systems to support the development of a ''free flight'' environment for commercial air traffic in the United States. We are also currently testing the application of our human error analysis approach for space flight operations. We have developed a simplified model of the critical habitability functions for the space station Mir, and have used this model to assess the affects of system failures and human errors that have occurred in the wake of the collision incident last year. We are developing an approach so that lessons learned from Mir operations can be systematically applied to design and operation of long-term space missions such as the International Space Station (ISS) and the manned Mars mission. (author)

  11. Error-related anterior cingulate cortex activity and the prediction of conscious error awareness

    Directory of Open Access Journals (Sweden)

    Catherine eOrr

    2012-06-01

    Full Text Available Research examining the neural mechanisms associated with error awareness has consistently identified dorsal anterior cingulate activity (ACC as necessary but not predictive of conscious error detection. Two recent studies (Steinhauser and Yeung, 2010; Wessel et al. 2011 have found a contrary pattern of greater dorsal ACC activity (in the form of the error-related negativity during detected errors, but suggested that the greater activity may instead reflect task influences (e.g., response conflict, error probability and or individual variability (e.g., statistical power. We re-analyzed fMRI BOLD data from 56 healthy participants who had previously been administered the Error Awareness Task, a motor Go/No-go response inhibition task in which subjects make errors of commission of which they are aware (Aware errors, or unaware (Unaware errors. Consistent with previous data, the activity in a number of cortical regions was predictive of error awareness, including bilateral inferior parietal and insula cortices, however in contrast to previous studies, including our own smaller sample studies using the same task, error-related dorsal ACC activity was significantly greater during aware errors when compared to unaware errors. While the significantly faster RT for aware errors (compared to unaware was consistent with the hypothesis of higher response conflict increasing ACC activity, we could find no relationship between dorsal ACC activity and the error RT difference. The data suggests that individual variability in error awareness is associated with error-related dorsal ACC activity, and therefore this region may be important to conscious error detection, but it remains unclear what task and individual factors influence error awareness.

  12. Group representations, error bases and quantum codes

    Energy Technology Data Exchange (ETDEWEB)

    Knill, E

    1996-01-01

    This report continues the discussion of unitary error bases and quantum codes. Nice error bases are characterized in terms of the existence of certain characters in a group. A general construction for error bases which are non-abelian over the center is given. The method for obtaining codes due to Calderbank et al. is generalized and expressed purely in representation theoretic terms. The significance of the inertia subgroup both for constructing codes and obtaining the set of transversally implementable operations is demonstrated.

  13. Practical, Reliable Error Bars in Quantum Tomography

    OpenAIRE

    Faist, Philippe; Renner, Renato

    2015-01-01

    Precise characterization of quantum devices is usually achieved with quantum tomography. However, most methods which are currently widely used in experiments, such as maximum likelihood estimation, lack a well-justified error analysis. Promising recent methods based on confidence regions are difficult to apply in practice or yield error bars which are unnecessarily large. Here, we propose a practical yet robust method for obtaining error bars. We do so by introducing a novel representation of...

  14. Soft errors in modern electronic systems

    CERN Document Server

    Nicolaidis, Michael

    2010-01-01

    This book provides a comprehensive presentation of the most advanced research results and technological developments enabling understanding, qualifying and mitigating the soft errors effect in advanced electronics, including the fundamental physical mechanisms of radiation induced soft errors, the various steps that lead to a system failure, the modelling and simulation of soft error at various levels (including physical, electrical, netlist, event driven, RTL, and system level modelling and simulation), hardware fault injection, accelerated radiation testing and natural environment testing, s

  15. Error calculations statistics in radioactive measurements

    International Nuclear Information System (INIS)

    Verdera, Silvia

    1994-01-01

    Basic approach and procedures frequently used in the practice of radioactive measurements.Statistical principles applied are part of Good radiopharmaceutical Practices and quality assurance.Concept of error, classification as systematic and random errors.Statistic fundamentals,probability theories, populations distributions, Bernoulli, Poisson,Gauss, t-test distribution,Ξ2 test, error propagation based on analysis of variance.Bibliography.z table,t-test table, Poisson index ,Ξ2 test

  16. Ultrahigh Error Threshold for Surface Codes with Biased Noise

    Science.gov (United States)

    Tuckett, David K.; Bartlett, Stephen D.; Flammia, Steven T.

    2018-02-01

    We show that a simple modification of the surface code can exhibit an enormous gain in the error correction threshold for a noise model in which Pauli Z errors occur more frequently than X or Y errors. Such biased noise, where dephasing dominates, is ubiquitous in many quantum architectures. In the limit of pure dephasing noise we find a threshold of 43.7(1)% using a tensor network decoder proposed by Bravyi, Suchara, and Vargo. The threshold remains surprisingly large in the regime of realistic noise bias ratios, for example 28.2(2)% at a bias of 10. The performance is, in fact, at or near the hashing bound for all values of the bias. The modified surface code still uses only weight-4 stabilizers on a square lattice, but merely requires measuring products of Y instead of Z around the faces, as this doubles the number of useful syndrome bits associated with the dominant Z errors. Our results demonstrate that large efficiency gains can be found by appropriately tailoring codes and decoders to realistic noise models, even under the locality constraints of topological codes.

  17. Complications: acknowledging, managing, and coping with human error.

    Science.gov (United States)

    Helo, Sevann; Moulton, Carol-Anne E

    2017-08-01

    Errors are inherent in medicine due to the imperfectness of human nature. Health care providers may have a difficult time accepting their fallibility, acknowledging mistakes, and disclosing errors. Fear of litigation, shame, blame, and concern about reputation are just some of the barriers preventing physicians from being more candid with their patients, despite the supporting body of evidence that patients cite poor communication and lack of transparency as primary drivers to file a lawsuit in the wake of a medical complication. Proper error disclosure includes a timely explanation of what happened, who was involved, why the error occurred, and how it will be prevented in the future. Medical mistakes afford the opportunity for individuals and institutions to be candid about their weaknesses while improving patient care processes. When a physician takes the Hippocratic Oath they take on a tremendous sense of responsibility for the care of their patients, and often bear the burden of their mistakes in isolation. Physicians may struggle with guilt, shame, and a crisis of confidence, which may thwart efforts to identify areas for improvement that can lead to meaningful change. Coping strategies for providers include discussing the event with others, seeking professional counseling, and implementing quality improvement projects. Physicians and health care organizations need to find adaptive ways to deal with complications that will benefit patients, providers, and their institutions.

  18. Neutron-induced soft errors in CMOS circuits

    International Nuclear Information System (INIS)

    Hazucha, P.

    1999-01-01

    The subject of this thesis is a systematic study of soft errors occurring in CMOS integrated circuits when being exposed to radiation. The vast majority of commercial circuits operate in the natural environment ranging from the sea level to aircraft flight altitudes (less than 20 km), where the errors are caused mainly by interaction of atmospheric neutrons with silicon. Initially, the soft error rate (SER) of a static memory was measured for supply voltages from 2V to 5V when irradiated by 14 MeV and 100 MeV neutrons. Increased error rate due to the decreased supply voltage has been identified as a potential hazard for operation of future low-voltage circuits. A novel methodology was proposed for accurate SER characterization of a manufacturing process and it was validated by measurements on a 0.6 μm process and 100 MeV neutrons. The methodology can be applied to the prediction of SER in the natural environment

  19. Error monitoring issues for common channel signaling

    Science.gov (United States)

    Hou, Victor T.; Kant, Krishna; Ramaswami, V.; Wang, Jonathan L.

    1994-04-01

    Motivated by field data which showed a large number of link changeovers and incidences of link oscillations between in-service and out-of-service states in common channel signaling (CCS) networks, a number of analyses of the link error monitoring procedures in the SS7 protocol were performed by the authors. This paper summarizes the results obtained thus far and include the following: (1) results of an exact analysis of the performance of the error monitoring procedures under both random and bursty errors; (2) a demonstration that there exists a range of error rates within which the error monitoring procedures of SS7 may induce frequent changeovers and changebacks; (3) an analysis of the performance ofthe SS7 level-2 transmission protocol to determine the tolerable error rates within which the delay requirements can be met; (4) a demonstration that the tolerable error rate depends strongly on various link and traffic characteristics, thereby implying that a single set of error monitor parameters will not work well in all situations; (5) some recommendations on a customizable/adaptable scheme of error monitoring with a discussion on their implementability. These issues may be particularly relevant in the presence of anticipated increases in SS7 traffic due to widespread deployment of Advanced Intelligent Network (AIN) and Personal Communications Service (PCS) as well as for developing procedures for high-speed SS7 links currently under consideration by standards bodies.

  20. Neurochemical enhancement of conscious error awareness.

    Science.gov (United States)

    Hester, Robert; Nandam, L Sanjay; O'Connell, Redmond G; Wagner, Joe; Strudwick, Mark; Nathan, Pradeep J; Mattingley, Jason B; Bellgrove, Mark A

    2012-02-22

    How the brain monitors ongoing behavior for performance errors is a central question of cognitive neuroscience. Diminished awareness of performance errors limits the extent to which humans engage in corrective behavior and has been linked to loss of insight in a number of psychiatric syndromes (e.g., attention deficit hyperactivity disorder, drug addiction). These conditions share alterations in monoamine signaling that may influence the neural mechanisms underlying error processing, but our understanding of the neurochemical drivers of these processes is limited. We conducted a randomized, double-blind, placebo-controlled, cross-over design of the influence of methylphenidate, atomoxetine, and citalopram on error awareness in 27 healthy participants. The error awareness task, a go/no-go response inhibition paradigm, was administered to assess the influence of monoaminergic agents on performance errors during fMRI data acquisition. A single dose of methylphenidate, but not atomoxetine or citalopram, significantly improved the ability of healthy volunteers to consciously detect performance errors. Furthermore, this behavioral effect was associated with a strengthening of activation differences in the dorsal anterior cingulate cortex and inferior parietal lobe during the methylphenidate condition for errors made with versus without awareness. Our results have implications for the understanding of the neurochemical underpinnings of performance monitoring and for the pharmacological treatment of a range of disparate clinical conditions that are marked by poor awareness of errors.

  1. [Analysis of intrusion errors in free recall].

    Science.gov (United States)

    Diesfeldt, H F A

    2017-06-01

    Extra-list intrusion errors during five trials of the eight-word list-learning task of the Amsterdam Dementia Screening Test (ADST) were investigated in 823 consecutive psychogeriatric patients (87.1% suffering from major neurocognitive disorder). Almost half of the participants (45.9%) produced one or more intrusion errors on the verbal recall test. Correct responses were lower when subjects made intrusion errors, but learning slopes did not differ between subjects who committed intrusion errors and those who did not so. Bivariate regression analyses revealed that participants who committed intrusion errors were more deficient on measures of eight-word recognition memory, delayed visual recognition and tests of executive control (the Behavioral Dyscontrol Scale and the ADST-Graphical Sequences as measures of response inhibition). Using hierarchical multiple regression, only free recall and delayed visual recognition retained an independent effect in the association with intrusion errors, such that deficient scores on tests of episodic memory were sufficient to explain the occurrence of intrusion errors. Measures of inhibitory control did not add significantly to the explanation of intrusion errors in free recall, which makes insufficient strength of memory traces rather than a primary deficit in inhibition the preferred account for intrusion errors in free recall.

  2. Human error mechanisms in complex work environments

    International Nuclear Information System (INIS)

    Rasmussen, J.

    1988-01-01

    Human error taxonomies have been developed from analysis of industrial incident reports as well as from psychological experiments. In this paper the results of the two approaches are reviewed and compared. It is found, in both cases, that a fairly small number of basic psychological mechanisms will account for most of the action errors observed. In addition, error mechanisms appear to be intimately related to the development of high skill and know-how in a complex work context. This relationship between errors and human adaptation is discussed in detail for individuals and organisations. The implications for system safety and briefly mentioned, together with the implications for system design. (author)

  3. Human error mechanisms in complex work environments

    International Nuclear Information System (INIS)

    Rasmussen, Jens; Danmarks Tekniske Hoejskole, Copenhagen)

    1988-01-01

    Human error taxonomies have been developed from analysis of industrial incident reports as well as from psychological experiments. In this paper the results of the two approaches are reviewed and compared. It is found, in both cases, that a fairly small number of basic psychological mechanisms will account for most of the action errors observed. In addition, error mechanisms appear to be intimately related to the development of high skill and know-how in a complex work context. This relationship between errors and human adaptation is discussed in detail for individuals and organisations. The implications for system safety are briefly mentioned, together with the implications for system design. (author)

  4. Learning from errors in super-resolution.

    Science.gov (United States)

    Tang, Yi; Yuan, Yuan

    2014-11-01

    A novel framework of learning-based super-resolution is proposed by employing the process of learning from the estimation errors. The estimation errors generated by different learning-based super-resolution algorithms are statistically shown to be sparse and uncertain. The sparsity of the estimation errors means most of estimation errors are small enough. The uncertainty of the estimation errors means the location of the pixel with larger estimation error is random. Noticing the prior information about the estimation errors, a nonlinear boosting process of learning from these estimation errors is introduced into the general framework of the learning-based super-resolution. Within the novel framework of super-resolution, a low-rank decomposition technique is used to share the information of different super-resolution estimations and to remove the sparse estimation errors from different learning algorithms or training samples. The experimental results show the effectiveness and the efficiency of the proposed framework in enhancing the performance of different learning-based algorithms.

  5. Random and Systematic Errors Share in Total Error of Probes for CNC Machine Tools

    Directory of Open Access Journals (Sweden)

    Adam Wozniak

    2018-03-01

    Full Text Available Probes for CNC machine tools, as every measurement device, have accuracy limited by random errors and by systematic errors. Random errors of these probes are described by a parameter called unidirectional repeatability. Manufacturers of probes for CNC machine tools usually specify only this parameter, while parameters describing systematic errors of the probes, such as pre-travel variation or triggering radius variation, are used rarely. Systematic errors of the probes, linked to the differences in pre-travel values for different measurement directions, can be corrected or compensated, but it is not a widely used procedure. In this paper, the share of systematic errors and random errors in total error of exemplary probes are determined. In the case of simple, kinematic probes, systematic errors are much greater than random errors, so compensation would significantly reduce the probing error. Moreover, it shows that in the case of kinematic probes commonly specified unidirectional repeatability is significantly better than 2D performance. However, in the case of more precise strain-gauge probe systematic errors are of the same order as random errors, which means that errors correction or compensation, in this case, would not yield any significant benefits.

  6. Reducing patient identification errors related to glucose point-of-care testing

    Directory of Open Access Journals (Sweden)

    Gaurav Alreja

    2011-01-01

    Full Text Available Background: Patient identification (ID errors in point-of-care testing (POCT can cause test results to be transferred to the wrong patient′s chart or prevent results from being transmitted and reported. Despite the implementation of patient barcoding and ongoing operator training at our institution, patient ID errors still occur with glucose POCT. The aim of this study was to develop a solution to reduce identification errors with POCT. Materials and Methods: Glucose POCT was performed by approximately 2,400 clinical operators throughout our health system. Patients are identified by scanning in wristband barcodes or by manual data entry using portable glucose meters. Meters are docked to upload data to a database server which then transmits data to any medical record matching the financial number of the test result. With a new model, meters connect to an interface manager where the patient ID (a nine-digit account number is checked against patient registration data from admission, discharge, and transfer (ADT feeds and only matched results are transferred to the patient′s electronic medical record. With the new process, the patient ID is checked prior to testing, and testing is prevented until ID errors are resolved. Results: When averaged over a period of a month, ID errors were reduced to 3 errors/month (0.015% in comparison with 61.5 errors/month (0.319% before implementing the new meters. Conclusion: Patient ID errors may occur with glucose POCT despite patient barcoding. The verification of patient identification should ideally take place at the bedside before testing occurs so that the errors can be addressed in real time. The introduction of an ADT feed directly to glucose meters reduced patient ID errors in POCT.

  7. Medication Errors in Patients with Enteral Feeding Tubes in the Intensive Care Unit.

    Science.gov (United States)

    Sohrevardi, Seyed Mojtaba; Jarahzadeh, Mohammad Hossein; Mirzaei, Ehsan; Mirjalili, Mahtabalsadat; Tafti, Arefeh Dehghani; Heydari, Behrooz

    2017-01-01

    Most patients admitted to Intensive Care Units (ICU) have problems in using oral medication or ingesting solid forms of drugs. Selecting the most suitable dosage form in such patients is a challenge. The current study was conducted to assess the frequency and types of errors of oral medication administration in patients with enteral feeding tubes or suffering swallowing problems. A cross-sectional study was performed in the ICU of Shahid Sadoughi Hospital, Yazd, Iran. Patients were assessed for the incidence and types of medication errors occurring in the process of preparation and administration of oral medicines. Ninety-four patients were involved in this study and 10,250 administrations were observed. Totally, 4753 errors occurred among the studied patients. The most commonly used drugs were pantoprazole tablet, piracetam syrup, and losartan tablet. A total of 128 different types of drugs and nine different oral pharmaceutical preparations were prescribed for the patients. Forty-one (35.34%) out of 116 different solid drugs (except effervescent tablets and powders) could be substituted by liquid or injectable forms. The most common error was the wrong time of administration. Errors of wrong dose preparation and administration accounted for 24.04% and 25.31% of all errors, respectively. In this study, at least three-fourth of the patients experienced medication errors. The occurrence of these errors can greatly impair the quality of the patients' pharmacotherapy, and more attention should be paid to this issue.

  8. Characteristics of the Traumatic Forensic Cases Admitted To Emergency Department and Errors in the Forensic Report Writing.

    Science.gov (United States)

    Aktas, Nurettin; Gulacti, Umut; Lok, Ugur; Aydin, İrfan; Borta, Tayfun; Celik, Murat

    2018-01-01

    To identify errors in forensic reports and to describe the characteristics of traumatic medico-legal cases presenting to the emergency department (ED) at a tertiary care hospital. This study is a retrospective cross-sectional study. The study includes cases resulting in a forensic report among all traumatic patients presenting to the ED of Adiyaman University Training and Research Hospital, Adiyaman, Turkey during a 1-year period. We recorded the demographic characteristics of all the cases, time of presentation to the ED, traumatic characteristics of medico-legal cases, forms of suicide attempt, suspected poisonous substance exposure, the result of follow-up and the type of forensic report. A total of 4300 traumatic medico-legal cases were included in the study and 72% of these cases were male. Traumatic medico-legal cases occurred at the greatest frequency in July (10.1%) and 28.9% of all cases occurred in summer. The most frequent causes of traumatic medico-legal cases in the ED were traffic accidents (43.4%), violent crime (30.5%), and suicide attempt (7.2%). The most common method of attempted suicide was drug intake (86.4%). 12.3% of traumatic medico-legal cases were hospitalized and 24.2% of those hospitalized were admitted to the orthopedics service. The most common error in forensic reports was the incomplete recording of the patient's "cooperation" status (82.7%). Additionally, external traumatic lesions were not defined in 62.4% of forensic reports. The majority of traumatic medico-legal cases were male age 18-44 years, the most common source of trauma was traffic accidents and in the summer months. When writing a forensic report, emergency physicians made mistakes in noting physical examination findings and identifying external traumatic lesions. Physicians should make sure that the traumatic medico-legal patients they treat have adequate documentation for reference during legal proceedings. The legal duties and responsibilities of physicians should be

  9. Effect of Pharmacist Participation During Physician Rounds and Prescription Error in the Intensive Care Unit

    Directory of Open Access Journals (Sweden)

    Marlina A. Turnodihardjo

    2016-09-01

    Full Text Available Patient’s safety is now a prominent issue in pharmaceutical care because of adverse drug events that is common in hospitalized patients. Majority of error are likely occured during prescribing, which is the first stage of pharmacy process. Prescription errors mostly occured in an Intensive Care Unit (ICU, which is due to the severity of the illness of its patients as well as the large number of medications prescribed. Pharmacist participation actually could reduce prescribing error made by doctors. The main objective of this study was to determine the effect of pharmacist participation during physician rounds on prescription errors in the ICU. This study was a quasi-experimental design with one group pre-post test. A prospective study was conducted from April to May 2015 by screening 110 samples of orders. Screening was done to identify type of prescription errors. Prescription error was defined as error in the prescription writing process – incomplete information and not according to agreement. Mann-Whitney test was used to analyze the differences in prescribing errors. The results showed that there was the differences between prescription errors before and during the pharmacist participation (p<0.05. There was also a significant negative correlation between the frequency of pharmacist recommendation on drug ordering and prescription errors (r= –0.638; p<0.05. It means the pharmacist participation was one of the strategies that can be adopted to prevent in prescribing errors and implementation of collaboration between both doctors and pharmacists. In other words, the supporting hospital management system which would encourage interpersonal communication among health care proffesionals is needed.

  10. Impact of Stewardship Interventions on Antiretroviral Medication Errors in an Urban Medical Center: A 3-Year, Multiphase Study.

    Science.gov (United States)

    Zucker, Jason; Mittal, Jaimie; Jen, Shin-Pung; Cheng, Lucy; Cennimo, David

    2016-03-01

    There is a high prevalence of HIV infection in Newark, New Jersey, with University Hospital admitting approximately 600 HIV-infected patients per year. Medication errors involving antiretroviral therapy (ART) could significantly affect treatment outcomes. The goal of this study was to evaluate the effectiveness of various stewardship interventions in reducing the prevalence of prescribing errors involving ART. This was a retrospective review of all inpatients receiving ART for HIV treatment during three distinct 6-month intervals over a 3-year period. During the first year, the baseline prevalence of medication errors was determined. During the second year, physician and pharmacist education was provided, and a computerized order entry system with drug information resources and prescribing recommendations was implemented. Prospective audit of ART orders with feedback was conducted in the third year. Analyses and comparisons were made across the three phases of this study. Of the 334 patients with HIV admitted in the first year, 45% had at least one antiretroviral medication error and 38% had uncorrected errors at the time of discharge. After education and computerized order entry, significant reductions in medication error rates were observed compared to baseline rates; 36% of 315 admissions had at least one error and 31% had uncorrected errors at discharge. While the prevalence of antiretroviral errors in year 3 was similar to that of year 2 (37% of 276 admissions), there was a significant decrease in the prevalence of uncorrected errors at discharge (12%) with the use of prospective review and intervention. Interventions, such as education and guideline development, can aid in reducing ART medication errors, but a committed stewardship program is necessary to elicit the greatest impact. © 2016 Pharmacotherapy Publications, Inc.

  11. Peak-counts blood flow model-errors and limitations

    International Nuclear Information System (INIS)

    Mullani, N.A.; Marani, S.K.; Ekas, R.D.; Gould, K.L.

    1984-01-01

    The peak-counts model has several advantages, but its use may be limited due to the condition that the venous egress may not be negligible at the time of peak-counts. Consequently, blood flow measurements by the peak-counts model will depend on the bolus size, bolus duration, and the minimum transit time of the bolus through the region of interest. The effect of bolus size on the measurement of extraction fraction and blood flow was evaluated by injecting 1 to 30ml of rubidium chloride in the femoral vein of a dog and measuring the myocardial activity with a beta probe over the heart. Regional blood flow measurements were not found to vary with bolus sizes up to 30ml. The effect of bolus duration was studied by injecting a 10cc bolus of tracer at different speeds in the femoral vein of a dog. All intravenous injections undergo a broadening of the bolus duration due to the transit time of the tracer through the lungs and the heart. This transit time was found to range from 4-6 second FWHM and dominates the duration of the bolus to the myocardium for up to 3 second injections. A computer simulation has been carried out in which the different parameters of delay time, extraction fraction, and bolus duration can be changed to assess the errors in the peak-counts model. The results of the simulations show that the error will be greatest for short transit time delays and for low extraction fractions

  12. Automation of a gamma spectrometric analysis method for naturally occuring radionuclides in different materials (NORM)

    International Nuclear Information System (INIS)

    Marzocchi, Olaf

    2009-06-01

    This work presents an improvement over the standard analysis routine used in the Physikalisches Messlabor to detect gamma peaks in spectra from naturally occurring radioactive materials (NORM). The new routine introduces the use of custom libraries of known gamma peaks, in order to ease the work of the software than can therefore detect more peaks. As final result, the user performing the analysis has less chances of making errors and can also analyse more spectra in the same amount of time. A new software, with an optimised interface able to further enhance the productivity of the user, is developed and validated. (orig.)

  13. How to measure a-few-nanometer-small LER occurring in EUV lithography processed feature

    Science.gov (United States)

    Kawada, Hiroki; Kawasaki, Takahiro; Kakuta, Junichi; Ikota, Masami; Kondo, Tsuyoshi

    2018-03-01

    For EUV lithography features we want to decrease the dose and/or energy of CD-SEM's probe beam because LER decreases with severe resist-material's shrink. Under such conditions, however, measured LER increases from true LER, due to LER bias that is fake LER caused by random noise in SEM image. A gap error occurs between the right and the left LERs. In this work we propose new procedures to obtain true LER by excluding the LER bias from the measured LER. To verify it we propose a LER's reference-metrology using TEM.

  14. A Model of the Acoustic Interactions Occurring Under Arctic Ice

    Science.gov (United States)

    1990-05-22

    agreement at angles near ecrit - Finally there is undoubtedly some error in the collected data as any temperature variations were not accounted for...acoustic attenuation in various media will supplement the overall comprehension of reflection and transmission phenomena as well. Continued collection of

  15. The Neural Feedback Response to Error As a Teaching Signal for the Motor Learning System

    Science.gov (United States)

    Shadmehr, Reza

    2016-01-01

    When we experience an error during a movement, we update our motor commands to partially correct for this error on the next trial. How does experience of error produce the improvement in the subsequent motor commands? During the course of an erroneous reaching movement, proprioceptive and visual sensory pathways not only sense the error, but also engage feedback mechanisms, resulting in corrective motor responses that continue until the hand arrives at its goal. One possibility is that this feedback response is co-opted by the learning system and used as a template to improve performance on the next attempt. Here we used electromyography (EMG) to compare neural correlates of learning and feedback to test the hypothesis that the feedback response to error acts as a template for learning. We designed a task in which mixtures of error-clamp and force-field perturbation trials were used to deconstruct EMG time courses into error-feedback and learning components. We observed that the error-feedback response was composed of excitation of some muscles, and inhibition of others, producing a complex activation/deactivation pattern during the reach. Despite this complexity, across muscles the learning response was consistently a scaled version of the error-feedback response, but shifted 125 ms earlier in time. Across people, individuals who produced a greater feedback response to error, also learned more from error. This suggests that the feedback response to error serves as a teaching signal for the brain. Individuals who learn faster have a better teacher in their feedback control system. SIGNIFICANCE STATEMENT Our sensory organs transduce errors in behavior. To improve performance, we must generate better motor commands. How does the nervous system transform an error in sensory coordinates into better motor commands in muscle coordinates? Here we show that when an error occurs during a movement, the reflexes transform the sensory representation of error into motor

  16. Cross-cultural differences in categorical memory errors.

    Science.gov (United States)

    Schwartz, Aliza J; Boduroglu, Aysecan; Gutchess, Angela H

    2014-06-01

    Cultural differences occur in the use of categories to aid accurate recall of information. This study investigated whether culture also contributed to false (erroneous) memories, and extended cross-cultural memory research to Turkish culture, which is shaped by Eastern and Western influences. Americans and Turks viewed word pairs, half of which were categorically related and half unrelated. Participants then attempted to recall the second word from the pair in response to the first word cue. Responses were coded as correct, as blanks, or as different types of errors. Americans committed more categorical errors than did Turks, and Turks mistakenly recalled more non-categorically related list words than did Americans. These results support the idea that Americans use categories either to organize information in memory or to support retrieval strategies to a greater extent than Turks and suggest that culture shapes not only accurate recall but also erroneous distortions of memory. © 2014 Cognitive Science Society, Inc.

  17. GPS/DR Error Estimation for Autonomous Vehicle Localization.

    Science.gov (United States)

    Lee, Byung-Hyun; Song, Jong-Hwa; Im, Jun-Hyuck; Im, Sung-Hyuck; Heo, Moon-Beom; Jee, Gyu-In

    2015-08-21

    Autonomous vehicles require highly reliable navigation capabilities. For example, a lane-following method cannot be applied in an intersection without lanes, and since typical lane detection is performed using a straight-line model, errors can occur when the lateral distance is estimated in curved sections due to a model mismatch. Therefore, this paper proposes a localization method that uses GPS/DR error estimation based on a lane detection method with curved lane models, stop line detection, and curve matching in order to improve the performance during waypoint following procedures. The advantage of using the proposed method is that position information can be provided for autonomous driving through intersections, in sections with sharp curves, and in curved sections following a straight section. The proposed method was applied in autonomous vehicles at an experimental site to evaluate its performance, and the results indicate that the positioning achieved accuracy at the sub-meter level.

  18. GPS/DR Error Estimation for Autonomous Vehicle Localization

    Directory of Open Access Journals (Sweden)

    Byung-Hyun Lee

    2015-08-01

    Full Text Available Autonomous vehicles require highly reliable navigation capabilities. For example, a lane-following method cannot be applied in an intersection without lanes, and since typical lane detection is performed using a straight-line model, errors can occur when the lateral distance is estimated in curved sections due to a model mismatch. Therefore, this paper proposes a localization method that uses GPS/DR error estimation based on a lane detection method with curved lane models, stop line detection, and curve matching in order to improve the performance during waypoint following procedures. The advantage of using the proposed method is that position information can be provided for autonomous driving through intersections, in sections with sharp curves, and in curved sections following a straight section. The proposed method was applied in autonomous vehicles at an experimental site to evaluate its performance, and the results indicate that the positioning achieved accuracy at the sub-meter level.

  19. DOI resolution measurement and error analysis with LYSO and APDs

    International Nuclear Information System (INIS)

    Lee, Chae-hun; Cho, Gyuseong

    2008-01-01

    Spatial resolution degradation in PET occurs at the edge of Field Of View (FOV) due to parallax error. To improve spatial resolution at the edge of FOV, Depth-Of-Interaction (DOI) PET has been investigated and several methods for DOI positioning were proposed. In this paper, a DOI-PET detector module using two 8x4 array avalanche photodiodes (APDs) (Hamamatsu, S8550) and a 2 cm long LYSO scintillation crystal was proposed and its DOI characteristics were investigated experimentally. In order to measure DOI positions, signals from two APDs were compared. Energy resolution was obtained from the sum of two APDs' signals and DOI positioning error was calculated. Finally, an optimum DOI step size in a 2 cm long LYSO were suggested to help to design a DOI-PET

  20. ERF/ERFC, Calculation of Error Function, Complementary Error Function, Probability Integrals

    International Nuclear Information System (INIS)

    Vogel, J.E.

    1983-01-01

    1 - Description of problem or function: ERF and ERFC are used to compute values of the error function and complementary error function for any real number. They may be used to compute other related functions such as the normal probability integrals. 4. Method of solution: The error function and complementary error function are approximated by rational functions. Three such rational approximations are used depending on whether - x .GE.4.0. In the first region the error function is computed directly and the complementary error function is computed via the identity erfc(x)=1.0-erf(x). In the other two regions the complementary error function is computed directly and the error function is computed from the identity erf(x)=1.0-erfc(x). The error function and complementary error function are real-valued functions of any real argument. The range of the error function is (-1,1). The range of the complementary error function is (0,2). 5. Restrictions on the complexity of the problem: The user is cautioned against using ERF to compute the complementary error function by using the identity erfc(x)=1.0-erf(x). This subtraction may cause partial or total loss of significance for certain values of x