WorldWideScience

Sample records for researchers uncover errors

  1. Uncovering the Best Skill Multimap by Constraining the Error Probabilities of the Gain-Loss Model

    Science.gov (United States)

    Anselmi, Pasquale; Robusto, Egidio; Stefanutti, Luca

    2012-01-01

    The Gain-Loss model is a probabilistic skill multimap model for assessing learning processes. In practical applications, more than one skill multimap could be plausible, while none corresponds to the true one. The article investigates whether constraining the error probabilities is a way of uncovering the best skill assignment among a number of…

  2. Hydraulic behaviour of a partially uncovered core

    International Nuclear Information System (INIS)

    Fischer, K.; Hafner, W.

    1989-10-01

    A critical review of experimental data and theoretical models relevant to the thermohydraulic processes in a partially uncovered core has been performed. Presently available optimized thermohydraulic codes should be able to predict swell level elevations within an error band of ± 0.5 m. Rod temperature rising velocities could be predicted within an error bandwidth of ± 10%, provided the correct rod heat capacity is given. A general statement about the accuracy of predicted rod temperatures is not possible because the errors increase with simulation time. Highest errors are expected for long transients with low heating rates and low steam velocities. As a result, three areas for additional research are suggested: - a high-pressure test at 120 bar to complete the void correlation data base, - a low steam flow - low power experiment to improve heat transfer correlations, - a numerical investigation of three-dimensional effects in the reactor core with unequally heated rod bundles. For the present state of 1-dimensional experiments and models, suggestions for a satisfactory modeling have been derived. The suggested further work could improve the modelling capabilities and the code reliability for some limiting cases like high pressure boil-off, low-power long-term steam cooling, and unequal heating of neighbouring bundles considerably

  3. Uncovering cognitive processes: Different techniques that can contribute to cognitive load research and instruction

    NARCIS (Netherlands)

    Van Gog, Tamara; Kester, Liesbeth; Nievelstein, Fleurie; Giesbers, Bas; Fred, Paas

    2009-01-01

    Van Gog, T., Kester, L., Nievelstein, F., Giesbers, B., & Paas, F. (2009). Uncovering cognitive processes: Different techniques that can contribute to cognitive load research and instruction. Computers in Human Behavior, 25, 325-331.

  4. Feminist Approaches to Triangulation: Uncovering Subjugated Knowledge and Fostering Social Change in Mixed Methods Research

    Science.gov (United States)

    Hesse-Biber, Sharlene

    2012-01-01

    This article explores the deployment of triangulation in the service of uncovering subjugated knowledge and promoting social change for women and other oppressed groups. Feminist approaches to mixed methods praxis create a tight link between the research problem and the research design. An analysis of selected case studies of feminist praxis…

  5. Advancing the research agenda for diagnostic error reduction.

    Science.gov (United States)

    Zwaan, Laura; Schiff, Gordon D; Singh, Hardeep

    2013-10-01

    Diagnostic errors remain an underemphasised and understudied area of patient safety research. We briefly summarise the methods that have been used to conduct research on epidemiology, contributing factors and interventions related to diagnostic error and outline directions for future research. Research methods that have studied epidemiology of diagnostic error provide some estimate on diagnostic error rates. However, there appears to be a large variability in the reported rates due to the heterogeneity of definitions and study methods used. Thus, future methods should focus on obtaining more precise estimates in different settings of care. This would lay the foundation for measuring error rates over time to evaluate improvements. Research methods have studied contributing factors for diagnostic error in both naturalistic and experimental settings. Both approaches have revealed important and complementary information. Newer conceptual models from outside healthcare are needed to advance the depth and rigour of analysis of systems and cognitive insights of causes of error. While the literature has suggested many potentially fruitful interventions for reducing diagnostic errors, most have not been systematically evaluated and/or widely implemented in practice. Research is needed to study promising intervention areas such as enhanced patient involvement in diagnosis, improving diagnosis through the use of electronic tools and identification and reduction of specific diagnostic process 'pitfalls' (eg, failure to conduct appropriate diagnostic evaluation of a breast lump after a 'normal' mammogram). The last decade of research on diagnostic error has made promising steps and laid a foundation for more rigorous methods to advance the field.

  6. Advancing the research agenda for diagnostic error reduction

    NARCIS (Netherlands)

    Zwaan, L.; Schiff, G.D.; Singh, H.

    2013-01-01

    Diagnostic errors remain an underemphasised and understudied area of patient safety research. We briefly summarise the methods that have been used to conduct research on epidemiology, contributing factors and interventions related to diagnostic error and outline directions for future research.

  7. Research trend on human error reduction

    International Nuclear Information System (INIS)

    Miyaoka, Sadaoki

    1990-01-01

    Human error has been the problem in all industries. In 1988, the Bureau of Mines, Department of the Interior, USA, carried out the worldwide survey on the human error in all industries in relation to the fatal accidents in mines. There was difference in the results according to the methods of collecting data, but the proportion that human error took in the total accidents distributed in the wide range of 20∼85%, and was 35% on the average. The rate of occurrence of accidents and troubles in Japanese nuclear power stations is shown, and the rate of occurrence of human error is 0∼0.5 cases/reactor-year, which did not much vary. Therefore, the proportion that human error took in the total tended to increase, and it has become important to reduce human error for lowering the rate of occurrence of accidents and troubles hereafter. After the TMI accident in 1979 in USA, the research on man-machine interface became active, and after the Chernobyl accident in 1986 in USSR, the problem of organization and management has been studied. In Japan, 'Safety 21' was drawn up by the Advisory Committee for Energy, and also the annual reports on nuclear safety pointed out the importance of human factors. The state of the research on human factors in Japan and abroad and three targets to reduce human error are reported. (K.I.)

  8. Overview of error-tolerant cockpit research

    Science.gov (United States)

    Abbott, Kathy

    1990-01-01

    The objectives of research in intelligent cockpit aids and intelligent error-tolerant systems are stated. In intelligent cockpit aids research, the objective is to provide increased aid and support to the flight crew of civil transport aircraft through the use of artificial intelligence techniques combined with traditional automation. In intelligent error-tolerant systems, the objective is to develop and evaluate cockpit systems that provide flight crews with safe and effective ways and means to manage aircraft systems, plan and replan flights, and respond to contingencies. A subsystems fault management functional diagram is given. All information is in viewgraph form.

  9. Frequent methodological errors in clinical research.

    Science.gov (United States)

    Silva Aycaguer, L C

    2018-03-07

    Several errors that are frequently present in clinical research are listed, discussed and illustrated. A distinction is made between what can be considered an "error" arising from ignorance or neglect, from what stems from a lack of integrity of researchers, although it is recognized and documented that it is not easy to establish when we are in a case and when in another. The work does not intend to make an exhaustive inventory of such problems, but focuses on those that, while frequent, are usually less evident or less marked in the various lists that have been published with this type of problems. It has been a decision to develop in detail the examples that illustrate the problems identified, instead of making a list of errors accompanied by an epidermal description of their characteristics. Copyright © 2018 Elsevier España, S.L.U. y SEMICYUC. All rights reserved.

  10. Uncovering Black Womanhood in Engineering

    Science.gov (United States)

    Gibson, Sheree L.; Espino, Michelle M.

    2016-01-01

    Despite the growing research that outlines the experiences of Blacks and women undergraduates in engineering, little is known about Black women in this field. The purpose of this qualitative study was to uncover how eight Black undergraduate women in engineering understood their race and gender identities in a culture that can be oppressive to…

  11. Sustained response with ixekizumab treatment of moderate-to-severe psoriasis with scalp involvement: results from three phase 3 trials (UNCOVER-1, UNCOVER-2, UNCOVER-3).

    Science.gov (United States)

    Reich, Kristian; Leonardi, Craig; Lebwohl, Mark; Kerdel, Francisco; Okubo, Yukari; Romiti, Ricardo; Goldblum, Orin; Dennehy, Ellen B; Kerr, Lisa; Sofen, Howard

    2017-06-01

    Scalp is a frequently affected and difficult-to-treat area in psoriasis patients. We assessed the efficacy of ixekizumab in the treatment of patients with scalp psoriasis over 60 weeks using the Psoriasis Scalp Severity Index (PSSI). In three Phase 3, multicenter, double-blind, placebo-controlled trials, patients with moderate-to-severe psoriasis in UNCOVER-1 (N = 1296), UNCOVER-2 (N = 1224) and UNCOVER-3 (N = 1346) were randomized to subcutaneous 80 mg ixekizumab every two weeks (Q2W) or every four weeks (Q4W) after a 160 mg starting dose, or placebo through Week 12. Additional UNCOVER-2 and UNCOVER-3 cohorts were randomized to 50 mg bi-weekly etanercept through Week 12. Patients entering the open-label long-term extension (LTE) (UNCOVER-3) received ixekizumab Q4W; UNCOVER-1 and UNCOVER-2 included a blinded maintenance period in which static physician global assessment (sPGA) 0/1 responders were re-randomized to placebo, ixekizumab Q4W, or 80 mg ixekizumab every 12 weeks (Q12W) through Week 60. In patients with moderate-to-severe psoriasis with baseline scalp involvement, PSSI 90 and 100 were achieved at Week 12 in higher percentages of patients treated with ixekizumab Q2W (81.7% and 74.6%) or ixekizumab Q4W (75.6% and 68.9%) compared with patients treated with placebo (7.6% and 6.7%; p psoriasis in patients with moderate-to-severe psoriasis, with most patients achieving complete or near-complete resolution of scalp psoriasis and maintaining this response over 60 weeks.

  12. Spectrum of diagnostic errors in radiology.

    Science.gov (United States)

    Pinto, Antonio; Brunese, Luca

    2010-10-28

    Diagnostic errors are important in all branches of medicine because they are an indication of poor patient care. Since the early 1970s, physicians have been subjected to an increasing number of medical malpractice claims. Radiology is one of the specialties most liable to claims of medical negligence. Most often, a plaintiff's complaint against a radiologist will focus on a failure to diagnose. The etiology of radiological error is multi-factorial. Errors fall into recurrent patterns. Errors arise from poor technique, failures of perception, lack of knowledge and misjudgments. The work of diagnostic radiology consists of the complete detection of all abnormalities in an imaging examination and their accurate diagnosis. Every radiologist should understand the sources of error in diagnostic radiology as well as the elements of negligence that form the basis of malpractice litigation. Error traps need to be uncovered and highlighted, in order to prevent repetition of the same mistakes. This article focuses on the spectrum of diagnostic errors in radiology, including a classification of the errors, and stresses the malpractice issues in mammography, chest radiology and obstetric sonography. Missed fractures in emergency and communication issues between radiologists and physicians are also discussed.

  13. National Suicide Rates a Century after Durkheim: Do We Know Enough to Estimate Error?

    Science.gov (United States)

    Claassen, Cynthia A.; Yip, Paul S.; Corcoran, Paul; Bossarte, Robert M.; Lawrence, Bruce A.; Currier, Glenn W.

    2010-01-01

    Durkheim's nineteenth-century analysis of national suicide rates dismissed prior concerns about mortality data fidelity. Over the intervening century, however, evidence documenting various types of error in suicide data has only mounted, and surprising levels of such error continue to be routinely uncovered. Yet the annual suicide rate remains the…

  14. The uncovered parity properties of the Czech Koruna

    Czech Academy of Sciences Publication Activity Database

    Derviz, Alexis

    2002-01-01

    Roč. 11, č. 1 (2002), s. 17-37 ISSN 1210-0455 R&D Projects: GA AV ČR KSK1019101 Institutional research plan: CEZ:AV0Z1075907 Keywords : uncovered parity * asset prices * international consumption-based capital asset pricing model Subject RIV: AH - Economics

  15. Error and objectivity: cognitive illusions and qualitative research.

    Science.gov (United States)

    Paley, John

    2005-07-01

    Psychological research has shown that cognitive illusions, of which visual illusions are just a special case, are systematic and pervasive, raising epistemological questions about how error in all forms of research can be identified and eliminated. The quantitative sciences make use of statistical techniques for this purpose, but it is not clear what the qualitative equivalent is, particularly in view of widespread scepticism about validity and objectivity. I argue that, in the light of cognitive psychology, the 'error question' cannot be dismissed as a positivist obsession, and that the concepts of truth and objectivity are unavoidable. However, they constitute only a 'minimal realism', which does not necessarily bring a commitment to 'absolute' truth, certainty, correspondence, causation, reductionism, or universal laws in its wake. The assumption that it does reflects a misreading of positivism and, ironically, precipitates a 'crisis of legitimation and representation', as described by constructivist authors.

  16. Process error rates in general research applications to the Human ...

    African Journals Online (AJOL)

    Objective. To examine process error rates in applications for ethics clearance of health research. Methods. Minutes of 586 general research applications made to a human health research ethics committee (HREC) from April 2008 to March 2009 were examined. Rates of approval were calculated and reasons for requiring ...

  17. Error begat error: design error analysis and prevention in social infrastructure projects.

    Science.gov (United States)

    Love, Peter E D; Lopez, Robert; Edwards, David J; Goh, Yang M

    2012-09-01

    Design errors contribute significantly to cost and schedule growth in social infrastructure projects and to engineering failures, which can result in accidents and loss of life. Despite considerable research that has addressed their error causation in construction projects they still remain prevalent. This paper identifies the underlying conditions that contribute to design errors in social infrastructure projects (e.g. hospitals, education, law and order type buildings). A systemic model of error causation is propagated and subsequently used to develop a learning framework for design error prevention. The research suggests that a multitude of strategies should be adopted in congruence to prevent design errors from occurring and so ensure that safety and project performance are ameliorated. Copyright © 2011. Published by Elsevier Ltd.

  18. Revisiting Uncovered Interest Rate Parity: Switching Between UIP and the Random Walk

    NARCIS (Netherlands)

    R. Huisman (Ronald); R.J. Mahieu (Ronald)

    2007-01-01

    textabstractIn this paper, we examine in which periods uncovered interest rate parity was likely to hold. Empirical research has shown mixed evidence on UIP. The main finding is that it doesn’t hold, although some researchers were not able to reject UIP in periods with large interest differentials

  19. The recovery factors analysis of the human errors for research reactors

    International Nuclear Information System (INIS)

    Farcasiu, M.; Nitoi, M.; Apostol, M.; Turcu, I.; Florescu, Ghe.

    2006-01-01

    The results of many Probabilistic Safety Assessment (PSA) studies show a very significant contribution of human errors to systems unavailability of the nuclear installations. The treatment of human interactions is considered one of the major limitations in the context of PSA. To identify those human actions that can have an effect on system reliability or availability applying the Human Reliability Analysis (HRA) is necessary. The recovery factors analysis of the human action is an important step in HRA. This paper presents how can be reduced the human errors probabilities (HEP) using those elements that have the capacity to recovery human error. The recovery factors modeling is marked to identify error likelihood situations or situations that conduct at development of the accident. This analysis is realized by THERP method. The necessary information was obtained from the operating experience of the research reactor TRIGA of the INR Pitesti. The required data were obtained from generic databases. (authors)

  20. Being an honest broker of hydrology: Uncovering, communicating and addressing model error in a climate change streamflow dataset

    Science.gov (United States)

    Chegwidden, O.; Nijssen, B.; Pytlak, E.

    2017-12-01

    Any model simulation has errors, including errors in meteorological data, process understanding, model structure, and model parameters. These errors may express themselves as bias, timing lags, and differences in sensitivity between the model and the physical world. The evaluation and handling of these errors can greatly affect the legitimacy, validity and usefulness of the resulting scientific product. In this presentation we will discuss a case study of handling and communicating model errors during the development of a hydrologic climate change dataset for the Pacific Northwestern United States. The dataset was the result of a four-year collaboration between the University of Washington, Oregon State University, the Bonneville Power Administration, the United States Army Corps of Engineers and the Bureau of Reclamation. Along the way, the partnership facilitated the discovery of multiple systematic errors in the streamflow dataset. Through an iterative review process, some of those errors could be resolved. For the errors that remained, honest communication of the shortcomings promoted the dataset's legitimacy. Thoroughly explaining errors also improved ways in which the dataset would be used in follow-on impact studies. Finally, we will discuss the development of the "streamflow bias-correction" step often applied to climate change datasets that will be used in impact modeling contexts. We will describe the development of a series of bias-correction techniques through close collaboration among universities and stakeholders. Through that process, both universities and stakeholders learned about the others' expectations and workflows. This mutual learning process allowed for the development of methods that accommodated the stakeholders' specific engineering requirements. The iterative revision process also produced a functional and actionable dataset while preserving its scientific merit. We will describe how encountering earlier techniques' pitfalls allowed us

  1. Familial Brugada syndrome uncovered by hyperkalaemic diabetic ketoacidosis

    NARCIS (Netherlands)

    Postema, Pieter G.; Vlaar, Alexander P. J.; DeVries, J. Hans; Tan, Hanno L.

    2011-01-01

    We describe a case of diabetic ketoacidosis with concomitant hyperkalaemia that uncovered a typical Brugada syndrome electrocardiogram (ECG). Further provocation testing in the patient and his son confirmed familial Brugada syndrome. Diabetic ketoacidosis with hyperkalaemia may uncover an

  2. Sources of medical error in refractive surgery.

    Science.gov (United States)

    Moshirfar, Majid; Simpson, Rachel G; Dave, Sonal B; Christiansen, Steven M; Edmonds, Jason N; Culbertson, William W; Pascucci, Stephen E; Sher, Neal A; Cano, David B; Trattler, William B

    2013-05-01

    To evaluate the causes of laser programming errors in refractive surgery and outcomes in these cases. In this multicenter, retrospective chart review, 22 eyes of 18 patients who had incorrect data entered into the refractive laser computer system at the time of treatment were evaluated. Cases were analyzed to uncover the etiology of these errors, patient follow-up treatments, and final outcomes. The results were used to identify potential methods to avoid similar errors in the future. Every patient experienced compromised uncorrected visual acuity requiring additional intervention, and 7 of 22 eyes (32%) lost corrected distance visual acuity (CDVA) of at least one line. Sixteen patients were suitable candidates for additional surgical correction to address these residual visual symptoms and six were not. Thirteen of 22 eyes (59%) received surgical follow-up treatment; nine eyes were treated with contact lenses. After follow-up treatment, six patients (27%) still had a loss of one line or more of CDVA. Three significant sources of error were identified: errors of cylinder conversion, data entry, and patient identification error. Twenty-seven percent of eyes with laser programming errors ultimately lost one or more lines of CDVA. Patients who underwent surgical revision had better outcomes than those who did not. Many of the mistakes identified were likely avoidable had preventive measures been taken, such as strict adherence to patient verification protocol or rigorous rechecking of treatment parameters. Copyright 2013, SLACK Incorporated.

  3. Research progress of the static and dynamic characteristics and motion errors of hydrostatic supports

    Directory of Open Access Journals (Sweden)

    Zhiwei WANG

    2016-06-01

    Full Text Available At present, the research on static and dynamic characteristics of hydrostatic supports depend on the form and structure of the restrictor, which are mainly focused on the influences of recess shape, bearing structure, bearing surface roughness, lubricant and elastic deformations of the bearing. There are few studies on the thermal effect of hydrostatic supports and static and dynamic characteristics of hydrostatic guideways. The research on motion errors of hydrostatic supports is primarily based on the static equilibrium of the moving part. The effects of the motion speed of the moving part and structural deformation on the motion errors are not considered. Finally, the research prospects from the standardization, modularization and industrialization of hydrostatic supports, thermal effect of hydrostatic bearing, the static and dynamic characteristics of hydrostatic guideways and motion errors of hydrostatic supports under operating conditions are concluded.

  4. Uncovering the Hidden Costs of Offshoring

    DEFF Research Database (Denmark)

    Larsen, Marcus M.; Manning, Stephan; Pedersen, Torben

    2013-01-01

    This study investigates estimation errors due to hidden costs—the costs of implementation that are neglected in strategic decision-making processes—in the context of services offshoring. Based on data from the Offshoring Research Network, we find that decision makers are more likely to make cost......-estimation errors given increasing configuration and task complexity in captive offshoring and offshore outsourcing, respectively. Moreover, we show that experience and a strong orientation toward organizational design in the offshoring strategy reduce the cost-estimation errors that follow from complexity. Our...

  5. Action errors, error management, and learning in organizations.

    Science.gov (United States)

    Frese, Michael; Keith, Nina

    2015-01-03

    Every organization is confronted with errors. Most errors are corrected easily, but some may lead to negative consequences. Organizations often focus on error prevention as a single strategy for dealing with errors. Our review suggests that error prevention needs to be supplemented by error management--an approach directed at effectively dealing with errors after they have occurred, with the goal of minimizing negative and maximizing positive error consequences (examples of the latter are learning and innovations). After defining errors and related concepts, we review research on error-related processes affected by error management (error detection, damage control). Empirical evidence on positive effects of error management in individuals and organizations is then discussed, along with emotional, motivational, cognitive, and behavioral pathways of these effects. Learning from errors is central, but like other positive consequences, learning occurs under certain circumstances--one being the development of a mind-set of acceptance of human error.

  6. Research on Electronic Transformer Data Synchronization Based on Interpolation Methods and Their Error Analysis

    Directory of Open Access Journals (Sweden)

    Pang Fubin

    2015-09-01

    Full Text Available In this paper the origin problem of data synchronization is analyzed first, and then three common interpolation methods are introduced to solve the problem. Allowing for the most general situation, the paper divides the interpolation error into harmonic and transient interpolation error components, and the error expression of each method is derived and analyzed. Besides, the interpolation errors of linear, quadratic and cubic methods are computed at different sampling rates, harmonic orders and transient components. Further, the interpolation accuracy and calculation amount of each method are compared. The research results provide theoretical guidance for selecting the interpolation method in the data synchronization application of electronic transformer.

  7. Relating Complexity and Error Rates of Ontology Concepts. More Complex NCIt Concepts Have More Errors.

    Science.gov (United States)

    Min, Hua; Zheng, Ling; Perl, Yehoshua; Halper, Michael; De Coronado, Sherri; Ochs, Christopher

    2017-05-18

    Ontologies are knowledge structures that lend support to many health-information systems. A study is carried out to assess the quality of ontological concepts based on a measure of their complexity. The results show a relation between complexity of concepts and error rates of concepts. A measure of lateral complexity defined as the number of exhibited role types is used to distinguish between more complex and simpler concepts. Using a framework called an area taxonomy, a kind of abstraction network that summarizes the structural organization of an ontology, concepts are divided into two groups along these lines. Various concepts from each group are then subjected to a two-phase QA analysis to uncover and verify errors and inconsistencies in their modeling. A hierarchy of the National Cancer Institute thesaurus (NCIt) is used as our test-bed. A hypothesis pertaining to the expected error rates of the complex and simple concepts is tested. Our study was done on the NCIt's Biological Process hierarchy. Various errors, including missing roles, incorrect role targets, and incorrectly assigned roles, were discovered and verified in the two phases of our QA analysis. The overall findings confirmed our hypothesis by showing a statistically significant difference between the amounts of errors exhibited by more laterally complex concepts vis-à-vis simpler concepts. QA is an essential part of any ontology's maintenance regimen. In this paper, we reported on the results of a QA study targeting two groups of ontology concepts distinguished by their level of complexity, defined in terms of the number of exhibited role types. The study was carried out on a major component of an important ontology, the NCIt. The findings suggest that more complex concepts tend to have a higher error rate than simpler concepts. These findings can be utilized to guide ongoing efforts in ontology QA.

  8. Correction of refractive errors in rhesus macaques (Macaca mulatta) involved in visual research.

    Science.gov (United States)

    Mitchell, Jude F; Boisvert, Chantal J; Reuter, Jon D; Reynolds, John H; Leblanc, Mathias

    2014-08-01

    Macaques are the most common animal model for studies in vision research, and due to their high value as research subjects, often continue to participate in studies well into old age. As is true in humans, visual acuity in macaques is susceptible to refractive errors. Here we report a case study in which an aged macaque demonstrated clear impairment in visual acuity according to performance on a demanding behavioral task. Refraction demonstrated bilateral myopia that significantly affected behavioral and visual tasks. Using corrective lenses, we were able to restore visual acuity. After correction of myopia, the macaque's performance on behavioral tasks was comparable to that of a healthy control. We screened 20 other male macaques to assess the incidence of refractive errors and ocular pathologies in a larger population. Hyperopia was the most frequent ametropia but was mild in all cases. A second macaque had mild myopia and astigmatism in one eye. There were no other pathologies observed on ocular examination. We developed a simple behavioral task that visual research laboratories could use to test visual acuity in macaques. The test was reliable and easily learned by the animals in 1 d. This case study stresses the importance of screening macaques involved in visual science for refractive errors and ocular pathologies to ensure the quality of research; we also provide simple methodology for screening visual acuity in these animals.

  9. Thermal Error Test and Intelligent Modeling Research on the Spindle of High Speed CNC Machine Tools

    Science.gov (United States)

    Luo, Zhonghui; Peng, Bin; Xiao, Qijun; Bai, Lu

    2018-03-01

    Thermal error is the main factor affecting the accuracy of precision machining. Through experiments, this paper studies the thermal error test and intelligent modeling for the spindle of vertical high speed CNC machine tools in respect of current research focuses on thermal error of machine tool. Several testing devices for thermal error are designed, of which 7 temperature sensors are used to measure the temperature of machine tool spindle system and 2 displacement sensors are used to detect the thermal error displacement. A thermal error compensation model, which has a good ability in inversion prediction, is established by applying the principal component analysis technology, optimizing the temperature measuring points, extracting the characteristic values closely associated with the thermal error displacement, and using the artificial neural network technology.

  10. Research on Human-Error Factors of Civil Aircraft Pilots Based On Grey Relational Analysis

    Directory of Open Access Journals (Sweden)

    Guo Yundong

    2018-01-01

    Full Text Available In consideration of the situation that civil aviation accidents involve many human-error factors and show the features of typical grey systems, an index system of civil aviation accident human-error factors is built using human factor analysis and classification system model. With the data of accidents happened worldwide between 2008 and 2011, the correlation between human-error factors can be analyzed quantitatively using the method of grey relational analysis. Research results show that the order of main factors affecting pilot human-error factors is preconditions for unsafe acts, unsafe supervision, organization and unsafe acts. The factor related most closely with second-level indexes and pilot human-error factors is the physical/mental limitations of pilots, followed by supervisory violations. The relevancy between the first-level indexes and the corresponding second-level indexes and the relevancy between second-level indexes can also be analyzed quantitatively.

  11. The Impact of Error-Management Climate, Error Type and Error Originator on Auditors’ Reporting Errors Discovered on Audit Work Papers

    NARCIS (Netherlands)

    A.H. Gold-Nöteberg (Anna); U. Gronewold (Ulfert); S. Salterio (Steve)

    2010-01-01

    textabstractWe examine factors affecting the auditor’s willingness to report their own or their peers’ self-discovered errors in working papers subsequent to detailed working paper review. Prior research has shown that errors in working papers are detected in the review process; however, such

  12. A New Paradigm for Diagnosing Contributions to Model Aerosol Forcing Error

    Science.gov (United States)

    Jones, A. L.; Feldman, D. R.; Freidenreich, S.; Paynter, D.; Ramaswamy, V.; Collins, W. D.; Pincus, R.

    2017-12-01

    A new paradigm in benchmark absorption-scattering radiative transfer is presented that enables both the globally averaged and spatially resolved testing of climate model radiation parameterizations in order to uncover persistent sources of biases in the aerosol instantaneous radiative effect (IRE). A proof of concept is demonstrated with the Geophysical Fluid Dynamics Laboratory AM4 and Community Earth System Model 1.2.2 climate models. Instead of prescribing atmospheric conditions and aerosols, as in prior intercomparisons, native snapshots of the atmospheric state and aerosol optical properties from the participating models are used as inputs to an accurate radiation solver to uncover model-relevant biases. These diagnostic results show that the models' aerosol IRE bias is of the same magnitude as the persistent range cited ( 1 W/m2) and also varies spatially and with intrinsic aerosol optical properties. The findings underscore the significance of native model error analysis and its dispositive ability to diagnose global biases, confirming its fundamental value for the Radiative Forcing Model Intercomparison Project.

  13. Choosing appropriate independent variable in educational experimental research: some errors debunked

    Science.gov (United States)

    Panjaitan, R. L.

    2018-03-01

    It is found that a number of quantitative research reports of some beginning researchers, especially undergraduate students, tend to ‘merely’ quantitative with not really proper understanding of variables involved in the research. This paper focuses on some mistakes related to independent variable determination in experimental research in education. With literature research methodology, data were gathered from an undergraduate student’s thesis as a single non-human subject. This data analysis resulted some findings, such as misinterpreted variables that should have represented the research question, and unsuitable calculation of determination coefficient due to incorrect independent variable determination. When a researcher misinterprets data as data that could behave as the independent variable but actually it could not, all of the following data processes become pointless. These problems might lead to inaccurate research conclusion. In this paper, the problems were analysed and discussed. To avert the incorrectness in processing data, it is suggested that undergraduate students as beginning researchers have adequate statistics mastery. This study might function as a resource to researchers in education to be aware to and not to redo similar errors.

  14. Evaluating the thermal and electrical performance of several uncovered PVT collectors with a field test

    NARCIS (Netherlands)

    de Keizer, C.; de Jong, M.; Mendes, T.; Katiyar, M.; Folkerts, W.; Rindt, C.C.M.; Zondag, H.A.

    Recently, there has been a lot of interest in PV thermal systems, which generate both heat and power. Within the WenSDak project, several companies and research institutes work together to (further) develop several uncovered PVT collectors. The outdoor performance of prototypes of these collectors

  15. The human fallibility of scientists : Dealing with error and bias in academic research

    NARCIS (Netherlands)

    Veldkamp, Coosje

    2017-01-01

    THE HUMAN FALLIBILITY OF SCIENTISTS Dealing with error and bias in academic research Recent studies have highlighted that not all published findings in the scientific lit¬erature are trustworthy, suggesting that currently implemented control mechanisms such as high standards for the reporting of

  16. Research on effects of phase error in phase-shifting interferometer

    Science.gov (United States)

    Wang, Hongjun; Wang, Zhao; Zhao, Hong; Tian, Ailing; Liu, Bingcai

    2007-12-01

    Referring to phase-shifting interferometry technology, the phase shifting error from the phase shifter is the main factor that directly affects the measurement accuracy of the phase shifting interferometer. In this paper, the resources and sorts of phase shifting error were introduction, and some methods to eliminate errors were mentioned. Based on the theory of phase shifting interferometry, the effects of phase shifting error were analyzed in detail. The Liquid Crystal Display (LCD) as a new shifter has advantage as that the phase shifting can be controlled digitally without any mechanical moving and rotating element. By changing coded image displayed on LCD, the phase shifting in measuring system was induced. LCD's phase modulation characteristic was analyzed in theory and tested. Based on Fourier transform, the effect model of phase error coming from LCD was established in four-step phase shifting interferometry. And the error range was obtained. In order to reduce error, a new error compensation algorithm was put forward. With this method, the error can be obtained by process interferogram. The interferogram can be compensated, and the measurement results can be obtained by four-step phase shifting interferogram. Theoretical analysis and simulation results demonstrate the feasibility of this approach to improve measurement accuracy.

  17. Apraxia of Speech: Perceptual Analysis of Trisyllabic Word Productions across Repeated Sampling Occasions

    Science.gov (United States)

    Mauszycki, Shannon C.; Wambaugh, Julie L.; Cameron, Rosalea M.

    2012-01-01

    Purpose: Early apraxia of speech (AOS) research has characterized errors as being variable, resulting in a number of different error types being produced on repeated productions of the same stimuli. Conversely, recent research has uncovered greater consistency in errors, but there are limited data examining sound errors over time (more than one…

  18. Hepatitis C virus host cell interactions uncovered

    DEFF Research Database (Denmark)

    Gottwein, Judith; Bukh, Jens

    2007-01-01

      Insights into virus-host cell interactions as uncovered by Randall et al. (1) in a recent issue of PNAS further our understanding of the hepatitis C virus (HCV) life cycle, persistence, and pathogenesis and might lead to the identification of new therapeutic targets. HCV persistently infects 180...... million individuals worldwide, causing chronic hepatitis, liver cirrhosis, and hepatocellular carcinoma. The only approved treatment, combination therapy with IFN- and ribavirin, targets cellular pathways (2); however, a sustained virologic response is achieved only in approximately half of the patients...... treated. Therefore, there is a pressing need for the identification of novel drugs against hepatitis C. Although most research focuses on the development of HCV-specific antivirals, such as protease and polymerase inhibitors (3), cellular targets could be pursued and might allow the development of broad...

  19. Minimizing Experimental Error in Thinning Research

    Science.gov (United States)

    C. B. Briscoe

    1964-01-01

    Many diverse approaches have been made prescribing and evaluating thinnings on an objective basis. None of the techniques proposed hasbeen widely accepted. Indeed. none has been proven superior to the others nor even widely applicable. There are at least two possible reasons for this: none of the techniques suggested is of any general utility and/or experimental error...

  20. Is a shift from research on individual medical error to research on health information technology underway? A 40-year analysis of publication trends in medical journals.

    Science.gov (United States)

    Erlewein, Daniel; Bruni, Tommaso; Gadebusch Bondio, Mariacarla

    2018-06-07

    In 1983, McIntyre and Popper underscored the need for more openness in dealing with errors in medicine. Since then, much has been written on individual medical errors. Furthermore, at the beginning of the 21st century, researchers and medical practitioners increasingly approached individual medical errors through health information technology. Hence, the question arises whether the attention of biomedical researchers shifted from individual medical errors to health information technology. We ran a study to determine publication trends concerning individual medical errors and health information technology in medical journals over the last 40 years. We used the Medical Subject Headings (MeSH) taxonomy in the database MEDLINE. Each year, we analyzed the percentage of relevant publications to the total number of publications in MEDLINE. The trends identified were tested for statistical significance. Our analysis showed that the percentage of publications dealing with individual medical errors increased from 1976 until the beginning of the 21st century but began to drop in 2003. Both the upward and the downward trends were statistically significant (P information technology doubled between 2003 and 2015. The upward trend was statistically significant (P information technology in the USA and the UK. © 2018 Chinese Cochrane Center, West China Hospital of Sichuan University and John Wiley & Sons Australia, Ltd.

  1. Comparison between uncovered and covered self-expandable metal stent placement in malignant duodenal obstruction.

    Science.gov (United States)

    Kim, Ji Won; Jeong, Ji Bong; Lee, Kook Lae; Kim, Byeong Gwan; Ahn, Dong Won; Lee, Jae Kyung; Kim, Su Hwan

    2015-02-07

    To compare the clinical outcomes of uncovered and covered self-expandable metal stent placements in patients with malignant duodenal obstruction. A total of 67 patients were retrospectively enrolled from January 2003 to June 2013. All patients had symptomatic obstruction characterized by nausea, vomiting, reduced oral intake, and weight loss. The exclusion criteria included asymptomatic duodenal obstruction, perforation or peritonitis, concomitant small bowel obstruction, or duodenal obstruction caused by benign strictures. The technical and clinical success rate, complication rate, and stent patency were compared according to the placement of uncovered (n = 38) or covered (n = 29) stents. The technical and clinical success rates did not differ between the uncovered and covered stent groups (100% vs 96.6% and 89.5% vs 82.8%). There were no differences in the overall complication rates between the uncovered and covered stent groups (31.6% vs 41.4%). However, stent migration occurred more frequently with covered than uncovered stents [20.7% (6/29) vs 0% (0/38), P stent patency was longer in uncovered than in covered stents [251 d (95%CI: 149.8 d-352.2 d) vs 139 d (95%CI: 45.5 d-232.5 d), P stent (70 d) and covered stent groups (60 d). Uncovered stents may be preferable in malignant duodenal obstruction because of their greater resistance to stent migration and longer stent patency than covered stents.

  2. Research on calibration error of carrier phase against antenna arraying

    Science.gov (United States)

    Sun, Ke; Hou, Xiaomin

    2016-11-01

    It is the technical difficulty of uplink antenna arraying that signals from various quarters can not be automatically aligned at the target in deep space. The size of the far-field power combining gain is directly determined by the accuracy of carrier phase calibration. It is necessary to analyze the entire arraying system in order to improve the accuracy of the phase calibration. This paper analyzes the factors affecting the calibration error of carrier phase of uplink antenna arraying system including the error of phase measurement and equipment, the error of the uplink channel phase shift, the position error of ground antenna, calibration receiver and target spacecraft, the error of the atmospheric turbulence disturbance. Discuss the spatial and temporal autocorrelation model of atmospheric disturbances. Each antenna of the uplink antenna arraying is no common reference signal for continuous calibration. So it must be a system of the periodic calibration. Calibration is refered to communication of one or more spacecrafts in a certain period. Because the deep space targets are not automatically aligned to multiplexing received signal. Therefore the aligned signal should be done in advance on the ground. Data is shown that the error can be controlled within the range of demand by the use of existing technology to meet the accuracy of carrier phase calibration. The total error can be controlled within a reasonable range.

  3. Prescription Errors in Psychiatry

    African Journals Online (AJOL)

    Arun Kumar Agnihotri

    clinical pharmacists in detecting errors before they have a (sometimes serious) clinical impact should not be underestimated. Research on medication error in mental health care is limited. .... participation in ward rounds and adverse drug.

  4. Common Errors in Ecological Data Sharing

    Directory of Open Access Journals (Sweden)

    Robert B. Cook

    2013-04-01

    Full Text Available Objectives: (1 to identify common errors in data organization and metadata completeness that would preclude a “reader” from being able to interpret and re-use the data for a new purpose; and (2 to develop a set of best practices derived from these common errors that would guide researchers in creating more usable data products that could be readily shared, interpreted, and used.Methods: We used directed qualitative content analysis to assess and categorize data and metadata errors identified by peer reviewers of data papers published in the Ecological Society of America’s (ESA Ecological Archives. Descriptive statistics provided the relative frequency of the errors identified during the peer review process.Results: There were seven overarching error categories: Collection & Organization, Assure, Description, Preserve, Discover, Integrate, and Analyze/Visualize. These categories represent errors researchers regularly make at each stage of the Data Life Cycle. Collection & Organization and Description errors were some of the most common errors, both of which occurred in over 90% of the papers.Conclusions: Publishing data for sharing and reuse is error prone, and each stage of the Data Life Cycle presents opportunities for mistakes. The most common errors occurred when the researcher did not provide adequate metadata to enable others to interpret and potentially re-use the data. Fortunately, there are ways to minimize these mistakes through carefully recording all details about study context, data collection, QA/ QC, and analytical procedures from the beginning of a research project and then including this descriptive information in the metadata.

  5. CORRECTING ERRORS: THE RELATIVE EFFICACY OF DIFFERENT FORMS OF ERROR FEEDBACK IN SECOND LANGUAGE WRITING

    Directory of Open Access Journals (Sweden)

    Chitra Jayathilake

    2013-01-01

    Full Text Available Error correction in ESL (English as a Second Language classes has been a focal phenomenon in SLA (Second Language Acquisition research due to some controversial research results and diverse feedback practices. This paper presents a study which explored the relative efficacy of three forms of error correction employed in ESL writing classes: focusing on the acquisition of one grammar element both for immediate and delayed language contexts, and collecting data from university undergraduates, this study employed an experimental research design with a pretest-treatment-posttests structure. The research revealed that the degree of success in acquiring L2 (Second Language grammar through error correction differs according to the form of the correction and to learning contexts. While the findings are discussed in relation to the previous literature, this paper concludes creating a cline of error correction forms to be promoted in Sri Lankan L2 writing contexts, particularly in ESL contexts in Universities.

  6. An MEG signature corresponding to an axiomatic model of reward prediction error.

    Science.gov (United States)

    Talmi, Deborah; Fuentemilla, Lluis; Litvak, Vladimir; Duzel, Emrah; Dolan, Raymond J

    2012-01-02

    Optimal decision-making is guided by evaluating the outcomes of previous decisions. Prediction errors are theoretical teaching signals which integrate two features of an outcome: its inherent value and prior expectation of its occurrence. To uncover the magnetic signature of prediction errors in the human brain we acquired magnetoencephalographic (MEG) data while participants performed a gambling task. Our primary objective was to use formal criteria, based upon an axiomatic model (Caplin and Dean, 2008a), to determine the presence and timing profile of MEG signals that express prediction errors. We report analyses at the sensor level, implemented in SPM8, time locked to outcome onset. We identified, for the first time, a MEG signature of prediction error, which emerged approximately 320 ms after an outcome and expressed as an interaction between outcome valence and probability. This signal followed earlier, separate signals for outcome valence and probability, which emerged approximately 200 ms after an outcome. Strikingly, the time course of the prediction error signal, as well as the early valence signal, resembled the Feedback-Related Negativity (FRN). In simultaneously acquired EEG data we obtained a robust FRN, but the win and loss signals that comprised this difference wave did not comply with the axiomatic model. Our findings motivate an explicit examination of the critical issue of timing embodied in computational models of prediction errors as seen in human electrophysiological data. Copyright © 2011 Elsevier Inc. All rights reserved.

  7. Error management for musicians: an interdisciplinary conceptual framework.

    Science.gov (United States)

    Kruse-Weber, Silke; Parncutt, Richard

    2014-01-01

    Musicians tend to strive for flawless performance and perfection, avoiding errors at all costs. Dealing with errors while practicing or performing is often frustrating and can lead to anger and despair, which can explain musicians' generally negative attitude toward errors and the tendency to aim for flawless learning in instrumental music education. But even the best performances are rarely error-free, and research in general pedagogy and psychology has shown that errors provide useful information for the learning process. Research in instrumental pedagogy is still neglecting error issues; the benefits of risk management (before the error) and error management (during and after the error) are still underestimated. It follows that dealing with errors is a key aspect of music practice at home, teaching, and performance in public. And yet, to be innovative, or to make their performance extraordinary, musicians need to risk errors. Currently, most music students only acquire the ability to manage errors implicitly - or not at all. A more constructive, creative, and differentiated culture of errors would balance error tolerance and risk-taking against error prevention in ways that enhance music practice and music performance. The teaching environment should lay the foundation for the development of such an approach. In this contribution, we survey recent research in aviation, medicine, economics, psychology, and interdisciplinary decision theory that has demonstrated that specific error-management training can promote metacognitive skills that lead to better adaptive transfer and better performance skills. We summarize how this research can be applied to music, and survey-relevant research that is specifically tailored to the needs of musicians, including generic guidelines for risk and error management in music teaching and performance. On this basis, we develop a conceptual framework for risk management that can provide orientation for further music education and

  8. Error management for musicians: an interdisciplinary conceptual framework

    Directory of Open Access Journals (Sweden)

    Silke eKruse-Weber

    2014-07-01

    Full Text Available Musicians tend to strive for flawless performance and perfection, avoiding errors at all costs. Dealing with errors while practicing or performing is often frustrating and can lead to anger and despair, which can explain musicians’ generally negative attitude toward errors and the tendency to aim for errorless learning in instrumental music education. But even the best performances are rarely error-free, and research in general pedagogy and psychology has shown that errors provide useful information for the learning process. Research in instrumental pedagogy is still neglecting error issues; the benefits of risk management (before the error and error management (during and after the error are still underestimated. It follows that dealing with errors is a key aspect of music practice at home, teaching, and performance in public. And yet, to be innovative, or to make their performance extraordinary, musicians need to risk errors. Currently, most music students only acquire the ability to manage errors implicitly - or not at all. A more constructive, creative and differentiated culture of errors would balance error tolerance and risk-taking against error prevention in ways that enhance music practice and music performance. The teaching environment should lay the foundation for the development of these abilities. In this contribution, we survey recent research in aviation, medicine, economics, psychology, and interdisciplinary decision theory that has demonstrated that specific error-management training can promote metacognitive skills that lead to better adaptive transfer and better performance skills. We summarize how this research can be applied to music, and survey relevant research that is specifically tailored to the needs of musicians, including generic guidelines for risk and error management in music teaching and performance. On this basis, we develop a conceptual framework for risk management that can provide orientation for further

  9. Research on Error Modelling and Identification of 3 Axis NC Machine Tools Based on Cross Grid Encoder Measurement

    International Nuclear Information System (INIS)

    Du, Z C; Lv, C F; Hong, M S

    2006-01-01

    A new error modelling and identification method based on the cross grid encoder is proposed in this paper. Generally, there are 21 error components in the geometric error of the 3 axis NC machine tools. However according our theoretical analysis, the squareness error among different guide ways affects not only the translation error component, but also the rotational ones. Therefore, a revised synthetic error model is developed. And the mapping relationship between the error component and radial motion error of round workpiece manufactured on the NC machine tools are deduced. This mapping relationship shows that the radial error of circular motion is the comprehensive function result of all the error components of link, worktable, sliding table and main spindle block. Aiming to overcome the solution singularity shortcoming of traditional error component identification method, a new multi-step identification method of error component by using the Cross Grid Encoder measurement technology is proposed based on the kinematic error model of NC machine tool. Firstly, the 12 translational error components of the NC machine tool are measured and identified by using the least square method (LSM) when the NC machine tools go linear motion in the three orthogonal planes: XOY plane, XOZ plane and YOZ plane. Secondly, the circular error tracks are measured when the NC machine tools go circular motion in the same above orthogonal planes by using the cross grid encoder Heidenhain KGM 182. Therefore 9 rotational errors can be identified by using LSM. Finally the experimental validation of the above modelling theory and identification method is carried out in the 3 axis CNC vertical machining centre Cincinnati 750 Arrow. The entire 21 error components have been successfully measured out by the above method. Research shows the multi-step modelling and identification method is very suitable for 'on machine measurement'

  10. 77 FR 12227 - Long Term 2 Enhanced Surface Water Treatment Rule: Uncovered Finished Water Reservoirs; Public...

    Science.gov (United States)

    2012-02-29

    ... Water Treatment Rule: Uncovered Finished Water Reservoirs; Public Meeting AGENCY: Environmental... review of the uncovered finished water reservoir requirement in the Long Term 2 Enhanced Surface Water... uncovered finished water reservoir requirement and the agency's Six Year Review process. EPA also plans to...

  11. Outlier Removal and the Relation with Reporting Errors and Quality of Psychological Research

    Science.gov (United States)

    Bakker, Marjan; Wicherts, Jelte M.

    2014-01-01

    Background The removal of outliers to acquire a significant result is a questionable research practice that appears to be commonly used in psychology. In this study, we investigated whether the removal of outliers in psychology papers is related to weaker evidence (against the null hypothesis of no effect), a higher prevalence of reporting errors, and smaller sample sizes in these papers compared to papers in the same journals that did not report the exclusion of outliers from the analyses. Methods and Findings We retrieved a total of 2667 statistical results of null hypothesis significance tests from 153 articles in main psychology journals, and compared results from articles in which outliers were removed (N = 92) with results from articles that reported no exclusion of outliers (N = 61). We preregistered our hypotheses and methods and analyzed the data at the level of articles. Results show no significant difference between the two types of articles in median p value, sample sizes, or prevalence of all reporting errors, large reporting errors, and reporting errors that concerned the statistical significance. However, we did find a discrepancy between the reported degrees of freedom of t tests and the reported sample size in 41% of articles that did not report removal of any data values. This suggests common failure to report data exclusions (or missingness) in psychological articles. Conclusions We failed to find that the removal of outliers from the analysis in psychological articles was related to weaker evidence (against the null hypothesis of no effect), sample size, or the prevalence of errors. However, our control sample might be contaminated due to nondisclosure of excluded values in articles that did not report exclusion of outliers. Results therefore highlight the importance of more transparent reporting of statistical analyses. PMID:25072606

  12. Simultaneous Treatment of Missing Data and Measurement Error in HIV Research Using Multiple Overimputation.

    Science.gov (United States)

    Schomaker, Michael; Hogger, Sara; Johnson, Leigh F; Hoffmann, Christopher J; Bärnighausen, Till; Heumann, Christian

    2015-09-01

    Both CD4 count and viral load in HIV-infected persons are measured with error. There is no clear guidance on how to deal with this measurement error in the presence of missing data. We used multiple overimputation, a method recently developed in the political sciences, to account for both measurement error and missing data in CD4 count and viral load measurements from four South African cohorts of a Southern African HIV cohort collaboration. Our knowledge about the measurement error of ln CD4 and log10 viral load is part of an imputation model that imputes both missing and mismeasured data. In an illustrative example, we estimate the association of CD4 count and viral load with the hazard of death among patients on highly active antiretroviral therapy by means of a Cox model. Simulation studies evaluate the extent to which multiple overimputation is able to reduce bias in survival analyses. Multiple overimputation emphasizes more strongly the influence of having high baseline CD4 counts compared to both a complete case analysis and multiple imputation (hazard ratio for >200 cells/mm vs. <25 cells/mm: 0.21 [95% confidence interval: 0.18, 0.24] vs. 0.38 [0.29, 0.48], and 0.29 [0.25, 0.34], respectively). Similar results are obtained when varying assumptions about measurement error, when using p-splines, and when evaluating time-updated CD4 count in a longitudinal analysis. The estimates of the association with viral load are slightly more attenuated when using multiple imputation instead of multiple overimputation. Our simulation studies suggest that multiple overimputation is able to reduce bias and mean squared error in survival analyses. Multiple overimputation, which can be used with existing software, offers a convenient approach to account for both missing and mismeasured data in HIV research.

  13. Health Detectives: Uncovering the Mysteries of Disease (LBNL Science at the Theater)

    Energy Technology Data Exchange (ETDEWEB)

    Bissell, Mina; Canaria, Christie; Celnicker, Susan; Karpen, Gary

    2012-04-23

    In this April 23, 2012 Science at the Theater event, Berkeley Lab scientists discuss how they uncover the mysteries of disease in unlikely places. Speakers and topics include: World-renowned cancer researcher Mina Bissell's pioneering research on the role of the cellular microenvironment in breast cancer has changed the conversation about the disease. How does DNA instability cause disease? To find out, Christie Canaria images neural networks to study disorders such as Huntington's disease. Fruit flies can tell us a lot about ourselves. Susan Celniker explores the fruit fly genome to learn how our genome works. DNA is not destiny. Gary Karpen explores how environmental factors shape genome function and disease through epigenetics.

  14. Source memory errors in schizophrenia, hallucinations and negative symptoms: a synthesis of research findings.

    Science.gov (United States)

    Brébion, G; Ohlsen, R I; Bressan, R A; David, A S

    2012-12-01

    Previous research has shown associations between source memory errors and hallucinations in patients with schizophrenia. We bring together here findings from a broad memory investigation to specify better the type of source memory failure that is associated with auditory and visual hallucinations. Forty-one patients with schizophrenia and 43 healthy participants underwent a memory task involving recall and recognition of lists of words, recognition of pictures, memory for temporal and spatial context of presentation of the stimuli, and remembering whether target items were presented as words or pictures. False recognition of words and pictures was associated with hallucination scores. The extra-list intrusions in free recall were associated with verbal hallucinations whereas the intra-list intrusions were associated with a global hallucination score. Errors in discriminating the temporal context of word presentation and the spatial context of picture presentation were associated with auditory hallucinations. The tendency to remember verbal labels of items as pictures of these items was associated with visual hallucinations. Several memory errors were also inversely associated with affective flattening and anhedonia. Verbal and visual hallucinations are associated with confusion between internal verbal thoughts or internal visual images and perception. In addition, auditory hallucinations are associated with failure to process or remember the context of presentation of the events. Certain negative symptoms have an opposite effect on memory errors.

  15. Covered versus uncovered self-expandable metal stents for malignant biliary strictures: A meta-analysis and systematic review.

    Science.gov (United States)

    Moole, Harsha; Bechtold, Matthew L; Cashman, Micheal; Volmar, Fritz H; Dhillon, Sonu; Forcione, David; Taneja, Deepak; Puli, Srinivas R

    2016-09-01

    Self-expandable metal stents (SEMS) are used for palliating inoperable malignant biliary strictures. It is unclear if covered metal stents are superior to uncovered metal stents in these patients. We compared clinical outcomes in patients with covered and uncovered stents. Studies using covered and uncovered metallic stents for palliation in patients with malignant biliary stricture were reviewed. Articles were searched in MEDLINE, PubMed, and Ovid journals. Fixed and random effects models were used to calculate the pooled proportions. Initial search identified 1436 reference articles, of which 132 were selected and reviewed. Thirteen studies (n = 2239) for covered and uncovered metallic stents which met the inclusion criteria were included in this analysis. Odds ratio for stent occlusion rates in covered vs. uncovered stents was 0.79 (95 % CI = 0.65 to 0.96). Survival benefit in patients with covered vs. uncovered stents showed the odds ratio to be 1.29 (95 % CI = 0.95 to 1.74). Pooled odds ratio for migration of covered vs. uncovered stents was 9.9 (95 % CI = 4.5 to 22.3). Covered stents seemed to have significantly lesser occlusion rates, increased odds of migration, and increased odds of pancreatitis compared to uncovered stents. There was no statistically significant difference in the survival benefit, overall adverse event rate, and patency period of covered vs. uncovered metal stents in patients with malignant biliary strictures.

  16. Pierre Bourdieu's Theory of Practice offers nurses a framework to uncover embodied knowledge of patients living with disabilities or illnesses: A discussion paper.

    Science.gov (United States)

    Oerther, Sarah; Oerther, Daniel B

    2018-04-01

    To discuss how Bourdieu's theory of practice can be used by nurse researchers to better uncover the embodied knowledge of patients living with disability and illness. Bourdieu's theory of practice has been used in social and healthcare researches. This theory emphasizes that an individual's everyday practices are not always explicit and mediated by language, but instead an individual's everyday practices are often are tacit and embodied. Discussion paper. Ovid MEDLINE, CINAHL and SCOPUS were searched for concepts from Bourdieu's theory that was used to understand embodied knowledge of patients living with disability and illness. The literature search included articles from 2003 - 2017. Nurse researchers should use Bourdieu's theory of practice to uncover the embodied knowledge of patients living with disability and illness, and nurse researchers should translate these discoveries into policy recommendations and improved evidence-based best practice. The practice of nursing should incorporate an understanding of embodied knowledge to support disabled and ill patients as these patients modify "everyday practices" in the light of their disabilities and illnesses. Bourdieu's theory enriches nursing because the theory allows for consideration of both the objective and the subjective through the conceptualization of capital, habitus and field. Uncovering individuals embodied knowledge is critical to implement best practices that assist patients as they adapt to bodily changes during disability and illness. © 2017 John Wiley & Sons Ltd.

  17. Use of (Time-Domain) Vector Autoregressions to Test Uncovered Interest Parity

    OpenAIRE

    Takatoshi Ito

    1984-01-01

    In this paper, a vector autoregression model (VAR) is proposed in order to test uncovered interest parity (UIP) in the foreign exchange market. Consider a VAR system of the spot exchange rate (yen/dollar), the domestic (US) interest rate and the foreign (Japanese) interest rate, describing the interdependence of the domestic and international financia lmarkets. Uncovered interest parity is stated as a null hypothesis that the current difference between the two interest rates is equal to the d...

  18. The "Measuring Outcomes of Clinical Connectivity" (MOCC) trial: investigating data entry errors in the Electronic Primary Care Research Network (ePCRN).

    Science.gov (United States)

    Fontaine, Patricia; Mendenhall, Tai J; Peterson, Kevin; Speedie, Stuart M

    2007-01-01

    The electronic Primary Care Research Network (ePCRN) enrolled PBRN researchers in a feasibility trial to test the functionality of the network's electronic architecture and investigate error rates associated with two data entry strategies used in clinical trials. PBRN physicians and research assistants who registered with the ePCRN were eligible to participate. After online consent and randomization, participants viewed simulated patient records, presented as either abstracted data (short form) or progress notes (long form). Participants transcribed 50 data elements onto electronic case report forms (CRFs) without integrated field restrictions. Data errors were analyzed. Ten geographically dispersed PBRNs enrolled 100 members and completed the study in less than 7 weeks. The estimated overall error rate if field restrictions had been applied was 2.3%. Participants entering data from the short form had a higher rate of correctly entered data fields (94.5% vs 90.8%, P = .004) and significantly more error-free records (P = .003). Feasibility outcomes integral to completion of an Internet-based, multisite study were successfully achieved. Further development of programmable electronic safeguards is indicated. The error analysis conducted in this study will aid design of specific field restrictions for electronic CRFs, an important component of clinical trial management systems.

  19. Comparison of Covered Versus Uncovered Stents for Benign Superior Vena Cava (SVC) Obstruction.

    Science.gov (United States)

    Haddad, Mustafa M; Simmons, Benjamin; McPhail, Ian R; Kalra, Manju; Neisen, Melissa J; Johnson, Matthew P; Stockland, Andrew H; Andrews, James C; Misra, Sanjay; Bjarnason, Haraldur

    2018-05-01

    To identify whether long-term symptom relief and stent patency vary with the use of covered versus uncovered stents for the treatment of benign SVC obstruction. We retrospectively identified all patients with benign SVC syndrome treated to stent placement between January 2003 and December 2015 (n = 59). Only cases with both clinical and imaging follow-up were included (n = 47). In 33 (70%) of the patients, the obstruction was due to a central line or pacemaker wires, and in 14 (30%), the cause was fibrosing mediastinitis. Covered stents were placed in 17 (36%) of the patients, and 30 (64%) patients had an uncovered stent. Clinical and treatment outcomes, complications, and the percent stenosis of each stent were evaluated. Technical success was achieved in all cases at first attempt. Average clinical and imaging follow-up in years was 2.7 (range 0.1-11.1) (covered) and 1.7 (range 0.2-10.5) (uncovered), respectively. There was a significant difference (p = 0.044) in the number of patients who reported a return of symptoms between the covered (5/17 or 29.4%) and uncovered (18/30 or 60%) groups. There was also a significant difference (p = stenosis after stent placement between the covered [17.9% (range 0-100) ± 26.2] and uncovered [48.3% (range 6.8-100) ± 33.5] groups. No significant difference (p = 0.227) was found in the time (days) between the date of the procedure and the date of clinical follow-up where a return of symptoms was reported [covered: 426.6 (range 28-1554) ± 633.9 and uncovered 778.1 (range 23-3851) ± 1066.8]. One patient in the uncovered group had non-endovascular surgical intervention (innominate to right atrial bypass), while none in the covered group required surgical intervention. One major complication (SIR grade C) occurred that consisted of a pericardial hemorrhagic effusion after angioplasty that required covered stent placement. There were no procedure-related deaths. Both covered and uncovered stents can be used for

  20. Human errors and mistakes

    International Nuclear Information System (INIS)

    Wahlstroem, B.

    1993-01-01

    Human errors have a major contribution to the risks for industrial accidents. Accidents have provided important lesson making it possible to build safer systems. In avoiding human errors it is necessary to adapt the systems to their operators. The complexity of modern industrial systems is however increasing the danger of system accidents. Models of the human operator have been proposed, but the models are not able to give accurate predictions of human performance. Human errors can never be eliminated, but their frequency can be decreased by systematic efforts. The paper gives a brief summary of research in human error and it concludes with suggestions for further work. (orig.)

  1. Error-related anterior cingulate cortex activity and the prediction of conscious error awareness

    Directory of Open Access Journals (Sweden)

    Catherine eOrr

    2012-06-01

    Full Text Available Research examining the neural mechanisms associated with error awareness has consistently identified dorsal anterior cingulate activity (ACC as necessary but not predictive of conscious error detection. Two recent studies (Steinhauser and Yeung, 2010; Wessel et al. 2011 have found a contrary pattern of greater dorsal ACC activity (in the form of the error-related negativity during detected errors, but suggested that the greater activity may instead reflect task influences (e.g., response conflict, error probability and or individual variability (e.g., statistical power. We re-analyzed fMRI BOLD data from 56 healthy participants who had previously been administered the Error Awareness Task, a motor Go/No-go response inhibition task in which subjects make errors of commission of which they are aware (Aware errors, or unaware (Unaware errors. Consistent with previous data, the activity in a number of cortical regions was predictive of error awareness, including bilateral inferior parietal and insula cortices, however in contrast to previous studies, including our own smaller sample studies using the same task, error-related dorsal ACC activity was significantly greater during aware errors when compared to unaware errors. While the significantly faster RT for aware errors (compared to unaware was consistent with the hypothesis of higher response conflict increasing ACC activity, we could find no relationship between dorsal ACC activity and the error RT difference. The data suggests that individual variability in error awareness is associated with error-related dorsal ACC activity, and therefore this region may be important to conscious error detection, but it remains unclear what task and individual factors influence error awareness.

  2. DOES UNCOVERED INTEREST RATE PARITY HOLD IN TURKEY?

    Directory of Open Access Journals (Sweden)

    Ozcan Karahan

    2012-01-01

    Full Text Available Most of the earlier empirical studies focusing on developed countries failed to give evidence in favor of the Uncovered Interest Rate Parity (UIP. After intensive financial liberalization processes and mostly preferred free exchange rate regimes, a new area of research starts to involve the investigation whether UIP holds for developing economies differently. Accordingly, we tested the UIP for Turkey’s monthly interest rate and exchange rate data between 2002 and 2011. We run conventional regressions in the form of Ordinary Least Squares (OLS and used a simple Generalized Autoregressive Conditional Heteroskedasticity (GARCH analysis. The empirical results of both methods do not support the validity of UIP for Turkey. Thus, together with most of the earlier empirical studies focusing on developed countries and detecting the invalidity of UIP, we can argue that the experience of Turkey and developed economies are not different.

  3. Game Design Principles based on Human Error

    Directory of Open Access Journals (Sweden)

    Guilherme Zaffari

    2016-03-01

    Full Text Available This paper displays the result of the authors’ research regarding to the incorporation of Human Error, through design principles, to video game design. In a general way, designers must consider Human Error factors throughout video game interface development; however, when related to its core design, adaptations are in need, since challenge is an important factor for fun and under the perspective of Human Error, challenge can be considered as a flaw in the system. The research utilized Human Error classifications, data triangulation via predictive human error analysis, and the expanded flow theory to allow the design of a set of principles in order to match the design of playful challenges with the principles of Human Error. From the results, it was possible to conclude that the application of Human Error in game design has a positive effect on player experience, allowing it to interact only with errors associated with the intended aesthetics of the game.

  4. Human Errors in Decision Making

    OpenAIRE

    Mohamad, Shahriari; Aliandrina, Dessy; Feng, Yan

    2005-01-01

    The aim of this paper was to identify human errors in decision making process. The study was focused on a research question such as: what could be the human error as a potential of decision failure in evaluation of the alternatives in the process of decision making. Two case studies were selected from the literature and analyzed to find the human errors contribute to decision fail. Then the analysis of human errors was linked with mental models in evaluation of alternative step. The results o...

  5. Uncovering Indicators of Commercial Sexual Exploitation.

    Science.gov (United States)

    Bounds, Dawn; Delaney, Kathleen R; Julion, Wrenetha; Breitenstein, Susan

    2017-07-01

    It is estimated that annually 100,000 to 300,000 youth are at risk for sex trafficking; a commercial sex act induced by force, fraud, or coercion, or any such act where the person induced to perform such an act is younger than 18 years of age. Increasingly, such transactions are occurring online via Internet-based sites that serve the commercial sex industry. Commercial sex transactions involving trafficking are illegal; thus, Internet discussions between those involved must be veiled. Even so, transactions around sex trafficking do occur. Within these transactions are innuendos that provide one avenue for detecting potential activity. The purpose of this study is to identify linguistic indicators of potential commercial sexual exploitation within the online comments of men posted on an Internet site. Six hundred sixty-six posts from five Midwest cities and 363 unique members were analyzed via content analysis. Three main indicators were found: the presence of youth or desire for youthfulness, presence of pimps, and awareness of vulnerability. These findings begin a much-needed dialogue on uncovering online risks of commercial sexual exploitation and support the need for further research on Internet indicators of sex trafficking.

  6. Electronic error-reporting systems: a case study into the impact on nurse reporting of medical errors.

    Science.gov (United States)

    Lederman, Reeva; Dreyfus, Suelette; Matchan, Jessica; Knott, Jonathan C; Milton, Simon K

    2013-01-01

    Underreporting of errors in hospitals persists despite the claims of technology companies that electronic systems will facilitate reporting. This study builds on previous analyses to examine error reporting by nurses in hospitals using electronic media. This research asks whether the electronic media creates additional barriers to error reporting, and, if so, what practical steps can all hospitals take to reduce these barriers. This is a mixed-method case study nurses' use of an error reporting system, RiskMan, in two hospitals. The case study involved one large private hospital and one large public hospital in Victoria, Australia, both of which use the RiskMan medical error reporting system. Information technology-based error reporting systems have unique access problems and time demands and can encourage nurses to develop alternative reporting mechanisms. This research focuses on nurses and raises important findings for hospitals using such systems or considering installation. This article suggests organizational and technical responses that could reduce some of the identified barriers. Crown Copyright © 2013. Published by Mosby, Inc. All rights reserved.

  7. Long Term Follow-up of a Transjugular Intrahepatic Portosystemic Shunt: A Comparison of Covered and Uncovered Stents

    Energy Technology Data Exchange (ETDEWEB)

    Joo, Seung Moon; Park, Jae Hyung; Kim, Hyo Cheol; Jae, Hwan Jun; Chung, Jin Wook [Seoul National University Hospital, Seoul (Korea, Republic of)

    2009-01-15

    To evaluate the long term patency of transjugular intrahepatic portosystemic shunts (TIPS) and to compare the patency rate of covered and uncovered stents in TIPS. The study population included 78 patients with portal hypertension that underwent TIPS between January 1999 and July 2007 at our institution using uncovered stents in 53 patients and covered stents in 25 patients. The primary and secondary patency rates of TIPS were estimated to compare the uncovered and covered stent groups. The primary and secondary patency rates of the TIPS patients were found to be 83.9% and 93.9% at the 6 month follow-up and 73.5% and 88.5% at the12 month follow-up for uncovered and covered stents, respectively. A breakdown patency rates for the 12 month follow-up revealed that the primary patency rates were 76.6% and 66.3% for uncovered and covered stents, respectively; whereas, the secondary patency rates were 94.3% and 73.8% for the uncovered and covered stents, respectively. A comparative analysis did not provide evidence to suggest that a difference exists between the patency rates of the uncovered and covered stent groups (p>0.05). No significant difference was found between the patency rates of the uncovered and covered stent groups. A follow-up to this study would be a more thorough randomized evaluation of the different types of covered stents to compare long-term patency rates.

  8. Systematic Procedural Error

    National Research Council Canada - National Science Library

    Byrne, Michael D

    2006-01-01

    .... This problem has received surprisingly little attention from cognitive psychologists. The research summarized here examines such errors in some detail both empirically and through computational cognitive modeling...

  9. Students’ Written Production Error Analysis in the EFL Classroom Teaching: A Study of Adult English Learners Errors

    Directory of Open Access Journals (Sweden)

    Ranauli Sihombing

    2016-12-01

    Full Text Available Errors analysis has become one of the most interesting issues in the study of Second Language Acquisition. It can not be denied that some teachers do not know a lot about error analysis and related theories of how L1, L2 or foreign language acquired. In addition, the students often feel upset since they find a gap between themselves and the teachers for the errors the students make and the teachers’ understanding about the error correction. The present research aims to investigate what errors adult English learners make in written production of English. The significances of the study is to know what errors students make in writing that the teachers can find solution to the errors the students make for a better English language teaching and learning especially in teaching English for adults. The study employed qualitative method. The research was undertaken at an airline education center in Bandung. The result showed that syntax errors are more frequently found than morphology errors, especially in terms of verb phrase errors. It is recommended that it is important for teacher to know the theory of second language acquisition in order to know how the students learn and produce theirlanguage. In addition, it will be advantages for teachers if they know what errors students frequently make in their learning, so that the teachers can give solution to the students for a better English language learning achievement.   DOI: https://doi.org/10.24071/llt.2015.180205

  10. Uncovering student ideas in physical science

    CERN Document Server

    Keeley, Page

    2014-01-01

    If you and your students can't get enough of a good thing, Volume 2 of Uncovering Student Ideas in Physical Science is just what you need. The book offers 39 new formative assessment probes, this time with a focus on electric charge, electric current, and magnets and electromagnetism. It can help you do everything from demystify electromagnetic fields to explain the real reason balloons stick to the wall after you rub them on your hair.

  11. Positive Beliefs about Errors as an Important Element of Adaptive Individual Dealing with Errors during Academic Learning

    Science.gov (United States)

    Tulis, Maria; Steuer, Gabriele; Dresel, Markus

    2018-01-01

    Research on learning from errors gives reason to assume that errors provide a high potential to facilitate deep learning if students are willing and able to take these learning opportunities. The first aim of this study was to analyse whether beliefs about errors as learning opportunities can be theoretically and empirically distinguished from…

  12. Partially Covered Metal Stents May Not Prolong Stent Patency Compared to Uncovered Stents in Unresectable Malignant Distal Biliary Obstruction

    Science.gov (United States)

    Kim, Jae Yun; Ko, Gyu Bong; Lee, Tae Hoon; Park, Sang-Heum; Lee, Yun Nah; Cho, Young Sin; Jung, Yunho; Chung, Il-Kwun; Choi, Hyun Jong; Cha, Sang-Woo; Moon, Jong Ho; Cho, Young Deok; Kim, Sun-Joo

    2017-01-01

    Background/Aims Controversy still exists regarding the benefits of covered self-expandable metal stents (SEMSs) compared to uncovered SEMSs. We aimed to compare the patency and stent-related adverse events of partially covered SEMSs (PC-SEMSs) and uncovered SEMSs in unresectable malignant distal biliary obstruction. Methods A total of 134 patients who received a PC-SEMS or uncovered SEMS for palliation of unresectable malignant distal biliary obstruction were reviewed retrospectively. The main outcome measures were stent patency, stent-related adverse events, and overall survival. Results The median stent patency was 118 days (range, 3 to 802 days) with PC-SEMSs and 105 days (range, 2 to 485 days) with uncovered SEMSs (p=0.718). The overall endoscopic revision rate due to stent dysfunction was 36.6% (26/71) with PC-SEMSs and 36.5% (23/63) with uncovered SEMSs (p=0.589). Tumor ingrowth was more frequent with uncovered SEMSs (4.2% vs 19.1%, p=0.013), but migration was more frequent with PC-SEMSs (11.2% vs 1.5%, p=0.04). The incidence of stent-related adverse events was 2.8% (2/71) with PC-SEMSs and 9.5% (6/63) with uncovered SEMSs (p=0.224). The median overall survival was 166 days with PC-SEMSs and 168 days with uncovered SEMSs (p=0.189). Conclusions Compared to uncovered SEMSs, PC-SEMSs did not prolong stent patency in unresectable malignant distal biliary obstruction. Stent migration was more frequent with PC-SEMSs. However, tumor ingrowth was less frequent with PC-SEMSs compared to uncovered SEMSs. PMID:28208003

  13. Malignant Gastroduodenal Obstruction: Treatment with Self-Expanding Uncovered Wallstent

    International Nuclear Information System (INIS)

    Gutzeit, Andreas; Binkert, Christoph A.; Schoch, Eric; Sautter, Thomas; Jost, Res; Zollikofer, Christoph L.

    2009-01-01

    Purpose: To retrospectively evaluate the clinical effectiveness of a self-expanding uncovered Wallstent in patients with malignant gastroduodenal obstruction. Materials and Methods: Under combined endoscopic and fluoroscopic guidance, 29 patients with a malignant gastroduodenal stenosis were treated with a self-expanding uncovered metallic Wallstent. A dysphagia score was assessed before and after the intervention to measure the success of this palliative therapy. The dysphagia score ranged between grade 0 to grade 4: grade 0 = able to tolerate solid food, grade 1 = able to tolerate soft food, grade 2 = able to tolerate thick liquids, grade 3 = able to tolerate water or clear fluids, and grade 4 = unable to tolerate anything perorally. Stent patency and patients survival rates were calculated. Results: The insertion of the gastroduodenal stent was technically successful in 28 patients (96.5%). After stenting, 25 patients (86.2%) showed clinical improvement by at least one score point. During follow-up, 22 (78.5%) of 28 patients showed no stent occlusion until death and did not have to undergo any further intervention. In six patients (20.6%), all of whom were treated with secondary stent insertions, occlusion with tumor ingrowth and/or overgrowth was observed after the intervention. The median period of primary stent patency in our study was 240 days. Conclusion: Placement of an uncovered Wallstent is clinically effective in patients with malignant gastroduodenal obstruction. Stent placement is associated with high technical success, good palliation effect, and high durability of stent function.

  14. 78 FR 75353 - Agency Information Collection Activities: Proposed Collection: Public Comment Request

    Science.gov (United States)

    2013-12-11

    ... identified in a traditional survey interview, such as interpretive errors and recall accuracy, are uncovered... as more basic research on response errors in surveys. HRSA staff use various techniques to evaluate...-based questionnaires. The most common questionnaire evaluation method is the cognitive interview. The...

  15. Error-related brain activity and error awareness in an error classification paradigm.

    Science.gov (United States)

    Di Gregorio, Francesco; Steinhauser, Marco; Maier, Martin E

    2016-10-01

    Error-related brain activity has been linked to error detection enabling adaptive behavioral adjustments. However, it is still unclear which role error awareness plays in this process. Here, we show that the error-related negativity (Ne/ERN), an event-related potential reflecting early error monitoring, is dissociable from the degree of error awareness. Participants responded to a target while ignoring two different incongruent distractors. After responding, they indicated whether they had committed an error, and if so, whether they had responded to one or to the other distractor. This error classification paradigm allowed distinguishing partially aware errors, (i.e., errors that were noticed but misclassified) and fully aware errors (i.e., errors that were correctly classified). The Ne/ERN was larger for partially aware errors than for fully aware errors. Whereas this speaks against the idea that the Ne/ERN foreshadows the degree of error awareness, it confirms the prediction of a computational model, which relates the Ne/ERN to post-response conflict. This model predicts that stronger distractor processing - a prerequisite of error classification in our paradigm - leads to lower post-response conflict and thus a smaller Ne/ERN. This implies that the relationship between Ne/ERN and error awareness depends on how error awareness is related to response conflict in a specific task. Our results further indicate that the Ne/ERN but not the degree of error awareness determines adaptive performance adjustments. Taken together, we conclude that the Ne/ERN is dissociable from error awareness and foreshadows adaptive performance adjustments. Our results suggest that the relationship between the Ne/ERN and error awareness is correlative and mediated by response conflict. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Proceedings of the international workshop on building the new HRA: errors of commission - from research to application

    International Nuclear Information System (INIS)

    2003-01-01

    The main mission of the Working Group on Risk Assessment (RISK) is to advance the understanding and utilisation of probabilistic safety analysis (PSA) in ensuring continued safety of nuclear installations in Member countries. One of the major criticisms of current PSAs is that they do not adequately address an important class of human system interactions, namely inappropriate actions, particularly those that might occur during the response to a transient or accident, that place the plant in a situation of higher risk. This class of inappropriate actions is often referred to as 'errors of commission'. The principal characteristic of an error of commission in a PSA context is that its consequence is a state of unavailability of a component, system or function. This is in contrast to an error of omission, which is characterised by a lack of action and, therefore, preserves the status quo of a system, component, or function. In the PSA context, the most significant errors of commission are those that, in addition to resulting in failure to perform some function, also fail or make unavailable other equipment or functions needed to mitigate the accident scenario, or otherwise exacerbate the situation. The workshop reported herein is an extension of the work of the Working Group on Risk Assessment (RISK) performed to review errors of commission in probabilistic safety analysis (NEA/CSNI/R(2000)17). The main purpose of the meeting was to provide a forum for exchange of information including lessons learned, identification of gaps in our current understanding and knowledge, data needs, and research needs. This workshop also provides a perspective for another workshop, Building the New HRA: Strengthening the Link Between Experience and HRA, to be held in Munich in January of 2002. Individual speakers present a broad international perspective that summarises technical issues, lessons learned, and experiences gained through applying second-generation human reliability

  17. Rotational error in path integration: encoding and execution errors in angle reproduction.

    Science.gov (United States)

    Chrastil, Elizabeth R; Warren, William H

    2017-06-01

    Path integration is fundamental to human navigation. When a navigator leaves home on a complex outbound path, they are able to keep track of their approximate position and orientation and return to their starting location on a direct homebound path. However, there are several sources of error during path integration. Previous research has focused almost exclusively on encoding error-the error in registering the outbound path in memory. Here, we also consider execution error-the error in the response, such as turning and walking a homebound trajectory. In two experiments conducted in ambulatory virtual environments, we examined the contribution of execution error to the rotational component of path integration using angle reproduction tasks. In the reproduction tasks, participants rotated once and then rotated again to face the original direction, either reproducing the initial turn or turning through the supplementary angle. One outstanding difficulty in disentangling encoding and execution error during a typical angle reproduction task is that as the encoding angle increases, so does the required response angle. In Experiment 1, we dissociated these two variables by asking participants to report each encoding angle using two different responses: by turning to walk on a path parallel to the initial facing direction in the same (reproduction) or opposite (supplementary angle) direction. In Experiment 2, participants reported the encoding angle by turning both rightward and leftward onto a path parallel to the initial facing direction, over a larger range of angles. The results suggest that execution error, not encoding error, is the predominant source of error in angular path integration. These findings also imply that the path integrator uses an intrinsic (action-scaled) rather than an extrinsic (objective) metric.

  18. Error framing effects on performance: cognitive, motivational, and affective pathways.

    Science.gov (United States)

    Steele-Johnson, Debra; Kalinoski, Zachary T

    2014-01-01

    Our purpose was to examine whether positive error framing, that is, making errors salient and cuing individuals to see errors as useful, can benefit learning when task exploration is constrained. Recent research has demonstrated the benefits of a newer approach to training, that is, error management training, that includes the opportunity to actively explore the task and framing errors as beneficial to learning complex tasks (Keith & Frese, 2008). Other research has highlighted the important role of errors in on-the-job learning in complex domains (Hutchins, 1995). Participants (N = 168) from a large undergraduate university performed a class scheduling task. Results provided support for a hypothesized path model in which error framing influenced cognitive, motivational, and affective factors which in turn differentially affected performance quantity and quality. Within this model, error framing had significant direct effects on metacognition and self-efficacy. Our results suggest that positive error framing can have beneficial effects even when tasks cannot be structured to support extensive exploration. Whereas future research can expand our understanding of error framing effects on outcomes, results from the current study suggest that positive error framing can facilitate learning from errors in real-time performance of tasks.

  19. Nonresponse Error in Mail Surveys: Top Ten Problems

    Directory of Open Access Journals (Sweden)

    Jeanette M. Daly

    2011-01-01

    Full Text Available Conducting mail surveys can result in nonresponse error, which occurs when the potential participant is unwilling to participate or impossible to contact. Nonresponse can result in a reduction in precision of the study and may bias results. The purpose of this paper is to describe and make readers aware of a top ten list of mailed survey problems affecting the response rate encountered over time with different research projects, while utilizing the Dillman Total Design Method. Ten nonresponse error problems were identified, such as inserter machine gets sequence out of order, capitalization in databases, and mailing discarded by postal service. These ten mishaps can potentiate nonresponse errors, but there are ways to minimize their frequency. Suggestions offered stem from our own experiences during research projects. Our goal is to increase researchers' knowledge of nonresponse error problems and to offer solutions which can decrease nonresponse error in future projects.

  20. Analysis of Medication Error Reports

    Energy Technology Data Exchange (ETDEWEB)

    Whitney, Paul D.; Young, Jonathan; Santell, John; Hicks, Rodney; Posse, Christian; Fecht, Barbara A.

    2004-11-15

    In medicine, as in many areas of research, technological innovation and the shift from paper based information to electronic records has created a climate of ever increasing availability of raw data. There has been, however, a corresponding lag in our abilities to analyze this overwhelming mass of data, and classic forms of statistical analysis may not allow researchers to interact with data in the most productive way. This is true in the emerging area of patient safety improvement. Traditionally, a majority of the analysis of error and incident reports has been carried out based on an approach of data comparison, and starts with a specific question which needs to be answered. Newer data analysis tools have been developed which allow the researcher to not only ask specific questions but also to “mine” data: approach an area of interest without preconceived questions, and explore the information dynamically, allowing questions to be formulated based on patterns brought up by the data itself. Since 1991, United States Pharmacopeia (USP) has been collecting data on medication errors through voluntary reporting programs. USP’s MEDMARXsm reporting program is the largest national medication error database and currently contains well over 600,000 records. Traditionally, USP has conducted an annual quantitative analysis of data derived from “pick-lists” (i.e., items selected from a list of items) without an in-depth analysis of free-text fields. In this paper, the application of text analysis and data analysis tools used by Battelle to analyze the medication error reports already analyzed in the traditional way by USP is described. New insights and findings were revealed including the value of language normalization and the distribution of error incidents by day of the week. The motivation for this effort is to gain additional insight into the nature of medication errors to support improvements in medication safety.

  1. A Research on the Responsibility of Accounting Professionals to Determine and Prevent Accounting Errors and Frauds: Edirne Sample

    Directory of Open Access Journals (Sweden)

    Semanur Adalı

    2017-09-01

    Full Text Available In this study, the ethical dimensions of accounting professionals related to accounting errors and frauds were examined. Firstly, general and technical information about accounting were provided. Then, some terminology on error, fraud and ethics in accounting were discussed. Study also included recent statistics about accounting errors and fraud as well as presenting a literature review. As the methodology of research, a questionnaire was distributed to 36 accounting professionals residing in Edirne city of Turkey. The collected data were then entered to the SPSS package program for analysis. The study revealed very important results. Accounting professionals think that, accounting chambers do not organize enough seminars/conferences on errors and fraud. They also believe that supervision and disciplinary boards of professional accounting chambers fulfill their responsibilities partially. Attitude of professional accounting chambers in terms of errors, fraud and ethics is considered neither strict nor lenient. But, most accounting professionals are aware of colleagues who had disciplinary penalties. Most important and effective tool to prevent errors and fraud is indicated as external audit, but internal audit and internal control are valued as well. According to accounting professionals, most errors occur due to incorrect data received from clients and as a result of recording. Fraud is generally made in order to get credit from banks and for providing benefits to the organization by not showing the real situation of the firm. Finally, accounting professionals state that being honest, trustworthy and impartial is the basis of accounting profession and accountants must adhere to ethical rules.

  2. Human errors in NPP operations

    International Nuclear Information System (INIS)

    Sheng Jufang

    1993-01-01

    Based on the operational experiences of nuclear power plants (NPPs), the importance of studying human performance problems is described. Statistical analysis on the significance or frequency of various root-causes and error-modes from a large number of human-error-related events demonstrate that the defects in operation/maintenance procedures, working place factors, communication and training practices are primary root-causes, while omission, transposition, quantitative mistake are the most frequent among the error-modes. Recommendations about domestic research on human performance problem in NPPs are suggested

  3. An Analysis of Medication Errors at the Military Medical Center: Implications for a Systems Approach for Error Reduction

    National Research Council Canada - National Science Library

    Scheirman, Katherine

    2001-01-01

    An analysis was accomplished of all inpatient medication errors at a military academic medical center during the year 2000, based on the causes of medication errors as described by current research in the field...

  4. The error model and experiment of measuring angular position error based on laser collimation

    Science.gov (United States)

    Cai, Yangyang; Yang, Jing; Li, Jiakun; Feng, Qibo

    2018-01-01

    Rotary axis is the reference component of rotation motion. Angular position error is the most critical factor which impair the machining precision among the six degree-of-freedom (DOF) geometric errors of rotary axis. In this paper, the measuring method of angular position error of rotary axis based on laser collimation is thoroughly researched, the error model is established and 360 ° full range measurement is realized by using the high precision servo turntable. The change of space attitude of each moving part is described accurately by the 3×3 transformation matrices and the influences of various factors on the measurement results is analyzed in detail. Experiments results show that the measurement method can achieve high measurement accuracy and large measurement range.

  5. Social learning through prediction error in the brain

    Science.gov (United States)

    Joiner, Jessica; Piva, Matthew; Turrin, Courtney; Chang, Steve W. C.

    2017-06-01

    Learning about the world is critical to survival and success. In social animals, learning about others is a necessary component of navigating the social world, ultimately contributing to increasing evolutionary fitness. How humans and nonhuman animals represent the internal states and experiences of others has long been a subject of intense interest in the developmental psychology tradition, and, more recently, in studies of learning and decision making involving self and other. In this review, we explore how psychology conceptualizes the process of representing others, and how neuroscience has uncovered correlates of reinforcement learning signals to explore the neural mechanisms underlying social learning from the perspective of representing reward-related information about self and other. In particular, we discuss self-referenced and other-referenced types of reward prediction errors across multiple brain structures that effectively allow reinforcement learning algorithms to mediate social learning. Prediction-based computational principles in the brain may be strikingly conserved between self-referenced and other-referenced information.

  6. Infant search and object permanence: a meta-analysis of the A-not-B error.

    Science.gov (United States)

    Wellman, H M; Cross, D; Bartsch, K

    1987-01-01

    Research on Piaget's stage 4 object concept has failed to reveal a clear or consistent pattern of results. Piaget found that 8-12-month-old infants would make perserverative errors; his explanation for this phenomenon was that the infant's concept of the object was contextually dependent on his or her actions. Some studies designed to test Piaget's explanation have replicated Piaget's basic finding, yet many have found no preference for the A location or the B location or an actual preference for the B location. More recently, researchers have attempted to uncover the causes for these results concerning the A-not-B error. Again, however, different studies have yielded different results, and qualitative reviews have failed to yield a consistent explanation for the results of the individual studies. This state of affairs suggests that the phenomenon may simply be too complex to be captured by individual studies varying 1 factor at a time and by reviews based on similar qualitative considerations. Therefore, the current investigation undertook a meta-analysis, a synthesis capturing the quantitative information across the now sizable number of studies. We entered several important factors into the meta-analysis, including the effects of age, the number of A trials, the length of delay between hiding and search, the number of locations, the distances between locations, and the distinctive visual properties of the hiding arrays. Of these, the analysis consistently indicated that age, delay, and number of hiding locations strongly influence infants' search. The pattern of specific findings also yielded new information about infant search. A general characterization of the results is that, at every age, both above-chance and below-chance performance was observed. That is, at each age at least 1 combination of delay and number of locations yielded above-chance A-not-B errors or significant perseverative search. At the same time, at each age at least 1 alternative

  7. Understanding human management of automation errors

    Science.gov (United States)

    McBride, Sara E.; Rogers, Wendy A.; Fisk, Arthur D.

    2013-01-01

    Automation has the potential to aid humans with a diverse set of tasks and support overall system performance. Automated systems are not always reliable, and when automation errs, humans must engage in error management, which is the process of detecting, understanding, and correcting errors. However, this process of error management in the context of human-automation interaction is not well understood. Therefore, we conducted a systematic review of the variables that contribute to error management. We examined relevant research in human-automation interaction and human error to identify critical automation, person, task, and emergent variables. We propose a framework for management of automation errors to incorporate and build upon previous models. Further, our analysis highlights variables that may be addressed through design and training to positively influence error management. Additional efforts to understand the error management process will contribute to automation designed and implemented to support safe and effective system performance. PMID:25383042

  8. Identifying Lattice, Orbit, And BPM Errors in PEP-II

    International Nuclear Information System (INIS)

    Decker, F.-J.; SLAC

    2005-01-01

    The PEP-II B-Factory is delivering peak luminosities of up to 9.2 · 10 33 1/cm 2 · l/s. This is very impressive especially considering our poor understanding of the lattice, absolute orbit and beam position monitor system (BPM). A few simple MATLAB programs were written to get lattice information, like betatron functions in a coupled machine (four all together) and the two dispersions, from the current machine and compare it the design. Big orbit deviations in the Low Energy Ring (LER) could be explained not by bad BPMs (only 3), but by many strong correctors (one corrector to fix four BPMs on average). Additionally these programs helped to uncover a sign error in the third order correction of the BPM system. Further analysis of the current information of the BPMs (sum of all buttons) indicates that there might be still more problematic BPMs

  9. Generalizing human error rates: A taxonomic approach

    International Nuclear Information System (INIS)

    Buffardi, L.; Fleishman, E.; Allen, J.

    1989-01-01

    It is well established that human error plays a major role in malfunctioning of complex, technological systems and in accidents associated with their operation. Estimates of the rate of human error in the nuclear industry range from 20-65% of all system failures. In response to this, the Nuclear Regulatory Commission has developed a variety of techniques for estimating human error probabilities for nuclear power plant personnel. Most of these techniques require the specification of the range of human error probabilities for various tasks. Unfortunately, very little objective performance data on error probabilities exist for nuclear environments. Thus, when human reliability estimates are required, for example in computer simulation modeling of system reliability, only subjective estimates (usually based on experts' best guesses) can be provided. The objective of the current research is to provide guidelines for the selection of human error probabilities based on actual performance data taken in other complex environments and applying them to nuclear settings. A key feature of this research is the application of a comprehensive taxonomic approach to nuclear and non-nuclear tasks to evaluate their similarities and differences, thus providing a basis for generalizing human error estimates across tasks. In recent years significant developments have occurred in classifying and describing tasks. Initial goals of the current research are to: (1) identify alternative taxonomic schemes that can be applied to tasks, and (2) describe nuclear tasks in terms of these schemes. Three standardized taxonomic schemes (Ability Requirements Approach, Generalized Information-Processing Approach, Task Characteristics Approach) are identified, modified, and evaluated for their suitability in comparing nuclear and non-nuclear power plant tasks. An agenda for future research and its relevance to nuclear power plant safety is also discussed

  10. Conformable covered versus uncovered self-expandable metallic stents for palliation of malignant gastroduodenal obstruction: a randomized prospective study.

    Science.gov (United States)

    Lim, Sun Gyo; Kim, Jin Hong; Lee, Kee Myung; Shin, Sung Jae; Kim, Chan Gyoo; Kim, Kyung Ho; Kim, Ho Gak; Yang, Chang Heon

    2014-07-01

    A conformable self-expandable metallic stent was developed to overcome the limitation of previous self-expandable metallic stents. The aim of this study was to evaluate outcomes after placement of conformable covered and uncovered self-expandable metallic stents for palliation of malignant gastroduodenal obstruction. A single-blind, randomized, parallel-group, prospective study were conducted in 4 medical centres between March 2009 and July 2012. 134 patients with unresectable malignant gastroduodenal obstruction were assigned to a covered double-layered (n=66) or uncovered unfixed-cell braided (n=68) stent placement group. Primary analysis was performed to compare re-intervention rates between two groups. 120 patients were analysed (59 in the covered group and 61 in the uncovered group). Overall rates of re-intervention were not significantly different between the two groups: 13/59 (22.0%) in the covered group vs. 13/61 (21.3%) in the uncovered group, p=0.999. Stent migration was more frequent in the covered group than in the uncovered group (p=0.003). The tumour ingrowth rate was higher in the uncovered group than in the covered group (p=0.016). The rates of re-intervention did not significantly differ between the two stents. Conformable covered double-layered and uncovered unfixed-cell braided stents were associated with different patterns of stent malfunction. Copyright © 2014 Editrice Gastroenterologica Italiana S.r.l. Published by Elsevier Ltd. All rights reserved.

  11. Hemispheric Asymmetries in the Activation and Monitoring of Memory Errors

    Science.gov (United States)

    Giammattei, Jeannette; Arndt, Jason

    2012-01-01

    Previous research on the lateralization of memory errors suggests that the right hemisphere's tendency to produce more memory errors than the left hemisphere reflects hemispheric differences in semantic activation. However, all prior research that has examined the lateralization of memory errors has used self-paced recognition judgments. Because…

  12. Error and uncertainty in scientific practice

    NARCIS (Netherlands)

    Boumans, M.; Hon, G.; Petersen, A.C.

    2014-01-01

    Assessment of error and uncertainty is a vital component of both natural and social science. Empirical research involves dealing with all kinds of errors and uncertainties, yet there is significant variance in how such results are dealt with. Contributors to this volume present case studies of

  13. Uncovering the Density of Matter from Multiplicity Distribution

    International Nuclear Information System (INIS)

    Bialas, A.

    2010-01-01

    Multiplicity distributions in the form of superposition of Poisson distributions which are observed in multiparticle production are interpreted as reflection of a two-step nature of this process: the creation and evolution of the strongly interacting fluid, followed by its uncorrelated decay into observed hadrons. A method to uncover the density of the fluid from the observed multiplicity distribution is described. (author)

  14. Multi-frequency complex network from time series for uncovering oil-water flow structure.

    Science.gov (United States)

    Gao, Zhong-Ke; Yang, Yu-Xuan; Fang, Peng-Cheng; Jin, Ning-De; Xia, Cheng-Yi; Hu, Li-Dan

    2015-02-04

    Uncovering complex oil-water flow structure represents a challenge in diverse scientific disciplines. This challenge stimulates us to develop a new distributed conductance sensor for measuring local flow signals at different positions and then propose a novel approach based on multi-frequency complex network to uncover the flow structures from experimental multivariate measurements. In particular, based on the Fast Fourier transform, we demonstrate how to derive multi-frequency complex network from multivariate time series. We construct complex networks at different frequencies and then detect community structures. Our results indicate that the community structures faithfully represent the structural features of oil-water flow patterns. Furthermore, we investigate the network statistic at different frequencies for each derived network and find that the frequency clustering coefficient enables to uncover the evolution of flow patterns and yield deep insights into the formation of flow structures. Current results present a first step towards a network visualization of complex flow patterns from a community structure perspective.

  15. Analysis on the dynamic error for optoelectronic scanning coordinate measurement network

    Science.gov (United States)

    Shi, Shendong; Yang, Linghui; Lin, Jiarui; Guo, Siyang; Ren, Yongjie

    2018-01-01

    Large-scale dynamic three-dimension coordinate measurement technique is eagerly demanded in equipment manufacturing. Noted for advantages of high accuracy, scale expandability and multitask parallel measurement, optoelectronic scanning measurement network has got close attention. It is widely used in large components jointing, spacecraft rendezvous and docking simulation, digital shipbuilding and automated guided vehicle navigation. At present, most research about optoelectronic scanning measurement network is focused on static measurement capacity and research about dynamic accuracy is insufficient. Limited by the measurement principle, the dynamic error is non-negligible and restricts the application. The workshop measurement and positioning system is a representative which can realize dynamic measurement function in theory. In this paper we conduct deep research on dynamic error resources and divide them two parts: phase error and synchronization error. Dynamic error model is constructed. Based on the theory above, simulation about dynamic error is carried out. Dynamic error is quantized and the rule of volatility and periodicity has been found. Dynamic error characteristics are shown in detail. The research result lays foundation for further accuracy improvement.

  16. The uncorrected refractive error challenge

    Directory of Open Access Journals (Sweden)

    Kovin Naidoo

    2016-11-01

    Full Text Available Refractive error affects people of all ages, socio-economic status and ethnic groups. The most recent statistics estimate that, worldwide, 32.4 million people are blind and 191 million people have vision impairment. Vision impairment has been defined based on distance visual acuity only, and uncorrected distance refractive error (mainly myopia is the single biggest cause of worldwide vision impairment. However, when we also consider near visual impairment, it is clear that even more people are affected. From research it was estimated that the number of people with vision impairment due to uncorrected distance refractive error was 107.8 million,1 and the number of people affected by uncorrected near refractive error was 517 million, giving a total of 624.8 million people.

  17. Modeling coherent errors in quantum error correction

    Science.gov (United States)

    Greenbaum, Daniel; Dutton, Zachary

    2018-01-01

    Analysis of quantum error correcting codes is typically done using a stochastic, Pauli channel error model for describing the noise on physical qubits. However, it was recently found that coherent errors (systematic rotations) on physical data qubits result in both physical and logical error rates that differ significantly from those predicted by a Pauli model. Here we examine the accuracy of the Pauli approximation for noise containing coherent errors (characterized by a rotation angle ɛ) under the repetition code. We derive an analytic expression for the logical error channel as a function of arbitrary code distance d and concatenation level n, in the small error limit. We find that coherent physical errors result in logical errors that are partially coherent and therefore non-Pauli. However, the coherent part of the logical error is negligible at fewer than {ε }-({dn-1)} error correction cycles when the decoder is optimized for independent Pauli errors, thus providing a regime of validity for the Pauli approximation. Above this number of correction cycles, the persistent coherent logical error will cause logical failure more quickly than the Pauli model would predict, and this may need to be combated with coherent suppression methods at the physical level or larger codes.

  18. Use of geographic information systems to assess the error associated with the use of place of residence in injury research.

    Science.gov (United States)

    Amram, Ofer; Schuurman, Nadine; Yanchar, Natalie L; Pike, Ian; Friger, Michael; Griesdale, Donald

    In any spatial research, the use of accurate location data is critical to the reliability of the results. Unfortunately, however, many of the administrative data sets used in injury research do not include the location at which the injury takes place. The aim of this paper is to examine the error associated with using place of residence as opposed to place of injury when identifying injury hotspots and hospital access. Traumatic Brian Injury (TBI) data from the BC Trauma Registry (BCTR) was used to identify all TBI patients admitted to BC hospitals between January 2000 and March 2013. In order to estimate how locational error impacts the identification of injury hotspots, the data was aggregated to the level of dissemination area (DA) and census tract (CT) and a linear regression was performed using place of residence as a predictor for place of injury. In order to assess the impact of locational error in studies examining hospital access, an analysis of the driving time between place of injury and place of residence and the difference in driving time between place of residence and the treatment hospital, and place of injury and the same hospital was conducted. The driving time analysis indicated that 73.3 % of the injuries occurred within 5 min of place of residence, 11.2 % between five and ten minutes and 15.5 % over 20 min. Misclassification error occurs at both the DA and CT level. The residual map of the DA clearly shows more detailed misclassification. As expected, the driving time between place of residence and place of injury and the difference between these same two locations and the treatment hospital share a positive relationship. In fact, the larger the distance was between the two locations, the larger the error was when estimating access to hospital. Our results highlight the need for more systematic recording of place of injury as this will allow researchers to more accurately pinpoint where injuries occur. It will also allow researchers to

  19. Errors in causal inference: an organizational schema for systematic error and random error.

    Science.gov (United States)

    Suzuki, Etsuji; Tsuda, Toshihide; Mitsuhashi, Toshiharu; Mansournia, Mohammad Ali; Yamamoto, Eiji

    2016-11-01

    To provide an organizational schema for systematic error and random error in estimating causal measures, aimed at clarifying the concept of errors from the perspective of causal inference. We propose to divide systematic error into structural error and analytic error. With regard to random error, our schema shows its four major sources: nondeterministic counterfactuals, sampling variability, a mechanism that generates exposure events and measurement variability. Structural error is defined from the perspective of counterfactual reasoning and divided into nonexchangeability bias (which comprises confounding bias and selection bias) and measurement bias. Directed acyclic graphs are useful to illustrate this kind of error. Nonexchangeability bias implies a lack of "exchangeability" between the selected exposed and unexposed groups. A lack of exchangeability is not a primary concern of measurement bias, justifying its separation from confounding bias and selection bias. Many forms of analytic errors result from the small-sample properties of the estimator used and vanish asymptotically. Analytic error also results from wrong (misspecified) statistical models and inappropriate statistical methods. Our organizational schema is helpful for understanding the relationship between systematic error and random error from a previously less investigated aspect, enabling us to better understand the relationship between accuracy, validity, and precision. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. Automated Classification of Phonological Errors in Aphasic Language

    Science.gov (United States)

    Ahuja, Sanjeev B.; Reggia, James A.; Berndt, Rita S.

    1984-01-01

    Using heuristically-guided state space search, a prototype program has been developed to simulate and classify phonemic errors occurring in the speech of neurologically-impaired patients. Simulations are based on an interchangeable rule/operator set of elementary errors which represent a theory of phonemic processing faults. This work introduces and evaluates a novel approach to error simulation and classification, it provides a prototype simulation tool for neurolinguistic research, and it forms the initial phase of a larger research effort involving computer modelling of neurolinguistic processes.

  1. [Errors in Peruvian medical journals references].

    Science.gov (United States)

    Huamaní, Charles; Pacheco-Romero, José

    2009-01-01

    References are fundamental in our studies; an adequate selection is asimportant as an adequate description. To determine the number of errors in a sample of references found in Peruvian medical journals. We reviewed 515 scientific papers references selected by systematic randomized sampling and corroborated reference information with the original document or its citation in Pubmed, LILACS or SciELO-Peru. We found errors in 47,6% (245) of the references, identifying 372 types of errors; the most frequent were errors in presentation style (120), authorship (100) and title (100), mainly due to spelling mistakes (91). References error percentage was high, varied and multiple. We suggest systematic revision of references in the editorial process as well as to extend the discussion on this theme. references, periodicals, research, bibliometrics.

  2. Soft errors in modern electronic systems

    CERN Document Server

    Nicolaidis, Michael

    2010-01-01

    This book provides a comprehensive presentation of the most advanced research results and technological developments enabling understanding, qualifying and mitigating the soft errors effect in advanced electronics, including the fundamental physical mechanisms of radiation induced soft errors, the various steps that lead to a system failure, the modelling and simulation of soft error at various levels (including physical, electrical, netlist, event driven, RTL, and system level modelling and simulation), hardware fault injection, accelerated radiation testing and natural environment testing, s

  3. Random measurement error: Why worry? An example of cardiovascular risk factors.

    Science.gov (United States)

    Brakenhoff, Timo B; van Smeden, Maarten; Visseren, Frank L J; Groenwold, Rolf H H

    2018-01-01

    With the increased use of data not originally recorded for research, such as routine care data (or 'big data'), measurement error is bound to become an increasingly relevant problem in medical research. A common view among medical researchers on the influence of random measurement error (i.e. classical measurement error) is that its presence leads to some degree of systematic underestimation of studied exposure-outcome relations (i.e. attenuation of the effect estimate). For the common situation where the analysis involves at least one exposure and one confounder, we demonstrate that the direction of effect of random measurement error on the estimated exposure-outcome relations can be difficult to anticipate. Using three example studies on cardiovascular risk factors, we illustrate that random measurement error in the exposure and/or confounder can lead to underestimation as well as overestimation of exposure-outcome relations. We therefore advise medical researchers to refrain from making claims about the direction of effect of measurement error in their manuscripts, unless the appropriate inferential tools are used to study or alleviate the impact of measurement error from the analysis.

  4. #2 - An Empirical Assessment of Exposure Measurement Error ...

    Science.gov (United States)

    Background• Differing degrees of exposure error acrosspollutants• Previous focus on quantifying and accounting forexposure error in single-pollutant models• Examine exposure errors for multiple pollutantsand provide insights on the potential for bias andattenuation of effect estimates in single and bipollutantepidemiological models The National Exposure Research Laboratory (NERL) Human Exposure and Atmospheric Sciences Division (HEASD) conducts research in support of EPA mission to protect human health and the environment. HEASD research program supports Goal 1 (Clean Air) and Goal 4 (Healthy People) of EPA strategic plan. More specifically, our division conducts research to characterize the movement of pollutants from the source to contact with humans. Our multidisciplinary research program produces Methods, Measurements, and Models to identify relationships between and characterize processes that link source emissions, environmental concentrations, human exposures, and target-tissue dose. The impact of these tools is improved regulatory programs and policies for EPA.

  5. Students’ Errors in Geometry Viewed from Spatial Intelligence

    Science.gov (United States)

    Riastuti, N.; Mardiyana, M.; Pramudya, I.

    2017-09-01

    Geometry is one of the difficult materials because students must have ability to visualize, describe images, draw shapes, and know the kind of shapes. This study aim is to describe student error based on Newmans’ Error Analysis in solving geometry problems viewed from spatial intelligence. This research uses descriptive qualitative method by using purposive sampling technique. The datas in this research are the result of geometri material test and interview by the 8th graders of Junior High School in Indonesia. The results of this study show that in each category of spatial intelligence has a different type of error in solving the problem on the material geometry. Errors are mostly made by students with low spatial intelligence because they have deficiencies in visual abilities. Analysis of student error viewed from spatial intelligence is expected to help students do reflection in solving the problem of geometry.

  6. Errors in Neonatology

    Directory of Open Access Journals (Sweden)

    Antonio Boldrini

    2013-06-01

    Full Text Available Introduction: Danger and errors are inherent in human activities. In medical practice errors can lean to adverse events for patients. Mass media echo the whole scenario. Methods: We reviewed recent published papers in PubMed database to focus on the evidence and management of errors in medical practice in general and in Neonatology in particular. We compared the results of the literature with our specific experience in Nina Simulation Centre (Pisa, Italy. Results: In Neonatology the main error domains are: medication and total parenteral nutrition, resuscitation and respiratory care, invasive procedures, nosocomial infections, patient identification, diagnostics. Risk factors include patients’ size, prematurity, vulnerability and underlying disease conditions but also multidisciplinary teams, working conditions providing fatigue, a large variety of treatment and investigative modalities needed. Discussion and Conclusions: In our opinion, it is hardly possible to change the human beings but it is likely possible to change the conditions under they work. Voluntary errors report systems can help in preventing adverse events. Education and re-training by means of simulation can be an effective strategy too. In Pisa (Italy Nina (ceNtro di FormazIone e SimulazioNe NeonAtale is a simulation center that offers the possibility of a continuous retraining for technical and non-technical skills to optimize neonatological care strategies. Furthermore, we have been working on a novel skill trainer for mechanical ventilation (MEchatronic REspiratory System SImulator for Neonatal Applications, MERESSINA. Finally, in our opinion national health policy indirectly influences risk for errors. Proceedings of the 9th International Workshop on Neonatology · Cagliari (Italy · October 23rd-26th, 2013 · Learned lessons, changing practice and cutting-edge research

  7. Reward positivity: Reward prediction error or salience prediction error?

    Science.gov (United States)

    Heydari, Sepideh; Holroyd, Clay B

    2016-08-01

    The reward positivity is a component of the human ERP elicited by feedback stimuli in trial-and-error learning and guessing tasks. A prominent theory holds that the reward positivity reflects a reward prediction error signal that is sensitive to outcome valence, being larger for unexpected positive events relative to unexpected negative events (Holroyd & Coles, 2002). Although the theory has found substantial empirical support, most of these studies have utilized either monetary or performance feedback to test the hypothesis. However, in apparent contradiction to the theory, a recent study found that unexpected physical punishments also elicit the reward positivity (Talmi, Atkinson, & El-Deredy, 2013). The authors of this report argued that the reward positivity reflects a salience prediction error rather than a reward prediction error. To investigate this finding further, in the present study participants navigated a virtual T maze and received feedback on each trial under two conditions. In a reward condition, the feedback indicated that they would either receive a monetary reward or not and in a punishment condition the feedback indicated that they would receive a small shock or not. We found that the feedback stimuli elicited a typical reward positivity in the reward condition and an apparently delayed reward positivity in the punishment condition. Importantly, this signal was more positive to the stimuli that predicted the omission of a possible punishment relative to stimuli that predicted a forthcoming punishment, which is inconsistent with the salience hypothesis. © 2016 Society for Psychophysiological Research.

  8. Advanced hardware design for error correcting codes

    CERN Document Server

    Coussy, Philippe

    2015-01-01

    This book provides thorough coverage of error correcting techniques. It includes essential basic concepts and the latest advances on key topics in design, implementation, and optimization of hardware/software systems for error correction. The book’s chapters are written by internationally recognized experts in this field. Topics include evolution of error correction techniques, industrial user needs, architectures, and design approaches for the most advanced error correcting codes (Polar Codes, Non-Binary LDPC, Product Codes, etc). This book provides access to recent results, and is suitable for graduate students and researchers of mathematics, computer science, and engineering. • Examines how to optimize the architecture of hardware design for error correcting codes; • Presents error correction codes from theory to optimized architecture for the current and the next generation standards; • Provides coverage of industrial user needs advanced error correcting techniques.

  9. Uncovering multiple populations with washington photometry. I. The globular cluster NGC 1851

    Energy Technology Data Exchange (ETDEWEB)

    Cummings, Jeffrey D.; Geisler, D.; Villanova, S. [Departamento de Astronomía, Casilla 160-C, Universidad de Concepción (Chile); Carraro, G. [ESO, Alonso de Cordova 3107, Casilla 19001, Santiago de Chile (Chile)

    2014-08-01

    The analysis of multiple populations (MPs) in globular clusters (GCs) has become a forefront area of research in astronomy. Multiple red giant branches (RGBs), subgiant branches (SGBs), and even main sequences (MSs) have now been observed photometrically in many GCs, while broad abundance distributions of certain elements have been detected spectroscopically in most, if not all, GCs. UV photometry has been crucial in discovering and analyzing these MPs, but the Johnson U and the Stromgren and Sloan u filters that have generally been used are relatively inefficient and very sensitive to reddening and atmospheric extinction. In contrast, the Washington C filter is much broader and redder than these competing UV filters, making it far more efficient at detecting MPs and much less sensitive to reddening and extinction. Here, we investigate the use of the Washington system to uncover MPs using only a 1 m telescope. Our analysis of the well-studied GC NGC 1851 finds that the C filter is both very efficient and effective at detecting its previously discovered MPs in the RGB and SGB. Remarkably, we have also detected an intrinsically broad MS best characterized by two distinct but heavily overlapping populations that cannot be explained by binaries, field stars, or photometric errors. The MS distribution is in very good agreement with that seen on the RGB, with ∼30% of the stars belonging to the second population. There is also evidence for two sequences in the red horizontal branch, but this appears to be unrelated to the MPs in this cluster. Neither of these latter phenomena have been observed previously in this cluster. The redder MS stars are also more centrally concentrated than the blue MS. This is the first time MPs in an MS have been discovered from the ground, and using only a 1 m telescope. The Washington system thus proves to be a very powerful tool for investigating MPs, and holds particular promise for extragalactic objects where photons are limited.

  10. Numerical optimization with computational errors

    CERN Document Server

    Zaslavski, Alexander J

    2016-01-01

    This book studies the approximate solutions of optimization problems in the presence of computational errors. A number of results are presented on the convergence behavior of algorithms in a Hilbert space; these algorithms are examined taking into account computational errors. The author illustrates that algorithms generate a good approximate solution, if computational errors are bounded from above by a small positive constant. Known computational errors are examined with the aim of determining an approximate solution. Researchers and students interested in the optimization theory and its applications will find this book instructive and informative. This monograph contains 16 chapters; including a chapters devoted to the subgradient projection algorithm, the mirror descent algorithm, gradient projection algorithm, the Weiszfelds method, constrained convex minimization problems, the convergence of a proximal point method in a Hilbert space, the continuous subgradient method, penalty methods and Newton’s meth...

  11. Simulator data on human error probabilities

    International Nuclear Information System (INIS)

    Kozinsky, E.J.; Guttmann, H.E.

    1982-01-01

    Analysis of operator errors on NPP simulators is being used to determine Human Error Probabilities (HEP) for task elements defined in NUREG/CR 1278. Simulator data tapes from research conducted by EPRI and ORNL are being analyzed for operator error rates. The tapes collected, using Performance Measurement System software developed for EPRI, contain a history of all operator manipulations during simulated casualties. Analysis yields a time history or Operational Sequence Diagram and a manipulation summary, both stored in computer data files. Data searches yield information on operator errors of omission and commission. This work experimentally determines HEPs for Probabilistic Risk Assessment calculations. It is the only practical experimental source of this data to date

  12. Simulator data on human error probabilities

    International Nuclear Information System (INIS)

    Kozinsky, E.J.; Guttmann, H.E.

    1981-01-01

    Analysis of operator errors on NPP simulators is being used to determine Human Error Probabilities (HEP) for task elements defined in NUREG/CR-1278. Simulator data tapes from research conducted by EPRI and ORNL are being analyzed for operator error rates. The tapes collected, using Performance Measurement System software developed for EPRI, contain a history of all operator manipulations during simulated casualties. Analysis yields a time history or Operational Sequence Diagram and a manipulation summary, both stored in computer data files. Data searches yield information on operator errors of omission and commission. This work experimentally determined HEP's for Probabilistic Risk Assessment calculations. It is the only practical experimental source of this data to date

  13. Medication Review and Transitions of Care: A Case Report of a Decade-Old Medication Error.

    Science.gov (United States)

    Comer, Rachel; Lizer, Mitsi

    2017-10-01

    A 69-year-old Caucasian male with a 25-year history of paranoid schizophrenia was brought to the emergency department because of violence toward the staff in his nursing facility. He was diagnosed with a urinary tract infection and was admitted to the behavioral health unit for medication stabilization. History included a five-year state psychiatric hospital admission and nursing facility placement. Because of poor cognitive function, the patient was unable to corroborate medication history, so the pharmacy student on rotation performed an in-depth chart review. The review revealed a transcription error in 2003 deleting amantadine 100 mg twice daily and adding amiodarone 100 mg twice daily. Subsequent hospitalization resulted in another transcription error increasing the amiodarone to 200 mg twice daily. All electrocardiograms conducted were negative for atrial fibrillation. Once detected, the consulted cardiologist discontinued the amiodarone, and the primary care provider was notified via letter and discharge papers. An admission four months later revealed that the nursing facility restarted the amiodarone. Amiodarone was discontinued and the facility was again notified. This case reviews how a 10-year-old medication error went undetected in the electronic medical records through numerous medication reconciliations, but was uncovered when a single comprehensive medication review was conducted.

  14. An Investigation of effective factors on nurses\\' speech errors

    Directory of Open Access Journals (Sweden)

    Maryam Tafaroji yeganeh

    2017-03-01

    Full Text Available Background : Speech errors are a branch of psycholinguistic science. Speech error or slip of tongue is a natural process that happens to everyone. The importance of this research is because of sensitivity and importance of nursing in which the speech errors may be interfere in the treatment of patients, but unfortunately no research has been done yet in this field.This research has been done to study the factors (personality, stress, fatigue and insomnia which cause speech errors happen to nurses of Ilam province. Materials and Methods: The sample of this correlation-descriptive research consists of 50 nurses working in Mustafa Khomeini Hospital of Ilam province who were selected randomly. Our data were collected using The Minnesota Multiphasic Personality Inventory, NEO-Five Factor Inventory and Expanded Nursing Stress Scale, and were analyzed using SPSS version 20, descriptive, inferential and multivariate linear regression or two-variable statistical methods (with significant level: p≤0. 05. Results: 30 (60% of nurses participating in the study were female and 19 (38% were male. In this study, all three factors (type of personality, stress and fatigue have significant effects on nurses' speech errors Conclusion: 30 (60% of nurses participating in the study were female and 19 (38% were male. In this study, all three factors (type of personality, stress and fatigue have significant effects on nurses' speech errors.

  15. The Nature of Error in Adolescent Student Writing

    Science.gov (United States)

    Wilcox, Kristen Campbell; Yagelski, Robert; Yu, Fang

    2014-01-01

    This study examined the nature and frequency of error in high school native English speaker (L1) and English learner (L2) writing. Four main research questions were addressed: Are there significant differences in students' error rates in English language arts (ELA) and social studies? Do the most common errors made by students differ in ELA…

  16. Random measurement error: Why worry? An example of cardiovascular risk factors.

    Directory of Open Access Journals (Sweden)

    Timo B Brakenhoff

    Full Text Available With the increased use of data not originally recorded for research, such as routine care data (or 'big data', measurement error is bound to become an increasingly relevant problem in medical research. A common view among medical researchers on the influence of random measurement error (i.e. classical measurement error is that its presence leads to some degree of systematic underestimation of studied exposure-outcome relations (i.e. attenuation of the effect estimate. For the common situation where the analysis involves at least one exposure and one confounder, we demonstrate that the direction of effect of random measurement error on the estimated exposure-outcome relations can be difficult to anticipate. Using three example studies on cardiovascular risk factors, we illustrate that random measurement error in the exposure and/or confounder can lead to underestimation as well as overestimation of exposure-outcome relations. We therefore advise medical researchers to refrain from making claims about the direction of effect of measurement error in their manuscripts, unless the appropriate inferential tools are used to study or alleviate the impact of measurement error from the analysis.

  17. The Uncovered Interest Parity in the Foreign Exchange (FX Markets

    Directory of Open Access Journals (Sweden)

    Silvio Ricardo Micheloto

    2004-12-01

    Full Text Available This work verifies the uncovered interest rates parity (UIP in the FX (foreign exchange emerging markets by using the panel cointegration technique. The data involves several developing countries that compose the EMBI+ Global Index. We compare the results of several panel estimators: OLS (ordinary list square, DOLS (dynamic OLS and FMOLS (fully modified OLS. This new panel technique can handle problems of either non-stationary series (spurious regression or small problem. This latter problem has being considered one of the main causes for distorting the UIP empirical results. By using this approach, we check the UIP in the FX (foreign exchange emerging markets. These markets are more critical because they have been subjected to changing FX regimes and speculative attacks. Our results do not corroborate the uncovered interest parity for the developing countries in the recent years. Thus, the forward premium puzzle may hold in the FX emergent markets.

  18. Comparison of covered and uncovered self-expandable stents in the treatment of malignant biliary obstruction.

    Science.gov (United States)

    Flores Carmona, Diana Yamel; Alonso Lárraga, Juan Octavio; Hernández Guerrero, Angélica; Ramírez Solís, Mauro Eduardo

    2016-05-01

    Drainage with metallic stents is the treatment of choice in malignant obstructive jaundice. Technical and clinical success with metallic stents is obtained in over 90% and 80% of cases, respectively. There are self-expandable metallic stents designed to increase permeability. The aim of this study was to describe the results obtained with totally covered self-expandable and uncovered self-expandable metallic stents in the palliative treatment of malignant biliary obstruction. Sixty eight patients with malignant obstructive jaundice secondary to pancreatobiliary or metastatic disease not amenable to surgery were retrospectively included. Two groups were created: group A (covered self-expandable metallic stents) (n = 22) and group B (uncovered self-expandable metallic stents) (n = 46). Serum total bilirubin, direct bilirubin, alkaline phosphatase and gamma glutamyl transferase levels decreased in both groups and no statistically significant difference was detected (p = 0.800, p = 0.190, p = 0.743, p = 0.521). Migration was greater with covered stents but it was not statistically significant either (p = 0.101). Obstruction was greater in the group with uncovered stents but it was not statistically significant either (p = 0.476). There are no differences when using covered self-expandable stents or uncovered self-expandable stents in terms of technical and clinical success or complications in the palliative treatment of malignant obstructive jaundice.

  19. Understanding and Confronting Our Mistakes: The Epidemiology of Error in Radiology and Strategies for Error Reduction.

    Science.gov (United States)

    Bruno, Michael A; Walker, Eric A; Abujudeh, Hani H

    2015-10-01

    Arriving at a medical diagnosis is a highly complex process that is extremely error prone. Missed or delayed diagnoses often lead to patient harm and missed opportunities for treatment. Since medical imaging is a major contributor to the overall diagnostic process, it is also a major potential source of diagnostic error. Although some diagnoses may be missed because of the technical or physical limitations of the imaging modality, including image resolution, intrinsic or extrinsic contrast, and signal-to-noise ratio, most missed radiologic diagnoses are attributable to image interpretation errors by radiologists. Radiologic interpretation cannot be mechanized or automated; it is a human enterprise based on complex psychophysiologic and cognitive processes and is itself subject to a wide variety of error types, including perceptual errors (those in which an important abnormality is simply not seen on the images) and cognitive errors (those in which the abnormality is visually detected but the meaning or importance of the finding is not correctly understood or appreciated). The overall prevalence of radiologists' errors in practice does not appear to have changed since it was first estimated in the 1960s. The authors review the epidemiology of errors in diagnostic radiology, including a recently proposed taxonomy of radiologists' errors, as well as research findings, in an attempt to elucidate possible underlying causes of these errors. The authors also propose strategies for error reduction in radiology. On the basis of current understanding, specific suggestions are offered as to how radiologists can improve their performance in practice. © RSNA, 2015.

  20. Medication errors: an overview for clinicians.

    Science.gov (United States)

    Wittich, Christopher M; Burkle, Christopher M; Lanier, William L

    2014-08-01

    Medication error is an important cause of patient morbidity and mortality, yet it can be a confusing and underappreciated concept. This article provides a review for practicing physicians that focuses on medication error (1) terminology and definitions, (2) incidence, (3) risk factors, (4) avoidance strategies, and (5) disclosure and legal consequences. A medication error is any error that occurs at any point in the medication use process. It has been estimated by the Institute of Medicine that medication errors cause 1 of 131 outpatient and 1 of 854 inpatient deaths. Medication factors (eg, similar sounding names, low therapeutic index), patient factors (eg, poor renal or hepatic function, impaired cognition, polypharmacy), and health care professional factors (eg, use of abbreviations in prescriptions and other communications, cognitive biases) can precipitate medication errors. Consequences faced by physicians after medication errors can include loss of patient trust, civil actions, criminal charges, and medical board discipline. Methods to prevent medication errors from occurring (eg, use of information technology, better drug labeling, and medication reconciliation) have been used with varying success. When an error is discovered, patients expect disclosure that is timely, given in person, and accompanied with an apology and communication of efforts to prevent future errors. Learning more about medication errors may enhance health care professionals' ability to provide safe care to their patients. Copyright © 2014 Mayo Foundation for Medical Education and Research. Published by Elsevier Inc. All rights reserved.

  1. Errors and conflict at the task level and the response level.

    Science.gov (United States)

    Desmet, Charlotte; Fias, Wim; Hartstra, Egbert; Brass, Marcel

    2011-01-26

    In the last decade, research on error and conflict processing has become one of the most influential research areas in the domain of cognitive control. There is now converging evidence that a specific part of the posterior frontomedian cortex (pFMC), the rostral cingulate zone (RCZ), is crucially involved in the processing of errors and conflict. However, error-related research has focused primarily on a specific error type, namely, response errors. The aim of the present study was to investigate whether errors on the task level rely on the same neural and functional mechanisms. Here we report a dissociation of both error types in the pFMC: whereas response errors activate the RCZ, task errors activate the dorsal frontomedian cortex. Although this last region shows an overlap in activation for task and response errors on the group level, a closer inspection of the single-subject data is more in accordance with a functional anatomical dissociation. When investigating brain areas related to conflict on the task and response levels, a clear dissociation was perceived between areas associated with response conflict and with task conflict. Overall, our data support a dissociation between response and task levels of processing in the pFMC. In addition, we provide additional evidence for a dissociation between conflict and errors both at the response level and at the task level.

  2. Errors, error detection, error correction and hippocampal-region damage: data and theories.

    Science.gov (United States)

    MacKay, Donald G; Johnson, Laura W

    2013-11-01

    This review and perspective article outlines 15 observational constraints on theories of errors, error detection, and error correction, and their relation to hippocampal-region (HR) damage. The core observations come from 10 studies with H.M., an amnesic with cerebellar and HR damage but virtually no neocortical damage. Three studies examined the detection of errors planted in visual scenes (e.g., a bird flying in a fish bowl in a school classroom) and sentences (e.g., I helped themselves to the birthday cake). In all three experiments, H.M. detected reliably fewer errors than carefully matched memory-normal controls. Other studies examined the detection and correction of self-produced errors, with controls for comprehension of the instructions, impaired visual acuity, temporal factors, motoric slowing, forgetting, excessive memory load, lack of motivation, and deficits in visual scanning or attention. In these studies, H.M. corrected reliably fewer errors than memory-normal and cerebellar controls, and his uncorrected errors in speech, object naming, and reading aloud exhibited two consistent features: omission and anomaly. For example, in sentence production tasks, H.M. omitted one or more words in uncorrected encoding errors that rendered his sentences anomalous (incoherent, incomplete, or ungrammatical) reliably more often than controls. Besides explaining these core findings, the theoretical principles discussed here explain H.M.'s retrograde amnesia for once familiar episodic and semantic information; his anterograde amnesia for novel information; his deficits in visual cognition, sentence comprehension, sentence production, sentence reading, and object naming; and effects of aging on his ability to read isolated low frequency words aloud. These theoretical principles also explain a wide range of other data on error detection and correction and generate new predictions for future test. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. Organization of physical interactomes as uncovered by network schemas.

    Science.gov (United States)

    Banks, Eric; Nabieva, Elena; Chazelle, Bernard; Singh, Mona

    2008-10-01

    Large-scale protein-protein interaction networks provide new opportunities for understanding cellular organization and functioning. We introduce network schemas to elucidate shared mechanisms within interactomes. Network schemas specify descriptions of proteins and the topology of interactions among them. We develop algorithms for systematically uncovering recurring, over-represented schemas in physical interaction networks. We apply our methods to the S. cerevisiae interactome, focusing on schemas consisting of proteins described via sequence motifs and molecular function annotations and interacting with one another in one of four basic network topologies. We identify hundreds of recurring and over-represented network schemas of various complexity, and demonstrate via graph-theoretic representations how more complex schemas are organized in terms of their lower-order constituents. The uncovered schemas span a wide range of cellular activities, with many signaling and transport related higher-order schemas. We establish the functional importance of the schemas by showing that they correspond to functionally cohesive sets of proteins, are enriched in the frequency with which they have instances in the H. sapiens interactome, and are useful for predicting protein function. Our findings suggest that network schemas are a powerful paradigm for organizing, interrogating, and annotating cellular networks.

  4. Error Bounds: Necessary and Sufficient Conditions

    Czech Academy of Sciences Publication Activity Database

    Outrata, Jiří; Kruger, A.Y.; Fabian, Marián; Henrion, R.

    2010-01-01

    Roč. 18, č. 2 (2010), s. 121-149 ISSN 1877-0533 R&D Projects: GA AV ČR IAA100750802 Institutional research plan: CEZ:AV0Z10750506; CEZ:AV0Z10190503 Keywords : Error bounds * Calmness * Subdifferential * Slope Subject RIV: BA - General Mathematics Impact factor: 0.333, year: 2010 http://library.utia.cas.cz/separaty/2010/MTR/outrata-error bounds necessary and sufficient conditions.pdf

  5. Error Estimation in Preconditioned Conjugate Gradients

    Czech Academy of Sciences Publication Activity Database

    Strakoš, Zdeněk; Tichý, Petr

    2005-01-01

    Roč. 45, - (2005), s. 789-817 ISSN 0006-3835 R&D Projects: GA AV ČR 1ET400300415; GA AV ČR KJB1030306 Institutional research plan: CEZ:AV0Z10300504 Keywords : preconditioned conjugate gradient method * error bounds * stopping criteria * evaluation of convergence * numerical stability * finite precision arithmetic * rounding errors Subject RIV: BA - General Mathematics Impact factor: 0.509, year: 2005

  6. Nursing Errors in Intensive Care Unit by Human Error Identification in Systems Tool: A Case Study

    Directory of Open Access Journals (Sweden)

    Nezamodini

    2016-03-01

    Full Text Available Background Although health services are designed and implemented to improve human health, the errors in health services are a very common phenomenon and even sometimes fatal in this field. Medical errors and their cost are global issues with serious consequences for the patients’ community that are preventable and require serious attention. Objectives The current study aimed to identify possible nursing errors applying human error identification in systems tool (HEIST in the intensive care units (ICUs of hospitals. Patients and Methods This descriptive research was conducted in the intensive care unit of a hospital in Khuzestan province in 2013. Data were collected through observation and interview by nine nurses in this section in a period of four months. Human error classification was based on Rose and Rose and Swain and Guttmann models. According to HEIST work sheets the guide questions were answered and error causes were identified after the determination of the type of errors. Results In total 527 errors were detected. The performing operation on the wrong path had the highest frequency which was 150, and the second rate with a frequency of 136 was doing the tasks later than the deadline. Management causes with a frequency of 451 were the first rank among identified errors. Errors mostly occurred in the system observation stage and among the performance shaping factors (PSFs, time was the most influencing factor in occurrence of human errors. Conclusions Finally, in order to prevent the occurrence and reduce the consequences of identified errors the following suggestions were proposed : appropriate training courses, applying work guidelines and monitoring their implementation, increasing the number of work shifts, hiring professional workforce, equipping work space with appropriate facilities and equipment.

  7. An adaptive orienting theory of error processing.

    Science.gov (United States)

    Wessel, Jan R

    2018-03-01

    The ability to detect and correct action errors is paramount to safe and efficient goal-directed behaviors. Existing work on the neural underpinnings of error processing and post-error behavioral adaptations has led to the development of several mechanistic theories of error processing. These theories can be roughly grouped into adaptive and maladaptive theories. While adaptive theories propose that errors trigger a cascade of processes that will result in improved behavior after error commission, maladaptive theories hold that error commission momentarily impairs behavior. Neither group of theories can account for all available data, as different empirical studies find both impaired and improved post-error behavior. This article attempts a synthesis between the predictions made by prominent adaptive and maladaptive theories. Specifically, it is proposed that errors invoke a nonspecific cascade of processing that will rapidly interrupt and inhibit ongoing behavior and cognition, as well as orient attention toward the source of the error. It is proposed that this cascade follows all unexpected action outcomes, not just errors. In the case of errors, this cascade is followed by error-specific, controlled processing, which is specifically aimed at (re)tuning the existing task set. This theory combines existing predictions from maladaptive orienting and bottleneck theories with specific neural mechanisms from the wider field of cognitive control, including from error-specific theories of adaptive post-error processing. The article aims to describe the proposed framework and its implications for post-error slowing and post-error accuracy, propose mechanistic neural circuitry for post-error processing, and derive specific hypotheses for future empirical investigations. © 2017 Society for Psychophysiological Research.

  8. Joint Schemes for Physical Layer Security and Error Correction

    Science.gov (United States)

    Adamo, Oluwayomi

    2011-01-01

    The major challenges facing resource constraint wireless devices are error resilience, security and speed. Three joint schemes are presented in this research which could be broadly divided into error correction based and cipher based. The error correction based ciphers take advantage of the properties of LDPC codes and Nordstrom Robinson code. A…

  9. Collection of offshore human error probability data

    International Nuclear Information System (INIS)

    Basra, Gurpreet; Kirwan, Barry

    1998-01-01

    Accidents such as Piper Alpha have increased concern about the effects of human errors in complex systems. Such accidents can in theory be predicted and prevented by risk assessment, and in particular human reliability assessment (HRA), but HRA ideally requires qualitative and quantitative human error data. A research initiative at the University of Birmingham led to the development of CORE-DATA, a Computerised Human Error Data Base. This system currently contains a reasonably large number of human error data points, collected from a variety of mainly nuclear-power related sources. This article outlines a recent offshore data collection study, concerned with collecting lifeboat evacuation data. Data collection methods are outlined and a selection of human error probabilities generated as a result of the study are provided. These data give insights into the type of errors and human failure rates that could be utilised to support offshore risk analyses

  10. Learning a locomotor task: with or without errors?

    Science.gov (United States)

    Marchal-Crespo, Laura; Schneider, Jasmin; Jaeger, Lukas; Riener, Robert

    2014-03-04

    Robotic haptic guidance is the most commonly used robotic training strategy to reduce performance errors while training. However, research on motor learning has emphasized that errors are a fundamental neural signal that drive motor adaptation. Thus, researchers have proposed robotic therapy algorithms that amplify movement errors rather than decrease them. However, to date, no study has analyzed with precision which training strategy is the most appropriate to learn an especially simple task. In this study, the impact of robotic training strategies that amplify or reduce errors on muscle activation and motor learning of a simple locomotor task was investigated in twenty two healthy subjects. The experiment was conducted with the MAgnetic Resonance COmpatible Stepper (MARCOS) a special robotic device developed for investigations in the MR scanner. The robot moved the dominant leg passively and the subject was requested to actively synchronize the non-dominant leg to achieve an alternating stepping-like movement. Learning with four different training strategies that reduce or amplify errors was evaluated: (i) Haptic guidance: errors were eliminated by passively moving the limbs, (ii) No guidance: no robot disturbances were presented, (iii) Error amplification: existing errors were amplified with repulsive forces, (iv) Noise disturbance: errors were evoked intentionally with a randomly-varying force disturbance on top of the no guidance strategy. Additionally, the activation of four lower limb muscles was measured by the means of surface electromyography (EMG). Strategies that reduce or do not amplify errors limit muscle activation during training and result in poor learning gains. Adding random disturbing forces during training seems to increase attention, and therefore improve motor learning. Error amplification seems to be the most suitable strategy for initially less skilled subjects, perhaps because subjects could better detect their errors and correct them

  11. Functional requirements for the man-vehicle systems research facility. [identifying and correcting human errors during flight simulation

    Science.gov (United States)

    Clement, W. F.; Allen, R. W.; Heffley, R. K.; Jewell, W. F.; Jex, H. R.; Mcruer, D. T.; Schulman, T. M.; Stapleford, R. L.

    1980-01-01

    The NASA Ames Research Center proposed a man-vehicle systems research facility to support flight simulation studies which are needed for identifying and correcting the sources of human error associated with current and future air carrier operations. The organization of research facility is reviewed and functional requirements and related priorities for the facility are recommended based on a review of potentially critical operational scenarios. Requirements are included for the experimenter's simulation control and data acquisition functions, as well as for the visual field, motion, sound, computation, crew station, and intercommunications subsystems. The related issues of functional fidelity and level of simulation are addressed, and specific criteria for quantitative assessment of various aspects of fidelity are offered. Recommendations for facility integration, checkout, and staffing are included.

  12. An Investigation into Soft Error Detection Efficiency at Operating System Level

    OpenAIRE

    Asghari, Seyyed Amir; Kaynak, Okyay; Taheri, Hassan

    2014-01-01

    Electronic equipment operating in harsh environments such as space is subjected to a range of threats. The most important of these is radiation that gives rise to permanent and transient errors on microelectronic components. The occurrence rate of transient errors is significantly more than permanent errors. The transient errors, or soft errors, emerge in two formats: control flow errors (CFEs) and data errors. Valuable research results have already appeared in literature at hardware and soft...

  13. Putting a face on medical errors: a patient perspective.

    Science.gov (United States)

    Kooienga, Sarah; Stewart, Valerie T

    2011-01-01

    Knowledge of the patient's perspective on medical error is limited. Research efforts have centered on how best to disclose error and how patients desire to have medical error disclosed. On the basis of a qualitative descriptive component of a mixed method study, a purposive sample of 30 community members told their stories of medical error. Their experiences focused on lack of communication, missed communication, or provider's poor interpersonal style of communication, greatly contrasting with the formal definition of error as failure to follow a set standard of care. For these participants, being a patient was more important than error or how an error is disclosed. The patient's understanding of error must be a key aspect of any quality improvement strategy. © 2010 National Association for Healthcare Quality.

  14. Learning time-dependent noise to reduce logical errors: real time error rate estimation in quantum error correction

    Science.gov (United States)

    Huo, Ming-Xia; Li, Ying

    2017-12-01

    Quantum error correction is important to quantum information processing, which allows us to reliably process information encoded in quantum error correction codes. Efficient quantum error correction benefits from the knowledge of error rates. We propose a protocol for monitoring error rates in real time without interrupting the quantum error correction. Any adaptation of the quantum error correction code or its implementation circuit is not required. The protocol can be directly applied to the most advanced quantum error correction techniques, e.g. surface code. A Gaussian processes algorithm is used to estimate and predict error rates based on error correction data in the past. We find that using these estimated error rates, the probability of error correction failures can be significantly reduced by a factor increasing with the code distance.

  15. Optimizing learning of a locomotor task: amplifying errors as needed.

    Science.gov (United States)

    Marchal-Crespo, Laura; López-Olóriz, Jorge; Jaeger, Lukas; Riener, Robert

    2014-01-01

    Research on motor learning has emphasized that errors drive motor adaptation. Thereby, several researchers have proposed robotic training strategies that amplify movement errors rather than decrease them. In this study, the effect of different robotic training strategies that amplify errors on learning a complex locomotor task was investigated. The experiment was conducted with a one degree-of freedom robotic stepper (MARCOS). Subjects were requested to actively coordinate their legs in a desired gait-like pattern in order to track a Lissajous figure presented on a visual display. Learning with three different training strategies was evaluated: (i) No perturbation: the robot follows the subjects' movement without applying any perturbation, (ii) Error amplification: existing errors were amplified with repulsive forces proportional to errors, (iii) Noise disturbance: errors were evoked with a randomly-varying force disturbance. Results showed that training without perturbations was especially suitable for a subset of initially less-skilled subjects, while error amplification seemed to benefit more skilled subjects. Training with error amplification, however, limited transfer of learning. Random disturbing forces benefited learning and promoted transfer in all subjects, probably because it increased attention. These results suggest that learning a locomotor task can be optimized when errors are randomly evoked or amplified based on subjects' initial skill level.

  16. A Corpus-based Study of EFL Learners’ Errors in IELTS Essay Writing

    Directory of Open Access Journals (Sweden)

    Hoda Divsar

    2017-03-01

    Full Text Available The present study analyzed different types of errors in the EFL learners’ IELTS essays. In order to determine the major types of errors, a corpus of 70 IELTS examinees’ writings were collected, and their errors were extracted and categorized qualitatively. Errors were categorized based on a researcher-developed error-coding scheme into 13 aspects. Based on the descriptive statistical analyses, the frequency of each error type was calculated and the commonest errors committed by the EFL learners in IELTS essays were identified. The results indicated that the two most frequent errors that IELTS candidates committed were related to word choice and verb forms. Based on the research results, pedagogical implications highlight analyzing EFL learners’ writing errors as a useful basis for instructional purposes including creating pedagogical teaching materials that are in line with learners’ linguistic strengths and weaknesses.

  17. Bootstrap-Based Improvements for Inference with Clustered Errors

    OpenAIRE

    Doug Miller; A. Colin Cameron; Jonah B. Gelbach

    2006-01-01

    Microeconometrics researchers have increasingly realized the essential need to account for any within-group dependence in estimating standard errors of regression parameter estimates. The typical preferred solution is to calculate cluster-robust or sandwich standard errors that permit quite general heteroskedasticity and within-cluster error correlation, but presume that the number of clusters is large. In applications with few (5-30) clusters, standard asymptotic tests can over-reject consid...

  18. Research on the Method of Noise Error Estimation of Atomic Clocks

    Science.gov (United States)

    Song, H. J.; Dong, S. W.; Li, W.; Zhang, J. H.; Jing, Y. J.

    2017-05-01

    The simulation methods of different noises of atomic clocks are given. The frequency flicker noise of atomic clock is studied by using the Markov process theory. The method for estimating the maximum interval error of the frequency white noise is studied by using the Wiener process theory. Based on the operation of 9 cesium atomic clocks in the time frequency reference laboratory of NTSC (National Time Service Center), the noise coefficients of the power-law spectrum model are estimated, and the simulations are carried out according to the noise models. Finally, the maximum interval error estimates of the frequency white noises generated by the 9 cesium atomic clocks have been acquired.

  19. KMRR thermal power measurement error estimation

    International Nuclear Information System (INIS)

    Rhee, B.W.; Sim, B.S.; Lim, I.C.; Oh, S.K.

    1990-01-01

    The thermal power measurement error of the Korea Multi-purpose Research Reactor has been estimated by a statistical Monte Carlo method, and compared with those obtained by the other methods including deterministic and statistical approaches. The results show that the specified thermal power measurement error of 5% cannot be achieved if the commercial RTDs are used to measure the coolant temperatures of the secondary cooling system and the error can be reduced below the requirement if the commercial RTDs are replaced by the precision RTDs. The possible range of the thermal power control operation has been identified to be from 100% to 20% of full power

  20. Medication errors: prescribing faults and prescription errors.

    Science.gov (United States)

    Velo, Giampaolo P; Minuz, Pietro

    2009-06-01

    1. Medication errors are common in general practice and in hospitals. Both errors in the act of writing (prescription errors) and prescribing faults due to erroneous medical decisions can result in harm to patients. 2. Any step in the prescribing process can generate errors. Slips, lapses, or mistakes are sources of errors, as in unintended omissions in the transcription of drugs. Faults in dose selection, omitted transcription, and poor handwriting are common. 3. Inadequate knowledge or competence and incomplete information about clinical characteristics and previous treatment of individual patients can result in prescribing faults, including the use of potentially inappropriate medications. 4. An unsafe working environment, complex or undefined procedures, and inadequate communication among health-care personnel, particularly between doctors and nurses, have been identified as important underlying factors that contribute to prescription errors and prescribing faults. 5. Active interventions aimed at reducing prescription errors and prescribing faults are strongly recommended. These should be focused on the education and training of prescribers and the use of on-line aids. The complexity of the prescribing procedure should be reduced by introducing automated systems or uniform prescribing charts, in order to avoid transcription and omission errors. Feedback control systems and immediate review of prescriptions, which can be performed with the assistance of a hospital pharmacist, are also helpful. Audits should be performed periodically.

  1. Technical errors and complications in orthopaedic trauma surgery

    NARCIS (Netherlands)

    Meeuwis, M.A.; de Jongh, M.A.C.; Roukema, J.A.; van der Heijden, F.H.W.M.; Verhofstad, M. H. J.

    2016-01-01

    Introduction Adverse events and associated morbidity and subsequent costs receive increasing attention in clinical practice and research. As opposed to complications, errors are not described or analysed in literature on fracture surgery. The aim of this study was to provide a description of errors

  2. Quantum error correction for beginners

    International Nuclear Information System (INIS)

    Devitt, Simon J; Nemoto, Kae; Munro, William J

    2013-01-01

    Quantum error correction (QEC) and fault-tolerant quantum computation represent one of the most vital theoretical aspects of quantum information processing. It was well known from the early developments of this exciting field that the fragility of coherent quantum systems would be a catastrophic obstacle to the development of large-scale quantum computers. The introduction of quantum error correction in 1995 showed that active techniques could be employed to mitigate this fatal problem. However, quantum error correction and fault-tolerant computation is now a much larger field and many new codes, techniques, and methodologies have been developed to implement error correction for large-scale quantum algorithms. In response, we have attempted to summarize the basic aspects of quantum error correction and fault-tolerance, not as a detailed guide, but rather as a basic introduction. The development in this area has been so pronounced that many in the field of quantum information, specifically researchers who are new to quantum information or people focused on the many other important issues in quantum computation, have found it difficult to keep up with the general formalisms and methodologies employed in this area. Rather than introducing these concepts from a rigorous mathematical and computer science framework, we instead examine error correction and fault-tolerance largely through detailed examples, which are more relevant to experimentalists today and in the near future. (review article)

  3. Magnetic field errors tolerances of Nuclotron booster

    Science.gov (United States)

    Butenko, Andrey; Kazinova, Olha; Kostromin, Sergey; Mikhaylov, Vladimir; Tuzikov, Alexey; Khodzhibagiyan, Hamlet

    2018-04-01

    Generation of magnetic field in units of booster synchrotron for the NICA project is one of the most important conditions for getting the required parameters and qualitative accelerator operation. Research of linear and nonlinear dynamics of ion beam 197Au31+ in the booster have carried out with MADX program. Analytical estimation of magnetic field errors tolerance and numerical computation of dynamic aperture of booster DFO-magnetic lattice are presented. Closed orbit distortion with random errors of magnetic fields and errors in layout of booster units was evaluated.

  4. The Perception of Error in Production Plants of a Chemical Organisation

    Science.gov (United States)

    Seifried, Jurgen; Hopfer, Eva

    2013-01-01

    There is considerable current interest in error-friendly corporate culture, one particular research question being how and under what conditions errors are learnt from in the workplace. This paper starts from the assumption that errors are inevitable and considers key factors which affect learning from errors in high responsibility organisations,…

  5. Uncovering the Transnational Networks, Organisational Techniques and State-Corporate Ties Behind Grand Corruption: Building an Investigative Methodology

    Directory of Open Access Journals (Sweden)

    Kristian Lasslett

    2017-11-01

    Full Text Available While grand corruption is a major global governance challenge, researchers notably lack a systematic methodology for conducting qualitative research into its complex forms. To address this lacuna, the following article sets out and applies the corruption investigative framework (CIF, a methodology designed to generate a systematic, transferable approach for grand corruption research. Its utility will be demonstrated employing a case study that centres on an Australian-led megaproject being built in Papua New Guinea’s capital city, Port Moresby. Unlike conventional analyses of corruption in Papua New Guinea, which emphasise its local characteristics and patrimonial qualities, application of CIF uncovered new empirical layers that centre on transnational state-corporate power, the ambiguity of civil society, and the structural inequalities that marginalise resistance movements. The important theoretical consequences of the findings and underpinning methodology are explored.

  6. Nurses' attitude and intention of medication administration error reporting.

    Science.gov (United States)

    Hung, Chang-Chiao; Chu, Tsui-Ping; Lee, Bih-O; Hsiao, Chia-Chi

    2016-02-01

    The Aims of this study were to explore the effects of nurses' attitudes and intentions regarding medication administration error reporting on actual reporting behaviours. Underreporting of medication errors is still a common occurrence. Whether attitude and intention towards medication administration error reporting connect to actual reporting behaviours remain unclear. This study used a cross-sectional design with self-administered questionnaires, and the theory of planned behaviour was used as the framework for this study. A total of 596 staff nurses who worked in general wards and intensive care units in a hospital were invited to participate in this study. The researchers used the instruments measuring nurses' attitude, nurse managers' and co-workers' attitude, report control, and nurses' intention to predict nurses' actual reporting behaviours. Data were collected from September-November 2013. Path analyses were used to examine the hypothesized model. Of the 596 nurses invited to participate, 548 (92%) completed and returned a valid questionnaire. The findings indicated that nurse managers' and co-workers' attitudes are predictors for nurses' attitudes towards medication administration error reporting. Nurses' attitudes also influenced their intention to report medication administration errors; however, no connection was found between intention and actual reporting behaviour. The findings reflected links among colleague perspectives, nurses' attitudes, and intention to report medication administration errors. The researchers suggest that hospitals should increase nurses' awareness and recognition of error occurrence. Regardless of nurse managers' and co-workers' attitudes towards medication administration error reporting, nurses are likely to report medication administration errors if they detect them. Management of medication administration errors should focus on increasing nurses' awareness and recognition of error occurrence. © 2015 John Wiley & Sons Ltd.

  7. Slow Learner Errors Analysis in Solving Fractions Problems in Inclusive Junior High School Class

    Science.gov (United States)

    Novitasari, N.; Lukito, A.; Ekawati, R.

    2018-01-01

    A slow learner whose IQ is between 71 and 89 will have difficulties in solving mathematics problems that often lead to errors. The errors could be analyzed to where the errors may occur and its type. This research is qualitative descriptive which aims to describe the locations, types, and causes of slow learner errors in the inclusive junior high school class in solving the fraction problem. The subject of this research is one slow learner of seventh-grade student which was selected through direct observation by the researcher and through discussion with mathematics teacher and special tutor which handles the slow learner students. Data collection methods used in this study are written tasks and semistructured interviews. The collected data was analyzed by Newman’s Error Analysis (NEA). Results show that there are four locations of errors, namely comprehension, transformation, process skills, and encoding errors. There are four types of errors, such as concept, principle, algorithm, and counting errors. The results of this error analysis will help teachers to identify the causes of the errors made by the slow learner.

  8. Entropy Error Model of Planar Geometry Features in GIS

    Institute of Scientific and Technical Information of China (English)

    LI Dajun; GUAN Yunlan; GONG Jianya; DU Daosheng

    2003-01-01

    Positional error of line segments is usually described by using "g-band", however, its band width is in relation to the confidence level choice. In fact, given different confidence levels, a series of concentric bands can be obtained. To overcome the effect of confidence level on the error indicator, by introducing the union entropy theory, we propose an entropy error ellipse index of point, then extend it to line segment and polygon,and establish an entropy error band of line segment and an entropy error donut of polygon. The research shows that the entropy error index can be determined uniquely and is not influenced by confidence level, and that they are suitable for positional uncertainty of planar geometry features.

  9. Everyday memory errors in older adults.

    Science.gov (United States)

    Ossher, Lynn; Flegal, Kristin E; Lustig, Cindy

    2013-01-01

    Despite concern about cognitive decline in old age, few studies document the types and frequency of memory errors older adults make in everyday life. In the present study, 105 healthy older adults completed the Everyday Memory Questionnaire (EMQ; Sunderland, Harris, & Baddeley, 1983 , Journal of Verbal Learning and Verbal Behavior, 22, 341), indicating what memory errors they had experienced in the last 24 hours, the Memory Self-Efficacy Questionnaire (MSEQ; West, Thorn, & Bagwell, 2003 , Psychology and Aging, 18, 111), and other neuropsychological and cognitive tasks. EMQ and MSEQ scores were unrelated and made separate contributions to variance on the Mini Mental State Exam (MMSE; Folstein, Folstein, & McHugh, 1975 , Journal of Psychiatric Research, 12, 189), suggesting separate constructs. Tip-of-the-tongue errors were the most commonly reported, and the EMQ Faces/Places and New Things subscales were most strongly related to MMSE. These findings may help training programs target memory errors commonly experienced by older adults, and suggest which types of memory errors could indicate cognitive declines of clinical concern.

  10. Error Tendencies in Processing Student Feedback for Instructional Decision Making.

    Science.gov (United States)

    Schermerhorn, John R., Jr.; And Others

    1985-01-01

    Seeks to assist instructors in recognizing two basic errors that can occur in processing student evaluation data on instructional development efforts; offers a research framework for future investigations of the error tendencies and related issues; and suggests ways in which instructors can confront and manage error tendencies in practice. (MBR)

  11. Dose error analysis for a scanned proton beam delivery system

    International Nuclear Information System (INIS)

    Coutrakon, G; Wang, N; Miller, D W; Yang, Y

    2010-01-01

    All particle beam scanning systems are subject to dose delivery errors due to errors in position, energy and intensity of the delivered beam. In addition, finite scan speeds, beam spill non-uniformities, and delays in detector, detector electronics and magnet responses will all contribute errors in delivery. In this paper, we present dose errors for an 8 x 10 x 8 cm 3 target of uniform water equivalent density with 8 cm spread out Bragg peak and a prescribed dose of 2 Gy. Lower doses are also analyzed and presented later in the paper. Beam energy errors and errors due to limitations of scanning system hardware have been included in the analysis. By using Gaussian shaped pencil beams derived from measurements in the research room of the James M Slater Proton Treatment and Research Center at Loma Linda, CA and executing treatment simulations multiple times, statistical dose errors have been calculated in each 2.5 mm cubic voxel in the target. These errors were calculated by delivering multiple treatments to the same volume and calculating the rms variation in delivered dose at each voxel in the target. The variations in dose were the result of random beam delivery errors such as proton energy, spot position and intensity fluctuations. The results show that with reasonable assumptions of random beam delivery errors, the spot scanning technique yielded an rms dose error in each voxel less than 2% or 3% of the 2 Gy prescribed dose. These calculated errors are within acceptable clinical limits for radiation therapy.

  12. Sensation seeking and error processing.

    Science.gov (United States)

    Zheng, Ya; Sheng, Wenbin; Xu, Jing; Zhang, Yuanyuan

    2014-09-01

    Sensation seeking is defined by a strong need for varied, novel, complex, and intense stimulation, and a willingness to take risks for such experience. Several theories propose that the insensitivity to negative consequences incurred by risks is one of the hallmarks of sensation-seeking behaviors. In this study, we investigated the time course of error processing in sensation seeking by recording event-related potentials (ERPs) while high and low sensation seekers performed an Eriksen flanker task. Whereas there were no group differences in ERPs to correct trials, sensation seeking was associated with a blunted error-related negativity (ERN), which was female-specific. Further, different subdimensions of sensation seeking were related to ERN amplitude differently. These findings indicate that the relationship between sensation seeking and error processing is sex-specific. Copyright © 2014 Society for Psychophysiological Research.

  13. Challenge and Error: Critical Events and Attention-Related Errors

    Science.gov (United States)

    Cheyne, James Allan; Carriere, Jonathan S. A.; Solman, Grayden J. F.; Smilek, Daniel

    2011-01-01

    Attention lapses resulting from reactivity to task challenges and their consequences constitute a pervasive factor affecting everyday performance errors and accidents. A bidirectional model of attention lapses (error [image omitted] attention-lapse: Cheyne, Solman, Carriere, & Smilek, 2009) argues that errors beget errors by generating attention…

  14. An experimental approach to validating a theory of human error in complex systems

    Science.gov (United States)

    Morris, N. M.; Rouse, W. B.

    1985-01-01

    The problem of 'human error' is pervasive in engineering systems in which the human is involved. In contrast to the common engineering approach of dealing with error probabilistically, the present research seeks to alleviate problems associated with error by gaining a greater understanding of causes and contributing factors from a human information processing perspective. The general approach involves identifying conditions which are hypothesized to contribute to errors, and experimentally creating the conditions in order to verify the hypotheses. The conceptual framework which serves as the basis for this research is discussed briefly, followed by a description of upcoming research. Finally, the potential relevance of this research to design, training, and aiding issues is discussed.

  15. Error forecasting schemes of error correction at receiver

    International Nuclear Information System (INIS)

    Bhunia, C.T.

    2007-08-01

    To combat error in computer communication networks, ARQ (Automatic Repeat Request) techniques are used. Recently Chakraborty has proposed a simple technique called the packet combining scheme in which error is corrected at the receiver from the erroneous copies. Packet Combining (PC) scheme fails: (i) when bit error locations in erroneous copies are the same and (ii) when multiple bit errors occur. Both these have been addressed recently by two schemes known as Packet Reversed Packet Combining (PRPC) Scheme, and Modified Packet Combining (MPC) Scheme respectively. In the letter, two error forecasting correction schemes are reported, which in combination with PRPC offer higher throughput. (author)

  16. FRamework Assessing Notorious Contributing Influences for Error (FRANCIE): Perspective on Taxonomy Development to Support Error Reporting and Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lon N. Haney; David I. Gertman

    2003-04-01

    Beginning in the 1980s a primary focus of human reliability analysis was estimation of human error probabilities. However, detailed qualitative modeling with comprehensive representation of contextual variables often was lacking. This was likely due to the lack of comprehensive error and performance shaping factor taxonomies, and the limited data available on observed error rates and their relationship to specific contextual variables. In the mid 90s Boeing, America West Airlines, NASA Ames Research Center and INEEL partnered in a NASA sponsored Advanced Concepts grant to: assess the state of the art in human error analysis, identify future needs for human error analysis, and develop an approach addressing these needs. Identified needs included the need for a method to identify and prioritize task and contextual characteristics affecting human reliability. Other needs identified included developing comprehensive taxonomies to support detailed qualitative modeling and to structure meaningful data collection efforts across domains. A result was the development of the FRamework Assessing Notorious Contributing Influences for Error (FRANCIE) with a taxonomy for airline maintenance tasks. The assignment of performance shaping factors to generic errors by experts proved to be valuable to qualitative modeling. Performance shaping factors and error types from such detailed approaches can be used to structure error reporting schemes. In a recent NASA Advanced Human Support Technology grant FRANCIE was refined, and two new taxonomies for use on space missions were developed. The development, sharing, and use of error taxonomies, and the refinement of approaches for increased fidelity of qualitative modeling is offered as a means to help direct useful data collection strategies.

  17. The assessment of cognitive errors using an observer-rated method.

    Science.gov (United States)

    Drapeau, Martin

    2014-01-01

    Cognitive Errors (CEs) are a key construct in cognitive behavioral therapy (CBT). Integral to CBT is that individuals with depression process information in an overly negative or biased way, and that this bias is reflected in specific depressotypic CEs which are distinct from normal information processing. Despite the importance of this construct in CBT theory, practice, and research, few methods are available to researchers and clinicians to reliably identify CEs as they occur. In this paper, the author presents a rating system, the Cognitive Error Rating Scale, which can be used by trained observers to identify and assess the cognitive errors of patients or research participants in vivo, i.e., as they are used or reported by the patients or participants. The method is described, including some of the more important rating conventions to be considered when using the method. This paper also describes the 15 cognitive errors assessed, and the different summary scores, including valence of the CEs, that can be derived from the method.

  18. Error identification and improvement in English first additional ...

    African Journals Online (AJOL)

    This paper seeks to identify errors committed by learners in EFAL essay writing, focussing on causes behind such errors, and strategies to eliminate them as a way of improving learners' writing skills. Document review was adopted as the research method in this study. 15 Grade 10 essays from Mmapadi Secondary school ...

  19. Speech error and tip-of-the-tongue diary for mobile devices

    Directory of Open Access Journals (Sweden)

    Michael S Vitevitch

    2015-08-01

    Full Text Available Collections of various types of speech errors have increased our understanding of the acquisition, production, and perception of language. Although such collections of naturally occurring language errors are invaluable for a number of reasons, the process of collecting various types of speech errors presents many challenges to the researcher interested in building such a collection, among them a significant investment of time and effort to obtain a sufficient number of examples to enable statistical analysis. Here we describe a freely accessible website (http://spedi.ku.edu that helps users document slips of the tongue, slips of the ear, and tip of the tongue states that they experience firsthand or observe in others. The documented errors are amassed, and made available for other users to analyze, thereby distributing the time and effort involved in collecting errors across a large number of individuals instead of saddling the lone researcher, and facilitating distribution of the collection to other researchers. This approach also addresses some issues related to data curation that hampered previous error collections, and enables the collection to continue to grow over a longer period of time than previous collections. Finally, this web-based tool creates an opportunity for language scientists to engage in outreach efforts to increase the understanding of language disorders and research in the general public.

  20. Operator errors

    International Nuclear Information System (INIS)

    Knuefer; Lindauer

    1980-01-01

    Besides that at spectacular events a combination of component failure and human error is often found. Especially the Rasmussen-Report and the German Risk Assessment Study show for pressurised water reactors that human error must not be underestimated. Although operator errors as a form of human error can never be eliminated entirely, they can be minimized and their effects kept within acceptable limits if a thorough training of personnel is combined with an adequate design of the plant against accidents. Contrary to the investigation of engineering errors, the investigation of human errors has so far been carried out with relatively small budgets. Intensified investigations in this field appear to be a worthwhile effort. (orig.)

  1. Uncovering Student Ideas in Astronomy 45 Formative Assessment Probes

    CERN Document Server

    Keeley, Page

    2012-01-01

    What do your students know-or think they know-about what causes night and day, why days are shorter in winter, and how to tell a planet from a star? Find out with this book on astronomy, the latest in NSTA's popular Uncovering Student Ideas in Science series. The 45 astronomy probes provide situations that will pique your students' interest while helping you understand how your students think about key ideas related to the universe and how it operates.

  2. Barriers to Medical Error Reporting for Physicians and Nurses.

    Science.gov (United States)

    Soydemir, Dilek; Seren Intepeler, Seyda; Mert, Hatice

    2017-10-01

    The purpose of the study was to determine what barriers to error reporting exist for physicians and nurses. The study, of descriptive qualitative design, was conducted with physicians and nurses working at a training and research hospital. In-depth interviews were held with eight physicians and 15 nurses, a total of 23 participants. Physicians and nurses do not choose to report medical errors that they experience or witness. When barriers to error reporting were examined, it was seen that there were four main themes involved: fear, the attitude of administration, barriers related to the system, and the employees' perceptions of error. It is important in terms of preventing medical errors to identify the barriers that keep physicians and nurses from reporting errors.

  3. A Corpus-based Study of EFL Learners’ Errors in IELTS Essay Writing

    OpenAIRE

    Hoda Divsar; Robab Heydari

    2017-01-01

    The present study analyzed different types of errors in the EFL learners’ IELTS essays. In order to determine the major types of errors, a corpus of 70 IELTS examinees’ writings were collected, and their errors were extracted and categorized qualitatively. Errors were categorized based on a researcher-developed error-coding scheme into 13 aspects. Based on the descriptive statistical analyses, the frequency of each error type was calculated and the commonest errors committed by the EFL learne...

  4. How Do Simulated Error Experiences Impact Attitudes Related to Error Prevention?

    Science.gov (United States)

    Breitkreuz, Karen R; Dougal, Renae L; Wright, Melanie C

    2016-10-01

    The objective of this project was to determine whether simulated exposure to error situations changes attitudes in a way that may have a positive impact on error prevention behaviors. Using a stratified quasi-randomized experiment design, we compared risk perception attitudes of a control group of nursing students who received standard error education (reviewed medication error content and watched movies about error experiences) to an experimental group of students who reviewed medication error content and participated in simulated error experiences. Dependent measures included perceived memorability of the educational experience, perceived frequency of errors, and perceived caution with respect to preventing errors. Experienced nursing students perceived the simulated error experiences to be more memorable than movies. Less experienced students perceived both simulated error experiences and movies to be highly memorable. After the intervention, compared with movie participants, simulation participants believed errors occurred more frequently. Both types of education increased the participants' intentions to be more cautious and reported caution remained higher than baseline for medication errors 6 months after the intervention. This study provides limited evidence of an advantage of simulation over watching movies describing actual errors with respect to manipulating attitudes related to error prevention. Both interventions resulted in long-term impacts on perceived caution in medication administration. Simulated error experiences made participants more aware of how easily errors can occur, and the movie education made participants more aware of the devastating consequences of errors.

  5. Propagation of positional error in 3D GIS

    NARCIS (Netherlands)

    Biljecki, Filip; Heuvelink, Gerard B.M.; Ledoux, Hugo; Stoter, Jantien

    2015-01-01

    While error propagation in GIS is a topic that has received a lot of attention, it has not been researched with 3D GIS data. We extend error propagation to 3D city models using a Monte Carlo simulation on a use case of annual solar irradiation estimation of building rooftops for assessing the

  6. Learning from Errors: A Model of Individual Processes

    Science.gov (United States)

    Tulis, Maria; Steuer, Gabriele; Dresel, Markus

    2016-01-01

    Errors bear the potential to improve knowledge acquisition, provided that learners are able to deal with them in an adaptive and reflexive manner. However, learners experience a host of different--often impeding or maladaptive--emotional and motivational states in the face of academic errors. Research has made few attempts to develop a theory that…

  7. The Applicability of Standard Error of Measurement and Minimal Detectable Change to Motor Learning Research-A Behavioral Study.

    Science.gov (United States)

    Furlan, Leonardo; Sterr, Annette

    2018-01-01

    Motor learning studies face the challenge of differentiating between real changes in performance and random measurement error. While the traditional p -value-based analyses of difference (e.g., t -tests, ANOVAs) provide information on the statistical significance of a reported change in performance scores, they do not inform as to the likely cause or origin of that change, that is, the contribution of both real modifications in performance and random measurement error to the reported change. One way of differentiating between real change and random measurement error is through the utilization of the statistics of standard error of measurement (SEM) and minimal detectable change (MDC). SEM is estimated from the standard deviation of a sample of scores at baseline and a test-retest reliability index of the measurement instrument or test employed. MDC, in turn, is estimated from SEM and a degree of confidence, usually 95%. The MDC value might be regarded as the minimum amount of change that needs to be observed for it to be considered a real change, or a change to which the contribution of real modifications in performance is likely to be greater than that of random measurement error. A computer-based motor task was designed to illustrate the applicability of SEM and MDC to motor learning research. Two studies were conducted with healthy participants. Study 1 assessed the test-retest reliability of the task and Study 2 consisted in a typical motor learning study, where participants practiced the task for five consecutive days. In Study 2, the data were analyzed with a traditional p -value-based analysis of difference (ANOVA) and also with SEM and MDC. The findings showed good test-retest reliability for the task and that the p -value-based analysis alone identified statistically significant improvements in performance over time even when the observed changes could in fact have been smaller than the MDC and thereby caused mostly by random measurement error, as opposed

  8. Error-Free Software

    Science.gov (United States)

    1989-01-01

    001 is an integrated tool suited for automatically developing ultra reliable models, simulations and software systems. Developed and marketed by Hamilton Technologies, Inc. (HTI), it has been applied in engineering, manufacturing, banking and software tools development. The software provides the ability to simplify the complex. A system developed with 001 can be a prototype or fully developed with production quality code. It is free of interface errors, consistent, logically complete and has no data or control flow errors. Systems can be designed, developed and maintained with maximum productivity. Margaret Hamilton, President of Hamilton Technologies, also directed the research and development of USE.IT, an earlier product which was the first computer aided software engineering product in the industry to concentrate on automatically supporting the development of an ultrareliable system throughout its life cycle. Both products originated in NASA technology developed under a Johnson Space Center contract.

  9. Spelling Errors of Iranian School-Level EFL Learners: Potential Sources

    Directory of Open Access Journals (Sweden)

    Mahnaz Saeidi

    2010-05-01

    Full Text Available With the purpose of examining the sources of spelling errors of Iranian school level EFL learners, the present researchers analyzed the dictation samples of 51 Iranian senior and junior high school male and female students majoring at an Iranian school in Baku, Azerbaijan. The content analysis of the data revealed three main sources (intralingual, interlingual, and unique with seven patterns of errors. The frequency of intralingual errors far outnumbers that of interlingual errors. Unique errors were even less. Therefore, in-service training programs may include some instruction on raising the teachers’ awareness of the different sources of errors to focus on during the teaching program.

  10. Epistemically Virtuous Risk Management : Financial Due Diligence and Uncovering the Madoff Fraud

    NARCIS (Netherlands)

    de Bruin, Boudewijn; Luetge, Christoph; Jauernig, Johanna

    2014-01-01

    The chapter analyses how Bernard Madoff’s Ponzi scheme was uncovered by Harry Markopolos, an employee of Rampart Investment Management, LLC, and the contribution of so-called epistemic virtues to Markopolos’ success. After Rampart had informed the firm about an allegedly highly successful hedge fund

  11. Integrative Genetic and Epigenetic Analysis Uncovers Regulatory Mechanisms of Autoimmune Disease.

    Science.gov (United States)

    Shooshtari, Parisa; Huang, Hailiang; Cotsapas, Chris

    2017-07-06

    Genome-wide association studies in autoimmune and inflammatory diseases (AID) have uncovered hundreds of loci mediating risk. These associations are preferentially located in non-coding DNA regions and in particular in tissue-specific DNase I hypersensitivity sites (DHSs). While these analyses clearly demonstrate the overall enrichment of disease risk alleles on gene regulatory regions, they are not designed to identify individual regulatory regions mediating risk or the genes under their control, and thus uncover the specific molecular events driving disease risk. To do so we have departed from standard practice by identifying regulatory regions which replicate across samples and connect them to the genes they control through robust re-analysis of public data. We find significant evidence of regulatory potential in 78/301 (26%) risk loci across nine autoimmune and inflammatory diseases, and we find that individual genes are targeted by these effects in 53/78 (68%) of these. Thus, we are able to generate testable mechanistic hypotheses of the molecular changes that drive disease risk. Copyright © 2017 American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.

  12. Statistical errors in Monte Carlo estimates of systematic errors

    Energy Technology Data Exchange (ETDEWEB)

    Roe, Byron P. [Department of Physics, University of Michigan, Ann Arbor, MI 48109 (United States)]. E-mail: byronroe@umich.edu

    2007-01-01

    For estimating the effects of a number of systematic errors on a data sample, one can generate Monte Carlo (MC) runs with systematic parameters varied and examine the change in the desired observed result. Two methods are often used. In the unisim method, the systematic parameters are varied one at a time by one standard deviation, each parameter corresponding to a MC run. In the multisim method (see ), each MC run has all of the parameters varied; the amount of variation is chosen from the expected distribution of each systematic parameter, usually assumed to be a normal distribution. The variance of the overall systematic error determination is derived for each of the two methods and comparisons are made between them. If one focuses not on the error in the prediction of an individual systematic error, but on the overall error due to all systematic errors in the error matrix element in data bin m, the number of events needed is strongly reduced because of the averaging effect over all of the errors. For simple models presented here the multisim model was far better if the statistical error in the MC samples was larger than an individual systematic error, while for the reverse case, the unisim model was better. Exact formulas and formulas for the simple toy models are presented so that realistic calculations can be made. The calculations in the present note are valid if the errors are in a linear region. If that region extends sufficiently far, one can have the unisims or multisims correspond to k standard deviations instead of one. This reduces the number of events required by a factor of k{sup 2}.

  13. Statistical errors in Monte Carlo estimates of systematic errors

    International Nuclear Information System (INIS)

    Roe, Byron P.

    2007-01-01

    For estimating the effects of a number of systematic errors on a data sample, one can generate Monte Carlo (MC) runs with systematic parameters varied and examine the change in the desired observed result. Two methods are often used. In the unisim method, the systematic parameters are varied one at a time by one standard deviation, each parameter corresponding to a MC run. In the multisim method (see ), each MC run has all of the parameters varied; the amount of variation is chosen from the expected distribution of each systematic parameter, usually assumed to be a normal distribution. The variance of the overall systematic error determination is derived for each of the two methods and comparisons are made between them. If one focuses not on the error in the prediction of an individual systematic error, but on the overall error due to all systematic errors in the error matrix element in data bin m, the number of events needed is strongly reduced because of the averaging effect over all of the errors. For simple models presented here the multisim model was far better if the statistical error in the MC samples was larger than an individual systematic error, while for the reverse case, the unisim model was better. Exact formulas and formulas for the simple toy models are presented so that realistic calculations can be made. The calculations in the present note are valid if the errors are in a linear region. If that region extends sufficiently far, one can have the unisims or multisims correspond to k standard deviations instead of one. This reduces the number of events required by a factor of k 2

  14. Destroyed documents: uncovering the science that Imperial Tobacco Canada sought to conceal.

    Science.gov (United States)

    Hammond, David; Chaiton, Michael; Lee, Alex; Collishaw, Neil

    2009-11-10

    In 1992, British American Tobacco had its Canadian affiliate, Imperial Tobacco Canada, destroy internal research documents that could expose the company to liability or embarrassment. Sixty of these destroyed documents were subsequently uncovered in British American Tobacco's files. Legal counsel for Imperial Tobacco Canada provided a list of 60 destroyed documents to British American Tobacco. Information in this list was used to search for copies of the documents in British American Tobacco files released through court disclosure. We reviewed and summarized this information. Imperial Tobacco destroyed documents that included evidence from scientific reviews prepared by British American Tobacco's researchers, as well as 47 original research studies, 35 of which examined the biological activity and carcinogenicity of tobacco smoke. The documents also describe British American Tobacco research on cigarette modifications and toxic emissions, including the ways in which consumers adapted their smoking behaviour in response to these modifications. The documents also depict a comprehensive research program on the pharmacology of nicotine and the central role of nicotine in smoking behaviour. British American Tobacco scientists noted that ".. the present scale of the tobacco industry is largely dependent on the intensity and nature of the pharmacological action of nicotine," and that "... should nicotine become less attractive to smokers, the future of the tobacco industry would become less secure." The scientific evidence contained in the documents destroyed by Imperial Tobacco demonstrates that British American Tobacco had collected evidence that cigarette smoke was carcinogenic and addictive. The evidence that Imperial Tobacco sought to destroy had important implications for government regulation of tobacco.

  15. Commission errors of active intentions: the roles of aging, cognitive load, and practice.

    Science.gov (United States)

    Boywitt, C Dennis; Rummel, Jan; Meiser, Thorsten

    2015-01-01

    Performing an intended action when it needs to be withheld, for example, when temporarily prescribed medication is incompatible with the other medication, is referred to as commission errors of prospective memory (PM). While recent research indicates that older adults are especially prone to commission errors for finished intentions, there is a lack of research on the effects of aging on commission errors for still active intentions. The present research investigates conditions which might contribute to older adults' propensity to perform planned intentions under inappropriate conditions. Specifically, disproportionally higher rates of commission errors for still active intentions were observed in older than in younger adults with both salient (Experiment 1) and non-salient (Experiment 2) target cues. Practicing the PM task in Experiment 2, however, helped execution of the intended action in terms of higher PM performance at faster ongoing-task response times but did not increase the rate of commission errors. The results have important implications for the understanding of older adults' PM commission errors and the processes involved in these errors.

  16. Measuring worst-case errors in a robot workcell

    International Nuclear Information System (INIS)

    Simon, R.W.; Brost, R.C.; Kholwadwala, D.K.

    1997-10-01

    Errors in model parameters, sensing, and control are inevitably present in real robot systems. These errors must be considered in order to automatically plan robust solutions to many manipulation tasks. Lozano-Perez, Mason, and Taylor proposed a formal method for synthesizing robust actions in the presence of uncertainty; this method has been extended by several subsequent researchers. All of these results presume the existence of worst-case error bounds that describe the maximum possible deviation between the robot's model of the world and reality. This paper examines the problem of measuring these error bounds for a real robot workcell. These measurements are difficult, because of the desire to completely contain all possible deviations while avoiding bounds that are overly conservative. The authors present a detailed description of a series of experiments that characterize and quantify the possible errors in visual sensing and motion control for a robot workcell equipped with standard industrial robot hardware. In addition to providing a means for measuring these specific errors, these experiments shed light on the general problem of measuring worst-case errors

  17. AN ERROR ANALYSIS OF ARGUMENTATIVE ESSAY (CASE STUDY AT UNIVERSITY MUHAMMADIYAH OF METRO

    Directory of Open Access Journals (Sweden)

    Fenny - Thresia

    2015-10-01

    Full Text Available The purpose of this study was study analyze the students’ error in writing argumentative essay. The researcher focuses on errors of verb, concord and learner language. This study took 20 students as the subject of research from the third semester. The data took from observation and documentation. Based on the result of the data analysis there are some errors still found on the student’s argumentative essay in English writing? The common errors which repeatedly appear are verb. The second is concord, and learner languages are the smallest error. From 20 samples that took, the frequency the errors of verb are 12 items (60%, concord are 8 items (40%, learner languages are 7 items (35%. As a result, verb has the biggest number of common errors.

  18. Correcting the Standard Errors of 2-Stage Residual Inclusion Estimators for Mendelian Randomization Studies.

    Science.gov (United States)

    Palmer, Tom M; Holmes, Michael V; Keating, Brendan J; Sheehan, Nuala A

    2017-11-01

    Mendelian randomization studies use genotypes as instrumental variables to test for and estimate the causal effects of modifiable risk factors on outcomes. Two-stage residual inclusion (TSRI) estimators have been used when researchers are willing to make parametric assumptions. However, researchers are currently reporting uncorrected or heteroscedasticity-robust standard errors for these estimates. We compared several different forms of the standard error for linear and logistic TSRI estimates in simulations and in real-data examples. Among others, we consider standard errors modified from the approach of Newey (1987), Terza (2016), and bootstrapping. In our simulations Newey, Terza, bootstrap, and corrected 2-stage least squares (in the linear case) standard errors gave the best results in terms of coverage and type I error. In the real-data examples, the Newey standard errors were 0.5% and 2% larger than the unadjusted standard errors for the linear and logistic TSRI estimators, respectively. We show that TSRI estimators with modified standard errors have correct type I error under the null. Researchers should report TSRI estimates with modified standard errors instead of reporting unadjusted or heteroscedasticity-robust standard errors. © The Author(s) 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health.

  19. Field error lottery

    Energy Technology Data Exchange (ETDEWEB)

    Elliott, C.J.; McVey, B. (Los Alamos National Lab., NM (USA)); Quimby, D.C. (Spectra Technology, Inc., Bellevue, WA (USA))

    1990-01-01

    The level of field errors in an FEL is an important determinant of its performance. We have computed 3D performance of a large laser subsystem subjected to field errors of various types. These calculations have been guided by simple models such as SWOOP. The technique of choice is utilization of the FELEX free electron laser code that now possesses extensive engineering capabilities. Modeling includes the ability to establish tolerances of various types: fast and slow scale field bowing, field error level, beam position monitor error level, gap errors, defocusing errors, energy slew, displacement and pointing errors. Many effects of these errors on relative gain and relative power extraction are displayed and are the essential elements of determining an error budget. The random errors also depend on the particular random number seed used in the calculation. The simultaneous display of the performance versus error level of cases with multiple seeds illustrates the variations attributable to stochasticity of this model. All these errors are evaluated numerically for comprehensive engineering of the system. In particular, gap errors are found to place requirements beyond mechanical tolerances of {plus minus}25{mu}m, and amelioration of these may occur by a procedure utilizing direct measurement of the magnetic fields at assembly time. 4 refs., 12 figs.

  20. MEDICAL ERROR: CIVIL AND LEGAL ASPECT.

    Science.gov (United States)

    Buletsa, S; Drozd, O; Yunin, O; Mohilevskyi, L

    2018-03-01

    The scientific article is focused on the research of the notion of medical error, medical and legal aspects of this notion have been considered. The necessity of the legislative consolidation of the notion of «medical error» and criteria of its legal estimation have been grounded. In the process of writing a scientific article, we used the empirical method, general scientific and comparative legal methods. A comparison of the concept of medical error in civil and legal aspects was made from the point of view of Ukrainian, European and American scientists. It has been marked that the problem of medical errors is known since ancient times and in the whole world, in fact without regard to the level of development of medicine, there is no country, where doctors never make errors. According to the statistics, medical errors in the world are included in the first five reasons of death rate. At the same time the grant of medical services practically concerns all people. As a man and his life, health in Ukraine are acknowledged by a higher social value, medical services must be of high-quality and effective. The grant of not quality medical services causes harm to the health, and sometimes the lives of people; it may result in injury or even death. The right to the health protection is one of the fundamental human rights assured by the Constitution of Ukraine; therefore the issue of medical errors and liability for them is extremely relevant. The authors make conclusions, that the definition of the notion of «medical error» must get the legal consolidation. Besides, the legal estimation of medical errors must be based on the single principles enshrined in the legislation and confirmed by judicial practice.

  1. Managing organizational errors: Three theoretical lenses on a bank collapse

    OpenAIRE

    Giolito, Vincent

    2015-01-01

    Errors have been shown to be a major source of organizational disasters, yet scant research has paid attention to the management of errors that is, what managers do once errors have occurred and how actions may determine outcomes. In an early attempt to build a theory of the management of organizational errors, this paper examines how extant theory applies to the collapse of a bank. The financial industry was chosen because of the systemic risks it entails, as demonstrated by the financial cr...

  2. Investigating Medication Errors in Educational Health Centers of Kermanshah

    Directory of Open Access Journals (Sweden)

    Mohsen Mohammadi

    2015-08-01

    Full Text Available Background and objectives : Medication errors can be a threat to the safety of patients. Preventing medication errors requires reporting and investigating such errors. The present study was conducted with the purpose of investigating medication errors in educational health centers of Kermanshah. Material and Methods: The present research is an applied, descriptive-analytical study and is done as a survey. Error Report of Ministry of Health and Medical Education was used for data collection. The population of the study included all the personnel (nurses, doctors, paramedics of educational health centers of Kermanshah. Among them, those who reported the committed errors were selected as the sample of the study. The data analysis was done using descriptive statistics and Chi 2 Test using SPSS version 18. Results: The findings of the study showed that most errors were related to not using medication properly, the least number of errors were related to improper dose, and the majority of errors occurred in the morning. The most frequent reason for errors was staff negligence and the least frequent was the lack of knowledge. Conclusion: The health care system should create an environment for detecting and reporting errors by the personnel, recognizing related factors causing errors, training the personnel and create a good working environment and standard workload.

  3. Errors in clinical laboratories or errors in laboratory medicine?

    Science.gov (United States)

    Plebani, Mario

    2006-01-01

    Laboratory testing is a highly complex process and, although laboratory services are relatively safe, they are not as safe as they could or should be. Clinical laboratories have long focused their attention on quality control methods and quality assessment programs dealing with analytical aspects of testing. However, a growing body of evidence accumulated in recent decades demonstrates that quality in clinical laboratories cannot be assured by merely focusing on purely analytical aspects. The more recent surveys on errors in laboratory medicine conclude that in the delivery of laboratory testing, mistakes occur more frequently before (pre-analytical) and after (post-analytical) the test has been performed. Most errors are due to pre-analytical factors (46-68.2% of total errors), while a high error rate (18.5-47% of total errors) has also been found in the post-analytical phase. Errors due to analytical problems have been significantly reduced over time, but there is evidence that, particularly for immunoassays, interference may have a serious impact on patients. A description of the most frequent and risky pre-, intra- and post-analytical errors and advice on practical steps for measuring and reducing the risk of errors is therefore given in the present paper. Many mistakes in the Total Testing Process are called "laboratory errors", although these may be due to poor communication, action taken by others involved in the testing process (e.g., physicians, nurses and phlebotomists), or poorly designed processes, all of which are beyond the laboratory's control. Likewise, there is evidence that laboratory information is only partially utilized. A recent document from the International Organization for Standardization (ISO) recommends a new, broader definition of the term "laboratory error" and a classification of errors according to different criteria. In a modern approach to total quality, centered on patients' needs and satisfaction, the risk of errors and mistakes

  4. Radiologic errors, past, present and future.

    Science.gov (United States)

    Berlin, Leonard

    2014-01-01

    During the 10-year period beginning in 1949 with publication of five articles in two radiology journals and UKs The Lancet, a California radiologist named L.H. Garland almost single-handedly shocked the entire medical and especially the radiologic community. He focused their attention on the fact now known and accepted by all, but at that time not previously recognized and acknowledged only with great reluctance, that a substantial degree of observer error was prevalent in radiologic interpretation. In the more than half-century that followed, Garland's pioneering work has been affirmed and reaffirmed by numerous researchers. Retrospective studies disclosed then and still disclose today that diagnostic errors in radiologic interpretations of plain radiographic (as well as CT, MR, ultrasound, and radionuclide) images hover in the 30% range, not too dissimilar to the error rates in clinical medicine. Seventy percent of these errors are perceptual in nature, i.e., the radiologist does not "see" the abnormality on the imaging exam, perhaps due to poor conspicuity, satisfaction of search, or simply the "inexplicable psycho-visual phenomena of human perception." The remainder are cognitive errors: the radiologist sees an abnormality but fails to render a correct diagnoses by attaching the wrong significance to what is seen, perhaps due to inadequate knowledge, or an alliterative or judgmental error. Computer-assisted detection (CAD), a technology that for the past two decades has been utilized primarily in mammographic interpretation, increases sensitivity but at the same time decreases specificity; whether it reduces errors is debatable. Efforts to reduce diagnostic radiological errors continue, but the degree to which they will be successful remains to be determined.

  5. ERF/ERFC, Calculation of Error Function, Complementary Error Function, Probability Integrals

    International Nuclear Information System (INIS)

    Vogel, J.E.

    1983-01-01

    1 - Description of problem or function: ERF and ERFC are used to compute values of the error function and complementary error function for any real number. They may be used to compute other related functions such as the normal probability integrals. 4. Method of solution: The error function and complementary error function are approximated by rational functions. Three such rational approximations are used depending on whether - x .GE.4.0. In the first region the error function is computed directly and the complementary error function is computed via the identity erfc(x)=1.0-erf(x). In the other two regions the complementary error function is computed directly and the error function is computed from the identity erf(x)=1.0-erfc(x). The error function and complementary error function are real-valued functions of any real argument. The range of the error function is (-1,1). The range of the complementary error function is (0,2). 5. Restrictions on the complexity of the problem: The user is cautioned against using ERF to compute the complementary error function by using the identity erfc(x)=1.0-erf(x). This subtraction may cause partial or total loss of significance for certain values of x

  6. Error-correction coding

    Science.gov (United States)

    Hinds, Erold W. (Principal Investigator)

    1996-01-01

    This report describes the progress made towards the completion of a specific task on error-correcting coding. The proposed research consisted of investigating the use of modulation block codes as the inner code of a concatenated coding system in order to improve the overall space link communications performance. The study proposed to identify and analyze candidate codes that will complement the performance of the overall coding system which uses the interleaved RS (255,223) code as the outer code.

  7. A methodology for translating positional error into measures of attribute error, and combining the two error sources

    Science.gov (United States)

    Yohay Carmel; Curtis Flather; Denis Dean

    2006-01-01

    This paper summarizes our efforts to investigate the nature, behavior, and implications of positional error and attribute error in spatiotemporal datasets. Estimating the combined influence of these errors on map analysis has been hindered by the fact that these two error types are traditionally expressed in different units (distance units, and categorical units,...

  8. Uncovering Barriers to Teaching Assistants (TAs) Implementing Inquiry Teaching: Inconsistent Facilitation Techniques, Student Resistance, and Reluctance to Share Control over Learning with Students.

    Science.gov (United States)

    Gormally, Cara; Sullivan, Carol Subiño; Szeinbaum, Nadia

    2016-05-01

    Inquiry-based teaching approaches are increasingly being adopted in biology laboratories. Yet teaching assistants (TAs), often novice teachers, teach the majority of laboratory courses in US research universities. This study analyzed the perspectives of TAs and their students and used classroom observations to uncover challenges faced by TAs during their first year of inquiry-based teaching. Our study revealed three insights about barriers to effective inquiry teaching practices: 1) TAs lack sufficient facilitation skills; 2) TAs struggle to share control over learning with students as they reconcile long-standing teaching beliefs with newly learned approaches, consequently undermining their fledgling ability to use inquiry approaches; and 3) student evaluations reinforce teacher-centered behaviors as TAs receive positive feedback conflicting with inquiry approaches. We make recommendations, including changing instructional feedback to focus on learner-centered teaching practices. We urge TA mentors to engage TAs in discussions to uncover teaching beliefs underlying teaching choices and support TAs through targeted feedback and practice.

  9. Uncovering Barriers to Teaching Assistants (TAs Implementing Inquiry Teaching: Inconsistent Facilitation Techniques, Student Resistance, and Reluctance to Share Control over Learning with Students

    Directory of Open Access Journals (Sweden)

    Cara Gormally

    2016-05-01

    Full Text Available Inquiry-based teaching approaches are increasingly being adopted in biology laboratories. Yet teaching assistants (TAs, often novice teachers, teach the majority of laboratory courses in US research universities. This study analyzed the perspectives of TAs and their students and used classroom observations to uncover challenges faced by TAs during their first year of inquiry-based teaching. Our study revealed three insights about barriers to effective inquiry teaching practices: 1 TAs lack sufficient facilitation skills; 2 TAs struggle to share control over learning with students as they reconcile long-standing teaching beliefs with newly learned approaches, consequently undermining their fledgling ability to use inquiry approaches; and 3 student evaluations reinforce teacher-centered behaviors as TAs receive positive feedback conflicting with inquiry approaches. We make recommendations, including changing instructional feedback to focus on learner-centered teaching practices. We urge TA mentors to engage TAs in discussions to uncover teaching beliefs underlying teaching choices and support TAs through targeted feedback and practice.

  10. Subject-verb agreement: Error production by Tourism undergraduate students

    Directory of Open Access Journals (Sweden)

    Ana Paula Correia

    2014-11-01

    Full Text Available The aim of this paper, which is part of a more extensive research on verb tense errors, is to investigate the subject-verb agreement errors in the simple present in the texts of a group of Tourism undergraduate students. Based on the concept of interlanguage and following the error analysis model, this descriptive non-experimental study applies qualitative and quantitative procedures. Three types of instruments were used to collect data: a sociolinguistic questionnaire (to define the learners’ profile; the Dialang test (to establish their proficiency level in English; and our own learner corpus (140 texts. Errors were identified and classified by an expert panel in accordance with a verb error taxonomy developed for this study based on the taxonomy established by the Cambridge Learner Corpus. The Markin software was used to code errors in the corpus and the Wordsmith Tools software to analyze the data. Subject-verb agreement errors and their relation with the learners’ proficiency levels are described.

  11. Cultural differences in categorical memory errors persist with age.

    Science.gov (United States)

    Gutchess, Angela; Boduroglu, Aysecan

    2018-01-02

    This cross-sectional experiment examined the influence of aging on cross-cultural differences in memory errors. Previous research revealed that Americans committed more categorical memory errors than Turks; we tested whether the cognitive constraints associated with aging impacted the pattern of memory errors across cultures. Furthermore, older adults are vulnerable to memory errors for semantically-related information, and we assessed whether this tendency occurs across cultures. Younger and older adults from the US and Turkey studied word pairs, with some pairs sharing a categorical relationship and some unrelated. Participants then completed a cued recall test, generating the word that was paired with the first. These responses were scored for correct responses or different types of errors, including categorical and semantic. The tendency for Americans to commit more categorical memory errors emerged for both younger and older adults. In addition, older adults across cultures committed more memory errors, and these were for semantically-related information (including both categorical and other types of semantic errors). Heightened vulnerability to memory errors with age extends across cultural groups, and Americans' proneness to commit categorical memory errors occurs across ages. The findings indicate some robustness in the ways that age and culture influence memory errors.

  12. Error or "act of God"? A study of patients' and operating room team members' perceptions of error definition, reporting, and disclosure.

    Science.gov (United States)

    Espin, Sherry; Levinson, Wendy; Regehr, Glenn; Baker, G Ross; Lingard, Lorelei

    2006-01-01

    Calls abound for a culture change in health care to improve patient safety. However, effective change cannot proceed without a clear understanding of perceptions and beliefs about error. In this study, we describe and compare operative team members' and patients' perceptions of error, reporting of error, and disclosure of error. Thirty-nine interviews of team members (9 surgeons, 9 nurses, 10 anesthesiologists) and patients (11) were conducted at 2 teaching hospitals using 4 scenarios as prompts. Transcribed responses to open questions were analyzed by 2 researchers for recurrent themes using the grounded-theory method. Yes/no answers were compared across groups using chi-square analyses. Team members and patients agreed on what constitutes an error. Deviation from standards and negative outcome were emphasized as definitive features. Patients and nurse professionals differed significantly in their perception of whether errors should be reported. Nurses were willing to report only events within their disciplinary scope of practice. Although most patients strongly advocated full disclosure of errors (what happened and how), team members preferred to disclose only what happened. When patients did support partial disclosure, their rationales varied from that of team members. Both operative teams and patients define error in terms of breaking the rules and the concept of "no harm no foul." These concepts pose challenges for treating errors as system failures. A strong culture of individualism pervades nurses' perception of error reporting, suggesting that interventions are needed to foster collective responsibility and a constructive approach to error identification.

  13. Coping with medical error: a systematic review of papers to assess the effects of involvement in medical errors on healthcare professionals' psychological well-being.

    Science.gov (United States)

    Sirriyeh, Reema; Lawton, Rebecca; Gardner, Peter; Armitage, Gerry

    2010-12-01

    Previous research has established health professionals as secondary victims of medical error, with the identification of a range of emotional and psychological repercussions that may occur as a result of involvement in error.2 3 Due to the vast range of emotional and psychological outcomes, research to date has been inconsistent in the variables measured and tools used. Therefore, differing conclusions have been drawn as to the nature of the impact of error on professionals and the subsequent repercussions for their team, patients and healthcare institution. A systematic review was conducted. Data sources were identified using database searches, with additional reference and hand searching. Eligibility criteria were applied to all studies identified, resulting in a total of 24 included studies. Quality assessment was conducted with the included studies using a tool that was developed as part of this research, but due to the limited number and diverse nature of studies, no exclusions were made on this basis. Review findings suggest that there is consistent evidence for the widespread impact of medical error on health professionals. Psychological repercussions may include negative states such as shame, self-doubt, anxiety and guilt. Despite much attention devoted to the assessment of negative outcomes, the potential for positive outcomes resulting from error also became apparent, with increased assertiveness, confidence and improved colleague relationships reported. It is evident that involvement in a medical error can elicit a significant psychological response from the health professional involved. However, a lack of literature around coping and support, coupled with inconsistencies and weaknesses in methodology, may need be addressed in future work.

  14. 77 FR 27064 - Agency Forms Undergoing Paperwork Reduction Act Review

    Science.gov (United States)

    2012-05-08

    ... identified in a traditional survey interview, such as interpretive errors and recall accuracy, are uncovered... contract) research, demonstrations, and evaluations respecting new or improved methods for obtaining... CDC surveys (such as the NCHS National Health Interview Survey, OMB No. 0920-0214) and other federally...

  15. Double checking medicines: defence against error or contributory factor?

    Science.gov (United States)

    Armitage, Gerry

    2008-08-01

    The double checking of medicines in health care is a contestable procedure. It occupies an obvious position in health care practice and is understood to be an effective defence against medication error but the process is variable and the outcomes have not been exposed to testing. This paper presents an appraisal of the process using data from part of a larger study on the contributory factors in medication errors and their reporting. Previous research studies are reviewed; data are analysed from a review of 991 drug error reports and a subsequent series of 40 in-depth interviews with health professionals in an acute hospital in northern England. The incident reports showed that errors occurred despite double checking but that action taken did not appear to investigate the checking process. Most interview participants (34) talked extensively about double checking but believed the process to be inconsistent. Four key categories were apparent: deference to authority, reduction of responsibility, automatic processing and lack of time. Solutions to the problems were also offered, which are discussed with several recommendations. Double checking medicines should be a selective and systematic procedure informed by key principles and encompassing certain behaviours. Psychological research may be instructive in reducing checking errors but the aviation industry may also have a part to play in increasing error wisdom and reducing risk.

  16. Applying Intelligent Algorithms to Automate the Identification of Error Factors.

    Science.gov (United States)

    Jin, Haizhe; Qu, Qingxing; Munechika, Masahiko; Sano, Masataka; Kajihara, Chisato; Duffy, Vincent G; Chen, Han

    2018-05-03

    Medical errors are the manifestation of the defects occurring in medical processes. Extracting and identifying defects as medical error factors from these processes are an effective approach to prevent medical errors. However, it is a difficult and time-consuming task and requires an analyst with a professional medical background. The issues of identifying a method to extract medical error factors and reduce the extraction difficulty need to be resolved. In this research, a systematic methodology to extract and identify error factors in the medical administration process was proposed. The design of the error report, extraction of the error factors, and identification of the error factors were analyzed. Based on 624 medical error cases across four medical institutes in both Japan and China, 19 error-related items and their levels were extracted. After which, they were closely related to 12 error factors. The relational model between the error-related items and error factors was established based on a genetic algorithm (GA)-back-propagation neural network (BPNN) model. Additionally, compared to GA-BPNN, BPNN, partial least squares regression and support vector regression, GA-BPNN exhibited a higher overall prediction accuracy, being able to promptly identify the error factors from the error-related items. The combination of "error-related items, their different levels, and the GA-BPNN model" was proposed as an error-factor identification technology, which could automatically identify medical error factors.

  17. Learning mechanisms to limit medication administration errors.

    Science.gov (United States)

    Drach-Zahavy, Anat; Pud, Dorit

    2010-04-01

    This paper is a report of a study conducted to identify and test the effectiveness of learning mechanisms applied by the nursing staff of hospital wards as a means of limiting medication administration errors. Since the influential report ;To Err Is Human', research has emphasized the role of team learning in reducing medication administration errors. Nevertheless, little is known about the mechanisms underlying team learning. Thirty-two hospital wards were randomly recruited. Data were collected during 2006 in Israel by a multi-method (observations, interviews and administrative data), multi-source (head nurses, bedside nurses) approach. Medication administration error was defined as any deviation from procedures, policies and/or best practices for medication administration, and was identified using semi-structured observations of nurses administering medication. Organizational learning was measured using semi-structured interviews with head nurses, and the previous year's reported medication administration errors were assessed using administrative data. The interview data revealed four learning mechanism patterns employed in an attempt to learn from medication administration errors: integrated, non-integrated, supervisory and patchy learning. Regression analysis results demonstrated that whereas the integrated pattern of learning mechanisms was associated with decreased errors, the non-integrated pattern was associated with increased errors. Supervisory and patchy learning mechanisms were not associated with errors. Superior learning mechanisms are those that represent the whole cycle of team learning, are enacted by nurses who administer medications to patients, and emphasize a system approach to data analysis instead of analysis of individual cases.

  18. [Event-related EEG potentials associated with error detection in psychiatric disorder: literature review].

    Science.gov (United States)

    Balogh, Lívia; Czobor, Pál

    2010-01-01

    Error-related bioelectric signals constitute a special subgroup of event-related potentials. Researchers have identified two evoked potential components to be closely related to error processing, namely error-related negativity (ERN) and error-positivity (Pe), and they linked these to specific cognitive functions. In our article first we give a brief description of these components, then based on the available literature, we review differences in error-related evoked potentials observed in patients across psychiatric disorders. The PubMed and Medline search engines were used in order to identify all relevant articles, published between 2000 and 2009. For the purpose of the current paper we reviewed publications summarizing results of clinical trials. Patients suffering from schizophrenia, anorexia nervosa or borderline personality disorder exhibited a decrease in the amplitude of error-negativity when compared with healthy controls, while in cases of depression and anxiety an increase in the amplitude has been observed. Some of the articles suggest specific personality variables, such as impulsivity, perfectionism, negative emotions or sensitivity to punishment to underlie these electrophysiological differences. Research in the field of error-related electric activity has come to the focus of psychiatry research only recently, thus the amount of available data is significantly limited. However, since this is a relatively new field of research, the results available at present are noteworthy and promising for future electrophysiological investigations in psychiatric disorders.

  19. Recognition of medical errors' reporting system dimensions in educational hospitals.

    Science.gov (United States)

    Yarmohammadian, Mohammad H; Mohammadinia, Leila; Tavakoli, Nahid; Ghalriz, Parvin; Haghshenas, Abbas

    2014-01-01

    Nowadays medical errors are one of the serious issues in the health-care system and carry to account of the patient's safety threat. The most important step for achieving safety promotion is identifying errors and their causes in order to recognize, correct and omit them. Concerning about repeating medical errors and harms, which were received via theses errors concluded to designing and establishing medical error reporting systems for hospitals and centers that are presenting therapeutic services. The aim of this study is the recognition of medical errors' reporting system dimensions in educational hospitals. This research is a descriptive-analytical and qualities' study, which has been carried out in Shahid Beheshti educational therapeutic center in Isfahan during 2012. In this study, relevant information was collected through 15 face to face interviews. That each of interviews take place in about 1hr and creation of five focused discussion groups through 45 min for each section, they were composed of Metron, educational supervisor, health officer, health education, and all of the head nurses. Concluded data interviews and discussion sessions were coded, then achieved results were extracted in the presence of clear-sighted persons and after their feedback perception, they were categorized. In order to make sure of information correctness, tables were presented to the research's interviewers and final the corrections were confirmed based on their view. The extracted information from interviews and discussion groups have been divided into nine main categories after content analyzing and subject coding and their subsets have been completely expressed. Achieved dimensions are composed of nine domains of medical error concept, error cases according to nurses' prospection, medical error reporting barriers, employees' motivational factors for error reporting, purposes of medical error reporting system, error reporting's challenges and opportunities, a desired system

  20. Statistical errors in Monte Carlo estimates of systematic errors

    Science.gov (United States)

    Roe, Byron P.

    2007-01-01

    For estimating the effects of a number of systematic errors on a data sample, one can generate Monte Carlo (MC) runs with systematic parameters varied and examine the change in the desired observed result. Two methods are often used. In the unisim method, the systematic parameters are varied one at a time by one standard deviation, each parameter corresponding to a MC run. In the multisim method (see ), each MC run has all of the parameters varied; the amount of variation is chosen from the expected distribution of each systematic parameter, usually assumed to be a normal distribution. The variance of the overall systematic error determination is derived for each of the two methods and comparisons are made between them. If one focuses not on the error in the prediction of an individual systematic error, but on the overall error due to all systematic errors in the error matrix element in data bin m, the number of events needed is strongly reduced because of the averaging effect over all of the errors. For simple models presented here the multisim model was far better if the statistical error in the MC samples was larger than an individual systematic error, while for the reverse case, the unisim model was better. Exact formulas and formulas for the simple toy models are presented so that realistic calculations can be made. The calculations in the present note are valid if the errors are in a linear region. If that region extends sufficiently far, one can have the unisims or multisims correspond to k standard deviations instead of one. This reduces the number of events required by a factor of k2. The specific terms unisim and multisim were coined by Peter Meyers and Steve Brice, respectively, for the MiniBooNE experiment. However, the concepts have been developed over time and have been in general use for some time.

  1. Fail Better: Toward a Taxonomy of E-Learning Error

    Science.gov (United States)

    Priem, Jason

    2010-01-01

    The study of student error, important across many fields of educational research, has begun to attract interest in the field of e-learning, particularly in relation to usability. However, it remains unclear when errors should be avoided (as usability failures) or embraced (as learning opportunities). Many domains have benefited from taxonomies of…

  2. Using HET taxonomy to help stop human error

    OpenAIRE

    Li, Wen-Chin; Harris, Don; Stanton, Neville A.; Hsu, Yueh-Ling; Chang, Danny; Wang, Thomas; Young, Hong-Tsu

    2010-01-01

    Flight crews make positive contributions to the safety of aviation operations. Pilots have to assess continuously changing situations, evaluate potential risks, and make quick decisions. However, even well-trained and experienced pilots make errors. Accident investigations have identified that pilots’ performance is influenced significantly by the design of the flightdeck interface. This research applies hierarchical task analysis (HTA) and utilizes the Human Error Template (HET) taxonomy to ...

  3. Medication Errors - A Review

    OpenAIRE

    Vinay BC; Nikhitha MK; Patel Sunil B

    2015-01-01

    In this present review article, regarding medication errors its definition, medication error problem, types of medication errors, common causes of medication errors, monitoring medication errors, consequences of medication errors, prevention of medication error and managing medication errors have been explained neatly and legibly with proper tables which is easy to understand.

  4. On the determinants of measurement error in time-driven costing

    NARCIS (Netherlands)

    Cardinaels, E.; Labro, E.

    2008-01-01

    Although time estimates are used extensively for costing purposes, they are prone to measurement error. In an experimental setting, we research how measurement error in time estimates varies with: (1) the level of aggregation in the definition of costing system activities (aggregated or

  5. The role of hand of error and stimulus orientation in the relationship between worry and error-related brain activity: Implications for theory and practice.

    Science.gov (United States)

    Lin, Yanli; Moran, Tim P; Schroder, Hans S; Moser, Jason S

    2015-10-01

    Anxious apprehension/worry is associated with exaggerated error monitoring; however, the precise mechanisms underlying this relationship remain unclear. The current study tested the hypothesis that the worry-error monitoring relationship involves left-lateralized linguistic brain activity by examining the relationship between worry and error monitoring, indexed by the error-related negativity (ERN), as a function of hand of error (Experiment 1) and stimulus orientation (Experiment 2). Results revealed that worry was exclusively related to the ERN on right-handed errors committed by the linguistically dominant left hemisphere. Moreover, the right-hand ERN-worry relationship emerged only when stimuli were presented horizontally (known to activate verbal processes) but not vertically. Together, these findings suggest that the worry-ERN relationship involves left hemisphere verbal processing, elucidating a potential mechanism to explain error monitoring abnormalities in anxiety. Implications for theory and practice are discussed. © 2015 Society for Psychophysiological Research.

  6. Error Patterns

    NARCIS (Netherlands)

    Hoede, C.; Li, Z.

    2001-01-01

    In coding theory the problem of decoding focuses on error vectors. In the simplest situation code words are $(0,1)$-vectors, as are the received messages and the error vectors. Comparison of a received word with the code words yields a set of error vectors. In deciding on the original code word,

  7. Researchers Find a Mechanism for Schizophrenia

    Science.gov (United States)

    ... issue Health Capsule Researchers Find a Mechanism for Schizophrenia En español Send us your comments Scientists uncovered a mechanism behind genetic variations previously linked to schizophrenia. The findings may lead to new clinical approaches. ...

  8. Errors Analysis of Students in Mathematics Department to Learn Plane Geometry

    Science.gov (United States)

    Mirna, M.

    2018-04-01

    This article describes the results of qualitative descriptive research that reveal the locations, types and causes of student error in answering the problem of plane geometry at the problem-solving level. Answers from 59 students on three test items informed that students showed errors ranging from understanding the concepts and principles of geometry itself to the error in applying it to problem solving. Their type of error consists of concept errors, principle errors and operational errors. The results of reflection with four subjects reveal the causes of the error are: 1) student learning motivation is very low, 2) in high school learning experience, geometry has been seen as unimportant, 3) the students' experience using their reasoning in solving the problem is very less, and 4) students' reasoning ability is still very low.

  9. The error in total error reduction.

    Science.gov (United States)

    Witnauer, James E; Urcelay, Gonzalo P; Miller, Ralph R

    2014-02-01

    Most models of human and animal learning assume that learning is proportional to the discrepancy between a delivered outcome and the outcome predicted by all cues present during that trial (i.e., total error across a stimulus compound). This total error reduction (TER) view has been implemented in connectionist and artificial neural network models to describe the conditions under which weights between units change. Electrophysiological work has revealed that the activity of dopamine neurons is correlated with the total error signal in models of reward learning. Similar neural mechanisms presumably support fear conditioning, human contingency learning, and other types of learning. Using a computational modeling approach, we compared several TER models of associative learning to an alternative model that rejects the TER assumption in favor of local error reduction (LER), which assumes that learning about each cue is proportional to the discrepancy between the delivered outcome and the outcome predicted by that specific cue on that trial. The LER model provided a better fit to the reviewed data than the TER models. Given the superiority of the LER model with the present data sets, acceptance of TER should be tempered. Copyright © 2013 Elsevier Inc. All rights reserved.

  10. Characteristics of pediatric chemotherapy medication errors in a national error reporting database.

    Science.gov (United States)

    Rinke, Michael L; Shore, Andrew D; Morlock, Laura; Hicks, Rodney W; Miller, Marlene R

    2007-07-01

    Little is known regarding chemotherapy medication errors in pediatrics despite studies suggesting high rates of overall pediatric medication errors. In this study, the authors examined patterns in pediatric chemotherapy errors. The authors queried the United States Pharmacopeia MEDMARX database, a national, voluntary, Internet-accessible error reporting system, for all error reports from 1999 through 2004 that involved chemotherapy medications and patients aged error reports, 85% reached the patient, and 15.6% required additional patient monitoring or therapeutic intervention. Forty-eight percent of errors originated in the administering phase of medication delivery, and 30% originated in the drug-dispensing phase. Of the 387 medications cited, 39.5% were antimetabolites, 14.0% were alkylating agents, 9.3% were anthracyclines, and 9.3% were topoisomerase inhibitors. The most commonly involved chemotherapeutic agents were methotrexate (15.3%), cytarabine (12.1%), and etoposide (8.3%). The most common error types were improper dose/quantity (22.9% of 327 cited error types), wrong time (22.6%), omission error (14.1%), and wrong administration technique/wrong route (12.2%). The most common error causes were performance deficit (41.3% of 547 cited error causes), equipment and medication delivery devices (12.4%), communication (8.8%), knowledge deficit (6.8%), and written order errors (5.5%). Four of the 5 most serious errors occurred at community hospitals. Pediatric chemotherapy errors often reached the patient, potentially were harmful, and differed in quality between outpatient and inpatient areas. This study indicated which chemotherapeutic agents most often were involved in errors and that administering errors were common. Investigation is needed regarding targeted medication administration safeguards for these high-risk medications. Copyright (c) 2007 American Cancer Society.

  11. Uncovering Barriers to Teaching Assistants (TAs) Implementing Inquiry Teaching: Inconsistent Facilitation Techniques, Student Resistance, and Reluctance to Share Control over Learning with Students †

    Science.gov (United States)

    Gormally, Cara; Sullivan, Carol Subiño; Szeinbaum, Nadia

    2016-01-01

    Inquiry-based teaching approaches are increasingly being adopted in biology laboratories. Yet teaching assistants (TAs), often novice teachers, teach the majority of laboratory courses in US research universities. This study analyzed the perspectives of TAs and their students and used classroom observations to uncover challenges faced by TAs during their first year of inquiry-based teaching. Our study revealed three insights about barriers to effective inquiry teaching practices: 1) TAs lack sufficient facilitation skills; 2) TAs struggle to share control over learning with students as they reconcile long-standing teaching beliefs with newly learned approaches, consequently undermining their fledgling ability to use inquiry approaches; and 3) student evaluations reinforce teacher-centered behaviors as TAs receive positive feedback conflicting with inquiry approaches. We make recommendations, including changing instructional feedback to focus on learner-centered teaching practices. We urge TA mentors to engage TAs in discussions to uncover teaching beliefs underlying teaching choices and support TAs through targeted feedback and practice. PMID:27158302

  12. Error Analysis in a Written Composition Análisis de errores en una composición escrita

    Directory of Open Access Journals (Sweden)

    David Alberto Londoño Vásquez

    2008-12-01

    Full Text Available Learners make errors in both comprehension and production. Some theoreticians have pointed out the difficulty of assigning the cause of failures in comprehension to an inadequate knowledge of a particular syntactic feature of a misunderstood utterance. Indeed, an error can be defined as a deviation from the norms of the target language. In this investigation, based on personal and professional experience, a written composition entitled "My Life in Colombia" will be analyzed based on clinical elicitation (CE research. CE involves getting the informant to produce data of any sort, for example, by means of a general interview or by asking the learner to write a composition. Some errors produced by a foreign language learner in her acquisition process will be analyzed, identifying the possible sources of these errors. Finally, four kinds of errors are classified: omission, addition, misinformation, and misordering.Los aprendices comenten errores tanto en la comprensión como en la producción. Algunos teóricos han identificado que la dificultad para clasificar las diferentes fallas en comprensión se debe al conocimiento inadecuado de una característica sintáctica particular. Por tanto, el error puede definirse como una desviación de las normas del idioma objetivo. En esta experiencia profesional se analizará una composición escrita sobre "Mi vida en Colombia" con base en la investigación a través de la elicitación clínica (EC. Esta se centra en cómo el informante produce datos de cualquier tipo, por ejemplo, a través de una entrevista general o solicitándole al aprendiz una composición escrita. Se analizarán algunos errores producidos por un aprendiz de una lengua extranjera en su proceso de adquisición, identificando sus posibles causas. Finalmente, se clasifican cuatro tipos de errores: omisión, adición, desinformación y yuxtaposición sintáctica.

  13. Measurement errors in voice-key naming latency for Hiragana.

    Science.gov (United States)

    Yamada, Jun; Tamaoka, Katsuo

    2003-12-01

    This study makes explicit the limitations and possibilities of voice-key naming latency research on single hiragana symbols (a Japanese syllabic script) by examining three sets of voice-key naming data against Sakuma, Fushimi, and Tatsumi's 1997 speech-analyzer voice-waveform data. Analysis showed that voice-key measurement errors can be substantial in standard procedures as they may conceal the true effects of significant variables involved in hiragana-naming behavior. While one can avoid voice-key measurement errors to some extent by applying Sakuma, et al.'s deltas and by excluding initial phonemes which induce measurement errors, such errors may be ignored when test items are words and other higher-level linguistic materials.

  14. An investigation into soft error detection efficiency at operating system level.

    Science.gov (United States)

    Asghari, Seyyed Amir; Kaynak, Okyay; Taheri, Hassan

    2014-01-01

    Electronic equipment operating in harsh environments such as space is subjected to a range of threats. The most important of these is radiation that gives rise to permanent and transient errors on microelectronic components. The occurrence rate of transient errors is significantly more than permanent errors. The transient errors, or soft errors, emerge in two formats: control flow errors (CFEs) and data errors. Valuable research results have already appeared in literature at hardware and software levels for their alleviation. However, there is the basic assumption behind these works that the operating system is reliable and the focus is on other system levels. In this paper, we investigate the effects of soft errors on the operating system components and compare their vulnerability with that of application level components. Results show that soft errors in operating system components affect both operating system and application level components. Therefore, by providing endurance to operating system level components against soft errors, both operating system and application level components gain tolerance.

  15. The organizational context of error tolerant interface systems

    International Nuclear Information System (INIS)

    Sepanloo, K.; Meshkati, N.; Kozuh, M.

    1995-01-01

    Human error has been recognized as the main contributor to the occurrence of incidents in large technological systems such as nuclear power plants. Recent researches have concluded that human errors are unavoidable side effects of exploration of acceptable performance during adaptation to the unknown changes in the environment. To assist the operators in coping with unforeseen situations, the innovative error tolerant interface systems have been proposed to provide the operators with opportunities to make hypothetical tests without having to carry them out directly on the plant in potentially irreversible conditions. On the other hand, the degree of success of introduction of any new system into a tightly-coupled complex socio-technological system is known to be a great deal dependent upon the degree of harmony of that system with the organization s framework and attitudes. Error tolerant interface systems with features of simplicity, transparency, error detectability and recoverability provide a forgiving cognition environment where the effects of errors are observable and recoverable. The nature of these systems are likely to be more consistent with flexible and rather plain organizational structures, in which static and punitive concepts of human error are modified on the favour of dynamic and adaptive approaches. In this paper the features of error tolerant interface systems are explained and their consistent organizational structures are explored. (author)

  16. The organizational context of error tolerant interface systems

    Energy Technology Data Exchange (ETDEWEB)

    Sepanloo, K [Nuclear Safety Department, Tehran (Iran, Islamic Republic of); Meshkati, N [Institute of Safety and Systems Management, Los Angeles (United States); Kozuh, M [Josef Stefan Institute, Ljubljana (Slovenia)

    1996-12-31

    Human error has been recognized as the main contributor to the occurrence of incidents in large technological systems such as nuclear power plants. Recent researches have concluded that human errors are unavoidable side effects of exploration of acceptable performance during adaptation to the unknown changes in the environment. To assist the operators in coping with unforeseen situations, the innovative error tolerant interface systems have been proposed to provide the operators with opportunities to make hypothetical tests without having to carry them out directly on the plant in potentially irreversible conditions. On the other hand, the degree of success of introduction of any new system into a tightly-coupled complex socio-technological system is known to be a great deal dependent upon the degree of harmony of that system with the organization s framework and attitudes. Error tolerant interface systems with features of simplicity, transparency, error detectability and recoverability provide a forgiving cognition environment where the effects of errors are observable and recoverable. The nature of these systems are likely to be more consistent with flexible and rather plain organizational structures, in which static and punitive concepts of human error are modified on the favour of dynamic and adaptive approaches. In this paper the features of error tolerant interface systems are explained and their consistent organizational structures are explored. (author) 11 refs.

  17. Relationship between Recent Flight Experience and Pilot Error General Aviation Accidents

    Science.gov (United States)

    Nilsson, Sarah J.

    Aviation insurance agents and fixed-base operation (FBO) owners use recent flight experience, as implied by the 90-day rule, to measure pilot proficiency in physical airplane skills, and to assess the likelihood of a pilot error accident. The generally accepted premise is that more experience in a recent timeframe predicts less of a propensity for an accident, all other factors excluded. Some of these aviation industry stakeholders measure pilot proficiency solely by using time flown within the past 90, 60, or even 30 days, not accounting for extensive research showing aeronautical decision-making and situational awareness training decrease the likelihood of a pilot error accident. In an effort to reduce the pilot error accident rate, the Federal Aviation Administration (FAA) has seen the need to shift pilot training emphasis from proficiency in physical airplane skills to aeronautical decision-making and situational awareness skills. However, current pilot training standards still focus more on the former than on the latter. The relationship between pilot error accidents and recent flight experience implied by the FAA's 90-day rule has not been rigorously assessed using empirical data. The intent of this research was to relate recent flight experience, in terms of time flown in the past 90 days, to pilot error accidents. A quantitative ex post facto approach, focusing on private pilots of single-engine general aviation (GA) fixed-wing aircraft, was used to analyze National Transportation Safety Board (NTSB) accident investigation archival data. The data were analyzed using t-tests and binary logistic regression. T-tests between the mean number of hours of recent flight experience of tricycle gear pilots involved in pilot error accidents (TPE) and non-pilot error accidents (TNPE), t(202) = -.200, p = .842, and conventional gear pilots involved in pilot error accidents (CPE) and non-pilot error accidents (CNPE), t(111) = -.271, p = .787, indicate there is no

  18. Trends in Health Information Technology Safety: From Technology-Induced Errors to Current Approaches for Ensuring Technology Safety

    Science.gov (United States)

    2013-01-01

    Objectives Health information technology (HIT) research findings suggested that new healthcare technologies could reduce some types of medical errors while at the same time introducing classes of medical errors (i.e., technology-induced errors). Technology-induced errors have their origins in HIT, and/or HIT contribute to their occurrence. The objective of this paper is to review current trends in the published literature on HIT safety. Methods A review and synthesis of the medical and life sciences literature focusing on the area of technology-induced error was conducted. Results There were four main trends in the literature on technology-induced error. The following areas were addressed in the literature: definitions of technology-induced errors; models, frameworks and evidence for understanding how technology-induced errors occur; a discussion of monitoring; and methods for preventing and learning about technology-induced errors. Conclusions The literature focusing on technology-induced errors continues to grow. Research has focused on the defining what an error is, models and frameworks used to understand these new types of errors, monitoring of such errors and methods that can be used to prevent these errors. More research will be needed to better understand and mitigate these types of errors. PMID:23882411

  19. Using Generalizability Theory to Disattenuate Correlation Coefficients for Multiple Sources of Measurement Error.

    Science.gov (United States)

    Vispoel, Walter P; Morris, Carrie A; Kilinc, Murat

    2018-05-02

    Over the years, research in the social sciences has been dominated by reporting of reliability coefficients that fail to account for key sources of measurement error. Use of these coefficients, in turn, to correct for measurement error can hinder scientific progress by misrepresenting true relationships among the underlying constructs being investigated. In the research reported here, we addressed these issues using generalizability theory (G-theory) in both traditional and new ways to account for the three key sources of measurement error (random-response, specific-factor, and transient) that affect scores from objectively scored measures. Results from 20 widely used measures of personality, self-concept, and socially desirable responding showed that conventional indices consistently misrepresented reliability and relationships among psychological constructs by failing to account for key sources of measurement error and correlated transient errors within occasions. The results further revealed that G-theory served as an effective framework for remedying these problems. We discuss possible extensions in future research and provide code from the computer package R in an online supplement to enable readers to apply the procedures we demonstrate to their own research.

  20. A preliminary taxonomy of medical errors in family practice.

    Science.gov (United States)

    Dovey, S M; Meyers, D S; Phillips, R L; Green, L A; Fryer, G E; Galliher, J M; Kappus, J; Grob, P

    2002-09-01

    To develop a preliminary taxonomy of primary care medical errors. Qualitative analysis to identify categories of error reported during a randomized controlled trial of computer and paper reporting methods. The National Network for Family Practice and Primary Care Research. Family physicians. Medical error category, context, and consequence. Forty two physicians made 344 reports: 284 (82.6%) arose from healthcare systems dysfunction; 46 (13.4%) were errors due to gaps in knowledge or skills; and 14 (4.1%) were reports of adverse events, not errors. The main subcategories were: administrative failure (102; 30.9% of errors), investigation failures (82; 24.8%), treatment delivery lapses (76; 23.0%), miscommunication (19; 5.8%), payment systems problems (4; 1.2%), error in the execution of a clinical task (19; 5.8%), wrong treatment decision (14; 4.2%), and wrong diagnosis (13; 3.9%). Most reports were of errors that were recognized and occurred in reporters' practices. Affected patients ranged in age from 8 months to 100 years, were of both sexes, and represented all major US ethnic groups. Almost half the reports were of events which had adverse consequences. Ten errors resulted in patients being admitted to hospital and one patient died. This medical error taxonomy, developed from self-reports of errors observed by family physicians during their routine clinical practice, emphasizes problems in healthcare processes and acknowledges medical errors arising from shortfalls in clinical knowledge and skills. Patient safety strategies with most effect in primary care settings need to be broader than the current focus on medication errors.

  1. Error studies for SNS Linac. Part 1: Transverse errors

    International Nuclear Information System (INIS)

    Crandall, K.R.

    1998-01-01

    The SNS linac consist of a radio-frequency quadrupole (RFQ), a drift-tube linac (DTL), a coupled-cavity drift-tube linac (CCDTL) and a coupled-cavity linac (CCL). The RFQ and DTL are operated at 402.5 MHz; the CCDTL and CCL are operated at 805 MHz. Between the RFQ and DTL is a medium-energy beam-transport system (MEBT). This error study is concerned with the DTL, CCDTL and CCL, and each will be analyzed separately. In fact, the CCL is divided into two sections, and each of these will be analyzed separately. The types of errors considered here are those that affect the transverse characteristics of the beam. The errors that cause the beam center to be displaced from the linac axis are quad displacements and quad tilts. The errors that cause mismatches are quad gradient errors and quad rotations (roll)

  2. Uncovering the Motivating Factors behind Writing in English in en EFL Context

    Science.gov (United States)

    Büyükyavuz, Oya; Çakir, Ismail

    2014-01-01

    Writing in a language, whether the target or native, is regarded as a complex activity operating on multiple cognitive levels. This study aimed to uncover the factors which motivate teacher trainees of English to write in English in an EFL context. The study also investigated the differences in the ways teacher trainees are motivated in terms of…

  3. A stochastic dynamic model for human error analysis in nuclear power plants

    Science.gov (United States)

    Delgado-Loperena, Dharma

    Nuclear disasters like Three Mile Island and Chernobyl indicate that human performance is a critical safety issue, sending a clear message about the need to include environmental press and competence aspects in research. This investigation was undertaken to serve as a roadmap for studying human behavior through the formulation of a general solution equation. The theoretical model integrates models from two heretofore-disassociated disciplines (behavior specialists and technical specialists), that historically have independently studied the nature of error and human behavior; including concepts derived from fractal and chaos theory; and suggests re-evaluation of base theory regarding human error. The results of this research were based on comprehensive analysis of patterns of error, with the omnipresent underlying structure of chaotic systems. The study of patterns lead to a dynamic formulation, serving for any other formula used to study human error consequences. The search for literature regarding error yielded insight for the need to include concepts rooted in chaos theory and strange attractors---heretofore unconsidered by mainstream researchers who investigated human error in nuclear power plants or those who employed the ecological model in their work. The study of patterns obtained from the rupture of a steam generator tube (SGTR) event simulation, provided a direct application to aspects of control room operations in nuclear power plant operations. In doing so, the conceptual foundation based in the understanding of the patterns of human error analysis can be gleaned, resulting in reduced and prevent undesirable events.

  4. Two-dimensional errors

    International Nuclear Information System (INIS)

    Anon.

    1991-01-01

    This chapter addresses the extension of previous work in one-dimensional (linear) error theory to two-dimensional error analysis. The topics of the chapter include the definition of two-dimensional error, the probability ellipse, the probability circle, elliptical (circular) error evaluation, the application to position accuracy, and the use of control systems (points) in measurements

  5. Errors in chest x-ray interpretation

    International Nuclear Information System (INIS)

    Woznitza, N.; Piper, K.

    2015-01-01

    Full text: Reporting of adult chest x-rays by appropriately trained radiographers is frequently used in the United Kingdom as one method to maintain a patient focused radiology service in times of increasing workload. With models of advanced practice being developed in Australia, New Zealand and Canada, the spotlight is on the evidence base which underpins radiographer reporting. It is essential that any radiographer who extends their scope of practice to incorporate definitive clinical reporting perform at a level comparable to a consultant radiologist. In any analysis of performance it is important to quantify levels of sensitivity and specificity and to evaluate areas of error and variation. A critical review of the errors made by reporting radiographers in the interpretation of adult chest x-rays will be performed, examining performance in structured clinical examinations, clinical audit and a diagnostic accuracy study from research undertaken by the authors, and including studies which have compared the performance of reporting radiographers and consultant radiologists. overall performance will be examined and common errors discussed using a case based approach. Methods of error reduction, including multidisciplinary team meetings and ongoing learning will be considered

  6. Subroutine library for error estimation of matrix computation (Ver. 1.0)

    International Nuclear Information System (INIS)

    Ichihara, Kiyoshi; Shizawa, Yoshihisa; Kishida, Norio

    1999-03-01

    'Subroutine Library for Error Estimation of Matrix Computation' is a subroutine library which aids the users in obtaining the error ranges of the linear system's solutions or the Hermitian matrices' eigenvalues. This library contains routines for both sequential computers and parallel computers. The subroutines for linear system error estimation calculate norms of residual vectors, matrices's condition numbers, error bounds of solutions and so on. The subroutines for error estimation of Hermitian matrix eigenvalues derive the error ranges of the eigenvalues according to the Korn-Kato's formula. The test matrix generators supply the matrices appeared in the mathematical research, the ones randomly generated and the ones appeared in the application programs. This user's manual contains a brief mathematical background of error analysis on linear algebra and usage of the subroutines. (author)

  7. Issues with data and analyses: Errors, underlying themes, and potential solutions.

    Science.gov (United States)

    Brown, Andrew W; Kaiser, Kathryn A; Allison, David B

    2018-03-13

    Some aspects of science, taken at the broadest level, are universal in empirical research. These include collecting, analyzing, and reporting data. In each of these aspects, errors can and do occur. In this work, we first discuss the importance of focusing on statistical and data errors to continually improve the practice of science. We then describe underlying themes of the types of errors and postulate contributing factors. To do so, we describe a case series of relatively severe data and statistical errors coupled with surveys of some types of errors to better characterize the magnitude, frequency, and trends. Having examined these errors, we then discuss the consequences of specific errors or classes of errors. Finally, given the extracted themes, we discuss methodological, cultural, and system-level approaches to reducing the frequency of commonly observed errors. These approaches will plausibly contribute to the self-critical, self-correcting, ever-evolving practice of science, and ultimately to furthering knowledge.

  8. Errors in otology.

    Science.gov (United States)

    Kartush, J M

    1996-11-01

    Practicing medicine successfully requires that errors in diagnosis and treatment be minimized. Malpractice laws encourage litigators to ascribe all medical errors to incompetence and negligence. There are, however, many other causes of unintended outcomes. This article describes common causes of errors and suggests ways to minimize mistakes in otologic practice. Widespread dissemination of knowledge about common errors and their precursors can reduce the incidence of their occurrence. Consequently, laws should be passed to allow for a system of non-punitive, confidential reporting of errors and "near misses" that can be shared by physicians nationwide.

  9. Comparing Measurement Error between Two Different Methods of Measurement of Various Magnitudes

    Science.gov (United States)

    Zavorsky, Gerald S.

    2010-01-01

    Measurement error is a common problem in several fields of research such as medicine, physiology, and exercise science. The standard deviation of repeated measurements on the same person is the measurement error. One way of presenting measurement error is called the repeatability, which is 2.77 multiplied by the within subject standard deviation.…

  10. Learning from Errors

    OpenAIRE

    Martínez-Legaz, Juan Enrique; Soubeyran, Antoine

    2003-01-01

    We present a model of learning in which agents learn from errors. If an action turns out to be an error, the agent rejects not only that action but also neighboring actions. We find that, keeping memory of his errors, under mild assumptions an acceptable solution is asymptotically reached. Moreover, one can take advantage of big errors for a faster learning.

  11. Primary Care Providers' Perspectives on Errors of Omission.

    Science.gov (United States)

    Poghosyan, Lusine; Norful, Allison A; Fleck, Elaine; Bruzzese, Jean-Marie; Talsma, AkkeNeel; Nannini, Angela

    2017-01-01

    Despite recent focus on patient safety in primary care, little attention has been paid to errors of omission, which represent significant gaps in care and threaten patient safety in primary care but are not well studied or categorized. The purpose of this study was to develop a typology of errors of omission from the perspectives of primary care providers (PCPs) and understand what factors within practices lead to or prevent these omissions. A qualitative descriptive design was used to collect data from 26 PCPs, both physicians and nurse practitioners, from the New York State through individual interviews. One researcher conducted all interviews, which were audiotaped, transcribed verbatim, and analyzed in ATLAS.ti, Berlin by 3 researchers using content analysis. They immersed themselves into data, read transcripts independently, and conducted inductive coding. The final codes were linked to each other to develop the typology of errors of omission and the themes. Data saturation was reached at the 26th interview. PCPs reported that omitting patient teaching, patient followup, emotional support, and addressing mental health needs were the main categories of errors of omission. PCPs perceived that time constraints, unplanned patient visits and emergencies, and administrative burden led to these gaps in care. They emphasized that organizational support and infrastructure, effective teamwork and communication, and preparation for the patient encounter were important safeguards to prevent errors of omission within their practices. Errors of omission are common in primary care and could threaten patient safety. Efforts to eliminate them should focus on strengthening organizational attributes of practices, improving teamwork and communication, and assigning manageable workload to PCPs. Practice and policy change is necessary to address gaps in care and prevent them before they result in patient harm. © Copyright 2017 by the American Board of Family Medicine.

  12. Normalization of Deviation: Quotation Error in Human Factors.

    Science.gov (United States)

    Lock, Jordan; Bearman, Chris

    2018-05-01

    Objective The objective of this paper is to examine quotation error in human factors. Background Science progresses through building on the work of previous research. This requires accurate quotation. Quotation error has a number of adverse consequences: loss of credibility, loss of confidence in the journal, and a flawed basis for academic debate and scientific progress. Quotation error has been observed in a number of domains, including marine biology and medicine, but there has been little or no previous study of this form of error in human factors, a domain that specializes in the causes and management of error. Methods A study was conducted examining quotation accuracy of 187 extracts from 118 published articles that cited a control article (Vaughan's 1996 book: The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA). Results Of extracts studied, 12.8% ( n = 24) were classed as inaccurate, with 87.2% ( n = 163) being classed as accurate. A second dimension of agreement was examined with 96.3% ( n = 180) agreeing with the control article and only 3.7% ( n = 7) disagreeing. The categories of accuracy and agreement form a two by two matrix. Conclusion Rather than simply blaming individuals for quotation error, systemic factors should also be considered. Vaughan's theory, normalization of deviance, is one systemic theory that can account for quotation error. Application Quotation error is occurring in human factors and should receive more attention. According to Vaughan's theory, the normal everyday systems that promote scholarship may also allow mistakes, mishaps, and quotation error to occur.

  13. Preanalytical Blood Sampling Errors in Clinical Settings

    International Nuclear Information System (INIS)

    Zehra, N.; Malik, A. H.; Arshad, Q.; Sarwar, S.; Aslam, S.

    2016-01-01

    Background: Blood sampling is one of the common procedures done in every ward for disease diagnosis and prognosis. Daily hundreds of samples are collected from different wards but lack of appropriate knowledge of blood sampling by paramedical staff and accidental errors make the samples inappropriate for testing. Thus the need to avoid these errors for better results still remains. We carried out this research with an aim to determine the common errors during blood sampling; find factors responsible and propose ways to reduce these errors. Methods: A cross sectional descriptive study was carried out at the Military and Combined Military Hospital Rawalpindi during February and March 2014. A Venous Blood Sampling questionnaire (VBSQ) was filled by the staff on voluntary basis in front of the researchers. The staff was briefed on the purpose of the survey before filling the questionnaire. Sample size was 228. Results were analysed using SPSS-21. Results: When asked in the questionnaire, around 61.6 percent of the paramedical staff stated that they cleaned the vein by moving the alcohol swab from inward to outwards while 20.8 percent of the staff reported that they felt the vein after disinfection. On contrary to WHO guidelines, 89.6 percent identified that they had a habit of placing blood in the test tube by holding it in the other hand, which should actually be done after inserting it into the stand. Although 86 percent thought that they had ample knowledge regarding the blood sampling process but they did not practice it properly. Conclusion: Pre analytical blood sampling errors are common in our setup. Eighty six percent participants though thought that they had adequate knowledge regarding blood sampling, but most of them were not adhering to standard protocols. There is a need of continued education and refresher courses. (author)

  14. Online Tools for Uncovering Data Quality (DQ) Issues in Satellite-Based Global Precipitation Products

    Science.gov (United States)

    Liu, Zhong; Heo, Gil

    2015-01-01

    Data quality (DQ) has many attributes or facets (i.e., errors, biases, systematic differences, uncertainties, benchmark, false trends, false alarm ratio, etc.)Sources can be complicated (measurements, environmental conditions, surface types, algorithms, etc.) and difficult to be identified especially for multi-sensor and multi-satellite products with bias correction (TMPA, IMERG, etc.) How to obtain DQ info fast and easily, especially quantified info in ROI Existing parameters (random error), literature, DIY, etc.How to apply the knowledge in research and applications.Here, we focus on online systems for integration of products and parameters, visualization and analysis as well as investigation and extraction of DQ information.

  15. Skills, rules and knowledge in aircraft maintenance: errors in context

    Science.gov (United States)

    Hobbs, Alan; Williamson, Ann

    2002-01-01

    Automatic or skill-based behaviour is generally considered to be less prone to error than behaviour directed by conscious control. However, researchers who have applied Rasmussen's skill-rule-knowledge human error framework to accidents and incidents have sometimes found that skill-based errors appear in significant numbers. It is proposed that this is largely a reflection of the opportunities for error which workplaces present and does not indicate that skill-based behaviour is intrinsically unreliable. In the current study, 99 errors reported by 72 aircraft mechanics were examined in the light of a task analysis based on observations of the work of 25 aircraft mechanics. The task analysis identified the opportunities for error presented at various stages of maintenance work packages and by the job as a whole. Once the frequency of each error type was normalized in terms of the opportunities for error, it became apparent that skill-based performance is more reliable than rule-based performance, which is in turn more reliable than knowledge-based performance. The results reinforce the belief that industrial safety interventions designed to reduce errors would best be directed at those aspects of jobs that involve rule- and knowledge-based performance.

  16. An Investigation into Soft Error Detection Efficiency at Operating System Level

    Directory of Open Access Journals (Sweden)

    Seyyed Amir Asghari

    2014-01-01

    Full Text Available Electronic equipment operating in harsh environments such as space is subjected to a range of threats. The most important of these is radiation that gives rise to permanent and transient errors on microelectronic components. The occurrence rate of transient errors is significantly more than permanent errors. The transient errors, or soft errors, emerge in two formats: control flow errors (CFEs and data errors. Valuable research results have already appeared in literature at hardware and software levels for their alleviation. However, there is the basic assumption behind these works that the operating system is reliable and the focus is on other system levels. In this paper, we investigate the effects of soft errors on the operating system components and compare their vulnerability with that of application level components. Results show that soft errors in operating system components affect both operating system and application level components. Therefore, by providing endurance to operating system level components against soft errors, both operating system and application level components gain tolerance.

  17. Design Patterns for Mixed-Method Research in HCI

    NARCIS (Netherlands)

    Robert Holwerda; Arthur Bennis; Lambert Zaad; René Bakker; Sabine Craenmehr; Stijn Hoppenbrouwers; Dick Lenior; Marjolein Jacobs; Koen van Turnhout; Ralph Niels

    2014-01-01

    In this paper we discuss mixed-method research in HCI. We report on an empirical literature study of the NordiCHI 2012 proceedings which aimed to uncover and describe common mixed-method approaches, and to identify good practices for mixed-methods research in HCI. We present our results as

  18. Incorporating measurement error in n = 1 psychological autoregressive modeling

    Science.gov (United States)

    Schuurman, Noémi K.; Houtveen, Jan H.; Hamaker, Ellen L.

    2015-01-01

    Measurement error is omnipresent in psychological data. However, the vast majority of applications of autoregressive time series analyses in psychology do not take measurement error into account. Disregarding measurement error when it is present in the data results in a bias of the autoregressive parameters. We discuss two models that take measurement error into account: An autoregressive model with a white noise term (AR+WN), and an autoregressive moving average (ARMA) model. In a simulation study we compare the parameter recovery performance of these models, and compare this performance for both a Bayesian and frequentist approach. We find that overall, the AR+WN model performs better. Furthermore, we find that for realistic (i.e., small) sample sizes, psychological research would benefit from a Bayesian approach in fitting these models. Finally, we illustrate the effect of disregarding measurement error in an AR(1) model by means of an empirical application on mood data in women. We find that, depending on the person, approximately 30–50% of the total variance was due to measurement error, and that disregarding this measurement error results in a substantial underestimation of the autoregressive parameters. PMID:26283988

  19. Effects of Target Positioning Error on Motion Compensation for Airborne Interferometric SAR

    Directory of Open Access Journals (Sweden)

    Li Yin-wei

    2013-12-01

    Full Text Available The measurement inaccuracies of Inertial Measurement Unit/Global Positioning System (IMU/GPS as well as the positioning error of the target may contribute to the residual uncompensated motion errors in the MOtion COmpensation (MOCO approach based on the measurement of IMU/GPS. Aiming at the effects of target positioning error on MOCO for airborne interferometric SAR, the paper firstly deduces a mathematical model of residual motion error bring out by target positioning error under the condition of squint. And the paper analyzes the effects on the residual motion error caused by system sampling delay error, the Doppler center frequency error and reference DEM error which result in target positioning error based on the model. Then, the paper discusses the effects of the reference DEM error on the interferometric SAR image quality, the interferometric phase and the coherent coefficient. The research provides theoretical bases for the MOCO precision in signal processing of airborne high precision SAR and airborne repeat-pass interferometric SAR.

  20. Complementarity based a posteriori error estimates and their properties

    Czech Academy of Sciences Publication Activity Database

    Vejchodský, Tomáš

    2012-01-01

    Roč. 82, č. 10 (2012), s. 2033-2046 ISSN 0378-4754 R&D Projects: GA ČR(CZ) GA102/07/0496; GA AV ČR IAA100760702 Institutional research plan: CEZ:AV0Z10190503 Keywords : error majorant * a posteriori error estimates * method of hypercircle Subject RIV: BA - General Mathematics Impact factor: 0.836, year: 2012 http://www.sciencedirect.com/science/article/pii/S0378475411001509

  1. Error Detection and Error Classification: Failure Awareness in Data Transfer Scheduling

    Energy Technology Data Exchange (ETDEWEB)

    Louisiana State University; Balman, Mehmet; Kosar, Tevfik

    2010-10-27

    Data transfer in distributed environment is prone to frequent failures resulting from back-end system level problems, like connectivity failure which is technically untraceable by users. Error messages are not logged efficiently, and sometimes are not relevant/useful from users point-of-view. Our study explores the possibility of an efficient error detection and reporting system for such environments. Prior knowledge about the environment and awareness of the actual reason behind a failure would enable higher level planners to make better and accurate decisions. It is necessary to have well defined error detection and error reporting methods to increase the usability and serviceability of existing data transfer protocols and data management systems. We investigate the applicability of early error detection and error classification techniques and propose an error reporting framework and a failure-aware data transfer life cycle to improve arrangement of data transfer operations and to enhance decision making of data transfer schedulers.

  2. Death Certification Errors and the Effect on Mortality Statistics.

    Science.gov (United States)

    McGivern, Lauri; Shulman, Leanne; Carney, Jan K; Shapiro, Steven; Bundock, Elizabeth

    Errors in cause and manner of death on death certificates are common and affect families, mortality statistics, and public health research. The primary objective of this study was to characterize errors in the cause and manner of death on death certificates completed by non-Medical Examiners. A secondary objective was to determine the effects of errors on national mortality statistics. We retrospectively compared 601 death certificates completed between July 1, 2015, and January 31, 2016, from the Vermont Electronic Death Registration System with clinical summaries from medical records. Medical Examiners, blinded to original certificates, reviewed summaries, generated mock certificates, and compared mock certificates with original certificates. They then graded errors using a scale from 1 to 4 (higher numbers indicated increased impact on interpretation of the cause) to determine the prevalence of minor and major errors. They also compared International Classification of Diseases, 10th Revision (ICD-10) codes on original certificates with those on mock certificates. Of 601 original death certificates, 319 (53%) had errors; 305 (51%) had major errors; and 59 (10%) had minor errors. We found no significant differences by certifier type (physician vs nonphysician). We did find significant differences in major errors in place of death ( P statistics. Surveillance and certifier education must expand beyond local and state efforts. Simplifying and standardizing underlying literal text for cause of death may improve accuracy, decrease coding errors, and improve national mortality statistics.

  3. Research on Measurement Accuracy of Laser Tracking System Based on Spherical Mirror with Rotation Errors of Gimbal Mount Axes

    Science.gov (United States)

    Shi, Zhaoyao; Song, Huixu; Chen, Hongfang; Sun, Yanqiang

    2018-02-01

    This paper presents a novel experimental approach for confirming that spherical mirror of a laser tracking system can reduce the influences of rotation errors of gimbal mount axes on the measurement accuracy. By simplifying the optical system model of laser tracking system based on spherical mirror, we can easily extract the laser ranging measurement error caused by rotation errors of gimbal mount axes with the positions of spherical mirror, biconvex lens, cat's eye reflector, and measuring beam. The motions of polarization beam splitter and biconvex lens along the optical axis and vertical direction of optical axis are driven by error motions of gimbal mount axes. In order to simplify the experimental process, the motion of biconvex lens is substituted by the motion of spherical mirror according to the principle of relative motion. The laser ranging measurement error caused by the rotation errors of gimbal mount axes could be recorded in the readings of laser interferometer. The experimental results showed that the laser ranging measurement error caused by rotation errors was less than 0.1 μm if radial error motion and axial error motion were within ±10 μm. The experimental method simplified the experimental procedure and the spherical mirror could reduce the influences of rotation errors of gimbal mount axes on the measurement accuracy of the laser tracking system.

  4. Ethnomathematics study: uncovering units of length, area, and volume in Kampung Naga Society

    Science.gov (United States)

    Septianawati, T.; Turmudi; Puspita, E.

    2017-02-01

    During this time, mathematics is considered as something neutral and not associated with culture. It can be seen from mathematics learning in the school which adopt many of foreign mathematics learning are considered more advanced (western). In fact, Indonesia is a rich country in cultural diversity. In the cultural activities, there are mathematical ideas that were considered a important thing in the mathematics learning. A study that examines the idea or mathematical practices in a variety of cultural activities are known as ethnomathematics. In Indonesia, there are some ethnic maintain their ancestral traditions, one of them is Kampung Naga. Therefore, this study was conducted in Kampung Naga. This study aims to uncover units of length, area, and volume used by Kampung Naga society. This study used a qualitative approach and ethnography methods. In this research, data collection is done through the principles of ethnography such as observation, interviews, documentation, and field notes. The results of this study are units of length, area, and volume used by Kampung Naga society and its conversion into standard units. This research is expected to give information to the public that mathematics has a relationship with culture and become recommendation to mathematics curriculum in Indonesia.

  5. Generalized Gaussian Error Calculus

    CERN Document Server

    Grabe, Michael

    2010-01-01

    For the first time in 200 years Generalized Gaussian Error Calculus addresses a rigorous, complete and self-consistent revision of the Gaussian error calculus. Since experimentalists realized that measurements in general are burdened by unknown systematic errors, the classical, widespread used evaluation procedures scrutinizing the consequences of random errors alone turned out to be obsolete. As a matter of course, the error calculus to-be, treating random and unknown systematic errors side by side, should ensure the consistency and traceability of physical units, physical constants and physical quantities at large. The generalized Gaussian error calculus considers unknown systematic errors to spawn biased estimators. Beyond, random errors are asked to conform to the idea of what the author calls well-defined measuring conditions. The approach features the properties of a building kit: any overall uncertainty turns out to be the sum of a contribution due to random errors, to be taken from a confidence inter...

  6. Families, nurses and organisations contributing factors to medication administration error in paediatrics: a literature review

    Directory of Open Access Journals (Sweden)

    Albara Alomari

    2015-05-01

    Full Text Available Background: Medication error is the most common adverse event for hospitalised children and can lead to significant harm. Despite decades of research and implementation of a number of initiatives, the error rates continue to rise, particularly those associated with administration. Objectives: The objective of this literature review is to explore the factors involving nurses, families and healthcare systems that impact on medication administration errors in paediatric patients. Design: A review was undertaken of studies that reported on factors that contribute to a rise or fall in medication administration errors, from family, nurse and organisational perspectives. The following databases were searched: Medline, Embase, CINAHL and the Cochrane library. The title, abstract and full article were reviewed for relevance. Articles were excluded if they were not research studies, they related to medications and not medication administration errors or they referred to medical errors rather than medication errors. Results: A total of 15 studies met the inclusion criteria. The factors contributing to medication administration errors are communication failure between the parents and healthcare professionals, nurse workload, failure to adhere to policy and guidelines, interruptions, inexperience and insufficient nurse education from organisations. Strategies that were reported to reduce errors were doublechecking by two nurses, implementing educational sessions, use of computerised prescribing and barcoding administration systems. Yet despite such interventions, errors persist. The review highlighted families that have a central role in caring for the child and therefore are key to the administration process, but have largely been ignored in research studies relating to medication administration. Conclusions: While there is a consensus about the factors that contribute to errors, sustainable and effective solutions remain elusive. To date, families have not

  7. Medication errors of nurses and factors in refusal to report medication errors among nurses in a teaching medical center of iran in 2012.

    Science.gov (United States)

    Mostafaei, Davoud; Barati Marnani, Ahmad; Mosavi Esfahani, Haleh; Estebsari, Fatemeh; Shahzaidi, Shiva; Jamshidi, Ensiyeh; Aghamiri, Seyed Samad

    2014-10-01

    About one third of unwanted reported medication consequences are due to medication errors, resulting in one-fifth of hospital injuries. The aim of this study was determined formal and informal medication errors of nurses and the level of importance of factors in refusal to report medication errors among nurses. The cross-sectional study was done on the nursing staff of Shohada Tajrish Hospital, Tehran, Iran in 2012. The data was gathered through a questionnaire, made by the researchers. The questionnaires' face and content validity was confirmed by experts and for measuring its reliability test-retest was used. The data was analyzed by descriptive statistics. We used SPSS for related statistical analyses. The most important factors in refusal to report medication errors respectively were: lack of medication error recording and reporting system in the hospital (3.3%), non-significant error reporting to hospital authorities and lack of appropriate feedback (3.1%), and lack of a clear definition for a medication error (3%). There were both formal and informal reporting of medication errors in this study. Factors pertaining to management in hospitals as well as the fear of the consequences of reporting are two broad fields among the factors that make nurses not report their medication errors. In this regard, providing enough education to nurses, boosting the job security for nurses, management support and revising related processes and definitions are some factors that can help decreasing medication errors and increasing their report in case of occurrence.

  8. Using Text Mining to Uncover Students' Technology-Related Problems in Live Video Streaming

    Science.gov (United States)

    Abdous, M'hammed; He, Wu

    2011-01-01

    Because of their capacity to sift through large amounts of data, text mining and data mining are enabling higher education institutions to reveal valuable patterns in students' learning behaviours without having to resort to traditional survey methods. In an effort to uncover live video streaming (LVS) students' technology related-problems and to…

  9. A framework to assess diagnosis error probabilities in the advanced MCR

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ar Ryum; Seong, Poong Hyun [KAIST, Daejeon (Korea, Republic of); Kim, Jong Hyun [Chosun University, Gwangju (Korea, Republic of); Jang, Inseok; Park, Jinkyun [Korea Atomic Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    The Institute of Nuclear Power Operations (INPO)’s operating experience database revealed that about 48% of the total events in world NPPs for 2 years (2010-2011) happened due to human errors. The purposes of human reliability analysis (HRA) method are to evaluate the potential for, and mechanism of, human errors that may affect plant safety. Accordingly, various HRA methods have been developed such as technique for human error rate prediction (THERP), simplified plant analysis risk human reliability assessment (SPAR-H), cognitive reliability and error analysis method (CREAM) and so on. Many researchers have asserted that procedure, alarm, and display are critical factors to affect operators’ generic activities, especially for diagnosis activities. None of various HRA methods was explicitly designed to deal with digital systems. SCHEME (Soft Control Human error Evaluation MEthod) considers only for the probability of soft control execution error in the advanced MCR. The necessity of developing HRA methods in various conditions of NPPs has been raised. In this research, the framework to estimate diagnosis error probabilities in the advanced MCR was suggested. The assessment framework was suggested by three steps. The first step is to investigate diagnosis errors and calculate their probabilities. The second step is to quantitatively estimate PSFs’ weightings in the advanced MCR. The third step is to suggest the updated TRC model to assess the nominal diagnosis error probabilities. Additionally, the proposed framework was applied by using the full-scope simulation. Experiments conducted in domestic full-scope simulator and HAMMLAB were used as data-source. Total eighteen tasks were analyzed and twenty-three crews participated in.

  10. Relationship Between Technical Errors and Decision-Making Skills in the Junior Resident.

    Science.gov (United States)

    Nathwani, Jay N; Fiers, Rebekah M; Ray, Rebecca D; Witt, Anna K; Law, Katherine E; DiMarco, ShannonM; Pugh, Carla M

    The purpose of this study is to coevaluate resident technical errors and decision-making capabilities during placement of a subclavian central venous catheter (CVC). We hypothesize that there would be significant correlations between scenario-based decision-making skills and technical proficiency in central line insertion. We also predict residents would face problems in anticipating common difficulties and generating solutions associated with line placement. Participants were asked to insert a subclavian central line on a simulator. After completion, residents were presented with a real-life patient photograph depicting CVC placement and asked to anticipate difficulties and generate solutions. Error rates were analyzed using chi-square tests and a 5% expected error rate. Correlations were sought by comparing technical errors and scenario-based decision-making skills. This study was performed at 7 tertiary care centers. Study participants (N = 46) largely consisted of first-year research residents who could be followed longitudinally. Second-year research and clinical residents were not excluded. In total, 6 checklist errors were committed more often than anticipated. Residents committed an average of 1.9 errors, significantly more than the 1 error, at most, per person expected (t(44) = 3.82, p technical errors committed negatively correlated with the total number of commonly identified difficulties and generated solutions (r (33) = -0.429, p = 0.021, r (33) = -0.383, p = 0.044, respectively). Almost half of the surgical residents committed multiple errors while performing subclavian CVC placement. The correlation between technical errors and decision-making skills suggests a critical need to train residents in both technique and error management. Copyright © 2016 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  11. A Hybrid Unequal Error Protection / Unequal Error Resilience ...

    African Journals Online (AJOL)

    The quality layers are then assigned an Unequal Error Resilience to synchronization loss by unequally allocating the number of headers available for synchronization to them. Following that Unequal Error Protection against channel noise is provided to the layers by the use of Rate Compatible Punctured Convolutional ...

  12. Engagement in Learning after Errors at Work: Enabling Conditions and Types of Engagement

    Science.gov (United States)

    Bauer, Johannes; Mulder, Regina H.

    2013-01-01

    This article addresses two research questions concerning nurses' engagement in social learning activities after errors at work. Firstly, we investigated how this engagement relates to nurses' interpretations of the error situation and perceptions of a safe team climate. The results indicate that the individual estimation of an error as relevant to…

  13. Theoretical explanations and practices regarding the distinction between the concepts: judicial error, error of law and fundamental vice in the legislation of the Republic of Moldova

    Directory of Open Access Journals (Sweden)

    Vasilisa Muntean

    2017-10-01

    Full Text Available In the research, a doctrinal and legal analysis of the concept of legal error is carried out. The author provides a self-defined definition of the concept addressed and highlights the main causes and conditions for the occurrence of judicial errors. At present, in the specialized legal doctrine of the Republic of Moldova, the problem of defining the judicial error has been little approached. In this respect, this scientific article is a scientific approach aimed at elucidating the theoretical and normative deficiencies and errors that occur in the area of reparation of the prejudice caused by judicial errors. In order to achieve our goal, we aim to create a core of ideas and referral mechanisms that ensure a certain interpretative and decisional homogeneity in the doctrinal and legal characterization of the phrase "judicial error".

  14. Measuring Articulatory Error Consistency in Children with Developmental Apraxia of Speech

    Science.gov (United States)

    Betz, Stacy K.; Stoel-Gammon, Carol

    2005-01-01

    Error inconsistency is often cited as a characteristic of children with speech disorders, particularly developmental apraxia of speech (DAS); however, few researchers operationally define error inconsistency and the definitions that do exist are not standardized across studies. This study proposes three formulas for measuring various aspects of…

  15. Learning from prescribing errors

    OpenAIRE

    Dean, B

    2002-01-01

    

 The importance of learning from medical error has recently received increasing emphasis. This paper focuses on prescribing errors and argues that, while learning from prescribing errors is a laudable goal, there are currently barriers that can prevent this occurring. Learning from errors can take place on an individual level, at a team level, and across an organisation. Barriers to learning from prescribing errors include the non-discovery of many prescribing errors, lack of feedback to th...

  16. Grammar Errors in the Writing of Iraqi English Language Learners

    Directory of Open Access Journals (Sweden)

    Yasir Bdaiwi Jasim Al-Shujairi

    2017-10-01

    Full Text Available Several studies have been conducted to investigate the grammatical errors of Iraqi postgraduates and undergraduates in their academic writing. However, few studies have focused on the writing challenges that Iraqi pre-university students face. This research aims at examining the written discourse of Iraqi high school students and the common grammatical errors they make in their writing. The study had a mixed methods design. Through convenience sampling method, 112 compositions were collected from Iraqi pre-university students. For purpose of triangulation, an interview was conducted. The data was analyzed using Corder’s (1967 error analysis model and James’ (1998 framework of grammatical errors. Furthermore, Brown’s (2000 taxonomy was adopted to classify the types of errors. The result showed that Iraqi high school students have serious problems with the usage of verb tenses, articles, and prepositions. Moreover, the most frequent types of errors were Omission and Addition. Furthermore, it was found that intralanguage was the dominant source of errors. These findings may enlighten Iraqi students on the importance of correct grammar use for writing efficacy.

  17. Diagnostic Error in Correctional Mental Health: Prevalence, Causes, and Consequences.

    Science.gov (United States)

    Martin, Michael S; Hynes, Katie; Hatcher, Simon; Colman, Ian

    2016-04-01

    While they have important implications for inmates and resourcing of correctional institutions, diagnostic errors are rarely discussed in correctional mental health research. This review seeks to estimate the prevalence of diagnostic errors in prisons and jails and explores potential causes and consequences. Diagnostic errors are defined as discrepancies in an inmate's diagnostic status depending on who is responsible for conducting the assessment and/or the methods used. It is estimated that at least 10% to 15% of all inmates may be incorrectly classified in terms of the presence or absence of a mental illness. Inmate characteristics, relationships with staff, and cognitive errors stemming from the use of heuristics when faced with time constraints are discussed as possible sources of error. A policy example of screening for mental illness at intake to prison is used to illustrate when the risk of diagnostic error might be increased and to explore strategies to mitigate this risk. © The Author(s) 2016.

  18. A Posteriori Error Estimates Including Algebraic Error and Stopping Criteria for Iterative Solvers

    Czech Academy of Sciences Publication Activity Database

    Jiránek, P.; Strakoš, Zdeněk; Vohralík, M.

    2010-01-01

    Roč. 32, č. 3 (2010), s. 1567-1590 ISSN 1064-8275 R&D Projects: GA AV ČR IAA100300802 Grant - others:GA ČR(CZ) GP201/09/P464 Institutional research plan: CEZ:AV0Z10300504 Keywords : second-order elliptic partial differential equation * finite volume method * a posteriori error estimates * iterative methods for linear algebraic systems * conjugate gradient method * stopping criteria Subject RIV: BA - General Mathematics Impact factor: 3.016, year: 2010

  19. Physician assistants and the disclosure of medical error.

    Science.gov (United States)

    Brock, Douglas M; Quella, Alicia; Lipira, Lauren; Lu, Dave W; Gallagher, Thomas H

    2014-06-01

    Evolving state law, professional societies, and national guidelines, including those of the American Medical Association and Joint Commission, recommend that patients receive transparent communication when a medical error occurs. Recommendations for error disclosure typically consist of an explanation that an error has occurred, delivery of an explicit apology, an explanation of the facts around the event, its medical ramifications and how care will be managed, and a description of how similar errors will be prevented in the future. Although error disclosure is widely endorsed in the medical and nursing literature, there is little discussion of the unique role that the physician assistant (PA) might play in these interactions. PAs are trained in the medical model and technically practice under the supervision of a physician. They are also commonly integrated into interprofessional health care teams in surgical and urgent care settings. PA practice is characterized by widely varying degrees of provider autonomy. How PAs should collaborate with physicians in sensitive error disclosure conversations with patients is unclear. With the number of practicing PAs growing rapidly in nearly all domains of medicine, their role in the error disclosure process warrants exploration. The authors call for educational societies and accrediting agencies to support policy to establish guidelines for PA disclosure of error. They encourage medical and PA researchers to explore and report best-practice disclosure roles for PAs. Finally, they recommend that PA educational programs implement trainings in disclosure skills, and hospitals and supervising physicians provide and support training for practicing PAs.

  20. Medical Errors in Cyprus: The 2005 Eurobarometer Survey

    Directory of Open Access Journals (Sweden)

    Andreas Pavlakis

    2012-01-01

    Full Text Available Background: Medical errors have been highlighted in recent years by different agencies, scientific bodies and research teams alike. We sought to explore the issue of medical errors in Cyprus using data from the Eurobarometer survey.Methods: Data from the special Eurobarometer survey conducted in 2005 across all European Union countries (EU-25 and the acceding countries were obtained from the corresponding EU office. Statisticalanalyses including logistic regression models were performed using SPSS.Results: A total of 502 individuals participated in the Cyprus survey. About 90% reported that they had often or sometimes heard about medical errors, while 22% reported that a family member or they had suffered a serious medical error in a local hospital. In addition, 9.4% reported a serious problem from a prescribed medicine. We also found statistically significant differences across different ages and gender and in rural versus urban residents. Finally, using multivariable-adjusted logistic regression models, wefound that residents in rural areas were more likely to have suffered a serious medical error in a local hospital or from a prescribed medicine.Conclusion: Our study shows that the vast majority of residents in Cyprus in parallel with the other Europeans worry about medical errors and a significant percentage report having suffered a serious medical error at a local hospital or from a prescribed medicine. The results of our study could help the medical community in Cyprus and the society at large to enhance its vigilance with respect to medical errors in order to improve medical care.

  1. 76 FR 4290 - Uncovered Innerspring Units From the People's Republic of China: Final Results of First...

    Science.gov (United States)

    2011-01-25

    ... Avenue, NW., Washington, DC 20230; telephone: (202) 482-1655. Case History With the issuance of the... material and then glued together in a linear fashion. Uncovered innersprings are classified under...

  2. Random and Systematic Errors Share in Total Error of Probes for CNC Machine Tools

    Directory of Open Access Journals (Sweden)

    Adam Wozniak

    2018-03-01

    Full Text Available Probes for CNC machine tools, as every measurement device, have accuracy limited by random errors and by systematic errors. Random errors of these probes are described by a parameter called unidirectional repeatability. Manufacturers of probes for CNC machine tools usually specify only this parameter, while parameters describing systematic errors of the probes, such as pre-travel variation or triggering radius variation, are used rarely. Systematic errors of the probes, linked to the differences in pre-travel values for different measurement directions, can be corrected or compensated, but it is not a widely used procedure. In this paper, the share of systematic errors and random errors in total error of exemplary probes are determined. In the case of simple, kinematic probes, systematic errors are much greater than random errors, so compensation would significantly reduce the probing error. Moreover, it shows that in the case of kinematic probes commonly specified unidirectional repeatability is significantly better than 2D performance. However, in the case of more precise strain-gauge probe systematic errors are of the same order as random errors, which means that errors correction or compensation, in this case, would not yield any significant benefits.

  3. LEARNING FROM MISTAKES Error Analysis in the English Speech of Indonesian Tertiary Students

    Directory of Open Access Journals (Sweden)

    Imelda Gozali

    2017-12-01

    Full Text Available This study is part of a series of Classroom Action Research conducted with the aim of improving the English speech of students in one of the tertiary institutes in Indonesia. After some years of teaching English conversation, the writer noted that students made various types of errors in their speech, which can be classified generally into morphological, phonological, and lexical. While some of the errors are still generally acceptable, some others elicit laughter or inhibit comprehension altogether. Therefore, the writer is keen to analyze the more common errors made by the students, so as to be able to compile a teaching material that could be utilized to address those errors more effectively in future classes. This research used Error Analysis by Richards (1971 as the basis of classification. It was carried out in five classes with a total number of 80 students for a period of one semester (14 weeks. The results showed that most of the errors were phonological (errors in pronunciation, while others were morphological or grammatical in nature. This prompted the writer to design simple Phonics lessons for future classes.

  4. Evaluation of rotational set-up errors in patients with thoracic neoplasms

    International Nuclear Information System (INIS)

    Wang Yanyang; Fu Xiaolong; Xia Bing; Fan Min; Yang Huanjun; Ren Jun; Xu Zhiyong; Jiang Guoliang

    2010-01-01

    Objective: To assess the rotational set-up errors in patients with thoracic neoplasms. Methods: 224 kilovoltage cone-beam computed tomography (KVCBCT) scans from 20 thoracic tumor patients were evaluated retrospectively. All these patients were involved in the research of 'Evaluation of the residual set-up error for online kilovoltage cone-beam CT guided thoracic tumor radiation'. Rotational set-up errors, including pitch, roll and yaw, were calculated by 'aligning the KVCBCT with the planning CT, using the semi-automatic alignment method. Results: The average rotational set-up errors were -0.28 degree ±1.52 degree, 0.21 degree ± 0.91 degree and 0.27 degree ± 0.78 degree in the left-fight, superior-inferior and anterior-posterior axis, respectively. The maximal rotational errors of pitch, roll and yaw were 3.5 degree, 2.7 degree and 2.2 degree, respectively. After correction for translational set-up errors, no statistically significant changes in rotational error were observed. Conclusions: The rotational set-up errors in patients with thoracic neoplasms were all small in magnitude. Rotational errors may not change after the correction for translational set-up errors alone, which should be evaluated in a larger sample future. (authors)

  5. Uncovering the benefits of participatory research: implications of a realist review for health research and practice.

    Science.gov (United States)

    Jagosh, Justin; Macaulay, Ann C; Pluye, Pierre; Salsberg, Jon; Bush, Paula L; Henderson, Jim; Sirett, Erin; Wong, Geoff; Cargo, Margaret; Herbert, Carol P; Seifer, Sarena D; Green, Lawrence W; Greenhalgh, Trisha

    2012-06-01

    Participatory research (PR) is the co-construction of research through partnerships between researchers and people affected by and/or responsible for action on the issues under study. Evaluating the benefits of PR is challenging for a number of reasons: the research topics, methods, and study designs are heterogeneous; the extent of collaborative involvement may vary over the duration of a project and from one project to the next; and partnership activities may generate a complex array of both short- and long-term outcomes. Our review team consisted of a collaboration among researchers and decision makers in public health, research funding, ethics review, and community-engaged scholarship. We identified, selected, and appraised a large-variety sample of primary studies describing PR partnerships, and in each stage, two team members independently reviewed and coded the literature. We used key realist review concepts (middle-range theory, demi-regularity, and context-mechanism-outcome configurations [CMO]) to analyze and synthesize the data, using the PR partnership as the main unit of analysis. From 7,167 abstracts and 591 full-text papers, we distilled for synthesis a final sample of twenty-three PR partnerships described in 276 publications. The link between process and outcome in these partnerships was best explained using the middle-range theory of partnership synergy, which demonstrates how PR can (1) ensure culturally and logistically appropriate research, (2) enhance recruitment capacity, (3) generate professional capacity and competence in stakeholder groups, (4) result in productive conflicts followed by useful negotiation, (5) increase the quality of outputs and outcomes over time, (6) increase the sustainability of project goals beyond funded time frames and during gaps in external funding, and (7) create system changes and new unanticipated projects and activities. Negative examples illustrated why these outcomes were not a guaranteed product of PR

  6. ERM model analysis for adaptation to hydrological model errors

    Science.gov (United States)

    Baymani-Nezhad, M.; Han, D.

    2018-05-01

    Hydrological conditions are changed continuously and these phenomenons generate errors on flood forecasting models and will lead to get unrealistic results. Therefore, to overcome these difficulties, a concept called model updating is proposed in hydrological studies. Real-time model updating is one of the challenging processes in hydrological sciences and has not been entirely solved due to lack of knowledge about the future state of the catchment under study. Basically, in terms of flood forecasting process, errors propagated from the rainfall-runoff model are enumerated as the main source of uncertainty in the forecasting model. Hence, to dominate the exciting errors, several methods have been proposed by researchers to update the rainfall-runoff models such as parameter updating, model state updating, and correction on input data. The current study focuses on investigations about the ability of rainfall-runoff model parameters to cope with three types of existing errors, timing, shape and volume as the common errors in hydrological modelling. The new lumped model, the ERM model, has been selected for this study to evaluate its parameters for its use in model updating to cope with the stated errors. Investigation about ten events proves that the ERM model parameters can be updated to cope with the errors without the need to recalibrate the model.

  7. Experimental Evaluation of a Mixed Controller That Amplifies Spatial Errors and Reduces Timing Errors

    Directory of Open Access Journals (Sweden)

    Laura Marchal-Crespo

    2017-06-01

    Full Text Available Research on motor learning suggests that training with haptic guidance enhances learning of the timing components of motor tasks, whereas error amplification is better for learning the spatial components. We present a novel mixed guidance controller that combines haptic guidance and error amplification to simultaneously promote learning of the timing and spatial components of complex motor tasks. The controller is realized using a force field around the desired position. This force field has a stable manifold tangential to the trajectory that guides subjects in velocity-related aspects. The force field has an unstable manifold perpendicular to the trajectory, which amplifies the perpendicular (spatial error. We also designed a controller that applies randomly varying, unpredictable disturbing forces to enhance the subjects’ active participation by pushing them away from their “comfort zone.” We conducted an experiment with thirty-two healthy subjects to evaluate the impact of four different training strategies on motor skill learning and self-reported motivation: (i No haptics, (ii mixed guidance, (iii perpendicular error amplification and tangential haptic guidance provided in sequential order, and (iv randomly varying disturbing forces. Subjects trained two motor tasks using ARMin IV, a robotic exoskeleton for upper limb rehabilitation: follow circles with an ellipsoidal speed profile, and move along a 3D line following a complex speed profile. Mixed guidance showed no detectable learning advantages over the other groups. Results suggest that the effectiveness of the training strategies depends on the subjects’ initial skill level. Mixed guidance seemed to benefit subjects who performed the circle task with smaller errors during baseline (i.e., initially more skilled subjects, while training with no haptics was more beneficial for subjects who created larger errors (i.e., less skilled subjects. Therefore, perhaps the high functional

  8. Partially covered versus uncovered self-expandable nitinol stents with anti-migration properties for the palliation of malignant distal biliary obstruction: A randomized controlled trial.

    Science.gov (United States)

    Yang, Min Jae; Kim, Jin Hong; Yoo, Byung Moo; Hwang, Jae Chul; Yoo, Jun Hwan; Lee, Ki Seong; Kang, Joon Koo; Kim, Soon Sun; Lim, Sun Gyo; Shin, Sung Jae; Cheong, Jae Youn; Lee, Kee Myung; Lee, Kwang Jae; Cho, Sung Won

    2015-01-01

    Covered self-expandable metal stents (SEMSs) are increasingly used as alternatives to uncovered SEMSs for the palliation of inoperable malignant distal biliary obstruction to counteract tumor ingrowth. We aimed to compare the outcomes of partially covered and uncovered SEMSs with identical mesh structures and anti-migration properties, such as low axial force and flared ends. One hundred and three patients who were diagnosed with inoperable malignant distal biliary obstruction between January 2006 and August 2013 were randomly assigned to either the partially covered (n = 51) or uncovered (n = 52) SEMS group. There were no significant differences in the cumulative stent patency, overall patient survival, stent dysfunction-free survival and overall adverse events, including pancreatitis and cholecystitis, between the two groups. Compared to the uncovered group, stent migration (5.9% vs. 0%, p = 0.118) and tumor overgrowth (7.8% vs. 1.9%, p = 0.205) were non-significantly more frequent in the partially covered group, whereas tumor ingrowth showed a significantly higher incidence in the uncovered group (5.9% vs. 19.2%, p = 0.041). Stent migration in the partially covered group occurred only in patients with short stenosis of the utmost distal bile duct (two in ampullary cancer, one in bile duct cancer), and did not occur in any patients with pancreatic cancer. For the palliation of malignant distal biliary obstruction, endoscopic placement of partially covered SEMSs with anti-migration designs and identical mesh structures to uncovered SEMSs failed to prolong cumulative stent patency or reduce stent migration.

  9. BAYES-HEP: Bayesian belief networks for estimation of human error probability

    International Nuclear Information System (INIS)

    Karthick, M.; Senthil Kumar, C.; Paul, Robert T.

    2017-01-01

    Human errors contribute a significant portion of risk in safety critical applications and methods for estimation of human error probability have been a topic of research for over a decade. The scarce data available on human errors and large uncertainty involved in the prediction of human error probabilities make the task difficult. This paper presents a Bayesian belief network (BBN) model for human error probability estimation in safety critical functions of a nuclear power plant. The developed model using BBN would help to estimate HEP with limited human intervention. A step-by-step illustration of the application of the method and subsequent evaluation is provided with a relevant case study and the model is expected to provide useful insights into risk assessment studies

  10. Understanding error generation in fused deposition modeling

    Science.gov (United States)

    Bochmann, Lennart; Bayley, Cindy; Helu, Moneer; Transchel, Robert; Wegener, Konrad; Dornfeld, David

    2015-03-01

    Additive manufacturing offers completely new possibilities for the manufacturing of parts. The advantages of flexibility and convenience of additive manufacturing have had a significant impact on many industries, and optimizing part quality is crucial for expanding its utilization. This research aims to determine the sources of imprecision in fused deposition modeling (FDM). Process errors in terms of surface quality, accuracy and precision are identified and quantified, and an error-budget approach is used to characterize errors of the machine tool. It was determined that accuracy and precision in the y direction (0.08-0.30 mm) are generally greater than in the x direction (0.12-0.62 mm) and the z direction (0.21-0.57 mm). Furthermore, accuracy and precision tend to decrease at increasing axis positions. The results of this work can be used to identify possible process improvements in the design and control of FDM technology.

  11. Error monitoring and empathy: Explorations within a neurophysiological context.

    Science.gov (United States)

    Amiruddin, Azhani; Fueggle, Simone N; Nguyen, An T; Gignac, Gilles E; Clunies-Ross, Karen L; Fox, Allison M

    2017-06-01

    Past literature has proposed that empathy consists of two components: cognitive and affective empathy. Error monitoring mechanisms indexed by the error-related negativity (ERN) have been associated with empathy. Studies have found that a larger ERN is associated with higher levels of empathy. We aimed to expand upon previous work by investigating how error monitoring relates to the independent theoretical domains of cognitive and affective empathy. Study 1 (N = 24) explored the relationship between error monitoring mechanisms and subcomponents of empathy using the Questionnaire of Cognitive and Affective Empathy and found no relationship. Study 2 (N = 38) explored the relationship between the error monitoring mechanisms and overall empathy. Contrary to past findings, there was no evidence to support a relationship between error monitoring mechanisms and scores on empathy measures. A subsequent meta-analysis (Study 3, N = 125) summarizing the relationship across previously published studies together with the two studies reported in the current paper indicated that overall there was no significant association between ERN and empathy and that there was significant heterogeneity across studies. Future investigations exploring the potential variables that may moderate these relationships are discussed. © 2017 Society for Psychophysiological Research.

  12. Children's mathematics 4-15 learning from errors and misconceptions

    CERN Document Server

    Ryan, Julie

    2007-01-01

    Develops concepts for teachers to use in organizing their understanding and knowledge of children's mathematics. This book offers guidance for classroom teaching and concludes with theoretical accounts of learning and teaching. It transforms research on diagnostic errors into knowledge for teaching, teacher education and research on teaching.

  13. Uncorrected refractive errors.

    Science.gov (United States)

    Naidoo, Kovin S; Jaggernath, Jyoti

    2012-01-01

    Global estimates indicate that more than 2.3 billion people in the world suffer from poor vision due to refractive error; of which 670 million people are considered visually impaired because they do not have access to corrective treatment. Refractive errors, if uncorrected, results in an impaired quality of life for millions of people worldwide, irrespective of their age, sex and ethnicity. Over the past decade, a series of studies using a survey methodology, referred to as Refractive Error Study in Children (RESC), were performed in populations with different ethnic origins and cultural settings. These studies confirmed that the prevalence of uncorrected refractive errors is considerably high for children in low-and-middle-income countries. Furthermore, uncorrected refractive error has been noted to have extensive social and economic impacts, such as limiting educational and employment opportunities of economically active persons, healthy individuals and communities. The key public health challenges presented by uncorrected refractive errors, the leading cause of vision impairment across the world, require urgent attention. To address these issues, it is critical to focus on the development of human resources and sustainable methods of service delivery. This paper discusses three core pillars to addressing the challenges posed by uncorrected refractive errors: Human Resource (HR) Development, Service Development and Social Entrepreneurship.

  14. Uncorrected refractive errors

    Directory of Open Access Journals (Sweden)

    Kovin S Naidoo

    2012-01-01

    Full Text Available Global estimates indicate that more than 2.3 billion people in the world suffer from poor vision due to refractive error; of which 670 million people are considered visually impaired because they do not have access to corrective treatment. Refractive errors, if uncorrected, results in an impaired quality of life for millions of people worldwide, irrespective of their age, sex and ethnicity. Over the past decade, a series of studies using a survey methodology, referred to as Refractive Error Study in Children (RESC, were performed in populations with different ethnic origins and cultural settings. These studies confirmed that the prevalence of uncorrected refractive errors is considerably high for children in low-and-middle-income countries. Furthermore, uncorrected refractive error has been noted to have extensive social and economic impacts, such as limiting educational and employment opportunities of economically active persons, healthy individuals and communities. The key public health challenges presented by uncorrected refractive errors, the leading cause of vision impairment across the world, require urgent attention. To address these issues, it is critical to focus on the development of human resources and sustainable methods of service delivery. This paper discusses three core pillars to addressing the challenges posed by uncorrected refractive errors: Human Resource (HR Development, Service Development and Social Entrepreneurship.

  15. Comparing different error conditions in filmdosemeter evaluation

    International Nuclear Information System (INIS)

    Roed, H.; Figel, M.

    2005-01-01

    Full text: In the evaluation of a film used as a personal dosemeter it may be necessary to mark the dosemeters when possible error conditions are recognized. These are errors that might have an influence on the ability to make a correct evaluation of the dose value, and include broken, contaminated or improperly handled dosemeters. In this project we have examined how two services (NIRH, GSF), from two different countries within the EU, mark their dosemeters. The services have a large difference in size, customer composition and issuing period, but both use film as their primary dosemeters. The possible error conditions that are examined here are dosemeters being contaminated, dosemeters exposed to moisture or light, missing filters in the dosemeter badges among others. The data are collected for the year 2003 where NIRH evaluated approximately 50 thousand and GSF about one million filmdosemeters. For each error condition the percentage of filmdosemeters belonging hereto is calculated as well as the distribution among different employee categories, i.e. industry, medicine, research, veterinary and other. For some error conditions we see a common pattern, while for others there is a large discrepancy between the services. The differences and possible explanations are discussed. The results of the investigation may motivate further comparisons between the different monitoring services in Europe. (author)

  16. Community Mapping in Action: Uncovering Resources and Assets for Young Children and Their Families

    Science.gov (United States)

    Ordonez-Jasis, Rosario; Myck-Wayne, Janice

    2012-01-01

    Community mapping is a promising practice that can assist early intervention/early childhood special education (EI/ECSE) professionals uncover the depth and diversity of community needs, resources, and learning opportunities, in the neighborhoods surrounding their schools. Community mapping is an inquiry-based method that situates learning in the…

  17. Biallelic mutations in the 3' exonuclease TOE1 cause pontocerebellar hypoplasia and uncover a role in snRNA processing

    DEFF Research Database (Denmark)

    Lardelli, Rea M.; Schaffer, Ashleigh E.; Eggens, Veerle R C

    2017-01-01

    ) is a unique recessive syndrome characterized by neurodegeneration and ambiguous genitalia. We studied 12 human families with PCH7, uncovering biallelic, loss-of-function mutations in TOE1, which encodes an unconventional deadenylase. toe1-morphant zebrafish displayed midbrain and hindbrain degeneration...... of TOE1 accumulated 3'-end-extended pre-snRNAs, and the immunoisolated TOE1 complex was sufficient for 3'-end maturation of snRNAs. Our findings identify the cause of a neurodegenerative syndrome linked to snRNA maturation and uncover a key factor involved in the processing of snRNA 3' ends....

  18. Detected-jump-error-correcting quantum codes, quantum error designs, and quantum computation

    International Nuclear Information System (INIS)

    Alber, G.; Mussinger, M.; Beth, Th.; Charnes, Ch.; Delgado, A.; Grassl, M.

    2003-01-01

    The recently introduced detected-jump-correcting quantum codes are capable of stabilizing qubit systems against spontaneous decay processes arising from couplings to statistically independent reservoirs. These embedded quantum codes exploit classical information about which qubit has emitted spontaneously and correspond to an active error-correcting code embedded in a passive error-correcting code. The construction of a family of one-detected-jump-error-correcting quantum codes is shown and the optimal redundancy, encoding, and recovery as well as general properties of detected-jump-error-correcting quantum codes are discussed. By the use of design theory, multiple-jump-error-correcting quantum codes can be constructed. The performance of one-jump-error-correcting quantum codes under nonideal conditions is studied numerically by simulating a quantum memory and Grover's algorithm

  19. Medication Errors: New EU Good Practice Guide on Risk Minimisation and Error Prevention.

    Science.gov (United States)

    Goedecke, Thomas; Ord, Kathryn; Newbould, Victoria; Brosch, Sabine; Arlett, Peter

    2016-06-01

    A medication error is an unintended failure in the drug treatment process that leads to, or has the potential to lead to, harm to the patient. Reducing the risk of medication errors is a shared responsibility between patients, healthcare professionals, regulators and the pharmaceutical industry at all levels of healthcare delivery. In 2015, the EU regulatory network released a two-part good practice guide on medication errors to support both the pharmaceutical industry and regulators in the implementation of the changes introduced with the EU pharmacovigilance legislation. These changes included a modification of the 'adverse reaction' definition to include events associated with medication errors, and the requirement for national competent authorities responsible for pharmacovigilance in EU Member States to collaborate and exchange information on medication errors resulting in harm with national patient safety organisations. To facilitate reporting and learning from medication errors, a clear distinction has been made in the guidance between medication errors resulting in adverse reactions, medication errors without harm, intercepted medication errors and potential errors. This distinction is supported by an enhanced MedDRA(®) terminology that allows for coding all stages of the medication use process where the error occurred in addition to any clinical consequences. To better understand the causes and contributing factors, individual case safety reports involving an error should be followed-up with the primary reporter to gather information relevant for the conduct of root cause analysis where this may be appropriate. Such reports should also be summarised in periodic safety update reports and addressed in risk management plans. Any risk minimisation and prevention strategy for medication errors should consider all stages of a medicinal product's life-cycle, particularly the main sources and types of medication errors during product development. This article

  20. The 95% confidence intervals of error rates and discriminant coefficients

    Directory of Open Access Journals (Sweden)

    Shuichi Shinmura

    2015-02-01

    Full Text Available Fisher proposed a linear discriminant function (Fisher’s LDF. From 1971, we analysed electrocardiogram (ECG data in order to develop the diagnostic logic between normal and abnormal symptoms by Fisher’s LDF and a quadratic discriminant function (QDF. Our four years research was inferior to the decision tree logic developed by the medical doctor. After this experience, we discriminated many data and found four problems of the discriminant analysis. A revised Optimal LDF by Integer Programming (Revised IP-OLDF based on the minimum number of misclassification (minimum NM criterion resolves three problems entirely [13, 18]. In this research, we discuss fourth problem of the discriminant analysis. There are no standard errors (SEs of the error rate and discriminant coefficient. We propose a k-fold crossvalidation method. This method offers a model selection technique and a 95% confidence intervals (C.I. of error rates and discriminant coefficients.

  1. Application of impedance spectroscopy to SOFC research

    Energy Technology Data Exchange (ETDEWEB)

    Hsieh, G.; Mason, T.O. [Northwestern Univ., Evanston, IL (United States); Pederson, L.R. [Pacific Northwest National Lab., Richland, WA (United States)

    1996-12-31

    With the resurgence of interest in solid oxide fuel cells and other solid state electrochemical devices, techniques originally developed for characterizing aqueous systems are being adapted and applied to solid state systems. One of these techniques, three-electrode impedance spectroscopy, is particularly powerful as it allows characterization of subcomponent and interfacial properties. Obtaining accurate impedance spectra, however, is difficult as reference electrode impedance is usually non-negligible and solid electrolytes typically have much lower conductance than aqueous solutions. Faidi et al and Chechirlian et al have both identified problems associated with low conductivity media. Other sources of error are still being uncovered. Ford et al identified resistive contacts with large time constants as a possibility, while Me et al showed that the small contact capacitance of the reference electrode was at fault. Still others show that instrument limitations play a role. Using the voltage divider concept, a simplified model that demonstrates the interplay of these various factors, predicts the form of possible distortions, and offers means to minimize errors is presented.

  2. Perceptual learning eases crowding by reducing recognition errors but not position errors.

    Science.gov (United States)

    Xiong, Ying-Zi; Yu, Cong; Zhang, Jun-Yun

    2015-08-01

    When an observer reports a letter flanked by additional letters in the visual periphery, the response errors (the crowding effect) may result from failure to recognize the target letter (recognition errors), from mislocating a correctly recognized target letter at a flanker location (target misplacement errors), or from reporting a flanker as the target letter (flanker substitution errors). Crowding can be reduced through perceptual learning. However, it is not known how perceptual learning operates to reduce crowding. In this study we trained observers with a partial-report task (Experiment 1), in which they reported the central target letter of a three-letter string presented in the visual periphery, or a whole-report task (Experiment 2), in which they reported all three letters in order. We then assessed the impact of training on recognition of both unflanked and flanked targets, with particular attention to how perceptual learning affected the types of errors. Our results show that training improved target recognition but not single-letter recognition, indicating that training indeed affected crowding. However, training did not reduce target misplacement errors or flanker substitution errors. This dissociation between target recognition and flanker substitution errors supports the view that flanker substitution may be more likely a by-product (due to response bias), rather than a cause, of crowding. Moreover, the dissociation is not consistent with hypothesized mechanisms of crowding that would predict reduced positional errors.

  3. Uncovering client retention antecedents in service organizations

    Directory of Open Access Journals (Sweden)

    Mari Jansen van Rensburg

    2014-01-01

    Full Text Available This paper develops a multi-dimensional model of retention to provide a more complete and integrated view of client retention and its determinants in service contexts. To uncover the antecedents of client retention, social and economic exchanges were reviewed under the fundamental ideas of the Social Exchange Theory. Findings from a survey of senior South African advertising executives suggest that client retention is the result of evaluative as well as relational factors that can influence client responses. Despite contractual obligations, advertisers are willing to pay the costs and make the sacrifices of switching should their expectations be unmet. An important contribution of this study is the use of multi-item scales to measure retention. The model developed provides valuable insight to agencies on client retention management and the optimal allocation of resources for maximum customer equity. This model may also be applied to other service organisations to provide insight to client retention.

  4. Error suppression and error correction in adiabatic quantum computation: non-equilibrium dynamics

    International Nuclear Information System (INIS)

    Sarovar, Mohan; Young, Kevin C

    2013-01-01

    While adiabatic quantum computing (AQC) has some robustness to noise and decoherence, it is widely believed that encoding, error suppression and error correction will be required to scale AQC to large problem sizes. Previous works have established at least two different techniques for error suppression in AQC. In this paper we derive a model for describing the dynamics of encoded AQC and show that previous constructions for error suppression can be unified with this dynamical model. In addition, the model clarifies the mechanisms of error suppression and allows the identification of its weaknesses. In the second half of the paper, we utilize our description of non-equilibrium dynamics in encoded AQC to construct methods for error correction in AQC by cooling local degrees of freedom (qubits). While this is shown to be possible in principle, we also identify the key challenge to this approach: the requirement of high-weight Hamiltonians. Finally, we use our dynamical model to perform a simplified thermal stability analysis of concatenated-stabilizer-code encoded many-body systems for AQC or quantum memories. This work is a companion paper to ‘Error suppression and error correction in adiabatic quantum computation: techniques and challenges (2013 Phys. Rev. X 3 041013)’, which provides a quantum information perspective on the techniques and limitations of error suppression and correction in AQC. In this paper we couch the same results within a dynamical framework, which allows for a detailed analysis of the non-equilibrium dynamics of error suppression and correction in encoded AQC. (paper)

  5. Quantum Error Correction and Fault Tolerant Quantum Computing

    CERN Document Server

    Gaitan, Frank

    2008-01-01

    It was once widely believed that quantum computation would never become a reality. However, the discovery of quantum error correction and the proof of the accuracy threshold theorem nearly ten years ago gave rise to extensive development and research aimed at creating a working, scalable quantum computer. Over a decade has passed since this monumental accomplishment yet no book-length pedagogical presentation of this important theory exists. Quantum Error Correction and Fault Tolerant Quantum Computing offers the first full-length exposition on the realization of a theory once thought impo

  6. SIMULATION OF INERTIAL NAVIGATION SYSTEM ERRORS AT AERIAL PHOTOGRAPHY FROM UAV

    Directory of Open Access Journals (Sweden)

    R. Shults

    2017-05-01

    Full Text Available The problem of accuracy determination of the UAV position using INS at aerial photography can be resolved in two different ways: modelling of measurement errors or in-field calibration for INS. The paper presents the results of INS errors research by mathematical modelling. In paper were considered the following steps: developing of INS computer model; carrying out INS simulation; using reference data without errors, estimation of errors and their influence on maps creation accuracy by UAV data. It must be remembered that the values of orientation angles and the coordinates of the projection centre may change abruptly due to the influence of the atmosphere (different air density, wind, etc.. Therefore, the mathematical model of the INS was constructed taking into account the use of different models of wind gusts. For simulation were used typical characteristics of micro electromechanical (MEMS INS and parameters of standard atmosphere. According to the simulation established domination of INS systematic errors that accumulate during the execution of photographing and require compensation mechanism, especially for orientation angles. MEMS INS have a high level of noise at the system input. Thanks to the developed model, we are able to investigate separately the impact of noise in the absence of systematic errors. According to the research was found that on the interval of observations in 5 seconds the impact of random and systematic component is almost the same. The developed model of INS errors studies was implemented in Matlab software environment and without problems can be improved and enhanced with new blocks.

  7. An advanced SEU tolerant latch based on error detection

    Science.gov (United States)

    Xu, Hui; Zhu, Jianwei; Lu, Xiaoping; Li, Jingzhao

    2018-05-01

    This paper proposes a latch that can mitigate SEUs via an error detection circuit. The error detection circuit is hardened by a C-element and a stacked PMOS. In the hold state, a particle strikes the latch or the error detection circuit may cause a fault logic state of the circuit. The error detection circuit can detect the upset node in the latch and the fault output will be corrected. The upset node in the error detection circuit can be corrected by the C-element. The power dissipation and propagation delay of the proposed latch are analyzed by HSPICE simulations. The proposed latch consumes about 77.5% less energy and 33.1% less propagation delay than the triple modular redundancy (TMR) latch. Simulation results demonstrate that the proposed latch can mitigate SEU effectively. Project supported by the National Natural Science Foundation of China (Nos. 61404001, 61306046), the Anhui Province University Natural Science Research Major Project (No. KJ2014ZD12), the Huainan Science and Technology Program (No. 2013A4011), and the National Natural Science Foundation of China (No. 61371025).

  8. Effects of learning climate and registered nurse staffing on medication errors.

    Science.gov (United States)

    Chang, YunKyung; Mark, Barbara

    2011-01-01

    Despite increasing recognition of the significance of learning from errors, little is known about how learning climate contributes to error reduction. The purpose of this study was to investigate whether learning climate moderates the relationship between error-producing conditions and medication errors. A cross-sectional descriptive study was done using data from 279 nursing units in 146 randomly selected hospitals in the United States. Error-producing conditions included work environment factors (work dynamics and nurse mix), team factors (communication with physicians and nurses' expertise), personal factors (nurses' education and experience), patient factors (age, health status, and previous hospitalization), and medication-related support services. Poisson models with random effects were used with the nursing unit as the unit of analysis. A significant negative relationship was found between learning climate and medication errors. It also moderated the relationship between nurse mix and medication errors: When learning climate was negative, having more registered nurses was associated with fewer medication errors. However, no relationship was found between nurse mix and medication errors at either positive or average levels of learning climate. Learning climate did not moderate the relationship between work dynamics and medication errors. The way nurse mix affects medication errors depends on the level of learning climate. Nursing units with fewer registered nurses and frequent medication errors should examine their learning climate. Future research should be focused on the role of learning climate as related to the relationships between nurse mix and medication errors.

  9. Team errors: definition and taxonomy

    International Nuclear Information System (INIS)

    Sasou, Kunihide; Reason, James

    1999-01-01

    In error analysis or error management, the focus is usually upon individuals who have made errors. In large complex systems, however, most people work in teams or groups. Considering this working environment, insufficient emphasis has been given to 'team errors'. This paper discusses the definition of team errors and its taxonomy. These notions are also applied to events that have occurred in the nuclear power industry, aviation industry and shipping industry. The paper also discusses the relations between team errors and Performance Shaping Factors (PSFs). As a result, the proposed definition and taxonomy are found to be useful in categorizing team errors. The analysis also reveals that deficiencies in communication, resource/task management, excessive authority gradient, excessive professional courtesy will cause team errors. Handling human errors as team errors provides an opportunity to reduce human errors

  10. Potential Errors and Test Assessment in Software Product Line Engineering

    Directory of Open Access Journals (Sweden)

    Hartmut Lackner

    2015-04-01

    Full Text Available Software product lines (SPL are a method for the development of variant-rich software systems. Compared to non-variable systems, testing SPLs is extensive due to an increasingly amount of possible products. Different approaches exist for testing SPLs, but there is less research for assessing the quality of these tests by means of error detection capability. Such test assessment is based on error injection into correct version of the system under test. However to our knowledge, potential errors in SPL engineering have never been systematically identified before. This article presents an overview over existing paradigms for specifying software product lines and the errors that can occur during the respective specification processes. For assessment of test quality, we leverage mutation testing techniques to SPL engineering and implement the identified errors as mutation operators. This allows us to run existing tests against defective products for the purpose of test assessment. From the results, we draw conclusions about the error-proneness of the surveyed SPL design paradigms and how quality of SPL tests can be improved.

  11. The Errors of Our Ways: Understanding Error Representations in Cerebellar-Dependent Motor Learning.

    Science.gov (United States)

    Popa, Laurentiu S; Streng, Martha L; Hewitt, Angela L; Ebner, Timothy J

    2016-04-01

    The cerebellum is essential for error-driven motor learning and is strongly implicated in detecting and correcting for motor errors. Therefore, elucidating how motor errors are represented in the cerebellum is essential in understanding cerebellar function, in general, and its role in motor learning, in particular. This review examines how motor errors are encoded in the cerebellar cortex in the context of a forward internal model that generates predictions about the upcoming movement and drives learning and adaptation. In this framework, sensory prediction errors, defined as the discrepancy between the predicted consequences of motor commands and the sensory feedback, are crucial for both on-line movement control and motor learning. While many studies support the dominant view that motor errors are encoded in the complex spike discharge of Purkinje cells, others have failed to relate complex spike activity with errors. Given these limitations, we review recent findings in the monkey showing that complex spike modulation is not necessarily required for motor learning or for simple spike adaptation. Also, new results demonstrate that the simple spike discharge provides continuous error signals that both lead and lag the actual movements in time, suggesting errors are encoded as both an internal prediction of motor commands and the actual sensory feedback. These dual error representations have opposing effects on simple spike discharge, consistent with the signals needed to generate sensory prediction errors used to update a forward internal model.

  12. Video Error Correction Using Steganography

    Science.gov (United States)

    Robie, David L.; Mersereau, Russell M.

    2002-12-01

    The transmission of any data is always subject to corruption due to errors, but video transmission, because of its real time nature must deal with these errors without retransmission of the corrupted data. The error can be handled using forward error correction in the encoder or error concealment techniques in the decoder. This MPEG-2 compliant codec uses data hiding to transmit error correction information and several error concealment techniques in the decoder. The decoder resynchronizes more quickly with fewer errors than traditional resynchronization techniques. It also allows for perfect recovery of differentially encoded DCT-DC components and motion vectors. This provides for a much higher quality picture in an error-prone environment while creating an almost imperceptible degradation of the picture in an error-free environment.

  13. A second study of the prediction of cognitive errors using the 'CREAM' technique

    International Nuclear Information System (INIS)

    Collier, Steve; Andresen, Gisle

    2000-03-01

    Some human errors, such as errors of commission and knowledge-based errors, are not adequately modelled in probabilistic safety assessments. Even qualitative methods for handling these sorts of errors are comparatively underdeveloped. The 'Cognitive Reliability and Error Analysis Method' (CREAM) was recently developed for prediction of cognitive error modes. It has not yet been comprehensively established how reliable, valid and generally useful it could be to researchers and practitioners. A previous study of CREAM at Halden was promising, showing a relationship between errors predicted in advance and those that actually occurred in simulated fault scenarios. The present study continues this work. CREAM was used to make predictions of cognitive error modes throughout two rather difficult fault scenarios. Predictions were made of the most likely cognitive error mode, were one to occur at all, at several points throughout the expected scenarios, based upon the scenario design and description. Each scenario was then run 15 times with different operators. Error modes occurring during simulations were later scored using the task description for the scenario, videotapes of operator actions, eye-track recording, operators' verbal protocols and an expert's concurrent commentary. The scoring team had no previous substantive knowledge of the experiment or the techniques used, so as to provide a more stringent test of the data and knowledge needed for scoring. The scored error modes were then compared with the CREAM predictions to assess the degree of agreement. Some cognitive error modes were predicted successfully, but the results were generally not so encouraging as the previous study. Several problems were found with both the CREAM technique and the data needed to complete the analysis. It was felt that further development was needed before this kind of analysis can be reliable and valid, either in a research setting or as a practitioner's tool in a safety assessment

  14. Understanding error generation in fused deposition modeling

    International Nuclear Information System (INIS)

    Bochmann, Lennart; Transchel, Robert; Wegener, Konrad; Bayley, Cindy; Helu, Moneer; Dornfeld, David

    2015-01-01

    Additive manufacturing offers completely new possibilities for the manufacturing of parts. The advantages of flexibility and convenience of additive manufacturing have had a significant impact on many industries, and optimizing part quality is crucial for expanding its utilization. This research aims to determine the sources of imprecision in fused deposition modeling (FDM). Process errors in terms of surface quality, accuracy and precision are identified and quantified, and an error-budget approach is used to characterize errors of the machine tool. It was determined that accuracy and precision in the y direction (0.08–0.30 mm) are generally greater than in the x direction (0.12–0.62 mm) and the z direction (0.21–0.57 mm). Furthermore, accuracy and precision tend to decrease at increasing axis positions. The results of this work can be used to identify possible process improvements in the design and control of FDM technology. (paper)

  15. Uncovering missing links with cold ends

    Science.gov (United States)

    Zhu, Yu-Xiao; Lü, Linyuan; Zhang, Qian-Ming; Zhou, Tao

    2012-11-01

    To evaluate the performance of prediction of missing links, the known data are randomly divided into two parts, the training set and the probe set. We argue that this straightforward and standard method may lead to terrible bias, since in real biological and information networks, missing links are more likely to be links connecting low-degree nodes. We therefore study how to uncover missing links with low-degree nodes, namely links in the probe set are of lower degree products than a random sampling. Experimental analysis on ten local similarity indices and four disparate real networks reveals a surprising result that the Leicht-Holme-Newman index [E.A. Leicht, P. Holme, M.E.J. Newman, Vertex similarity in networks, Phys. Rev. E 73 (2006) 026120] performs the best, although it was known to be one of the worst indices if the probe set is a random sampling of all links. We further propose an parameter-dependent index, which considerably improves the prediction accuracy. Finally, we show the relevance of the proposed index to three real sampling methods: acquaintance sampling, random-walk sampling and path-based sampling.

  16. Epistemically Virtuous Risk Management: Financial Due Diligence and Uncovering the Madoff Fraud

    OpenAIRE

    de Bruin, Boudewijn; Luetge, Christoph; Jauernig, Johanna

    2014-01-01

    The chapter analyses how Bernard Madoff’s Ponzi scheme was uncovered by Harry Markopolos, an employee of Rampart Investment Management, LLC, and the contribution of so-called epistemic virtues to Markopolos’ success. After Rampart had informed the firm about an allegedly highly successful hedge fund run by Madoff, Markopolos used qualitative and quantitative methods from financial due diligence to examine Madoff’s risks, returns and strategy, ultimately to conclude that Madoff was running a l...

  17. Video Error Correction Using Steganography

    Directory of Open Access Journals (Sweden)

    Robie David L

    2002-01-01

    Full Text Available The transmission of any data is always subject to corruption due to errors, but video transmission, because of its real time nature must deal with these errors without retransmission of the corrupted data. The error can be handled using forward error correction in the encoder or error concealment techniques in the decoder. This MPEG-2 compliant codec uses data hiding to transmit error correction information and several error concealment techniques in the decoder. The decoder resynchronizes more quickly with fewer errors than traditional resynchronization techniques. It also allows for perfect recovery of differentially encoded DCT-DC components and motion vectors. This provides for a much higher quality picture in an error-prone environment while creating an almost imperceptible degradation of the picture in an error-free environment.

  18. 78 FR 17635 - Uncovered Innerspring Units From the People's Republic of China: Final Results of Antidumping...

    Science.gov (United States)

    2013-03-22

    ... DEPARTMENT OF COMMERCE International Trade Administration [A-570-928] Uncovered Innerspring Units... AGENCY: Import Administration, International Trade Administration, Department of Commerce. SUMMARY: On... Operations, Office 9, Import Administration, International Trade Administration, U.S. Department of Commerce...

  19. Part two: Error propagation

    International Nuclear Information System (INIS)

    Picard, R.R.

    1989-01-01

    Topics covered in this chapter include a discussion of exact results as related to nuclear materials management and accounting in nuclear facilities; propagation of error for a single measured value; propagation of error for several measured values; error propagation for materials balances; and an application of error propagation to an example of uranium hexafluoride conversion process

  20. Heuristic thinking: interdisciplinary perspectives on medical error

    Directory of Open Access Journals (Sweden)

    Annegret F. Hannawa

    2013-12-01

    Switzerland to stimulate such interdisciplinary dialogue. International scholars from eight disciplines and 17 countries attended the congress to discuss interdisciplinary ideas and perspectives for advancing safer care. The team of invited COME experts collaborated in compiling this issue of the Journal of Public Health Research entitled Interdisciplinary perspectives on medical error. This particular issue introduces relevant North American and European theorizing and research on preventable adverse events. The caliber of scientists who have contributed to this issue is humbling. But rather than naming their affiliations and summarizing their individual manuscripts here, it is more important to reflect on the contribution of this special issue as a whole. Particularly, I would like to raise two important take-home messages that the articles yield: i What new insights can be derived from the papers collected in this issue? ii What are the central challenges implied for future research on medical error?

  1. Prediction-error of Prediction Error (PPE)-based Reversible Data Hiding

    OpenAIRE

    Wu, Han-Zhou; Wang, Hong-Xia; Shi, Yun-Qing

    2016-01-01

    This paper presents a novel reversible data hiding (RDH) algorithm for gray-scaled images, in which the prediction-error of prediction error (PPE) of a pixel is used to carry the secret data. In the proposed method, the pixels to be embedded are firstly predicted with their neighboring pixels to obtain the corresponding prediction errors (PEs). Then, by exploiting the PEs of the neighboring pixels, the prediction of the PEs of the pixels can be determined. And, a sorting technique based on th...

  2. Identifying medication error chains from critical incident reports: a new analytic approach.

    Science.gov (United States)

    Huckels-Baumgart, Saskia; Manser, Tanja

    2014-10-01

    Research into the distribution of medication errors usually focuses on isolated stages within the medication use process. Our study aimed to provide a novel process-oriented approach to medication incident analysis focusing on medication error chains. Our study was conducted across a 900-bed teaching hospital in Switzerland. All reported 1,591 medication errors 2009-2012 were categorized using the Medication Error Index NCC MERP and the WHO Classification for Patient Safety Methodology. In order to identify medication error chains, each reported medication incident was allocated to the relevant stage of the hospital medication use process. Only 25.8% of the reported medication errors were detected before they propagated through the medication use process. The majority of medication errors (74.2%) formed an error chain encompassing two or more stages. The most frequent error chain comprised preparation up to and including medication administration (45.2%). "Non-consideration of documentation/prescribing" during the drug preparation was the most frequent contributor for "wrong dose" during the administration of medication. Medication error chains provide important insights for detecting and stopping medication errors before they reach the patient. Existing and new safety barriers need to be extended to interrupt error chains and to improve patient safety. © 2014, The American College of Clinical Pharmacology.

  3. Standard error propagation in R-matrix model fitting for light elements

    International Nuclear Information System (INIS)

    Chen Zhenpeng; Zhang Rui; Sun Yeying; Liu Tingjin

    2003-01-01

    The error propagation features with R-matrix model fitting 7 Li, 11 B and 17 O systems were researched systematically. Some laws of error propagation were revealed, an empirical formula P j = U j c / U j d = K j · S-bar · √m / √N for describing standard error propagation was established, the most likely error ranges for standard cross sections of 6 Li(n,t), 10 B(n,α0) and 10 B(n,α1) were estimated. The problem that the standard error of light nuclei standard cross sections may be too small results mainly from the R-matrix model fitting, which is not perfect. Yet R-matrix model fitting is the most reliable evaluation method for such data. The error propagation features of R-matrix model fitting for compound nucleus system of 7 Li, 11 B and 17 O has been studied systematically, some laws of error propagation are revealed, and these findings are important in solving the problem mentioned above. Furthermore, these conclusions are suitable for similar model fitting in other scientific fields. (author)

  4. OOK power model based dynamic error testing for smart electricity meter

    International Nuclear Information System (INIS)

    Wang, Xuewei; Chen, Jingxia; Jia, Xiaolu; Zhu, Meng; Yuan, Ruiming; Jiang, Zhenyu

    2017-01-01

    This paper formulates the dynamic error testing problem for a smart meter, with consideration and investigation of both the testing signal and the dynamic error testing method. To solve the dynamic error testing problems, the paper establishes an on-off-keying (OOK) testing dynamic current model and an OOK testing dynamic load energy (TDLE) model. Then two types of TDLE sequences and three modes of OOK testing dynamic power are proposed. In addition, a novel algorithm, which helps to solve the problem of dynamic electric energy measurement’s traceability, is derived for dynamic errors. Based on the above researches, OOK TDLE sequence generation equipment is developed and a dynamic error testing system is constructed. Using the testing system, five kinds of meters were tested in the three dynamic power modes. The test results show that the dynamic error is closely related to dynamic power mode and the measurement uncertainty is 0.38%. (paper)

  5. OOK power model based dynamic error testing for smart electricity meter

    Science.gov (United States)

    Wang, Xuewei; Chen, Jingxia; Yuan, Ruiming; Jia, Xiaolu; Zhu, Meng; Jiang, Zhenyu

    2017-02-01

    This paper formulates the dynamic error testing problem for a smart meter, with consideration and investigation of both the testing signal and the dynamic error testing method. To solve the dynamic error testing problems, the paper establishes an on-off-keying (OOK) testing dynamic current model and an OOK testing dynamic load energy (TDLE) model. Then two types of TDLE sequences and three modes of OOK testing dynamic power are proposed. In addition, a novel algorithm, which helps to solve the problem of dynamic electric energy measurement’s traceability, is derived for dynamic errors. Based on the above researches, OOK TDLE sequence generation equipment is developed and a dynamic error testing system is constructed. Using the testing system, five kinds of meters were tested in the three dynamic power modes. The test results show that the dynamic error is closely related to dynamic power mode and the measurement uncertainty is 0.38%.

  6. Diagnostic errors in pediatric radiology

    International Nuclear Information System (INIS)

    Taylor, George A.; Voss, Stephan D.; Melvin, Patrice R.; Graham, Dionne A.

    2011-01-01

    Little information is known about the frequency, types and causes of diagnostic errors in imaging children. Our goals were to describe the patterns and potential etiologies of diagnostic error in our subspecialty. We reviewed 265 cases with clinically significant diagnostic errors identified during a 10-year period. Errors were defined as a diagnosis that was delayed, wrong or missed; they were classified as perceptual, cognitive, system-related or unavoidable; and they were evaluated by imaging modality and level of training of the physician involved. We identified 484 specific errors in the 265 cases reviewed (mean:1.8 errors/case). Most discrepancies involved staff (45.5%). Two hundred fifty-eight individual cognitive errors were identified in 151 cases (mean = 1.7 errors/case). Of these, 83 cases (55%) had additional perceptual or system-related errors. One hundred sixty-five perceptual errors were identified in 165 cases. Of these, 68 cases (41%) also had cognitive or system-related errors. Fifty-four system-related errors were identified in 46 cases (mean = 1.2 errors/case) of which all were multi-factorial. Seven cases were unavoidable. Our study defines a taxonomy of diagnostic errors in a large academic pediatric radiology practice and suggests that most are multi-factorial in etiology. Further study is needed to define effective strategies for improvement. (orig.)

  7. Uncovering growth-suppressive MicroRNAs in lung cancer

    DEFF Research Database (Denmark)

    Liu, Xi; Sempere, Lorenzo F; Galimberti, Fabrizio

    2009-01-01

    PURPOSE: MicroRNA (miRNA) expression profiles improve classification, diagnosis, and prognostic information of malignancies, including lung cancer. This study uncovered unique growth-suppressive miRNAs in lung cancer. EXPERIMENTAL DESIGN: miRNA arrays were done on normal lung tissues...... and adenocarcinomas from wild-type and proteasome degradation-resistant cyclin E transgenic mice to reveal repressed miRNAs in lung cancer. Real-time and semiquantitative reverse transcription-PCR as well as in situ hybridization assays validated these findings. Lung cancer cell lines were derived from each......-malignant human lung tissue bank. RESULTS: miR-34c, miR-145, and miR-142-5p were repressed in transgenic lung cancers. Findings were confirmed by real-time and semiquantitative reverse transcription-PCR as well as in situ hybridization assays. Similar miRNA profiles occurred in human normal versus malignant lung...

  8. Repeat-aware modeling and correction of short read errors.

    Science.gov (United States)

    Yang, Xiao; Aluru, Srinivas; Dorman, Karin S

    2011-02-15

    High-throughput short read sequencing is revolutionizing genomics and systems biology research by enabling cost-effective deep coverage sequencing of genomes and transcriptomes. Error detection and correction are crucial to many short read sequencing applications including de novo genome sequencing, genome resequencing, and digital gene expression analysis. Short read error detection is typically carried out by counting the observed frequencies of kmers in reads and validating those with frequencies exceeding a threshold. In case of genomes with high repeat content, an erroneous kmer may be frequently observed if it has few nucleotide differences with valid kmers with multiple occurrences in the genome. Error detection and correction were mostly applied to genomes with low repeat content and this remains a challenging problem for genomes with high repeat content. We develop a statistical model and a computational method for error detection and correction in the presence of genomic repeats. We propose a method to infer genomic frequencies of kmers from their observed frequencies by analyzing the misread relationships among observed kmers. We also propose a method to estimate the threshold useful for validating kmers whose estimated genomic frequency exceeds the threshold. We demonstrate that superior error detection is achieved using these methods. Furthermore, we break away from the common assumption of uniformly distributed errors within a read, and provide a framework to model position-dependent error occurrence frequencies common to many short read platforms. Lastly, we achieve better error correction in genomes with high repeat content. The software is implemented in C++ and is freely available under GNU GPL3 license and Boost Software V1.0 license at "http://aluru-sun.ece.iastate.edu/doku.php?id = redeem". We introduce a statistical framework to model sequencing errors in next-generation reads, which led to promising results in detecting and correcting errors

  9. Accounting for measurement error: a critical but often overlooked process.

    Science.gov (United States)

    Harris, Edward F; Smith, Richard N

    2009-12-01

    Due to instrument imprecision and human inconsistencies, measurements are not free of error. Technical error of measurement (TEM) is the variability encountered between dimensions when the same specimens are measured at multiple sessions. A goal of a data collection regimen is to minimise TEM. The few studies that actually quantify TEM, regardless of discipline, report that it is substantial and can affect results and inferences. This paper reviews some statistical approaches for identifying and controlling TEM. Statistically, TEM is part of the residual ('unexplained') variance in a statistical test, so accounting for TEM, which requires repeated measurements, enhances the chances of finding a statistically significant difference if one exists. The aim of this paper was to review and discuss common statistical designs relating to types of error and statistical approaches to error accountability. This paper addresses issues of landmark location, validity, technical and systematic error, analysis of variance, scaled measures and correlation coefficients in order to guide the reader towards correct identification of true experimental differences. Researchers commonly infer characteristics about populations from comparatively restricted study samples. Most inferences are statistical and, aside from concerns about adequate accounting for known sources of variation with the research design, an important source of variability is measurement error. Variability in locating landmarks that define variables is obvious in odontometrics, cephalometrics and anthropometry, but the same concerns about measurement accuracy and precision extend to all disciplines. With increasing accessibility to computer-assisted methods of data collection, the ease of incorporating repeated measures into statistical designs has improved. Accounting for this technical source of variation increases the chance of finding biologically true differences when they exist.

  10. Stand-alone error characterisation of microwave satellite soil moisture using a Fourier method

    Science.gov (United States)

    Error characterisation of satellite-retrieved soil moisture (SM) is crucial for maximizing their utility in research and applications in hydro-meteorology and climatology. Error characteristics can provide insights for retrieval development and validation, and inform suitable strategies for data fus...

  11. A distinctive avian assemblage (Aves: Passeriformes in Western Darién, Panama is uncovered through a disease surveillance program

    Directory of Open Access Journals (Sweden)

    Matthew J. Miller

    2014-08-01

    Full Text Available Basic knowledge about the distribution of flora and fauna is lacking for most tropical areas. Improving our knowledge of the tropical biota will help address contemporary global problems, including emerging tropical diseases. Less appreciated is the role that applied studies can have in improving our understanding of basic biological patterns and processes in the tropics. Here, I describe a novel avifauna assemblage uncovered in Western Darién province in the Republic of Panama that was uncovered during a vector-borne disease surveillance program. I compared the passerine bird species composition at 16 sites using records from recent ornithological expeditions sponsored by the Smithsonian Tropical Research Institute in Central and Eastern Panama. Based on the results of a Mantel test, geographic distance did not correlate with pairwise distinctiveness of sites. Instead, based on an index of distinctiveness modified from the Chao-Jaccard index, most sites were more or less similarly distinctive, with one site, Aruza Abajo, significantly more distinctive than the rest. I found that the distinctiveness of this site was due not only to the presence of several rare and range-restricted taxa, but also to the absence of taxa that are common elsewhere. This finding provides more evidence of high species composition turnover (beta-diversity in the Panamanian biota, which appears to be driven by a combination of soil and climate differences over narrow distances. Rev. Biol. Trop. 62 (2: 711-717. Epub 2014 June 01.

  12. Effects of Lexico-syntactic Errors on Teaching Materials: A Study of Textbooks Written by Nigerians

    Directory of Open Access Journals (Sweden)

    Peace Chinwendu Israel

    2014-01-01

    Full Text Available This study examined lexico-syntactic errors in selected textbooks written by Nigerians. Our focus was on the educated bilinguals (acrolect who acquired their primary, secondary and tertiary education in Nigeria and the selected textbooks were textbooks published by Vanity Publishers/Press. The participants (authors cut across the three major ethnic groups in Nigeria – Hausa, Igbo and Yoruba and the selection of the textbooks covered the major disciplines of study. We adopted the descriptive research design and specifically employed the survey method to accomplish the purpose of our exploratory research.  The lexico-syntactic errors in the selected textbooks were identified and classified into various categories. These errors were not different from those identified over the years in students’ essays and exam scripts. This buttressed our argument that students are merely the conveyor belt of errors contained in the teaching material and that we can analyse the students’ lexico-syntactic errors in tandem with errors contained in the material used in teaching.

  13. Measurements of Gun Tube Motion and Muzzle Pointing Error of Main Battle Tanks

    Directory of Open Access Journals (Sweden)

    Peter L. McCall

    2001-01-01

    Full Text Available Beginning in 1990, the US Army Aberdeen Test Center (ATC began testing a prototype cannon mounted in a non-armored turret fitted to an M1A1 Abrams tank chassis. The cannon design incorporated a longer gun tube as a means to increase projectile velocity. A significant increase in projectile impact dispersion was measured early in the test program. Through investigative efforts, the cause of the error was linked to the increased dynamic bending or flexure of the longer tube observed while the vehicle was moving. Research and investigative work was conducted through a collaborative effort with the US Army Research Laboratory, Benet Laboratory, Project Manager – Tank Main Armament Systems, US Army Research and Engineering Center, and Cadillac Gage Textron Inc. New test methods, instrumentation, data analysis procedures, and stabilization control design resulted through this series of investigations into the dynamic tube flexure error source. Through this joint research, improvements in tank fire control design have been developed to improve delivery accuracy. This paper discusses the instrumentation implemented, methods applied, and analysis procedures used to characterize the tube flexure during dynamic tests of a main battle tank and the relationship between gun pointing error and muzzle pointing error.

  14. A systems perspective of managing error recovery and tactical re-planning of operating teams in safety critical domains.

    Science.gov (United States)

    Kontogiannis, Tom

    2011-04-01

    Research in human error has provided useful tools for designing procedures, training, and intelligent interfaces that trap errors at an early stage. However, this "error prevention" policy may not be entirely successful because human errors will inevitably occur. This requires that the error management process (e.g., detection, diagnosis and correction) must also be supported. Research has focused almost exclusively on error detection; little is known about error recovery, especially in the context of safety critical systems. The aim of this paper is to develop a research framework that integrates error recovery strategies employed by experienced practitioners in handling their own errors. A control theoretic model of human performance was used to integrate error recovery strategies assembled from reviews of the literature, analyses of near misses from aviation and command & control domains, and observations of abnormal situations training at air traffic control facilities. The method of system dynamics has been used to analyze and compare error recovery strategies in terms of patterns of interaction, system affordances, and types of recovery plans. System dynamics offer a promising basis for studying the nature of error recovery management in the context of team interactions and system characteristics. The proposed taxonomy of error recovery strategies can help human factors and safety experts to develop resilient system designs and training solutions for managing human errors in unforeseen situations; it may also help incident investigators to explore why people's actions and assessments were not corrected at the time. Copyright © 2011 Elsevier Ltd. All rights reserved.

  15. An Analysis of Lexical Errors of Korean Language Learners: Some American College Learners' Case

    Science.gov (United States)

    Kang, Manjin

    2014-01-01

    There has been a huge amount of research on errors of language learners. However, most of them have focused on syntactic errors and those about lexical errors are not found easily despite the importance of lexical learning for the language learners. The case is even rarer for Korean language. In line with this background, this study was designed…

  16. Error processing - evidence from intracerebral ERP recordings

    Czech Academy of Sciences Publication Activity Database

    Brázdil, M.; Roman, R.; Falkenstein, M.; Daniel, P.; Jurák, Pavel; Rektor, I.

    2002-01-01

    Roč. 146, č. 4 (2002), s. - ISSN 1432-1106 R&D Projects: GA ČR GA102/95/0467; GA ČR GA102/02/1339 Institutional research plan: CEZ:AV0Z2065902 Keywords : error processing * event-related potentials * intracerebral recordings Subject RIV: FA - Cardiovascular Diseases incl. Cardiotharic Surgery

  17. Data contributed by EPA/ORD/NERL/CED researchers to the manuscript "Advanced Error Diagnostics of the CMAQ and CHIMERE modeling systems within the AQMEII3 Model Evaluation Framework"

    Data.gov (United States)

    U.S. Environmental Protection Agency — This dataset contains the data contributed by EPA/ORD/NERL/CED researchers to the manuscript "Advanced Error Diagnostics of the CMAQ and CHIMERE modeling systems...

  18. ERRORS AND DIFFICULTIES IN TRANSLATING LEGAL TEXTS

    Directory of Open Access Journals (Sweden)

    Camelia, CHIRILA

    2014-11-01

    Full Text Available Nowadays the accurate translation of legal texts has become highly important as the mistranslation of a passage in a contract, for example, could lead to lawsuits and loss of money. Consequently, the translation of legal texts to other languages faces many difficulties and only professional translators specialised in legal translation should deal with the translation of legal documents and scholarly writings. The purpose of this paper is to analyze translation from three perspectives: translation quality, errors and difficulties encountered in translating legal texts and consequences of such errors in professional translation. First of all, the paper points out the importance of performing a good and correct translation, which is one of the most important elements to be considered when discussing translation. Furthermore, the paper presents an overview of the errors and difficulties in translating texts and of the consequences of errors in professional translation, with applications to the field of law. The paper is also an approach to the differences between languages (English and Romanian that can hinder comprehension for those who have embarked upon the difficult task of translation. The research method that I have used to achieve the objectives of the paper was the content analysis of various Romanian and foreign authors' works.

  19. Error Management in ATLAS TDAQ: An Intelligent Systems approach

    CERN Document Server

    Slopper, John Erik

    2010-01-01

    This thesis is concerned with the use of intelligent system techniques (IST) within a large distributed software system, specically the ATLAS TDAQ system which has been developed and is currently in use at the European Laboratory for Particle Physics(CERN). The overall aim is to investigate and evaluate a range of ITS techniques in order to improve the error management system (EMS) currently used within the TDAQ system via error detection and classication. The thesis work will provide a reference for future research and development of such methods in the TDAQ system. The thesis begins by describing the TDAQ system and the existing EMS, with a focus on the underlying expert system approach, in order to identify areas where improvements can be made using IST techniques. It then discusses measures of evaluating error detection and classication techniques and the factors specic to the TDAQ system. Error conditions are then simulated in a controlled manner using an experimental setup and datasets were gathered fro...

  20. Medical Error Types and Causes Made by Nurses in Turkey

    Directory of Open Access Journals (Sweden)

    Dilek Kucuk Alemdar

    2013-06-01

    Full Text Available AIM: This study was carried out as a descriptive study in order to determine types, causes and prevalence of medical errors made by nurses in Turkey. METHOD: Seventy eight (78 nurses who have worked in a randomly selected hospital from five hospitals in Giresun city centre were enrolled in the study. The data was collected by the researchers using the ‘Information Form for Nurses’ and ‘Medical Error Form’. The Medical Error Form consists of 2 parts and 40 items including types and causes of medical errors. Nurses’ socio-demographic variables, medical error types and causes were evaluated using the percentage distribution and mean. RESULTS: The mean age of the nurses was 25.5 years, with a standard deviation 6.03 years. 50% of the nurses graduated health professional high school in the study. 53.8% of the nurses are single, 63.1% worked between 1-5 years, 71.8% day and night shifts and 42.3% in medical clinics. The common types of medical errors were hospital infection rate of 15.4%, diagnostic errors 12.8%, needle or cutting tool injuries and problems related to drug usage which has side effects 10.3%. In the study 38.5% of the nurses reported that they thought the cause of medical error highly was tiredness, 36.4% increased workload and 34.6% long working hours. CONCLUSION: As a result of the present study, nurses mentioned hospital infection, diagnostic errors, needle or cutting tool injuries as the most common medical errors and fatigue, over work load and long working hours as the most common medical error reasons. [TAF Prev Med Bull 2013; 12(3.000: 307-314

  1. Addressing the Problem of Negative Lexical Transfer Errors in Chilean University Students

    Science.gov (United States)

    Dissington, Paul Anthony

    2018-01-01

    Studies of second language learning have revealed a connection between first language transfer and errors in second language production. This paper describes an action research study carried out among Chilean university students studying English as part of their degree programmes. The study focuses on common lexical errors made by Chilean…

  2. Understanding Problem-Solving Errors by Students with Learning Disabilities in Standards-Based and Traditional Curricula

    Science.gov (United States)

    Bouck, Emily C.; Bouck, Mary K.; Joshi, Gauri S.; Johnson, Linley

    2016-01-01

    Students with learning disabilities struggle with word problems in mathematics classes. Understanding the type of errors students make when working through such mathematical problems can further describe student performance and highlight student difficulties. Through the use of error codes, researchers analyzed the type of errors made by 14 sixth…

  3. Positional error in automated geocoding of residential addresses

    Directory of Open Access Journals (Sweden)

    Talbot Thomas O

    2003-12-01

    Full Text Available Abstract Background Public health applications using geographic information system (GIS technology are steadily increasing. Many of these rely on the ability to locate where people live with respect to areas of exposure from environmental contaminants. Automated geocoding is a method used to assign geographic coordinates to an individual based on their street address. This method often relies on street centerline files as a geographic reference. Such a process introduces positional error in the geocoded point. Our study evaluated the positional error caused during automated geocoding of residential addresses and how this error varies between population densities. We also evaluated an alternative method of geocoding using residential property parcel data. Results Positional error was determined for 3,000 residential addresses using the distance between each geocoded point and its true location as determined with aerial imagery. Error was found to increase as population density decreased. In rural areas of an upstate New York study area, 95 percent of the addresses geocoded to within 2,872 m of their true location. Suburban areas revealed less error where 95 percent of the addresses geocoded to within 421 m. Urban areas demonstrated the least error where 95 percent of the addresses geocoded to within 152 m of their true location. As an alternative to using street centerline files for geocoding, we used residential property parcel points to locate the addresses. In the rural areas, 95 percent of the parcel points were within 195 m of the true location. In suburban areas, this distance was 39 m while in urban areas 95 percent of the parcel points were within 21 m of the true location. Conclusion Researchers need to determine if the level of error caused by a chosen method of geocoding may affect the results of their project. As an alternative method, property data can be used for geocoding addresses if the error caused by traditional methods is

  4. Laboratory errors and patient safety.

    Science.gov (United States)

    Miligy, Dawlat A

    2015-01-01

    Laboratory data are extensively used in medical practice; consequently, laboratory errors have a tremendous impact on patient safety. Therefore, programs designed to identify and reduce laboratory errors, as well as, setting specific strategies are required to minimize these errors and improve patient safety. The purpose of this paper is to identify part of the commonly encountered laboratory errors throughout our practice in laboratory work, their hazards on patient health care and some measures and recommendations to minimize or to eliminate these errors. Recording the encountered laboratory errors during May 2008 and their statistical evaluation (using simple percent distribution) have been done in the department of laboratory of one of the private hospitals in Egypt. Errors have been classified according to the laboratory phases and according to their implication on patient health. Data obtained out of 1,600 testing procedure revealed that the total number of encountered errors is 14 tests (0.87 percent of total testing procedures). Most of the encountered errors lay in the pre- and post-analytic phases of testing cycle (representing 35.7 and 50 percent, respectively, of total errors). While the number of test errors encountered in the analytic phase represented only 14.3 percent of total errors. About 85.7 percent of total errors were of non-significant implication on patients health being detected before test reports have been submitted to the patients. On the other hand, the number of test errors that have been already submitted to patients and reach the physician represented 14.3 percent of total errors. Only 7.1 percent of the errors could have an impact on patient diagnosis. The findings of this study were concomitant with those published from the USA and other countries. This proves that laboratory problems are universal and need general standardization and bench marking measures. Original being the first data published from Arabic countries that

  5. Fluid dynamic analysis and experimental study of a low radiation error temperature sensor

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Jie, E-mail: yangjie396768@163.com [Key Laboratory for Aerosol-Cloud-Precipitation of China Meteorological Administration, Nanjing 210044 (China); School of Atmospheric Physics, Nanjing University of Information Science and Technology, Nanjing 210044 (China); Liu, Qingquan, E-mail: andyucd@163.com [Jiangsu Key Laboratory of Meteorological Observation and Information Processing, Nanjing 210044 (China); Jiangsu Collaborative Innovation Center on Atmospheric Environment and Equipment Technology, Nanjing 210044 (China); Dai, Wei, E-mail: daiweiilove@163.com [Key Laboratory for Aerosol-Cloud-Precipitation of China Meteorological Administration, Nanjing 210044 (China); School of Atmospheric Physics, Nanjing University of Information Science and Technology, Nanjing 210044 (China); Ding, Renhui, E-mail: drhabcd@sina.com [Jiangsu Meteorological Observation Center, Nanjing 210008 (China)

    2017-01-30

    To improve the air temperature observation accuracy, a low radiation error temperature sensor is proposed. A Computational Fluid Dynamics (CFD) method is implemented to obtain radiation errors under various environmental conditions. The low radiation error temperature sensor, a naturally ventilated radiation shield, a thermometer screen and an aspirated temperature measurement platform are characterized in the same environment to conduct the intercomparison. The aspirated platform served as an air temperature reference. The mean radiation errors of the naturally ventilated radiation shield and the thermometer screen are 0.57 °C and 0.32 °C, respectively. In contrast, the mean radiation error of the low radiation error temperature sensor is 0.05 °C. The low radiation error temperature sensor proposed in this research may be helpful to provide a relatively accurate air temperature measurement result. - Highlights: • A CFD method is applied to obtain a quantitative solution of radiation error. • A temperature sensor is proposed to minimize radiation error. • The radiation error of the temperature sensor is on the order of 0.05 °C.

  6. Errors in Neonatology

    OpenAIRE

    Antonio Boldrini; Rosa T. Scaramuzzo; Armando Cuttano

    2013-01-01

    Introduction: Danger and errors are inherent in human activities. In medical practice errors can lean to adverse events for patients. Mass media echo the whole scenario. Methods: We reviewed recent published papers in PubMed database to focus on the evidence and management of errors in medical practice in general and in Neonatology in particular. We compared the results of the literature with our specific experience in Nina Simulation Centre (Pisa, Italy). Results: In Neonatology the main err...

  7. Using a Delphi Method to Identify Human Factors Contributing to Nursing Errors.

    Science.gov (United States)

    Roth, Cheryl; Brewer, Melanie; Wieck, K Lynn

    2017-07-01

    The purpose of this study was to identify human factors associated with nursing errors. Using a Delphi technique, this study used feedback from a panel of nurse experts (n = 25) on an initial qualitative survey questionnaire followed by summarizing the results with feedback and confirmation. Synthesized factors regarding causes of errors were incorporated into a quantitative Likert-type scale, and the original expert panel participants were queried a second time to validate responses. The list identified 24 items as most common causes of nursing errors, including swamping and errors made by others that nurses are expected to recognize and fix. The responses provided a consensus top 10 errors list based on means with heavy workload and fatigue at the top of the list. The use of the Delphi survey established consensus and developed a platform upon which future study of nursing errors can evolve as a link to future solutions. This list of human factors in nursing errors should serve to stimulate dialogue among nurses about how to prevent errors and improve outcomes. Human and system failures have been the subject of an abundance of research, yet nursing errors continue to occur. © 2016 Wiley Periodicals, Inc.

  8. Consolidity: Mystery of inner property of systems uncovered

    Directory of Open Access Journals (Sweden)

    Hassen T. Dorrah

    2012-10-01

    Full Text Available This paper uncovers the mystery of consolidity, an inner property of systems that was amazingly hidden. Consolidity also reveals the secrecy of why strong stable and highly controllable systems are not invulnerable of falling and collapsing. Consolidity is measured by its Consolidity Index, defined as the ratio of overall changes of output parameters over combined changes of input and system parameters, all operating in fully fuzzy environment. Under this notion, systems are classified into consolidated, quasi-consolidated, neutrally consolidated, unconsolidated, quasi-unconsolidated and mixed types. The strategy for the implementation of consolidity is elaborated for both natural and man-made existing systems as well as the new developed ones. An important critique arises that the by-product consolidity of natural or built-as-usual system could lead to trapping such systems into a completely undesired unconsolidity. This suggests that the ample number of conventional techniques that do not take system consolidity into account should gradually be changed, and adjusted with improved consolidity-based techniques. Four Golden Rules are highlighted for handling system consolidity, and applied to several illustrative case studies. These case studies cover the consolidity analysis of the Drug Concentration problem, Predator-Prey Population problem, Spread of Infectious Disease problem, AIDS Epidemic problem and Arm Race model. It is demonstrated that consolidity changes are contrary (opposite in sign to changes of both stability and controllability. This is a very significant result showing that our present practice of stressing on building strong stable and highly controllable systems could have already jeopardized the consolidity behavior of an ample family of existing real life systems. It is strongly recommended that the four Golden Rules of consolidity should be enforced as future strict regulations of systems modeling, analysis, design and

  9. Errors in abdominal computed tomography

    International Nuclear Information System (INIS)

    Stephens, S.; Marting, I.; Dixon, A.K.

    1989-01-01

    Sixty-nine patients are presented in whom a substantial error was made on the initial abdominal computed tomography report. Certain features of these errors have been analysed. In 30 (43.5%) a lesion was simply not recognised (error of observation); in 39 (56.5%) the wrong conclusions were drawn about the nature of normal or abnormal structures (error of interpretation). The 39 errors of interpretation were more complex; in 7 patients an abnormal structure was noted but interpreted as normal, whereas in four a normal structure was thought to represent a lesion. Other interpretive errors included those where the wrong cause for a lesion had been ascribed (24 patients), and those where the abnormality was substantially under-reported (4 patients). Various features of these errors are presented and discussed. Errors were made just as often in relation to small and large lesions. Consultants made as many errors as senior registrar radiologists. It is like that dual reporting is the best method of avoiding such errors and, indeed, this is widely practised in our unit. (Author). 9 refs.; 5 figs.; 1 tab

  10. Detecting Role Errors in the Gene Hierarchy of the NCI Thesaurus

    Directory of Open Access Journals (Sweden)

    Yehoshua Perl

    2008-01-01

    Full Text Available Gene terminologies are playing an increasingly important role in the ever-growing field of genomic research. While errors in large, complex terminologies are inevitable, gene terminologies are even more susceptible to them due to the rapid growth of genomic knowledge and the nature of its discovery. It is therefore very important to establish quality- assurance protocols for such genomic-knowledge repositories. Different kinds of terminologies oftentimes require auditing methodologies adapted to their particular structures. In light of this, an auditing methodology tailored to the characteristics of the NCI Thesaurus’s (NCIT’s Gene hierarchy is presented. The Gene hierarchy is of particular interest to the NCIT’s designers due to the primary role of genomics in current cancer research. This multiphase methodology focuses on detecting role-errors, such as missing roles or roles with incorrect or incomplete target structures, occurring within that hierarchy. The methodology is based on two kinds of abstraction networks, called taxonomies, that highlight the role distribution among concepts within the IS-A (subsumption hierarchy. These abstract views tend to highlight portions of the hierarchy having a higher concentration of errors. The errors found during an application of the methodology

  11. Scaling prediction errors to reward variability benefits error-driven learning in humans.

    Science.gov (United States)

    Diederen, Kelly M J; Schultz, Wolfram

    2015-09-01

    Effective error-driven learning requires individuals to adapt learning to environmental reward variability. The adaptive mechanism may involve decays in learning rate across subsequent trials, as shown previously, and rescaling of reward prediction errors. The present study investigated the influence of prediction error scaling and, in particular, the consequences for learning performance. Participants explicitly predicted reward magnitudes that were drawn from different probability distributions with specific standard deviations. By fitting the data with reinforcement learning models, we found scaling of prediction errors, in addition to the learning rate decay shown previously. Importantly, the prediction error scaling was closely related to learning performance, defined as accuracy in predicting the mean of reward distributions, across individual participants. In addition, participants who scaled prediction errors relative to standard deviation also presented with more similar performance for different standard deviations, indicating that increases in standard deviation did not substantially decrease "adapters'" accuracy in predicting the means of reward distributions. However, exaggerated scaling beyond the standard deviation resulted in impaired performance. Thus efficient adaptation makes learning more robust to changing variability. Copyright © 2015 the American Physiological Society.

  12. Evolutionary modeling-based approach for model errors correction

    Directory of Open Access Journals (Sweden)

    S. Q. Wan

    2012-08-01

    Full Text Available The inverse problem of using the information of historical data to estimate model errors is one of the science frontier research topics. In this study, we investigate such a problem using the classic Lorenz (1963 equation as a prediction model and the Lorenz equation with a periodic evolutionary function as an accurate representation of reality to generate "observational data."

    On the basis of the intelligent features of evolutionary modeling (EM, including self-organization, self-adaptive and self-learning, the dynamic information contained in the historical data can be identified and extracted by computer automatically. Thereby, a new approach is proposed to estimate model errors based on EM in the present paper. Numerical tests demonstrate the ability of the new approach to correct model structural errors. In fact, it can actualize the combination of the statistics and dynamics to certain extent.

  13. Translating Research Into Practice: Voluntary Reporting of Medication Errors in Critical Access Hospitals

    Science.gov (United States)

    Jones, Katherine J.; Cochran, Gary; Hicks, Rodney W.; Mueller, Keith J.

    2004-01-01

    Context:Low service volume, insufficient information technology, and limited human resources are barriers to learning about and correcting system failures in small rural hospitals. This paper describes the implementation of and initial findings from a voluntary medication error reporting program developed by the Nebraska Center for Rural Health…

  14. Comparing Absolute Error with Squared Error for Evaluating Empirical Models of Continuous Variables: Compositions, Implications, and Consequences

    Science.gov (United States)

    Gao, J.

    2014-12-01

    Reducing modeling error is often a major concern of empirical geophysical models. However, modeling errors can be defined in different ways: When the response variable is continuous, the most commonly used metrics are squared (SQ) and absolute (ABS) errors. For most applications, ABS error is the more natural, but SQ error is mathematically more tractable, so is often used as a substitute with little scientific justification. Existing literature has not thoroughly investigated the implications of using SQ error in place of ABS error, especially not geospatially. This study compares the two metrics through the lens of bias-variance decomposition (BVD). BVD breaks down the expected modeling error of each model evaluation point into bias (systematic error), variance (model sensitivity), and noise (observation instability). It offers a way to probe the composition of various error metrics. I analytically derived the BVD of ABS error and compared it with the well-known SQ error BVD, and found that not only the two metrics measure the characteristics of the probability distributions of modeling errors differently, but also the effects of these characteristics on the overall expected error are different. Most notably, under SQ error all bias, variance, and noise increase expected error, while under ABS error certain parts of the error components reduce expected error. Since manipulating these subtractive terms is a legitimate way to reduce expected modeling error, SQ error can never capture the complete story embedded in ABS error. I then empirically compared the two metrics with a supervised remote sensing model for mapping surface imperviousness. Pair-wise spatially-explicit comparison for each error component showed that SQ error overstates all error components in comparison to ABS error, especially variance-related terms. Hence, substituting ABS error with SQ error makes model performance appear worse than it actually is, and the analyst would more likely accept a

  15. Culture and error in space: implications from analog environments.

    Science.gov (United States)

    Helmreich, R L

    2000-09-01

    An ongoing study investigating national, organizational, and professional cultures in aviation and medicine is described. Survey data from 26 nations on 5 continents show highly significant national differences regarding appropriate relationships between leaders and followers, in group vs. individual orientation, and in values regarding adherence to rules and procedures. These findings replicate earlier research on dimensions of national culture. Data collected also isolate significant operational issues in multi-national flight crews. While there are no better or worse cultures, these cultural differences have operational implications for the way crews function in an international space environment. The positive professional cultures of pilots and physicians exhibit a high enjoyment of the job and professional pride. However, a negative component was also identified characterized by a sense of personal invulnerability regarding the effects of stress and fatigue on performance. This misperception of personal invulnerability has operational implications such as failures in teamwork and increased probability of error. A second component of the research examines team error in operational environments. From observational data collected during normal flight operations, new models of threat and error and their management were developed that can be generalized to operations in space and other socio-technological domains. Five categories of crew error are defined and their relationship to training programs in team performance, known generically as Crew Resource Management, is described. The relevance of these data for future spaceflight is discussed.

  16. Abnormal error monitoring in math-anxious individuals: evidence from error-related brain potentials.

    Directory of Open Access Journals (Sweden)

    Macarena Suárez-Pellicioni

    Full Text Available This study used event-related brain potentials to investigate whether math anxiety is related to abnormal error monitoring processing. Seventeen high math-anxious (HMA and seventeen low math-anxious (LMA individuals were presented with a numerical and a classical Stroop task. Groups did not differ in terms of trait or state anxiety. We found enhanced error-related negativity (ERN in the HMA group when subjects committed an error on the numerical Stroop task, but not on the classical Stroop task. Groups did not differ in terms of the correct-related negativity component (CRN, the error positivity component (Pe, classical behavioral measures or post-error measures. The amplitude of the ERN was negatively related to participants' math anxiety scores, showing a more negative amplitude as the score increased. Moreover, using standardized low resolution electromagnetic tomography (sLORETA we found greater activation of the insula in errors on a numerical task as compared to errors in a non-numerical task only for the HMA group. The results were interpreted according to the motivational significance theory of the ERN.

  17. Heuristic errors in clinical reasoning.

    Science.gov (United States)

    Rylander, Melanie; Guerrasio, Jeannette

    2016-08-01

    Errors in clinical reasoning contribute to patient morbidity and mortality. The purpose of this study was to determine the types of heuristic errors made by third-year medical students and first-year residents. This study surveyed approximately 150 clinical educators inquiring about the types of heuristic errors they observed in third-year medical students and first-year residents. Anchoring and premature closure were the two most common errors observed amongst third-year medical students and first-year residents. There was no difference in the types of errors observed in the two groups. Errors in clinical reasoning contribute to patient morbidity and mortality Clinical educators perceived that both third-year medical students and first-year residents committed similar heuristic errors, implying that additional medical knowledge and clinical experience do not affect the types of heuristic errors made. Further work is needed to help identify methods that can be used to reduce heuristic errors early in a clinician's education. © 2015 John Wiley & Sons Ltd.

  18. Awareness of technology-induced errors and processes for identifying and preventing such errors.

    Science.gov (United States)

    Bellwood, Paule; Borycki, Elizabeth M; Kushniruk, Andre W

    2015-01-01

    There is a need to determine if organizations working with health information technology are aware of technology-induced errors and how they are addressing and preventing them. The purpose of this study was to: a) determine the degree of technology-induced error awareness in various Canadian healthcare organizations, and b) identify those processes and procedures that are currently in place to help address, manage, and prevent technology-induced errors. We identified a lack of technology-induced error awareness among participants. Participants identified there was a lack of well-defined procedures in place for reporting technology-induced errors, addressing them when they arise, and preventing them.

  19. 76 FR 80337 - Uncovered Innerspring Units From the People's Republic of China: Rescission of Antidumping Duty...

    Science.gov (United States)

    2011-12-23

    ... fashion. Uncovered innersprings are classified under subheading 9404.29.9010, 9404.29.9005 and 9404.29... (``APO'') of their responsibility concerning the return or destruction of proprietary information... written notification of the return or destruction of APO materials or conversion to judicial protective...

  20. CCD image sensor induced error in PIV applications

    Science.gov (United States)

    Legrand, M.; Nogueira, J.; Vargas, A. A.; Ventas, R.; Rodríguez-Hidalgo, M. C.

    2014-06-01

    The readout procedure of charge-coupled device (CCD) cameras is known to generate some image degradation in different scientific imaging fields, especially in astrophysics. In the particular field of particle image velocimetry (PIV), widely extended in the scientific community, the readout procedure of the interline CCD sensor induces a bias in the registered position of particle images. This work proposes simple procedures to predict the magnitude of the associated measurement error. Generally, there are differences in the position bias for the different images of a certain particle at each PIV frame. This leads to a substantial bias error in the PIV velocity measurement (˜0.1 pixels). This is the order of magnitude that other typical PIV errors such as peak-locking may reach. Based on modern CCD technology and architecture, this work offers a description of the readout phenomenon and proposes a modeling for the CCD readout bias error magnitude. This bias, in turn, generates a velocity measurement bias error when there is an illumination difference between two successive PIV exposures. The model predictions match the experiments performed with two 12-bit-depth interline CCD cameras (MegaPlus ES 4.0/E incorporating the Kodak KAI-4000M CCD sensor with 4 megapixels). For different cameras, only two constant values are needed to fit the proposed calibration model and predict the error from the readout procedure. Tests by different researchers using different cameras would allow verification of the model, that can be used to optimize acquisition setups. Simple procedures to obtain these two calibration values are also described.

  1. CCD image sensor induced error in PIV applications

    International Nuclear Information System (INIS)

    Legrand, M; Nogueira, J; Vargas, A A; Ventas, R; Rodríguez-Hidalgo, M C

    2014-01-01

    The readout procedure of charge-coupled device (CCD) cameras is known to generate some image degradation in different scientific imaging fields, especially in astrophysics. In the particular field of particle image velocimetry (PIV), widely extended in the scientific community, the readout procedure of the interline CCD sensor induces a bias in the registered position of particle images. This work proposes simple procedures to predict the magnitude of the associated measurement error. Generally, there are differences in the position bias for the different images of a certain particle at each PIV frame. This leads to a substantial bias error in the PIV velocity measurement (∼0.1 pixels). This is the order of magnitude that other typical PIV errors such as peak-locking may reach. Based on modern CCD technology and architecture, this work offers a description of the readout phenomenon and proposes a modeling for the CCD readout bias error magnitude. This bias, in turn, generates a velocity measurement bias error when there is an illumination difference between two successive PIV exposures. The model predictions match the experiments performed with two 12-bit-depth interline CCD cameras (MegaPlus ES 4.0/E incorporating the Kodak KAI-4000M CCD sensor with 4 megapixels). For different cameras, only two constant values are needed to fit the proposed calibration model and predict the error from the readout procedure. Tests by different researchers using different cameras would allow verification of the model, that can be used to optimize acquisition setups. Simple procedures to obtain these two calibration values are also described. (paper)

  2. Multivariate weighted recurrence network inference for uncovering oil-water transitional flow behavior in a vertical pipe.

    Science.gov (United States)

    Gao, Zhong-Ke; Yang, Yu-Xuan; Cai, Qing; Zhang, Shan-Shan; Jin, Ning-De

    2016-06-01

    Exploring the dynamical behaviors of high water cut and low velocity oil-water flows remains a contemporary and challenging problem of significant importance. This challenge stimulates us to design a high-speed cycle motivation conductance sensor to capture spatial local flow information. We systematically carry out experiments and acquire the multi-channel measurements from different oil-water flow patterns. Then we develop a novel multivariate weighted recurrence network for uncovering the flow behaviors from multi-channel measurements. In particular, we exploit graph energy and weighted clustering coefficient in combination with multivariate time-frequency analysis to characterize the derived complex networks. The results indicate that the network measures are very sensitive to the flow transitions and allow uncovering local dynamical behaviors associated with water cut and flow velocity. These properties render our method particularly useful for quantitatively characterizing dynamical behaviors governing the transition and evolution of different oil-water flow patterns.

  3. Falls and Postural Control in Older Adults With Eye Refractive Errors

    Directory of Open Access Journals (Sweden)

    Afsun Nodehi-Moghadam

    2016-04-01

    Conclusion: Vision impairment of older adults due to refractive error is not associated with an increase in falls. Furthermore, TUG test results did not show balance disorders in these groups. Further research, such as assessment of postural control with advanced devices and considering other falling risk factors is also needed to identify the predictors of falls in older adults with eye refractive errors.

  4. Estimation of heading gyrocompass error using a GPS 3DF system: Impact on ADCP measurements

    Directory of Open Access Journals (Sweden)

    Simón Ruiz

    2002-12-01

    Full Text Available Traditionally the horizontal orientation in a ship (heading has been obtained from a gyrocompass. This instrument is still used on research vessels but has an estimated error of about 2-3 degrees, inducing a systematic error in the cross-track velocity measured by an Acoustic Doppler Current Profiler (ADCP. The three-dimensional positioning system (GPS 3DF provides an independent heading measurement with accuracy better than 0.1 degree. The Spanish research vessel BIO Hespérides has been operating with this new system since 1996. For the first time on this vessel, the data from this new instrument are used to estimate gyrocompass error. The methodology we use follows the scheme developed by Griffiths (1994, which compares data from the gyrocompass and the GPS system in order to obtain an interpolated error function. In the present work we apply this methodology on mesoscale surveys performed during the observational phase of the OMEGA project, in the Alboran Sea. The heading-dependent gyrocompass error dominated. Errors in gyrocompass heading of 1.4-3.4 degrees have been found, which give a maximum error in measured cross-track ADCP velocity of 24 cm s-1.

  5. Eliminating cancer stem cells: an interview with CCR’s Steven Hou | Center for Cancer Research

    Science.gov (United States)

    Steven Hou, Ph.D., senior investigator in the Basic Research Laboratory at the Center for Cancer Research describes his latest research that has uncovered potential ways to eliminate cancer stem cells and may offer hope to patients with reoccurring tumors.  Learn more...

  6. Minimum Error Entropy Classification

    CERN Document Server

    Marques de Sá, Joaquim P; Santos, Jorge M F; Alexandre, Luís A

    2013-01-01

    This book explains the minimum error entropy (MEE) concept applied to data classification machines. Theoretical results on the inner workings of the MEE concept, in its application to solving a variety of classification problems, are presented in the wider realm of risk functionals. Researchers and practitioners also find in the book a detailed presentation of practical data classifiers using MEE. These include multi‐layer perceptrons, recurrent neural networks, complexvalued neural networks, modular neural networks, and decision trees. A clustering algorithm using a MEE‐like concept is also presented. Examples, tests, evaluation experiments and comparison with similar machines using classic approaches, complement the descriptions.

  7. The Vanishing Site of Mina Shaughnessy's "Error and Expectations."

    Science.gov (United States)

    Laurence, Patricia

    1993-01-01

    Claims that recent reassessments of Mina Shaughnessy's "Errors and Expectations" and the field of composition in the 1970s overlook the institutional forces that helped shape the rhetoric and methodology of researchers at that time. (HB)

  8. Uncovering transcriptional interactions via an adaptive fuzzy logic approach

    Directory of Open Access Journals (Sweden)

    Chen Chung-Ming

    2009-12-01

    Full Text Available Abstract Background To date, only a limited number of transcriptional regulatory interactions have been uncovered. In a pilot study integrating sequence data with microarray data, a position weight matrix (PWM performed poorly in inferring transcriptional interactions (TIs, which represent physical interactions between transcription factors (TF and upstream sequences of target genes. Inferring a TI means that the promoter sequence of a target is inferred to match the consensus sequence motifs of a potential TF, and their interaction type such as AT or RT is also predicted. Thus, a robust PWM (rPWM was developed to search for consensus sequence motifs. In addition to rPWM, one feature extracted from ChIP-chip data was incorporated to identify potential TIs under specific conditions. An interaction type classifier was assembled to predict activation/repression of potential TIs using microarray data. This approach, combining an adaptive (learning fuzzy inference system and an interaction type classifier to predict transcriptional regulatory networks, was named AdaFuzzy. Results AdaFuzzy was applied to predict TIs using real genomics data from Saccharomyces cerevisiae. Following one of the latest advances in predicting TIs, constrained probabilistic sparse matrix factorization (cPSMF, and using 19 transcription factors (TFs, we compared AdaFuzzy to four well-known approaches using over-representation analysis and gene set enrichment analysis. AdaFuzzy outperformed these four algorithms. Furthermore, AdaFuzzy was shown to perform comparably to 'ChIP-experimental method' in inferring TIs identified by two sets of large scale ChIP-chip data, respectively. AdaFuzzy was also able to classify all predicted TIs into one or more of the four promoter architectures. The results coincided with known promoter architectures in yeast and provided insights into transcriptional regulatory mechanisms. Conclusion AdaFuzzy successfully integrates multiple types of

  9. The work is never ending: uncovering teamwork sustainability using realistic evaluation.

    Science.gov (United States)

    Frykman, Mandus; von Thiele Schwarz, Ulrica; Muntlin Athlin, Åsa; Hasson, Henna; Mazzocato, Pamela

    2017-03-20

    Purpose The purpose of this paper is to uncover the mechanisms influencing the sustainability of behavior changes following the implementation of teamwork. Design/methodology/approach Realistic evaluation was combined with a framework (DCOM®) based on applied behavior analysis to study the sustainability of behavior changes two and a half years after the initial implementation of teamwork at an emergency department. The DCOM® framework was used to categorize the mechanisms of behavior change interventions (BCIs) into the four categories of direction, competence, opportunity, and motivation. Non-participant observation and interview data were used. Findings The teamwork behaviors were not sustained. A substantial fallback in managerial activities in combination with a complex context contributed to reduced direction, opportunity, and motivation. Reduced direction made staff members unclear about how and why they should work in teams. Deterioration of opportunity was evident from the lack of problem-solving resources resulting in accumulated barriers to teamwork. Motivation in terms of management support and feedback was reduced. Practical implications The implementation of complex organizational changes in complex healthcare contexts requires continuous adaption and managerial activities well beyond the initial implementation period. Originality/value By integrating the DCOM® framework with realistic evaluation, this study responds to the call for theoretically based research on behavioral mechanisms that can explain how BCIs interact with context and how this interaction influences sustainability.

  10. Comparing different error-conditions in film dosemeter evaluation

    International Nuclear Information System (INIS)

    Roed, H.; Figel, M.

    2007-01-01

    In the evaluation of a film used as a personal dosemeter it may be necessary to mark the dosemeters when possible error-conditions are recognised, such as errors that have an influence on the ability to make a correct evaluation of the dose value. In this project a comparison has been carried out to examine how two individual monitoring services, IMS [National Inst. of Radiation Hygiene (Denmark) (NIRH) and National Research Centre for Environment and Health (Germany) (GSF)], from two different EU countries mark their dosemeters. The IMS are different in size, type of customers and issuing period, but both use films as their primary dosemeters. The error-conditions examined are dosemeters exposed to moisture or light, contaminated dosemeters, films exposed outside the badge, missing filters in the badge, films inserted incorrectly in the badge and dosemeters not returned or returned too late to the IMS. The data are collected for the year 2003 where NIRH evaluated ∼50,000 and GSF ∼1.4 million film dosemeters. The percentage of film dosemeters is calculated for each error-condition as well as the distribution among eight different employee categories, i.e. medicine, nuclear medicine, nuclear industry, industry, radiography, laboratories, veterinary and others. It turned out, that incorrect insertion of the film in the badge was the most common error-condition observed at both IMS and that veterinarians, as the employee category, generally have the highest number of errors. NIRH has a significantly higher relative number of dosemeters in most error-conditions than GSF, which perhaps reflects that a comparison is difficult due to different systemic and methodical differences between the IMS and countries, e.g. regulations and monitoring programs etc. Also the non-existence of a common categorisation method for employee categories contributes to make a comparison like this difficult. (authors)

  11. Einstein's error

    International Nuclear Information System (INIS)

    Winterflood, A.H.

    1980-01-01

    In discussing Einstein's Special Relativity theory it is claimed that it violates the principle of relativity itself and that an anomalous sign in the mathematics is found in the factor which transforms one inertial observer's measurements into those of another inertial observer. The apparent source of this error is discussed. Having corrected the error a new theory, called Observational Kinematics, is introduced to replace Einstein's Special Relativity. (U.K.)

  12. Exploring key considerations when determining bona fide inadvertent errors resulting in understatements

    Directory of Open Access Journals (Sweden)

    Chrizanne de Villiers

    2016-03-01

    Full Text Available Chapter 16 of the Tax Administration Act (28 of 2011 (the TA Act deals with understatement penalties. In the event of an ‘understatement’, in terms of Section 222 of the TA Act, a taxpayer must pay an understatement penalty, unless the understatement results from a bona fide inadvertent error. The determining of a bona fide inadvertent error on taxpayers’ returns is a totally new concept in the tax fraternity. It is of utmost importance that this section is applied correctly based on sound evaluation principles and not on professional judgement when determining if the error was indeed the result of a bona fide inadvertent error. This research study focuses on exploring key considerations when determining bona fide inadvertent errors resulting in understatements. The role and importance of tax penalty provisions is explored and the meaning of the different components in the term ‘bona fide inadvertent error’ critically analysed with the purpose to find a possible definition for the term ‘bona fide inadvertent error’. The study also compares the provisions of other tax jurisdictions with regards to errors made resulting in tax understatements in order to find possible guidelines on the application of bona fide inadvertent errors as contained in Section 222 of the TA Act. The findings of the research study revealed that the term ‘bona fide inadvertent error’ contained in Section 222 of the TA Act should be defined urgently and that guidelines must be provided by SARS on the application of the new amendment. SARS should also clarify the application of a bona fide inadvertent error in light of the behaviours contained in Section 223 of the TA Act to avoid any confusion.

  13. Seeing the forest through the trees: uncovering phenomic complexity through interactive network visualization.

    Science.gov (United States)

    Warner, Jeremy L; Denny, Joshua C; Kreda, David A; Alterovitz, Gil

    2015-03-01

    Our aim was to uncover unrecognized phenomic relationships using force-based network visualization methods, based on observed electronic medical record data. A primary phenotype was defined from actual patient profiles in the Multiparameter Intelligent Monitoring in Intensive Care II database. Network visualizations depicting primary relationships were compared to those incorporating secondary adjacencies. Interactivity was enabled through a phenotype visualization software concept: the Phenomics Advisor. Subendocardial infarction with cardiac arrest was demonstrated as a sample phenotype; there were 332 primarily adjacent diagnoses, with 5423 relationships. Primary network visualization suggested a treatment-related complication phenotype and several rare diagnoses; re-clustering by secondary relationships revealed an emergent cluster of smokers with the metabolic syndrome. Network visualization reveals phenotypic patterns that may have remained occult in pairwise correlation analysis. Visualization of complex data, potentially offered as point-of-care tools on mobile devices, may allow clinicians and researchers to quickly generate hypotheses and gain deeper understanding of patient subpopulations. © The Author 2014. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  14. Controlling errors in unidosis carts

    Directory of Open Access Journals (Sweden)

    Inmaculada Díaz Fernández

    2010-01-01

    Full Text Available Objective: To identify errors in the unidosis system carts. Method: For two months, the Pharmacy Service controlled medication either returned or missing from the unidosis carts both in the pharmacy and in the wards. Results: Uncorrected unidosis carts show a 0.9% of medication errors (264 versus 0.6% (154 which appeared in unidosis carts previously revised. In carts not revised, the error is 70.83% and mainly caused when setting up unidosis carts. The rest are due to a lack of stock or unavailability (21.6%, errors in the transcription of medical orders (6.81% or that the boxes had not been emptied previously (0.76%. The errors found in the units correspond to errors in the transcription of the treatment (3.46%, non-receipt of the unidosis copy (23.14%, the patient did not take the medication (14.36%or was discharged without medication (12.77%, was not provided by nurses (14.09%, was withdrawn from the stocks of the unit (14.62%, and errors of the pharmacy service (17.56% . Conclusions: It is concluded the need to redress unidosis carts and a computerized prescription system to avoid errors in transcription.Discussion: A high percentage of medication errors is caused by human error. If unidosis carts are overlooked before sent to hospitalization units, the error diminishes to 0.3%.

  15. Using lexical variables to predict picture-naming errors in jargon aphasia

    Directory of Open Access Journals (Sweden)

    Catherine Godbold

    2015-04-01

    Full Text Available Introduction Individuals with jargon aphasia produce fluent output which often comprises high proportions of non-word errors (e.g., maf for dog. Research has been devoted to identifying the underlying mechanisms behind such output. Some accounts posit a reduced flow of spreading activation between levels in the lexical network (e.g., Robson et al., 2003. If activation level differences across the lexical network are a cause of non-word outputs, we would predict improved performance when target items reflect an increased flow of activation between levels (e.g. more frequently-used words are often represented by higher resting levels of activation. This research investigates the effect of lexical properties of targets (e.g., frequency, imageability on accuracy, error type (real word vs. non-word and target-error overlap of non-word errors in a picture naming task by individuals with jargon aphasia. Method Participants were 17 individuals with Wernicke’s aphasia, who produced a high proportion of non-word errors (>20% of errors on the Philadelphia Naming Test (PNT; Roach et al., 1996. The data were retrieved from the Moss Aphasic Psycholinguistic Database Project (MAPPD, Mirman et al., 2010. We used a series of mixed models to test whether lexical variables predicted accuracy, error type (real word vs. non-word and target-error overlap for the PNT data. As lexical variables tend to be highly correlated, we performed a principal components analysis to reduce the variables into five components representing variables associated with phonology (length, phonotactic probability, neighbourhood density and neighbourhood frequency, semantics (imageability and concreteness, usage (frequency and age-of-acquisition, name agreement and visual complexity. Results and Discussion Table 1 shows the components that made a significant contribution to each model. Individuals with jargon aphasia produced more correct responses and fewer non-word errors relative to

  16. Threat and error management for anesthesiologists: a predictive risk taxonomy

    Science.gov (United States)

    Ruskin, Keith J.; Stiegler, Marjorie P.; Park, Kellie; Guffey, Patrick; Kurup, Viji; Chidester, Thomas

    2015-01-01

    Purpose of review Patient care in the operating room is a dynamic interaction that requires cooperation among team members and reliance upon sophisticated technology. Most human factors research in medicine has been focused on analyzing errors and implementing system-wide changes to prevent them from recurring. We describe a set of techniques that has been used successfully by the aviation industry to analyze errors and adverse events and explain how these techniques can be applied to patient care. Recent findings Threat and error management (TEM) describes adverse events in terms of risks or challenges that are present in an operational environment (threats) and the actions of specific personnel that potentiate or exacerbate those threats (errors). TEM is a technique widely used in aviation, and can be adapted for the use in a medical setting to predict high-risk situations and prevent errors in the perioperative period. A threat taxonomy is a novel way of classifying and predicting the hazards that can occur in the operating room. TEM can be used to identify error-producing situations, analyze adverse events, and design training scenarios. Summary TEM offers a multifaceted strategy for identifying hazards, reducing errors, and training physicians. A threat taxonomy may improve analysis of critical events with subsequent development of specific interventions, and may also serve as a framework for training programs in risk mitigation. PMID:24113268

  17. Error Analysis of 3D Metal Micromold Fabricated by Femtosecond Laser Cutting and Microelectric Resistance Slip Welding

    Directory of Open Access Journals (Sweden)

    Bin Xu

    2013-01-01

    Full Text Available We used micro-double-staged laminated object manufacturing process (micro-DLOM to fabricate 3D micromold. Moreover, the error of the micro-DLOM was also studied. Firstly, we got the principle error of the micro-DLOM. Based on the mathematical expression, it can be deduced that the smaller the opening angle α and the steel foil thickness h are, the smaller the principle error δ is. Secondly, we studied the error of femtosecond laser cutting. Through the experimental results, we know that the error of femtosecond laser cutting is 0.5 μm under 110 mW femtosecond laser power, 100 μm/s cutting speed, and 0.75 μm dimension compensation. Finally, we researched the error of microelectric resistance slip welding. Based on the research results, we can know that the minimum error of microcavity mold in the height direction is only 0.22 μm when welding voltage is 0.21 V and the number of slip welding discharge is 160.

  18. 77 FR 21961 - Uncovered Innerspring Units From the People's Republic of China: Final Results and Final...

    Science.gov (United States)

    2012-04-12

    ... material and then glued together in a linear fashion. Uncovered innersprings are classified under... responsibility concerning the return or destruction of proprietary information disclosed under the APO, which... notification of the return/destruction of APO materials or conversion to judicial protective order is hereby...

  19. How EFL students can use Google to correct their “untreatable” written errors

    Directory of Open Access Journals (Sweden)

    Luc Geiller

    2014-09-01

    Full Text Available This paper presents the findings of an experiment in which a group of 17 French post-secondary EFL learners used Google to self-correct several “untreatable” written errors. Whether or not error correction leads to improved writing has been much debated, some researchers dismissing it is as useless and others arguing that error feedback leads to more grammatical accuracy. In her response to Truscott (1996, Ferris (1999 explains that it would be unreasonable to abolish correction given the present state of knowledge, and that further research needed to focus on which types of errors were more amenable to which types of error correction. In her attempt to respond more effectively to her students’ errors, she made the distinction between “treatable” and “untreatable” ones: the former occur in “a patterned, rule-governed way” and include problems with verb tense or form, subject-verb agreement, run-ons, noun endings, articles, pronouns, while the latter include a variety of lexical errors, problems with word order and sentence structure, including missing and unnecessary words. Substantial research on the use of search engines as a tool for L2 learners has been carried out suggesting that the web plays an important role in fostering language awareness and learner autonomy (e.g. Shei 2008a, 2008b; Conroy 2010. According to Bathia and Richie (2009: 547, “the application of Google for language learning has just begun to be tapped.” Within the framework of this study it was assumed that the students, conversant with digital technologies and using Google and the web on a regular basis, could use various search options and the search results to self-correct their errors instead of relying on their teacher to provide direct feedback. After receiving some in-class training on how to formulate Google queries, the students were asked to use a customized Google search engine limiting searches to 28 information websites to correct up to

  20. Most Common Formal Grammatical Errors Committed by Authors

    Science.gov (United States)

    Onwuegbuzie, Anthony J.

    2017-01-01

    Empirical evidence has been provided about the importance of avoiding American Psychological Association (APA) errors in the abstract, body, reference list, and table sections of empirical research articles. Specifically, authors are significantly more likely to have their manuscripts rejected for publication if they commit numerous APA…

  1. Errors and violations

    International Nuclear Information System (INIS)

    Reason, J.

    1988-01-01

    This paper is in three parts. The first part summarizes the human failures responsible for the Chernobyl disaster and argues that, in considering the human contribution to power plant emergencies, it is necessary to distinguish between: errors and violations; and active and latent failures. The second part presents empirical evidence, drawn from driver behavior, which suggest that errors and violations have different psychological origins. The concluding part outlines a resident pathogen view of accident causation, and seeks to identify the various system pathways along which errors and violations may be propagated

  2. Development of a framework to estimate human error for diagnosis tasks in advanced control room

    International Nuclear Information System (INIS)

    Kim, Ar Ryum; Jang, In Seok; Seong, Proong Hyun

    2014-01-01

    In the emergency situation of nuclear power plants (NPPs), a diagnosis of the occurring events is crucial for managing or controlling the plant to a safe and stable condition. If the operators fail to diagnose the occurring events or relevant situations, their responses can eventually inappropriate or inadequate Accordingly, huge researches have been performed to identify the cause of diagnosis error and estimate the probability of diagnosis error. D.I Gertman et al. asserted that 'the cognitive failures stem from erroneous decision-making, poor understanding of rules and procedures, and inadequate problem solving and this failures may be due to quality of data and people's capacity for processing information'. Also many researchers have asserted that human-system interface (HSI), procedure, training and available time are critical factors to cause diagnosis error. In nuclear power plants, a diagnosis of the event is critical for safe condition of the system. As advanced main control room is being adopted in nuclear power plants, the operators may obtain the plant data via computer-based HSI and procedure. Also many researchers have asserted that HSI, procedure, training and available time are critical factors to cause diagnosis error. In this regards, using simulation data, diagnosis errors and its causes were identified. From this study, some useful insights to reduce diagnosis errors of operators in advanced main control room were provided

  3. The conditions that promote fear learning: prediction error and Pavlovian fear conditioning.

    Science.gov (United States)

    Li, Susan Shi Yuan; McNally, Gavan P

    2014-02-01

    A key insight of associative learning theory is that learning depends on the actions of prediction error: a discrepancy between the actual and expected outcomes of a conditioning trial. When positive, such error causes increments in associative strength and, when negative, such error causes decrements in associative strength. Prediction error can act directly on fear learning by determining the effectiveness of the aversive unconditioned stimulus or indirectly by determining the effectiveness, or associability, of the conditioned stimulus. Evidence from a variety of experimental preparations in human and non-human animals suggest that discrete neural circuits code for these actions of prediction error during fear learning. Here we review the circuits and brain regions contributing to the neural coding of prediction error during fear learning and highlight areas of research (safety learning, extinction, and reconsolidation) that may profit from this approach to understanding learning. Crown Copyright © 2013. Published by Elsevier Inc. All rights reserved.

  4. A Comparison of Error-Correction Procedures on Skill Acquisition during Discrete-Trial Instruction

    Science.gov (United States)

    Carroll, Regina A.; Joachim, Brad T.; St. Peter, Claire C.; Robinson, Nicole

    2015-01-01

    Previous research supports the use of a variety of error-correction procedures to facilitate skill acquisition during discrete-trial instruction. We used an adapted alternating treatments design to compare the effects of 4 commonly used error-correction procedures on skill acquisition for 2 children with attention deficit hyperactivity disorder…

  5. The invisible Web uncovering information sources search engines can't see

    CERN Document Server

    Sherman, Chris

    2001-01-01

    Enormous expanses of the Internet are unreachable with standard web search engines. This book provides the key to finding these hidden resources by identifying how to uncover and use invisible web resources. Mapping the invisible Web, when and how to use it, assessing the validity of the information, and the future of Web searching are topics covered in detail. Only 16 percent of Net-based information can be located using a general search engine. The other 84 percent is what is referred to as the invisible Web-made up of information stored in databases. Unlike pages on the visible Web, informa

  6. Characteristics and evidence of nursing scientific production for medication errors at the hospital environment

    Directory of Open Access Journals (Sweden)

    Lolita Dopico da Silva

    2012-06-01

    Full Text Available This study aimed to identify the characteristics of nurses’ publications about medication errors. It was used an Integrative methodology review covering January 2005 to October 2010 with "medication errors" and "nursing" descriptors and it was also collected data from electronic databases via “Capes Portal”. Results show four categories, the conduct of health professionals in medication errors, types and rates of errors, medication system weaknesses, and barriers to error. Discussion of the prevalent practice was not to notify the error. The prevalent error type was administration and error rates which ranged from 14.8 to 56.7%. Ilegible handwriting, communication failures among professionals, and lack of technical knowledge were weaknesses. Among the barriers, the civility from patient, nurses and technology were evident. Advances in researches for testing barriers were found and some gaps were apparent concerning lack of study that address pharmacodynamics or pharmacokinetic aspects of drugs involved in errors.

  7. Stochastic and sensitivity analysis of shape error of inflatable antenna reflectors

    Science.gov (United States)

    San, Bingbing; Yang, Qingshan; Yin, Liwei

    2017-03-01

    Inflatable antennas are promising candidates to realize future satellite communications and space observations since they are lightweight, low-cost and small-packaged-volume. However, due to their high flexibility, inflatable reflectors are difficult to manufacture accurately, which may result in undesirable shape errors, and thus affect their performance negatively. In this paper, the stochastic characteristics of shape errors induced during manufacturing process are investigated using Latin hypercube sampling coupled with manufacture simulations. Four main random error sources are involved, including errors in membrane thickness, errors in elastic modulus of membrane, boundary deviations and pressure variations. Using regression and correlation analysis, a global sensitivity study is conducted to rank the importance of these error sources. This global sensitivity analysis is novel in that it can take into account the random variation and the interaction between error sources. Analyses are parametrically carried out with various focal-length-to-diameter ratios (F/D) and aperture sizes (D) of reflectors to investigate their effects on significance ranking of error sources. The research reveals that RMS (Root Mean Square) of shape error is a random quantity with an exponent probability distribution and features great dispersion; with the increase of F/D and D, both mean value and standard deviation of shape errors are increased; in the proposed range, the significance ranking of error sources is independent of F/D and D; boundary deviation imposes the greatest effect with a much higher weight than the others; pressure variation ranks the second; error in thickness and elastic modulus of membrane ranks the last with very close sensitivities to pressure variation. Finally, suggestions are given for the control of the shape accuracy of reflectors and allowable values of error sources are proposed from the perspective of reliability.

  8. Efficiently characterizing the total error in quantum circuits

    Science.gov (United States)

    Carignan-Dugas, Arnaud; Wallman, Joel J.; Emerson, Joseph

    A promising technological advancement meant to enlarge our computational means is the quantum computer. Such a device would harvest the quantum complexity of the physical world in order to unfold concrete mathematical problems more efficiently. However, the errors emerging from the implementation of quantum operations are likewise quantum, and hence share a similar level of intricacy. Fortunately, randomized benchmarking protocols provide an efficient way to characterize the operational noise within quantum devices. The resulting figures of merit, like the fidelity and the unitarity, are typically attached to a set of circuit components. While important, this doesn't fulfill the main goal: determining if the error rate of the total circuit is small enough in order to trust its outcome. In this work, we fill the gap by providing an optimal bound on the total fidelity of a circuit in terms of component-wise figures of merit. Our bound smoothly interpolates between the classical regime, in which the error rate grows linearly in the circuit's length, and the quantum regime, which can naturally allow quadratic growth. Conversely, our analysis substantially improves the bounds on single circuit element fidelities obtained through techniques such as interleaved randomized benchmarking. This research was supported by the U.S. Army Research Office through Grant W911NF- 14-1-0103, CIFAR, the Government of Ontario, and the Government of Canada through NSERC and Industry Canada.

  9. Human errors evaluation for muster in emergency situations applying human error probability index (HEPI, in the oil company warehouse in Hamadan City

    Directory of Open Access Journals (Sweden)

    2012-12-01

    Full Text Available Introduction: Emergency situation is one of the influencing factors on human error. The aim of this research was purpose to evaluate human error in emergency situation of fire and explosion at the oil company warehouse in Hamadan city applying human error probability index (HEPI. . Material and Method: First, the scenario of emergency situation of those situation of fire and explosion at the oil company warehouse was designed and then maneuver against, was performed. The scaled questionnaire of muster for the maneuver was completed in the next stage. Collected data were analyzed to calculate the probability success for the 18 actions required in an emergency situation from starting point of the muster until the latest action to temporary sheltersafe. .Result: The result showed that the highest probability of error occurrence was related to make safe workplace (evaluation phase with 32.4 % and lowest probability of occurrence error in detection alarm (awareness phase with 1.8 %, probability. The highest severity of error was in the evaluation phase and the lowest severity of error was in the awareness and recovery phase. Maximum risk level was related to the evaluating exit routes and selecting one route and choosy another exit route and minimum risk level was related to the four evaluation phases. . Conclusion: To reduce the risk of reaction in the exit phases of an emergency situation, the following actions are recommended, based on the finding in this study: A periodic evaluation of the exit phase and modifying them if necessary, conducting more maneuvers and analyzing this results along with a sufficient feedback to the employees.

  10. Imagery of Errors in Typing

    Science.gov (United States)

    Rieger, Martina; Martinez, Fanny; Wenke, Dorit

    2011-01-01

    Using a typing task we investigated whether insufficient imagination of errors and error corrections is related to duration differences between execution and imagination. In Experiment 1 spontaneous error imagination was investigated, whereas in Experiment 2 participants were specifically instructed to imagine errors. Further, in Experiment 2 we…

  11. Trellis and turbo coding iterative and graph-based error control coding

    CERN Document Server

    Schlegel, Christian B

    2015-01-01

    This new edition has been extensively revised to reflect the progress in error control coding over the past few years. Over 60% of the material has been completely reworked, and 30% of the material is original. Convolutional, turbo, and low density parity-check (LDPC) coding and polar codes in a unified framework. Advanced research-related developments such as spatial coupling. A focus on algorithmic and implementation aspects of error control coding.

  12. A procedure for the significance testing of unmodeled errors in GNSS observations

    Science.gov (United States)

    Li, Bofeng; Zhang, Zhetao; Shen, Yunzhong; Yang, Ling

    2018-01-01

    It is a crucial task to establish a precise mathematical model for global navigation satellite system (GNSS) observations in precise positioning. Due to the spatiotemporal complexity of, and limited knowledge on, systematic errors in GNSS observations, some residual systematic errors would inevitably remain even after corrected with empirical model and parameterization. These residual systematic errors are referred to as unmodeled errors. However, most of the existing studies mainly focus on handling the systematic errors that can be properly modeled and then simply ignore the unmodeled errors that may actually exist. To further improve the accuracy and reliability of GNSS applications, such unmodeled errors must be handled especially when they are significant. Therefore, a very first question is how to statistically validate the significance of unmodeled errors. In this research, we will propose a procedure to examine the significance of these unmodeled errors by the combined use of the hypothesis tests. With this testing procedure, three components of unmodeled errors, i.e., the nonstationary signal, stationary signal and white noise, are identified. The procedure is tested by using simulated data and real BeiDou datasets with varying error sources. The results show that the unmodeled errors can be discriminated by our procedure with approximately 90% confidence. The efficiency of the proposed procedure is further reassured by applying the time-domain Allan variance analysis and frequency-domain fast Fourier transform. In summary, the spatiotemporally correlated unmodeled errors are commonly existent in GNSS observations and mainly governed by the residual atmospheric biases and multipath. Their patterns may also be impacted by the receiver.

  13. Residents' numeric inputting error in computerized physician order entry prescription.

    Science.gov (United States)

    Wu, Xue; Wu, Changxu; Zhang, Kan; Wei, Dong

    2016-04-01

    Computerized physician order entry (CPOE) system with embedded clinical decision support (CDS) can significantly reduce certain types of prescription error. However, prescription errors still occur. Various factors such as the numeric inputting methods in human computer interaction (HCI) produce different error rates and types, but has received relatively little attention. This study aimed to examine the effects of numeric inputting methods and urgency levels on numeric inputting errors of prescription, as well as categorize the types of errors. Thirty residents participated in four prescribing tasks in which two factors were manipulated: numeric inputting methods (numeric row in the main keyboard vs. numeric keypad) and urgency levels (urgent situation vs. non-urgent situation). Multiple aspects of participants' prescribing behavior were measured in sober prescribing situations. The results revealed that in urgent situations, participants were prone to make mistakes when using the numeric row in the main keyboard. With control of performance in the sober prescribing situation, the effects of the input methods disappeared, and urgency was found to play a significant role in the generalized linear model. Most errors were either omission or substitution types, but the proportion of transposition and intrusion error types were significantly higher than that of the previous research. Among numbers 3, 8, and 9, which were the less common digits used in prescription, the error rate was higher, which was a great risk to patient safety. Urgency played a more important role in CPOE numeric typing error-making than typing skills and typing habits. It was recommended that inputting with the numeric keypad had lower error rates in urgent situation. An alternative design could consider increasing the sensitivity of the keys with lower frequency of occurrence and decimals. To improve the usability of CPOE, numeric keyboard design and error detection could benefit from spatial

  14. Coping with human errors through system design: Implications for ecological interface design

    DEFF Research Database (Denmark)

    Rasmussen, Jens; Vicente, Kim J.

    1989-01-01

    Research during recent years has revealed that human errors are not stochastic events which can be removed through improved training programs or optimal interface design. Rather, errors tend to reflect either systematic interference between various models, rules, and schemata, or the effects...... of the adaptive mechanisms involved in learning. In terms of design implications, these findings suggest that reliable human-system interaction will be achieved by designing interfaces which tend to minimize the potential for control interference and support recovery from errors. In other words, the focus should...... be on control of the effects of errors rather than on the elimination of errors per se. In this paper, we propose a theoretical framework for interface design that attempts to satisfy these objectives. The goal of our framework, called ecological interface design, is to develop a meaningful representation...

  15. Magnetic Nanoparticle Thermometer: An Investigation of Minimum Error Transmission Path and AC Bias Error

    Directory of Open Access Journals (Sweden)

    Zhongzhou Du

    2015-04-01

    Full Text Available The signal transmission module of a magnetic nanoparticle thermometer (MNPT was established in this study to analyze the error sources introduced during the signal flow in the hardware system. The underlying error sources that significantly affected the precision of the MNPT were determined through mathematical modeling and simulation. A transfer module path with the minimum error in the hardware system was then proposed through the analysis of the variations of the system error caused by the significant error sources when the signal flew through the signal transmission module. In addition, a system parameter, named the signal-to-AC bias ratio (i.e., the ratio between the signal and AC bias, was identified as a direct determinant of the precision of the measured temperature. The temperature error was below 0.1 K when the signal-to-AC bias ratio was higher than 80 dB, and other system errors were not considered. The temperature error was below 0.1 K in the experiments with a commercial magnetic fluid (Sample SOR-10, Ocean Nanotechnology, Springdale, AR, USA when the hardware system of the MNPT was designed with the aforementioned method.

  16. Prioritising interventions against medication errors

    DEFF Research Database (Denmark)

    Lisby, Marianne; Pape-Larsen, Louise; Sørensen, Ann Lykkegaard

    errors are therefore needed. Development of definition: A definition of medication errors including an index of error types for each stage in the medication process was developed from existing terminology and through a modified Delphi-process in 2008. The Delphi panel consisted of 25 interdisciplinary......Abstract Authors: Lisby M, Larsen LP, Soerensen AL, Nielsen LP, Mainz J Title: Prioritising interventions against medication errors – the importance of a definition Objective: To develop and test a restricted definition of medication errors across health care settings in Denmark Methods: Medication...... errors constitute a major quality and safety problem in modern healthcare. However, far from all are clinically important. The prevalence of medication errors ranges from 2-75% indicating a global problem in defining and measuring these [1]. New cut-of levels focusing the clinical impact of medication...

  17. Uncovering Listeria monocytogenes hypervirulence by harnessing its biodiversity

    Science.gov (United States)

    Charlier, Caroline; Touchon, Marie; Chenal-Francisque, Viviane; Leclercq, Alexandre; Criscuolo, Alexis; Gaultier, Charlotte; Roussel, Sophie; Brisabois, Anne; Disson, Olivier; Rocha, Eduardo P. C.; Brisse, Sylvain; Lecuit, Marc

    2016-01-01

    Microbial pathogenesis studies are typically performed with reference strains, thereby overlooking microbial intra-species virulence heterogeneity. Here we integrated human epidemiological and clinical data with bacterial population genomics to harness the biodiversity of the model foodborne pathogen Listeria monocytogenes and decipher the basis of its neural and placental tropisms. Taking advantage of the clonal structure of this bacterial species, we identify clones epidemiologically associated with either food or human central nervous system (CNS) and maternal-neonatal (MN) listeriosis. The latter are also most prevalent in patients without immunosuppressive comorbidities. Strikingly, CNS and MN clones are hypervirulent in a humanized mouse model of listeriosis. By integrating epidemiological data and comparative genomics, we uncovered multiple novel putative virulence factors and demonstrated experimentally the contribution of the first gene cluster mediating Listeria monocytogenes neural and placental tropisms. This study illustrates the exceptional power of harnessing microbial biodiversity to identify clinically relevant microbial virulence attributes. PMID:26829754

  18. Medication Errors in the Southeast Asian Countries: A Systematic Review.

    Directory of Open Access Journals (Sweden)

    Shahrzad Salmasi

    Full Text Available Medication error (ME is a worldwide issue, but most studies on ME have been undertaken in developed countries and very little is known about ME in Southeast Asian countries. This study aimed systematically to identify and review research done on ME in Southeast Asian countries in order to identify common types of ME and estimate its prevalence in this region.The literature relating to MEs in Southeast Asian countries was systematically reviewed in December 2014 by using; Embase, Medline, Pubmed, ProQuest Central and the CINAHL. Inclusion criteria were studies (in any languages that investigated the incidence and the contributing factors of ME in patients of all ages.The 17 included studies reported data from six of the eleven Southeast Asian countries: five studies in Singapore, four in Malaysia, three in Thailand, three in Vietnam, one in the Philippines and one in Indonesia. There was no data on MEs in Brunei, Laos, Cambodia, Myanmar and Timor. Of the seventeen included studies, eleven measured administration errors, four focused on prescribing errors, three were done on preparation errors, three on dispensing errors and two on transcribing errors. There was only one study of reconciliation error. Three studies were interventional.The most frequently reported types of administration error were incorrect time, omission error and incorrect dose. Staff shortages, and hence heavy workload for nurses, doctor/nurse distraction, and misinterpretation of the prescription/medication chart, were identified as contributing factors of ME. There is a serious lack of studies on this topic in this region which needs to be addressed if the issue of ME is to be fully understood and addressed.

  19. Working memory load impairs the evaluation of behavioral errors in the medial frontal cortex.

    Science.gov (United States)

    Maier, Martin E; Steinhauser, Marco

    2017-10-01

    Early error monitoring in the medial frontal cortex enables error detection and the evaluation of error significance, which helps prioritize adaptive control. This ability has been assumed to be independent from central capacity, a limited pool of resources assumed to be involved in cognitive control. The present study investigated whether error evaluation depends on central capacity by measuring the error-related negativity (Ne/ERN) in a flanker paradigm while working memory load was varied on two levels. We used a four-choice flanker paradigm in which participants had to classify targets while ignoring flankers. Errors could be due to responding either to the flankers (flanker errors) or to none of the stimulus elements (nonflanker errors). With low load, the Ne/ERN was larger for flanker errors than for nonflanker errors-an effect that has previously been interpreted as reflecting differential significance of these error types. With high load, no such effect of error type on the Ne/ERN was observable. Our findings suggest that working memory load does not impair the generation of an Ne/ERN per se but rather impairs the evaluation of error significance. They demonstrate that error monitoring is composed of capacity-dependent and capacity-independent mechanisms. © 2017 Society for Psychophysiological Research.

  20. Stereochemical errors and their implications for molecular dynamics simulations

    Directory of Open Access Journals (Sweden)

    Freddolino Peter L

    2011-05-01

    Full Text Available Abstract Background Biological molecules are often asymmetric with respect to stereochemistry, and correct stereochemistry is essential to their function. Molecular dynamics simulations of biomolecules have increasingly become an integral part of biophysical research. However, stereochemical errors in biomolecular structures can have a dramatic impact on the results of simulations. Results Here we illustrate the effects that chirality and peptide bond configuration flips may have on the secondary structure of proteins throughout a simulation. We also analyze the most common sources of stereochemical errors in biomolecular structures and present software tools to identify, correct, and prevent stereochemical errors in molecular dynamics simulations of biomolecules. Conclusions Use of the tools presented here should become a standard step in the preparation of biomolecular simulations and in the generation of predicted structural models for proteins and nucleic acids.

  1. Medical errors in hospitalized pediatric trauma patients with chronic health conditions

    Directory of Open Access Journals (Sweden)

    Xiaotong Liu

    2014-01-01

    Full Text Available Objective: This study compares medical errors in pediatric trauma patients with and without chronic conditions. Methods: The 2009 Kids’ Inpatient Database, which included 123,303 trauma discharges, was analyzed. Medical errors were identified by International Classification of Diseases, Ninth Revision, Clinical Modification diagnosis codes. The medical error rates per 100 discharges and per 1000 hospital days were calculated and compared between inpatients with and without chronic conditions. Results: Pediatric trauma patients with chronic conditions experienced a higher medical error rate compared with patients without chronic conditions: 4.04 (95% confidence interval: 3.75–4.33 versus 1.07 (95% confidence interval: 0.98–1.16 per 100 discharges. The rate of medical error differed by type of chronic condition. After controlling for confounding factors, the presence of a chronic condition increased the adjusted odds ratio of medical error by 37% if one chronic condition existed (adjusted odds ratio: 1.37, 95% confidence interval: 1.21–1.5, and 69% if more than one chronic condition existed (adjusted odds ratio: 1.69, 95% confidence interval: 1.48–1.53. In the adjusted model, length of stay had the strongest association with medical error, but the adjusted odds ratio for chronic conditions and medical error remained significantly elevated even when accounting for the length of stay, suggesting that medical complexity has a role in medical error. Higher adjusted odds ratios were seen in other subgroups. Conclusion: Chronic conditions are associated with significantly higher rate of medical errors in pediatric trauma patients. Future research should evaluate interventions or guidelines for reducing the risk of medical errors in pediatric trauma patients with chronic conditions.

  2. Learning from Errors

    Science.gov (United States)

    Metcalfe, Janet

    2017-01-01

    Although error avoidance during learning appears to be the rule in American classrooms, laboratory studies suggest that it may be a counterproductive strategy, at least for neurologically typical students. Experimental investigations indicate that errorful learning followed by corrective feedback is beneficial to learning. Interestingly, the…

  3. Error-information in tutorial documentation: Supporting users' errors to facilitate initial skill learning

    NARCIS (Netherlands)

    Lazonder, Adrianus W.; van der Meij, Hans

    1995-01-01

    Novice users make many errors when they first try to learn how to work with a computer program like a spreadsheet or wordprocessor. No matter how user-friendly the software or the training manual, errors can and will occur. The current view on errors is that they can be helpful or disruptive,

  4. Health | Page 17 | IDRC - International Development Research Centre

    International Development Research Centre (IDRC) Digital Library (Canada)

    Researchers with the Africa Health Systems Initiative (AHSI) have uncovered ways to strengthen health systems in sub-Saharan Africa. Weak health systems cause a high burden of preventable and treatable illnesses, especially for those living in rural areas. Strong health systems are needed to provide adequate access ...

  5. The 3 faces of clinical reasoning: Epistemological explorations of disparate error reduction strategies.

    Science.gov (United States)

    Monteiro, Sandra; Norman, Geoff; Sherbino, Jonathan

    2018-03-13

    There is general consensus that clinical reasoning involves 2 stages: a rapid stage where 1 or more diagnostic hypotheses are advanced and a slower stage where these hypotheses are tested or confirmed. The rapid hypothesis generation stage is considered inaccessible for analysis or observation. Consequently, recent research on clinical reasoning has focused specifically on improving the accuracy of the slower, hypothesis confirmation stage. Three perspectives have developed in this line of research, and each proposes different error reduction strategies for clinical reasoning. This paper considers these 3 perspectives and examines the underlying assumptions. Additionally, this paper reviews the evidence, or lack of, behind each class of error reduction strategies. The first perspective takes an epidemiological stance, appealing to the benefits of incorporating population data and evidence-based medicine in every day clinical reasoning. The second builds on the heuristic and bias research programme, appealing to a special class of dual process reasoning models that theorizes a rapid error prone cognitive process for problem solving with a slower more logical cognitive process capable of correcting those errors. Finally, the third perspective borrows from an exemplar model of categorization that explicitly relates clinical knowledge and experience to diagnostic accuracy. © 2018 John Wiley & Sons, Ltd.

  6. Medication error detection in two major teaching hospitals: What are the types of errors?

    Directory of Open Access Journals (Sweden)

    Fatemeh Saghafi

    2014-01-01

    Full Text Available Background: Increasing number of reports on medication errors and relevant subsequent damages, especially in medical centers has become a growing concern for patient safety in recent decades. Patient safety and in particular, medication safety is a major concern and challenge for health care professionals around the world. Our prospective study was designed to detect prescribing, transcribing, dispensing, and administering medication errors in two major university hospitals. Materials and Methods: After choosing 20 similar hospital wards in two large teaching hospitals in the city of Isfahan, Iran, the sequence was randomly selected. Diagrams for drug distribution were drawn by the help of pharmacy directors. Direct observation technique was chosen as the method for detecting the errors. A total of 50 doses were studied in each ward to detect prescribing, transcribing and administering errors in each ward. The dispensing error was studied on 1000 doses dispensed in each hospital pharmacy. Results: A total of 8162 number of doses of medications were studied during the four stages, of which 8000 were complete data to be analyzed. 73% of prescribing orders were incomplete and did not have all six parameters (name, dosage form, dose and measuring unit, administration route, and intervals of administration. We found 15% transcribing errors. One-third of administration of medications on average was erroneous in both hospitals. Dispensing errors ranged between 1.4% and 2.2%. Conclusion: Although prescribing and administrating compromise most of the medication errors, improvements are needed in all four stages with regard to medication errors. Clear guidelines must be written and executed in both hospitals to reduce the incidence of medication errors.

  7. Spreadsheet Error Detection: an Empirical Examination in the Context of Greece

    Directory of Open Access Journals (Sweden)

    Dimitrios Maditinos

    2012-06-01

    Full Text Available The personal computers era made advanced programming tasks available to end users. Spreadsheet models are one of the most widely used applications that can produce valuable results with minimal training and effort. However, errors contained in most spreadsheets may be catastrophic and difficult to detect. This study attempts to investigate the influence of experience and spreadsheet presentation on the error finding performance by end users. To reach the target of the study, 216 business and finance students participated in a task of finding errors in a simple free cash flow model. The findings of the study reveal that presentation of the spreadsheet is of major importance as far as the error finding performance is concerned, while experience does not seem to affect students on their performance. Further research proposals and limitations of the study are, moreover, discussed.

  8. Dissociating response conflict and error likelihood in anterior cingulate cortex.

    Science.gov (United States)

    Yeung, Nick; Nieuwenhuis, Sander

    2009-11-18

    Neuroimaging studies consistently report activity in anterior cingulate cortex (ACC) in conditions of high cognitive demand, leading to the view that ACC plays a crucial role in the control of cognitive processes. According to one prominent theory, the sensitivity of ACC to task difficulty reflects its role in monitoring for the occurrence of competition, or "conflict," between responses to signal the need for increased cognitive control. However, a contrasting theory proposes that ACC is the recipient rather than source of monitoring signals, and that ACC activity observed in relation to task demand reflects the role of this region in learning about the likelihood of errors. Response conflict and error likelihood are typically confounded, making the theories difficult to distinguish empirically. The present research therefore used detailed computational simulations to derive contrasting predictions regarding ACC activity and error rate as a function of response speed. The simulations demonstrated a clear dissociation between conflict and error likelihood: fast response trials are associated with low conflict but high error likelihood, whereas slow response trials show the opposite pattern. Using the N2 component as an index of ACC activity, an EEG study demonstrated that when conflict and error likelihood are dissociated in this way, ACC activity tracks conflict and is negatively correlated with error likelihood. These findings support the conflict-monitoring theory and suggest that, in speeded decision tasks, ACC activity reflects current task demands rather than the retrospective coding of past performance.

  9. Social aspects of clinical errors.

    Science.gov (United States)

    Richman, Joel; Mason, Tom; Mason-Whitehead, Elizabeth; McIntosh, Annette; Mercer, Dave

    2009-08-01

    Clinical errors, whether committed by doctors, nurses or other professions allied to healthcare, remain a sensitive issue requiring open debate and policy formulation in order to reduce them. The literature suggests that the issues underpinning errors made by healthcare professionals involve concerns about patient safety, professional disclosure, apology, litigation, compensation, processes of recording and policy development to enhance quality service. Anecdotally, we are aware of narratives of minor errors, which may well have been covered up and remain officially undisclosed whilst the major errors resulting in damage and death to patients alarm both professionals and public with resultant litigation and compensation. This paper attempts to unravel some of these issues by highlighting the historical nature of clinical errors and drawing parallels to contemporary times by outlining the 'compensation culture'. We then provide an overview of what constitutes a clinical error and review the healthcare professional strategies for managing such errors.

  10. Test-Retest Reliability of the Adaptive Chemistry Assessment Survey for Teachers: Measurement Error and Alternatives to Correlation

    Science.gov (United States)

    Harshman, Jordan; Yezierski, Ellen

    2016-01-01

    Determining the error of measurement is a necessity for researchers engaged in bench chemistry, chemistry education research (CER), and a multitude of other fields. Discussions regarding what constructs measurement error entails and how to best measure them have occurred, but the critiques about traditional measures have yielded few alternatives.…

  11. Passive quantum error correction of linear optics networks through error averaging

    Science.gov (United States)

    Marshman, Ryan J.; Lund, Austin P.; Rohde, Peter P.; Ralph, Timothy C.

    2018-02-01

    We propose and investigate a method of error detection and noise correction for bosonic linear networks using a method of unitary averaging. The proposed error averaging does not rely on ancillary photons or control and feedforward correction circuits, remaining entirely passive in its operation. We construct a general mathematical framework for this technique and then give a series of proof of principle examples including numerical analysis. Two methods for the construction of averaging are then compared to determine the most effective manner of implementation and probe the related error thresholds. Finally we discuss some of the potential uses of this scheme.

  12. Paediatric Patient Safety and the Need for Aviation Black Box Thinking to Learn From and Prevent Medication Errors.

    Science.gov (United States)

    Huynh, Chi; Wong, Ian C K; Correa-West, Jo; Terry, David; McCarthy, Suzanne

    2017-04-01

    Since the publication of To Err Is Human: Building a Safer Health System in 1999, there has been much research conducted into the epidemiology, nature and causes of medication errors in children, from prescribing and supply to administration. It is reassuring to see growing evidence of improving medication safety in children; however, based on media reports, it can be seen that serious and fatal medication errors still occur. This critical opinion article examines the problem of medication errors in children and provides recommendations for research, training of healthcare professionals and a culture shift towards dealing with medication errors. There are three factors that we need to consider to unravel what is missing and why fatal medication errors still occur. (1) Who is involved and affected by the medication error? (2) What factors hinder staff and organisations from learning from mistakes? Does the fear of litigation and criminal charges deter healthcare professionals from voluntarily reporting medication errors? (3) What are the educational needs required to prevent medication errors? It is important to educate future healthcare professionals about medication errors and human factors to prevent these from happening. Further research is required to apply aviation's 'black box' principles in healthcare to record and learn from near misses and errors to prevent future events. There is an urgent need for the black box investigations to be published and made public for the benefit of other organisations that may have similar potential risks for adverse events. International sharing of investigations and learning is also needed.

  13. Integration of error tolerance into the design of control rooms of nuclear power plants

    International Nuclear Information System (INIS)

    Sepanloo, Kamran

    1998-08-01

    Many complex technological systems' failures have been attributed to human errors. Today, based on extensive research on the role of human element in technological systems it is known that human error can not totally be eliminated in modern, flexible, or changing work environments by conventional style design strategies(e.g. defence in depth), or better instructions nor should they be. Instead, the operators' ability to explore degrees of freedom should be supported and means for recovering from the effects of errors should be included. This calls for innovative error tolerant design of technological systems. Integration of error tolerant concept into the design, construction, startup, and operation of nuclear power plants provides an effective means of reducing human error occurrence during all stages of life of it and therefore leads to considerable enhancement of plant's safety

  14. Apologies and Medical Error

    Science.gov (United States)

    2008-01-01

    One way in which physicians can respond to a medical error is to apologize. Apologies—statements that acknowledge an error and its consequences, take responsibility, and communicate regret for having caused harm—can decrease blame, decrease anger, increase trust, and improve relationships. Importantly, apologies also have the potential to decrease the risk of a medical malpractice lawsuit and can help settle claims by patients. Patients indicate they want and expect explanations and apologies after medical errors and physicians indicate they want to apologize. However, in practice, physicians tend to provide minimal information to patients after medical errors and infrequently offer complete apologies. Although fears about potential litigation are the most commonly cited barrier to apologizing after medical error, the link between litigation risk and the practice of disclosure and apology is tenuous. Other barriers might include the culture of medicine and the inherent psychological difficulties in facing one’s mistakes and apologizing for them. Despite these barriers, incorporating apology into conversations between physicians and patients can address the needs of both parties and can play a role in the effective resolution of disputes related to medical error. PMID:18972177

  15. Error-related potentials during continuous feedback: using EEG to detect errors of different type and severity

    Science.gov (United States)

    Spüler, Martin; Niethammer, Christian

    2015-01-01

    When a person recognizes an error during a task, an error-related potential (ErrP) can be measured as response. It has been shown that ErrPs can be automatically detected in tasks with time-discrete feedback, which is widely applied in the field of Brain-Computer Interfaces (BCIs) for error correction or adaptation. However, there are only a few studies that concentrate on ErrPs during continuous feedback. With this study, we wanted to answer three different questions: (i) Can ErrPs be measured in electroencephalography (EEG) recordings during a task with continuous cursor control? (ii) Can ErrPs be classified using machine learning methods and is it possible to discriminate errors of different origins? (iii) Can we use EEG to detect the severity of an error? To answer these questions, we recorded EEG data from 10 subjects during a video game task and investigated two different types of error (execution error, due to inaccurate feedback; outcome error, due to not achieving the goal of an action). We analyzed the recorded data to show that during the same task, different kinds of error produce different ErrP waveforms and have a different spectral response. This allows us to detect and discriminate errors of different origin in an event-locked manner. By utilizing the error-related spectral response, we show that also a continuous, asynchronous detection of errors is possible. Although the detection of error severity based on EEG was one goal of this study, we did not find any significant influence of the severity on the EEG. PMID:25859204

  16. Error-related potentials during continuous feedback: using EEG to detect errors of different type and severity

    Directory of Open Access Journals (Sweden)

    Martin eSpüler

    2015-03-01

    Full Text Available When a person recognizes an error during a task, an error-related potential (ErrP can be measured as response. It has been shown that ErrPs can be automatically detected in tasks with time-discrete feedback, which is widely applied in the field of Brain-Computer Interfaces (BCIs for error correction or adaptation. However, there are only a few studies that concentrate on ErrPs during continuous feedback.With this study, we wanted to answer three different questions: (i Can ErrPs be measured in electroencephalography (EEG recordings during a task with continuous cursor control? (ii Can ErrPs be classified using machine learning methods and is it possible to discriminate errors of different origins? (iii Can we use EEG to detect the severity of an error? To answer these questions, we recorded EEG data from 10 subjects during a video game task and investigated two different types of error (execution error, due to inaccurate feedback; outcome error, due to not achieving the goal of an action. We analyzed the recorded data to show that during the same task, different kinds of error produce different ErrP waveforms and have a different spectral response. This allows us to detect and discriminate errors of different origin in an event-locked manner. By utilizing the error-related spectral response, we show that also a continuous, asynchronous detection of errors is possible.Although the detection of error severity based on EEG was one goal of this study, we did not find any significant influence of the severity on the EEG.

  17. Post-error expression of speed and force while performing a simple, monotonous task with a haptic pen

    NARCIS (Netherlands)

    Bruns, M.; Keyson, D.V.; Jabon, M.E.; Hummels, C.C.M.; Hekkert, P.P.M.; Bailenson, J.N.

    2013-01-01

    Control errors often occur in repetitive and monotonous tasks, such as manual assembly tasks. Much research has been done in the area of human error identification; however, most existing systems focus solely on the prediction of errors, not on increasing worker accuracy. The current study examines

  18. Running Records and First Grade English Learners: An Analysis of Language Related Errors

    Science.gov (United States)

    Briceño, Allison; Klein, Adria F.

    2018-01-01

    The purpose of this study was to determine if first-grade English Learners made patterns of language related errors when reading, and if so, to identify those patterns and how teachers coded language related errors when analyzing English Learners' running records. Using research from the fields of both literacy and Second Language Acquisition, we…

  19. Development of safety analysis and constraint detection techniques for process interaction errors

    Energy Technology Data Exchange (ETDEWEB)

    Fan, Chin-Feng, E-mail: csfanc@saturn.yzu.edu.tw [Computer Science and Engineering Dept., Yuan-Ze University, Taiwan (China); Tsai, Shang-Lin; Tseng, Wan-Hui [Computer Science and Engineering Dept., Yuan-Ze University, Taiwan (China)

    2011-02-15

    Among the new failure modes introduced by computer into safety systems, the process interaction error is the most unpredictable and complicated failure mode, which may cause disastrous consequences. This paper presents safety analysis and constraint detection techniques for process interaction errors among hardware, software, and human processes. Among interaction errors, the most dreadful ones are those that involve run-time misinterpretation from a logic process. We call them the 'semantic interaction errors'. Such abnormal interaction is not adequately emphasized in current research. In our static analysis, we provide a fault tree template focusing on semantic interaction errors by checking conflicting pre-conditions and post-conditions among interacting processes. Thus, far-fetched, but highly risky, interaction scenarios involve interpretation errors can be identified. For run-time monitoring, a range of constraint types is proposed for checking abnormal signs at run time. We extend current constraints to a broader relational level and a global level, considering process/device dependencies and physical conservation rules in order to detect process interaction errors. The proposed techniques can reduce abnormal interactions; they can also be used to assist in safety-case construction.

  20. Development of safety analysis and constraint detection techniques for process interaction errors

    International Nuclear Information System (INIS)

    Fan, Chin-Feng; Tsai, Shang-Lin; Tseng, Wan-Hui

    2011-01-01

    Among the new failure modes introduced by computer into safety systems, the process interaction error is the most unpredictable and complicated failure mode, which may cause disastrous consequences. This paper presents safety analysis and constraint detection techniques for process interaction errors among hardware, software, and human processes. Among interaction errors, the most dreadful ones are those that involve run-time misinterpretation from a logic process. We call them the 'semantic interaction errors'. Such abnormal interaction is not adequately emphasized in current research. In our static analysis, we provide a fault tree template focusing on semantic interaction errors by checking conflicting pre-conditions and post-conditions among interacting processes. Thus, far-fetched, but highly risky, interaction scenarios involve interpretation errors can be identified. For run-time monitoring, a range of constraint types is proposed for checking abnormal signs at run time. We extend current constraints to a broader relational level and a global level, considering process/device dependencies and physical conservation rules in order to detect process interaction errors. The proposed techniques can reduce abnormal interactions; they can also be used to assist in safety-case construction.

  1. Deductive Error Diagnosis and Inductive Error Generalization for Intelligent Tutoring Systems.

    Science.gov (United States)

    Hoppe, H. Ulrich

    1994-01-01

    Examines the deductive approach to error diagnosis for intelligent tutoring systems. Topics covered include the principles of the deductive approach to diagnosis; domain-specific heuristics to solve the problem of generalizing error patterns; and deductive diagnosis and the hypertext-based learning environment. (Contains 26 references.) (JLB)

  2. VOLUMETRIC ERROR COMPENSATION IN FIVE-AXIS CNC MACHINING CENTER THROUGH KINEMATICS MODELING OF GEOMETRIC ERROR

    Directory of Open Access Journals (Sweden)

    Pooyan Vahidi Pashsaki

    2016-06-01

    Full Text Available Accuracy of a five-axis CNC machine tool is affected by a vast number of error sources. This paper investigates volumetric error modeling and its compensation to the basis for creation of new tool path for improvement of work pieces accuracy. The volumetric error model of a five-axis machine tool with the configuration RTTTR (tilting head B-axis and rotary table in work piece side A΄ was set up taking into consideration rigid body kinematics and homogeneous transformation matrix, in which 43 error components are included. Volumetric error comprises 43 error components that can separately reduce geometrical and dimensional accuracy of work pieces. The machining accuracy of work piece is guaranteed due to the position of the cutting tool center point (TCP relative to the work piece. The cutting tool is deviated from its ideal position relative to the work piece and machining error is experienced. For compensation process detection of the present tool path and analysis of the RTTTR five-axis CNC machine tools geometrical error, translating current position of component to compensated positions using the Kinematics error model, converting newly created component to new tool paths using the compensation algorithms and finally editing old G-codes using G-code generator algorithm have been employed.

  3. Errorful and errorless learning: The impact of cue-target constraint in learning from errors.

    Science.gov (United States)

    Bridger, Emma K; Mecklinger, Axel

    2014-08-01

    The benefits of testing on learning are well described, and attention has recently turned to what happens when errors are elicited during learning: Is testing nonetheless beneficial, or can errors hinder learning? Whilst recent findings have indicated that tests boost learning even if errors are made on every trial, other reports, emphasizing the benefits of errorless learning, have indicated that errors lead to poorer later memory performance. The possibility that this discrepancy is a function of the materials that must be learned-in particular, the relationship between the cues and targets-was addressed here. Cued recall after either a study-only errorless condition or an errorful learning condition was contrasted across cue-target associations, for which the extent to which the target was constrained by the cue was either high or low. Experiment 1 showed that whereas errorful learning led to greater recall for low-constraint stimuli, it led to a significant decrease in recall for high-constraint stimuli. This interaction is thought to reflect the extent to which retrieval is constrained by the cue-target association, as well as by the presence of preexisting semantic associations. The advantage of errorful retrieval for low-constraint stimuli was replicated in Experiment 2, and the interaction with stimulus type was replicated in Experiment 3, even when guesses were randomly designated as being either correct or incorrect. This pattern provides support for inferences derived from reports in which participants made errors on all learning trials, whilst highlighting the impact of material characteristics on the benefits and disadvantages that accrue from errorful learning in episodic memory.

  4. Decoding of DBEC-TBED Reed-Solomon codes. [Double-Byte-Error-Correcting, Triple-Byte-Error-Detecting

    Science.gov (United States)

    Deng, Robert H.; Costello, Daniel J., Jr.

    1987-01-01

    A problem in designing semiconductor memories is to provide some measure of error control without requiring excessive coding overhead or decoding time. In LSI and VLSI technology, memories are often organized on a multiple bit (or byte) per chip basis. For example, some 256 K bit DRAM's are organized in 32 K x 8 bit-bytes. Byte-oriented codes such as Reed-Solomon (RS) codes can provide efficient low overhead error control for such memories. However, the standard iterative algorithm for decoding RS codes is too slow for these applications. The paper presents a special decoding technique for double-byte-error-correcting, triple-byte-error-detecting RS codes which is capable of high-speed operation. This technique is designed to find the error locations and the error values directly from the syndrome without having to use the iterative algorithm to find the error locator polynomial.

  5. The Countermeasures against the Human Errors in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Lee, Yong Hee; Kwon, Ki Chun; Lee, Jung Woon; Lee, Hyun; Jang, Tong Il

    2009-10-01

    Due to human error, the failure of nuclear power facilities essential for the prevention of accidents and related research in ergonomics and human factors, including the long term, comprehensive measures are considered technology is urgently required. Past nuclear facilities for the hardware in terms of continuing interest over subsequent definite improvement even have brought, now a nuclear facility to engage in people-related human factors for attention by nuclear facilities, ensuring the safety of its economic and industrial aspects. The point of the improvement is urgently required. The purpose of this research, including nuclear power plants in various nuclear facilities to minimize the possibility of human error by ensuring the safety for human engineering aspects will be implemented in the medium and long term preventive measures is to establish comprehensive

  6. The Countermeasures against the Human Errors in Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yong Hee; Kwon, Ki Chun; Lee, Jung Woon; Lee, Hyun; Jang, Tong Il

    2009-10-15

    Due to human error, the failure of nuclear power facilities essential for the prevention of accidents and related research in ergonomics and human factors, including the long term, comprehensive measures are considered technology is urgently required. Past nuclear facilities for the hardware in terms of continuing interest over subsequent definite improvement even have brought, now a nuclear facility to engage in people-related human factors for attention by nuclear facilities, ensuring the safety of its economic and industrial aspects. The point of the improvement is urgently required. The purpose of this research, including nuclear power plants in various nuclear facilities to minimize the possibility of human error by ensuring the safety for human engineering aspects will be implemented in the medium and long term preventive measures is to establish comprehensive.

  7. High cortisol awakening response is associated with impaired error monitoring and decreased post-error adjustment.

    Science.gov (United States)

    Zhang, Liang; Duan, Hongxia; Qin, Shaozheng; Yuan, Yiran; Buchanan, Tony W; Zhang, Kan; Wu, Jianhui

    2015-01-01

    The cortisol awakening response (CAR), a rapid increase in cortisol levels following morning awakening, is an important aspect of hypothalamic-pituitary-adrenocortical axis activity. Alterations in the CAR have been linked to a variety of mental disorders and cognitive function. However, little is known regarding the relationship between the CAR and error processing, a phenomenon that is vital for cognitive control and behavioral adaptation. Using high-temporal resolution measures of event-related potentials (ERPs) combined with behavioral assessment of error processing, we investigated whether and how the CAR is associated with two key components of error processing: error detection and subsequent behavioral adjustment. Sixty university students performed a Go/No-go task while their ERPs were recorded. Saliva samples were collected at 0, 15, 30 and 60 min after awakening on the two consecutive days following ERP data collection. The results showed that a higher CAR was associated with slowed latency of the error-related negativity (ERN) and a higher post-error miss rate. The CAR was not associated with other behavioral measures such as the false alarm rate and the post-correct miss rate. These findings suggest that high CAR is a biological factor linked to impairments of multiple steps of error processing in healthy populations, specifically, the automatic detection of error and post-error behavioral adjustment. A common underlying neural mechanism of physiological and cognitive control may be crucial for engaging in both CAR and error processing.

  8. Learning from Errors

    Directory of Open Access Journals (Sweden)

    MA. Lendita Kryeziu

    2015-06-01

    Full Text Available “Errare humanum est”, a well known and widespread Latin proverb which states that: to err is human, and that people make mistakes all the time. However, what counts is that people must learn from mistakes. On these grounds Steve Jobs stated: “Sometimes when you innovate, you make mistakes. It is best to admit them quickly, and get on with improving your other innovations.” Similarly, in learning new language, learners make mistakes, thus it is important to accept them, learn from them, discover the reason why they make them, improve and move on. The significance of studying errors is described by Corder as: “There have always been two justifications proposed for the study of learners' errors: the pedagogical justification, namely that a good understanding of the nature of error is necessary before a systematic means of eradicating them could be found, and the theoretical justification, which claims that a study of learners' errors is part of the systematic study of the learners' language which is itself necessary to an understanding of the process of second language acquisition” (Corder, 1982; 1. Thus the importance and the aim of this paper is analyzing errors in the process of second language acquisition and the way we teachers can benefit from mistakes to help students improve themselves while giving the proper feedback.

  9. Nature and frequency of medication errors in a geriatric ward: an Indonesian experience

    Directory of Open Access Journals (Sweden)

    Ernawati DK

    2014-06-01

    Full Text Available Desak Ketut Ernawati,1,2 Ya Ping Lee,2 Jeffery David Hughes21Faculty of Medicine, Udayana University, Denpasar, Bali, Indonesia; 2School of Pharmacy and Curtin Health Innovation and Research Institute, Curtin University, Perth, WA, AustraliaPurpose: To determine the nature and frequency of medication errors during medication delivery processes in a public teaching hospital geriatric ward in Bali, Indonesia.Methods: A 20-week prospective study on medication errors occurring during the medication delivery process was conducted in a geriatric ward in a public teaching hospital in Bali, Indonesia. Participants selected were inpatients aged more than 60 years. Patients were excluded if they had a malignancy, were undergoing surgery, or receiving chemotherapy treatment. The occurrence of medication errors in prescribing, transcribing, dispensing, and administration were detected by the investigator providing in-hospital clinical pharmacy services.Results: Seven hundred and seventy drug orders and 7,662 drug doses were reviewed as part of the study. There were 1,563 medication errors detected among the 7,662 drug doses reviewed, representing an error rate of 20.4%. Administration errors were the most frequent medication errors identified (59%, followed by transcription errors (15%, dispensing errors (14%, and prescribing errors (7%. Errors in documentation were the most common form of administration errors. Of these errors, 2.4% were classified as potentially serious and 10.3% as potentially significant.Conclusion: Medication errors occurred in every stage of the medication delivery process, with administration errors being the most frequent. The majority of errors identified in the administration stage were related to documentation. Provision of in-hospital clinical pharmacy services could potentially play a significant role in detecting and preventing medication errors.Keywords: geriatric, medication errors, inpatients, medication delivery process

  10. Errors and Conflict at the Task Level and the Response Level

    NARCIS (Netherlands)

    Desmet, C.; Fias, W.; Hartstra, E.; Brass, M.

    2011-01-01

    In the last decade, research on error and conflict processing has become one of the most influential research areas in the domain of cognitive control. There is now converging evidence that a specific part of the posterior frontomedian cortex (pFMC), the rostral cingulate zone (RCZ), is crucially

  11. Uncovering transcriptional regulation of metabolism by using metabolic network topology

    DEFF Research Database (Denmark)

    Patil, Kiran Raosaheb; Nielsen, Jens

    2005-01-01

    in the metabolic network that follow a common transcriptional response. Thus, the algorithm enables identification of so-called reporter metabolites (metabolites around which the most significant transcriptional changes occur) and a set of connected genes with significant and coordinated response to genetic......Cellular response to genetic and environmental perturbations is often reflected and/or mediated through changes in the metabolism, because the latter plays a key role in providing Gibbs free energy and precursors for biosynthesis. Such metabolic changes are often exerted through transcriptional...... therefore developed an algorithm that is based on hypothesis-driven data analysis to uncover the transcriptional regulatory architecture of metabolic networks. By using information on the metabolic network topology from genome-scale metabolic reconstruction, we show that it is possible to reveal patterns...

  12. Psychological safety and error reporting within Veterans Health Administration hospitals.

    Science.gov (United States)

    Derickson, Ryan; Fishman, Jonathan; Osatuke, Katerine; Teclaw, Robert; Ramsel, Dee

    2015-03-01

    In psychologically safe workplaces, employees feel comfortable taking interpersonal risks, such as pointing out errors. Previous research suggested that psychologically safe climate optimizes organizational outcomes. We evaluated psychological safety levels in Veterans Health Administration (VHA) hospitals and assessed their relationship to employee willingness of reporting medical errors. We conducted an ANOVA on psychological safety scores from a VHA employees census survey (n = 185,879), assessing variability of means across racial and supervisory levels. We examined organizational climate assessment interviews (n = 374) evaluating how many employees asserted willingness to report errors (or not) and their stated reasons. Finally, based on survey data, we identified 2 (psychologically safe versus unsafe) hospitals and compared their number of employees who would be willing/unwilling to report an error. Psychological safety increased with supervisory level (P hospital (71% would report, 13% would not) were less willing to report an error than at the psychologically safe hospital (91% would, 0% would not). A substantial minority would not report an error and were willing to admit so in a private interview setting. Their stated reasons as well as higher psychological safety means for supervisory employees both suggest power as an important determinant. Intentions to report were associated with psychological safety, strongly suggesting this climate aspect as instrumental to improving patient safety and reducing costs.

  13. Blood specimen labelling errors: Implications for nephrology nursing practice.

    Science.gov (United States)

    Duteau, Jennifer

    2014-01-01

    Patient safety is the foundation of high-quality health care, as recognized both nationally and worldwide. Patient blood specimen identification is critical in ensuring the delivery of safe and appropriate care. The practice of nephrology nursing involves frequent patient blood specimen withdrawals to treat and monitor kidney disease. A critical review of the literature reveals that incorrect patient identification is one of the major causes of blood specimen labelling errors. Misidentified samples create a serious risk to patient safety leading to multiple specimen withdrawals, delay in diagnosis, misdiagnosis, incorrect treatment, transfusion reactions, increased length of stay and other negative patient outcomes. Barcode technology has been identified as a preferred method for positive patient identification leading to a definitive decrease in blood specimen labelling errors by as much as 83% (Askeland, et al., 2008). The use of a root cause analysis followed by an action plan is one approach to decreasing the occurrence of blood specimen labelling errors. This article will present a review of the evidence-based literature surrounding blood specimen labelling errors, followed by author recommendations for completing a root cause analysis and action plan. A failure modes and effects analysis (FMEA) will be presented as one method to determine root cause, followed by the Ottawa Model of Research Use (OMRU) as a framework for implementation of strategies to reduce blood specimen labelling errors.

  14. Periodic boundary conditions and the error-controlled fast multipole method

    Energy Technology Data Exchange (ETDEWEB)

    Kabadshow, Ivo

    2012-08-22

    The simulation of pairwise interactions in huge particle ensembles is a vital issue in scientific research. Especially the calculation of long-range interactions poses limitations to the system size, since these interactions scale quadratically with the number of particles. Fast summation techniques like the Fast Multipole Method (FMM) can help to reduce the complexity to O(N). This work extends the possible range of applications of the FMM to periodic systems in one, two and three dimensions with one unique approach. Together with a tight error control, this contribution enables the simulation of periodic particle systems for different applications without the need to know and tune the FMM specific parameters. The implemented error control scheme automatically optimizes the parameters to obtain an approximation for the minimal runtime for a given energy error bound.

  15. Putting into practice error management theory: Unlearning and learning to manage action errors in construction.

    Science.gov (United States)

    Love, Peter E D; Smith, Jim; Teo, Pauline

    2018-05-01

    Error management theory is drawn upon to examine how a project-based organization, which took the form of a program alliance, was able to change its established error prevention mindset to one that enacted a learning mindfulness that provided an avenue to curtail its action errors. The program alliance was required to unlearn its existing routines and beliefs to accommodate the practices required to embrace error management. As a result of establishing an error management culture the program alliance was able to create a collective mindfulness that nurtured learning and supported innovation. The findings provide a much-needed context to demonstrate the relevance of error management theory to effectively address rework and safety problems in construction projects. The robust theoretical underpinning that is grounded in practice and presented in this paper provides a mechanism to engender learning from errors, which can be utilized by construction organizations to improve the productivity and performance of their projects. Copyright © 2018 Elsevier Ltd. All rights reserved.

  16. Rounding errors in weighing

    International Nuclear Information System (INIS)

    Jeach, J.L.

    1976-01-01

    When rounding error is large relative to weighing error, it cannot be ignored when estimating scale precision and bias from calibration data. Further, if the data grouping is coarse, rounding error is correlated with weighing error and may also have a mean quite different from zero. These facts are taken into account in a moment estimation method. A copy of the program listing for the MERDA program that provides moment estimates is available from the author. Experience suggests that if the data fall into four or more cells or groups, it is not necessary to apply the moment estimation method. Rather, the estimate given by equation (3) is valid in this instance. 5 tables

  17. Statistical analysis with measurement error or misclassification strategy, method and application

    CERN Document Server

    Yi, Grace Y

    2017-01-01

    This monograph on measurement error and misclassification covers a broad range of problems and emphasizes unique features in modeling and analyzing problems arising from medical research and epidemiological studies. Many measurement error and misclassification problems have been addressed in various fields over the years as well as with a wide spectrum of data, including event history data (such as survival data and recurrent event data), correlated data (such as longitudinal data and clustered data), multi-state event data, and data arising from case-control studies. Statistical Analysis with Measurement Error or Misclassification: Strategy, Method and Application brings together assorted methods in a single text and provides an update of recent developments for a variety of settings. Measurement error effects and strategies of handling mismeasurement for different models are closely examined in combination with applications to specific problems. Readers with diverse backgrounds and objectives can utilize th...

  18. Error-finding and error-correcting methods for the start-up of the SLC

    International Nuclear Information System (INIS)

    Lee, M.J.; Clearwater, S.H.; Kleban, S.D.; Selig, L.J.

    1987-02-01

    During the commissioning of an accelerator, storage ring, or beam transfer line, one of the important tasks of an accelertor physicist is to check the first-order optics of the beam line and to look for errors in the system. Conceptually, it is important to distinguish between techniques for finding the machine errors that are the cause of the problem and techniques for correcting the beam errors that are the result of the machine errors. In this paper we will limit our presentation to certain applications of these two methods for finding or correcting beam-focus errors and beam-kick errors that affect the profile and trajectory of the beam respectively. Many of these methods have been used successfully in the commissioning of SLC systems. In order not to waste expensive beam time we have developed and used a beam-line simulator to test the ideas that have not been tested experimentally. To save valuable physicist's time we have further automated the beam-kick error-finding procedures by adopting methods from the field of artificial intelligence to develop a prototype expert system. Our experience with this prototype has demonstrated the usefulness of expert systems in solving accelerator control problems. The expert system is able to find the same solutions as an expert physicist but in a more systematic fashion. The methods used in these procedures and some of the recent applications will be described in this paper

  19. Benchmark test cases for evaluation of computer-based methods for detection of setup errors: realistic digitally reconstructed electronic portal images with known setup errors

    International Nuclear Information System (INIS)

    Fritsch, Daniel S.; Raghavan, Suraj; Boxwala, Aziz; Earnhart, Jon; Tracton, Gregg; Cullip, Timothy; Chaney, Edward L.

    1997-01-01

    the visible human CT scans from the National Library of Medicine, are essential for producing realistic images. Sets of test cases with systematic and random errors in selected setup parameters and anatomic volumes are suitable for use as standard benchmarks by the radiotherapy community. In addition to serving as an aid to research and development, benchmark images may also be useful for evaluation of commercial systems and as part of a quality assurance program for clinical systems. Test cases and software are available upon request

  20. ERROR ANALYSIS IN THE TRAVEL WRITING MADE BY THE STUDENTS OF ENGLISH STUDY PROGRAM

    Directory of Open Access Journals (Sweden)

    Vika Agustina

    2015-05-01

    Full Text Available This study was conducted to identify the kinds of errors in surface strategy taxonomy and to know the dominant type of errors made by the fifth semester students of English Department of one State University in Malang-Indonesia in producing their travel writing. The type of research of this study is document analysis since it analyses written materials, in this case travel writing texts. The analysis finds that the grammatical errors made by the students based on surface strategy taxonomy theory consist of four types. They are (1 omission, (2 addition, (3 misformation and (4 misordering. The most frequent errors occuring in misformation are in the use of tense form. Secondly, the errors are in omission of noun/verb inflection. The next error, there are many clauses that contain unnecessary phrase added there.

  1. Barriers to medical error reporting

    Directory of Open Access Journals (Sweden)

    Jalal Poorolajal

    2015-01-01

    Full Text Available Background: This study was conducted to explore the prevalence of medical error underreporting and associated barriers. Methods: This cross-sectional study was performed from September to December 2012. Five hospitals, affiliated with Hamadan University of Medical Sciences, in Hamedan,Iran were investigated. A self-administered questionnaire was used for data collection. Participants consisted of physicians, nurses, midwives, residents, interns, and staffs of radiology and laboratory departments. Results: Overall, 50.26% of subjects had committed but not reported medical errors. The main reasons mentioned for underreporting were lack of effective medical error reporting system (60.0%, lack of proper reporting form (51.8%, lack of peer supporting a person who has committed an error (56.0%, and lack of personal attention to the importance of medical errors (62.9%. The rate of committing medical errors was higher in men (71.4%, age of 50-40 years (67.6%, less-experienced personnel (58.7%, educational level of MSc (87.5%, and staff of radiology department (88.9%. Conclusions: This study outlined the main barriers to reporting medical errors and associated factors that may be helpful for healthcare organizations in improving medical error reporting as an essential component for patient safety enhancement.

  2. Relative Error Evaluation to Typical Open Global dem Datasets in Shanxi Plateau of China

    Science.gov (United States)

    Zhao, S.; Zhang, S.; Cheng, W.

    2018-04-01

    Produced by radar data or stereo remote sensing image pairs, global DEM datasets are one of the most important types for DEM data. Relative error relates to surface quality created by DEM data, so it relates to geomorphology and hydrologic applications using DEM data. Taking Shanxi Plateau of China as the study area, this research evaluated the relative error to typical open global DEM datasets including Shuttle Radar Terrain Mission (SRTM) data with 1 arc second resolution (SRTM1), SRTM data with 3 arc second resolution (SRTM3), ASTER global DEM data in the second version (GDEM-v2) and ALOS world 3D-30m (AW3D) data. Through process and selection, more than 300,000 ICESat/GLA14 points were used as the GCP data, and the vertical error was computed and compared among four typical global DEM datasets. Then, more than 2,600,000 ICESat/GLA14 point pairs were acquired using the distance threshold between 100 m and 500 m. Meanwhile, the horizontal distance between every point pair was computed, so the relative error was achieved using slope values based on vertical error difference and the horizontal distance of the point pairs. Finally, false slope ratio (FSR) index was computed through analyzing the difference between DEM and ICESat/GLA14 values for every point pair. Both relative error and FSR index were categorically compared for the four DEM datasets under different slope classes. Research results show: Overall, AW3D has the lowest relative error values in mean error, mean absolute error, root mean square error and standard deviation error; then the SRTM1 data, its values are a little higher than AW3D data; the SRTM3 and GDEM-v2 data have the highest relative error values, and the values for the two datasets are similar. Considering different slope conditions, all the four DEM data have better performance in flat areas but worse performance in sloping regions; AW3D has the best performance in all the slope classes, a litter better than SRTM1; with slope increasing

  3. 「言い誤り」(speech errors)の傾向に関する考察(IV)

    OpenAIRE

    伊藤, 克敏; Ito, Katsutoshi

    2007-01-01

    This is the fourth in a series (1988, 1992, 1999) of my research on the tendencies of speech errors committed by adults. Collected speech errors were analyzed on phonological, morphological, syntactic and semantic levels. Similarities and differences between adult and child speech errors were discussed. It was pointed out that the typology of speech errors can be established by comparative study of adult speech errors, developing child language, aphasic speech and speech of senile dementia.

  4. An in-process form error measurement system for precision machining

    International Nuclear Information System (INIS)

    Gao, Y; Huang, X; Zhang, Y

    2010-01-01

    In-process form error measurement for precision machining is studied. Due to two key problems, opaque barrier and vibration, the study of in-process form error optical measurement for precision machining has been a hard topic and so far very few existing research works can be found. In this project, an in-process form error measurement device is proposed to deal with the two key problems. Based on our existing studies, a prototype system has been developed. It is the first one of the kind that overcomes the two key problems. The prototype is based on a single laser sensor design of 50 nm resolution together with two techniques, a damping technique and a moving average technique, proposed for use with the device. The proposed damping technique is able to improve vibration attenuation by up to 21 times compared to the case of natural attenuation. The proposed moving average technique is able to reduce errors by seven to ten times without distortion to the form profile results. The two proposed techniques are simple but they are especially useful for the proposed device. For a workpiece sample, the measurement result under coolant condition is only 2.5% larger compared with the one under no coolant condition. For a certified Wyko test sample, the overall system measurement error can be as low as 0.3 µm. The measurement repeatability error can be as low as 2.2%. The experimental results give confidence in using the proposed in-process form error measurement device. For better results, further improvement in design and tests are necessary

  5. Metacognitive Unawareness of the Errorful Generation Benefit and Its Effects on Self-Regulated Learning

    Science.gov (United States)

    Yang, Chunliang; Potts, Rosalind; Shanks, David R.

    2017-01-01

    Generating errors followed by corrective feedback enhances retention more effectively than does reading--the benefit of errorful generation--but people tend to be unaware of this benefit. The current research explored this metacognitive unawareness, its effect on self-regulated learning, and how to alleviate or reverse it. People's beliefs about…

  6. Learning from errors in super-resolution.

    Science.gov (United States)

    Tang, Yi; Yuan, Yuan

    2014-11-01

    A novel framework of learning-based super-resolution is proposed by employing the process of learning from the estimation errors. The estimation errors generated by different learning-based super-resolution algorithms are statistically shown to be sparse and uncertain. The sparsity of the estimation errors means most of estimation errors are small enough. The uncertainty of the estimation errors means the location of the pixel with larger estimation error is random. Noticing the prior information about the estimation errors, a nonlinear boosting process of learning from these estimation errors is introduced into the general framework of the learning-based super-resolution. Within the novel framework of super-resolution, a low-rank decomposition technique is used to share the information of different super-resolution estimations and to remove the sparse estimation errors from different learning algorithms or training samples. The experimental results show the effectiveness and the efficiency of the proposed framework in enhancing the performance of different learning-based algorithms.

  7. Error management process for power stations

    International Nuclear Information System (INIS)

    Hirotsu, Yuko; Takeda, Daisuke; Fujimoto, Junzo; Nagasaka, Akihiko

    2016-01-01

    The purpose of this study is to establish 'error management process for power stations' for systematizing activities for human error prevention and for festering continuous improvement of these activities. The following are proposed by deriving concepts concerning error management process from existing knowledge and realizing them through application and evaluation of their effectiveness at a power station: an entire picture of error management process that facilitate four functions requisite for maraging human error prevention effectively (1. systematizing human error prevention tools, 2. identifying problems based on incident reports and taking corrective actions, 3. identifying good practices and potential problems for taking proactive measures, 4. prioritizeng human error prevention tools based on identified problems); detail steps for each activity (i.e. developing an annual plan for human error prevention, reporting and analyzing incidents and near misses) based on a model of human error causation; procedures and example of items for identifying gaps between current and desired levels of executions and outputs of each activity; stages for introducing and establishing the above proposed error management process into a power station. By giving shape to above proposals at a power station, systematization and continuous improvement of activities for human error prevention in line with the actual situation of the power station can be expected. (author)

  8. Error detecting capabilities of the shortened Hamming codes adopted for error detection in IEEE Standard 802.3

    Science.gov (United States)

    Fujiwara, Toru; Kasami, Tadao; Lin, Shu

    1989-09-01

    The error-detecting capabilities of the shortened Hamming codes adopted for error detection in IEEE Standard 802.3 are investigated. These codes are also used for error detection in the data link layer of the Ethernet, a local area network. The weight distributions for various code lengths are calculated to obtain the probability of undetectable error and that of detectable error for a binary symmetric channel with bit-error rate between 0.00001 and 1/2.

  9. TIME HORIZON AND UNCOVERED INTEREST PARITY IN EMERGING ECONOMIES

    Directory of Open Access Journals (Sweden)

    Norlida Hanim Mohd Salleh

    2011-07-01

    Full Text Available The aim of this study is to re-examine the well-known empirical puzzle of uncovered interest parity (UIP for emerging market economies with different prediction time horizons. The empirical results obtained using dynamic panel and time series techniques for monthly data from January 1995 to December 2009 eventually show that the panel data estimates are more powerful than those obtained by applying individual time series estimations and the significant contribution of the exchange rate prediction horizons in determining the status of UIP. This finding reveals that at the longer time horizon, the model has better econometric specification and thus more predictive power for exchange rate movements compared to the shorter time period. The findings can also be a signalling of well-integrated currency markets and a reliable guide to international investors as well as for the orderly conduct of monetary authorities.

  10. Uncovering Transcriptional Regulatory Networks by Sparse Bayesian Factor Model

    Directory of Open Access Journals (Sweden)

    Qi Yuan(Alan

    2010-01-01

    Full Text Available Abstract The problem of uncovering transcriptional regulation by transcription factors (TFs based on microarray data is considered. A novel Bayesian sparse correlated rectified factor model (BSCRFM is proposed that models the unknown TF protein level activity, the correlated regulations between TFs, and the sparse nature of TF-regulated genes. The model admits prior knowledge from existing database regarding TF-regulated target genes based on a sparse prior and through a developed Gibbs sampling algorithm, a context-specific transcriptional regulatory network specific to the experimental condition of the microarray data can be obtained. The proposed model and the Gibbs sampling algorithm were evaluated on the simulated systems, and results demonstrated the validity and effectiveness of the proposed approach. The proposed model was then applied to the breast cancer microarray data of patients with Estrogen Receptor positive ( status and Estrogen Receptor negative ( status, respectively.

  11. Research on Non-Similarity about Thermal Deformation Error of Mechanical Parts in High-accuracy Measurement

    International Nuclear Information System (INIS)

    Luo, Z; Fei, Y T

    2006-01-01

    Expanding with heat and contracting with cold are common physical phenomenon in the nature. The conventional theories and calculations of thermal deformation are approximate and linear, can only be applied in normal or low precision field. The thermal deformation error of mechanical parts doesn't follow the conventional linear formula, it relates to all physical dimension of the mechanical part, and the deformation can be indicated by a nonlinear formula of physical dimensions. A theory on non-similarity about thermal deformation error of mechanical parts is presented. Studies on some common mechanical parts in precision technology have went on and the mathematical models have been set up, hollow piece, gear and cube are included. The experimental results also make it clear that these models are more logical than traditional models

  12. Human error probability estimation using licensee event reports

    International Nuclear Information System (INIS)

    Voska, K.J.; O'Brien, J.N.

    1984-07-01

    Objective of this report is to present a method for using field data from nuclear power plants to estimate human error probabilities (HEPs). These HEPs are then used in probabilistic risk activities. This method of estimating HEPs is one of four being pursued in NRC-sponsored research. The other three are structured expert judgment, analysis of training simulator data, and performance modeling. The type of field data analyzed in this report is from Licensee Event reports (LERs) which are analyzed using a method specifically developed for that purpose. However, any type of field data or human errors could be analyzed using this method with minor adjustments. This report assesses the practicality, acceptability, and usefulness of estimating HEPs from LERs and comprehensively presents the method for use

  13. Processing graded feedback: electrophysiological correlates of learning from small and large errors.

    Science.gov (United States)

    Luft, Caroline Di Bernardi; Takase, Emilio; Bhattacharya, Joydeep

    2014-05-01

    Feedback processing is important for learning and therefore may affect the consolidation of skills. Considerable research demonstrates electrophysiological differences between correct and incorrect feedback, but how we learn from small versus large errors is usually overlooked. This study investigated electrophysiological differences when processing small or large error feedback during a time estimation task. Data from high-learners and low-learners were analyzed separately. In both high- and low-learners, large error feedback was associated with higher feedback-related negativity (FRN) and small error feedback was associated with a larger P300 and increased amplitude over the motor related areas of the left hemisphere. In addition, small error feedback induced larger desynchronization in the alpha and beta bands with distinctly different topographies between the two learning groups: The high-learners showed a more localized decrease in beta power over the left frontocentral areas, and the low-learners showed a widespread reduction in the alpha power following small error feedback. Furthermore, only the high-learners showed an increase in phase synchronization between the midfrontal and left central areas. Importantly, this synchronization was correlated to how well the participants consolidated the estimation of the time interval. Thus, although large errors were associated with higher FRN, small errors were associated with larger oscillatory responses, which was more evident in the high-learners. Altogether, our results suggest an important role of the motor areas in the processing of error feedback for skill consolidation.

  14. Missed opportunities for diagnosis: lessons learned from diagnostic errors in primary care.

    Science.gov (United States)

    Goyder, Clare R; Jones, Caroline H D; Heneghan, Carl J; Thompson, Matthew J

    2015-12-01

    Because of the difficulties inherent in diagnosis in primary care, it is inevitable that diagnostic errors will occur. However, despite the important consequences associated with diagnostic errors and their estimated high prevalence, teaching and research on diagnostic error is a neglected area. To ascertain the key learning points from GPs' experiences of diagnostic errors and approaches to clinical decision making associated with these. Secondary analysis of 36 qualitative interviews with GPs in Oxfordshire, UK. Two datasets of semi-structured interviews were combined. Questions focused on GPs' experiences of diagnosis and diagnostic errors (or near misses) in routine primary care and out of hours. Interviews were audiorecorded, transcribed verbatim, and analysed thematically. Learning points include GPs' reliance on 'pattern recognition' and the failure of this strategy to identify atypical presentations; the importance of considering all potentially serious conditions using a 'restricted rule out' approach; and identifying and acting on a sense of unease. Strategies to help manage uncertainty in primary care were also discussed. Learning from previous examples of diagnostic errors is essential if these events are to be reduced in the future and this should be incorporated into GP training. At a practice level, learning points from experiences of diagnostic errors should be discussed more frequently; and more should be done to integrate these lessons nationally to understand and characterise diagnostic errors. © British Journal of General Practice 2015.

  15. Arab ESL Secondary School Students' Spelling Errors

    Science.gov (United States)

    Al-Sobhi, Bandar Mohammad Saeed; Rashid, Sabariah Md; Abdullah, Ain Nadzimah; Darmi, Ramiza

    2017-01-01

    English spelling has always been described by many language researchers and teachers as a daunting task especially for learners whose first language is not English. Accordingly, Arab ESL learners commit serious errors when they spell out English words. The primary objective of this paper is to determine the types as well as the causes of spelling…

  16. Evaluating a medical error taxonomy.

    OpenAIRE

    Brixey, Juliana; Johnson, Todd R.; Zhang, Jiajie

    2002-01-01

    Healthcare has been slow in using human factors principles to reduce medical errors. The Center for Devices and Radiological Health (CDRH) recognizes that a lack of attention to human factors during product development may lead to errors that have the potential for patient injury, or even death. In response to the need for reducing medication errors, the National Coordinating Council for Medication Errors Reporting and Prevention (NCC MERP) released the NCC MERP taxonomy that provides a stand...

  17. The surveillance error grid.

    Science.gov (United States)

    Klonoff, David C; Lias, Courtney; Vigersky, Robert; Clarke, William; Parkes, Joan Lee; Sacks, David B; Kirkman, M Sue; Kovatchev, Boris

    2014-07-01

    Currently used error grids for assessing clinical accuracy of blood glucose monitors are based on out-of-date medical practices. Error grids have not been widely embraced by regulatory agencies for clearance of monitors, but this type of tool could be useful for surveillance of the performance of cleared products. Diabetes Technology Society together with representatives from the Food and Drug Administration, the American Diabetes Association, the Endocrine Society, and the Association for the Advancement of Medical Instrumentation, and representatives of academia, industry, and government, have developed a new error grid, called the surveillance error grid (SEG) as a tool to assess the degree of clinical risk from inaccurate blood glucose (BG) monitors. A total of 206 diabetes clinicians were surveyed about the clinical risk of errors of measured BG levels by a monitor. The impact of such errors on 4 patient scenarios was surveyed. Each monitor/reference data pair was scored and color-coded on a graph per its average risk rating. Using modeled data representative of the accuracy of contemporary meters, the relationships between clinical risk and monitor error were calculated for the Clarke error grid (CEG), Parkes error grid (PEG), and SEG. SEG action boundaries were consistent across scenarios, regardless of whether the patient was type 1 or type 2 or using insulin or not. No significant differences were noted between responses of adult/pediatric or 4 types of clinicians. Although small specific differences in risk boundaries between US and non-US clinicians were noted, the panel felt they did not justify separate grids for these 2 types of clinicians. The data points of the SEG were classified in 15 zones according to their assigned level of risk, which allowed for comparisons with the classic CEG and PEG. Modeled glucose monitor data with realistic self-monitoring of blood glucose errors derived from meter testing experiments plotted on the SEG when compared to

  18. Dopamine reward prediction error coding.

    Science.gov (United States)

    Schultz, Wolfram

    2016-03-01

    Reward prediction errors consist of the differences between received and predicted rewards. They are crucial for basic forms of learning about rewards and make us strive for more rewards-an evolutionary beneficial trait. Most dopamine neurons in the midbrain of humans, monkeys, and rodents signal a reward prediction error; they are activated by more reward than predicted (positive prediction error), remain at baseline activity for fully predicted rewards, and show depressed activity with less reward than predicted (negative prediction error). The dopamine signal increases nonlinearly with reward value and codes formal economic utility. Drugs of addiction generate, hijack, and amplify the dopamine reward signal and induce exaggerated, uncontrolled dopamine effects on neuronal plasticity. The striatum, amygdala, and frontal cortex also show reward prediction error coding, but only in subpopulations of neurons. Thus, the important concept of reward prediction errors is implemented in neuronal hardware.

  19. Addressing the Problem of Negative Lexical Transfer Errors in Chilean University Students

    Directory of Open Access Journals (Sweden)

    Paul Anthony Dissington

    2018-01-01

    Full Text Available Studies of second language learning have revealed a connection between first language transfer and errors in second language production. This paper describes an action research study carried out among Chilean university students studying English as part of their degree programmes. The study focuses on common lexical errors made by Chilean Spanish-speakers due to negative first language transfer and aims to analyse the effects of systematic instruction and practice of this problematic lexis. It is suggested that raising awareness of lexical transfer through focused attention on common transfer errors is valued by students and seems essential for learners to achieve productive mastery.

  20. Crowdsourcing for error detection in cortical surface delineations.

    Science.gov (United States)

    Ganz, Melanie; Kondermann, Daniel; Andrulis, Jonas; Knudsen, Gitte Moos; Maier-Hein, Lena

    2017-01-01

    With the recent trend toward big data analysis, neuroimaging datasets have grown substantially in the past years. While larger datasets potentially offer important insights for medical research, one major bottleneck is the requirement for resources of medical experts needed to validate automatic processing results. To address this issue, the goal of this paper was to assess whether anonymous nonexperts from an online community can perform quality control of MR-based cortical surface delineations derived by an automatic algorithm. So-called knowledge workers from an online crowdsourcing platform were asked to annotate errors in automatic cortical surface delineations on 100 central, coronal slices of MR images. On average, annotations for 100 images were obtained in less than an hour. When using expert annotations as reference, the crowd on average achieves a sensitivity of 82 % and a precision of 42 %. Merging multiple annotations per image significantly improves the sensitivity of the crowd (up to 95 %), but leads to a decrease in precision (as low as 22 %). Our experiments show that the detection of errors in automatic cortical surface delineations generated by anonymous untrained workers is feasible. Future work will focus on increasing the sensitivity of our method further, such that the error detection tasks can be handled exclusively by the crowd and expert resources can be focused on error correction.

  1. Uncovering the Topic Landscape of Product-Service System Research: from Sustainability to Value Creation

    Directory of Open Access Journals (Sweden)

    Hakyeon Lee

    2018-03-01

    Full Text Available As the product-service system (PSS is considered a promising business model that can create more value for customers, PSS research has enjoyed remarkable growth in its volume and coverage over the last decade. This study aims to delineate the thematic landscape of PSS research by identifying latent topics from a large amount of scholarly data. Ten topics of PSS research are identified by applying the Latent Dirichlet Allocation (LDA model to 1229 PSS publications published between 2000 and 2016. The ten PSS topics are briefly reviewed to provide an overview of what has previously been studied in PSS research. We also investigate which topics rise or fall in popularity by identifying hot and cold topics of PSS research. It is observed that the focus of discussions on the benefits of PSS has shifted from sustainability to value creation. Also, increasing attention has been paid to more practical topics such as PSS implementation. The areas of subspecialty of the top ten PSS journals are also examined to explore the interdisciplinary nature of PSS research and thematic differences across disciplines. The findings of this study can provide rich implications for both academia and practice in the field of PSS.

  2. Reducing Technology-Induced Errors: Organizational and Health Systems Approaches.

    Science.gov (United States)

    Borycki, Elizabeth M; Senthriajah, Yalini; Kushniruk, Andre W; Palojoki, Sari; Saranto, Kaija; Takeda, Hiroshi

    2016-01-01

    Technology-induced errors are a growing concern for health care organizations. Such errors arise from the interaction between healthcare and information technology deployed in complex settings and contexts. As the number of health information technologies that are used to provide patient care rises so will the need to develop ways to improve the quality and safety of the technology that we use. The objective of the panel is to describe varying approaches to improving software safety from and organizational and health systems perspective. We define what a technology-induced error is. Then, we discuss how software design and testing can be used to improve health information technologies. This discussion is followed by work in the area of monitoring and reporting at a health district and national level. Lastly, we draw on the quality, safety and resilience literature. The target audience for this work are nursing and health informatics researchers, practitioners, administrators, policy makers and students.

  3. Latent human error analysis and efficient improvement strategies by fuzzy TOPSIS in aviation maintenance tasks.

    Science.gov (United States)

    Chiu, Ming-Chuan; Hsieh, Min-Chih

    2016-05-01

    The purposes of this study were to develop a latent human error analysis process, to explore the factors of latent human error in aviation maintenance tasks, and to provide an efficient improvement strategy for addressing those errors. First, we used HFACS and RCA to define the error factors related to aviation maintenance tasks. Fuzzy TOPSIS with four criteria was applied to evaluate the error factors. Results show that 1) adverse physiological states, 2) physical/mental limitations, and 3) coordination, communication, and planning are the factors related to airline maintenance tasks that could be addressed easily and efficiently. This research establishes a new analytic process for investigating latent human error and provides a strategy for analyzing human error using fuzzy TOPSIS. Our analysis process complements shortages in existing methodologies by incorporating improvement efficiency, and it enhances the depth and broadness of human error analysis methodology. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  4. Thermodynamics of Error Correction

    Directory of Open Access Journals (Sweden)

    Pablo Sartori

    2015-12-01

    Full Text Available Information processing at the molecular scale is limited by thermal fluctuations. This can cause undesired consequences in copying information since thermal noise can lead to errors that can compromise the functionality of the copy. For example, a high error rate during DNA duplication can lead to cell death. Given the importance of accurate copying at the molecular scale, it is fundamental to understand its thermodynamic features. In this paper, we derive a universal expression for the copy error as a function of entropy production and work dissipated by the system during wrong incorporations. Its derivation is based on the second law of thermodynamics; hence, its validity is independent of the details of the molecular machinery, be it any polymerase or artificial copying device. Using this expression, we find that information can be copied in three different regimes. In two of them, work is dissipated to either increase or decrease the error. In the third regime, the protocol extracts work while correcting errors, reminiscent of a Maxwell demon. As a case study, we apply our framework to study a copy protocol assisted by kinetic proofreading, and show that it can operate in any of these three regimes. We finally show that, for any effective proofreading scheme, error reduction is limited by the chemical driving of the proofreading reaction.

  5. Space for human connection in antenatal education: Uncovering women's hopes using Participatory Action Research.

    Science.gov (United States)

    Brady, Vivienne; Lalor, Joan

    2017-12-01

    the aim of this research was to initiate active consultation with women and antenatal educators in the development and delivery of antenatal education that was mutually relevant. a Participatory Action Research approach influenced by feminist concerns was used to guide the research. Data were analysed by the researcher and participants using a Voice Centred Relational Method of Analysis. an Antenatal Education service in a consultant-led tertiary referral unit in Ireland. research findings revealed women's desires to build relationships through ANE to cope with anticipated loneliness and isolation after birth; however, environmental, structural, and organisational factors prohibited opportunity to build space for human connection. Participating women valued external and authoritative knowledge as truth, but concomitantly sought opportunity and space through classes to learn from the real life experiences of other mothers. Women lacked confidence in embodied knowing and their power to birth and demonstrated unquestioning acceptance of the predetermined nature of hospital birth and biomedical model of maternity care. in this research, we envisioned that hospital-based ANE, relevant and grounded in the needs and life experiences of women, could be developed, with a view to supporting women's decision-making processes, and understanding of pregnancy, birth and early motherhood. Participatory Action Research using a Voice Centred Relational Method of Analysis offered an opportunity to foster ethical and dialogic activity between learner and facilitator, underpinned by acknowledgement of the value of women's experiences; however, space for expression of new and useful knowledge in preparation for motherhood was limited by institutional context. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Impact of exposure measurement error in air pollution epidemiology: effect of error type in time-series studies.

    Science.gov (United States)

    Goldman, Gretchen T; Mulholland, James A; Russell, Armistead G; Strickland, Matthew J; Klein, Mitchel; Waller, Lance A; Tolbert, Paige E

    2011-06-22

    Two distinctly different types of measurement error are Berkson and classical. Impacts of measurement error in epidemiologic studies of ambient air pollution are expected to depend on error type. We characterize measurement error due to instrument imprecision and spatial variability as multiplicative (i.e. additive on the log scale) and model it over a range of error types to assess impacts on risk ratio estimates both on a per measurement unit basis and on a per interquartile range (IQR) basis in a time-series study in Atlanta. Daily measures of twelve ambient air pollutants were analyzed: NO2, NOx, O3, SO2, CO, PM10 mass, PM2.5 mass, and PM2.5 components sulfate, nitrate, ammonium, elemental carbon and organic carbon. Semivariogram analysis was applied to assess spatial variability. Error due to this spatial variability was added to a reference pollutant time-series on the log scale using Monte Carlo simulations. Each of these time-series was exponentiated and introduced to a Poisson generalized linear model of cardiovascular disease emergency department visits. Measurement error resulted in reduced statistical significance for the risk ratio estimates for all amounts (corresponding to different pollutants) and types of error. When modelled as classical-type error, risk ratios were attenuated, particularly for primary air pollutants, with average attenuation in risk ratios on a per unit of measurement basis ranging from 18% to 92% and on an IQR basis ranging from 18% to 86%. When modelled as Berkson-type error, risk ratios per unit of measurement were biased away from the null hypothesis by 2% to 31%, whereas risk ratios per IQR were attenuated (i.e. biased toward the null) by 5% to 34%. For CO modelled error amount, a range of error types were simulated and effects on risk ratio bias and significance were observed. For multiplicative error, both the amount and type of measurement error impact health effect estimates in air pollution epidemiology. By modelling

  7. Medication errors: the role of the patient.

    Science.gov (United States)

    Britten, Nicky

    2009-06-01

    1. Patients and their carers will usually be the first to notice any observable problems resulting from medication errors. They will probably be unable to distinguish between medication errors, adverse drug reactions, or 'side effects'. 2. Little is known about how patients understand drug related problems or how they make attributions of adverse effects. Some research suggests that patients' cognitive models of adverse drug reactions bear a close relationship to models of illness perception. 3. Attributions of adverse drug reactions are related to people's previous experiences and to their level of education. The evidence suggests that on the whole patients' reports of adverse drug reactions are accurate. However, patients do not report all the problems they perceive and are more likely to report those that they do perceive as severe. Patients may not report problems attributed to their medications if they are fearful of doctors' reactions. Doctors may respond inappropriately to patients' concerns, for example by ignoring them. Some authors have proposed the use of a symptom checklist to elicit patients' reports of suspected adverse drug reactions. 4. Many patients want information about adverse drug effects, and the challenge for the professional is to judge how much information to provide and the best way of doing so. Professionals' inappropriate emphasis on adherence may be dangerous when a medication error has occurred. 5. Recent NICE guidelines recommend that professionals should ask patients if they have any concerns about their medicines, and this approach is likely to yield information conducive to the identification of medication errors.

  8. Analysis of errors in forensic science

    Directory of Open Access Journals (Sweden)

    Mingxiao Du

    2017-01-01

    Full Text Available Reliability of expert testimony is one of the foundations of judicial justice. Both expert bias and scientific errors affect the reliability of expert opinion, which in turn affects the trustworthiness of the findings of fact in legal proceedings. Expert bias can be eliminated by replacing experts; however, it may be more difficult to eliminate scientific errors. From the perspective of statistics, errors in operation of forensic science include systematic errors, random errors, and gross errors. In general, process repetition and abiding by the standard ISO/IEC:17025: 2005, general requirements for the competence of testing and calibration laboratories, during operation are common measures used to reduce errors that originate from experts and equipment, respectively. For example, to reduce gross errors, the laboratory can ensure that a test is repeated several times by different experts. In applying for forensic principles and methods, the Federal Rules of Evidence 702 mandate that judges consider factors such as peer review, to ensure the reliability of the expert testimony. As the scientific principles and methods may not undergo professional review by specialists in a certain field, peer review serves as an exclusive standard. This study also examines two types of statistical errors. As false-positive errors involve a higher possibility of an unfair decision-making, they should receive more attention than false-negative errors.

  9. Research on the Factors Influencing the Measurement Errors of the Discrete Rogowski Coil.

    Science.gov (United States)

    Xu, Mengyuan; Yan, Jing; Geng, Yingsan; Zhang, Kun; Sun, Chao

    2018-03-13

    An innovative array of magnetic coils (the discrete Rogowski coil-RC) with the advantages of flexible structure, miniaturization and mass producibility is investigated. First, the mutual inductance between the discrete RC and circular and rectangular conductors are calculated using the magnetic vector potential (MVP) method. The results are found to be consistent with those calculated using the finite element method, but the MVP method is simpler and more practical. Then, the influence of conductor section parameters, inclination, and eccentricity on the accuracy of the discrete RC is calculated to provide a reference. Studying the influence of an external current on the discrete RC's interference error reveals optimal values for length, winding density, and position arrangement of the solenoids. It has also found that eccentricity and interference errors decreasing with increasing number of solenoids. Finally, a discrete RC prototype is devised and manufactured. The experimental results show consistent output characteristics, with the calculated sensitivity and mutual inductance of the discrete RC being very close to the experimental results. The influence of an external conductor on the measurement of the discrete RC is analyzed experimentally, and the results show that interference from an external current decreases with increasing distance between the external and measured conductors.

  10. Research on the Factors Influencing the Measurement Errors of the Discrete Rogowski Coil

    Directory of Open Access Journals (Sweden)

    Mengyuan Xu

    2018-03-01

    Full Text Available An innovative array of magnetic coils (the discrete Rogowski coil—RC with the advantages of flexible structure, miniaturization and mass producibility is investigated. First, the mutual inductance between the discrete RC and circular and rectangular conductors are calculated using the magnetic vector potential (MVP method. The results are found to be consistent with those calculated using the finite element method, but the MVP method is simpler and more practical. Then, the influence of conductor section parameters, inclination, and eccentricity on the accuracy of the discrete RC is calculated to provide a reference. Studying the influence of an external current on the discrete RC’s interference error reveals optimal values for length, winding density, and position arrangement of the solenoids. It has also found that eccentricity and interference errors decreasing with increasing number of solenoids. Finally, a discrete RC prototype is devised and manufactured. The experimental results show consistent output characteristics, with the calculated sensitivity and mutual inductance of the discrete RC being very close to the experimental results. The influence of an external conductor on the measurement of the discrete RC is analyzed experimentally, and the results show that interference from an external current decreases with increasing distance between the external and measured conductors.

  11. The current approach to human error and blame in the NHS.

    Science.gov (United States)

    Ottewill, Melanie

    There is a large body of research to suggest that serious errors are widespread throughout medicine. The traditional response to these adverse events has been to adopt a 'person approach' - blaming the individual seen as 'responsible'. The culture of medicine is highly complicit in this response. Such an approach results in enormous personal costs to the individuals concerned and does little to address the root causes of errors and thus prevent their recurrence. Other industries, such as aviation, where safety is a paramount concern and which have similar structures to the medical profession, have, over the past decade or so, adopted a 'systems' approach to error, recognizing that human error is ubiquitous and inevitable and that systems need to be developed with this in mind. This approach has been highly successful, but has necessitated, first and foremost, a cultural shift. It is in the best interests of patients, and medical professionals alike, that such a shift is embraced in the NHS.

  12. Doctors' duty to disclose error: a deontological or Kantian ethical analysis.

    Science.gov (United States)

    Bernstein, Mark; Brown, Barry

    2004-05-01

    Medical (surgical) error is being talked about more openly and besides being the subject of retrospective reviews, is now the subject of prospective research. Disclosure of error has been a difficult issue because of fear of embarrassment for doctors in the eyes of their peers, and fear of punitive action by patients, consisting of medicolegal action and/or complaints to doctors' governing bodies. This paper examines physicians' and surgeons' duty to disclose error, from an ethical standpoint; specifically by applying the moral philosophical theory espoused by Immanuel Kant (ie. deontology). The purpose of this discourse is to apply moral philosophical analysis to a delicate but important issue which will be a matter all physicians and surgeons will have to confront, probably numerous times, in their professional careers.

  13. Random errors in the magnetic field coefficients of superconducting magnets

    International Nuclear Information System (INIS)

    Herrera, J.; Hogue, R.; Prodell, A.; Wanderer, P.; Willen, E.

    1985-01-01

    Random errors in the multipole magnetic coefficients of superconducting magnet have been of continuing interest in accelerator research. The Superconducting Super Collider (SSC) with its small magnetic aperture only emphasizes this aspect of magnet design, construction, and measurement. With this in mind, we present a magnet model which mirrors the structure of a typical superconducting magnet. By taking advantage of the basic symmetries of a dipole magnet, we use this model to fit the measured multipole rms widths. The fit parameters allow us then to predict the values of the rms multipole errors expected for the SSC dipole reference design D, SSC-C5. With the aid of first-order perturbation theory, we then give an estimate of the effect of these random errors on the emittance growth of a proton beam stored in an SSC. 10 refs., 6 figs., 2 tabs

  14. Study of Errors among Nursing Students

    Directory of Open Access Journals (Sweden)

    Ella Koren

    2007-09-01

    Full Text Available The study of errors in the health system today is a topic of considerable interest aimed at reducing errors through analysis of the phenomenon and the conclusions reached. Errors that occur frequently among health professionals have also been observed among nursing students. True, in most cases they are actually “near errors,” but these could be a future indicator of therapeutic reality and the effect of nurses' work environment on their personal performance. There are two different approaches to such errors: (a The EPP (error prone person approach lays full responsibility at the door of the individual involved in the error, whether a student, nurse, doctor, or pharmacist. According to this approach, handling consists purely in identifying and penalizing the guilty party. (b The EPE (error prone environment approach emphasizes the environment as a primary contributory factor to errors. The environment as an abstract concept includes components and processes of interpersonal communications, work relations, human engineering, workload, pressures, technical apparatus, and new technologies. The objective of the present study was to examine the role played by factors in and components of personal performance as compared to elements and features of the environment. The study was based on both of the aforementioned approaches, which, when combined, enable a comprehensive understanding of the phenomenon of errors among the student population as well as a comparison of factors contributing to human error and to error deriving from the environment. The theoretical basis of the study was a model that combined both approaches: one focusing on the individual and his or her personal performance and the other focusing on the work environment. The findings emphasize the work environment of health professionals as an EPE. However, errors could have been avoided by means of strict adherence to practical procedures. The authors examined error events in the

  15. An overview of intravenous-related medication administration errors as reported to MEDMARX, a national medication error-reporting program.

    Science.gov (United States)

    Hicks, Rodney W; Becker, Shawn C

    2006-01-01

    Medication errors can be harmful, especially if they involve the intravenous (IV) route of administration. A mixed-methodology study using a 5-year review of 73,769 IV-related medication errors from a national medication error reporting program indicates that between 3% and 5% of these errors were harmful. The leading type of error was omission, and the leading cause of error involved clinician performance deficit. Using content analysis, three themes-product shortage, calculation errors, and tubing interconnectivity-emerge and appear to predispose patients to harm. Nurses often participate in IV therapy, and these findings have implications for practice and patient safety. Voluntary medication error-reporting programs afford an opportunity to improve patient care and to further understanding about the nature of IV-related medication errors.

  16. Quantifying and handling errors in instrumental measurements using the measurement error theory

    DEFF Research Database (Denmark)

    Andersen, Charlotte Møller; Bro, R.; Brockhoff, P.B.

    2003-01-01

    . This is a new way of using the measurement error theory. Reliability ratios illustrate that the models for the two fish species are influenced differently by the error. However, the error seems to influence the predictions of the two reference measures in the same way. The effect of using replicated x...... measurements. A new general formula is given for how to correct the least squares regression coefficient when a different number of replicated x-measurements is used for prediction than for calibration. It is shown that the correction should be applied when the number of replicates in prediction is less than...

  17. Uncovering values-based practice: VBP's implicit commitments to subjectivism and relativism.

    Science.gov (United States)

    Cassidy, Ben

    2013-06-01

    Despite assertions to the contrary, KWM Fulford's values-based practice is implicitly committed to subjectivism when it comes to reasoning about values. This renders the approach unworkable. The act of merely uncovering underlying values is not enough to effect change and, therefore, resolve problems if we have no way, even in principle, of determining which values are right and which are wrong. Fulford's only departure from subjectivism about value is his commitment to 'framework values', which seems grounded in a version of ethical relativism. I argue that we need to reject both subjectivism and relativism if progress within ethical discussions about practice is to be meaningful and a real possibility. © 2013 John Wiley & Sons Ltd.

  18. Error and its meaning in forensic science.

    Science.gov (United States)

    Christensen, Angi M; Crowder, Christian M; Ousley, Stephen D; Houck, Max M

    2014-01-01

    The discussion of "error" has gained momentum in forensic science in the wake of the Daubert guidelines and has intensified with the National Academy of Sciences' Report. Error has many different meanings, and too often, forensic practitioners themselves as well as the courts misunderstand scientific error and statistical error rates, often confusing them with practitioner error (or mistakes). Here, we present an overview of these concepts as they pertain to forensic science applications, discussing the difference between practitioner error (including mistakes), instrument error, statistical error, and method error. We urge forensic practitioners to ensure that potential sources of error and method limitations are understood and clearly communicated and advocate that the legal community be informed regarding the differences between interobserver errors, uncertainty, variation, and mistakes. © 2013 American Academy of Forensic Sciences.

  19. Error threshold ghosts in a simple hypercycle with error prone self-replication

    International Nuclear Information System (INIS)

    Sardanyes, Josep

    2008-01-01

    A delayed transition because of mutation processes is shown to happen in a simple hypercycle composed by two indistinguishable molecular species with error prone self-replication. The appearance of a ghost near the hypercycle error threshold causes a delay in the extinction and thus in the loss of information of the mutually catalytic replicators, in a kind of information memory. The extinction time, τ, scales near bifurcation threshold according to the universal square-root scaling law i.e. τ ∼ (Q hc - Q) -1/2 , typical of dynamical systems close to a saddle-node bifurcation. Here, Q hc represents the bifurcation point named hypercycle error threshold, involved in the change among the asymptotic stability phase and the so-called Random Replication State (RRS) of the hypercycle; and the parameter Q is the replication quality factor. The ghost involves a longer transient towards extinction once the saddle-node bifurcation has occurred, being extremely long near the bifurcation threshold. The role of this dynamical effect is expected to be relevant in fluctuating environments. Such a phenomenon should also be found in larger hypercycles when considering the hypercycle species in competition with their error tail. The implications of the ghost in the survival and evolution of error prone self-replicating molecules with hypercyclic organization are discussed

  20. A checklist to facilitate objective hypothesis testing in social psychology research.

    Science.gov (United States)

    Washburn, Anthony N; Morgan, G Scott; Skitka, Linda J

    2015-01-01

    Social psychology is not a very politically diverse area of inquiry, something that could negatively affect the objectivity of social psychological theory and research, as Duarte et al. argue in the target article. This commentary offers a number of checks to help researchers uncover possible biases and identify when they are engaging in hypothesis confirmation and advocacy instead of hypothesis testing.

  1. Radiologic Placement of Uncovered Stents for the Treatment of Malignant Colonic Obstruction Proximal to the Descending Colon

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Jehong; Kwon, Se Hwan, E-mail: Kwon98@khu.ac.kr [Kyung Hee University, Department of Radiology, College of Medicine (Korea, Republic of); Lee, Chang-Kyun [Kyung Hee University, Department of Internal Medicine, College of Medicine (Korea, Republic of); Park, Sun Jin [Kyung Hee University, Department of Surgery, College of Medicine (Korea, Republic of); Oh, Ji Young [Kyung Hee University Hospital at Gangdong, Department of Radiology (Korea, Republic of); Oh, Joo Hyeong [Kyung Hee University, Department of Radiology, College of Medicine (Korea, Republic of)

    2017-01-15

    PurposeTo evaluate the safety, feasibility, and patency rates of radiologic placement of uncovered stents for the treatment of malignant colonic obstruction proximal to the descending colon.Materials and MethodsThis was a retrospective, single-center study. From May 2003 to March 2015, 53 image-guided placements of uncovered stents (44 initial placements, 9 secondary placements) were attempted in 44 patients (male:female = 23:21; mean age, 71.8 years). The technical and clinical success, complication rates, and patency rates of the stents were also evaluated. Technical success was defined as the successful deployment of the stent under fluoroscopic guidance alone and clinical success was defined as the relief of obstructive symptoms or signs within 48 h of stent deployment.ResultsIn total, 12 (27.3 %) patients underwent preoperative decompression, while 32 (72.7 %) underwent decompression with palliative intent. The technical success rate was 93.2 % (41/44) for initial placement and 88.9 % (8/9) for secondary placement. Secondary stent placement in the palliative group was required in nine patients after successful initial stent placement due to stent obstruction from tumor ingrowth (n = 7) and stent migration (n = 2). The symptoms of obstruction were relieved in all successful cases (100 %). In the palliative group, the patency rates were 94.4 % at 1 month, 84.0 % at 3 months, 64.8 % at 6 months, and 48.6 % at 12 months.ConclusionsThe radiologic placement of uncovered stents for the treatment of malignant obstruction proximal to the descending colon is feasible and safe, and provides acceptable clinical results.

  2. Strategies and Errors in Translating Tourism Brochures: the case of EFL Learners

    OpenAIRE

    ZAHİRİ, Tahereh; SADEGHİ, Bahador; MALEKİ, Ataollah

    2015-01-01

    Abstract. Tourism English is a highly specialized discourse with its defining characteristics. In this study, the translation of travel brochures by Iranian EFL learners were studied. The study was carried out to reveal the nature of errors and strategies in Persian translations of English tourism brochures. The errors and strategies in translating travel brochures are under-researched in the tourism literature and similarly there is little discussion of tourism material in translation resear...

  3. Investigating Surface Bias Errors in the Weather Research and Forecasting (WRF) Model using a Geographic Information System (GIS)

    Science.gov (United States)

    2015-02-01

    Computational and Information Sciences Directorate Battlefield Environment Division (ATTN: RDRL- CIE -M) White Sands Missile Range, NM 88002-5501 8. PERFORMING...meteorological parameters, which became our focus. We found that elevation accounts for a significant portion of the variance in the model error. The...found that elevation accounts for a significant portion of the variance in the model error of surface temperature and relative humidity predictions

  4. Research and application of a novel hybrid decomposition-ensemble learning paradigm with error correction for daily PM10 forecasting

    Science.gov (United States)

    Luo, Hongyuan; Wang, Deyun; Yue, Chenqiang; Liu, Yanling; Guo, Haixiang

    2018-03-01

    In this paper, a hybrid decomposition-ensemble learning paradigm combining error correction is proposed for improving the forecast accuracy of daily PM10 concentration. The proposed learning paradigm is consisted of the following two sub-models: (1) PM10 concentration forecasting model; (2) error correction model. In the proposed model, fast ensemble empirical mode decomposition (FEEMD) and variational mode decomposition (VMD) are applied to disassemble original PM10 concentration series and error sequence, respectively. The extreme learning machine (ELM) model optimized by cuckoo search (CS) algorithm is utilized to forecast the components generated by FEEMD and VMD. In order to prove the effectiveness and accuracy of the proposed model, two real-world PM10 concentration series respectively collected from Beijing and Harbin located in China are adopted to conduct the empirical study. The results show that the proposed model performs remarkably better than all other considered models without error correction, which indicates the superior performance of the proposed model.

  5. Total Survey Error for Longitudinal Surveys

    NARCIS (Netherlands)

    Lynn, Peter; Lugtig, P.J.

    2016-01-01

    This article describes the application of the total survey error paradigm to longitudinal surveys. Several aspects of survey error, and of the interactions between different types of error, are distinct in the longitudinal survey context. Furthermore, error trade-off decisions in survey design and

  6. On-Error Training (Book Excerpt).

    Science.gov (United States)

    Fukuda, Ryuji

    1985-01-01

    This excerpt from "Managerial Engineering: Techniques for Improving Quality and Productivity in the Workplace" describes the development, objectives, and use of On-Error Training (OET), a method which trains workers to learn from their errors. Also described is New Joharry's Window, a performance-error data analysis technique used in…

  7. Uncertainty quantification and error analysis

    Energy Technology Data Exchange (ETDEWEB)

    Higdon, Dave M [Los Alamos National Laboratory; Anderson, Mark C [Los Alamos National Laboratory; Habib, Salman [Los Alamos National Laboratory; Klein, Richard [Los Alamos National Laboratory; Berliner, Mark [OHIO STATE UNIV.; Covey, Curt [LLNL; Ghattas, Omar [UNIV OF TEXAS; Graziani, Carlo [UNIV OF CHICAGO; Seager, Mark [LLNL; Sefcik, Joseph [LLNL; Stark, Philip [UC/BERKELEY; Stewart, James [SNL

    2010-01-01

    UQ studies all sources of error and uncertainty, including: systematic and stochastic measurement error; ignorance; limitations of theoretical models; limitations of numerical representations of those models; limitations on the accuracy and reliability of computations, approximations, and algorithms; and human error. A more precise definition for UQ is suggested below.

  8. Error Patterns in Problem Solving.

    Science.gov (United States)

    Babbitt, Beatrice C.

    Although many common problem-solving errors within the realm of school mathematics have been previously identified, a compilation of such errors is not readily available within learning disabilities textbooks, mathematics education texts, or teacher's manuals for school mathematics texts. Using data on error frequencies drawn from both the Fourth…

  9. Near Misses in Financial Trading: Skills for Capturing and Averting Error.

    Science.gov (United States)

    Leaver, Meghan; Griffiths, Alex; Reader, Tom

    2018-05-01

    The aims of this study were (a) to determine whether near-miss incidents in financial trading contain information on the operator skills and systems that detect and prevent near misses and the patterns and trends revealed by these data and (b) to explore if particular operator skills and systems are found as important for avoiding particular types of error on the trading floor. In this study, we examine a cohort of near-miss incidents collected from a financial trading organization using the Financial Incident Analysis System and report on the nontechnical skills and systems that are used to detect and prevent error in this domain. One thousand near-miss incidents are analyzed using distribution, mean, chi-square, and associative analysis to describe the data; reliability is provided. Slips/lapses (52%) and human-computer interface problems (21%) often occur alone and are the main contributors to error causation, whereas the prevention of error is largely a result of teamwork (65%) and situation awareness (46%) skills. No matter the cause of error, situation awareness and teamwork skills are used most often to detect and prevent the error. Situation awareness and teamwork skills appear universally important as a "last line" of defense for capturing error, and data from incident-monitoring systems can be analyzed in a fashion more consistent with a "Safety-II" approach. This research provides data for ameliorating risk within financial trading organizations, with implications for future risk management programs and regulation.

  10. Preventing Errors in Laterality

    OpenAIRE

    Landau, Elliot; Hirschorn, David; Koutras, Iakovos; Malek, Alexander; Demissie, Seleshie

    2014-01-01

    An error in laterality is the reporting of a finding that is present on the right side as on the left or vice versa. While different medical and surgical specialties have implemented protocols to help prevent such errors, very few studies have been published that describe these errors in radiology reports and ways to prevent them. We devised a system that allows the radiologist to view reports in a separate window, displayed in a simple font and with all terms of laterality highlighted in sep...

  11. Prevalence of Refractive errors in Primary school children in a rural ...

    African Journals Online (AJOL)

    Prevalence of Refractive errors in Primary school children in a rural community in Ebonyi state of Nigeria. ... PROMOTING ACCESS TO AFRICAN RESEARCH ... However, no previous vision screening study among primary schools children ...

  12. Comparing Interval Management Control Laws for Steady-State Errors and String Stability

    Science.gov (United States)

    Weitz, Lesley A.; Swieringa, Kurt A.

    2018-01-01

    Interval Management (IM) is a future airborne spacing concept that leverages avionics to provide speed guidance to an aircraft to achieve and maintain a specified spacing interval from another aircraft. The design of a speed control law to achieve the spacing goal is a key aspect in the research and development of the IM concept. In this paper, two control laws that are used in much of the contemporary IM research are analyzed and compared to characterize steady-state errors and string stability. Numerical results are used to illustrate how the choice of control laws gains impacts the size of steady-state errors and string performance and the potential trade-offs between those performance characteristics.

  13. Errors in translation made by English major students: A study on types and causes

    Directory of Open Access Journals (Sweden)

    Pattanapong Wongranu

    2017-05-01

    Full Text Available Many Thai English major students have problems when they translate Thai texts into English, as numerous errors can be found. Therefore, a study of translation errors is needed to find solutions to these problems. The objectives of this research were: 1 to examine types of translation errors in translation from Thai into English, 2 to determine the types of translation errors that are most common, and 3 to find possible explanations for the causes of errors. The results of this study will be used to improve translation teaching and the course “Translation from Thai into English”. The participants were 26 third-year, English major students at Kasetsart University. The data were collected from the students' exercises and examinations. Interviews and stimulated recall were also used to determine translation problems and causes of errors. The data were analyzed by considering the frequency and percentage, and by content analysis. The results shows that the most frequent translation errors were syntactic errors (65%, followed by semantic errors (26.5% and miscellaneous errors (8.5%, respectively. The causes of errors found in this study included translation procedures, carelessness, low self-confidence, and anxiety. It is recommended that more class time be spent to address the problematic points. In addition, more authentic translation and group work should be implemented to increase self-confidence and decrease anxiety.

  14. Use of Earth's magnetic field for mitigating gyroscope errors regardless of magnetic perturbation.

    Science.gov (United States)

    Afzal, Muhammad Haris; Renaudin, Valérie; Lachapelle, Gérard

    2011-01-01

    Most portable systems like smart-phones are equipped with low cost consumer grade sensors, making them useful as Pedestrian Navigation Systems (PNS). Measurements of these sensors are severely contaminated by errors caused due to instrumentation and environmental issues rendering the unaided navigation solution with these sensors of limited use. The overall navigation error budget associated with pedestrian navigation can be categorized into position/displacement errors and attitude/orientation errors. Most of the research is conducted for tackling and reducing the displacement errors, which either utilize Pedestrian Dead Reckoning (PDR) or special constraints like Zero velocity UPdaTes (ZUPT) and Zero Angular Rate Updates (ZARU). This article targets the orientation/attitude errors encountered in pedestrian navigation and develops a novel sensor fusion technique to utilize the Earth's magnetic field, even perturbed, for attitude and rate gyroscope error estimation in pedestrian navigation environments where it is assumed that Global Navigation Satellite System (GNSS) navigation is denied. As the Earth's magnetic field undergoes severe degradations in pedestrian navigation environments, a novel Quasi-Static magnetic Field (QSF) based attitude and angular rate error estimation technique is developed to effectively use magnetic measurements in highly perturbed environments. The QSF scheme is then used for generating the desired measurements for the proposed Extended Kalman Filter (EKF) based attitude estimator. Results indicate that the QSF measurements are capable of effectively estimating attitude and gyroscope errors, reducing the overall navigation error budget by over 80% in urban canyon environment.

  15. Compact disk error measurements

    Science.gov (United States)

    Howe, D.; Harriman, K.; Tehranchi, B.

    1993-01-01

    The objectives of this project are as follows: provide hardware and software that will perform simple, real-time, high resolution (single-byte) measurement of the error burst and good data gap statistics seen by a photoCD player read channel when recorded CD write-once discs of variable quality (i.e., condition) are being read; extend the above system to enable measurement of the hard decision (i.e., 1-bit error flags) and soft decision (i.e., 2-bit error flags) decoding information that is produced/used by the Cross Interleaved - Reed - Solomon - Code (CIRC) block decoder employed in the photoCD player read channel; construct a model that uses data obtained via the systems described above to produce meaningful estimates of output error rates (due to both uncorrected ECC words and misdecoded ECC words) when a CD disc having specific (measured) error statistics is read (completion date to be determined); and check the hypothesis that current adaptive CIRC block decoders are optimized for pressed (DAD/ROM) CD discs. If warranted, do a conceptual design of an adaptive CIRC decoder that is optimized for write-once CD discs.

  16. Measurement Error and Bias in Value-Added Models. Research Report. ETS RR-17-25

    Science.gov (United States)

    Kane, Michael T.

    2017-01-01

    By aggregating residual gain scores (the differences between each student's current score and a predicted score based on prior performance) for a school or a teacher, value-added models (VAMs) can be used to generate estimates of school or teacher effects. It is known that random errors in the prior scores will introduce bias into predictions of…

  17. Uncovering Aberrant Mutant PKA Function with Flow Cytometric FRET

    Directory of Open Access Journals (Sweden)

    Shin-Rong Lee

    2016-03-01

    Full Text Available Biology has been revolutionized by tools that allow the detection and characterization of protein-protein interactions (PPIs. Förster resonance energy transfer (FRET-based methods have become particularly attractive as they allow quantitative studies of PPIs within the convenient and relevant context of living cells. We describe here an approach that allows the rapid construction of live-cell FRET-based binding curves using a commercially available flow cytometer. We illustrate a simple method for absolutely calibrating the cytometer, validating our binding assay against the gold standard isothermal calorimetry (ITC, and using flow cytometric FRET to uncover the structural and functional effects of the Cushing-syndrome-causing mutation (L206R on PKA’s catalytic subunit. We discover that this mutation not only differentially affects PKAcat’s binding to its multiple partners but also impacts its rate of catalysis. These findings improve our mechanistic understanding of this disease-causing mutation, while illustrating the simplicity, general applicability, and power of flow cytometric FRET.

  18. Uncovering the Geometry of Barrierless Reactions Using Lagrangian Descriptors.

    Science.gov (United States)

    Junginger, Andrej; Hernandez, Rigoberto

    2016-03-03

    Transition-state theories describing barrierless chemical reactions, or more general activated problems, are often hampered by the lack of a saddle around which the dividing surface can be constructed. For example, the time-dependent transition-state trajectory uncovering the nonrecrossing dividing surface in thermal reactions in the framework of the Langevin equation has relied on perturbative approaches in the vicinity of the saddle. We recently obtained an alternative approach using Lagrangian descriptors to construct time-dependent and recrossing-free dividing surfaces. This is a nonperturbative approach making no reference to a putative saddle. Here we show how the Lagrangian descriptor can be used to obtain the transition-state geometry of a dissipated and thermalized reaction across barrierless potentials. We illustrate the method in the case of a 1D Brownian motion for both barrierless and step potentials; however, the method is not restricted and can be directly applied to different kinds of potentials and higher dimensional systems.

  19. A theory of human error

    Science.gov (United States)

    Mcruer, D. T.; Clement, W. F.; Allen, R. W.

    1981-01-01

    Human errors tend to be treated in terms of clinical and anecdotal descriptions, from which remedial measures are difficult to derive. Correction of the sources of human error requires an attempt to reconstruct underlying and contributing causes of error from the circumstantial causes cited in official investigative reports. A comprehensive analytical theory of the cause-effect relationships governing propagation of human error is indispensable to a reconstruction of the underlying and contributing causes. A validated analytical theory of the input-output behavior of human operators involving manual control, communication, supervisory, and monitoring tasks which are relevant to aviation, maritime, automotive, and process control operations is highlighted. This theory of behavior, both appropriate and inappropriate, provides an insightful basis for investigating, classifying, and quantifying the needed cause-effect relationships governing propagation of human error.

  20. A Survey of Soft-Error Mitigation Techniques for Non-Volatile Memories

    Directory of Open Access Journals (Sweden)

    Sparsh Mittal

    2017-02-01

    Full Text Available Non-volatile memories (NVMs offer superior density and energy characteristics compared to the conventional memories; however, NVMs suffer from severe reliability issues that can easily eclipse their energy efficiency advantages. In this paper, we survey architectural techniques for improving the soft-error reliability of NVMs, specifically PCM (phase change memory and STT-RAM (spin transfer torque RAM. We focus on soft-errors, such as resistance drift and write disturbance, in PCM and read disturbance and write failures in STT-RAM. By classifying the research works based on key parameters, we highlight their similarities and distinctions. We hope that this survey will underline the crucial importance of addressing NVM reliability for ensuring their system integration and will be useful for researchers, computer architects and processor designers.

  1. Decreasing scoring errors on Wechsler Scale Vocabulary, Comprehension, and Similarities subtests: a preliminary study.

    Science.gov (United States)

    Linger, Michele L; Ray, Glen E; Zachar, Peter; Underhill, Andrea T; LoBello, Steven G

    2007-10-01

    Studies of graduate students learning to administer the Wechsler scales have generally shown that training is not associated with the development of scoring proficiency. Many studies report on the reduction of aggregated administration and scoring errors, a strategy that does not highlight the reduction of errors on subtests identified as most prone to error. This study evaluated the development of scoring proficiency specifically on the Wechsler (WISC-IV and WAIS-III) Vocabulary, Comprehension, and Similarities subtests during training by comparing a set of 'early test administrations' to 'later test administrations.' Twelve graduate students enrolled in an intelligence-testing course participated in the study. Scoring errors (e.g., incorrect point assignment) were evaluated on the students' actual practice administration test protocols. Errors on all three subtests declined significantly when scoring errors on 'early' sets of Wechsler scales were compared to those made on 'later' sets. However, correcting these subtest scoring errors did not cause significant changes in subtest scaled scores. Implications for clinical instruction and future research are discussed.

  2. The Neural-fuzzy Thermal Error Compensation Controller on CNC Machining Center

    Science.gov (United States)

    Tseng, Pai-Chung; Chen, Shen-Len

    The geometric errors and structural thermal deformation are factors that influence the machining accuracy of Computer Numerical Control (CNC) machining center. Therefore, researchers pay attention to thermal error compensation technologies on CNC machine tools. Some real-time error compensation techniques have been successfully demonstrated in both laboratories and industrial sites. The compensation results still need to be enhanced. In this research, the neural-fuzzy theory has been conducted to derive a thermal prediction model. An IC-type thermometer has been used to detect the heat sources temperature variation. The thermal drifts are online measured by a touch-triggered probe with a standard bar. A thermal prediction model is then derived by neural-fuzzy theory based on the temperature variation and the thermal drifts. A Graphic User Interface (GUI) system is also built to conduct the user friendly operation interface with Insprise C++ Builder. The experimental results show that the thermal prediction model developed by neural-fuzzy theory methodology can improve machining accuracy from 80µm to 3µm. Comparison with the multi-variable linear regression analysis the compensation accuracy is increased from ±10µm to ±3µm.

  3. Optical System Error Analysis and Calibration Method of High-Accuracy Star Trackers

    Directory of Open Access Journals (Sweden)

    Zheng You

    2013-04-01

    Full Text Available The star tracker is a high-accuracy attitude measurement device widely used in spacecraft. Its performance depends largely on the precision of the optical system parameters. Therefore, the analysis of the optical system parameter errors and a precise calibration model are crucial to the accuracy of the star tracker. Research in this field is relatively lacking a systematic and universal analysis up to now. This paper proposes in detail an approach for the synthetic error analysis of the star tracker, without the complicated theoretical derivation. This approach can determine the error propagation relationship of the star tracker, and can build intuitively and systematically an error model. The analysis results can be used as a foundation and a guide for the optical design, calibration, and compensation of the star tracker. A calibration experiment is designed and conducted. Excellent calibration results are achieved based on the calibration model. To summarize, the error analysis approach and the calibration method are proved to be adequate and precise, and could provide an important guarantee for the design, manufacture, and measurement of high-accuracy star trackers.

  4. Optical system error analysis and calibration method of high-accuracy star trackers.

    Science.gov (United States)

    Sun, Ting; Xing, Fei; You, Zheng

    2013-04-08

    The star tracker is a high-accuracy attitude measurement device widely used in spacecraft. Its performance depends largely on the precision of the optical system parameters. Therefore, the analysis of the optical system parameter errors and a precise calibration model are crucial to the accuracy of the star tracker. Research in this field is relatively lacking a systematic and universal analysis up to now. This paper proposes in detail an approach for the synthetic error analysis of the star tracker, without the complicated theoretical derivation. This approach can determine the error propagation relationship of the star tracker, and can build intuitively and systematically an error model. The analysis results can be used as a foundation and a guide for the optical design, calibration, and compensation of the star tracker. A calibration experiment is designed and conducted. Excellent calibration results are achieved based on the calibration model. To summarize, the error analysis approach and the calibration method are proved to be adequate and precise, and could provide an important guarantee for the design, manufacture, and measurement of high-accuracy star trackers.

  5. Accounting for covariate measurement error in a Cox model analysis of recurrence of depression.

    Science.gov (United States)

    Liu, K; Mazumdar, S; Stone, R A; Dew, M A; Houck, P R; Reynolds, C F

    2001-01-01

    When a covariate measured with error is used as a predictor in a survival analysis using the Cox model, the parameter estimate is usually biased. In clinical research, covariates measured without error such as treatment procedure or sex are often used in conjunction with a covariate measured with error. In a randomized clinical trial of two types of treatments, we account for the measurement error in the covariate, log-transformed total rapid eye movement (REM) activity counts, in a Cox model analysis of the time to recurrence of major depression in an elderly population. Regression calibration and two variants of a likelihood-based approach are used to account for measurement error. The likelihood-based approach is extended to account for the correlation between replicate measures of the covariate. Using the replicate data decreases the standard error of the parameter estimate for log(total REM) counts while maintaining the bias reduction of the estimate. We conclude that covariate measurement error and the correlation between replicates can affect results in a Cox model analysis and should be accounted for. In the depression data, these methods render comparable results that have less bias than the results when measurement error is ignored.

  6. Errors in statistical decision making Chapter 2 in Applied Statistics in Agricultural, Biological, and Environmental Sciences

    Science.gov (United States)

    Agronomic and Environmental research experiments result in data that are analyzed using statistical methods. These data are unavoidably accompanied by uncertainty. Decisions about hypotheses, based on statistical analyses of these data are therefore subject to error. This error is of three types,...

  7. Approximate error conjugation gradient minimization methods

    Science.gov (United States)

    Kallman, Jeffrey S

    2013-05-21

    In one embodiment, a method includes selecting a subset of rays from a set of all rays to use in an error calculation for a constrained conjugate gradient minimization problem, calculating an approximate error using the subset of rays, and calculating a minimum in a conjugate gradient direction based on the approximate error. In another embodiment, a system includes a processor for executing logic, logic for selecting a subset of rays from a set of all rays to use in an error calculation for a constrained conjugate gradient minimization problem, logic for calculating an approximate error using the subset of rays, and logic for calculating a minimum in a conjugate gradient direction based on the approximate error. In other embodiments, computer program products, methods, and systems are described capable of using approximate error in constrained conjugate gradient minimization problems.

  8. Uncovering Sundanese Values by Analyzing Symbolic Meaning of Ménak Priangan Clothing (1800-1942)

    Science.gov (United States)

    Karmila, M.; Suciati; Widiaty, I.

    2016-04-01

    This study investigates symbolic meanings found in the Sunda ethnic clothing, particularly the Menak Priangan clothing. This study aims to uncover and document those symbolic meanings found in the Menak Priangan clothing as an effort to develop Sunda cultural artefacts of West Java. This study on Menak Priangan clothing applies ethnography (visual) and aesthetic methods. The visual method is utilized in order to uncover local cultural (Sunda) values found in Menak Priangan clothing visualization, including: design, model, name, and representing colours, which then directed towards local Sundanese aesthetic concepts living within the Priangan community. Furthermore, aesthetic method is used to explore role of aesthetic values in empowering visual cultural values within certain community, particularly Sunda aesthetic values. The study results show that since the 19th century, Sunda ethnic clothing was limited to Priangan Sunda only, while traditional clothing wearing by Priangan people reflects their social strata, consisting of: a. Menak Gede (Menak pangluhurna: mayor), bearing raden title, b. Menak Leutik/Santana (mayor assistant), titles: asep, mas, agus, ujang, (Nyimas for woman), c. Somah/Cacah: ordinary people/lower class. Clothing is a cultural phenomenon within certain culture reflecting such society experiences. For Menak people, clothing and its accessories have important meanings. They wear such traditional clothing and accessories as a symbol of power they have within bureaucratic structure and as a symbol of social status they bear within traditional community structure.

  9. Machine Translation as a Model for Overcoming Some Common Errors in English-into-Arabic Translation among EFL University Freshmen

    Science.gov (United States)

    El-Banna, Adel I.; Naeem, Marwa A.

    2016-01-01

    This research work aimed at making use of Machine Translation to help students avoid some syntactic, semantic and pragmatic common errors in translation from English into Arabic. Participants were a hundred and five freshmen who studied the "Translation Common Errors Remedial Program" prepared by the researchers. A testing kit that…

  10. Working memory capacity and task goals modulate error-related ERPs.

    Science.gov (United States)

    Coleman, James R; Watson, Jason M; Strayer, David L

    2018-03-01

    The present study investigated individual differences in information processing following errant behavior. Participants were initially classified as high or as low working memory capacity using the Operation Span Task. In a subsequent session, they then performed a high congruency version of the flanker task under both speed and accuracy stress. We recorded ERPs and behavioral measures of accuracy and response time in the flanker task with a primary focus on processing following an error. The error-related negativity was larger for the high working memory capacity group than for the low working memory capacity group. The positivity following an error (Pe) was modulated to a greater extent by speed-accuracy instruction for the high working memory capacity group than for the low working memory capacity group. These data help to explicate the neural bases of individual differences in working memory capacity and cognitive control. © 2017 Society for Psychophysiological Research.

  11. Development of an integrated system for estimating human error probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Auflick, J.L.; Hahn, H.A.; Morzinski, J.A.

    1998-12-01

    This is the final report of a three-year, Laboratory Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). This project had as its main objective the development of a Human Reliability Analysis (HRA), knowledge-based expert system that would provide probabilistic estimates for potential human errors within various risk assessments, safety analysis reports, and hazard assessments. HRA identifies where human errors are most likely, estimates the error rate for individual tasks, and highlights the most beneficial areas for system improvements. This project accomplished three major tasks. First, several prominent HRA techniques and associated databases were collected and translated into an electronic format. Next, the project started a knowledge engineering phase where the expertise, i.e., the procedural rules and data, were extracted from those techniques and compiled into various modules. Finally, these modules, rules, and data were combined into a nearly complete HRA expert system.

  12. Responses to Error: Sentence-Level Error and the Teacher of Basic Writing

    Science.gov (United States)

    Foltz-Gray, Dan

    2012-01-01

    In this article, the author talks about sentence-level error, error in grammar, mechanics, punctuation, usage, and the teacher of basic writing. He states that communities are crawling with teachers and administrators and parents and state legislators and school board members who are engaged in sometimes rancorous debate over what to do about…

  13. Influence of calculation error of total field anomaly in strongly magnetic environments

    Science.gov (United States)

    Yuan, Xiaoyu; Yao, Changli; Zheng, Yuanman; Li, Zelin

    2016-04-01

    An assumption made in many magnetic interpretation techniques is that ΔTact (total field anomaly - the measurement given by total field magnetometers, after we remove the main geomagnetic field, T0) can be approximated mathematically by ΔTpro (the projection of anomalous field vector in the direction of the earth's normal field). In order to meet the demand for high-precision processing of magnetic prospecting, the approximate error E between ΔTact and ΔTpro is studied in this research. Generally speaking, the error E is extremely small when anomalies not greater than about 0.2T0. However, the errorE may be large in highly magnetic environments. This leads to significant effects on subsequent quantitative inference. Therefore, we investigate the error E through numerical experiments of high-susceptibility bodies. A systematic error analysis was made by using a 2-D elliptic cylinder model. Error analysis show that the magnitude of ΔTact is usually larger than that of ΔTpro. This imply that a theoretical anomaly computed without accounting for the error E overestimate the anomaly associated with the body. It is demonstrated through numerical experiments that the error E is obvious and should not be ignored. It is also shown that the curves of ΔTpro and the error E had a certain symmetry when the directions of magnetization and geomagnetic field changed. To be more specific, the Emax (the maximum of the error E) appeared above the center of the magnetic body when the magnetic parameters are determined. Some other characteristics about the error Eare discovered. For instance, the curve of Emax with respect to the latitude was symmetrical on both sides of magnetic equator, and the extremum of the Emax can always be found in the mid-latitudes, and so on. It is also demonstrated that the error Ehas great influence on magnetic processing transformation and inversion results. It is conclude that when the bodies have highly magnetic susceptibilities, the error E can

  14. The proteome and phosphoproteome of maize pollen uncovers fertility candidate proteins.

    Science.gov (United States)

    Chao, Qing; Gao, Zhi-Fang; Wang, Yue-Feng; Li, Zhe; Huang, Xia-He; Wang, Ying-Chun; Mei, Ying-Chang; Zhao, Biligen-Gaowa; Li, Liang; Jiang, Yu-Bo; Wang, Bai-Chen

    2016-06-01

    Maize is unique since it is both monoecious and diclinous (separate male and female flowers on the same plant). We investigated the proteome and phosphoproteome of maize pollen containing modified proteins and here we provide a comprehensive pollen proteome and phosphoproteome which contain 100,990 peptides from 6750 proteins and 5292 phosphorylated sites corresponding to 2257 maize phosphoproteins, respectively. Interestingly, among the total 27 overrepresented phosphosite motifs we identified here, 11 were novel motifs, which suggested different modification mechanisms in plants compared to those of animals. Enrichment analysis of pollen phosphoproteins showed that pathways including DNA synthesis/chromatin structure, regulation of RNA transcription, protein modification, cell organization, signal transduction, cell cycle, vesicle transport, transport of ions and metabolisms, which were involved in pollen development, the following germination and pollen tube growth, were regulated by phosphorylation. In this study, we also found 430 kinases and 105 phosphatases in the maize pollen phosphoproteome, among which calcium dependent protein kinases (CDPKs), leucine rich repeat kinase, SNF1 related protein kinases and MAPK family proteins were heavily enriched and further analyzed. From our research, we also uncovered hundreds of male sterility-associated proteins and phosphoproteins that might influence maize productivity and serve as targets for hybrid maize seed production. At last, a putative complex signaling pathway involving CDPKs, MAPKs, ubiquitin ligases and multiple fertility proteins was constructed. Overall, our data provides new insight for further investigation of protein phosphorylation status in mature maize pollen and construction of maize male sterile mutants in the future.

  15. A chance to avoid mistakes human error

    International Nuclear Information System (INIS)

    Amaro, Pablo; Obeso, Eduardo; Gomez, Ruben

    2010-01-01

    human factor contribution to the events 'The explanations of the error': The evolution of the human error concept and which are the causes that are behind him, are presented in this chapter. Several examples try to facilitate understanding. In the appendix II, we present a series of 'Cause Codes' used in the industry, trying to aid to the technicians when they are assessing and researching events. 'The battle against error': Its the main objective of the book. Present one after other, the tools that are managed in the nuclear industry in a practical way. What's, Who have to use it and When to use it, are described with sufficient detail so that anyone can assimilated the tool and, if is applicable, look for the implementation in his organization. (authors)

  16. ADVANCED MMIS TOWARD SUBSTANTIAL REDUCTION IN HUMAN ERRORS IN NPPS

    Directory of Open Access Journals (Sweden)

    POONG HYUN SEONG

    2013-04-01

    Full Text Available This paper aims to give an overview of the methods to inherently prevent human errors and to effectively mitigate the consequences of such errors by securing defense-in-depth during plant management through the advanced man-machine interface system (MMIS. It is needless to stress the significance of human error reduction during an accident in nuclear power plants (NPPs. Unexpected shutdowns caused by human errors not only threaten nuclear safety but also make public acceptance of nuclear power extremely lower. We have to recognize there must be the possibility of human errors occurring since humans are not essentially perfect particularly under stressful conditions. However, we have the opportunity to improve such a situation through advanced information and communication technologies on the basis of lessons learned from our experiences. As important lessons, authors explained key issues associated with automation, man-machine interface, operator support systems, and procedures. Upon this investigation, we outlined the concept and technical factors to develop advanced automation, operation and maintenance support systems, and computer-based procedures using wired/wireless technology. It should be noted that the ultimate responsibility of nuclear safety obviously belongs to humans not to machines. Therefore, safety culture including education and training, which is a kind of organizational factor, should be emphasized as well. In regard to safety culture for human error reduction, several issues that we are facing these days were described. We expect the ideas of the advanced MMIS proposed in this paper to lead in the future direction of related researches and finally supplement the safety of NPPs.

  17. Advanced MMIS Toward Substantial Reduction in Human Errors in NPPs

    Energy Technology Data Exchange (ETDEWEB)

    Seong, Poong Hyun; Kang, Hyun Gook [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Na, Man Gyun [Chosun Univ., Gwangju (Korea, Republic of); Kim, Jong Hyun [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of); Heo, Gyunyoung [Kyung Hee Univ., Yongin (Korea, Republic of); Jung, Yoensub [Korea Hydro and Nuclear Power Co., Ltd., Daejeon (Korea, Republic of)

    2013-04-15

    This paper aims to give an overview of the methods to inherently prevent human errors and to effectively mitigate the consequences of such errors by securing defense-in-depth during plant management through the advanced man-machine interface system (MMIS). It is needless to stress the significance of human error reduction during an accident in nuclear power plants (NPPs). Unexpected shutdowns caused by human errors not only threaten nuclear safety but also make public acceptance of nuclear power extremely lower. We have to recognize there must be the possibility of human errors occurring since humans are not essentially perfect particularly under stressful conditions. However, we have the opportunity to improve such a situation through advanced information and communication technologies on the basis of lessons learned from our experiences. As important lessons, authors explained key issues associated with automation, man-machine interface, operator support systems, and procedures. Upon this investigation, we outlined the concept and technical factors to develop advanced automation, operation and maintenance support systems, and computer-based procedures using wired/wireless technology. It should be noted that the ultimate responsibility of nuclear safety obviously belongs to humans not to machines. Therefore, safety culture including education and training, which is a kind of organizational factor, should be emphasized as well. In regard to safety culture for human error reduction, several issues that we are facing these days were described. We expect the ideas of the advanced MMIS proposed in this paper to lead in the future direction of related researches and finally supplement the safety of NPPs.

  18. Advanced MMIS Toward Substantial Reduction in Human Errors in NPPs

    International Nuclear Information System (INIS)

    Seong, Poong Hyun; Kang, Hyun Gook; Na, Man Gyun; Kim, Jong Hyun; Heo, Gyunyoung; Jung, Yoensub

    2013-01-01

    This paper aims to give an overview of the methods to inherently prevent human errors and to effectively mitigate the consequences of such errors by securing defense-in-depth during plant management through the advanced man-machine interface system (MMIS). It is needless to stress the significance of human error reduction during an accident in nuclear power plants (NPPs). Unexpected shutdowns caused by human errors not only threaten nuclear safety but also make public acceptance of nuclear power extremely lower. We have to recognize there must be the possibility of human errors occurring since humans are not essentially perfect particularly under stressful conditions. However, we have the opportunity to improve such a situation through advanced information and communication technologies on the basis of lessons learned from our experiences. As important lessons, authors explained key issues associated with automation, man-machine interface, operator support systems, and procedures. Upon this investigation, we outlined the concept and technical factors to develop advanced automation, operation and maintenance support systems, and computer-based procedures using wired/wireless technology. It should be noted that the ultimate responsibility of nuclear safety obviously belongs to humans not to machines. Therefore, safety culture including education and training, which is a kind of organizational factor, should be emphasized as well. In regard to safety culture for human error reduction, several issues that we are facing these days were described. We expect the ideas of the advanced MMIS proposed in this paper to lead in the future direction of related researches and finally supplement the safety of NPPs

  19. Cost-Sensitive Feature Selection of Numeric Data with Measurement Errors

    Directory of Open Access Journals (Sweden)

    Hong Zhao

    2013-01-01

    Full Text Available Feature selection is an essential process in data mining applications since it reduces a model’s complexity. However, feature selection with various types of costs is still a new research topic. In this paper, we study the cost-sensitive feature selection problem of numeric data with measurement errors. The major contributions of this paper are fourfold. First, a new data model is built to address test costs and misclassification costs as well as error boundaries. It is distinguished from the existing models mainly on the error boundaries. Second, a covering-based rough set model with normal distribution measurement errors is constructed. With this model, coverings are constructed from data rather than assigned by users. Third, a new cost-sensitive feature selection problem is defined on this model. It is more realistic than the existing feature selection problems. Fourth, both backtracking and heuristic algorithms are proposed to deal with the new problem. Experimental results show the efficiency of the pruning techniques for the backtracking algorithm and the effectiveness of the heuristic algorithm. This study is a step toward realistic applications of the cost-sensitive learning.

  20. Comparison between calorimeter and HLNC errors

    International Nuclear Information System (INIS)

    Goldman, A.S.; De Ridder, P.; Laszlo, G.

    1991-01-01

    This paper summarizes an error analysis that compares systematic and random errors of total plutonium mass estimated for high-level neutron coincidence counter (HLNC) and calorimeter measurements. This task was part of an International Atomic Energy Agency (IAEA) study on the comparison of the two instruments to determine if HLNC measurement errors met IAEA standards and if the calorimeter gave ''significantly'' better precision. Our analysis was based on propagation of error models that contained all known sources of errors including uncertainties associated with plutonium isotopic measurements. 5 refs., 2 tabs