WorldWideScience

Sample records for voi definition errors

  1. Simplifying volumes-of-interest (VOIs) definition in quantitative SPECT: Beyond manual definition of 3D whole-organ VOIs.

    Science.gov (United States)

    Vicente, Esther M; Lodge, Martin A; Rowe, Steven P; Wahl, Richard L; Frey, Eric C

    2017-05-01

    We investigated the feasibility of using simpler methods than manual whole-organ volume-of-interest (VOI) definition to estimate the organ activity concentration in single photon emission computed tomography (SPECT) in cases where the activity in the organ can be assumed to be uniformly distributed on the scale of the voxel size. In particular, we investigated an anatomic region-of-interest (ROI) defined in a single transaxial slice, and a single sphere placed inside the organ boundaries. The evaluation was carried out using Monte Carlo simulations based on patient indium (111) In pentetreotide SPECT and computed tomography (CT) images. We modeled constant activity concentrations in each organ, validating this assumption by comparing the distribution of voxel values inside the organ VOIs of the simulated data with the patient data. We simulated projection data corresponding to 100, 50, and 25% of the clinical count level to study the effects of noise level due to shortened acquisition time. Images were reconstructed using a previously validated quantitative SPECT reconstruction method. The evaluation was performed in terms of the accuracy and precision of the activity concentration estimates. The results demonstrated that the non-uniform image intensity observed in the reconstructed images in the organs with normal uptake was consistent with uniform activity concentration in the organs on the scale of the voxel size; observed non-uniformities in image intensity were due to a combination of partial-volume effects at the boundaries of the organ, artifacts in the reconstructed image due to collimator-detector response compensation, and noise. Using an ROI defined in a single transaxial slice produced similar biases compared to the three-dimensional (3D) whole-organ VOIs, provided that the transaxial slice was near the central plane of the organ and that the pixels from the organ boundaries were not included in the ROI. Although this slice method was sensitive to noise

  2. Oral Definitions of Newly Learned Words: An Error Analysis

    Science.gov (United States)

    Steele, Sara C.

    2012-01-01

    This study examined and compared patterns of errors in the oral definitions of newly learned words. Fifteen 9- to 11-year-old children with language learning disability (LLD) and 15 typically developing age-matched peers inferred the meanings of 20 nonsense words from four novel reading passages. After reading, children provided oral definitions…

  3. What do family physicians consider an error? A comparison of definitions and physician perception

    Directory of Open Access Journals (Sweden)

    Pallerla Harini

    2006-12-01

    Full Text Available Abstract Background Physicians are being asked to report errors from primary care, but little is known about how they apply the term "error." This study qualitatively assesses the relationship between the variety of error definitions found in the medical literature and physicians' assessments of whether an error occurred in a series of clinical scenarios. Methods A systematic literature review and pilot survey results were analyzed qualitatively to search for insights into what may affect the use of the term error. The National Library of Medicine was systematically searched for medical error definitions. Survey participants were a random sample of active members of the American Academy of Family Physicians (AAFP and a selected sample of family physician patient safety "experts." A survey consisting of 5 clinical scenarios with problems (wrong test performed, abnormal result not followed-up, abnormal result overlooked, blood tube broken and missing scan results was sent by mail to AAFP members and by e-mail to the experts. Physicians were asked to judge if an error occurred. A qualitative analysis was performed via "immersion and crystallization" of emergent insights from the collected data. Results While one definition, that originated by James Reason, predominated the literature search, we found 25 different definitions for error in the medical literature. Surveys were returned by 28.5% of 1000 AAFP members and 92% of 25 experts. Of the 5 scenarios, 100% felt overlooking an abnormal result was an error. For other scenarios there was less agreement (experts and AAFP members, respectively agreeing an error occurred: 100 and 87% when the wrong test was performed, 96 and 87% when an abnormal test was not followed up, 74 and 62% when scan results were not available during a patient visit, and 57 and 47% when a blood tube was broken. Through qualitative analysis, we found that three areas may affect how physicians make decisions about error: the

  4. Error rates in forensic DNA analysis: definition, numbers, impact and communication.

    Science.gov (United States)

    Kloosterman, Ate; Sjerps, Marjan; Quak, Astrid

    2014-09-01

    Forensic DNA casework is currently regarded as one of the most important types of forensic evidence, and important decisions in intelligence and justice are based on it. However, errors occasionally occur and may have very serious consequences. In other domains, error rates have been defined and published. The forensic domain is lagging behind concerning this transparency for various reasons. In this paper we provide definitions and observed frequencies for different types of errors at the Human Biological Traces Department of the Netherlands Forensic Institute (NFI) over the years 2008-2012. Furthermore, we assess their actual and potential impact and describe how the NFI deals with the communication of these numbers to the legal justice system. We conclude that the observed relative frequency of quality failures is comparable to studies from clinical laboratories and genetic testing centres. Furthermore, this frequency is constant over the five-year study period. The most common causes of failures related to the laboratory process were contamination and human error. Most human errors could be corrected, whereas gross contamination in crime samples often resulted in irreversible consequences. Hence this type of contamination is identified as the most significant source of error. Of the known contamination incidents, most were detected by the NFI quality control system before the report was issued to the authorities, and thus did not lead to flawed decisions like false convictions. However in a very limited number of cases crucial errors were detected after the report was issued, sometimes with severe consequences. Many of these errors were made in the post-analytical phase. The error rates reported in this paper are useful for quality improvement and benchmarking, and contribute to an open research culture that promotes public trust. However, they are irrelevant in the context of a particular case. Here case-specific probabilities of undetected errors are needed

  5. RTP-based broadcast streaming of high definition H.264/AVC video: an error robustness evaluation

    Institute of Scientific and Technical Information of China (English)

    HILLESTAD Odd Inge; JETLUND Ola; PERKIS Andrew

    2006-01-01

    In this work, we present an evaluation of the performance and error robustness of RTP-based broadcast streaming of high-quality high-definition (HD) H.264/AVC video. Using a fully controlled IP test bed (Hillestad et al., 2005), we broadcast high-definition video over RTP/UDP, and use an IP network emulator to introduce a varying amount of randomly distributed packet loss. A high-performance network interface monitoring card is used to capture the video packets into a trace file. Purpose-built software parses the trace file, analyzes the RTP stream and assembles the correctly received NAL units into an H.264/AVC Annex B byte stream file, which is subsequently decoded by JVT JM 10.1 reference software. The proposed measurement setup is a novel, practical and intuitive approach to perform error resilience testing of real-world H.264/AVC broadcast applications. Through a series of experiments, we evaluate some of the error resilience features of the H.264/AVC standard, and see how they perform at packet loss rates from 0.01% to 5%. The results confirmed that an appropriate slice partitioning scheme is essential to have a graceful degradation in received quality in the case of packet loss. While flexible macroblock ordering reduces the compression efficiency about 1 dB for our test material, reconstructed video quality is improved for loss rates above 0.25%.

  6. Optimal target VOI size for accurate 4D coregistration of DCE-MRI

    Science.gov (United States)

    Park, Brian; Mikheev, Artem; Zaim Wadghiri, Youssef; Bertrand, Anne; Novikov, Dmitry; Chandarana, Hersh; Rusinek, Henry

    2016-03-01

    Dynamic contrast enhanced (DCE) MRI has emerged as a reliable and diagnostically useful functional imaging technique. DCE protocol typically lasts 3-15 minutes and results in a time series of N volumes. For automated analysis, it is important that volumes acquired at different times be spatially coregistered. We have recently introduced a novel 4D, or volume time series, coregistration tool based on a user-specified target volume of interest (VOI). However, the relationship between coregistration accuracy and target VOI size has not been investigated. In this study, coregistration accuracy was quantitatively measured using various sized target VOIs. Coregistration of 10 DCE-MRI mouse head image sets were performed with various sized VOIs targeting the mouse brain. Accuracy was quantified by measures based on the union and standard deviation of the coregistered volume time series. Coregistration accuracy was determined to improve rapidly as the size of the VOI increased and approached the approximate volume of the target (mouse brain). Further inflation of the VOI beyond the volume of the target (mouse brain) only marginally improved coregistration accuracy. The CPU time needed to accomplish coregistration is a linear function of N that varied gradually with VOI size. From the results of this study, we recommend the optimal size of the VOI to be slightly overinclusive, approximately by 5 voxels, of the target for computationally efficient and accurate coregistration.

  7. Challenges of "VoiP" Communication Systems for Air Traffic Management

    Directory of Open Access Journals (Sweden)

    Miroslav Borković

    2006-01-01

    Full Text Available Today's dense air traffic and the worldwide move to FutureAir Navigation System (FANS concepts demand a high levelof modem and reliable Air Traffic Control (ATC equipmentto accommodate customer requirements now and in the future.The work tries to answer the questions that indicate the fact whya new VCS generation (Voice Communication System basedon VoiP (Voice over Internet Protocol has some essentialcomparative advantages related to the conventional VCS. Oneof the key arguments featuring the VoiP technology is its lowprice a~d wide application in modem communications. Becauseof the fact of the PC-based HMI (Human Machine Interfaceapplication, IP-VCS could easily be implemented in thenew A TM data network infrastructure. In the future, the VoiPtechnology development and VoiP VCS will have to prove thatthey are ready to meet the requirements of more flexible, saferand cheaper way of air traffic management.

  8. Requirements for Value of Information (VoI) calculation over mission specifications

    Science.gov (United States)

    Michaelis, James R.

    2017-05-01

    Intelligence, Surveillance, and Reconnaissance (ISR) operations center on providing relevant situational understanding to military commanders and analysts to facilitate decision-making for execution of mission tasks. However, limitations exist in tactical-edge environments on the ability to disseminate digital materials to analysts and decision makers. This work investigates novel methods to calculate of Value of Information tied to digital materials (termed information objects) for consumer use, based on interpretation of mission specifications. Followed by a short survey of related VoI calculation efforts, discussion is provided on mission-centric VoI calculation for digital materials via adoption of the preexisting Missions and Means Framework model.

  9. Error rates in forensic DNA analysis: Definition, numbers, impact and communication

    NARCIS (Netherlands)

    Kloosterman, A.; Sjerps, M.; Quak, A.

    2014-01-01

    Forensic DNA casework is currently regarded as one of the most important types of forensic evidence, and important decisions in intelligence and justice are based on it. However, errors occasionally occur and may have very serious consequences. In other domains, error rates have been defined and pub

  10. Estimating study costs for use in VOI, a study of dutch publicly funded drug related research

    NARCIS (Netherlands)

    Van Asselt, A.D.; Ramaekers, B.L.; Corro Ramos, I.; Joore, M.A.; Al, M.J.; Lesman-Leegte, I.; Postma, M.J.; Vemer, P.; Feenstra, T.F.

    2016-01-01

    Objectives: To perform value of information (VOI) analyses, an estimate of research costs is needed. However, reference values for such costs are not available. This study aimed to analyze empirical data on research budgets and, by means of a cost tool, provide an overview of costs of several types

  11. [Neuronal mechanisms of motor signal transmission in thalamic Voi nucleus in spasmodic torticollis patients].

    Science.gov (United States)

    Sedov, A S; Raeva, S N; Pavlenko, V B

    2014-01-01

    Neural mechanisms of motor signal transmission in ventrooral (Voi) nucleus of motor thalamus during the realization-of voluntary and involuntary abnormal (dystonic) movements in patients with spasmodic torticollis were investigated by means of microelectrode technique. The high reactivity of the cellular Voi elements to various functional (mainly motor) tests was proved. Analysis of neuronal activity showed: (1) the difference of neural mechanisms of motor signal transmission in the realization of voluntary movement with and without the involvement of the pathological axial neck muscles, as well as passive and abnormal involuntary dystonic movements; (2) significance of sensory component in the mechanisms of sensorimotor interactions during realization of voluntary and involuntary dystonic head and neck movements, causing the activation of the axial neck muscles; (3) important role of the rhythmic and synchronized neuronal activity in motor signal transmission during the realization of active and passive movements. Participation of Voi nucleus in pathological mechanisms of spasmodic torticollis was shown. The data obtained can be used for identificatiori of Voi thalamic nucleus during stereotactic neurosurgical operations in patients with spasmodic torticollis for selection the optimum destruction (stimulation) target and reduction of postoperative effects.

  12. Medication Errors - A Review

    OpenAIRE

    Vinay BC; Nikhitha MK; Patel Sunil B

    2015-01-01

    In this present review article, regarding medication errors its definition, medication error problem, types of medication errors, common causes of medication errors, monitoring medication errors, consequences of medication errors, prevention of medication error and managing medication errors have been explained neatly and legibly with proper tables which is easy to understand.

  13. Medication Errors - A Review

    OpenAIRE

    Vinay BC; Nikhitha MK; Patel Sunil B

    2015-01-01

    In this present review article, regarding medication errors its definition, medication error problem, types of medication errors, common causes of medication errors, monitoring medication errors, consequences of medication errors, prevention of medication error and managing medication errors have been explained neatly and legibly with proper tables which is easy to understand.

  14. Developing a Value of Information (VoI) Enabled System from Collection to Analysis

    Science.gov (United States)

    2016-11-01

    Analysis by Mark R Mittrick, John T Richardson, Alex Vertlieb, and Timothy P Hanratty Approved for public release...US Army Research Laboratory Developing a Value of Information (VoI)- Enabled System from Collection to Analysis by Mark R Mittrick, John...Developing a Value of Information (VoI)-Enabled System from Collection to Analysis 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT

  15. Cross-cultural adaptation of the Chilean version of the Voice Symptom Scale - VoiSS.

    Science.gov (United States)

    Ruston, Francisco Contreras; Moreti, Felipe; Vivero, Martín; Malebran, Celina; Behlau, Mara

    This research aims to accomplish the cross-cultural equivalence of the Chilean version of the VoiSS protocol through its cultural and linguistic adaptation. After the translation of the VoiSS protocol to Chilean Spanish by two bilingual speech therapists and its back translation to English, we compared the items of the original tool with the previous translated version. The existing discrepancies were modified by a consensus committee of five speech therapists and the translated version was entitled Escala de Sintomas Vocales - ESV, with 30 questions and five answers: "Never", "Occasionally", "Sometimes", "Most of the time", "Always". For cross-cultural equivalence, the protocol was applied to 15 individuals with vocal problems. In each question the option of "Not applicable" was added to the answer choices for identification of the questions not comprehended or not appropriate for the target population. Two individuals had difficulty answering two questions, which made it necessary to adapt the translation of only one of them. The modified ESV was applied to three individuals with vocal problems, and there were incomprehensible inappropriate questions for the Chilean culture. The ESV reflects the original English version, both in the number of questions and the limitations of the emotional and physical domains. There is now a cross-cultural equivalence of VoiSS in Chilean Spanish, titled ESV. The validation of the ESV for Chilean Spanish is ongoing. RESUMEN Este estudio tuvo como objetivo realizar la equivalencia cultural de la versión Chilena del protocolo Voice Symptom Scale - VoiSS por medio de su adaptación cultural y lingüística. Después de la traducción del VoiSS para el Español Chileno, por dos fonoaudiólogos bilingües, y de la retro traducción para el inglés, se realizó una comparación de los ítems del instrumento original con la versión traducida, surgiendo discrepancias; tales divergencias fueron resueltas por un comité compuesto por

  16. 具误差的Ishikawa和Mann迭代过程的定义%On the Definitions of Ishikawa and Mann Iterative Processes with Errors

    Institute of Scientific and Technical Information of China (English)

    徐裕光; 周兴伟; 温一新

    2008-01-01

    The purpose of this note is to introduces some new definitions of Ishikawa and Mann iterative processes with random errors as a correction to those introduced in 1995 by L. S. Liu, J. Math. Anal. Appl. Vol. 194P114~125. In his definitions Liu imposed the requirement that the error terms {un} and {vn are two summable sequences, it is against the randomness of errors. By virtue of the new definitions, a lot of results on the basis of the definitions of Liu will be still valid if a little revision for the results are approved.%引入若干新的带有随机误差的Ishikawa和Mann迭代过程的定义,用以修正Liu在1995发表在J.Math.Anal.Appl.第194卷第114~125页上的关于带有误差的Ishikawa和Mann迭代过程的定义.应用这些新的定义,只要作些修改,许多基于Liu的定义的研究结果仍然是有效的.

  17. A two-compartment phantom for VOI profile measurements in small-bore 31P MR spectroscopy

    DEFF Research Database (Denmark)

    Vikhoff, Babro; Stubgaard, Max; Stensgaard, Anders

    1998-01-01

    A two-compartment gel phantom for VOI profile measurements in volume-selective 31P spectroscopy in small-bore units is presented. The phantom is cylindrical with two compartments divided by a very thin (30 microm) polyethene film. This thin film permits measurements with a minimum of susceptibility...... from the two compartments was measured for each position and the data were evaluated following differentiation. We have found this phantom suitable for VOI profile measurements of ISIS in small-bore systems. The phantom forms a useful complement to recommended phantoms for small bore-spectroscopy...

  18. Les kystes hydatiques du foie rompus dans les voies biliaires: à propos de 120 cas

    Science.gov (United States)

    Moujahid, Mountassir; Tajdine, Mohamed Tarik

    2011-01-01

    Etude rétrospective rapportant une série de kystes hydatiques rompus dans les voies biliaires colligés dans le service de chirurgie de l'hôpital militaire Avicenne à Marrakech. Entre 1990 à 2008, sur 536 kystes hydatiques du foie opérés dans le service, 120 étaient compliqués de rupture dans les voies biliaires soit 22,38%. Il y avait 82hommes et 38 femmes. L’âge moyen était de 35 ans avec des extrêmes allant de 10 à 60 ans. La clinique était dominée par la crise d'angiocholite ou une douleur du flanc droit. L'ictère était isolé dans huit cas. La fistule biliokystique était latente dans plus de 50% des cas. Le traitement a consisté en une résection du dôme saillant dans103cas (85,84%), une périkystectomie chez 11 malades (9,16%) et une lobectomie gauche dans six cas (5%). Le traitement de la fistule bilio kystique a consisté en une suture chez 36malades et un drainage bipolaire dans 25 cas, La déconnexion kysto-biliaire ou cholédocotomie trans hépatico kystique selon Perdomo était pratiquée dans 49cas et une anastomose bilio-digestive cholédoco-duodénale dans 10 cas. La durée moyenne d'hospitalisation était de 20jours. Nous déplorons deux décès par choc septique et un troisième par encéphalopathie secondaire à une cirrhose biliaire. La morbidité était représentée par huit abcès sous phrénique, douze fistules biliaires prolongées et deux occlusions intestinales. Les kystes hydatiques rompus dans les voies biliaires représentent la complication la plus grave de cette pathologie bénigne. Le traitement repose sur des méthodes radicales qui sont d'une efficacité reconnue, mais de réalisation dangereuse et les méthodes conservatrices, en particulier la déconnexion kysto-biliaire qui est une méthode simple et qui donne de bons résultats à court et à long terme. PMID:22384289

  19. Les kystes hydatiques du foie rompus dans les voies biliaires: a propos de 120 cas

    Directory of Open Access Journals (Sweden)

    Mountassir Moujahid

    2011-11-01

    Full Text Available s et une anastomose bilio digestive choledoco duodenale dans 10 cas. La duree moyenne d�hospitalisation etait de 20jours. Nous deplorons deux deces par choc septique et un troisieme par encephalopathie secondaire a une cirrhose biliaire. La morbidite etait representee par huit abces sous phrenique, douze fistules biliaires prolongees et deux occlusions intestinales. Les kystes hydatiques rompus dans les voies biliaires representent la complication la plus grave de cette pathologie benigne. Le traitement repose sur des methodes radicales qui sont d�une efficacite reconnue, mais de realisation dangereuse et les methodes conservatrices, en particulier la deconnexion kysto biliaire qui est une methode simple et qui donne de bons resultas a court et a long terme.

  20. The Voice Symptom Scale (VoiSS) and the Vocal Handicap Index (VHI): a comparison of structure and content.

    Science.gov (United States)

    Wilson, J A; Webb, A; Carding, P N; Steen, I N; MacKenzie, K; Deary, I J

    2004-04-01

    Self report measures of voice function are in frequent use, but have had inadequate psychometric evaluation. We aimed to perform a substantial factor analysis of two measures of voice impairment, the Voice Symptom Scale (VoiSS) and the Voice Handicap Index (VHI). Both the 30-item questionnaires were completed by 319 dysphonic voice clinic attenders (99M, 220F). Principal components analysis confirmed that both instruments reflected general voice abnormality. The VoiSS comprised three factors - impairment (15 items), emotional (8 items) and related physical symptoms (7 items) - each with a good internal consistency. Analysis of the VHI suggested that it contains only two subscales. When a three-factor solution was imposed on the data, analysis failed to support the currently advised three 10-item subscale interpretations. Instead, we found a physical (voice impairment) domain (8 items), a psychosocial domain (14 items) and a factor with 8 items related to difficulty in being heard. The VHI requires further statistical refinement to identify its subscale structure. The VoiSS was developed from 800 subjects and is psychometrically the most robust and extensively validated self report voice measure available.

  1. Gestion des voies aériennes supérieures et cellulite cervico-faciale

    Science.gov (United States)

    Kechna, Hicham; Nadour, Karim; Ouzzad, Omar; Chkoura, Khalid; Choumi, Faical; Loutid, Jaouad; Moumine, Mohamed; Hachimi, Moulay Ahmed

    2017-01-01

    Introduction La cellulite maxillo faciale est urgence médico chirurgicale. Ces patients sont le plus souvent pris en charge au bloc opératoire d’une part pour la mise à plat et le drainage des collections et d’autres parts pour les prélèvements bactériologiques. L’anesthésie de ce genre de patient expose à des difficultés potentielles de contrôle des voies aériennes supérieures. Méthodes Il s'agissait d'une étude rétrospective réalisée sur une durée de 24 mois dans le pôle d’anesthésie réanimation et urgence de l’hôpital militaire Moulay Ismail de Meknès avec la collaboration des services de stomatologie et d’oto-rhino laryngologie. On été inclus tous les patients admis au bloc pour cure chirurgicale sous anesthésie générale d’une cellulite cervicale et/ou maxillo-faciale. Résultats Nous avons colligé 22 dossiers. Le sexe ratio était à 4,5 en faveur des hommes. L'âge moyen des patients est de 29 ans. Concernant la gestion des voies aériennes ; une intubation standard à l’aide de laryngoscope a été retenue chez la plupart des patients (17 patients). L’intubation vigile sous fibroscopie a été réalisée chez 3 patients, une trachéotomie première a été faite chez une patiente et une intubation rétrograde salvatrice a été retenue chez une autre patiente. Conclusion Deux défis guettent tout anesthésiste prenant en charge des patients présentant une cellulite cervico-faciale, un risque de ventilation difficile et un souci d’intubation laborieuse. Les deux risques sont à envisager de principe, et où une stratégie anticipative devra être élaborée. Introduction Maxillofacial cellulitis is a medical-surgical emergency. These patients are most often treated in the operating room on the one hand for identifying and draining the collections and on the other hand for bacteriological samples. The type of anesthetic technique used exposes to potential difficulties in controlling the upper airways. Methods

  2. 火炮身管直线度的定义及误差评定方法%Artillery Barrel Straightness Definition And Error Assessment Method

    Institute of Scientific and Technical Information of China (English)

    程杰

    2016-01-01

    Firstly, The reason of artillery barrel curl is analysed. The definition of artillery barrel straightness is advanced. Finally, The error assessment method of artillery barrel straightness is elaborated in detail. It includes the two points link line law, the least square law and the minimum contain regional law.%首先分析了火炮身管弯曲原因,提出了火炮身管直线度的定义,最后详细阐述了火炮身管直线度误差评定方法,包括两端点连线法、最小二乘法和最小包容区域法。

  3. The definition of the railway position control error in the plane and profile using the optical-electronic system

    Science.gov (United States)

    Nikulin, Anton V.; Timofeev, Alexandr N.; Nekrylov, Ivan S.

    2015-05-01

    Continuous development of high-speed railway traffic in the world toughens requirements, including to the accuracy of installation and control of provision of a railway track. For the current technologies of service of a railway track using its absolute coordinates the perspective decision is creation along railway lines of a special fiducial network. In this case by means of optical-electronic systems, concerning reference points, obtaining the objective information on actual position of a railway track in a longitudinal cross-section and the plan with a margin error which isn't exceeding 1,5 mm in rather severe conditions of continuous operation of traveling machines at speeds up to 10 km/h is possible.

  4. Measurement of SUVs-Maximum for Normal Region Using VOI in PET/MRI and PET/CT

    Directory of Open Access Journals (Sweden)

    Jeong Kyu Park

    2014-01-01

    Full Text Available The purpose of this research is to establish an overall data set associated with the VOI (Volume of Interest, which is available for simultaneous assessment of PET/MRI and PET/CT regardless of the use of contrast media. The participants as objects of this investigation are 26 healthy examinees in Korea, SUV (standardized-uptake-values-maximum evaluation for whole-body F-18 FDG (fluorodeoxyglucose PET/MRI image using VOI of normal region has exhibited very significant difference to that for whole-body F-18 FDG PET/CT image (significant probability value (P0.8. It is shown that one needs to decide SUVs-maximum for PET/MRI with the reduction of 25.0~26.4% from their evaluated value and needs to decide with the reduction of 28.8~29.4% in the same situation but with the use of contrast media. The use of SUVLBM-maximum (SUVLean Body Mass-maximum is very advantageous in reading overall image of PET/CT and PET/MRI to medical doctors and researchers, if we consider its convenience and efficiency. We expect that this research enhances the level of the early stage accurate diagnosis with whole-body images of PET/MRI and PET/CT.

  5. Systematic error revisited

    Energy Technology Data Exchange (ETDEWEB)

    Glosup, J.G.; Axelrod, M.C.

    1996-08-05

    The American National Standards Institute (ANSI) defines systematic error as An error which remains constant over replicative measurements. It would seem from the ANSI definition that a systematic error is not really an error at all; it is merely a failure to calibrate the measurement system properly because if error is constant why not simply correct for it? Yet systematic errors undoubtedly exist, and they differ in some fundamental way from the kind of errors we call random. Early papers by Eisenhart and by Youden discussed systematic versus random error with regard to measurements in the physical sciences, but not in a fundamental way, and the distinction remains clouded by controversy. The lack of a general agreement on definitions has led to a plethora of different and often confusing methods on how to quantify the total uncertainty of a measurement that incorporates both its systematic and random errors. Some assert that systematic error should be treated by non- statistical methods. We disagree with this approach, and we provide basic definitions based on entropy concepts, and a statistical methodology for combining errors and making statements of total measurement of uncertainty. We illustrate our methods with radiometric assay data.

  6. Prioritising interventions against medication errors

    DEFF Research Database (Denmark)

    Lisby, Marianne; Pape-Larsen, Louise; Sørensen, Ann Lykkegaard

    2011-01-01

    Abstract Authors: Lisby M, Larsen LP, Soerensen AL, Nielsen LP, Mainz J Title: Prioritising interventions against medication errors – the importance of a definition Objective: To develop and test a restricted definition of medication errors across health care settings in Denmark Methods: Medication...... errors constitute a major quality and safety problem in modern healthcare. However, far from all are clinically important. The prevalence of medication errors ranges from 2-75% indicating a global problem in defining and measuring these [1]. New cut-of levels focusing the clinical impact of medication...... errors are therefore needed. Development of definition: A definition of medication errors including an index of error types for each stage in the medication process was developed from existing terminology and through a modified Delphi-process in 2008. The Delphi panel consisted of 25 interdisciplinary...

  7. Uncertainty quantification and error analysis

    Energy Technology Data Exchange (ETDEWEB)

    Higdon, Dave M [Los Alamos National Laboratory; Anderson, Mark C [Los Alamos National Laboratory; Habib, Salman [Los Alamos National Laboratory; Klein, Richard [Los Alamos National Laboratory; Berliner, Mark [OHIO STATE UNIV.; Covey, Curt [LLNL; Ghattas, Omar [UNIV OF TEXAS; Graziani, Carlo [UNIV OF CHICAGO; Seager, Mark [LLNL; Sefcik, Joseph [LLNL; Stark, Philip [UC/BERKELEY; Stewart, James [SNL

    2010-01-01

    UQ studies all sources of error and uncertainty, including: systematic and stochastic measurement error; ignorance; limitations of theoretical models; limitations of numerical representations of those models; limitations on the accuracy and reliability of computations, approximations, and algorithms; and human error. A more precise definition for UQ is suggested below.

  8. Evaluation of elastix-based propagated align algorithm for VOI- and voxel-based analysis of longitudinal F-18-FDG PET/CT data from patients with non-small cell lung cancer (NSCLC)

    OpenAIRE

    Kerner, Gerald S. M. A.; Fischer, Alexander; Koole, Michel J. B.; Pruim, Jan; Groen, Harry J M

    2015-01-01

    Background: Deformable image registration allows volume of interest (VOI)- and voxel-based analysis of longitudinal changes in fluorodeoxyglucose (FDG) tumor uptake in patients with non-small cell lung cancer (NSCLC). This study evaluates the performance of the elastix toolbox deformable image registration algorithm for VOI and voxel-wise assessment of longitudinal variations in FDG tumor uptake in NSCLC patients. Methods: Evaluation of the elastix toolbox was performed using F-18-FDG PET/CT ...

  9. Evaluation of elastix-based propagated align algorithm for VOI- and voxel-based analysis of longitudinal 18F-FDG PET/CT data from patients with non-small cell lung cancer (NSCLC)

    OpenAIRE

    Kerner, Gerald SMA; Fischer, Alexander; Koole, Michel JB; Pruim, Jan; Groen, Harry JM

    2015-01-01

    Background Deformable image registration allows volume of interest (VOI)- and voxel-based analysis of longitudinal changes in fluorodeoxyglucose (FDG) tumor uptake in patients with non-small cell lung cancer (NSCLC). This study evaluates the performance of the elastix toolbox deformable image registration algorithm for VOI and voxel-wise assessment of longitudinal variations in FDG tumor uptake in NSCLC patients. Methods Evaluation of the elastix toolbox was performed using 18F-FDG PET/CT at ...

  10. Under which conditions, additional monitoring data are worth gathering for improving decision making? Application of the VOI theory in the Bayesian Event Tree eruption forecasting framework

    Science.gov (United States)

    Loschetter, Annick; Rohmer, Jérémy

    2016-04-01

    Standard and new generation of monitoring observations provide in almost real-time important information about the evolution of the volcanic system. These observations are used to update the model and contribute to a better hazard assessment and to support decision making concerning potential evacuation. The framework BET_EF (based on Bayesian Event Tree) developed by INGV enables dealing with the integration of information from monitoring with the prospect of decision making. Using this framework, the objectives of the present work are i. to propose a method to assess the added value of information (within the Value Of Information (VOI) theory) from monitoring; ii. to perform sensitivity analysis on the different parameters that influence the VOI from monitoring. VOI consists in assessing the possible increase in expected value provided by gathering information, for instance through monitoring. Basically, the VOI is the difference between the value with information and the value without additional information in a Cost-Benefit approach. This theory is well suited to deal with situations that can be represented in the form of a decision tree such as the BET_EF tool. Reference values and ranges of variation (for sensitivity analysis) were defined for input parameters, based on data from the MESIMEX exercise (performed at Vesuvio volcano in 2006). Complementary methods for sensitivity analyses were implemented: local, global using Sobol' indices and regional using Contribution to Sample Mean and Variance plots. The results (specific to the case considered) obtained with the different techniques are in good agreement and enable answering the following questions: i. Which characteristics of monitoring are important for early warning (reliability)? ii. How do experts' opinions influence the hazard assessment and thus the decision? Concerning the characteristics of monitoring, the more influent parameters are the means rather than the variances for the case considered

  11. New definitions of pointing stability - ac and dc effects. [constant and time-dependent pointing error effects on image sensor performance

    Science.gov (United States)

    Lucke, Robert L.; Sirlin, Samuel W.; San Martin, A. M.

    1992-01-01

    For most imaging sensors, a constant (dc) pointing error is unimportant (unless large), but time-dependent (ac) errors degrade performance by either distorting or smearing the image. When properly quantified, the separation of the root-mean-square effects of random line-of-sight motions into dc and ac components can be used to obtain the minimum necessary line-of-sight stability specifications. The relation between stability requirements and sensor resolution is discussed, with a view to improving communication between the data analyst and the control systems engineer.

  12. VOI4N02

    African Journals Online (AJOL)

    AFRICAN JOURNAL OF CLINICAL & EXPERIMENTAL MICROBIOLOGY JULY. 2003 ... A survey of the aetiological agents of diarrhoea in children under S years of age was carried out in. Osogbo, Osun State. .... Case control study of endemic diarrheal disease in Thai children. J. Infect. Dis. 1989 ; ... acute diarrhea in New.

  13. H.264高清视频解码实时错误掩盖算法%Real-time Error Concealment Algorithm for H.264 High Definition Video Decoding

    Institute of Scientific and Technical Information of China (English)

    谢涛; 李志华; 黄轶伦

    2011-01-01

    针对IP网络丢包条件下的H.264高清视频实时解码问题,分析高清视频码流的特点,提出一种实时错误掩盖算法.该算法利用丢失片的边缘宏块信息,以垂直距离为权值加权平均预测得到错误宏块的运动矢量,进而完成错误掩盖.实验表明,与Joint模型中的错误掩盖算法相比,该算法提升了重建图像的主观质量和客观质量,计算复杂度较低,错误掩盖效果较好,适用于高清实时解码.%Aiming at real-time decode H.264 video on IP network with packet losses, an effective error concealment algonthm is proposed depending on the analysis of high definition video streams.By use of the edge macro-blocks information of the lost slice.the motion vector of corrupted macro-block is predicted, and the error concealment is completed.Experimental results show that.compared with the error concealment of Joint Model(JM), the proposed algorithm improves the objective quality and subjective quality of reconstructed images.The algorithm does not increase the complexity of the decoding, but achieves better recovery results.It is ideal for real-time decoding for high definition video.

  14. To Explore the Application of 3D VOI Technology in SPECT Three Phase Bone Imaging for the Early Diagnosis of the Femoral Head Osteonecrosis%3D VOI 技术在SPECT三时相骨显像对股骨头坏死早期诊断的应用

    Institute of Scientific and Technical Information of China (English)

    张森; 席继梅; 庞华

    2015-01-01

    目的:利用三维感兴趣区容积(3D VOI)技术与SPECT三时相骨显像技术相结合,对其在股骨头坏死的早期诊断中的临床价值进行研究,以期为临床提供一种更早期、更准确的诊断股骨头坏死的检测手段。方法对59例受检患者进行SPECT三时骨显像,在图像后处理后进行三维重建,并分别用二维感兴趣区容积(2D VOI)技术及三维感兴趣容积(3D VOI)技术进行分析;同时收集行MRI检查的结果进行对比。结果在52例受检者中,其中早期股骨头坏死共计58髋。二维感兴趣区容积(2D VOI)阳性43髋,3D VOI阳性48髋,二者皆阳性40髋。2D ROI检出率为74.1%,3D VOI检出率为82.3%。其中38人行MRI检查,共计坏死股骨头40个。其中MRI阳性29髋,2D VOI阳性31髋,3D VOI阳性33髋。MRI检出率为72.5%,2D ROI检出率为77.5%,3D VOI检出率为82.5%。结论3D VOI对早期股骨头坏死具有较高的检出率。%Objective By the technology of three-dimensional volume of interest combined with three-phase bone scan,we wanted to study its clinical value of early diagnosis of femoral head osteonecrosis,aiming to provide an earlier and more accurate measurement of femoral head osteonecrosis diagnosing.Methods 59 cases of patients received three-phase bone scan. After the three-dimensional reconstruction,we analyzed the results of the two-dimensional volume of interest (2D VOI)technology and three-dimensional volume of interest(3D VOI)respectively. In the meanwhile,we compared them with Magnetic Resonance Imaging (MRI) examination results.Results Among the 52 cases,there were 58 hips of early femoral head osteonecrosis,43 hips positive in 2D VOI while 48 hips positive in 3D VOI and 40 hips positive in both. The detection rate of 2D ROI and 3D ROI were 74.1% and 82.3%,respectively. MRI examination was performed on 38 patients. And there were 40 hips of femoral head osteonecrosis,29 hips positive in MRI,31

  15. Diagnosis of the Computer-Controlled Milling Machine, Definition of the Working Errors and Input Corrections on the Basis of Mathematical Model

    Science.gov (United States)

    Starikov, A. I.; Nekrasov, R. Yu; Teploukhov, O. J.; Soloviev, I. V.; Narikov, K. A.

    2016-10-01

    Manufactures, machinery and equipment improve of constructively as science advances and technology, and requirements are improving of quality and longevity. That is, the requirements for surface quality and precision manufacturing, oil and gas equipment parts are constantly increasing. Production of oil and gas engineering products on modern machine tools with computer numerical control - is a complex synthesis of technical and electrical equipment parts, as well as the processing procedure. Technical machine part wears during operation and in the electrical part are accumulated mathematical errors. Thus, the above-mentioned disadvantages of any of the following parts of metalworking equipment affect the manufacturing process of products in general, and as a result lead to the flaw.

  16. Analysis of Errors Encountered in Simultaneous Interpreting

    Institute of Scientific and Technical Information of China (English)

    方峥

    2015-01-01

    I.Introduction1.1 Definition of an error An error happens when the interpreter’s delivery affects the communicative impact of the speaker’s message,including semantic inaccuracies and inaccuracies of presentation.Along with the development of simultaneous interpreting,there has been a number of professional interpreters and linguists present their definitions and points of views about the errors

  17. Evaluation of elastix-based propagated align algorithm for VOI- and voxel-based analysis of longitudinal F-18-FDG PET/CT data from patients with non-small cell lung cancer (NSCLC)

    NARCIS (Netherlands)

    Kerner, Gerald S. M. A.; Fischer, Alexander; Koole, Michel J. B.; Pruim, Jan; Groen, Harry J. M.

    2015-01-01

    Background: Deformable image registration allows volume of interest (VOI)- and voxel-based analysis of longitudinal changes in fluorodeoxyglucose (FDG) tumor uptake in patients with non-small cell lung cancer (NSCLC). This study evaluates the performance of the elastix toolbox deformable image

  18. Evaluation of elastix-based propagated align algorithm for VOI- and voxel-based analysis of longitudinal F-18-FDG PET/CT data from patients with non-small cell lung cancer (NSCLC)

    NARCIS (Netherlands)

    Kerner, Gerald S. M. A.; Fischer, Alexander; Koole, Michel J. B.; Pruim, Jan; Groen, Harry J. M.

    2015-01-01

    Background: Deformable image registration allows volume of interest (VOI)- and voxel-based analysis of longitudinal changes in fluorodeoxyglucose (FDG) tumor uptake in patients with non-small cell lung cancer (NSCLC). This study evaluates the performance of the elastix toolbox deformable image regis

  19. Toward a cognitive taxonomy of medical errors.

    Science.gov (United States)

    Zhang, Jiajie; Patel, Vimla L; Johnson, Todd R; Shortliffe, Edward H

    2002-01-01

    One critical step in addressing and resolving the problems associated with human errors is the development of a cognitive taxonomy of such errors. In the case of errors, such a taxonomy may be developed (1) to categorize all types of errors along cognitive dimensions, (2) to associate each type of error with a specific underlying cognitive mechanism, (3) to explain why, and even predict when and where, a specific error will occur, and (4) to generate intervention strategies for each type of error. Based on Reason's (1992) definition of human errors and Norman's (1986) cognitive theory of human action, we have developed a preliminary action-based cognitive taxonomy of errors that largely satisfies these four criteria in the domain of medicine. We discuss initial steps for applying this taxonomy to develop an online medical error reporting system that not only categorizes errors but also identifies problems and generates solutions.

  20. Refractive Errors

    Science.gov (United States)

    ... does the eye focus light? In order to see clearly, light rays from an object must focus onto the ... The refractive errors are: myopia, hyperopia and astigmatism [See figures 2 and 3]. What is hyperopia (farsightedness)? Hyperopia occurs when light rays focus behind the retina (because the eye ...

  1. Medication Errors

    Science.gov (United States)

    ... Proprietary Names (PDF - 146KB) Draft Guidance for Industry: Best Practices in Developing Proprietary Names for Drugs (PDF - 279KB) ... or (301) 796-3400 druginfo@fda.hhs.gov Human Drug ... in Medication Errors Resources for You Agency for Healthcare Research and Quality: ...

  2. L'analyse des erreurs: etat actuel de la recherche (Error Analysis: Present State of Research). Errors: A New Perspective.

    Science.gov (United States)

    Lange, Michel

    This paper raises questions about the significance of errors made by language learners. The discussion is divided into four parts: (1) definition of error analysis, (2) the present status of error analysis research, including an overview of the theories of Lado, Skinner, Chomsky, Corder, Nemser, and Selinker; (3) the subdivisions of error analysis…

  3. Des sources de protéines locales pour l’alimentation des volailles : quelles voies de progrès ?

    Directory of Open Access Journals (Sweden)

    Bouvarel Isabelle

    2014-07-01

    Full Text Available L’aviculture européenne est dépendante d’approvisionnements en soja essentiellement en provenance du Brésil. Cette matière première riche en protéines est intéressante d’un point de vue nutritionnel pour l’alimentation des volailles mais pose des problèmes importants notamment en termes de prix et de conséquences environnementales. Les matières premières riches en protéines produites en France (oléo-protéagineux et coproduits de l’amidonnerie et de la distillerie ne remplacent qu’en partie le soja importé du fait d’équilibres nutritionnels moins propices. Les protéagineux sont de plus très peu disponibles sur le marché. L’aviculture européenne, et plus largement l’élevage et l’agriculture, sont ainsi face à des enjeux importants d’ordre économique, social et environnemental. Les contraintes et leviers relatifs à la formulation d’aliments destinés aux volailles sont analysés et différentes voies de progrès sont envisagées à plus ou moins long terme : développer l’alimentation de précision afin d’améliorer l’ajustement des apports aux besoins des animaux en fonction des objectifs fixés, disposer de matières premières adaptées (process, mis en œuvre de filières adaptées, nouvelles matières premières et additifs, sélection variétale mais aussi d’animaux adaptables. Une plus grande coordination entre acteurs, amont et aval, apparaît indispensable pour relever ces défis.

  4. Apports des systèmes d'information à l'exploitation des réseaux de voies rapides. Le cas du réseau d'Ile-de-France

    OpenAIRE

    Zhang, Ming-Yu

    1995-01-01

    La présente recherche pluridisciplinaire a pour objectif principal d'analyser et d'évaluer de nouvelles potentialités de l'amélioration du fonctionnement des réseaux de voies rapides (RVR) avec leur maillage progressif et en particulier avec le développement des systèmes d'information des usagers. Cette recherche s'appuie principalement sur le cas concret du RVR d'Ile-de-France, représentatif par sa complexité structurelle et par les problèmes rencontrés lors de son exploitation et notamment ...

  5. Error detection and reduction in blood banking.

    Science.gov (United States)

    Motschman, T L; Moore, S B

    1996-12-01

    Error management plays a major role in facility process improvement efforts. By detecting and reducing errors, quality and, therefore, patient care improve. It begins with a strong organizational foundation of management attitude with clear, consistent employee direction and appropriate physical facilities. Clearly defined critical processes, critical activities, and SOPs act as the framework for operations as well as active quality monitoring. To assure that personnel can detect an report errors they must be trained in both operational duties and error management practices. Use of simulated/intentional errors and incorporation of error detection into competency assessment keeps employees practiced, confident, and diminishes fear of the unknown. Personnel can clearly see that errors are indeed used as opportunities for process improvement and not for punishment. The facility must have a clearly defined and consistently used definition for reportable errors. Reportable errors should include those errors with potentially harmful outcomes as well as those errors that are "upstream," and thus further away from the outcome. A well-written error report consists of who, what, when, where, why/how, and follow-up to the error. Before correction can occur, an investigation to determine the underlying cause of the error should be undertaken. Obviously, the best corrective action is prevention. Correction can occur at five different levels; however, only three of these levels are directed at prevention. Prevention requires a method to collect and analyze data concerning errors. In the authors' facility a functional error classification method and a quality system-based classification have been useful. An active method to search for problems uncovers them further upstream, before they can have disastrous outcomes. In the continual quest for improving processes, an error management program is itself a process that needs improvement, and we must strive to always close the circle

  6. [Survey in hospitals. Nursing errors, error culture and error management].

    Science.gov (United States)

    Habermann, Monika; Cramer, Henning

    2010-09-01

    Knowledge on errors is important to design safe nursing practice and its framework. This article presents results of a survey on this topic, including data of a representative sample of 724 nurses from 30 German hospitals. Participants predominantly remembered medication errors. Structural and organizational factors were rated as most important causes of errors. Reporting rates were considered low; this was explained by organizational barriers. Nurses in large part expressed having suffered from mental problems after error events. Nurses' perception focussing on medication errors seems to be influenced by current discussions which are mainly medication-related. This priority should be revised. Hospitals' risk management should concentrate on organizational deficits and positive error cultures. Decision makers are requested to tackle structural problems such as staff shortage.

  7. IntroductionLa parole des rois à la fin du Moyen Âge : les voies d’une enquête

    Directory of Open Access Journals (Sweden)

    Stéphane PÉQUIGNOT

    2007-10-01

    Full Text Available El artículo sugiere algunas propuestas para una investigación general sobre el hablar de los reyes a finales de la Edad Media. Basándose en un estado de la cuestión para el caso de la Corona de Aragón, se indaga la inscripción del hablar de los reyes en distintas temporalidades imbricadas entre sí. La transcripción de las palabras resulta de un proceso complejo, la « fábrica de la palabra », cuyos mecanismos y huellas son objeto de estudio. Por otra parte, las representaciones del hablar de los reyes hacen a menudo referencia a unos modelos traídos del pasado, a veces se dirigen a un público futuro, mientras testimonian también su necesaria adaptación a las circunstancias de cada momento. Estos « actas reales de palabra », así como los « estilos expresivos » que contribuyen a forjar, se examinan en la secunda parte del artículo. Finalmente, el tiempo dedicado o dejado a las palabras reales participa de las evoluciones a largo plazo de las relaciones a lo escrito, de los regímenes políticos y de su forma de legitimación ; constituye un modo de comunicación político importante, un recurso y, también, una toma de riesgo para el poder y la autoridad monarquica.L’article invite à une enquête générale sur la parole des rois à la fin du Moyen Âge et en esquisse plusieurs voies possibles. À l’aide d’un état de la question sur la couronne d’Aragon, c’est l’inscription de la parole des rois dans différentes temporalités imbriquées qui est visée. Sa transcription même résulte d’un processus complexe, la « fabrique de la parole », dont mécanismes et traces sont examinés. D’autre part, les représentations de la parole des rois renvoient souvent à des modèles du passé, visent parfois un public futur tout en témoignant aussi d’une nécessaire adaptation aux circonstances présentes. Ces « actes royaux de parole » et les « styles expressifs » qu’ils contribuent à consolider

  8. Analysis of modeling errors in system identification

    Science.gov (United States)

    Hadaegh, F. Y.; Bekey, G. A.

    1986-01-01

    This paper is concerned with the identification of a system in the presence of several error sources. Following some basic definitions, the notion of 'near-equivalence in probability' is introduced using the concept of near-equivalence between a model and process. Necessary and sufficient conditions for the identifiability of system parameters are given. The effect of structural error on the parameter estimates for both deterministic and stochastic cases are considered.

  9. Generalized Gaussian Error Calculus

    CERN Document Server

    Grabe, Michael

    2010-01-01

    For the first time in 200 years Generalized Gaussian Error Calculus addresses a rigorous, complete and self-consistent revision of the Gaussian error calculus. Since experimentalists realized that measurements in general are burdened by unknown systematic errors, the classical, widespread used evaluation procedures scrutinizing the consequences of random errors alone turned out to be obsolete. As a matter of course, the error calculus to-be, treating random and unknown systematic errors side by side, should ensure the consistency and traceability of physical units, physical constants and physical quantities at large. The generalized Gaussian error calculus considers unknown systematic errors to spawn biased estimators. Beyond, random errors are asked to conform to the idea of what the author calls well-defined measuring conditions. The approach features the properties of a building kit: any overall uncertainty turns out to be the sum of a contribution due to random errors, to be taken from a confidence inter...

  10. Classification of Spreadsheet Errors

    OpenAIRE

    Rajalingham, Kamalasen; Chadwick, David R.; Knight, Brian

    2008-01-01

    This paper describes a framework for a systematic classification of spreadsheet errors. This classification or taxonomy of errors is aimed at facilitating analysis and comprehension of the different types of spreadsheet errors. The taxonomy is an outcome of an investigation of the widespread problem of spreadsheet errors and an analysis of specific types of these errors. This paper contains a description of the various elements and categories of the classification and is supported by appropri...

  11. Composite Gauss-Legendre Quadrature with Error Control

    Science.gov (United States)

    Prentice, J. S. C.

    2011-01-01

    We describe composite Gauss-Legendre quadrature for determining definite integrals, including a means of controlling the approximation error. We compare the form and performance of the algorithm with standard Newton-Cotes quadrature. (Contains 1 table.)

  12. AN ANALYSIS OF SUBJECT AGREEMENT ERRORS IN ENGLISH ...

    African Journals Online (AJOL)

    Windows User

    grammaticality of the sentences by putting a tick after the correct sentences and a cross after the ... the world and by many people who are not native speakers. ... The study therefore prefers the following working definition: An error is a.

  13. Composite Gauss-Legendre Quadrature with Error Control

    Science.gov (United States)

    Prentice, J. S. C.

    2011-01-01

    We describe composite Gauss-Legendre quadrature for determining definite integrals, including a means of controlling the approximation error. We compare the form and performance of the algorithm with standard Newton-Cotes quadrature. (Contains 1 table.)

  14. Bringing Definitions into High Definition

    Science.gov (United States)

    Mason, John

    2010-01-01

    Why do definitions play such a central role in mathematics? It may seem obvious that precision about the terms one uses is necessary in order to use those terms reasonably (while reasoning). Definitions are chosen so as to be definite about the terms one uses, but also to make both the statement of, and the reasoning to justify, theorems as…

  15. Error in the learning and teaching of english as a second language at higher education level

    OpenAIRE

    Mestre i Mestre, Eva María

    2011-01-01

    Linguistic error has proven to be a recurrent area of interest for researchers. There exist several types of approaches to error; some studies have focused on specific errors, such as grammatical errors, others on more general or exogenous issues, such as the perception of error of the group object of study, etc. From the point of view of methodology, some have been dedicated to the definition and description of error, while others have studied the identification of erroneous uses of language...

  16. Medication Errors: New EU Good Practice Guide on Risk Minimisation and Error Prevention.

    Science.gov (United States)

    Goedecke, Thomas; Ord, Kathryn; Newbould, Victoria; Brosch, Sabine; Arlett, Peter

    2016-06-01

    A medication error is an unintended failure in the drug treatment process that leads to, or has the potential to lead to, harm to the patient. Reducing the risk of medication errors is a shared responsibility between patients, healthcare professionals, regulators and the pharmaceutical industry at all levels of healthcare delivery. In 2015, the EU regulatory network released a two-part good practice guide on medication errors to support both the pharmaceutical industry and regulators in the implementation of the changes introduced with the EU pharmacovigilance legislation. These changes included a modification of the 'adverse reaction' definition to include events associated with medication errors, and the requirement for national competent authorities responsible for pharmacovigilance in EU Member States to collaborate and exchange information on medication errors resulting in harm with national patient safety organisations. To facilitate reporting and learning from medication errors, a clear distinction has been made in the guidance between medication errors resulting in adverse reactions, medication errors without harm, intercepted medication errors and potential errors. This distinction is supported by an enhanced MedDRA(®) terminology that allows for coding all stages of the medication use process where the error occurred in addition to any clinical consequences. To better understand the causes and contributing factors, individual case safety reports involving an error should be followed-up with the primary reporter to gather information relevant for the conduct of root cause analysis where this may be appropriate. Such reports should also be summarised in periodic safety update reports and addressed in risk management plans. Any risk minimisation and prevention strategy for medication errors should consider all stages of a medicinal product's life-cycle, particularly the main sources and types of medication errors during product development. This article

  17. Definitely Life but not Definitively

    Science.gov (United States)

    Oliver, Joan D.; Perry, Randall S.

    2006-12-01

    Although there have been attempts at a definition of life from many disciplines, none is accepted by all as definitive. Some people believe that it is impossible to define ‘life’ adequately at the moment. We agree with this point of view on linguistic grounds, examining the different types of definition, the contexts in which they are used and their relative usefulness as aids to arriving at a scientific definition of life. We look at some of the more recent definitions and analyse them in the light of our criteria for a good definition. We argue that since there are so many linguistic and philosophical difficulties with such a definition of life, what is needed is a series of working descriptions, which are suited to the audience and context in which they are used and useful for the intended purpose. We provide some ideas and examples of the forms these may take.

  18. Reducing medication errors.

    Science.gov (United States)

    Nute, Christine

    2014-11-25

    Most nurses are involved in medicines management, which is integral to promoting patient safety. Medicines management is prone to errors, which depending on the error can cause patient injury, increased hospital stay and significant legal expenses. This article describes a new approach to help minimise drug errors within healthcare settings where medications are prescribed, dispensed or administered. The acronym DRAINS, which considers all aspects of medicines management before administration, was devised to reduce medication errors on a cardiothoracic intensive care unit.

  19. Demand Forecasting Errors

    OpenAIRE

    Mackie, Peter; Nellthorp, John; Laird, James

    2005-01-01

    Demand forecasts form a key input to the economic appraisal. As such any errors present within the demand forecasts will undermine the reliability of the economic appraisal. The minimization of demand forecasting errors is therefore important in the delivery of a robust appraisal. This issue is addressed in this note by introducing the key issues, and error types present within demand fore...

  20. When errors are rewarding

    NARCIS (Netherlands)

    Bruijn, E.R.A. de; Lange, F.P. de; Cramon, D.Y. von; Ullsperger, M.

    2009-01-01

    For social beings like humans, detecting one's own and others' errors is essential for efficient goal-directed behavior. Although one's own errors are always negative events, errors from other persons may be negative or positive depending on the social context. We used neuroimaging to disentangle br

  1. THE PRACTICAL ANALYSIS OF FINITE ELEMENTS METHOD ERRORS

    Directory of Open Access Journals (Sweden)

    Natalia Bakhova

    2011-03-01

    Full Text Available Abstract. The most important in the practical plan questions of reliable estimations of finite elementsmethod errors are considered. Definition rules of necessary calculations accuracy are developed. Methodsand ways of the calculations allowing receiving at economical expenditures of computing work the best finalresults are offered.Keywords: error, given the accuracy, finite element method, lagrangian and hermitian elements.

  2. Quantum states characterization for the zero-error capacity

    CERN Document Server

    Medeiros, R A C; Cohen, G; De Assis, F M; Alleaume, Romain; Assis, Francisco M. de; Cohen, Gerard; Medeiros, Rex A C

    2006-01-01

    The zero-error capacity of quantum channels was defined as the least upper bound of rates at which classical information is transmitted through a quantum channel with probability of error equal to zero. This paper investigates some properties of input states used to attain the zero-error capacity of quantum channels. Initially, we reformulate the problem of finding the zero-error capacity in the language of graph theory. We use this alternative definition to prove that the zero-error capacity of any quantum channel is reached by using only pure states.

  3. A Literature Review of Research on Error Analysis Abroad

    Institute of Scientific and Technical Information of China (English)

    肖倩

    2014-01-01

    Error constitutes an important part of interlanguage.Error analysis is an approach influenced by behaviorism,it based on the cognitive theory. The aim of error analysis is to explore the errors made by second language learners, exploring the mental process of learners’second language acquisition,which is of great importance to both learners and teachers. However,as a research tool,error analysis has its limitations. In order to better understand and make best use of error analysis,its background, definition, basic assumptions, classification, procedure, explanation, implication as well as its application will be illustrated. Its limitations will be analyzed from the prospectives of its nature, definition categories.The literature review abroad sheds insight on implication for second language teaching.

  4. Floating-Point Numbers with Error Estimates (revised)

    CERN Document Server

    Masotti, Glauco

    2012-01-01

    The study addresses the problem of precision in floating-point (FP) computations. A method for estimating the errors which affect intermediate and final results is proposed and a summary of many software simulations is discussed. The basic idea consists of representing FP numbers by means of a data structure collecting value and estimated error information. Under certain constraints, the estimate of the absolute error is accurate and has a compact statistical distribution. By monitoring the estimated relative error during a computation (an ad-hoc definition of relative error has been used), the validity of results can be ensured. The error estimate enables the implementation of robust algorithms, and the detection of ill-conditioned problems. A dynamic extension of number precision, under the control of error estimates, is advocated, in order to compute results within given error bounds. A reduced time penalty could be achieved by a specialized FP processor. The realization of a hardwired processor incorporat...

  5. Evaluation of elastix-based propagated align algorithm for VOI- and voxel-based analysis of longitudinal (18)F-FDG PET/CT data from patients with non-small cell lung cancer (NSCLC).

    Science.gov (United States)

    Kerner, Gerald Sma; Fischer, Alexander; Koole, Michel Jb; Pruim, Jan; Groen, Harry Jm

    2015-01-01

    Deformable image registration allows volume of interest (VOI)- and voxel-based analysis of longitudinal changes in fluorodeoxyglucose (FDG) tumor uptake in patients with non-small cell lung cancer (NSCLC). This study evaluates the performance of the elastix toolbox deformable image registration algorithm for VOI and voxel-wise assessment of longitudinal variations in FDG tumor uptake in NSCLC patients. Evaluation of the elastix toolbox was performed using (18)F-FDG PET/CT at baseline and after 2 cycles of therapy (follow-up) data in advanced NSCLC patients. The elastix toolbox, an integrated part of the IMALYTICS workstation, was used to apply a CT-based non-linear image registration of follow-up PET/CT data using the baseline PET/CT data as reference. Lesion statistics were compared to assess the impact on therapy response assessment. Next, CT-based deformable image registration was performed anew on the deformed follow-up PET/CT data using the original follow-up PET/CT data as reference, yielding a realigned follow-up PET dataset. Performance was evaluated by determining the correlation coefficient between original and realigned follow-up PET datasets. The intra- and extra-thoracic tumors were automatically delineated on the original PET using a 41% of maximum standardized uptake value (SUVmax) adaptive threshold. Equivalence between reference and realigned images was tested (determining 95% range of the difference) and estimating the percentage of voxel values that fell within that range. Thirty-nine patients with 191 tumor lesions were included. In 37/39 and 12/39 patients, respectively, thoracic and non-thoracic lesions were evaluable for response assessment. Using the EORTC/SUVmax-based criteria, 5/37 patients had a discordant response of thoracic, and 2/12 a discordant response of non-thoracic lesions between the reference and the realigned image. FDG uptake values of corresponding tumor voxels in the original and realigned reference PET correlated well (R

  6. Probabilistic quantum error correction

    CERN Document Server

    Fern, J; Fern, Jesse; Terilla, John

    2002-01-01

    There are well known necessary and sufficient conditions for a quantum code to correct a set of errors. We study weaker conditions under which a quantum code may correct errors with probabilities that may be less than one. We work with stabilizer codes and as an application study how the nine qubit code, the seven qubit code, and the five qubit code perform when there are errors on more than one qubit. As a second application, we discuss the concept of syndrome quality and use it to suggest a way that quantum error correction can be practically improved.

  7. Neuroretinitis -- definition

    Science.gov (United States)

    ... page: //medlineplus.gov/ency/article/007624.htm Neuroretinitis - definition To use the sharing features on this page, ... this important distinction for online health information and services. Learn more about A.D.A.M.'s editorial ...

  8. Market Definition

    OpenAIRE

    Kaplow, Louis

    2014-01-01

    Market definition has long held a central place in competition law. This entry surveys recent analytical work that has called the market definition paradigm into question on a number of fronts: whether the process is feasible, whether market share threshold tests are coherent, whether the hypothetical monopolist test in merger guidelines is counterproductive, and whether and when the frequent focus on cross-elasticities is useful.

  9. Correction for quadrature errors

    DEFF Research Database (Denmark)

    Netterstrøm, A.; Christensen, Erik Lintz

    1994-01-01

    In high bandwidth radar systems it is necessary to use quadrature devices to convert the signal to/from baseband. Practical problems make it difficult to implement a perfect quadrature system. Channel imbalance and quadrature phase errors in the transmitter and the receiver result in error signal...

  10. ERRORS AND CORRECTION

    Institute of Scientific and Technical Information of China (English)

    1998-01-01

    To err is human . Since the 1960s, most second language teachers or language theorists have regarded errors as natural and inevitable in the language learning process . Instead of regarding them as terrible and disappointing, teachers have come to realize their value. This paper will consider these values, analyze some errors and propose some effective correction techniques.

  11. ERROR AND ERROR CORRECTION AT ELEMENTARY LEVEL

    Institute of Scientific and Technical Information of China (English)

    1994-01-01

    Introduction Errors are unavoidable in language learning, however, to a great extent, teachers in most middle schools in China regard errors as undesirable, a sign of failure in language learning. Most middle schools are still using the grammar-translation method which aims at encouraging students to read scientific works and enjoy literary works. The other goals of this method are to gain a greater understanding of the first language and to improve the students’ ability to cope with difficult subjects and materials, i.e. to develop the students’ minds. The practical purpose of using this method is to help learners pass the annual entrance examination. "To achieve these goals, the students must first learn grammar and vocabulary,... Grammar is taught deductively by means of long and elaborate explanations... students learn the rules of the language rather than its use." (Tang Lixing, 1983:11-12)

  12. Errors on errors - Estimating cosmological parameter covariance

    CERN Document Server

    Joachimi, Benjamin

    2014-01-01

    Current and forthcoming cosmological data analyses share the challenge of huge datasets alongside increasingly tight requirements on the precision and accuracy of extracted cosmological parameters. The community is becoming increasingly aware that these requirements not only apply to the central values of parameters but, equally important, also to the error bars. Due to non-linear effects in the astrophysics, the instrument, and the analysis pipeline, data covariance matrices are usually not well known a priori and need to be estimated from the data itself, or from suites of large simulations. In either case, the finite number of realisations available to determine data covariances introduces significant biases and additional variance in the errors on cosmological parameters in a standard likelihood analysis. Here, we review recent work on quantifying these biases and additional variances and discuss approaches to remedy these effects.

  13. Proofreading for word errors.

    Science.gov (United States)

    Pilotti, Maura; Chodorow, Martin; Agpawa, Ian; Krajniak, Marta; Mahamane, Salif

    2012-04-01

    Proofreading (i.e., reading text for the purpose of detecting and correcting typographical errors) is viewed as a component of the activity of revising text and thus is a necessary (albeit not sufficient) procedural step for enhancing the quality of a written product. The purpose of the present research was to test competing accounts of word-error detection which predict factors that may influence reading and proofreading differently. Word errors, which change a word into another word (e.g., from --> form), were selected for examination because they are unlikely to be detected by automatic spell-checking functions. Consequently, their detection still rests mostly in the hands of the human proofreader. Findings highlighted the weaknesses of existing accounts of proofreading and identified factors, such as length and frequency of the error in the English language relative to frequency of the correct word, which might play a key role in detection of word errors.

  14. Uncorrected refractive errors

    Directory of Open Access Journals (Sweden)

    Kovin S Naidoo

    2012-01-01

    Full Text Available Global estimates indicate that more than 2.3 billion people in the world suffer from poor vision due to refractive error; of which 670 million people are considered visually impaired because they do not have access to corrective treatment. Refractive errors, if uncorrected, results in an impaired quality of life for millions of people worldwide, irrespective of their age, sex and ethnicity. Over the past decade, a series of studies using a survey methodology, referred to as Refractive Error Study in Children (RESC, were performed in populations with different ethnic origins and cultural settings. These studies confirmed that the prevalence of uncorrected refractive errors is considerably high for children in low-and-middle-income countries. Furthermore, uncorrected refractive error has been noted to have extensive social and economic impacts, such as limiting educational and employment opportunities of economically active persons, healthy individuals and communities. The key public health challenges presented by uncorrected refractive errors, the leading cause of vision impairment across the world, require urgent attention. To address these issues, it is critical to focus on the development of human resources and sustainable methods of service delivery. This paper discusses three core pillars to addressing the challenges posed by uncorrected refractive errors: Human Resource (HR Development, Service Development and Social Entrepreneurship.

  15. Uncorrected refractive errors.

    Science.gov (United States)

    Naidoo, Kovin S; Jaggernath, Jyoti

    2012-01-01

    Global estimates indicate that more than 2.3 billion people in the world suffer from poor vision due to refractive error; of which 670 million people are considered visually impaired because they do not have access to corrective treatment. Refractive errors, if uncorrected, results in an impaired quality of life for millions of people worldwide, irrespective of their age, sex and ethnicity. Over the past decade, a series of studies using a survey methodology, referred to as Refractive Error Study in Children (RESC), were performed in populations with different ethnic origins and cultural settings. These studies confirmed that the prevalence of uncorrected refractive errors is considerably high for children in low-and-middle-income countries. Furthermore, uncorrected refractive error has been noted to have extensive social and economic impacts, such as limiting educational and employment opportunities of economically active persons, healthy individuals and communities. The key public health challenges presented by uncorrected refractive errors, the leading cause of vision impairment across the world, require urgent attention. To address these issues, it is critical to focus on the development of human resources and sustainable methods of service delivery. This paper discusses three core pillars to addressing the challenges posed by uncorrected refractive errors: Human Resource (HR) Development, Service Development and Social Entrepreneurship.

  16. Errors in Radiologic Reporting

    Directory of Open Access Journals (Sweden)

    Esmaeel Shokrollahi

    2010-05-01

    Full Text Available Given that the report is a professional document and bears the associated responsibilities, all of the radiologist's errors appear in it, either directly or indirectly. It is not easy to distinguish and classify the mistakes made when a report is prepared, because in most cases the errors are complex and attributable to more than one cause and because many errors depend on the individual radiologists' professional, behavioral and psychological traits."nIn fact, anyone can make a mistake, but some radiologists make more mistakes, and some types of mistakes are predictable to some extent."nReporting errors can be categorized differently:"nUniversal vs. individual"nHuman related vs. system related"nPerceptive vs. cognitive errors"n1. Descriptive "n2. Interpretative "n3. Decision related Perceptive errors"n1. False positive "n2. False negative"n Nonidentification "n Erroneous identification "nCognitive errors "n Knowledge-based"n Psychological  

  17. Errors in neuroradiology.

    Science.gov (United States)

    Caranci, Ferdinando; Tedeschi, Enrico; Leone, Giuseppe; Reginelli, Alfonso; Gatta, Gianluca; Pinto, Antonio; Squillaci, Ettore; Briganti, Francesco; Brunese, Luca

    2015-09-01

    Approximately 4 % of radiologic interpretation in daily practice contains errors and discrepancies that should occur in 2-20 % of reports. Fortunately, most of them are minor degree errors, or if serious, are found and corrected with sufficient promptness; obviously, diagnostic errors become critical when misinterpretation or misidentification should significantly delay medical or surgical treatments. Errors can be summarized into four main categories: observer errors, errors in interpretation, failure to suggest the next appropriate procedure, failure to communicate in a timely and a clinically appropriate manner. Misdiagnosis/misinterpretation percentage should rise up in emergency setting and in the first moments of the learning curve, as in residency. Para-physiological and pathological pitfalls in neuroradiology include calcification and brain stones, pseudofractures, and enlargement of subarachnoid or epidural spaces, ventricular system abnormalities, vascular system abnormalities, intracranial lesions or pseudolesions, and finally neuroradiological emergencies. In order to minimize the possibility of error, it is important to be aware of various presentations of pathology, obtain clinical information, know current practice guidelines, review after interpreting a diagnostic study, suggest follow-up studies when appropriate, communicate significant abnormal findings appropriately and in a timely fashion directly with the treatment team.

  18. Uncertainty and error in computational simulations

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, W.L.; Diegert, K.V.; Alvin, K.F.; Rutherford, B.M.

    1997-10-01

    The present paper addresses the question: ``What are the general classes of uncertainty and error sources in complex, computational simulations?`` This is the first step of a two step process to develop a general methodology for quantitatively estimating the global modeling and simulation uncertainty in computational modeling and simulation. The second step is to develop a general mathematical procedure for representing, combining and propagating all of the individual sources through the simulation. The authors develop a comprehensive view of the general phases of modeling and simulation. The phases proposed are: conceptual modeling of the physical system, mathematical modeling of the system, discretization of the mathematical model, computer programming of the discrete model, numerical solution of the model, and interpretation of the results. This new view is built upon combining phases recognized in the disciplines of operations research and numerical solution methods for partial differential equations. The characteristics and activities of each of these phases is discussed in general, but examples are given for the fields of computational fluid dynamics and heat transfer. They argue that a clear distinction should be made between uncertainty and error that can arise in each of these phases. The present definitions for uncertainty and error are inadequate and. therefore, they propose comprehensive definitions for these terms. Specific classes of uncertainty and error sources are then defined that can occur in each phase of modeling and simulation. The numerical sources of error considered apply regardless of whether the discretization procedure is based on finite elements, finite volumes, or finite differences. To better explain the broad types of sources of uncertainty and error, and the utility of their categorization, they discuss a coupled-physics example simulation.

  19. Article Errors in the English Writing of Saudi EFL Preparatory Year Students

    Science.gov (United States)

    Alhaisoni, Eid; Gaudel, Daya Ram; Al-Zuoud, Khalid M.

    2017-01-01

    This study aims at providing a comprehensive account of the types of errors produced by Saudi EFL students enrolled in the preparatory year programe in their use of articles, based on the Surface Structure Taxonomies (SST) of errors. The study describes the types, frequency and sources of the definite and indefinite article errors in writing…

  20. Inpatients’ medical prescription errors

    Directory of Open Access Journals (Sweden)

    Aline Melo Santos Silva

    2009-09-01

    Full Text Available Objective: To identify and quantify the most frequent prescription errors in inpatients’ medical prescriptions. Methods: A survey of prescription errors was performed in the inpatients’ medical prescriptions, from July 2008 to May 2009 for eight hours a day. Rresults: At total of 3,931 prescriptions was analyzed and 362 (9.2% prescription errors were found, which involved the healthcare team as a whole. Among the 16 types of errors detected in prescription, the most frequent occurrences were lack of information, such as dose (66 cases, 18.2% and administration route (26 cases, 7.2%; 45 cases (12.4% of wrong transcriptions to the information system; 30 cases (8.3% of duplicate drugs; doses higher than recommended (24 events, 6.6% and 29 cases (8.0% of prescriptions with indication but not specifying allergy. Cconclusion: Medication errors are a reality at hospitals. All healthcare professionals are responsible for the identification and prevention of these errors, each one in his/her own area. The pharmacist is an essential professional in the drug therapy process. All hospital organizations need a pharmacist team responsible for medical prescription analyses before preparation, dispensation and administration of drugs to inpatients. This study showed that the pharmacist improves the inpatient’s safety and success of prescribed therapy.

  1. A Matroidal Framework for Network-Error Correcting Codes

    CERN Document Server

    Prasad, K

    2012-01-01

    Matroidal networks were introduced by Dougherty et al. and have been well studied in the recent past. It was shown that a network has a scalar linear network coding solution if and only if it is matroidal associated with a representable matroid. A particularly interesting feature of this development is the ability to construct (scalar and vector) linearly solvable networks using certain classes of matroids. The current work attempts to establish a connection between matroid theory and network-error correcting codes. In a similar vein to the theory connecting matroids and network coding, we abstract the essential aspects of network-error correcting codes to arrive at the definition of a matroidal error correcting network. An acyclic network (with arbitrary sink demands) is then shown to possess a scalar linear error correcting network code if and only if there it is a matroidal error correcting network associated with a representable matroid. Therefore, constructing such network-error correcting codes implies ...

  2. Error monitoring in musicians

    Directory of Open Access Journals (Sweden)

    Clemens eMaidhof

    2013-07-01

    Full Text Available To err is human, and hence even professional musicians make errors occasionally during their performances. This paper summarizes recent work investigating error monitoring in musicians, i.e. the processes and their neural correlates associated with the monitoring of ongoing actions and the detection of deviations from intended sounds. EEG Studies reported an early component of the event-related potential (ERP occurring before the onsets of pitch errors. This component, which can be altered in musicians with focal dystonia, likely reflects processes of error detection and/or error compensation, i.e. attempts to cancel the undesired sensory consequence (a wrong tone a musician is about to perceive. Thus, auditory feedback seems not to be a prerequisite for error detection, consistent with previous behavioral results. In contrast, when auditory feedback is externally manipulated and thus unexpected, motor performance can be severely distorted, although not all feedback alterations result in performance impairments. Recent studies investigating the neural correlates of feedback processing showed that unexpected feedback elicits an ERP component after note onsets, which shows larger amplitudes during music performance than during mere perception of the same musical sequences. Hence, these results stress the role of motor actions for the processing of auditory information. Furthermore, recent methodological advances like the combination of 3D motion capture techniques with EEG will be discussed. Such combinations of different measures can potentially help to disentangle the roles of different feedback types such as proprioceptive and auditory feedback, and in general to derive at a better understanding of the complex interactions between the motor and auditory domain during error monitoring. Finally, outstanding questions and future directions in this context will be discussed.

  3. Smoothing error pitfalls

    Science.gov (United States)

    von Clarmann, T.

    2014-09-01

    The difference due to the content of a priori information between a constrained retrieval and the true atmospheric state is usually represented by a diagnostic quantity called smoothing error. In this paper it is shown that, regardless of the usefulness of the smoothing error as a diagnostic tool in its own right, the concept of the smoothing error as a component of the retrieval error budget is questionable because it is not compliant with Gaussian error propagation. The reason for this is that the smoothing error does not represent the expected deviation of the retrieval from the true state but the expected deviation of the retrieval from the atmospheric state sampled on an arbitrary grid, which is itself a smoothed representation of the true state; in other words, to characterize the full loss of information with respect to the true atmosphere, the effect of the representation of the atmospheric state on a finite grid also needs to be considered. The idea of a sufficiently fine sampling of this reference atmospheric state is problematic because atmospheric variability occurs on all scales, implying that there is no limit beyond which the sampling is fine enough. Even the idealization of infinitesimally fine sampling of the reference state does not help, because the smoothing error is applied to quantities which are only defined in a statistical sense, which implies that a finite volume of sufficient spatial extent is needed to meaningfully discuss temperature or concentration. Smoothing differences, however, which play a role when measurements are compared, are still a useful quantity if the covariance matrix involved has been evaluated on the comparison grid rather than resulting from interpolation and if the averaging kernel matrices have been evaluated on a grid fine enough to capture all atmospheric variations that the instruments are sensitive to. This is, under the assumptions stated, because the undefined component of the smoothing error, which is the

  4. Learning from Errors

    Directory of Open Access Journals (Sweden)

    MA. Lendita Kryeziu

    2015-06-01

    Full Text Available “Errare humanum est”, a well known and widespread Latin proverb which states that: to err is human, and that people make mistakes all the time. However, what counts is that people must learn from mistakes. On these grounds Steve Jobs stated: “Sometimes when you innovate, you make mistakes. It is best to admit them quickly, and get on with improving your other innovations.” Similarly, in learning new language, learners make mistakes, thus it is important to accept them, learn from them, discover the reason why they make them, improve and move on. The significance of studying errors is described by Corder as: “There have always been two justifications proposed for the study of learners' errors: the pedagogical justification, namely that a good understanding of the nature of error is necessary before a systematic means of eradicating them could be found, and the theoretical justification, which claims that a study of learners' errors is part of the systematic study of the learners' language which is itself necessary to an understanding of the process of second language acquisition” (Corder, 1982; 1. Thus the importance and the aim of this paper is analyzing errors in the process of second language acquisition and the way we teachers can benefit from mistakes to help students improve themselves while giving the proper feedback.

  5. Error Correction in Classroom

    Institute of Scientific and Technical Information of China (English)

    Dr. Grace Zhang

    2000-01-01

    Error correction is an important issue in foreign language acquisition. This paper investigates how students feel about the way in which error correction should take place in a Chinese-as-a foreign-language classroom, based on empirical data of a large scale. The study shows that there is a general consensus that error correction is necessary. In terms of correction strategy, the students preferred a combination of direct and indirect corrections, or a direct only correction. The former choice indicates that students would be happy to take either so long as the correction gets done.Most students didn't mind peer correcting provided it is conducted in a constructive way. More than halfofthe students would feel uncomfortable ifthe same error they make in class is corrected consecutively more than three times. Taking these findings into consideration, we may want to cncourage peer correcting, use a combination of correction strategies (direct only if suitable) and do it in a non-threatening and sensitive way. It is hoped that this study would contribute to the effectiveness of error correction in a Chinese language classroom and it may also have a wider implication on other languages.

  6. Methods of Run-Time Error Detection in Distributed Process Control Software

    DEFF Research Database (Denmark)

    Drejer, N.

    of generic run-time error types, design of methods of observing application software behaviorduring execution and design of methods of evaluating run time constraints. In the definition of error types it is attempted to cover all relevant aspects of the application softwaree behavior. Methods of observation...... and constraint evaluation is designed for the modt interesting error types. These include: a) semantical errors in data communicated between application tasks; b) errors in the execution of application tasks; and c) errors in the timing of distributed events emitted by the application software. The design......In this thesis, methods of run-time error detection in application software for distributed process control is designed. The error detection is based upon a monitoring approach in which application software is monitored by system software during the entire execution. The thesis includes definition...

  7. Bias in parameter estimation of form errors

    Science.gov (United States)

    Zhang, Xiangchao; Zhang, Hao; He, Xiaoying; Xu, Min

    2014-09-01

    The surface form qualities of precision components are critical to their functionalities. In precision instruments algebraic fitting is usually adopted and the form deviations are assessed in the z direction only, in which case the deviations at steep regions of curved surfaces will be over-weighted, making the fitted results biased and unstable. In this paper the orthogonal distance fitting is performed for curved surfaces and the form errors are measured along the normal vectors of the fitted ideal surfaces. The relative bias of the form error parameters between the vertical assessment and orthogonal assessment are analytically calculated and it is represented as functions of the surface slopes. The parameter bias caused by the non-uniformity of data points can be corrected by weighting, i.e. each data is weighted by the 3D area of the Voronoi cell around the projection point on the fitted surface. Finally numerical experiments are given to compare different fitting methods and definitions of the form error parameters. The proposed definition is demonstrated to show great superiority in terms of stability and unbiasedness.

  8. Errors in Neonatology

    Directory of Open Access Journals (Sweden)

    Antonio Boldrini

    2013-06-01

    Full Text Available Introduction: Danger and errors are inherent in human activities. In medical practice errors can lean to adverse events for patients. Mass media echo the whole scenario. Methods: We reviewed recent published papers in PubMed database to focus on the evidence and management of errors in medical practice in general and in Neonatology in particular. We compared the results of the literature with our specific experience in Nina Simulation Centre (Pisa, Italy. Results: In Neonatology the main error domains are: medication and total parenteral nutrition, resuscitation and respiratory care, invasive procedures, nosocomial infections, patient identification, diagnostics. Risk factors include patients’ size, prematurity, vulnerability and underlying disease conditions but also multidisciplinary teams, working conditions providing fatigue, a large variety of treatment and investigative modalities needed. Discussion and Conclusions: In our opinion, it is hardly possible to change the human beings but it is likely possible to change the conditions under they work. Voluntary errors report systems can help in preventing adverse events. Education and re-training by means of simulation can be an effective strategy too. In Pisa (Italy Nina (ceNtro di FormazIone e SimulazioNe NeonAtale is a simulation center that offers the possibility of a continuous retraining for technical and non-technical skills to optimize neonatological care strategies. Furthermore, we have been working on a novel skill trainer for mechanical ventilation (MEchatronic REspiratory System SImulator for Neonatal Applications, MERESSINA. Finally, in our opinion national health policy indirectly influences risk for errors. Proceedings of the 9th International Workshop on Neonatology · Cagliari (Italy · October 23rd-26th, 2013 · Learned lessons, changing practice and cutting-edge research

  9. Error Free Software

    Science.gov (United States)

    1985-01-01

    A mathematical theory for development of "higher order" software to catch computer mistakes resulted from a Johnson Space Center contract for Apollo spacecraft navigation. Two women who were involved in the project formed Higher Order Software, Inc. to develop and market the system of error analysis and correction. They designed software which is logically error-free, which, in one instance, was found to increase productivity by 600%. USE.IT defines its objectives using AXES -- a user can write in English and the system converts to computer languages. It is employed by several large corporations.

  10. LIBERTARISMO & ERROR CATEGORIAL

    Directory of Open Access Journals (Sweden)

    Carlos G. Patarroyo G.

    2009-01-01

    Full Text Available En este artículo se ofrece una defensa del libertarismo frente a dos acusaciones según las cuales éste comete un error categorial. Para ello, se utiliza la filosofía de Gilbert Ryle como herramienta para explicar las razones que fundamentan estas acusaciones y para mostrar por qué, pese a que ciertas versiones del libertarismo que acuden a la causalidad de agentes o al dualismo cartesiano cometen estos errores, un libertarismo que busque en el indeterminismo fisicalista la base de la posibilidad de la libertad humana no necesariamente puede ser acusado de incurrir en ellos.

  11. Orwell's Instructive Errors

    Science.gov (United States)

    Julian, Liam

    2009-01-01

    In this article, the author talks about George Orwell, his instructive errors, and the manner in which Orwell pierced worthless theory, faced facts and defended decency (with fluctuating success), and largely ignored the tradition of accumulated wisdom that has rendered him a timeless teacher--one whose inadvertent lessons, while infrequently…

  12. Challenge and Error: Critical Events and Attention-Related Errors

    Science.gov (United States)

    Cheyne, James Allan; Carriere, Jonathan S. A.; Solman, Grayden J. F.; Smilek, Daniel

    2011-01-01

    Attention lapses resulting from reactivity to task challenges and their consequences constitute a pervasive factor affecting everyday performance errors and accidents. A bidirectional model of attention lapses (error [image omitted] attention-lapse: Cheyne, Solman, Carriere, & Smilek, 2009) argues that errors beget errors by generating attention…

  13. Medical error and related factors during internship and residency.

    Science.gov (United States)

    Ahmadipour, Habibeh; Nahid, Mortazavi

    2015-01-01

    It is difficult to determine the real incidence of medical errors due to the lack of a precise definition of errors, as well as the failure to report them under certain circumstances. We carried out a cross- sectional study in Kerman University of Medical Sciences, Iran in 2013. The participants were selected through the census method. The data were collected using a self-administered questionnaire, which consisted of questions on the participants' demographic data and questions on the medical errors committed. The data were analysed by SPSS 19. It was found that 270 participants had committed medical errors. There was no significant difference in the frequency of errors committed by interns and residents. In the case of residents, the most common error was misdiagnosis and in that of interns, errors related to history-taking and physical examination. Considering that medical errors are common in the clinical setting, the education system should train interns and residents to prevent the occurrence of errors. In addition, the system should develop a positive attitude among them so that they can deal better with medical errors.

  14. Les voix/voies de la carte

    Directory of Open Access Journals (Sweden)

    Louise Bénat-Tachot

    2012-01-01

    Full Text Available La confection du padrón real par les experts de la Casa de la Contratación de Séville dans la première moitié du XVIe siècle, a été un enjeu majeur non seulement pour établir la nouvelle configuration globale du monde mais aussi pour légitimer les entreprises d’expansion des nations ibériques en affirmant la maîtrise de la navigation. Dans quelle mesure la rédaction des premières chroniques des Indes est-elle liée à cette activité cartographique d’état qui lui est contemporaine ? Les liens épistémologiques, rhétoriques et politiques qui lient ces deux productions seront étudiés à partir de la carte universelle de Diego de Ribeiro et de deux chroniques : celle de Gonzalo Fernández de Oviedo et celle de Francisco López de Gómara.La confección del padrón real por los expertos de la Casa de la Contratación de Sevilla durante la primera mitad del siglo XVI, fue de mayor trascendencia no sólo por cartografiar la nueva configuración del mundo sino también porque legitimaba la empresa de expansión de las naciones ibéricas así como para afirmaba su dominio de la navegación. ¿ En qué medida se vincula la redacción de las primeras crónicas de Indias con esta actividad de estado cartográfica contemporánea ? Los vínculos epistemológicos, retóricos y políticos que unen estas dos producciones serán analizados a partir de la carta universal de Diego de Ribeiro y de las dos crónicas primitivas de Indias : la de Gonzalo Fernández de Oviedo y la de Francisco López de Gómara.

  15. Kuinka voi näytelmä?

    OpenAIRE

    Hannula, Akseli

    2014-01-01

    Tämä opinnäytetyö on reflektoiva raportti keväällä 2013 Turun Kaupunginteatterissa alkaneesta prosessista. Tilataan näytelmä -niminen projekti piti sisällään keskustelutyöpajoja, joissa kartoitettiin näytelmäkirjallisuuden olemusta ja asemaa suomalaisessa kirjallisuudessa ja teatterissa sekä seurattiin kolmen Teatterikorkeakoulun dramaturgiopiskelijan kirjoitusprosessia. Työryhmän ydinkysymys oli: miten puhua (keskeneräisestä) näytelmästä? Projektin vetäjänä toim...

  16. Patient error: a preliminary taxonomy.

    NARCIS (Netherlands)

    Buetow, S.; Kiata, L.; Liew, T.; Kenealy, T.; Dovey, S.; Elwyn, G.

    2009-01-01

    PURPOSE: Current research on errors in health care focuses almost exclusively on system and clinician error. It tends to exclude how patients may create errors that influence their health. We aimed to identify the types of errors that patients can contribute and help manage, especially in primary ca

  17. Automatic Error Analysis Using Intervals

    Science.gov (United States)

    Rothwell, E. J.; Cloud, M. J.

    2012-01-01

    A technique for automatic error analysis using interval mathematics is introduced. A comparison to standard error propagation methods shows that in cases involving complicated formulas, the interval approach gives comparable error estimates with much less effort. Several examples are considered, and numerical errors are computed using the INTLAB…

  18. Imagery of Errors in Typing

    Science.gov (United States)

    Rieger, Martina; Martinez, Fanny; Wenke, Dorit

    2011-01-01

    Using a typing task we investigated whether insufficient imagination of errors and error corrections is related to duration differences between execution and imagination. In Experiment 1 spontaneous error imagination was investigated, whereas in Experiment 2 participants were specifically instructed to imagine errors. Further, in Experiment 2 we…

  19. Error bars in experimental biology.

    Science.gov (United States)

    Cumming, Geoff; Fidler, Fiona; Vaux, David L

    2007-04-09

    Error bars commonly appear in figures in publications, but experimental biologists are often unsure how they should be used and interpreted. In this article we illustrate some basic features of error bars and explain how they can help communicate data and assist correct interpretation. Error bars may show confidence intervals, standard errors, standard deviations, or other quantities. Different types of error bars give quite different information, and so figure legends must make clear what error bars represent. We suggest eight simple rules to assist with effective use and interpretation of error bars.

  20. Video Error Correction Using Steganography

    Directory of Open Access Journals (Sweden)

    Robie David L

    2002-01-01

    Full Text Available The transmission of any data is always subject to corruption due to errors, but video transmission, because of its real time nature must deal with these errors without retransmission of the corrupted data. The error can be handled using forward error correction in the encoder or error concealment techniques in the decoder. This MPEG-2 compliant codec uses data hiding to transmit error correction information and several error concealment techniques in the decoder. The decoder resynchronizes more quickly with fewer errors than traditional resynchronization techniques. It also allows for perfect recovery of differentially encoded DCT-DC components and motion vectors. This provides for a much higher quality picture in an error-prone environment while creating an almost imperceptible degradation of the picture in an error-free environment.

  1. Error-Free Software

    Science.gov (United States)

    1989-01-01

    001 is an integrated tool suited for automatically developing ultra reliable models, simulations and software systems. Developed and marketed by Hamilton Technologies, Inc. (HTI), it has been applied in engineering, manufacturing, banking and software tools development. The software provides the ability to simplify the complex. A system developed with 001 can be a prototype or fully developed with production quality code. It is free of interface errors, consistent, logically complete and has no data or control flow errors. Systems can be designed, developed and maintained with maximum productivity. Margaret Hamilton, President of Hamilton Technologies, also directed the research and development of USE.IT, an earlier product which was the first computer aided software engineering product in the industry to concentrate on automatically supporting the development of an ultrareliable system throughout its life cycle. Both products originated in NASA technology developed under a Johnson Space Center contract.

  2. A Characterization of Prediction Errors

    OpenAIRE

    Meek, Christopher

    2016-01-01

    Understanding prediction errors and determining how to fix them is critical to building effective predictive systems. In this paper, we delineate four types of prediction errors and demonstrate that these four types characterize all prediction errors. In addition, we describe potential remedies and tools that can be used to reduce the uncertainty when trying to determine the source of a prediction error and when trying to take action to remove a prediction errors.

  3. Error Analysis and Its Implication

    Institute of Scientific and Technical Information of China (English)

    崔蕾

    2007-01-01

    Error analysis is the important theory and approach for exploring the mental process of language learner in SLA. Its major contribution is pointing out that intralingual errors are the main reason of the errors during language learning. Researchers' exploration and description of the errors will not only promote the bidirectional study of Error Analysis as both theory and approach, but also give the implication to second language learning.

  4. Error bars in experimental biology

    OpenAIRE

    2007-01-01

    Error bars commonly appear in figures in publications, but experimental biologists are often unsure how they should be used and interpreted. In this article we illustrate some basic features of error bars and explain how they can help communicate data and assist correct interpretation. Error bars may show confidence intervals, standard errors, standard deviations, or other quantities. Different types of error bars give quite different information, and so figure legends must make clear what er...

  5. Diagnostic errors in pediatric radiology

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, George A.; Voss, Stephan D. [Children' s Hospital Boston, Department of Radiology, Harvard Medical School, Boston, MA (United States); Melvin, Patrice R. [Children' s Hospital Boston, The Program for Patient Safety and Quality, Boston, MA (United States); Graham, Dionne A. [Children' s Hospital Boston, The Program for Patient Safety and Quality, Boston, MA (United States); Harvard Medical School, The Department of Pediatrics, Boston, MA (United States)

    2011-03-15

    Little information is known about the frequency, types and causes of diagnostic errors in imaging children. Our goals were to describe the patterns and potential etiologies of diagnostic error in our subspecialty. We reviewed 265 cases with clinically significant diagnostic errors identified during a 10-year period. Errors were defined as a diagnosis that was delayed, wrong or missed; they were classified as perceptual, cognitive, system-related or unavoidable; and they were evaluated by imaging modality and level of training of the physician involved. We identified 484 specific errors in the 265 cases reviewed (mean:1.8 errors/case). Most discrepancies involved staff (45.5%). Two hundred fifty-eight individual cognitive errors were identified in 151 cases (mean = 1.7 errors/case). Of these, 83 cases (55%) had additional perceptual or system-related errors. One hundred sixty-five perceptual errors were identified in 165 cases. Of these, 68 cases (41%) also had cognitive or system-related errors. Fifty-four system-related errors were identified in 46 cases (mean = 1.2 errors/case) of which all were multi-factorial. Seven cases were unavoidable. Our study defines a taxonomy of diagnostic errors in a large academic pediatric radiology practice and suggests that most are multi-factorial in etiology. Further study is needed to define effective strategies for improvement. (orig.)

  6. Error Consistency Analysis Scheme for Infrared Ultraspectral Sounding Retrieval Error Budget Estimation

    Science.gov (United States)

    Zhou, Daniel K.; Larar, Allen M.; Liu, Xu; Smith, William L.; Strow, Larry, L.

    2013-01-01

    Great effort has been devoted towards validating geophysical parameters retrieved from ultraspectral infrared radiances obtained from satellite remote sensors. An error consistency analysis scheme (ECAS), utilizing fast radiative transfer model (RTM) forward and inverse calculations, has been developed to estimate the error budget in terms of mean difference and standard deviation of error in both spectral radiance and retrieval domains. The retrieval error is assessed through ECAS without relying on other independent measurements such as radiosonde data. ECAS establishes a link between the accuracies of radiances and retrieved geophysical parameters. ECAS can be applied to measurements from any ultraspectral instrument and any retrieval scheme with its associated RTM. In this manuscript, ECAS is described and demonstrated with measurements from the MetOp-A satellite Infrared Atmospheric Sounding Interferometer (IASI). This scheme can be used together with other validation methodologies to give a more definitive characterization of the error and/or uncertainty of geophysical parameters retrieved from ultraspectral radiances observed from current and future satellite remote sensors such as IASI, the Atmospheric Infrared Sounder (AIRS), and the Cross-track Infrared Sounder (CrIS).

  7. Transient Error Data Analysis.

    Science.gov (United States)

    1979-05-01

    Analysis is 3.2 Graphical Data Analysis 16 3.3 General Statistics and Confidence Intervals 1" 3.4 Goodness of Fit Test 15 4. Conclusions 31 Acknowledgements...MTTF per System Technology Mechanism Processor Processor MT IE . CMUA PDP-10, ECL Parity 44 hrs. 800-1600 hrs. 0.03-0.06 Cm* LSI-1 1, NMOS Diagnostics...OF BAD TIME ERRORS: 6 TOTAL NUMBER OF ENTRIES FOR ALL INPUT FILESs 18445 TIME SPAN: 1542 HRS., FROM: 17-Feb-79 5:3:11 TO: 18-1Mj-79 11:30:99

  8. Minimum Error Entropy Classification

    CERN Document Server

    Marques de Sá, Joaquim P; Santos, Jorge M F; Alexandre, Luís A

    2013-01-01

    This book explains the minimum error entropy (MEE) concept applied to data classification machines. Theoretical results on the inner workings of the MEE concept, in its application to solving a variety of classification problems, are presented in the wider realm of risk functionals. Researchers and practitioners also find in the book a detailed presentation of practical data classifiers using MEE. These include multi‐layer perceptrons, recurrent neural networks, complexvalued neural networks, modular neural networks, and decision trees. A clustering algorithm using a MEE‐like concept is also presented. Examples, tests, evaluation experiments and comparison with similar machines using classic approaches, complement the descriptions.

  9. New Directions in Price Test for Market Definition

    OpenAIRE

    Zipitria, Leandro

    2010-01-01

    The appropriate definition of the relevant market is the main task in competition cases. But this definition, and its application, has proved difficult in abuse of dominance cases, mainly because of the cellophane fallacy. I offer new interpretations for the cointegration test and its vector error correction representation, in antitrust market definition. Then I apply them to define the beer market in Uruguay as an example.

  10. Errors in CT colonography.

    Science.gov (United States)

    Trilisky, Igor; Ward, Emily; Dachman, Abraham H

    2015-10-01

    CT colonography (CTC) is a colorectal cancer screening modality which is becoming more widely implemented and has shown polyp detection rates comparable to those of optical colonoscopy. CTC has the potential to improve population screening rates due to its minimal invasiveness, no sedation requirement, potential for reduced cathartic examination, faster patient throughput, and cost-effectiveness. Proper implementation of a CTC screening program requires careful attention to numerous factors, including patient preparation prior to the examination, the technical aspects of image acquisition, and post-processing of the acquired data. A CTC workstation with dedicated software is required with integrated CTC-specific display features. Many workstations include computer-aided detection software which is designed to decrease errors of detection by detecting and displaying polyp-candidates to the reader for evaluation. There are several pitfalls which may result in false-negative and false-positive reader interpretation. We present an overview of the potential errors in CTC and a systematic approach to avoid them.

  11. Virtual occlusal definition for orthognathic surgery.

    Science.gov (United States)

    Liu, X J; Li, Q Q; Zhang, Z; Li, T T; Xie, Z; Zhang, Y

    2016-03-01

    Computer-assisted surgical simulation is being used increasingly in orthognathic surgery. However, occlusal definition is still undertaken using model surgery with subsequent digitization via surface scanning or cone beam computed tomography. A software tool has been developed and a workflow set up in order to achieve a virtual occlusal definition. The results of a validation study carried out on 60 models of normal occlusion are presented. Inter- and intra-user correlation tests were used to investigate the reproducibility of the manual setting point procedure. The errors between the virtually set positions (test) and the digitized manually set positions (gold standard) were compared. The consistency in virtual set positions performed by three individual users was investigated by one way analysis of variance test. Inter- and intra-observer correlation coefficients for manual setting points were all greater than 0.95. Overall, the median error between the test and the gold standard positions was 1.06mm. Errors did not differ among teeth (F=0.371, P>0.05). The errors were not significantly different from 1mm (P>0.05). There were no significant differences in the errors made by the three independent users (P>0.05). In conclusion, this workflow for virtual occlusal definition was found to be reliable and accurate.

  12. Error Analysis in Mathematics Education.

    Science.gov (United States)

    Rittner, Max

    1982-01-01

    The article reviews the development of mathematics error analysis as a means of diagnosing students' cognitive reasoning. Errors specific to addition, subtraction, multiplication, and division are described, and suggestions for remediation are provided. (CL)

  13. Payment Error Rate Measurement (PERM)

    Data.gov (United States)

    U.S. Department of Health & Human Services — The PERM program measures improper payments in Medicaid and CHIP and produces error rates for each program. The error rates are based on reviews of the...

  14. Error bounds for set inclusions

    Institute of Scientific and Technical Information of China (English)

    ZHENG; Xiyin(郑喜印)

    2003-01-01

    A variant of Robinson-Ursescu Theorem is given in normed spaces. Several error bound theorems for convex inclusions are proved and in particular a positive answer to Li and Singer's conjecture is given under weaker assumption than the assumption required in their conjecture. Perturbation error bounds are also studied. As applications, we study error bounds for convex inequality systems.

  15. Feature Referenced Error Correction Apparatus.

    Science.gov (United States)

    A feature referenced error correction apparatus utilizing the multiple images of the interstage level image format to compensate for positional...images and by the generation of an error correction signal in response to the sub-frame registration errors. (Author)

  16. Errors in causal inference: an organizational schema for systematic error and random error.

    Science.gov (United States)

    Suzuki, Etsuji; Tsuda, Toshihide; Mitsuhashi, Toshiharu; Mansournia, Mohammad Ali; Yamamoto, Eiji

    2016-11-01

    To provide an organizational schema for systematic error and random error in estimating causal measures, aimed at clarifying the concept of errors from the perspective of causal inference. We propose to divide systematic error into structural error and analytic error. With regard to random error, our schema shows its four major sources: nondeterministic counterfactuals, sampling variability, a mechanism that generates exposure events and measurement variability. Structural error is defined from the perspective of counterfactual reasoning and divided into nonexchangeability bias (which comprises confounding bias and selection bias) and measurement bias. Directed acyclic graphs are useful to illustrate this kind of error. Nonexchangeability bias implies a lack of "exchangeability" between the selected exposed and unexposed groups. A lack of exchangeability is not a primary concern of measurement bias, justifying its separation from confounding bias and selection bias. Many forms of analytic errors result from the small-sample properties of the estimator used and vanish asymptotically. Analytic error also results from wrong (misspecified) statistical models and inappropriate statistical methods. Our organizational schema is helpful for understanding the relationship between systematic error and random error from a previously less investigated aspect, enabling us to better understand the relationship between accuracy, validity, and precision. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Firewall Configuration Errors Revisited

    CERN Document Server

    Wool, Avishai

    2009-01-01

    The first quantitative evaluation of the quality of corporate firewall configurations appeared in 2004, based on Check Point FireWall-1 rule-sets. In general that survey indicated that corporate firewalls were often enforcing poorly written rule-sets, containing many mistakes. The goal of this work is to revisit the first survey. The current study is much larger. Moreover, for the first time, the study includes configurations from two major vendors. The study also introduce a novel "Firewall Complexity" (FC) measure, that applies to both types of firewalls. The findings of the current study indeed validate the 2004 study's main observations: firewalls are (still) poorly configured, and a rule-set's complexity is (still) positively correlated with the number of detected risk items. Thus we can conclude that, for well-configured firewalls, ``small is (still) beautiful''. However, unlike the 2004 study, we see no significant indication that later software versions have fewer errors (for both vendors).

  18. Beta systems error analysis

    Science.gov (United States)

    1984-01-01

    The atmospheric backscatter coefficient, beta, measured with an airborne CO Laser Doppler Velocimeter (LDV) system operating in a continuous wave, focussed model is discussed. The Single Particle Mode (SPM) algorithm, was developed from concept through analysis of an extensive amount of data obtained with the system on board a NASA aircraft. The SPM algorithm is intended to be employed in situations where one particle at a time appears in the sensitive volume of the LDV. In addition to giving the backscatter coefficient, the SPM algorithm also produces as intermediate results the aerosol density and the aerosol backscatter cross section distribution. A second method, which measures only the atmospheric backscatter coefficient, is called the Volume Mode (VM) and was simultaneously employed. The results of these two methods differed by slightly less than an order of magnitude. The measurement uncertainties or other errors in the results of the two methods are examined.

  19. Catalytic quantum error correction

    CERN Document Server

    Brun, T; Hsieh, M H; Brun, Todd; Devetak, Igor; Hsieh, Min-Hsiu

    2006-01-01

    We develop the theory of entanglement-assisted quantum error correcting (EAQEC) codes, a generalization of the stabilizer formalism to the setting in which the sender and receiver have access to pre-shared entanglement. Conventional stabilizer codes are equivalent to dual-containing symplectic codes. In contrast, EAQEC codes do not require the dual-containing condition, which greatly simplifies their construction. We show how any quaternary classical code can be made into a EAQEC code. In particular, efficient modern codes, like LDPC codes, which attain the Shannon capacity, can be made into EAQEC codes attaining the hashing bound. In a quantum computation setting, EAQEC codes give rise to catalytic quantum codes which maintain a region of inherited noiseless qubits. We also give an alternative construction of EAQEC codes by making classical entanglement assisted codes coherent.

  20. Experimental repetitive quantum error correction.

    Science.gov (United States)

    Schindler, Philipp; Barreiro, Julio T; Monz, Thomas; Nebendahl, Volckmar; Nigg, Daniel; Chwalla, Michael; Hennrich, Markus; Blatt, Rainer

    2011-05-27

    The computational potential of a quantum processor can only be unleashed if errors during a quantum computation can be controlled and corrected for. Quantum error correction works if imperfections of quantum gate operations and measurements are below a certain threshold and corrections can be applied repeatedly. We implement multiple quantum error correction cycles for phase-flip errors on qubits encoded with trapped ions. Errors are corrected by a quantum-feedback algorithm using high-fidelity gate operations and a reset technique for the auxiliary qubits. Up to three consecutive correction cycles are realized, and the behavior of the algorithm for different noise environments is analyzed.

  1. Register file soft error recovery

    Science.gov (United States)

    Fleischer, Bruce M.; Fox, Thomas W.; Wait, Charles D.; Muff, Adam J.; Watson, III, Alfred T.

    2013-10-15

    Register file soft error recovery including a system that includes a first register file and a second register file that mirrors the first register file. The system also includes an arithmetic pipeline for receiving data read from the first register file, and error detection circuitry to detect whether the data read from the first register file includes corrupted data. The system further includes error recovery circuitry to insert an error recovery instruction into the arithmetic pipeline in response to detecting the corrupted data. The inserted error recovery instruction replaces the corrupted data in the first register file with a copy of the data from the second register file.

  2. The generalized transmission error of spiral bevel gears

    Science.gov (United States)

    Mark, W. D.

    1987-01-01

    The traditional definition of the transmission error of parallel-axis gear pairs is reviewed and shown to be unsuitable for characterizing the deviation from conjugate action of bevel gear pairs for vibration excitation characterization purposes. This situation is rectified by generalizing the concept of the transmission error of parallel-axis gears to a three-component transmission error for spiral bevel gears of nominal spherical involute design. A general relationship is derived which expresses the contributions to the three-component transmission error from each gear of a meshing spiral bevel pair as a linear transformation of the six coordinates that describe the deviation of the shaft centerline position of each gear of the pair from the position of its rigid perfect involute counterpart.

  3. Controlling errors in unidosis carts

    Directory of Open Access Journals (Sweden)

    Inmaculada Díaz Fernández

    2010-01-01

    Full Text Available Objective: To identify errors in the unidosis system carts. Method: For two months, the Pharmacy Service controlled medication either returned or missing from the unidosis carts both in the pharmacy and in the wards. Results: Uncorrected unidosis carts show a 0.9% of medication errors (264 versus 0.6% (154 which appeared in unidosis carts previously revised. In carts not revised, the error is 70.83% and mainly caused when setting up unidosis carts. The rest are due to a lack of stock or unavailability (21.6%, errors in the transcription of medical orders (6.81% or that the boxes had not been emptied previously (0.76%. The errors found in the units correspond to errors in the transcription of the treatment (3.46%, non-receipt of the unidosis copy (23.14%, the patient did not take the medication (14.36%or was discharged without medication (12.77%, was not provided by nurses (14.09%, was withdrawn from the stocks of the unit (14.62%, and errors of the pharmacy service (17.56% . Conclusions: It is concluded the need to redress unidosis carts and a computerized prescription system to avoid errors in transcription.Discussion: A high percentage of medication errors is caused by human error. If unidosis carts are overlooked before sent to hospitalization units, the error diminishes to 0.3%.

  4. Prediction of discretization error using the error transport equation

    Science.gov (United States)

    Celik, Ismail B.; Parsons, Don Roscoe

    2017-06-01

    This study focuses on an approach to quantify the discretization error associated with numerical solutions of partial differential equations by solving an error transport equation (ETE). The goal is to develop a method that can be used to adequately predict the discretization error using the numerical solution on only one grid/mesh. The primary problem associated with solving the ETE is the formulation of the error source term which is required for accurately predicting the transport of the error. In this study, a novel approach is considered which involves fitting the numerical solution with a series of locally smooth curves and then blending them together with a weighted spline approach. The result is a continuously differentiable analytic expression that can be used to determine the error source term. Once the source term has been developed, the ETE can easily be solved using the same solver that is used to obtain the original numerical solution. The new methodology is applied to the two-dimensional Navier-Stokes equations in the laminar flow regime. A simple unsteady flow case is also considered. The discretization error predictions based on the methodology presented in this study are in good agreement with the 'true error'. While in most cases the error predictions are not quite as accurate as those from Richardson extrapolation, the results are reasonable and only require one numerical grid. The current results indicate that there is much promise going forward with the newly developed error source term evaluation technique and the ETE.

  5. An Empirical State Error Covariance Matrix Orbit Determination Example

    Science.gov (United States)

    Frisbee, Joseph H., Jr.

    2015-01-01

    is suspect. In its most straight forward form, the technique only requires supplemental calculations to be added to existing batch estimation algorithms. In the current problem being studied a truth model making use of gravity with spherical, J2 and J4 terms plus a standard exponential type atmosphere with simple diurnal and random walk components is used. The ability of the empirical state error covariance matrix to account for errors is investigated under four scenarios during orbit estimation. These scenarios are: exact modeling under known measurement errors, exact modeling under corrupted measurement errors, inexact modeling under known measurement errors, and inexact modeling under corrupted measurement errors. For this problem a simple analog of a distributed space surveillance network is used. The sensors in this network make only range measurements and with simple normally distributed measurement errors. The sensors are assumed to have full horizon to horizon viewing at any azimuth. For definiteness, an orbit at the approximate altitude and inclination of the International Space Station is used for the study. The comparison analyses of the data involve only total vectors. No investigation of specific orbital elements is undertaken. The total vector analyses will look at the chisquare values of the error in the difference between the estimated state and the true modeled state using both the empirical and theoretical error covariance matrices for each of scenario.

  6. Improved Error Thresholds for Measurement-Free Error Correction

    Science.gov (United States)

    Crow, Daniel; Joynt, Robert; Saffman, M.

    2016-09-01

    Motivated by limitations and capabilities of neutral atom qubits, we examine whether measurement-free error correction can produce practical error thresholds. We show that this can be achieved by extracting redundant syndrome information, giving our procedure extra fault tolerance and eliminating the need for ancilla verification. The procedure is particularly favorable when multiqubit gates are available for the correction step. Simulations of the bit-flip, Bacon-Shor, and Steane codes indicate that coherent error correction can produce threshold error rates that are on the order of 10-3 to 10-4—comparable with or better than measurement-based values, and much better than previous results for other coherent error correction schemes. This indicates that coherent error correction is worthy of serious consideration for achieving protected logical qubits.

  7. The cost of human error intervention

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, C.T.; Banks, W.W.; Jones, E.D.

    1994-03-01

    DOE has directed that cost-benefit analyses be conducted as part of the review process for all new DOE orders. This new policy will have the effect of ensuring that DOE analysts can justify the implementation costs of the orders that they develop. We would like to argue that a cost-benefit analysis is merely one phase of a complete risk management program -- one that would more than likely start with a probabilistic risk assessment. The safety community defines risk as the probability of failure times the severity of consequence. An engineering definition of failure can be considered in terms of physical performance, as in mean-time-between-failure; or, it can be thought of in terms of human performance, as in probability of human error. The severity of consequence of a failure can be measured along any one of a number of dimensions -- economic, political, or social. Clearly, an analysis along one dimension cannot be directly compared to another but, a set of cost-benefit analyses, based on a series of cost-dimensions, can be extremely useful to managers who must prioritize their resources. Over the last two years, DOE has been developing a series of human factors orders, directed a lowering the probability of human error -- or at least changing the distribution of those errors. The following discussion presents a series of cost-benefit analyses using historical events in the nuclear industry. However, we would first like to discuss some of the analytic cautions that must be considered when we deal with human error.

  8. PREVENTABLE ERRORS: NEVER EVENTS

    Directory of Open Access Journals (Sweden)

    Narra Gopal

    2014-07-01

    Full Text Available Operation or any invasive procedure is a stressful event involving risks and complications. We should be able to offer a guarantee that the right procedure will be done on right person in the right place on their body. “Never events” are definable. These are the avoidable and preventable events. The people affected from consequences of surgical mistakes ranged from temporary injury in 60%, permanent injury in 33% and death in 7%”.World Health Organization (WHO [1] has earlier said that over seven million people across the globe suffer from preventable surgical injuries every year, a million of them even dying during or immediately after the surgery? The UN body quantified the number of surgeries taking place every year globally 234 million. It said surgeries had become common, with one in every 25 people undergoing it at any given time. 50% never events are preventable. Evidence suggests up to one in ten hospital admissions results in an adverse incident. This incident rate is not acceptable in other industries. In order to move towards a more acceptable level of safety, we need to understand how and why things go wrong and have to build a reliable system of working. With this system even though complete prevention may not be possible but we can reduce the error percentage2. To change present concept towards patient, first we have to change and replace the word patient with medical customer. Then our outlook also changes, we will be more careful towards our customers.

  9. Comparison of analytical error and sampling error for contaminated soil.

    Science.gov (United States)

    Gustavsson, Björn; Luthbom, Karin; Lagerkvist, Anders

    2006-11-16

    Investigation of soil from contaminated sites requires several sample handling steps that, most likely, will induce uncertainties in the sample. The theory of sampling describes seven sampling errors that can be calculated, estimated or discussed in order to get an idea of the size of the sampling uncertainties. With the aim of comparing the size of the analytical error to the total sampling error, these seven errors were applied, estimated and discussed, to a case study of a contaminated site. The manageable errors were summarized, showing a range of three orders of magnitudes between the examples. The comparisons show that the quotient between the total sampling error and the analytical error is larger than 20 in most calculation examples. Exceptions were samples taken in hot spots, where some components of the total sampling error get small and the analytical error gets large in comparison. Low concentration of contaminant, small extracted sample size and large particles in the sample contribute to the extent of uncertainty.

  10. THE SELF-CORRECTION OF ENGLISH SPEECH ERRORS IN SECOND LANGUANGE LEARNING

    Directory of Open Access Journals (Sweden)

    Ketut Santi Indriani

    2015-05-01

    Full Text Available The process of second language (L2 learning is strongly influenced by the factors of error reconstruction that occur when the language is learned. Errors will definitely appear in the learning process. However, errors can be used as a step to accelerate the process of understanding the language. Doing self-correction (with or without giving cues is one of the examples. In the aspect of speaking, self-correction is done immediately after the error appears. This study is aimed at finding (i what speech errors the L2 speakers are able to identify, (ii of the errors identified, what speech errors the L2 speakers are able to self correct and (iii whether the self-correction of speech error are able to immediately improve the L2 learning. Based on the data analysis, it was found that the majority identified errors are related to noun (plurality, subject-verb agreement, grammatical structure and pronunciation.. B2 speakers tend to correct errors properly. Of the 78% identified speech errors, as much as 66% errors could be self-corrected accurately by the L2 speakers. Based on the analysis, it was also found that self-correction is able to improve L2 learning ability directly. This is evidenced by the absence of repetition of the same error after the error had been corrected.

  11. Quantum Metrology Enhanced by Repetitive Quantum Error Correction

    Science.gov (United States)

    Unden, Thomas; Balasubramanian, Priya; Louzon, Daniel; Vinkler, Yuval; Plenio, Martin B.; Markham, Matthew; Twitchen, Daniel; Stacey, Alastair; Lovchinsky, Igor; Sushkov, Alexander O.; Lukin, Mikhail D.; Retzker, Alex; Naydenov, Boris; McGuinness, Liam P.; Jelezko, Fedor

    2016-06-01

    We experimentally demonstrate the protection of a room-temperature hybrid spin register against environmental decoherence by performing repeated quantum error correction whilst maintaining sensitivity to signal fields. We use a long-lived nuclear spin to correct multiple phase errors on a sensitive electron spin in diamond and realize magnetic field sensing beyond the time scales set by natural decoherence. The universal extension of sensing time, robust to noise at any frequency, demonstrates the definitive advantage entangled multiqubit systems provide for quantum sensing and offers an important complement to quantum control techniques.

  12. Quantum Metrology Enhanced by Repetitive Quantum Error Correction.

    Science.gov (United States)

    Unden, Thomas; Balasubramanian, Priya; Louzon, Daniel; Vinkler, Yuval; Plenio, Martin B; Markham, Matthew; Twitchen, Daniel; Stacey, Alastair; Lovchinsky, Igor; Sushkov, Alexander O; Lukin, Mikhail D; Retzker, Alex; Naydenov, Boris; McGuinness, Liam P; Jelezko, Fedor

    2016-06-10

    We experimentally demonstrate the protection of a room-temperature hybrid spin register against environmental decoherence by performing repeated quantum error correction whilst maintaining sensitivity to signal fields. We use a long-lived nuclear spin to correct multiple phase errors on a sensitive electron spin in diamond and realize magnetic field sensing beyond the time scales set by natural decoherence. The universal extension of sensing time, robust to noise at any frequency, demonstrates the definitive advantage entangled multiqubit systems provide for quantum sensing and offers an important complement to quantum control techniques.

  13. The Usability-Error Ontology

    DEFF Research Database (Denmark)

    2013-01-01

    ability to do systematic reviews and meta-analyses. In an effort to support improved and more interoperable data capture regarding Usability Errors, we have created the Usability Error Ontology (UEO) as a classification method for representing knowledge regarding Usability Errors. We expect the UEO...... in patients coming to harm. Often the root cause analysis of these adverse events can be traced back to Usability Errors in the Health Information Technology (HIT) or its interaction with users. Interoperability of the documentation of HIT related Usability Errors in a consistent fashion can improve our...... will grow over time to support an increasing number of HIT system types. In this manuscript, we present this Ontology of Usability Error Types and specifically address Computerized Physician Order Entry (CPOE), Electronic Health Records (EHR) and Revenue Cycle HIT systems....

  14. Nested Quantum Error Correction Codes

    CERN Document Server

    Wang, Zhuo; Fan, Hen; Vedral, Vlatko

    2009-01-01

    The theory of quantum error correction was established more than a decade ago as the primary tool for fighting decoherence in quantum information processing. Although great progress has already been made in this field, limited methods are available in constructing new quantum error correction codes from old codes. Here we exhibit a simple and general method to construct new quantum error correction codes by nesting certain quantum codes together. The problem of finding long quantum error correction codes is reduced to that of searching several short length quantum codes with certain properties. Our method works for all length and all distance codes, and is quite efficient to construct optimal or near optimal codes. Two main known methods in constructing new codes from old codes in quantum error-correction theory, the concatenating and pasting, can be understood in the framework of nested quantum error correction codes.

  15. Processor register error correction management

    Science.gov (United States)

    Bose, Pradip; Cher, Chen-Yong; Gupta, Meeta S.

    2016-12-27

    Processor register protection management is disclosed. In embodiments, a method of processor register protection management can include determining a sensitive logical register for executable code generated by a compiler, generating an error-correction table identifying the sensitive logical register, and storing the error-correction table in a memory accessible by a processor. The processor can be configured to generate a duplicate register of the sensitive logical register identified by the error-correction table.

  16. MPDATA error estimator for mesh adaptivity

    Science.gov (United States)

    Szmelter, Joanna; Smolarkiewicz, Piotr K.

    2006-04-01

    In multidimensional positive definite advection transport algorithm (MPDATA) the leading error as well as the first- and second-order solutions are known explicitly by design. This property is employed to construct refinement indicators for mesh adaptivity. Recent progress with the edge-based formulation of MPDATA facilitates the use of the method in an unstructured-mesh environment. In particular, the edge-based data structure allows for flow solvers to operate on arbitrary hybrid meshes, thereby lending itself to implementations of various mesh adaptivity techniques. A novel unstructured-mesh nonoscillatory forward-in-time (NFT) solver for compressible Euler equations is used to illustrate the benefits of adaptive remeshing as well as mesh movement and enrichment for the efficacy of MPDATA-based flow solvers. Validation against benchmark test cases demonstrates robustness and accuracy of the approach.

  17. Calculating error bars for neutrino mixing parameters

    CERN Document Server

    Burroughs, H R; Escamilla-Roa, J; Latimer, D C; Ernst, D J

    2012-01-01

    One goal of contemporary particle physics is to determine the mixing angles and mass-squared differences that constitute the phenomenological constants that describe neutrino oscillations. Of great interest are not only the best fit values of these constants but also their errors. Some of the neutrino oscillation data is statistically poor and cannot be treated by normal (Gaussian) statistics. To extract confidence intervals when the statistics are not normal, one should not utilize the value for chisquare versus confidence level taken from normal statistics. Instead, we propose that one should use the normalized likelihood function as a probability distribution; the relationship between the correct chisquare and a given confidence level can be computed by integrating over the likelihood function. This allows for a definition of confidence level independent of the functional form of the !2 function; it is particularly useful for cases in which the minimum of the !2 function is near a boundary. We present two ...

  18. The Usability-Error Ontology

    DEFF Research Database (Denmark)

    2013-01-01

    ability to do systematic reviews and meta-analyses. In an effort to support improved and more interoperable data capture regarding Usability Errors, we have created the Usability Error Ontology (UEO) as a classification method for representing knowledge regarding Usability Errors. We expect the UEO...... will grow over time to support an increasing number of HIT system types. In this manuscript, we present this Ontology of Usability Error Types and specifically address Computerized Physician Order Entry (CPOE), Electronic Health Records (EHR) and Revenue Cycle HIT systems....

  19. Anxiety and Error Monitoring: Increased Error Sensitivity or Altered Expectations?

    Science.gov (United States)

    Compton, Rebecca J.; Carp, Joshua; Chaddock, Laura; Fineman, Stephanie L.; Quandt, Lorna C.; Ratliff, Jeffrey B.

    2007-01-01

    This study tested the prediction that the error-related negativity (ERN), a physiological measure of error monitoring, would be enhanced in anxious individuals, particularly in conditions with threatening cues. Participants made gender judgments about faces whose expressions were either happy, angry, or neutral. Replicating prior studies, midline…

  20. Measurement Error and Equating Error in Power Analysis

    Science.gov (United States)

    Phillips, Gary W.; Jiang, Tao

    2016-01-01

    Power analysis is a fundamental prerequisite for conducting scientific research. Without power analysis the researcher has no way of knowing whether the sample size is large enough to detect the effect he or she is looking for. This paper demonstrates how psychometric factors such as measurement error and equating error affect the power of…

  1. Error begat error: design error analysis and prevention in social infrastructure projects.

    Science.gov (United States)

    Love, Peter E D; Lopez, Robert; Edwards, David J; Goh, Yang M

    2012-09-01

    Design errors contribute significantly to cost and schedule growth in social infrastructure projects and to engineering failures, which can result in accidents and loss of life. Despite considerable research that has addressed their error causation in construction projects they still remain prevalent. This paper identifies the underlying conditions that contribute to design errors in social infrastructure projects (e.g. hospitals, education, law and order type buildings). A systemic model of error causation is propagated and subsequently used to develop a learning framework for design error prevention. The research suggests that a multitude of strategies should be adopted in congruence to prevent design errors from occurring and so ensure that safety and project performance are ameliorated.

  2. Spatial frequency domain error budget

    Energy Technology Data Exchange (ETDEWEB)

    Hauschildt, H; Krulewich, D

    1998-08-27

    The aim of this paper is to describe a methodology for designing and characterizing machines used to manufacture or inspect parts with spatial-frequency-based specifications. At Lawrence Livermore National Laboratory, one of our responsibilities is to design or select the appropriate machine tools to produce advanced optical and weapons systems. Recently, many of the component tolerances for these systems have been specified in terms of the spatial frequency content of residual errors on the surface. We typically use an error budget as a sensitivity analysis tool to ensure that the parts manufactured by a machine will meet the specified component tolerances. Error budgets provide the formalism whereby we account for all sources of uncertainty in a process, and sum them to arrive at a net prediction of how "precisely" a manufactured component can meet a target specification. Using the error budget, we are able to minimize risk during initial stages by ensuring that the machine will produce components that meet specifications before the machine is actually built or purchased. However, the current error budgeting procedure provides no formal mechanism for designing machines that can produce parts with spatial-frequency-based specifications. The output from the current error budgeting procedure is a single number estimating the net worst case or RMS error on the work piece. This procedure has limited ability to differentiate between low spatial frequency form errors versus high frequency surface finish errors. Therefore the current error budgeting procedure can lead us to reject a machine that is adequate or accept a machine that is inadequate. This paper will describe a new error budgeting methodology to aid in the design and characterization of machines used to manufacture or inspect parts with spatial-frequency-based specifications. The output from this new procedure is the continuous spatial frequency content of errors that result on a machined part. If the machine

  3. Reducing errors in emergency surgery.

    Science.gov (United States)

    Watters, David A K; Truskett, Philip G

    2013-06-01

    Errors are to be expected in health care. Adverse events occur in around 10% of surgical patients and may be even more common in emergency surgery. There is little formal teaching on surgical error in surgical education and training programmes despite their frequency. This paper reviews surgical error and provides a classification system, to facilitate learning. The approach and language used to enable teaching about surgical error was developed through a review of key literature and consensus by the founding faculty of the Management of Surgical Emergencies course, currently delivered by General Surgeons Australia. Errors may be classified as being the result of commission, omission or inition. An error of inition is a failure of effort or will and is a failure of professionalism. The risk of error can be minimized by good situational awareness, matching perception to reality, and, during treatment, reassessing the patient, team and plan. It is important to recognize and acknowledge an error when it occurs and then to respond appropriately. The response will involve rectifying the error where possible but also disclosing, reporting and reviewing at a system level all the root causes. This should be done without shaming or blaming. However, the individual surgeon still needs to reflect on their own contribution and performance. A classification of surgical error has been developed that promotes understanding of how the error was generated, and utilizes a language that encourages reflection, reporting and response by surgeons and their teams. © 2013 The Authors. ANZ Journal of Surgery © 2013 Royal Australasian College of Surgeons.

  4. IMRT QA: Selecting gamma criteria based on error detection sensitivity

    Energy Technology Data Exchange (ETDEWEB)

    Steers, Jennifer M. [Department of Radiation Oncology, Cedars-Sinai Medical Center, Los Angeles, California 90048 and Physics and Biology in Medicine IDP, David Geffen School of Medicine, University of California, Los Angeles, Los Angeles, California 90095 (United States); Fraass, Benedick A., E-mail: benedick.fraass@cshs.org [Department of Radiation Oncology, Cedars-Sinai Medical Center, Los Angeles, California 90048 (United States)

    2016-04-15

    Purpose: The gamma comparison is widely used to evaluate the agreement between measurements and treatment planning system calculations in patient-specific intensity modulated radiation therapy (IMRT) quality assurance (QA). However, recent publications have raised concerns about the lack of sensitivity when employing commonly used gamma criteria. Understanding the actual sensitivity of a wide range of different gamma criteria may allow the definition of more meaningful gamma criteria and tolerance limits in IMRT QA. We present a method that allows the quantitative determination of gamma criteria sensitivity to induced errors which can be applied to any unique combination of device, delivery technique, and software utilized in a specific clinic. Methods: A total of 21 DMLC IMRT QA measurements (ArcCHECK®, Sun Nuclear) were compared to QA plan calculations with induced errors. Three scenarios were studied: MU errors, multi-leaf collimator (MLC) errors, and the sensitivity of the gamma comparison to changes in penumbra width. Gamma comparisons were performed between measurements and error-induced calculations using a wide range of gamma criteria, resulting in a total of over 20 000 gamma comparisons. Gamma passing rates for each error class and case were graphed against error magnitude to create error curves in order to represent the range of missed errors in routine IMRT QA using 36 different gamma criteria. Results: This study demonstrates that systematic errors and case-specific errors can be detected by the error curve analysis. Depending on the location of the error curve peak (e.g., not centered about zero), 3%/3 mm threshold = 10% at 90% pixels passing may miss errors as large as 15% MU errors and ±1 cm random MLC errors for some cases. As the dose threshold parameter was increased for a given %Diff/distance-to-agreement (DTA) setting, error sensitivity was increased by up to a factor of two for select cases. This increased sensitivity with increasing dose

  5. Error Analysis in English Language Learning

    Institute of Scientific and Technical Information of China (English)

    杜文婷

    2009-01-01

    Errors in English language learning are usually classified into interlingual errors and intralin-gual errors, having a clear knowledge of the causes of the errors will help students learn better English.

  6. Error Analysis And Second Language Acquisition

    Institute of Scientific and Technical Information of China (English)

    王惠丽

    2016-01-01

    Based on the theories of error and error analysis, the article is trying to explore the effect of error and error analysis on SLA. Thus give some advice to the language teachers and language learners.

  7. Quantifying error distributions in crowding.

    Science.gov (United States)

    Hanus, Deborah; Vul, Edward

    2013-03-22

    When multiple objects are in close proximity, observers have difficulty identifying them individually. Two classes of theories aim to account for this crowding phenomenon: spatial pooling and spatial substitution. Variations of these accounts predict different patterns of errors in crowded displays. Here we aim to characterize the kinds of errors that people make during crowding by comparing a number of error models across three experiments in which we manipulate flanker spacing, display eccentricity, and precueing duration. We find that both spatial intrusions and individual letter confusions play a considerable role in errors. Moreover, we find no evidence that a naïve pooling model that predicts errors based on a nonadditive combination of target and flankers explains errors better than an independent intrusion model (indeed, in our data, an independent intrusion model is slightly, but significantly, better). Finally, we find that manipulating trial difficulty in any way (spacing, eccentricity, or precueing) produces homogenous changes in error distributions. Together, these results provide quantitative baselines for predictive models of crowding errors, suggest that pooling and spatial substitution models are difficult to tease apart, and imply that manipulations of crowding all influence a common mechanism that impacts subject performance.

  8. Discretization error of Stochastic Integrals

    CERN Document Server

    Fukasawa, Masaaki

    2010-01-01

    Asymptotic error distribution for approximation of a stochastic integral with respect to continuous semimartingale by Riemann sum with general stochastic partition is studied. Effective discretization schemes of which asymptotic conditional mean-squared error attains a lower bound are constructed. Two applications are given; efficient delta hedging strategies with transaction costs and effective discretization schemes for the Euler-Maruyama approximation are constructed.

  9. Dual Processing and Diagnostic Errors

    Science.gov (United States)

    Norman, Geoff

    2009-01-01

    In this paper, I review evidence from two theories in psychology relevant to diagnosis and diagnostic errors. "Dual Process" theories of thinking, frequently mentioned with respect to diagnostic error, propose that categorization decisions can be made with either a fast, unconscious, contextual process called System 1 or a slow, analytical,…

  10. Barriers to medical error reporting

    Directory of Open Access Journals (Sweden)

    Jalal Poorolajal

    2015-01-01

    Full Text Available Background: This study was conducted to explore the prevalence of medical error underreporting and associated barriers. Methods: This cross-sectional study was performed from September to December 2012. Five hospitals, affiliated with Hamadan University of Medical Sciences, in Hamedan,Iran were investigated. A self-administered questionnaire was used for data collection. Participants consisted of physicians, nurses, midwives, residents, interns, and staffs of radiology and laboratory departments. Results: Overall, 50.26% of subjects had committed but not reported medical errors. The main reasons mentioned for underreporting were lack of effective medical error reporting system (60.0%, lack of proper reporting form (51.8%, lack of peer supporting a person who has committed an error (56.0%, and lack of personal attention to the importance of medical errors (62.9%. The rate of committing medical errors was higher in men (71.4%, age of 50-40 years (67.6%, less-experienced personnel (58.7%, educational level of MSc (87.5%, and staff of radiology department (88.9%. Conclusions: This study outlined the main barriers to reporting medical errors and associated factors that may be helpful for healthcare organizations in improving medical error reporting as an essential component for patient safety enhancement.

  11. Rectifying calibration error of Goldmann applanation tonometer is easy!

    Directory of Open Access Journals (Sweden)

    Nikhil S Choudhari

    2014-01-01

    Full Text Available Purpose: Goldmann applanation tonometer (GAT is the current Gold standard tonometer. However, its calibration error is common and can go unnoticed in clinics. Its company repair has limitations. The purpose of this report is to describe a self-taught technique of rectifying calibration error of GAT. Materials and Methods: Twenty-nine slit-lamp-mounted Haag-Streit Goldmann tonometers (Model AT 900 C/M; Haag-Streit, Switzerland were included in this cross-sectional interventional pilot study. The technique of rectification of calibration error of the tonometer involved cleaning and lubrication of the instrument followed by alignment of weights when lubrication alone didn′t suffice. We followed the South East Asia Glaucoma Interest Group′s definition of calibration error tolerance (acceptable GAT calibration error within ±2, ±3 and ±4 mm Hg at the 0, 20 and 60-mm Hg testing levels, respectively. Results: Twelve out of 29 (41.3% GATs were out of calibration. The range of positive and negative calibration error at the clinically most important 20-mm Hg testing level was 0.5 to 20 mm Hg and -0.5 to -18 mm Hg, respectively. Cleaning and lubrication alone sufficed to rectify calibration error of 11 (91.6% faulty instruments. Only one (8.3% faulty GAT required alignment of the counter-weight. Conclusions: Rectification of calibration error of GAT is possible in-house. Cleaning and lubrication of GAT can be carried out even by eye care professionals and may suffice to rectify calibration error in the majority of faulty instruments. Such an exercise may drastically reduce the downtime of the Gold standard tonometer.

  12. Onorbit IMU alignment error budget

    Science.gov (United States)

    Corson, R. W.

    1980-01-01

    The Star Tracker, Crew Optical Alignment Sight (COAS), and Inertial Measurement Unit (IMU) from a complex navigation system with a multitude of error sources were combined. A complete list of the system errors is presented. The errors were combined in a rational way to yield an estimate of the IMU alignment accuracy for STS-1. The expected standard deviation in the IMU alignment error for STS-1 type alignments was determined to be 72 arc seconds per axis for star tracker alignments and 188 arc seconds per axis for COAS alignments. These estimates are based on current knowledge of the star tracker, COAS, IMU, and navigation base error specifications, and were partially verified by preliminary Monte Carlo analysis.

  13. Measurement Error Models in Astronomy

    CERN Document Server

    Kelly, Brandon C

    2011-01-01

    I discuss the effects of measurement error on regression and density estimation. I review the statistical methods that have been developed to correct for measurement error that are most popular in astronomical data analysis, discussing their advantages and disadvantages. I describe functional models for accounting for measurement error in regression, with emphasis on the methods of moments approach and the modified loss function approach. I then describe structural models for accounting for measurement error in regression and density estimation, with emphasis on maximum-likelihood and Bayesian methods. As an example of a Bayesian application, I analyze an astronomical data set subject to large measurement errors and a non-linear dependence between the response and covariate. I conclude with some directions for future research.

  14. Binary Error Correcting Network Codes

    CERN Document Server

    Wang, Qiwen; Li, Shuo-Yen Robert

    2011-01-01

    We consider network coding for networks experiencing worst-case bit-flip errors, and argue that this is a reasonable model for highly dynamic wireless network transmissions. We demonstrate that in this setup prior network error-correcting schemes can be arbitrarily far from achieving the optimal network throughput. We propose a new metric for errors under this model. Using this metric, we prove a new Hamming-type upper bound on the network capacity. We also show a commensurate lower bound based on GV-type codes that can be used for error-correction. The codes used to attain the lower bound are non-coherent (do not require prior knowledge of network topology). The end-to-end nature of our design enables our codes to be overlaid on classical distributed random linear network codes. Further, we free internal nodes from having to implement potentially computationally intensive link-by-link error-correction.

  15. Error Propagation in the Hypercycle

    CERN Document Server

    Campos, P R A; Stadler, P F

    1999-01-01

    We study analytically the steady-state regime of a network of n error-prone self-replicating templates forming an asymmetric hypercycle and its error tail. We show that the existence of a master template with a higher non-catalyzed self-replicative productivity, a, than the error tail ensures the stability of chains in which merror tail is guaranteed for catalytic coupling strengths (K) of order of a. We find that the hypercycle becomes more stable than the chains only for K of order of a2. Furthermore, we show that the minimal replication accuracy per template needed to maintain the hypercycle, the so-called error threshold, vanishes like sqrt(n/K) for large K and n<=4.

  16. FPU-Supported Running Error Analysis

    OpenAIRE

    T. Zahradnický; R. Lórencz

    2010-01-01

    A-posteriori forward rounding error analyses tend to give sharper error estimates than a-priori ones, as they use actual data quantities. One of such a-posteriori analysis – running error analysis – uses expressions consisting of two parts; one generates the error and the other propagates input errors to the output. This paper suggests replacing the error generating term with an FPU-extracted rounding error estimate, which produces a sharper error bound.

  17. Influence of uncorrected refractive error and unmet refractive error on visual impairment in a Brazilian population.

    Science.gov (United States)

    Ferraz, Fabio H; Corrente, José E; Opromolla, Paula; Schellini, Silvana A

    2014-06-25

    The World Health Organization (WHO) definitions of blindness and visual impairment are widely based on best-corrected visual acuity excluding uncorrected refractive errors (URE) as a visual impairment cause. Recently, URE was included as a cause of visual impairment, thus emphasizing the burden of visual impairment due to refractive error (RE) worldwide is substantially higher. The purpose of the present study is to determine the reversal of visual impairment and blindness in the population correcting RE and possible associations between RE and individual characteristics. A cross-sectional study was conducted in nine counties of the western region of state of São Paulo, using systematic and random sampling of households between March 2004 and July 2005. Individuals aged more than 1 year old were included and were evaluated for demographic data, eye complaints, history, and eye exam, including no corrected visual acuity (NCVA), best corrected vision acuity (BCVA), automatic and manual refractive examination. The definition adopted for URE was applied to individuals with NCVA > 0.15 logMAR and BCVA ≤ 0.15 logMAR after refractive correction and unmet refractive error (UREN), individuals who had visual impairment or blindness (NCVA > 0.5 logMAR) and BCVA ≤ 0.5 logMAR after optical correction. A total of 70.2% of subjects had normal NCVA. URE was detected in 13.8%. Prevalence of 4.6% of optically reversible low vision and 1.8% of blindness reversible by optical correction were found. UREN was detected in 6.5% of individuals, more frequently observed in women over the age of 50 and in higher RE carriers. Visual impairment related to eye diseases is not reversible with spectacles. Using multivariate analysis, associations between URE and UREN with regard to sex, age and RE was observed. RE is an important cause of reversible blindness and low vision in the Brazilian population.

  18. Human Errors - A Taxonomy for Describing Human Malfunction in Industrial Installations

    DEFF Research Database (Denmark)

    Rasmussen, J.

    1982-01-01

    This paper describes the definition and the characteristics of human errors. Different types of human behavior are classified, and their relation to different error mechanisms are analyzed. The effect of conditioning factors related to affective, motivating aspects of the work situation as well...... as physiological factors are also taken into consideration. The taxonomy for event analysis, including human malfunction, is presented. Possibilities for the prediction of human error are discussed. The need for careful studies in actual work situations is expressed. Such studies could provide a better...... understanding of the complexity of human error situations as well as the data needed to characterize these situations....

  19. New Approach for Error Reduction in the Volume Penalization Method

    CERN Document Server

    Iwakami-Nakano, Wakana; Hatakeyama, Nozomu; Hattori, Yuji

    2012-01-01

    The volume penalization method offers an efficient way to numerically simulate flows around complex-shaped bodies which move and/or deform in general. In this method a penalization term which has permeability eta and a mask function is added to a governing equation as a forcing term in order to impose different dynamics in solid and fluid regions. In this paper we investigate the accuracy of the volume penalization method in detail. We choose the one-dimensional Burgers' equation as a governing equation since it enables us extensive study and it has a nonlinear term similar to the Navier-Stokes equations. It is confirmed that the error which consists of the discretization/truncation error, the penalization error, the round-off error, and others has the same features as those in previous results when we use the standard definition of the mask function. As the number of grid points increases, the error converges to a non-zero constant which is equal to the penalization error. We propose a new approach for reduc...

  20. [Communication of medical errors to patients: questions and tools].

    Science.gov (United States)

    Bascuñán, María Luz; Arriagada, Ana María

    2016-09-01

    For several years and in many different ways, medical errors have been studied. As expected, the majority of efforts have been directed to prevent clinical errors during the different phases of health care. Nevertheless, less attention has been given to what happens when a negative effect has already occurred. The present work describes the doubts and difficulties that doctors deal with when facing an error and to describe the communicational tools that the literature offers to cope with them. The definition of medical error was the starting point that was used to later analyze the evidence about what, why and how to inform medical errors from an ethical and technical point of view. In the light of new legal exigencies, communicational and health protocols are revised, distinguishing those that are used for conveying bad news and medical errors. The importance of the ethical and communicational formation of the professionals is emphasized, identifying certain hindering aspects of the medical culture. This culture promotes an idea of the doctor as a professional who knows everything, does not make mistakes and acts in isolation. These do not reflect personal attributes in the professional and in the health team, required for a good professional practice.

  1. Writing Errors and Anosognosia in Amyotrophic Lateral Sclerosis with Dementia

    Directory of Open Access Journals (Sweden)

    Hiroo Ichikawa

    2008-01-01

    Full Text Available Amyotrophic lateral sclerosis (ALS with dementia (ALS-D is known to exhibit characteristics of frontotemporal dementia. However, in clinical situations, it is often difficult to evaluate their cognitive functions because of impaired voluntary speech and physical disabilities. In order to identify characteristic and diagnostic cognitive symptoms of relatively advanced ALS-D patients, we retrospectively reviewed the clinical features of seven cases of clinically definitive ALS who had dementia, impaired voluntary speech, and physical disability. Their medical records showed that six out of seven patients made writing errors, and all of the patients demonstrated anosognosia. The writing errors consisted of paragraphia such as substitution, omission, or syntactic errors with individual differences in error types. Dissociation between kana and kanji were also observed. Anosognosia was evaluated by a self-rating scale with which the patients and the medical staff evaluated the patient's physical ability; the results indicated a large discrepancy between the evaluation by the patients and the medical staff. We emphasize that aphasic writing errors have been underestimated, particularly in ALS-D patients with impaired voluntary speech. We also reported that anosognosia was the most important and quantifiable symptom in ALS-D. The relationship between writing errors and anosognosia should be investigated further.

  2. Writing Errors and Anosognosia in Amyotrophic Lateral Sclerosis with Dementia

    Science.gov (United States)

    Ichikawa, Hiroo; Koyama, Shinichi; Ohno, Hideki; Ishihara, Kenji; Nagumo, Kiyomi; Kawamura, Mitsuru

    2008-01-01

    Amyotrophic lateral sclerosis (ALS) with dementia (ALS-D) is known to exhibit characteristics of frontotemporal dementia. However, in clinical situations, it is often difficult to evaluate their cognitive functions because of impaired voluntary speech and physical disabilities. In order to identify characteristic and diagnostic cognitive symptoms of relatively advanced ALS-D patients, we retrospectively reviewed the clinical features of seven cases of clinically definitive ALS who had dementia, impaired voluntary speech, and physical disability. Their medical records showed that six out of seven patients made writing errors, and all of the patients demonstrated anosognosia. The writing errors consisted of paragraphia such as substitution, omission, or syntactic errors with individual differences in error types. Dissociation between kana and kanji were also observed. Anosognosia was evaluated by a self-rating scale with which the patients and the medical staff evaluated the patient's physical ability; the results indicated a large discrepancy between the evaluation by the patients and the medical staff. We emphasize that aphasic writing errors have been underestimated, particularly in ALS-D patients with impaired voluntary speech. We also reported that anosognosia was the most important and quantifiable symptom in ALS-D. The relationship between writing errors and anosognosia should be investigated further. PMID:18641430

  3. A precise error bound for quantum phase estimation.

    Directory of Open Access Journals (Sweden)

    James M Chappell

    Full Text Available Quantum phase estimation is one of the key algorithms in the field of quantum computing, but up until now, only approximate expressions have been derived for the probability of error. We revisit these derivations, and find that by ensuring symmetry in the error definitions, an exact formula can be found. This new approach may also have value in solving other related problems in quantum computing, where an expected error is calculated. Expressions for two special cases of the formula are also developed, in the limit as the number of qubits in the quantum computer approaches infinity and in the limit as the extra added qubits to improve reliability goes to infinity. It is found that this formula is useful in validating computer simulations of the phase estimation procedure and in avoiding the overestimation of the number of qubits required in order to achieve a given reliability. This formula thus brings improved precision in the design of quantum computers.

  4. Quantile Regression With Measurement Error

    KAUST Repository

    Wei, Ying

    2009-08-27

    Regression quantiles can be substantially biased when the covariates are measured with error. In this paper we propose a new method that produces consistent linear quantile estimation in the presence of covariate measurement error. The method corrects the measurement error induced bias by constructing joint estimating equations that simultaneously hold for all the quantile levels. An iterative EM-type estimation algorithm to obtain the solutions to such joint estimation equations is provided. The finite sample performance of the proposed method is investigated in a simulation study, and compared to the standard regression calibration approach. Finally, we apply our methodology to part of the National Collaborative Perinatal Project growth data, a longitudinal study with an unusual measurement error structure. © 2009 American Statistical Association.

  5. The uncorrected refractive error challenge

    Directory of Open Access Journals (Sweden)

    Kovin Naidoo

    2016-11-01

    Full Text Available Refractive error affects people of all ages, socio-economic status and ethnic groups. The most recent statistics estimate that, worldwide, 32.4 million people are blind and 191 million people have vision impairment. Vision impairment has been defined based on distance visual acuity only, and uncorrected distance refractive error (mainly myopia is the single biggest cause of worldwide vision impairment. However, when we also consider near visual impairment, it is clear that even more people are affected. From research it was estimated that the number of people with vision impairment due to uncorrected distance refractive error was 107.8 million,1 and the number of people affected by uncorrected near refractive error was 517 million, giving a total of 624.8 million people.

  6. Numerical optimization with computational errors

    CERN Document Server

    Zaslavski, Alexander J

    2016-01-01

    This book studies the approximate solutions of optimization problems in the presence of computational errors. A number of results are presented on the convergence behavior of algorithms in a Hilbert space; these algorithms are examined taking into account computational errors. The author illustrates that algorithms generate a good approximate solution, if computational errors are bounded from above by a small positive constant. Known computational errors are examined with the aim of determining an approximate solution. Researchers and students interested in the optimization theory and its applications will find this book instructive and informative. This monograph contains 16 chapters; including a chapters devoted to the subgradient projection algorithm, the mirror descent algorithm, gradient projection algorithm, the Weiszfelds method, constrained convex minimization problems, the convergence of a proximal point method in a Hilbert space, the continuous subgradient method, penalty methods and Newton’s meth...

  7. Error Analysis in Mathematics Education.

    Science.gov (United States)

    Radatz, Hendrik

    1979-01-01

    Five types of errors in an information-processing classification are discussed: language difficulties; difficulties in obtaining spatial information; deficient mastery of prerequisite skills, facts, and concepts; incorrect associations; and application of irrelevant rules. (MP)

  8. Comprehensive Error Rate Testing (CERT)

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Centers for Medicare and Medicaid Services (CMS) implemented the Comprehensive Error Rate Testing (CERT) program to measure improper payments in the Medicare...

  9. Aging transition by random errors

    Science.gov (United States)

    Sun, Zhongkui; Ma, Ning; Xu, Wei

    2017-02-01

    In this paper, the effects of random errors on the oscillating behaviors have been studied theoretically and numerically in a prototypical coupled nonlinear oscillator. Two kinds of noises have been employed respectively to represent the measurement errors accompanied with the parameter specifying the distance from a Hopf bifurcation in the Stuart-Landau model. It has been demonstrated that when the random errors are uniform random noise, the change of the noise intensity can effectively increase the robustness of the system. While the random errors are normal random noise, the increasing of variance can also enhance the robustness of the system under certain conditions that the probability of aging transition occurs reaches a certain threshold. The opposite conclusion is obtained when the probability is less than the threshold. These findings provide an alternative candidate to control the critical value of aging transition in coupled oscillator system, which is composed of the active oscillators and inactive oscillators in practice.

  10. Aging transition by random errors

    Science.gov (United States)

    Sun, Zhongkui; Ma, Ning; Xu, Wei

    2017-01-01

    In this paper, the effects of random errors on the oscillating behaviors have been studied theoretically and numerically in a prototypical coupled nonlinear oscillator. Two kinds of noises have been employed respectively to represent the measurement errors accompanied with the parameter specifying the distance from a Hopf bifurcation in the Stuart-Landau model. It has been demonstrated that when the random errors are uniform random noise, the change of the noise intensity can effectively increase the robustness of the system. While the random errors are normal random noise, the increasing of variance can also enhance the robustness of the system under certain conditions that the probability of aging transition occurs reaches a certain threshold. The opposite conclusion is obtained when the probability is less than the threshold. These findings provide an alternative candidate to control the critical value of aging transition in coupled oscillator system, which is composed of the active oscillators and inactive oscillators in practice. PMID:28198430

  11. Error correcting coding for OTN

    DEFF Research Database (Denmark)

    Justesen, Jørn; Larsen, Knud J.; Pedersen, Lars A.

    2010-01-01

    Forward error correction codes for 100 Gb/s optical transmission are currently receiving much attention from transport network operators and technology providers. We discuss the performance of hard decision decoding using product type codes that cover a single OTN frame or a small number...... of such frames. In particular we argue that a three-error correcting BCH is the best choice for the component code in such systems....

  12. Errors in Chemical Sensor Measurements

    Directory of Open Access Journals (Sweden)

    Artur Dybko

    2001-06-01

    Full Text Available Various types of errors during the measurements of ion-selective electrodes, ionsensitive field effect transistors, and fibre optic chemical sensors are described. The errors were divided according to their nature and place of origin into chemical, instrumental and non-chemical. The influence of interfering ions, leakage of the membrane components, liquid junction potential as well as sensor wiring, ambient light and temperature is presented.

  13. Error image aware content restoration

    Science.gov (United States)

    Choi, Sungwoo; Lee, Moonsik; Jung, Byunghee

    2015-12-01

    As the resolution of TV significantly increased, content consumers have become increasingly sensitive to the subtlest defect in TV contents. This rising standard in quality demanded by consumers has posed a new challenge in today's context where the tape-based process has transitioned to the file-based process: the transition necessitated digitalizing old archives, a process which inevitably produces errors such as disordered pixel blocks, scattered white noise, or totally missing pixels. Unsurprisingly, detecting and fixing such errors require a substantial amount of time and human labor to meet the standard demanded by today's consumers. In this paper, we introduce a novel, automated error restoration algorithm which can be applied to different types of classic errors by utilizing adjacent images while preserving the undamaged parts of an error image as much as possible. We tested our method to error images detected from our quality check system in KBS(Korean Broadcasting System) video archive. We are also implementing the algorithm as a plugin of well-known NLE(Non-linear editing system), which is a familiar tool for quality control agent.

  14. Quantum error correction for beginners.

    Science.gov (United States)

    Devitt, Simon J; Munro, William J; Nemoto, Kae

    2013-07-01

    Quantum error correction (QEC) and fault-tolerant quantum computation represent one of the most vital theoretical aspects of quantum information processing. It was well known from the early developments of this exciting field that the fragility of coherent quantum systems would be a catastrophic obstacle to the development of large-scale quantum computers. The introduction of quantum error correction in 1995 showed that active techniques could be employed to mitigate this fatal problem. However, quantum error correction and fault-tolerant computation is now a much larger field and many new codes, techniques, and methodologies have been developed to implement error correction for large-scale quantum algorithms. In response, we have attempted to summarize the basic aspects of quantum error correction and fault-tolerance, not as a detailed guide, but rather as a basic introduction. The development in this area has been so pronounced that many in the field of quantum information, specifically researchers who are new to quantum information or people focused on the many other important issues in quantum computation, have found it difficult to keep up with the general formalisms and methodologies employed in this area. Rather than introducing these concepts from a rigorous mathematical and computer science framework, we instead examine error correction and fault-tolerance largely through detailed examples, which are more relevant to experimentalists today and in the near future.

  15. Dominant modes via model error

    Science.gov (United States)

    Yousuff, A.; Breida, M.

    1992-01-01

    Obtaining a reduced model of a stable mechanical system with proportional damping is considered. Such systems can be conveniently represented in modal coordinates. Two popular schemes, the modal cost analysis and the balancing method, offer simple means of identifying dominant modes for retention in the reduced model. The dominance is measured via the modal costs in the case of modal cost analysis and via the singular values of the Gramian-product in the case of balancing. Though these measures do not exactly reflect the more appropriate model error, which is the H2 norm of the output-error between the full and the reduced models, they do lead to simple computations. Normally, the model error is computed after the reduced model is obtained, since it is believed that, in general, the model error cannot be easily computed a priori. The authors point out that the model error can also be calculated a priori, just as easily as the above measures. Hence, the model error itself can be used to determine the dominant modes. Moreover, the simplicity of the computations does not presume any special properties of the system, such as small damping, orthogonal symmetry, etc.

  16. Harmless error analysis: How do judges respond to confession errors?

    Science.gov (United States)

    Wallace, D Brian; Kassin, Saul M

    2012-04-01

    In Arizona v. Fulminante (1991), the U.S. Supreme Court opened the door for appellate judges to conduct a harmless error analysis of erroneously admitted, coerced confessions. In this study, 132 judges from three states read a murder case summary, evaluated the defendant's guilt, assessed the voluntariness of his confession, and responded to implicit and explicit measures of harmless error. Results indicated that judges found a high-pressure confession to be coerced and hence improperly admitted into evidence. As in studies with mock jurors, however, the improper confession significantly increased their conviction rate in the absence of other evidence. On the harmless error measures, judges successfully overruled the confession when required to do so, indicating that they are capable of this analysis.

  17. Explaining errors in children's questions.

    Science.gov (United States)

    Rowland, Caroline F

    2007-07-01

    The ability to explain the occurrence of errors in children's speech is an essential component of successful theories of language acquisition. The present study tested some generativist and constructivist predictions about error on the questions produced by ten English-learning children between 2 and 5 years of age. The analyses demonstrated that, as predicted by some generativist theories [e.g. Santelmann, L., Berk, S., Austin, J., Somashekar, S. & Lust. B. (2002). Continuity and development in the acquisition of inversion in yes/no questions: dissociating movement and inflection, Journal of Child Language, 29, 813-842], questions with auxiliary DO attracted higher error rates than those with modal auxiliaries. However, in wh-questions, questions with modals and DO attracted equally high error rates, and these findings could not be explained in terms of problems forming questions with why or negated auxiliaries. It was concluded that the data might be better explained in terms of a constructivist account that suggests that entrenched item-based constructions may be protected from error in children's speech, and that errors occur when children resort to other operations to produce questions [e.g. Dabrowska, E. (2000). From formula to schema: the acquisition of English questions. Cognitive Liguistics, 11, 83-102; Rowland, C. F. & Pine, J. M. (2000). Subject-auxiliary inversion errors and wh-question acquisition: What children do know? Journal of Child Language, 27, 157-181; Tomasello, M. (2003). Constructing a language: A usage-based theory of language acquisition. Cambridge, MA: Harvard University Press]. However, further work on constructivist theory development is required to allow researchers to make predictions about the nature of these operations.

  18. Pauli Exchange Errors in Quantum Computation

    CERN Document Server

    Ruskai, M B

    2000-01-01

    We argue that a physically reasonable model of fault-tolerant computation requires the ability to correct a type of two-qubit error which we call Pauli exchange errors as well as one qubit errors. We give an explicit 9-qubit code which can handle both Pauli exchange errors and all one-bit errors.

  19. State-independent error-disturbance trade-off for measurement operators

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, S.S. [Kuang Yaming Honors School, Nanjing University, Nanjing, Jiangsu 210093 (China); Wu, Shengjun, E-mail: sjwu@nju.edu.cn [Kuang Yaming Honors School, Nanjing University, Nanjing, Jiangsu 210093 (China); Chau, H.F. [Department of Physics, The University of Hong Kong, Pokfulam Road (Hong Kong)

    2016-05-20

    In general, classical measurement statistics of a quantum measurement is disturbed by performing an additional incompatible quantum measurement beforehand. Using this observation, we introduce a state-independent definition of disturbance by relating it to the distinguishability problem between two classical statistical distributions – one resulting from a single quantum measurement and the other from a succession of two quantum measurements. Interestingly, we find an error-disturbance trade-off relation for any measurements in two-dimensional Hilbert space and for measurements with mutually unbiased bases in any finite-dimensional Hilbert space. This relation shows that error should be reduced to zero in order to minimize the sum of error and disturbance. We conjecture that a similar trade-off relation with a slightly relaxed definition of error can be generalized to any measurements in an arbitrary finite-dimensional Hilbert space.

  20. An error criterion for determining sampling rates in closed-loop control systems

    Science.gov (United States)

    Brecher, S. M.

    1972-01-01

    The determination of an error criterion which will give a sampling rate for adequate performance of linear, time-invariant closed-loop, discrete-data control systems was studied. The proper modelling of the closed-loop control system for characterization of the error behavior, and the determination of an absolute error definition for performance of the two commonly used holding devices are discussed. The definition of an adequate relative error criterion as a function of the sampling rate and the parameters characterizing the system is established along with the determination of sampling rates. The validity of the expressions for the sampling interval was confirmed by computer simulations. Their application solves the problem of making a first choice in the selection of sampling rates.

  1. Evaluating the Appropriateness and Use of Domain Critical Errors

    Directory of Open Access Journals (Sweden)

    Chad W. Buckendahl

    2012-10-01

    Full Text Available The consequences associated with the uses and interpretations of scores for many credentialing testing programs have important implications for a range of stakeholders. Within licensure settings specifically, results from examination programs are often one of the final steps in the process of assessing whether individuals will be allowed to enter practice. This article focuses on the concept of domain critical errors and suggests a framework for considering their use in practice. Domain critical errors are defined here as knowledge, skills, abilities, or judgments that are essential to the definition of minimum qualifications in a testing program's pass-'fail decision-making process. Using domain critical errors has psychometric and policy implications, particularly for licensure programs that are mandatory for entry-level practice. Because these errors greatly influence pass-'fail decisions, the measurement community faces an ongoing challenge to promote defensible practices while concurrently providing assessment literacy development about the appropriate design and use of testing methods like domain critical errors.

  2. Error-associated behaviors and error rates for robotic geology

    Science.gov (United States)

    Anderson, Robert C.; Thomas, Geb; Wagner, Jacob; Glasgow, Justin

    2004-01-01

    This study explores human error as a function of the decision-making process. One of many models for human decision-making is Rasmussen's decision ladder [9]. The decision ladder identifies the multiple tasks and states of knowledge involved in decision-making. The tasks and states of knowledge can be classified by the level of cognitive effort required to make the decision, leading to the skill, rule, and knowledge taxonomy (Rasmussen, 1987). Skill based decisions require the least cognitive effort and knowledge based decisions require the greatest cognitive effort. Errors can occur at any of the cognitive levels.

  3. Error-associated behaviors and error rates for robotic geology

    Science.gov (United States)

    Anderson, Robert C.; Thomas, Geb; Wagner, Jacob; Glasgow, Justin

    2004-01-01

    This study explores human error as a function of the decision-making process. One of many models for human decision-making is Rasmussen's decision ladder [9]. The decision ladder identifies the multiple tasks and states of knowledge involved in decision-making. The tasks and states of knowledge can be classified by the level of cognitive effort required to make the decision, leading to the skill, rule, and knowledge taxonomy (Rasmussen, 1987). Skill based decisions require the least cognitive effort and knowledge based decisions require the greatest cognitive effort. Errors can occur at any of the cognitive levels.

  4. Some Notes on the Positive Definite Problem of a Binary Quartic Form

    Institute of Scientific and Technical Information of China (English)

    CHENBao-xing

    2003-01-01

    In this paper,we discuss the positive definite problem of a binary quartic form and obtain a necessary and sufficient condition .In addition we give two examples to show that there are some errors in the paper[1].

  5. Determination of error measurement by means of the basic magnetization curve

    Science.gov (United States)

    Lankin, M. V.; Lankin, A. M.

    2016-04-01

    The article describes the implementation of the methodology for determining the error search by means of the basic magnetization curve of electric cutting machines. The basic magnetization curve of the integrated operation of the electric characteristic allows one to define a fault type. In the process of measurement the definition of error calculation of the basic magnetization curve plays a major role as in accuracies of a particular characteristic can have a deleterious effect.

  6. POSITION ERROR IN STATION-KEEPING SATELLITE

    Science.gov (United States)

    of an error in satellite orientation and the sun being in a plane other than the equatorial plane may result in errors in position determination. The nature of the errors involved is described and their magnitudes estimated.

  7. The role of extensive recasts in error detection and correction by adult ESL students

    Directory of Open Access Journals (Sweden)

    Laura Hawkes

    2016-03-01

    Full Text Available Most of the laboratory studies on recasts have examined the role of intensive recasts provided repeatedly on the same target structure. This is different from the original definition of recasts as the reformulation of learner errors as they occur naturally and spontaneously in the course of communicative interaction. Using a within-group research design and a new testing methodology (video-based stimulated correction posttest, this laboratory study examined whether extensive and spontaneous recasts provided during small-group work were beneficial to adult L2 learners. Participants were 26 ESL learners, who were divided into seven small groups (3-5 students per group, and each group participated in an oral activity with a teacher. During the activity, the students received incidental and extensive recasts to half of their errors; the other half of their errors received no feedback. Students’ ability to detect and correct their errors in the three types of episodes was assessed using two types of tests: a stimulated correction test (a video-based computer test and a written test. Students’ reaction time on the error detection portion of the stimulated correction task was also measured. The results showed that students were able to detect more errors in error+recast (error followed by the provision of a recast episodes than in error-recast (error and no recast provided episodes (though this difference did not reach statistical significance. They were also able to successfully and partially successfully correct more errors in error+recast episodes than in error-recast episodes, and this difference was statistically significant on the written test. The reaction time results also point towards a benefit from recasts, as students were able to complete the task (slightly more quickly for error+recast episodes than for error-recast episodes.

  8. Orbit IMU alignment: Error analysis

    Science.gov (United States)

    Corson, R. W.

    1980-01-01

    A comprehensive accuracy analysis of orbit inertial measurement unit (IMU) alignments using the shuttle star trackers was completed and the results are presented. Monte Carlo techniques were used in a computer simulation of the IMU alignment hardware and software systems to: (1) determine the expected Space Transportation System 1 Flight (STS-1) manual mode IMU alignment accuracy; (2) investigate the accuracy of alignments in later shuttle flights when the automatic mode of star acquisition may be used; and (3) verify that an analytical model previously used for estimating the alignment error is a valid model. The analysis results do not differ significantly from expectations. The standard deviation in the IMU alignment error for STS-1 alignments was determined to the 68 arc seconds per axis. This corresponds to a 99.7% probability that the magnitude of the total alignment error is less than 258 arc seconds.

  9. Negligence, genuine error, and litigation

    Directory of Open Access Journals (Sweden)

    Sohn DH

    2013-02-01

    Full Text Available David H SohnDepartment of Orthopedic Surgery, University of Toledo Medical Center, Toledo, OH, USAAbstract: Not all medical injuries are the result of negligence. In fact, most medical injuries are the result either of the inherent risk in the practice of medicine, or due to system errors, which cannot be prevented simply through fear of disciplinary action. This paper will discuss the differences between adverse events, negligence, and system errors; the current medical malpractice tort system in the United States; and review current and future solutions, including medical malpractice reform, alternative dispute resolution, health courts, and no-fault compensation systems. The current political environment favors investigation of non-cap tort reform remedies; investment into more rational oversight systems, such as health courts or no-fault systems may reap both quantitative and qualitative benefits for a less costly and safer health system.Keywords: medical malpractice, tort reform, no fault compensation, alternative dispute resolution, system errors

  10. Large errors and severe conditions

    CERN Document Server

    Smith, D L; Van Wormer, L A

    2002-01-01

    Physical parameters that can assume real-number values over a continuous range are generally represented by inherently positive random variables. However, if the uncertainties in these parameters are significant (large errors), conventional means of representing and manipulating the associated variables can lead to erroneous results. Instead, all analyses involving them must be conducted in a probabilistic framework. Several issues must be considered: First, non-linear functional relations between primary and derived variables may lead to significant 'error amplification' (severe conditions). Second, the commonly used normal (Gaussian) probability distribution must be replaced by a more appropriate function that avoids the occurrence of negative sampling results. Third, both primary random variables and those derived through well-defined functions must be dealt with entirely in terms of their probability distributions. Parameter 'values' and 'errors' should be interpreted as specific moments of these probabil...

  11. Redundant measurements for controlling errors

    Energy Technology Data Exchange (ETDEWEB)

    Ehinger, M. H.; Crawford, J. M.; Madeen, M. L.

    1979-07-01

    Current federal regulations for nuclear materials control require consideration of operating data as part of the quality control program and limits of error propagation. Recent work at the BNFP has revealed that operating data are subject to a number of measurement problems which are very difficult to detect and even more difficult to correct in a timely manner. Thus error estimates based on operational data reflect those problems. During the FY 1978 and FY 1979 R and D demonstration runs at the BNFP, redundant measurement techniques were shown to be effective in detecting these problems to allow corrective action. The net effect is a reduction in measurement errors and a significant increase in measurement sensitivity. Results show that normal operation process control measurements, in conjunction with routine accountability measurements, are sensitive problem indicators when incorporated in a redundant measurement program.

  12. New night vision goggle gain definition

    Science.gov (United States)

    Podobedov, Vyacheslav B.; Eppeldauer, George P.; Larason, Thomas C.

    2015-05-01

    A new definition is proposed for the calibration of Night Vision Goggle (NVG) gains. This definition is based on the measurement of radiometric input and output quantities of the NVG. While the old definition used the "equivalent fL" which is a non SI traceable luminance unit, the new definition utilizes the radiance quantities that are traceable to the SI units through NIST standards. The new NVG gain matches the previous one as a result of the application of a correction coefficient originating from the conversion of the radiance to luminance units. The new definition was tested at the NIST Night Vision Calibration Facility and the measurement results were compared to the data obtained with a Hoffman Test Set Model ANV-126. Comparing the radiometric quantities of the Hoffman Test Set and those measured by the NIST transfer standard radiometer, indicates that the observed differences up to 15% were due to the calibration and experimental errors of the ANV-126 Test Set. In view of different spectral characteristics of luminophores that can be utilized in the NVG design, the simulation of the NVG output for gain measurement was performed. The NVG output was simulated with a sphere-based source using different LEDs and the measured gain was compared to that obtained with the ANV-126 internal luminance meter. The NVG gain uncertainty analysis was performed for the Type A, B, and C goggles.

  13. Toward a cognitive taxonomy of medical errors.

    OpenAIRE

    Zhang, Jiajie; Patel, Vimla L.; Johnson, Todd R.; Shortliffe, Edward H.

    2002-01-01

    One critical step in addressing and resolving the problems associated with human errors is the development of a cognitive taxonomy of such errors. In the case of errors, such a taxonomy may be developed (1) to categorize all types of errors along cognitive dimensions, (2) to associate each type of error with a specific underlying cognitive mechanism, (3) to explain why, and even predict when and where, a specific error will occur, and (4) to generate intervention strategies for each type of e...

  14. Robust Quantum Error Correction via Convex Optimization

    CERN Document Server

    Kosut, R L; Lidar, D A

    2007-01-01

    Quantum error correction procedures have traditionally been developed for specific error models, and are not robust against uncertainty in the errors. Using a semidefinite program optimization approach we find high fidelity quantum error correction procedures which present robust encoding and recovery effective against significant uncertainty in the error system. We present numerical examples for 3, 5, and 7-qubit codes. Our approach requires as input a description of the error channel, which can be provided via quantum process tomography.

  15. Errors depending on costs in sample surveys

    OpenAIRE

    Marella, Daniela

    2007-01-01

    "This paper presents a total survey error model that simultaneously treats sampling error, nonresponse error and measurement error. The main aim for developing the model is to determine the optimal allocation of the available resources for the total survey error reduction. More precisely, the paper is concerned with obtaining the best possible accuracy in survey estimate through an overall economic balance between sampling and nonsampling error." (author's abstract)

  16. Definition of postprandial lipaemia

    DEFF Research Database (Denmark)

    Kolovou, Genovefa D; Mikhailidis, Dimitri P; Nordestgaard, Børge G

    2011-01-01

    At the present time, there is no widely agreed definition of postprandial lipaemia (PPL). This lack of a shared definition limits the identification and treatment of patients with exaggerated PPL as well as the evaluation of potential therapeutic agents. PPL is a complex syndrome characterized...... by non-fasting hypertriglyceridaemia that is associated with an increased risk of vascular events. This review considers the definition of PPL and the methodology for assessing this process....

  17. Error-tolerant Tree Matching

    CERN Document Server

    Oflazer, K

    1996-01-01

    This paper presents an efficient algorithm for retrieving from a database of trees, all trees that match a given query tree approximately, that is, within a certain error tolerance. It has natural language processing applications in searching for matches in example-based translation systems, and retrieval from lexical databases containing entries of complex feature structures. The algorithm has been implemented on SparcStations, and for large randomly generated synthetic tree databases (some having tens of thousands of trees) it can associatively search for trees with a small error, in a matter of tenths of a second to few seconds.

  18. Immediate error correction process following sleep deprivation

    National Research Council Canada - National Science Library

    HSIEH, SHULAN; CHENG, I‐CHEN; TSAI, LING‐LING

    2007-01-01

    ...) participated in this study. Participants performed a modified letter flanker task and were instructed to make immediate error corrections on detecting performance errors. Event‐related potentials (ERPs...

  19. Frequency and Type of Situational Awareness Errors Contributing to Death and Brain Damage: A Closed Claims Analysis.

    Science.gov (United States)

    Schulz, Christian M; Burden, Amanda; Posner, Karen L; Mincer, Shawn L; Steadman, Randolph; Wagner, Klaus J; Domino, Karen B

    2017-08-01

    Situational awareness errors may play an important role in the genesis of patient harm. The authors examined closed anesthesia malpractice claims for death or brain damage to determine the frequency and type of situational awareness errors. Surgical and procedural anesthesia death and brain damage claims in the Anesthesia Closed Claims Project database were analyzed. Situational awareness error was defined as failure to perceive relevant clinical information, failure to comprehend the meaning of available information, or failure to project, anticipate, or plan. Patient and case characteristics, primary damaging events, and anesthesia payments in claims with situational awareness errors were compared to other death and brain damage claims from 2002 to 2013. Anesthesiologist situational awareness errors contributed to death or brain damage in 198 of 266 claims (74%). Respiratory system damaging events were more common in claims with situational awareness errors (56%) than other claims (21%, P situational awareness error claims compared to 46% in other claims (P = 0.001), with no significant difference in payment size. Among 198 claims with anesthesia situational awareness error, perception errors were most common (42%), whereas comprehension errors (29%) and projection errors (29%) were relatively less common. Situational awareness error definitions were operationalized for reliable application to real-world anesthesia cases. Situational awareness errors may have contributed to catastrophic outcomes in three quarters of recent anesthesia malpractice claims.Situational awareness errors resulting in death or brain damage remain prevalent causes of malpractice claims in the 21st century.

  20. The error of our ways

    Science.gov (United States)

    Swartz, Clifford E.

    1999-10-01

    In Victorian literature it was usually some poor female who came to see the error of her ways. How prescient of her! How I wish that all writers of manuscripts for The Physics Teacher would come to similar recognition of this centerpiece of measurement. For, Brothers and Sisters, we all err.

  1. Measurement error in geometric morphometrics.

    Science.gov (United States)

    Fruciano, Carmelo

    2016-06-01

    Geometric morphometrics-a set of methods for the statistical analysis of shape once saluted as a revolutionary advancement in the analysis of morphology -is now mature and routinely used in ecology and evolution. However, a factor often disregarded in empirical studies is the presence and the extent of measurement error. This is potentially a very serious issue because random measurement error can inflate the amount of variance and, since many statistical analyses are based on the amount of "explained" relative to "residual" variance, can result in loss of statistical power. On the other hand, systematic bias can affect statistical analyses by biasing the results (i.e. variation due to bias is incorporated in the analysis and treated as biologically-meaningful variation). Here, I briefly review common sources of error in geometric morphometrics. I then review the most commonly used methods to measure and account for both random and non-random measurement error, providing a worked example using a real dataset.

  2. Finding errors in big data

    NARCIS (Netherlands)

    Puts, Marco; Daas, Piet; de Waal, A.G.

    No data source is perfect. Mistakes inevitably creep in. Spotting errors is hard enough when dealing with survey responses from several thousand people, but the difficulty is multiplied hugely when that mysterious beast Big Data comes into play. Statistics Netherlands is about to publish its first

  3. Having Fun with Error Analysis

    Science.gov (United States)

    Siegel, Peter

    2007-01-01

    We present a fun activity that can be used to introduce students to error analysis: the M&M game. Students are told to estimate the number of individual candies plus uncertainty in a bag of M&M's. The winner is the group whose estimate brackets the actual number with the smallest uncertainty. The exercise produces enthusiastic discussions and…

  4. Typical errors of ESP users

    Science.gov (United States)

    Eremina, Svetlana V.; Korneva, Anna A.

    2004-07-01

    The paper presents analysis of the errors made by ESP (English for specific purposes) users which have been considered as typical. They occur as a result of misuse of resources of English grammar and tend to resist. Their origin and places of occurrence have also been discussed.

  5. Theory of Test Translation Error

    Science.gov (United States)

    Solano-Flores, Guillermo; Backhoff, Eduardo; Contreras-Nino, Luis Angel

    2009-01-01

    In this article, we present a theory of test translation whose intent is to provide the conceptual foundation for effective, systematic work in the process of test translation and test translation review. According to the theory, translation error is multidimensional; it is not simply the consequence of defective translation but an inevitable fact…

  6. A brief history of error.

    Science.gov (United States)

    Murray, Andrew W

    2011-10-03

    The spindle checkpoint monitors chromosome alignment on the mitotic and meiotic spindle. When the checkpoint detects errors, it arrests progress of the cell cycle while it attempts to correct the mistakes. This perspective will present a brief history summarizing what we know about the checkpoint, and a list of questions we must answer before we understand it.

  7. Error processing in Huntington's disease.

    Directory of Open Access Journals (Sweden)

    Christian Beste

    Full Text Available BACKGROUND: Huntington's disease (HD is a genetic disorder expressed by a degeneration of the basal ganglia inter alia accompanied with dopaminergic alterations. These dopaminergic alterations are related to genetic factors i.e., CAG-repeat expansion. The error (related negativity (Ne/ERN, a cognitive event-related potential related to performance monitoring, is generated in the anterior cingulate cortex (ACC and supposed to depend on the dopaminergic system. The Ne is reduced in Parkinson's Disease (PD. Due to a dopaminergic deficit in HD, a reduction of the Ne is also likely. Furthermore it is assumed that movement dysfunction emerges as a consequence of dysfunctional error-feedback processing. Since dopaminergic alterations are related to the CAG-repeat, a Ne reduction may furthermore also be related to the genetic disease load. METHODOLOGY/PRINCIPLE FINDINGS: We assessed the error negativity (Ne in a speeded reaction task under consideration of the underlying genetic abnormalities. HD patients showed a specific reduction in the Ne, which suggests impaired error processing in these patients. Furthermore, the Ne was closely related to CAG-repeat expansion. CONCLUSIONS/SIGNIFICANCE: The reduction of the Ne is likely to be an effect of the dopaminergic pathology. The result resembles findings in Parkinson's Disease. As such the Ne might be a measure for the integrity of striatal dopaminergic output function. The relation to the CAG-repeat expansion indicates that the Ne could serve as a gene-associated "cognitive" biomarker in HD.

  8. Learner Corpora without Error Tagging

    Directory of Open Access Journals (Sweden)

    Rastelli, Stefano

    2009-01-01

    Full Text Available The article explores the possibility of adopting a form-to-function perspective when annotating learner corpora in order to get deeper insights about systematic features of interlanguage. A split between forms and functions (or categories is desirable in order to avoid the "comparative fallacy" and because – especially in basic varieties – forms may precede functions (e.g., what resembles to a "noun" might have a different function or a function may show up in unexpected forms. In the computer-aided error analysis tradition, all items produced by learners are traced to a grid of error tags which is based on the categories of the target language. Differently, we believe it is possible to record and make retrievable both words and sequence of characters independently from their functional-grammatical label in the target language. For this purpose at the University of Pavia we adapted a probabilistic POS tagger designed for L1 on L2 data. Despite the criticism that this operation can raise, we found that it is better to work with "virtual categories" rather than with errors. The article outlines the theoretical background of the project and shows some examples in which some potential of SLA-oriented (non error-based tagging will be possibly made clearer.

  9. Input/output error analyzer

    Science.gov (United States)

    Vaughan, E. T.

    1977-01-01

    Program aids in equipment assessment. Independent assembly-language utility program is designed to operate under level 27 or 31 of EXEC 8 Operating System. It scans user-selected portions of system log file, whether located on tape or mass storage, and searches for and processes 1/0 error (type 6) entries.

  10. Amplify Errors to Minimize Them

    Science.gov (United States)

    Stewart, Maria Shine

    2009-01-01

    In this article, the author offers her experience of modeling mistakes and writing spontaneously in the computer classroom to get students' attention and elicit their editorial response. She describes how she taught her class about major sentence errors--comma splices, run-ons, and fragments--through her Sentence Meditation exercise, a rendition…

  11. Error and its meaning in forensic science.

    Science.gov (United States)

    Christensen, Angi M; Crowder, Christian M; Ousley, Stephen D; Houck, Max M

    2014-01-01

    The discussion of "error" has gained momentum in forensic science in the wake of the Daubert guidelines and has intensified with the National Academy of Sciences' Report. Error has many different meanings, and too often, forensic practitioners themselves as well as the courts misunderstand scientific error and statistical error rates, often confusing them with practitioner error (or mistakes). Here, we present an overview of these concepts as they pertain to forensic science applications, discussing the difference between practitioner error (including mistakes), instrument error, statistical error, and method error. We urge forensic practitioners to ensure that potential sources of error and method limitations are understood and clearly communicated and advocate that the legal community be informed regarding the differences between interobserver errors, uncertainty, variation, and mistakes.

  12. Discussing harm-causing errors with patients: an ethics primer for plastic surgeons.

    Science.gov (United States)

    Vercler, Christian J; Buchman, Steven R; Chung, Kevin C

    2015-02-01

    Plastic surgery is a field that demands perfection, yet despite our best efforts errors occur every day. Most errors are minor, but occasionally patients are harmed by our mistakes. Although there is a strong ethical requirement for full disclosure of medical errors, data suggest that surgeons have a difficult time disclosing errors and apologizing. "Conventional wisdom" has been to avoid frank discussion of errors with patients. This concept is fueled by the fear of litigation and the notion that any expression of apology leads to malpractice suits. Recently, there has been an increase in the literature pointing to the inadequacy of this approach. Policies that require disclosure of harm-causing medical errors to the patient and the family, apology, and an offer of compensation cultivate the transparency necessary for quality improvement efforts as well as the positive moral development of trainees. There is little published in the plastic surgery literature regarding error disclosure to provide guidance to practitioners. In this article, we will review the ethical, therapeutic, and practical issues involved in discussing the error with the patient and apologizing by presenting a representative case. This primer will provide an understanding of the definition of medical error, the ethical support of error disclosure, the barriers to disclosure, and how to overcome those barriers.

  13. Polynomial theory of error correcting codes

    CERN Document Server

    Cancellieri, Giovanni

    2015-01-01

    The book offers an original view on channel coding, based on a unitary approach to block and convolutional codes for error correction. It presents both new concepts and new families of codes. For example, lengthened and modified lengthened cyclic codes are introduced as a bridge towards time-invariant convolutional codes and their extension to time-varying versions. The novel families of codes include turbo codes and low-density parity check (LDPC) codes, the features of which are justified from the structural properties of the component codes. Design procedures for regular LDPC codes are proposed, supported by the presented theory. Quasi-cyclic LDPC codes, in block or convolutional form, represent one of the most original contributions of the book. The use of more than 100 examples allows the reader gradually to gain an understanding of the theory, and the provision of a list of more than 150 definitions, indexed at the end of the book, permits rapid location of sought information.

  14. Analytical method for coupled transmission error of helical gear system with machining errors, assembly errors and tooth modifications

    Science.gov (United States)

    Lin, Tengjiao; He, Zeyin

    2017-07-01

    We present a method for analyzing the transmission error of helical gear system with errors. First a finite element method is used for modeling gear transmission system with machining errors, assembly errors, modifications and the static transmission error is obtained. Then the bending-torsional-axial coupling dynamic model of the transmission system based on the lumped mass method is established and the dynamic transmission error of gear transmission system is calculated, which provides error excitation data for the analysis and control of vibration and noise of gear system.

  15. Axiomatic definition of valid 3D parcels, potentially in a space partition

    NARCIS (Netherlands)

    Thompson, R.J.; Van Oosterom, P.J.M.

    2011-01-01

    The definition of a valid 3D parcel must be correct and unambiguous, because an error or ambiguity in the definition of the extent of a property can lead to expensive legal disputes or to problems with handling 3D parcels in the information systems or problems during data transfer between two system

  16. Engineering Definitional Interpreters

    DEFF Research Database (Denmark)

    Midtgaard, Jan; Ramsay, Norman; Larsen, Bradford

    2013-01-01

    A definitional interpreter should be clear and easy to write, but it may run 4--10 times slower than a well-crafted bytecode interpreter. In a case study focused on implementation choices, we explore ways of making definitional interpreters faster without expending much programming effort. We imp...

  17. Engineering Definitional Interpreters

    DEFF Research Database (Denmark)

    Midtgaard, Jan; Ramsay, Norman; Larsen, Bradford

    2013-01-01

    A definitional interpreter should be clear and easy to write, but it may run 4--10 times slower than a well-crafted bytecode interpreter. In a case study focused on implementation choices, we explore ways of making definitional interpreters faster without expending much programming effort. We imp...

  18. Definition of postprandial lipaemia

    DEFF Research Database (Denmark)

    Kolovou, Genovefa D; Mikhailidis, Dimitri P; Nordestgaard, Børge G

    2011-01-01

    At the present time, there is no widely agreed definition of postprandial lipaemia (PPL). This lack of a shared definition limits the identification and treatment of patients with exaggerated PPL as well as the evaluation of potential therapeutic agents. PPL is a complex syndrome characterized by...

  19. Productivity of Stream Definitions

    NARCIS (Netherlands)

    Endrullis, Jörg; Grabmayer, Clemens; Hendriks, Dimitri; Isihara, Ariya; Klop, Jan

    2007-01-01

    We give an algorithm for deciding productivity of a large and natural class of recursive stream definitions. A stream definition is called ‘productive’ if it can be evaluated continuously in such a way that a uniquely determined stream is obtained as the limit. Whereas productivity is undecidable

  20. Productivity of stream definitions

    NARCIS (Netherlands)

    Endrullis, J.; Grabmayer, C.A.; Hendriks, D.; Isihara, A.; Klop, J.W.

    2008-01-01

    We give an algorithm for deciding productivity of a large and natural class of recursive stream definitions. A stream definition is called ‘productive’ if it can be evaluated continually in such a way that a uniquely determined stream in constructor normal form is obtained as the limit. Whereas prod

  1. Quantum Errors and Disturbances: Response to Busch, Lahti and Werner

    Directory of Open Access Journals (Sweden)

    David Marcus Appleby

    2016-05-01

    Full Text Available Busch, Lahti and Werner (BLW have recently criticized the operator approach to the description of quantum errors and disturbances. Their criticisms are justified to the extent that the physical meaning of the operator definitions has not hitherto been adequately explained. We rectify that omission. We then examine BLW’s criticisms in the light of our analysis. We argue that, although the BLW approach favour (based on the Wasserstein two-deviation has its uses, there are important physical situations where an operator approach is preferable. We also discuss the reason why the error-disturbance relation is still giving rise to controversies almost a century after Heisenberg first stated his microscope argument. We argue that the source of the difficulties is the problem of interpretation, which is not so wholly disconnected from experimental practicalities as is sometimes supposed.

  2. Genetic algorithm-based evaluation of spatial straightness error

    Institute of Scientific and Technical Information of China (English)

    崔长彩; 车仁生; 黄庆成; 叶东; 陈刚

    2003-01-01

    A genetic algorithm ( GA ) -based approach is proposed to evaluate the straightness error of spatial lines. According to the mathematical definition of spatial straightness, a verification model is established for straightness error, and the fitness function of GA is then given and the implementation techniques of the proposed algorithm is discussed in detail. The implementation techniques include real number encoding, adaptive variable range choosing, roulette wheel and elitist combination selection strategies, heuristic crossover and single point mutation schemes etc. An application example is quoted to validate the proposed algorithm. The computation result shows that the GA-based approach is a superior nonlinear parallel optimization method. The performance of the evolution population can be improved through genetic operations such as reproduction, crossover and mutation until the optimum goal of the minimum zone solution is obtained. The quality of the solution is better and the efficiency of computation is higher than other methods.

  3. Quantum Errors and Disturbances: Response to Busch, Lahti and Werner

    Science.gov (United States)

    Appleby, David

    2016-05-01

    Busch, Lahti and Werner (BLW) have recently criticized the operator approach to the description of quantum errors and disturbances. Their criticisms are justified to the extent that the physical meaning of the operator definitions has not hitherto been adequately explained. We rectify that omission. We then examine BLW's criticisms in the light of our analysis. We argue that, although the approach BLW favour (based on the Wasserstein 2-deviation) has its uses, there are important physical situations where an operator approach is preferable. We also discuss the reason why the error-disturbance relation is still giving rise to controversies almost a century after Heisenberg first stated his microscope argument. We argue that the source of the difficulties is the problem of interpretation, which is not so wholly disconnected from experimental practicalities as is sometimes supposed.

  4. Definition of intractable epilepsy.

    Science.gov (United States)

    Sinha, Shobhit; Siddiqui, Khurram A

    2011-01-01

    Defining intractable epilepsy is essential not only to identify up to 40% of patients refractory to pharmacological management, but also to facilitate selection and comparison of such patients for research purposes. The ideal definition still eludes us. Multiple factors including number of antiepileptic drug (AED) failures, seizure frequency and duration of unresponsiveness, etiology, and epilepsy syndromes are considered in formulating the definition of pharmaco-resistant epilepsy. Most definitions used in the literature agree on the number of AED failures, which seem to be 2 or 3, however, the seizure frequency and time factor are varied. The International League Against Epilepsy proposed a definition of drug-resistant epilepsy as a failure of adequate trials of 2 tolerated and appropriately chosen and used AED schedules. This for now, could provide an operational definition for clinical and research settings. However, with emergence of new data and novel treatments the criteria for intractability may change.

  5. Space Saving Statistics: An Introduction to Constant Error, Variable Error, and Absolute Error.

    Science.gov (United States)

    Guth, David

    1990-01-01

    Article discusses research on orientation and mobility (O&M) for individuals with visual impairments, examining constant, variable, and absolute error (descriptive statistics that quantify fundamentally different characteristics of distributions of spatially directed behavior). It illustrates the statistics with examples, noting their…

  6. Discretization vs. Rounding Error in Euler's Method

    Science.gov (United States)

    Borges, Carlos F.

    2011-01-01

    Euler's method for solving initial value problems is an excellent vehicle for observing the relationship between discretization error and rounding error in numerical computation. Reductions in stepsize, in order to decrease discretization error, necessarily increase the number of steps and so introduce additional rounding error. The problem is…

  7. Discretization vs. Rounding Error in Euler's Method

    Science.gov (United States)

    Borges, Carlos F.

    2011-01-01

    Euler's method for solving initial value problems is an excellent vehicle for observing the relationship between discretization error and rounding error in numerical computation. Reductions in stepsize, in order to decrease discretization error, necessarily increase the number of steps and so introduce additional rounding error. The problem is…

  8. Correction of errors in power measurements

    DEFF Research Database (Denmark)

    Pedersen, Knud Ole Helgesen

    1998-01-01

    Small errors in voltage and current measuring transformers cause inaccuracies in power measurements.In this report correction factors are derived to compensate for such errors.......Small errors in voltage and current measuring transformers cause inaccuracies in power measurements.In this report correction factors are derived to compensate for such errors....

  9. Error Analysis of Band Matrix Method

    OpenAIRE

    Taniguchi, Takeo; Soga, Akira

    1984-01-01

    Numerical error in the solution of the band matrix method based on the elimination method in single precision is investigated theoretically and experimentally, and the behaviour of the truncation error and the roundoff error is clarified. Some important suggestions for the useful application of the band solver are proposed by using the results of above error analysis.

  10. Error Correction in Oral Classroom English Teaching

    Science.gov (United States)

    Jing, Huang; Xiaodong, Hao; Yu, Liu

    2016-01-01

    As is known to all, errors are inevitable in the process of language learning for Chinese students. Should we ignore students' errors in learning English? In common with other questions, different people hold different opinions. All teachers agree that errors students make in written English are not allowed. For the errors students make in oral…

  11. 5 CFR 1601.34 - Error correction.

    Science.gov (United States)

    2010-01-01

    ... 5 Administrative Personnel 3 2010-01-01 2010-01-01 false Error correction. 1601.34 Section 1601.34... Contribution Allocations and Interfund Transfer Requests § 1601.34 Error correction. Errors in processing... in the wrong investment fund, will be corrected in accordance with the error correction...

  12. STRUCTURED BACKWARD ERRORS FOR STRUCTURED KKT SYSTEMS

    Institute of Scientific and Technical Information of China (English)

    Xin-xiu Li; Xin-guo Liu

    2004-01-01

    In this paper we study structured backward errors for some structured KKT systems.Normwise structured backward errors for structured KKT systems are defined, and computable formulae of the structured backward errors are obtained. Simple numerical examples show that the structured backward errors may be much larger than the unstructured ones in some cases.

  13. Magnetospheric Multiscale (MMS) Mission Commissioning Phase Orbit Determination Error Analysis

    Science.gov (United States)

    Chung, Lauren R.; Novak, Stefan; Long, Anne; Gramling, Cheryl

    2009-01-01

    The Magnetospheric MultiScale (MMS) mission commissioning phase starts in a 185 km altitude x 12 Earth radii (RE) injection orbit and lasts until the Phase 1 mission orbits and orientation to the Earth-Sun li ne are achieved. During a limited time period in the early part of co mmissioning, five maneuvers are performed to raise the perigee radius to 1.2 R E, with a maneuver every other apogee. The current baseline is for the Goddard Space Flight Center Flight Dynamics Facility to p rovide MMS orbit determination support during the early commissioning phase using all available two-way range and Doppler tracking from bo th the Deep Space Network and Space Network. This paper summarizes th e results from a linear covariance analysis to determine the type and amount of tracking data required to accurately estimate the spacecraf t state, plan each perigee raising maneuver, and support thruster cal ibration during this phase. The primary focus of this study is the na vigation accuracy required to plan the first and the final perigee ra ising maneuvers. Absolute and relative position and velocity error hi stories are generated for all cases and summarized in terms of the ma ximum root-sum-square consider and measurement noise error contributi ons over the definitive and predictive arcs and at discrete times inc luding the maneuver planning and execution times. Details of the meth odology, orbital characteristics, maneuver timeline, error models, and error sensitivities are provided.

  14. Small-Sample Error Estimation for Bagged Classification Rules

    Science.gov (United States)

    Vu, T. T.; Braga-Neto, U. M.

    2010-12-01

    Application of ensemble classification rules in genomics and proteomics has become increasingly common. However, the problem of error estimation for these classification rules, particularly for bagging under the small-sample settings prevalent in genomics and proteomics, is not well understood. Breiman proposed the "out-of-bag" method for estimating statistics of bagged classifiers, which was subsequently applied by other authors to estimate the classification error. In this paper, we give an explicit definition of the out-of-bag estimator that is intended to remove estimator bias, by formulating carefully how the error count is normalized. We also report the results of an extensive simulation study of bagging of common classification rules, including LDA, 3NN, and CART, applied on both synthetic and real patient data, corresponding to the use of common error estimators such as resubstitution, leave-one-out, cross-validation, basic bootstrap, bootstrap 632, bootstrap 632 plus, bolstering, semi-bolstering, in addition to the out-of-bag estimator. The results from the numerical experiments indicated that the performance of the out-of-bag estimator is very similar to that of leave-one-out; in particular, the out-of-bag estimator is slightly pessimistically biased. The performance of the other estimators is consistent with their performance with the corresponding single classifiers, as reported in other studies.

  15. Small-Sample Error Estimation for Bagged Classification Rules

    Directory of Open Access Journals (Sweden)

    Vu TT

    2010-01-01

    Full Text Available Application of ensemble classification rules in genomics and proteomics has become increasingly common. However, the problem of error estimation for these classification rules, particularly for bagging under the small-sample settings prevalent in genomics and proteomics, is not well understood. Breiman proposed the "out-of-bag" method for estimating statistics of bagged classifiers, which was subsequently applied by other authors to estimate the classification error. In this paper, we give an explicit definition of the out-of-bag estimator that is intended to remove estimator bias, by formulating carefully how the error count is normalized. We also report the results of an extensive simulation study of bagging of common classification rules, including LDA, 3NN, and CART, applied on both synthetic and real patient data, corresponding to the use of common error estimators such as resubstitution, leave-one-out, cross-validation, basic bootstrap, bootstrap 632, bootstrap 632 plus, bolstering, semi-bolstering, in addition to the out-of-bag estimator. The results from the numerical experiments indicated that the performance of the out-of-bag estimator is very similar to that of leave-one-out; in particular, the out-of-bag estimator is slightly pessimistically biased. The performance of the other estimators is consistent with their performance with the corresponding single classifiers, as reported in other studies.

  16. Chinese Translation Errors in English/Chinese Bilingual Children's Picture Books

    Science.gov (United States)

    Huang, Qiaoya; Chen, Xiaoning

    2012-01-01

    The aim of this study was to review the Chinese translation errors in 31 English/Chinese bilingual children's picture books. While bilingual children's books make definite contributions to language acquisition, few studies have examined the quality of these books, and even fewer have specifically focused on English/Chinese bilingual books.…

  17. Tracing Error-Related Knowledge in Interview Data: Negative Knowledge in Elder Care Nursing

    Science.gov (United States)

    Gartmeier, Martin; Gruber, Hans; Heid, Helmut

    2010-01-01

    This paper empirically investigates elder care nurses' negative knowledge. This form of experiential knowledge is defined as the outcome of error-related learning processes, focused on how something is not, on what not to do in certain situations or on deficits in one's knowledge or skills. Besides this definition, we presume the existence of…

  18. Chinese Translation Errors in English/Chinese Bilingual Children's Picture Books

    Science.gov (United States)

    Huang, Qiaoya; Chen, Xiaoning

    2012-01-01

    The aim of this study was to review the Chinese translation errors in 31 English/Chinese bilingual children's picture books. While bilingual children's books make definite contributions to language acquisition, few studies have examined the quality of these books, and even fewer have specifically focused on English/Chinese bilingual books.…

  19. Managing human error in aviation.

    Science.gov (United States)

    Helmreich, R L

    1997-05-01

    Crew resource management (CRM) programs were developed to address team and leadership aspects of piloting modern airplanes. The goal is to reduce errors through team work. Human factors research and social, cognitive, and organizational psychology are used to develop programs tailored for individual airlines. Flight crews study accident case histories, group dynamics, and human error. Simulators provide pilots with the opportunity to solve complex flight problems. CRM in the simulator is called line-oriented flight training (LOFT). In automated cockpits CRM promotes the idea of automation as a crew member. Cultural aspects of aviation include professional, business, and national culture. The aviation CRM model has been adapted for training surgeons and operating room staff in human factors.

  20. Robot learning and error correction

    Science.gov (United States)

    Friedman, L.

    1977-01-01

    A model of robot learning is described that associates previously unknown perceptions with the sensed known consequences of robot actions. For these actions, both the categories of outcomes and the corresponding sensory patterns are incorporated in a knowledge base by the system designer. Thus the robot is able to predict the outcome of an action and compare the expectation with the experience. New knowledge about what to expect in the world may then be incorporated by the robot in a pre-existing structure whether it detects accordance or discrepancy between a predicted consequence and experience. Errors committed during plan execution are detected by the same type of comparison process and learning may be applied to avoiding the errors.

  1. Manson’s triple error

    Directory of Open Access Journals (Sweden)

    Delaporte F.

    2008-09-01

    Full Text Available The author discusses the significance, implications and limitations of Manson’s work. How did Patrick Manson resolve some of the major problems raised by the filarial worm life cycle? The Amoy physician showed that circulating embryos could only leave the blood via the percutaneous route, thereby requiring a bloodsucking insect. The discovery of a new autonomous, airborne, active host undoubtedly had a considerable impact on the history of parasitology, but the way in which Manson formulated and solved the problem of the transfer of filarial worms from the body of the mosquito to man resulted in failure. This article shows how the epistemological transformation operated by Manson was indissociably related to a series of errors and how a major breakthrough can be the result of a series of false proposals and, consequently, that the history of truth often involves a history of error.

  2. Offset Error Compensation in Roundness Measurement

    Institute of Scientific and Technical Information of China (English)

    朱喜林; 史俊; 李晓梅

    2004-01-01

    This paper analyses three causes of offset error in roundness measurement and presents corresponding compensation methods.The causes of offset error include excursion error resulting from the deflection of the sensor's line of measurement from the rotational center in measurement (datum center), eccentricity error resulting from the variance between the workpiece's geometrical center and the rotational center, and tilt error resulting from the tilt between the workpiece's geometrical axes and the rotational centerline.

  3. FAKTOR PENYEBAB MEDICATION ERROR DI INSTALASI RAWAT DARURAT FACTORS AFFECTING MEDICATION ERRORS AT EMERGENCY UNIT

    OpenAIRE

    2014-01-01

    Background: Incident of medication errors is an importantindicator in patient safety and medication error is most commonmedical errors. However, most of medication errors can beprevented and efforts to reduce such errors are available.Due to high number of medications errors in the emergencyunit, understanding of the causes is important for designingsuccessful intervention. This research aims to identify typesand causes of medication errors.Method: Qualitative study was used and data were col...

  4. Error-resilient DNA computation

    Energy Technology Data Exchange (ETDEWEB)

    Karp, R.M.; Kenyon, C.; Waarts, O. [Univ. of California, Berkeley, CA (United States)

    1996-12-31

    The DNA model of computation, with test tubes of DNA molecules encoding bit sequences, is based on three primitives, Extract-A-Bit, which splits a test tube into two test tubes according to the value of a particular bit x, Merge-Two-Tubes and Detect-Emptiness. Perfect operations can test the satisfiability of any boolean formula in linear time. However, in reality the Extract operation is faulty; it misclassifies a certain proportion of the strands. We consider the following problem: given an algorithm based on perfect Extract, Merge and Detect operations, convert it to one that works correctly with high probability when the Extract operation is faulty. The fundamental problem in such a conversion is to construct a sequence of faulty Extracts and perfect Merges that simulates a highly reliable Extract operation. We first determine (up to a small constant factor) the minimum number of faulty Extract operations inherently required to simulate a highly reliable Extract operation. We then go on to derive a general method for converting any algorithm based on error-free operations to an error-resilient one, and give optimal error-resilient algorithms for realizing simple n-variable boolean functions such as Conjunction, Disjunction and Parity.

  5. The definition of cross polarization

    DEFF Research Database (Denmark)

    Ludwig, Arthur

    1973-01-01

    There are at least three different definitions of cross polarization used in the literature. The alternative definitions are discussed with respect to several applications, and the definition which corresponds to one standard measurement practice is proposed as the best choice.......There are at least three different definitions of cross polarization used in the literature. The alternative definitions are discussed with respect to several applications, and the definition which corresponds to one standard measurement practice is proposed as the best choice....

  6. $r$-Tuple Error Functions and Indefinite Theta Series of Higher-Depth

    CERN Document Server

    Nazaroglu, Caner

    2016-01-01

    Theta functions for definite signature lattices constitute a rich source of modular forms. A natural question is then their generalization to indefinite signature lattices. One way to ensure a convergent theta series while keeping the holomorphicity property of definite signature theta series is to restrict the sum over lattice points to a proper subset. Although such series do not have the modular properties that a definite signature theta function has, as shown by Zwegers for signature $(1,n-1)$ lattices, they can be completed to a function that has these modular properties by compromising on the holomorphicity property in a certain way. This construction has recently been generalized to signature $(2,n-2)$ lattices by Alexandrov, Banerjee, Manschot, and Pioline. A crucial ingredient in this work is the notion of double error functions which naturally lends itself to generalizations to higher dimensions. In this work we study the properties of such higher dimensional error functions which we will call $r$-t...

  7. From Slovene into English: Identifying Definiteness

    Directory of Open Access Journals (Sweden)

    Frančiška Lipovšek

    2008-06-01

    Full Text Available The paper addresses some typical instances of the translator’s failure to recognize definite reference in Slovene, which, in turn, results in an inappropriate determiner selection in English. It is argued that errors of this kind are ascribable not solely to the fact that the Slovene determiner system lacks an overt non-selective determiner parallel to the definite article, but to the relatively scarce use of overt determiners in general. Since definiteness is typically signalled by an anaphoric relation, some factors are explored that may help identify textual co-reference despite the absence of explicit anaphoric markers. Besides the translator’s inability to recognize the given phrase as anaphoric, two other major causes of inappropriate determiner selection are discussed: the misconception that the absence of an anaphoric relation entails indefiniteness and the translator’s misinterpreting an anaphoric expression as an ascriptive, non-referential entity. The second part of the paper focuses on the difference in use between the selective demonstrative pronoun and the non-selective definite article.

  8. Foundations of Coding Theory and Applications of Error-Correcting Codes with an Introduction to Cryptography and Information Theory

    CERN Document Server

    Adamek, Jiri

    1991-01-01

    Although devoted to constructions of good codes for error control, secrecy or data compression, the emphasis is on the first direction. Introduces a number of important classes of error-detecting and error-correcting codes as well as their decoding methods. Background material on modern algebra is presented where required. The role of error-correcting codes in modern cryptography is treated as are data compression and other topics related to information theory. The definition-theorem proof style used in mathematics texts is employed through the book but formalism is avoided wherever possible.

  9. Righting errors in writing errors: the Wing and Baddeley (1980) spelling error corpus revisited.

    Science.gov (United States)

    Wing, Alan M; Baddeley, Alan D

    2009-03-01

    We present a new analysis of our previously published corpus of handwriting errors (slips) using the proportional allocation algorithm of Machtynger and Shallice (2009). As previously, the proportion of slips is greater in the middle of the word than at the ends, however, in contrast to before, the proportion is greater at the end than at the beginning of the word. The findings are consistent with the hypothesis of memory effects in a graphemic output buffer.

  10. Effects of Listening Conditions, Error Types, and Ensemble Textures on Error Detection Skills

    Science.gov (United States)

    Waggoner, Dori T.

    2011-01-01

    This study was designed with three main purposes: (a) to investigate the effects of two listening conditions on error detection accuracy, (b) to compare error detection responses for rhythm errors and pitch errors, and (c) to examine the influences of texture on error detection accuracy. Undergraduate music education students (N = 18) listened to…

  11. SENSITIVE ERROR ANALYSIS OF CHAOS SYNCHRONIZATION

    Institute of Scientific and Technical Information of China (English)

    HUANG XIAN-GAO; XU JIAN-XUE; HUANG WEI; L(U) ZE-JUN

    2001-01-01

    We study the synchronizing sensitive errors of chaotic systems for adding other signals to the synchronizing signal.Based on the model of the Henon map masking, we examine the cause of the sensitive errors of chaos synchronization.The modulation ratio and the mean square error are defined to measure the synchronizing sensitive errors by quality.Numerical simulation results of the synchronizing sensitive errors are given for masking direct current, sinusoidal and speech signals, separately. Finally, we give the mean square error curves of chaos synchronizing sensitivity and threedimensional phase plots of the drive system and the response system for masking the three kinds of signals.

  12. Error signals driving locomotor adaptation

    DEFF Research Database (Denmark)

    Choi, Julia T; Jensen, Peter; Nielsen, Jens Bo

    2016-01-01

    perturbations. Forces were applied to the ankle joint during the early swing phase using an electrohydraulic ankle-foot orthosis. Repetitive 80 Hz electrical stimulation was applied to disrupt cutaneous feedback from the superficial peroneal nerve (foot dorsum) and medial plantar nerve (foot sole) during...... anaesthesia (n = 5) instead of repetitive nerve stimulation. Foot anaesthesia reduced ankle adaptation to external force perturbations during walking. Our results suggest that cutaneous input plays a role in force perception, and may contribute to the 'error' signal involved in driving walking adaptation when...

  13. (Errors in statistical tests3

    Directory of Open Access Journals (Sweden)

    Kaufman Jay S

    2008-07-01

    Full Text Available Abstract In 2004, Garcia-Berthou and Alcaraz published "Incongruence between test statistics and P values in medical papers," a critique of statistical errors that received a tremendous amount of attention. One of their observations was that the final reported digit of p-values in articles published in the journal Nature departed substantially from the uniform distribution that they suggested should be expected. In 2006, Jeng critiqued that critique, observing that the statistical analysis of those terminal digits had been based on comparing the actual distribution to a uniform continuous distribution, when digits obviously are discretely distributed. Jeng corrected the calculation and reported statistics that did not so clearly support the claim of a digit preference. However delightful it may be to read a critique of statistical errors in a critique of statistical errors, we nevertheless found several aspects of the whole exchange to be quite troubling, prompting our own meta-critique of the analysis. The previous discussion emphasized statistical significance testing. But there are various reasons to expect departure from the uniform distribution in terminal digits of p-values, so that simply rejecting the null hypothesis is not terribly informative. Much more importantly, Jeng found that the original p-value of 0.043 should have been 0.086, and suggested this represented an important difference because it was on the other side of 0.05. Among the most widely reiterated (though often ignored tenets of modern quantitative research methods is that we should not treat statistical significance as a bright line test of whether we have observed a phenomenon. Moreover, it sends the wrong message about the role of statistics to suggest that a result should be dismissed because of limited statistical precision when it is so easy to gather more data. In response to these limitations, we gathered more data to improve the statistical precision, and

  14. Errors associated with outpatient computerized prescribing systems

    Science.gov (United States)

    Rothschild, Jeffrey M; Salzberg, Claudia; Keohane, Carol A; Zigmont, Katherine; Devita, Jim; Gandhi, Tejal K; Dalal, Anuj K; Bates, David W; Poon, Eric G

    2011-01-01

    Objective To report the frequency, types, and causes of errors associated with outpatient computer-generated prescriptions, and to develop a framework to classify these errors to determine which strategies have greatest potential for preventing them. Materials and methods This is a retrospective cohort study of 3850 computer-generated prescriptions received by a commercial outpatient pharmacy chain across three states over 4 weeks in 2008. A clinician panel reviewed the prescriptions using a previously described method to identify and classify medication errors. Primary outcomes were the incidence of medication errors; potential adverse drug events, defined as errors with potential for harm; and rate of prescribing errors by error type and by prescribing system. Results Of 3850 prescriptions, 452 (11.7%) contained 466 total errors, of which 163 (35.0%) were considered potential adverse drug events. Error rates varied by computerized prescribing system, from 5.1% to 37.5%. The most common error was omitted information (60.7% of all errors). Discussion About one in 10 computer-generated prescriptions included at least one error, of which a third had potential for harm. This is consistent with the literature on manual handwritten prescription error rates. The number, type, and severity of errors varied by computerized prescribing system, suggesting that some systems may be better at preventing errors than others. Conclusions Implementing a computerized prescribing system without comprehensive functionality and processes in place to ensure meaningful system use does not decrease medication errors. The authors offer targeted recommendations on improving computerized prescribing systems to prevent errors. PMID:21715428

  15. Antenna motion errors in bistatic SAR imagery

    Science.gov (United States)

    Wang, Ling; Yazıcı, Birsen; Cagri Yanik, H.

    2015-06-01

    Antenna trajectory or motion errors are pervasive in synthetic aperture radar (SAR) imaging. Motion errors typically result in smearing and positioning errors in SAR images. Understanding the relationship between the trajectory errors and position errors in reconstructed images is essential in forming focused SAR images. Existing studies on the effect of antenna motion errors are limited to certain geometries, trajectory error models or monostatic SAR configuration. In this paper, we present an analysis of position errors in bistatic SAR imagery due to antenna motion errors. Bistatic SAR imagery is becoming increasingly important in the context of passive imaging and multi-sensor imaging. Our analysis provides an explicit quantitative relationship between the trajectory errors and the positioning errors in bistatic SAR images. The analysis is applicable to arbitrary trajectory errors and arbitrary imaging geometries including wide apertures and large scenes. We present extensive numerical simulations to validate the analysis and to illustrate the results in commonly used bistatic configurations and certain trajectory error models.

  16. Article Errors in the English Writing of Saudi EFL Preparatory Year Students

    Directory of Open Access Journals (Sweden)

    Eid Alhaisoni

    2017-02-01

    Full Text Available This study aims at providing a comprehensive account of the types of errors produced by Saudi EFL students enrolled in the preparatory year programe in their use of articles, based on the Surface Structure Taxonomies (SST of errors. The study describes the types, frequency and sources of the definite and indefinite article errors in writing compositions. Data were collected from written samples of 150 students. They were given one-and-a-half hours to write on one of four different descriptive topics. Analysis of  inter-lingual and intra-lingual sources of article errors revealed that the frequency of eliminating both the indefinite articles and the definite article was higher than the frequency of inserting and substituting one article with the other. The study also shows that errors of using ‘a’ were more common than errors of using ‘an’ and ‘the’ in the writing texts.  This result also indicates that L1 interference strongly influences the process of second language acquisition of the articles, having a negative effect on the learning process Pedagogical practices including comparison of article use in learners’ both language systems may improve learners’ ability to use the articles correctly in writing and the other language skills.

  17. Medication errors: hospital pharmacist perspective.

    Science.gov (United States)

    Guchelaar, Henk-Jan; Colen, Hadewig B B; Kalmeijer, Mathijs D; Hudson, Patrick T W; Teepe-Twiss, Irene M

    2005-01-01

    In recent years medication error has justly received considerable attention, as it causes substantial mortality, morbidity and additional healthcare costs. Risk assessment models, adapted from commercial aviation and the oil and gas industries, are currently being developed for use in clinical pharmacy. The hospital pharmacist is best placed to oversee the quality of the entire drug distribution chain, from prescribing, drug choice, dispensing and preparation to the administration of drugs, and can fulfil a vital role in improving medication safety. Most elements of the drug distribution chain can be optimised; however, because comparative intervention studies are scarce, there is little scientific evidence available demonstrating improvements in medication safety through such interventions. Possible interventions aimed at reducing medication errors, such as developing methods for detection of patients with increased risk of adverse drug events, performing risk assessment in clinical pharmacy and optimising the drug distribution chain are discussed. Moreover, the specific role of the clinical pharmacist in improving medication safety is highlighted, both at an organisational level and in individual patient care.

  18. Cosine tuning minimizes motor errors.

    Science.gov (United States)

    Todorov, Emanuel

    2002-06-01

    Cosine tuning is ubiquitous in the motor system, yet a satisfying explanation of its origin is lacking. Here we argue that cosine tuning minimizes expected errors in force production, which makes it a natural choice for activating muscles and neurons in the final stages of motor processing. Our results are based on the empirically observed scaling of neuromotor noise, whose standard deviation is a linear function of the mean. Such scaling predicts a reduction of net force errors when redundant actuators pull in the same direction. We confirm this prediction by comparing forces produced with one versus two hands and generalize it across directions. Under the resulting neuromotor noise model, we prove that the optimal activation profile is a (possibly truncated) cosine--for arbitrary dimensionality of the workspace, distribution of force directions, correlated or uncorrelated noise, with or without a separate cocontraction command. The model predicts a negative force bias, truncated cosine tuning at low muscle cocontraction levels, and misalignment of preferred directions and lines of action for nonuniform muscle distributions. All predictions are supported by experimental data.

  19. Field errors in hybrid insertion devices

    Energy Technology Data Exchange (ETDEWEB)

    Schlueter, R.D. [Lawrence Berkeley Lab., CA (United States)

    1995-02-01

    Hybrid magnet theory as applied to the error analyses used in the design of Advanced Light Source (ALS) insertion devices is reviewed. Sources of field errors in hybrid insertion devices are discussed.

  20. Medical errors: legal and ethical responses.

    Science.gov (United States)

    Dickens, B M

    2003-04-01

    Liability to err is a human, often unavoidable, characteristic. Errors can be classified as skill-based, rule-based, knowledge-based and other errors, such as of judgment. In law, a key distinction is between negligent and non-negligent errors. To describe a mistake as an error of clinical judgment is legally ambiguous, since an error that a physician might have made when acting with ordinary care and the professional skill the physician claims, is not deemed negligent in law. If errors prejudice patients' recovery from treatment and/or future care, in physical or psychological ways, it is legally and ethically required that they be informed of them in appropriate time. Senior colleagues, facility administrators and others such as medical licensing authorities should be informed of serious forms of error, so that preventive education and strategies can be designed. Errors for which clinicians may be legally liable may originate in systemically defective institutional administration.

  1. Experimental demonstration of topological error correction.

    Science.gov (United States)

    Yao, Xing-Can; Wang, Tian-Xiong; Chen, Hao-Ze; Gao, Wei-Bo; Fowler, Austin G; Raussendorf, Robert; Chen, Zeng-Bing; Liu, Nai-Le; Lu, Chao-Yang; Deng, You-Jin; Chen, Yu-Ao; Pan, Jian-Wei

    2012-02-22

    Scalable quantum computing can be achieved only if quantum bits are manipulated in a fault-tolerant fashion. Topological error correction--a method that combines topological quantum computation with quantum error correction--has the highest known tolerable error rate for a local architecture. The technique makes use of cluster states with topological properties and requires only nearest-neighbour interactions. Here we report the experimental demonstration of topological error correction with an eight-photon cluster state. We show that a correlation can be protected against a single error on any quantum bit. Also, when all quantum bits are simultaneously subjected to errors with equal probability, the effective error rate can be significantly reduced. Our work demonstrates the viability of topological error correction for fault-tolerant quantum information processing.

  2. Game Design Principles based on Human Error

    Directory of Open Access Journals (Sweden)

    Guilherme Zaffari

    2016-03-01

    Full Text Available This paper displays the result of the authors’ research regarding to the incorporation of Human Error, through design principles, to video game design. In a general way, designers must consider Human Error factors throughout video game interface development; however, when related to its core design, adaptations are in need, since challenge is an important factor for fun and under the perspective of Human Error, challenge can be considered as a flaw in the system. The research utilized Human Error classifications, data triangulation via predictive human error analysis, and the expanded flow theory to allow the design of a set of principles in order to match the design of playful challenges with the principles of Human Error. From the results, it was possible to conclude that the application of Human Error in game design has a positive effect on player experience, allowing it to interact only with errors associated with the intended aesthetics of the game.

  3. L’errore nel laboratorio di Microbiologia

    Directory of Open Access Journals (Sweden)

    Paolo Lanzafame

    2006-03-01

    Full Text Available Error management plays one of the most important roles in facility process improvement efforts. By detecting and reducing errors quality and patient care improve. The records of errors was analysed over a period of 6 months and another was used to study the potential bias in the registrations.The percentage of errors detected was 0,17% (normalised 1720 ppm and the errors in the pre-analytical phase was the largest part.The major rate of errors was generated by the peripheral centres which send only sometimes the microbiology tests and don’t know well the specific procedures to collect and storage biological samples.The errors in the management of laboratory supplies were reported too. The conclusion is that improving operators training, in particular concerning samples collection and storage, is very important and that an affective system of error detection should be employed to determine the causes and the best corrective action should be applied.

  4. An Error Analysis on TFL Learners’ Writings

    Directory of Open Access Journals (Sweden)

    Arif ÇERÇİ

    2016-12-01

    Full Text Available The main purpose of the present study is to identify and represent TFL learners’ writing errors through error analysis. All the learners started learning Turkish as foreign language with A1 (beginner level and completed the process by taking C1 (advanced certificate in TÖMER at Gaziantep University. The data of the present study were collected from 14 students’ writings in proficiency exams for each level. The data were grouped as grammatical, syntactic, spelling, punctuation, and word choice errors. The ratio and categorical distributions of identified errors were analyzed through error analysis. The data were analyzed through statistical procedures in an effort to determine whether error types differ according to the levels of the students. The errors in this study are limited to the linguistic and intralingual developmental errors

  5. Keyword Query over Error-Tolerant Knowledge Bases

    Institute of Scientific and Technical Information of China (English)

    Yu-Rong Cheng; Ye Yuan; Jia-Yu Li; Lei Chen; Guo-Ren Wang

    2016-01-01

    With more and more knowledge provided by WWW, querying and mining the knowledge bases have attracted much research attention. Among all the queries over knowledge bases, which are usually modelled as graphs, a keyword query is the most widely used one. Although the problem of keyword query over graphs has been deeply studied for years, knowledge bases, as special error-tolerant graphs, lead to the results of the traditional defined keyword queries out of users’ satisfaction. Thus, in this paper, we define a new keyword query, called confident r-clique, specific for knowledge bases based on the r-clique definition for keyword query on general graphs, which has been proved to be the best one. However, as we prove in the paper, finding the confident r-cliques is #P-hard. We propose a filtering-and-verification framework to improve the search efficiency. In the filtering phase, we develop the tightest upper bound of the confident r-clique, and design an index together with its search algorithm, which suits the large scale of knowledge bases well. In the verification phase, we develop an efficient sampling method to verify the final answers from the candidates remaining in the filtering phase. Extensive experiments demonstrate that the results derived from our new definition satisfy the users’ requirement better compared with the traditional r-clique definition, and our algorithms are efficient.

  6. Error Propagation in a System Model

    Science.gov (United States)

    Schloegel, Kirk (Inventor); Bhatt, Devesh (Inventor); Oglesby, David V. (Inventor); Madl, Gabor (Inventor)

    2015-01-01

    Embodiments of the present subject matter can enable the analysis of signal value errors for system models. In an example, signal value errors can be propagated through the functional blocks of a system model to analyze possible effects as the signal value errors impact incident functional blocks. This propagation of the errors can be applicable to many models of computation including avionics models, synchronous data flow, and Kahn process networks.

  7. Experimental demonstration of topological error correction

    OpenAIRE

    2012-01-01

    Scalable quantum computing can only be achieved if qubits are manipulated fault-tolerantly. Topological error correction - a novel method which combines topological quantum computing and quantum error correction - possesses the highest known tolerable error rate for a local architecture. This scheme makes use of cluster states with topological properties and requires only nearest-neighbour interactions. Here we report the first experimental demonstration of topological error correction with a...

  8. Sampling error of observation impact statistics

    OpenAIRE

    Kim, Sung-Min; Kim, Hyun Mee

    2014-01-01

    An observation impact is an estimate of the forecast error reduction by assimilating observations with numerical model forecasts. This study compares the sampling errors of the observation impact statistics (OBIS) of July 2011 and January 2012 using two methods. One method uses the random error under the assumption that the samples are independent, and the other method uses the error with lag correlation under the assumption that the samples are correlated with each other. The OBIS are obtain...

  9. Errors in veterinary practice: preliminary lessons for building better veterinary teams.

    Science.gov (United States)

    Kinnison, T; Guile, D; May, S A

    2015-11-14

    Case studies in two typical UK veterinary practices were undertaken to explore teamwork, including interprofessional working. Each study involved one week of whole team observation based on practice locations (reception, operating theatre), one week of shadowing six focus individuals (veterinary surgeons, veterinary nurses and administrators) and a final week consisting of semistructured interviews regarding teamwork. Errors emerged as a finding of the study. The definition of errors was inclusive, pertaining to inputs or omitted actions with potential adverse outcomes for patients, clients or the practice. The 40 identified instances could be grouped into clinical errors (dosing/drugs, surgical preparation, lack of follow-up), lost item errors, and most frequently, communication errors (records, procedures, missing face-to-face communication, mistakes within face-to-face communication). The qualitative nature of the study allowed the underlying cause of the errors to be explored. In addition to some individual mistakes, system faults were identified as a major cause of errors. Observed examples and interviews demonstrated several challenges to interprofessional teamworking which may cause errors, including: lack of time, part-time staff leading to frequent handovers, branch differences and individual veterinary surgeon work preferences. Lessons are drawn for building better veterinary teams and implications for Disciplinary Proceedings considered.

  10. Piecewise compensation for the nonlinear error of fiber-optic gyroscope scale factor

    Science.gov (United States)

    Zhang, Yonggang; Wu, Xunfeng; Yuan, Shun; Wu, Lei

    2013-08-01

    Fiber-Optic Gyroscope (FOG) scale factor nonlinear error will result in errors in Strapdown Inertial Navigation System (SINS). In order to reduce nonlinear error of FOG scale factor in SINS, a compensation method is proposed in this paper based on curve piecewise fitting of FOG output. Firstly, reasons which can result in FOG scale factor error are introduced and the definition of nonlinear degree is provided. Then we introduce the method to divide the output range of FOG into several small pieces, and curve fitting is performed in each output range of FOG to obtain scale factor parameter. Different scale factor parameters of FOG are used in different pieces to improve FOG output precision. These parameters are identified by using three-axis turntable, and nonlinear error of FOG scale factor can be reduced. Finally, three-axis swing experiment of SINS verifies that the proposed method can reduce attitude output errors of SINS by compensating the nonlinear error of FOG scale factor and improve the precision of navigation. The results of experiments also demonstrate that the compensation scheme is easy to implement. It can effectively compensate the nonlinear error of FOG scale factor with slightly increased computation complexity. This method can be used in inertial technology based on FOG to improve precision.

  11. Acoustic Evidence for Phonologically Mismatched Speech Errors

    Science.gov (United States)

    Gormley, Andrea

    2015-01-01

    Speech errors are generally said to accommodate to their new phonological context. This accommodation has been validated by several transcription studies. The transcription methodology is not the best choice for detecting errors at this level, however, as this type of error can be difficult to perceive. This paper presents an acoustic analysis of…

  12. Medication errors: the importance of safe dispensing.

    NARCIS (Netherlands)

    Cheung, K.C.; Bouvy, M.L.; Smet, P.A.G.M. de

    2009-01-01

    1. Although rates of dispensing errors are generally low, further improvements in pharmacy distribution systems are still important because pharmacies dispense such high volumes of medications that even a low error rate can translate into a large number of errors. 2. From the perspective of pharmacy

  13. Understanding EFL Students' Errors in Writing

    Science.gov (United States)

    Phuket, Pimpisa Rattanadilok Na; Othman, Normah Binti

    2015-01-01

    Writing is the most difficult skill in English, so most EFL students tend to make errors in writing. In assisting the learners to successfully acquire writing skill, the analysis of errors and the understanding of their sources are necessary. This study attempts to explore the major sources of errors occurred in the writing of EFL students. It…

  14. Error Analysis of Quadrature Rules. Classroom Notes

    Science.gov (United States)

    Glaister, P.

    2004-01-01

    Approaches to the determination of the error in numerical quadrature rules are discussed and compared. This article considers the problem of the determination of errors in numerical quadrature rules, taking Simpson's rule as the principal example. It suggests an approach based on truncation error analysis of numerical schemes for differential…

  15. Error Analysis in Mathematics. Technical Report #1012

    Science.gov (United States)

    Lai, Cheng-Fei

    2012-01-01

    Error analysis is a method commonly used to identify the cause of student errors when they make consistent mistakes. It is a process of reviewing a student's work and then looking for patterns of misunderstanding. Errors in mathematics can be factual, procedural, or conceptual, and may occur for a number of reasons. Reasons why students make…

  16. Error Analysis and the EFL Classroom Teaching

    Science.gov (United States)

    Xie, Fang; Jiang, Xue-mei

    2007-01-01

    This paper makes a study of error analysis and its implementation in the EFL (English as Foreign Language) classroom teaching. It starts by giving a systematic review of the concepts and theories concerning EA (Error Analysis), the various reasons causing errors are comprehensively explored. The author proposes that teachers should employ…

  17. Human Error Mechanisms in Complex Work Environments

    DEFF Research Database (Denmark)

    Rasmussen, Jens

    1988-01-01

    will account for most of the action errors observed. In addition, error mechanisms appear to be intimately related to the development of high skill and know-how in a complex work context. This relationship between errors and human adaptation is discussed in detail for individuals and organisations...

  18. Errors and Uncertainty in Physics Measurement.

    Science.gov (United States)

    Blasiak, Wladyslaw

    1983-01-01

    Classifies errors as either systematic or blunder and uncertainties as either systematic or random. Discusses use of error/uncertainty analysis in direct/indirect measurement, describing the process of planning experiments to ensure lowest possible uncertainty. Also considers appropriate level of error analysis for high school physics students'…

  19. Measurement error in a single regressor

    NARCIS (Netherlands)

    Meijer, H.J.; Wansbeek, T.J.

    2000-01-01

    For the setting of multiple regression with measurement error in a single regressor, we present some very simple formulas to assess the result that one may expect when correcting for measurement error. It is shown where the corrected estimated regression coefficients and the error variance may lie,

  20. Jonas Olson's Evidence for Moral Error Theory

    NARCIS (Netherlands)

    Evers, Daan

    2016-01-01

    Jonas Olson defends a moral error theory in (2014). I first argue that Olson is not justified in believing the error theory as opposed to moral nonnaturalism in his own opinion. I then argue that Olson is not justified in believing the error theory as opposed to moral contextualism either (although

  1. AWARENESS OF DE NTISTS ABOUT MEDICATION ERRORS

    Directory of Open Access Journals (Sweden)

    Sangeetha

    2014-01-01

    Full Text Available OBJECTIVE: To assess the awareness of medication errors among dentists. METHODS: Medication errors are the most common single preventable cause o f adverse events in medication practice. We conducted a survey with a sample of sixty dentists. Among them 30 were general dentists (BDS and 30 were dental specialists (MDS. Questionnaires were distributed to them with questions regarding medication erro rs and they were asked to fill up the questionnaire. Data was collected and subjected to statistical analysis using Fisher exact and Chi square test. RESULTS: In our study, sixty percent of general dentists and 76.7% of dental specialists were aware about the components of medication error. Overall 66.7% of the respondents in each group marked wrong duration as the dispensing error. Almost thirty percent of the general dentists and 56.7% of the dental specialists felt that technologic advances could accompl ish diverse task in reducing medication errors. This was of suggestive statistical significance with a P value of 0.069. CONCLUSION: Medication errors compromise patient confidence in the health - care system and increase health - care costs. Overall, the dent al specialists were more knowledgeable than the general dentists about the Medication errors. KEY WORDS: Medication errors; Dosing error; Prevention of errors; Adverse drug events; Prescribing errors; Medical errors.

  2. Error-Compensated Integrate and Hold

    Science.gov (United States)

    Matlin, M.

    1984-01-01

    Differencing circuit cancels error caused by switching transistors capacitance. In integrate and hold circuit using JFET switch, gate-to-source capacitance causes error in output voltage. Differential connection cancels out error. Applications in systems where very low voltages sampled or many integrate-and -hold cycles before circuit is reset.

  3. Jonas Olson's Evidence for Moral Error Theory

    NARCIS (Netherlands)

    Evers, Daan

    2016-01-01

    Jonas Olson defends a moral error theory in (2014). I first argue that Olson is not justified in believing the error theory as opposed to moral nonnaturalism in his own opinion. I then argue that Olson is not justified in believing the error theory as opposed to moral contextualism either (although

  4. Human Errors and Bridge Management Systems

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle; Nowak, A. S.

    Human errors are divided in two groups. The first group contains human errors, which effect the reliability directly. The second group contains human errors, which will not directly effect the reliability of the structure. The methodology used to estimate so-called reliability distributions on ba...

  5. The Problematic of Second Language Errors

    Science.gov (United States)

    Hamid, M. Obaidul; Doan, Linh Dieu

    2014-01-01

    The significance of errors in explicating Second Language Acquisition (SLA) processes led to the growth of error analysis in the 1970s which has since maintained its prominence in English as a second/foreign language (L2) research. However, one problem with this research is errors are often taken for granted, without problematising them and their…

  6. Error estimate for Doo-Sabin surfaces

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Based on a general bound on the distance error between a uniform Doo-Sabin surface and its control polyhedron, an exponential error bound independent of the subdivision process is presented in this paper. Using the exponential bound, one can predict the depth of recursive subdivision of the Doo-Sabin surface within any user-specified error tolerance.

  7. Medication errors: the importance of safe dispensing.

    NARCIS (Netherlands)

    Cheung, K.C.; Bouvy, M.L.; Smet, P.A.G.M. de

    2009-01-01

    1. Although rates of dispensing errors are generally low, further improvements in pharmacy distribution systems are still important because pharmacies dispense such high volumes of medications that even a low error rate can translate into a large number of errors. 2. From the perspective of pharmacy

  8. Preventing statistical errors in scientific journals.

    NARCIS (Netherlands)

    Nuijten, M.B.

    2016-01-01

    There is evidence for a high prevalence of statistical reporting errors in psychology and other scientific fields. These errors display a systematic preference for statistically significant results, distorting the scientific literature. There are several possible causes for this systematic error pre

  9. Influence of model errors in optimal sensor placement

    Science.gov (United States)

    Vincenzi, Loris; Simonini, Laura

    2017-02-01

    The paper investigates the role of model errors and parametric uncertainties in optimal or near optimal sensor placements for structural health monitoring (SHM) and modal testing. The near optimal set of measurement locations is obtained by the Information Entropy theory; the results of placement process considerably depend on the so-called covariance matrix of prediction error as well as on the definition of the correlation function. A constant and an exponential correlation function depending on the distance between sensors are firstly assumed; then a proposal depending on both distance and modal vectors is presented. With reference to a simple case-study, the effect of model uncertainties on results is described and the reliability and the robustness of the proposed correlation function in the case of model errors are tested with reference to 2D and 3D benchmark case studies. A measure of the quality of the obtained sensor configuration is considered through the use of independent assessment criteria. In conclusion, the results obtained by applying the proposed procedure on a real 5-spans steel footbridge are described. The proposed method also allows to better estimate higher modes when the number of sensors is greater than the number of modes of interest. In addition, the results show a smaller variation in the sensor position when uncertainties occur.

  10. Quantum error-correction failure distributions: Comparison of coherent and stochastic error models

    Science.gov (United States)

    Barnes, Jeff P.; Trout, Colin J.; Lucarelli, Dennis; Clader, B. D.

    2017-06-01

    We compare failure distributions of quantum error correction circuits for stochastic errors and coherent errors. We utilize a fully coherent simulation of a fault-tolerant quantum error correcting circuit for a d =3 Steane and surface code. We find that the output distributions are markedly different for the two error models, showing that no simple mapping between the two error models exists. Coherent errors create very broad and heavy-tailed failure distributions. This suggests that they are susceptible to outlier events and that mean statistics, such as pseudothreshold estimates, may not provide the key figure of merit. This provides further statistical insight into why coherent errors can be so harmful for quantum error correction. These output probability distributions may also provide a useful metric that can be utilized when optimizing quantum error correcting codes and decoding procedures for purely coherent errors.

  11. Improving ontologies by automatic reasoning and evaluation of logical definitions

    Directory of Open Access Journals (Sweden)

    Köhler Sebastian

    2011-10-01

    Full Text Available Abstract Background Ontologies are widely used to represent knowledge in biomedicine. Systematic approaches for detecting errors and disagreements are needed for large ontologies with hundreds or thousands of terms and semantic relationships. A recent approach of defining terms using logical definitions is now increasingly being adopted as a method for quality control as well as for facilitating interoperability and data integration. Results We show how automated reasoning over logical definitions of ontology terms can be used to improve ontology structure. We provide the Java software package GULO (Getting an Understanding of LOgical definitions, which allows fast and easy evaluation for any kind of logically decomposed ontology by generating a composite OWL ontology from appropriate subsets of the referenced ontologies and comparing the inferred relationships with the relationships asserted in the target ontology. As a case study we show how to use GULO to evaluate the logical definitions that have been developed for the Mammalian Phenotype Ontology (MPO. Conclusions Logical definitions of terms from biomedical ontologies represent an important resource for error and disagreement detection. GULO gives ontology curators a fast and simple tool for validation of their work.

  12. The Definitions of Safety and Security

    Directory of Open Access Journals (Sweden)

    Selçuk Nas

    2015-12-01

    Full Text Available It is seen that the words of safety and security are being used interchangeably in daily use of language. Yet these are defined as a synonym in many dictionaries. On the other hand, for a long while, there has been an attempt to clarify in what way “security” differs from “safety” in terms of meaning in aviation and maritime transportation. Following definitions have been made in the academic literature in order to make a distinction between these two words. In conclusion, the definitions of “safety” and “security” will be considered in the JEMS articles as stated below. Safety : The state of being away from hazards caused by natural forces or human errors randomly. The source of hazard is formed by natural forces and/or human errors. Security : The state of being away from hazards caused by deliberate intention of human to cause harm. The source of hazard is posed by human deliberately.

  13. Correlated measurement error hampers association network inference.

    Science.gov (United States)

    Kaduk, Mateusz; Hoefsloot, Huub C J; Vis, Daniel J; Reijmers, Theo; van der Greef, Jan; Smilde, Age K; Hendriks, Margriet M W B

    2014-09-01

    Modern chromatography-based metabolomics measurements generate large amounts of data in the form of abundances of metabolites. An increasingly popular way of representing and analyzing such data is by means of association networks. Ideally, such a network can be interpreted in terms of the underlying biology. A property of chromatography-based metabolomics data is that the measurement error structure is complex: apart from the usual (random) instrumental error there is also correlated measurement error. This is intrinsic to the way the samples are prepared and the analyses are performed and cannot be avoided. The impact of correlated measurement errors on (partial) correlation networks can be large and is not always predictable. The interplay between relative amounts of uncorrelated measurement error, correlated measurement error and biological variation defines this impact. Using chromatography-based time-resolved lipidomics data obtained from a human intervention study we show how partial correlation based association networks are influenced by correlated measurement error. We show how the effect of correlated measurement error on partial correlations is different for direct and indirect associations. For direct associations the correlated measurement error usually has no negative effect on the results, while for indirect associations, depending on the relative size of the correlated measurement error, results can become unreliable. The aim of this paper is to generate awareness of the existence of correlated measurement errors and their influence on association networks. Time series lipidomics data is used for this purpose, as it makes it possible to visually distinguish the correlated measurement error from a biological response. Underestimating the phenomenon of correlated measurement error will result in the suggestion of biologically meaningful results that in reality rest solely on complicated error structures. Using proper experimental designs that allow

  14. Avoiding and identifying errors in health technology assessment models: qualitative study and methodological review.

    Science.gov (United States)

    Chilcott, J; Tappenden, P; Rawdin, A; Johnson, M; Kaltenthaler, E; Paisley, S; Papaioannou, D; Shippam, A

    2010-05-01

    Health policy decisions must be relevant, evidence-based and transparent. Decision-analytic modelling supports this process but its role is reliant on its credibility. Errors in mathematical decision models or simulation exercises are unavoidable but little attention has been paid to processes in model development. Numerous error avoidance/identification strategies could be adopted but it is difficult to evaluate the merits of strategies for improving the credibility of models without first developing an understanding of error types and causes. The study aims to describe the current comprehension of errors in the HTA modelling community and generate a taxonomy of model errors. Four primary objectives are to: (1) describe the current understanding of errors in HTA modelling; (2) understand current processes applied by the technology assessment community for avoiding errors in development, debugging and critically appraising models for errors; (3) use HTA modellers' perceptions of model errors with the wider non-HTA literature to develop a taxonomy of model errors; and (4) explore potential methods and procedures to reduce the occurrence of errors in models. It also describes the model development process as perceived by practitioners working within the HTA community. A methodological review was undertaken using an iterative search methodology. Exploratory searches informed the scope of interviews; later searches focused on issues arising from the interviews. Searches were undertaken in February 2008 and January 2009. In-depth qualitative interviews were performed with 12 HTA modellers from academic and commercial modelling sectors. All qualitative data were analysed using the Framework approach. Descriptive and explanatory accounts were used to interrogate the data within and across themes and subthemes: organisation, roles and communication; the model development process; definition of error; types of model error; strategies for avoiding errors; strategies for

  15. High-Definition Medicine.

    Science.gov (United States)

    Torkamani, Ali; Andersen, Kristian G; Steinhubl, Steven R; Topol, Eric J

    2017-08-24

    The foundation for a new era of data-driven medicine has been set by recent technological advances that enable the assessment and management of human health at an unprecedented level of resolution-what we refer to as high-definition medicine. Our ability to assess human health in high definition is enabled, in part, by advances in DNA sequencing, physiological and environmental monitoring, advanced imaging, and behavioral tracking. Our ability to understand and act upon these observations at equally high precision is driven by advances in genome editing, cellular reprogramming, tissue engineering, and information technologies, especially artificial intelligence. In this review, we will examine the core disciplines that enable high-definition medicine and project how these technologies will alter the future of medicine. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Defining excellence: next steps for practicing clinicians seeking to prevent diagnostic error

    Directory of Open Access Journals (Sweden)

    Paul N. Foster

    2016-09-01

    Full Text Available The Institute of Medicine (IOM released its report on diagnostic errors in September, 2015. The report highlights the urgency of reducing errors and calls for system-level intervention and changes in our basic clinical interactions. Using the report’s controversial definition of diagnostic error as a starting point, we introduce the issues and the potential impact on practicing physicians. We report a case used to illustrate this in an academic conference. Finally, we turn to the challenge of integrating these ideas into the traditional peer-review process. We argue that the medical community must evolve from understanding diagnostic failures to redesigning the diagnostic process. We should see errors as steps toward diagnostic excellence and reliable processes that minimize the risk of mislabeling and harm.

  17. Model error estimation in ensemble data assimilation

    Directory of Open Access Journals (Sweden)

    S. Gillijns

    2007-01-01

    Full Text Available A new methodology is proposed to estimate and account for systematic model error in linear filtering as well as in nonlinear ensemble based filtering. Our results extend the work of Dee and Todling (2000 on constant bias errors to time-varying model errors. In contrast to existing methodologies, the new filter can also deal with the case where no dynamical model for the systematic error is available. In the latter case, the applicability is limited by a matrix rank condition which has to be satisfied in order for the filter to exist. The performance of the filter developed in this paper is limited by the availability and the accuracy of observations and by the variance of the stochastic model error component. The effect of these aspects on the estimation accuracy is investigated in several numerical experiments using the Lorenz (1996 model. Experimental results indicate that the availability of a dynamical model for the systematic error significantly reduces the variance of the model error estimates, but has only minor effect on the estimates of the system state. The filter is able to estimate additive model error of any type, provided that the rank condition is satisfied and that the stochastic errors and measurement errors are significantly smaller than the systematic errors. The results of this study are encouraging. However, it remains to be seen how the filter performs in more realistic applications.

  18. Analysis of errors in forensic science

    Directory of Open Access Journals (Sweden)

    Mingxiao Du

    2017-01-01

    Full Text Available Reliability of expert testimony is one of the foundations of judicial justice. Both expert bias and scientific errors affect the reliability of expert opinion, which in turn affects the trustworthiness of the findings of fact in legal proceedings. Expert bias can be eliminated by replacing experts; however, it may be more difficult to eliminate scientific errors. From the perspective of statistics, errors in operation of forensic science include systematic errors, random errors, and gross errors. In general, process repetition and abiding by the standard ISO/IEC:17025: 2005, general requirements for the competence of testing and calibration laboratories, during operation are common measures used to reduce errors that originate from experts and equipment, respectively. For example, to reduce gross errors, the laboratory can ensure that a test is repeated several times by different experts. In applying for forensic principles and methods, the Federal Rules of Evidence 702 mandate that judges consider factors such as peer review, to ensure the reliability of the expert testimony. As the scientific principles and methods may not undergo professional review by specialists in a certain field, peer review serves as an exclusive standard. This study also examines two types of statistical errors. As false-positive errors involve a higher possibility of an unfair decision-making, they should receive more attention than false-negative errors.

  19. Errors in quantum tomography: diagnosing systematic versus statistical errors

    Science.gov (United States)

    Langford, Nathan K.

    2013-03-01

    A prime goal of quantum tomography is to provide quantitatively rigorous characterization of quantum systems, be they states, processes or measurements, particularly for the purposes of trouble-shooting and benchmarking experiments in quantum information science. A range of techniques exist to enable the calculation of errors, such as Monte-Carlo simulations, but their quantitative value is arguably fundamentally flawed without an equally rigorous way of authenticating the quality of a reconstruction to ensure it provides a reasonable representation of the data, given the known noise sources. A key motivation for developing such a tool is to enable experimentalists to rigorously diagnose the presence of technical noise in their tomographic data. In this work, I explore the performance of the chi-squared goodness-of-fit test statistic as a measure of reconstruction quality. I show that its behaviour deviates noticeably from expectations for states lying near the boundaries of physical state space, severely undermining its usefulness as a quantitative tool precisely in the region which is of most interest in quantum information processing tasks. I suggest a simple, heuristic approach to compensate for these effects and present numerical simulations showing that this approach provides substantially improved performance.

  20. Prebiotics: why definitions matter.

    Science.gov (United States)

    Hutkins, Robert W; Krumbeck, Janina A; Bindels, Laure B; Cani, Patrice D; Fahey, George; Goh, Yong Jun; Hamaker, Bruce; Martens, Eric C; Mills, David A; Rastal, Robert A; Vaughan, Elaine; Sanders, Mary Ellen

    2016-02-01

    The prebiotic concept was introduced twenty years ago, and despite several revisions to the original definition, the scientific community has continued to debate what it means to be a prebiotic. How prebiotics are defined is important not only for the scientific community, but also for regulatory agencies, the food industry, consumers and healthcare professionals. Recent developments in community-wide sequencing and glycomics have revealed that more complex interactions occur between putative prebiotic substrates and the gut microbiota than previously considered. A consensus among scientists on the most appropriate definition of a prebiotic is necessary to enable continued use of the term.

  1. Definition of Entity Authentication

    DEFF Research Database (Denmark)

    Ahmed, Naveed; Jensen, Christian D.

    2010-01-01

    Authentication is considered a pre-requisite for communication security, but the definition of authentication is generally not agreed upon. Many attacks on authentication protocols are the result of misunderstanding of the goals of authentication. This state of affairs indicate limitations...... in theoretical understanding of the meanings of authentication. We provide a new insight in this direction and formalize it in CFPS (Common Framework for authentication Protocols' Specifications). CFPS provides a precise scope of definition for authentication protocols, which could make the design and analysis...

  2. Prebiotics: why definitions matter

    Science.gov (United States)

    Hutkins, Robert W; Krumbeck, Janina A; Bindels, Laure B; Cani, Patrice D; Fahey, George; Goh, Yong Jun; Hamaker, Bruce; Martens, Eric C; Mills, David A; Rastal, Robert A; Vaughan, Elaine; Sanders, Mary Ellen

    2015-01-01

    The prebiotic concept was introduced twenty years ago, and despite several revisions to the original definition, the scientific community has continued to debate what it means to be a prebiotic. How prebiotics are defined is important not only for the scientific community, but also for regulatory agencies, the food industry, consumers and healthcare professionals. Recent developments in community-wide sequencing and glycomics have revealed that more complex interactions occur between putative prebiotic substrates and the gut microbiota than previously considered. A consensus among scientists on the most appropriate definition of a prebiotic is necessary to enable continued use of the term. PMID:26431716

  3. Definition of Information

    Directory of Open Access Journals (Sweden)

    Boris Sunik

    2011-12-01

    Full Text Available This definition can be applied to information of every kind, level and complexity. Information is considered as the feature manifesting itself in the relations between certain real world
    entities. The real world has to be seen in terms of objects, actions, relations and properties.. The defintion is used as the basis of the Theory of Meaningful Information [1] that explains the nature and functionality of information and enables the production of relevant definitions regarding language and knowledge, which remain operative also in the case of non-human languages and
    knowledge systems.

  4. Impact of Measurement Error on Synchrophasor Applications

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Yilu [Univ. of Tennessee, Knoxville, TN (United States); Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Gracia, Jose R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Ewing, Paul D. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Zhao, Jiecheng [Univ. of Tennessee, Knoxville, TN (United States); Tan, Jin [Univ. of Tennessee, Knoxville, TN (United States); Wu, Ling [Univ. of Tennessee, Knoxville, TN (United States); Zhan, Lingwei [Univ. of Tennessee, Knoxville, TN (United States)

    2015-07-01

    Phasor measurement units (PMUs), a type of synchrophasor, are powerful diagnostic tools that can help avert catastrophic failures in the power grid. Because of this, PMU measurement errors are particularly worrisome. This report examines the internal and external factors contributing to PMU phase angle and frequency measurement errors and gives a reasonable explanation for them. It also analyzes the impact of those measurement errors on several synchrophasor applications: event location detection, oscillation detection, islanding detection, and dynamic line rating. The primary finding is that dynamic line rating is more likely to be influenced by measurement error. Other findings include the possibility of reporting nonoscillatory activity as an oscillation as the result of error, failing to detect oscillations submerged by error, and the unlikely impact of error on event location and islanding detection.

  5. Adjoint Error Estimation for Linear Advection

    Energy Technology Data Exchange (ETDEWEB)

    Connors, J M; Banks, J W; Hittinger, J A; Woodward, C S

    2011-03-30

    An a posteriori error formula is described when a statistical measurement of the solution to a hyperbolic conservation law in 1D is estimated by finite volume approximations. This is accomplished using adjoint error estimation. In contrast to previously studied methods, the adjoint problem is divorced from the finite volume method used to approximate the forward solution variables. An exact error formula and computable error estimate are derived based on an abstractly defined approximation of the adjoint solution. This framework allows the error to be computed to an arbitrary accuracy given a sufficiently well resolved approximation of the adjoint solution. The accuracy of the computable error estimate provably satisfies an a priori error bound for sufficiently smooth solutions of the forward and adjoint problems. The theory does not currently account for discontinuities. Computational examples are provided that show support of the theory for smooth solutions. The application to problems with discontinuities is also investigated computationally.

  6. On the Combination Procedure of Correlated Errors

    CERN Document Server

    Erler, Jens

    2015-01-01

    When averages of different experimental determinations of the same quantity are computed, each with statistical and systematic error components, then frequently the statistical and systematic components of the combined error are quoted explicitly. These are important pieces of information since statistical errors scale differently and often more favorably with the sample size than most systematical or theoretical errors. In this communication we describe a transparent procedure by which the statistical and systematic error components of the combination uncertainty can be obtained. We develop a general method and derive a general formula for the case of Gaussian errors with or without correlations. The method can easily be applied to other error distributions, as well. For the case of two measurements, we also define disparity and misalignment angles, and discuss their relation to the combination weight factors.

  7. On the combination procedure of correlated errors

    Energy Technology Data Exchange (ETDEWEB)

    Erler, Jens [Universidad Nacional Autonoma de Mexico, Instituto de Fisica, Mexico D.F. (Mexico)

    2015-09-15

    When averages of different experimental determinations of the same quantity are computed, each with statistical and systematic error components, then frequently the statistical and systematic components of the combined error are quoted explicitly. These are important pieces of information since statistical errors scale differently and often more favorably with the sample size than most systematical or theoretical errors. In this communication we describe a transparent procedure by which the statistical and systematic error components of the combination uncertainty can be obtained. We develop a general method and derive a general formula for the case of Gaussian errors with or without correlations. The method can easily be applied to other error distributions, as well. For the case of two measurements, we also define disparity and misalignment angles, and discuss their relation to the combination weight factors. (orig.)

  8. Advanced hardware design for error correcting codes

    CERN Document Server

    Coussy, Philippe

    2015-01-01

    This book provides thorough coverage of error correcting techniques. It includes essential basic concepts and the latest advances on key topics in design, implementation, and optimization of hardware/software systems for error correction. The book’s chapters are written by internationally recognized experts in this field. Topics include evolution of error correction techniques, industrial user needs, architectures, and design approaches for the most advanced error correcting codes (Polar Codes, Non-Binary LDPC, Product Codes, etc). This book provides access to recent results, and is suitable for graduate students and researchers of mathematics, computer science, and engineering. • Examines how to optimize the architecture of hardware design for error correcting codes; • Presents error correction codes from theory to optimized architecture for the current and the next generation standards; • Provides coverage of industrial user needs advanced error correcting techniques.

  9. Human error: A significant information security issue

    Energy Technology Data Exchange (ETDEWEB)

    Banks, W.W.

    1994-12-31

    One of the major threats to information security human error is often ignored or dismissed with statements such as {open_quotes}There is not much we can do about it.{close_quotes} This type of thinking runs counter to reality because studies have shown that, of all systems threats, human error has the highest probability of occurring and that, with professional assistance, human errors can be prevented or significantly reduced Security analysts often overlook human error as a major threat; however, other professionals such as human factors engineers are trained to deal with these probabilistic occurrences and mitigate them. In a recent study 55% of the respondents surveyed considered human error as the most important security threat. Documentation exists to show that human error was a major cause of the consequences suffered at Three Mile Island, Chernobyl, Bhopal, and the Exxon tanker, Valdez. Ironically, causes of human error can usually be quickly and easily eliminated.

  10. Radar error statistics for the space shuttle

    Science.gov (United States)

    Lear, W. M.

    1979-01-01

    Radar error statistics of C-band and S-band that are recommended for use with the groundtracking programs to process space shuttle tracking data are presented. The statistics are divided into two parts: bias error statistics, using the subscript B, and high frequency error statistics, using the subscript q. Bias errors may be slowly varying to constant. High frequency random errors (noise) are rapidly varying and may or may not be correlated from sample to sample. Bias errors were mainly due to hardware defects and to errors in correction for atmospheric refraction effects. High frequency noise was mainly due to hardware and due to atmospheric scintillation. Three types of atmospheric scintillation were identified: horizontal, vertical, and line of sight. This was the first time that horizontal and line of sight scintillations were identified.

  11. Relationships of Measurement Error and Prediction Error in Observed-Score Regression

    Science.gov (United States)

    Moses, Tim

    2012-01-01

    The focus of this paper is assessing the impact of measurement errors on the prediction error of an observed-score regression. Measures are presented and described for decomposing the linear regression's prediction error variance into parts attributable to the true score variance and the error variances of the dependent variable and the predictor…

  12. Chimera and other fertilization errors.

    Science.gov (United States)

    Malan, V; Vekemans, M; Turleau, C

    2006-11-01

    The finding of a mixture of 46,XX and 46,XY cells in an individual has been rarely reported in literature. It usually results in individuals with ambiguous genitalia. Approximately 10% of true human hermaphrodites show this type of karyotype. However, the underlying mechanisms are poorly understood. It may be the result of mosaicism or chimerism. By definition, a chimera is produced by the fusion of two different zygotes in a single embryo, while a mosaic contains genetically different cells issued from a single zygote. Several mechanisms are involved in the production of chimera. Stricto sensu, chimerism occurs from the post-zygotic fusion of two distinct embryos leading to a tetragametic chimera. In addition, there are other entities, which are also referred to as chimera: parthenogenetic chimera and chimera resulting from fertilization of the second polar body. Furthermore, a particular type of chimera called 'androgenetic chimera' recently described in fetuses with placental mesenchymal dysplasia and in rare patients with Beckwith-Wiedemann syndrome is discussed. Strategies to study mechanisms leading to the production of chimera and mosaics are also proposed.

  13. Orthogonality of inductosyn angle-measuring system error and error-separating technology

    Institute of Scientific and Technical Information of China (English)

    任顺清; 曾庆双; 王常虹

    2003-01-01

    Round inductosyn is widely used in inertial navigation test equipment, and its accuracy has significant effect on the general accuracy of the equipment. Four main errors of round inductosyn,i. e. the first-order long-period (360°) harmonic error, the second-order long-period harmonic error, the first-order short-period harmonic error and the second-order short-period harmonic error, are described, and the orthogonality of these tour kinds of errors is studied. An error separating technology is proposed to separate these four kinds of errors,and in the process of separating the short-period harmonic errors, the arrangement in the order of decimal part of the angle pitch number can be omitted. The effectiveness of the technology proposed is proved through measuring and adjusting the angular errors.

  14. Error processing network dynamics in schizophrenia.

    Science.gov (United States)

    Becerril, Karla E; Repovs, Grega; Barch, Deanna M

    2011-01-15

    Current theories of cognitive dysfunction in schizophrenia emphasize an impairment in the ability of individuals suffering from this disorder to monitor their own performance, and adjust their behavior to changing demands. Detecting an error in performance is a critical component of evaluative functions that allow the flexible adjustment of behavior to optimize outcomes. The dorsal anterior cingulate cortex (dACC) has been repeatedly implicated in error-detection and implementation of error-based behavioral adjustments. However, accurate error-detection and subsequent behavioral adjustments are unlikely to rely on a single brain region. Recent research demonstrates that regions in the anterior insula, inferior parietal lobule, anterior prefrontal cortex, thalamus, and cerebellum also show robust error-related activity, and integrate into a functional network. Despite the relevance of examining brain activity related to the processing of error information and supporting behavioral adjustments in terms of a distributed network, the contribution of regions outside the dACC to error processing remains poorly understood. To address this question, we used functional magnetic resonance imaging to examine error-related responses in 37 individuals with schizophrenia and 32 healthy controls in regions identified in the basic science literature as being involved in error processing, and determined whether their activity was related to behavioral adjustments. Our imaging results support previous findings showing that regions outside the dACC are sensitive to error commission, and demonstrated that abnormalities in brain responses to errors among individuals with schizophrenia extend beyond the dACC to almost all of the regions involved in error-related processing in controls. However, error related responses in the dACC were most predictive of behavioral adjustments in both groups. Moreover, the integration of this network of regions differed between groups, with the

  15. Subjective poverty line definitions

    NARCIS (Netherlands)

    J. Flik; B.M.S. van Praag (Bernard)

    1991-01-01

    textabstractIn this paper we will deal with definitions of subjective poverty lines. To measure a poverty threshold value in terms of household income, which separates the poor from the non-poor, we take into account the opinions of all people in society. Three subjective methods will be discussed

  16. Definition of Entity Authentication

    DEFF Research Database (Denmark)

    Ahmed, Naveed; Jensen, Christian D.

    2010-01-01

    Authentication is considered a pre-requisite for communication security, but the definition of authentication is generally not agreed upon. Many attacks on authentication protocols are the result of misunderstanding of the goals of authentication. This state of affairs indicate limitations in the...

  17. Romanian definite article revisited

    Directory of Open Access Journals (Sweden)

    Sorin Paliga

    1999-12-01

    Full Text Available I shall attempt to resume a long, almost endless discussion: the origin of the Romanian definite article. Any grammar of Romanian or any comparative grammar the Romance languages (e. g. Tagliavini 1977 always observes that Romanian, an iso­ lated case in the Romance family, has an agglutinated definite article. The typology is not indeed rare: Bulgarian, Albanian, Armenian, Basque and Swedish witness the same mechanism. We cannot approach the topic by analysing all these languages, yet a comparative analysis would be finally useful. In our case, it is obvious that Romanian cannot be isolated from Albanian and Bulgarian. A potential solution must explain the situation in ALL these three "Balkanic" languages, even if Romanian is not Balkanic stricto sensu1. The paper shall focus on the deep roots of the Romanian and Albanian definite arti­ cle, its typological relations with other linguistic areas, and shall attempt to explain this isolated situation in the field of Romance linguistics. For sure, the Romanian definite article mainly reflects the Latin heritage. Nevertheless, by saying only this, the tableau is not complete: some forms are not Latin but Pre-Latin, Thracian. This paper will try to substantiate this assertion.

  18. Subjective poverty line definitions

    NARCIS (Netherlands)

    J. Flik; B.M.S. van Praag (Bernard)

    1991-01-01

    textabstractIn this paper we will deal with definitions of subjective poverty lines. To measure a poverty threshold value in terms of household income, which separates the poor from the non-poor, we take into account the opinions of all people in society. Three subjective methods will be discussed a

  19. The definition of sarcopenia

    NARCIS (Netherlands)

    Bijlsma, Astrid Y.

    2013-01-01

    Sarcopenia in old age has been associated with a higher mortality, poor physical functioning, poor outcome of surgery and higher drug toxicity. There is no general consensus on the definition of sarcopenia. The aim of the research presented in this thesis was to assess the implications of the use of

  20. Definition af primitiver

    DEFF Research Database (Denmark)

    Christensen, Morten

    1997-01-01

    tidspunkt, kan der spares meget tid senere, hvor det er besværligt at definerer primitiverne helt om.I denne rapport er de primitiver, der skal bruges til Sydkrafts net gennemgået. Der er taget stilling til hver enkelt tilstand for de forskellige primitiver. Dette er gjort for at få en definition, som er...

  1. Definition of Professional Development

    Science.gov (United States)

    Learning Forward, 2015

    2015-01-01

    President Obama signed into law the Every Student Succeeds Act, the reauthorization of the Elementary and Secondary Education Act, on December 10, 2015. "Learning Forward's focus in this new law is its improved definition of professional learning," said Stephanie Hirsh, executive director of Learning Forward. "We've long advocated…

  2. The definition of sarcopenia

    NARCIS (Netherlands)

    Bijlsma, Astrid Y.

    2013-01-01

    Sarcopenia in old age has been associated with a higher mortality, poor physical functioning, poor outcome of surgery and higher drug toxicity. There is no general consensus on the definition of sarcopenia. The aim of the research presented in this thesis was to assess the implications of the use of

  3. COPD: Definition and Phenotypes

    DEFF Research Database (Denmark)

    Vestbo, J.

    2014-01-01

    particles or gases. Exacerbations and comorbidities contribute to the overall severity in individual patients. The evolution of this definition and the diagnostic criteria currently in use are discussed. COPD is increasingly divided in subgroups or phenotypes based on specific features and association...

  4. Embedded wavelet video coding with error concealment

    Science.gov (United States)

    Chang, Pao-Chi; Chen, Hsiao-Ching; Lu, Ta-Te

    2000-04-01

    We present an error-concealed embedded wavelet (ECEW) video coding system for transmission over Internet or wireless networks. This system consists of two types of frames: intra (I) frames and inter, or predicted (P), frames. Inter frames are constructed by the residual frames formed by variable block-size multiresolution motion estimation (MRME). Motion vectors are compressed by arithmetic coding. The image data of intra frames and residual frames are coded by error-resilient embedded zerotree wavelet (ER-EZW) coding. The ER-EZW coding partitions the wavelet coefficients into several groups and each group is coded independently. Therefore, the error propagation effect resulting from an error is only confined in a group. In EZW coding any single error may result in a totally undecodable bitstream. To further reduce the error damage, we use the error concealment at the decoding end. In intra frames, the erroneous wavelet coefficients are replaced by neighbors. In inter frames, erroneous blocks of wavelet coefficients are replaced by data from the previous frame. Simulations show that the performance of ECEW is superior to ECEW without error concealment by 7 to approximately 8 dB at the error-rate of 10-3 in intra frames. The improvement still has 2 to approximately 3 dB at a higher error-rate of 10-2 in inter frames.

  5. Regression calibration with heteroscedastic error variance.

    Science.gov (United States)

    Spiegelman, Donna; Logan, Roger; Grove, Douglas

    2011-01-01

    The problem of covariate measurement error with heteroscedastic measurement error variance is considered. Standard regression calibration assumes that the measurement error has a homoscedastic measurement error variance. An estimator is proposed to correct regression coefficients for covariate measurement error with heteroscedastic variance. Point and interval estimates are derived. Validation data containing the gold standard must be available. This estimator is a closed-form correction of the uncorrected primary regression coefficients, which may be of logistic or Cox proportional hazards model form, and is closely related to the version of regression calibration developed by Rosner et al. (1990). The primary regression model can include multiple covariates measured without error. The use of these estimators is illustrated in two data sets, one taken from occupational epidemiology (the ACE study) and one taken from nutritional epidemiology (the Nurses' Health Study). In both cases, although there was evidence of moderate heteroscedasticity, there was little difference in estimation or inference using this new procedure compared to standard regression calibration. It is shown theoretically that unless the relative risk is large or measurement error severe, standard regression calibration approximations will typically be adequate, even with moderate heteroscedasticity in the measurement error model variance. In a detailed simulation study, standard regression calibration performed either as well as or better than the new estimator. When the disease is rare and the errors normally distributed, or when measurement error is moderate, standard regression calibration remains the method of choice.

  6. Medical errors recovered by critical care nurses.

    Science.gov (United States)

    Dykes, Patricia C; Rothschild, Jeffrey M; Hurley, Ann C

    2010-05-01

    : The frequency and types of medical errors are well documented, but less is known about potential errors that were intercepted by nurses. We studied the type, frequency, and potential harm of recovered medical errors reported by critical care registered nurses (CCRNs) during the previous year. : Nurses are known to protect patients from harm. Several studies on medical errors found that there would have been more medical errors reaching the patient had not potential errors been caught earlier by nurses. : The Recovered Medical Error Inventory, a 25-item empirically derived and internally consistent (alpha =.90) list of medical errors, was posted on the Internet. Participants were recruited via e-mail and healthcare-related listservs using a nonprobability snowball sampling technique. Investigators e-mailed contacts working in hospitals or who managed healthcare-related listservs and asked the contacts to pass the link on to others with contacts in acute care settings. : During 1 year, 345 CCRNs reported that they recovered 18,578 medical errors, of which they rated 4,183 as potentially lethal. : Surveillance, clinical judgment, and interventions by CCRNs to identify, interrupt, and correct medical errors protected seriously ill patients from harm.

  7. Common errors in disease mapping

    Directory of Open Access Journals (Sweden)

    Ricardo Ocaña-Riola

    2010-05-01

    Full Text Available Many morbid-mortality atlases and small-area studies have been carried out over the last decade. However, the methods used to draw up such research, the interpretation of results and the conclusions published are often inaccurate. Often, the proliferation of this practice has led to inefficient decision-making, implementation of inappropriate health policies and negative impact on the advancement of scientific knowledge. This paper reviews the most frequent errors in the design, analysis and interpretation of small-area epidemiological studies and proposes a diagnostic evaluation test that should enable the scientific quality of published papers to be ascertained. Nine common mistakes in disease mapping methods are discussed. From this framework, and following the theory of diagnostic evaluation, a standardised test to evaluate the scientific quality of a small-area epidemiology study has been developed. Optimal quality is achieved with the maximum score (16 points, average with a score between 8 and 15 points, and low with a score of 7 or below. A systematic evaluation of scientific papers, together with an enhanced quality in future research, will contribute towards increased efficacy in epidemiological surveillance and in health planning based on the spatio-temporal analysis of ecological information.

  8. On Nautical Observation Errors Evaluation

    Directory of Open Access Journals (Sweden)

    Wlodzimierz Filipowicz

    2015-12-01

    Full Text Available Mathematical Theory of Evidence (MTE enables upgrading models and solving crucial problems in many disciplines. MTE delivers new unique opportunity once one engages possibilistic concept. Since fuzziness is widely perceived as something that enables encoding knowledge thus models build upon fuzzy platforms accepts ones skill within given field. At the same time evidence combining scheme is a mechanism enabling enrichment initial data informative context. Therefore it can be exploited in many cases where uncertainty and lack of precision prevail. In nautical applications, for example, it can be used in order to handle data feature systematic and random deflections. Theoretical background was discussed and computer application was successfully implemented in order to cope with erroneous and uncertain data. Output of the application resulted in making a fix and a posteriori evaluating its quality. It was also proven that it can be useful for calibrating measurement appliances. Unique feature of the combination scheme proven by the author in his previous paper, enables identifying measurement systematic deflection. Based on the theorem the paper aims at further exploration of practical aspects of the problem. It concentrates on reduction of hypothesis frame reduction and random along with systematic errors identifications.

  9. Medication errors in anesthesia: unacceptable or unavoidable?

    Directory of Open Access Journals (Sweden)

    Ira Dhawan

    Full Text Available Abstract Medication errors are the common causes of patient morbidity and mortality. It adds financial burden to the institution as well. Though the impact varies from no harm to serious adverse effects including death, it needs attention on priority basis since medication errors' are preventable. In today's world where people are aware and medical claims are on the hike, it is of utmost priority that we curb this issue. Individual effort to decrease medication error alone might not be successful until a change in the existing protocols and system is incorporated. Often drug errors that occur cannot be reversed. The best way to ‘treat' drug errors is to prevent them. Wrong medication (due to syringe swap, overdose (due to misunderstanding or preconception of the dose, pump misuse and dilution error, incorrect administration route, under dosing and omission are common causes of medication error that occur perioperatively. Drug omission and calculation mistakes occur commonly in ICU. Medication errors can occur perioperatively either during preparation, administration or record keeping. Numerous human and system errors can be blamed for occurrence of medication errors. The need of the hour is to stop the blame - game, accept mistakes and develop a safe and ‘just' culture in order to prevent medication errors. The newly devised systems like VEINROM, a fluid delivery system is a novel approach in preventing drug errors due to most commonly used medications in anesthesia. Similar developments along with vigilant doctors, safe workplace culture and organizational support all together can help prevent these errors.

  10. [Medication errors in anesthesia: unacceptable or unavoidable?

    Science.gov (United States)

    Dhawan, Ira; Tewari, Anurag; Sehgal, Sankalp; Sinha, Ashish Chandra

    Medication errors are the common causes of patient morbidity and mortality. It adds financial burden to the institution as well. Though the impact varies from no harm to serious adverse effects including death, it needs attention on priority basis since medication errors' are preventable. In today's world where people are aware and medical claims are on the hike, it is of utmost priority that we curb this issue. Individual effort to decrease medication error alone might not be successful until a change in the existing protocols and system is incorporated. Often drug errors that occur cannot be reversed. The best way to 'treat' drug errors is to prevent them. Wrong medication (due to syringe swap), overdose (due to misunderstanding or preconception of the dose, pump misuse and dilution error), incorrect administration route, under dosing and omission are common causes of medication error that occur perioperatively. Drug omission and calculation mistakes occur commonly in ICU. Medication errors can occur perioperatively either during preparation, administration or record keeping. Numerous human and system errors can be blamed for occurrence of medication errors. The need of the hour is to stop the blame - game, accept mistakes and develop a safe and 'just' culture in order to prevent medication errors. The newly devised systems like VEINROM, a fluid delivery system is a novel approach in preventing drug errors due to most commonly used medications in anesthesia. Similar developments along with vigilant doctors, safe workplace culture and organizational support all together can help prevent these errors. Copyright © 2016. Publicado por Elsevier Editora Ltda.

  11. Medication errors in anesthesia: unacceptable or unavoidable?

    Science.gov (United States)

    Dhawan, Ira; Tewari, Anurag; Sehgal, Sankalp; Sinha, Ashish Chandra

    Medication errors are the common causes of patient morbidity and mortality. It adds financial burden to the institution as well. Though the impact varies from no harm to serious adverse effects including death, it needs attention on priority basis since medication errors' are preventable. In today's world where people are aware and medical claims are on the hike, it is of utmost priority that we curb this issue. Individual effort to decrease medication error alone might not be successful until a change in the existing protocols and system is incorporated. Often drug errors that occur cannot be reversed. The best way to 'treat' drug errors is to prevent them. Wrong medication (due to syringe swap), overdose (due to misunderstanding or preconception of the dose, pump misuse and dilution error), incorrect administration route, under dosing and omission are common causes of medication error that occur perioperatively. Drug omission and calculation mistakes occur commonly in ICU. Medication errors can occur perioperatively either during preparation, administration or record keeping. Numerous human and system errors can be blamed for occurrence of medication errors. The need of the hour is to stop the blame - game, accept mistakes and develop a safe and 'just' culture in order to prevent medication errors. The newly devised systems like VEINROM, a fluid delivery system is a novel approach in preventing drug errors due to most commonly used medications in anesthesia. Similar developments along with vigilant doctors, safe workplace culture and organizational support all together can help prevent these errors. Copyright © 2016. Published by Elsevier Editora Ltda.

  12. Error correction maintains post-error adjustments after one night of total sleep deprivation.

    Science.gov (United States)

    Hsieh, Shulan; Tsai, Cheng-Yin; Tsai, Ling-Ling

    2009-06-01

    Previous behavioral and electrophysiologic evidence indicates that one night of total sleep deprivation (TSD) impairs error monitoring, including error detection, error correction, and posterror adjustments (PEAs). This study examined the hypothesis that error correction, manifesting as an overtly expressed self-generated performance feedback to errors, can effectively prevent TSD-induced impairment in the PEAs. Sixteen healthy right-handed adults (seven women and nine men) aged 19-23 years were instructed to respond to a target arrow flanked by four distracted arrows and to correct their errors immediately after committing errors. Task performance and electroencephalogram (EEG) data were collected after normal sleep (NS) and after one night of TSD in a counterbalanced repeated-measures design. With the demand of error correction, the participants maintained the same level of PEAs in reducing the error rate for trial N + 1 after TSD as after NS. Corrective behavior further affected the PEAs for trial N + 1 in the omission rate and response speed, which decreased and speeded up following corrected errors, particularly after TSD. These results show that error correction effectively maintains posterror reduction in both committed and omitted errors after TSD. A cerebral mechanism might be involved in the effect of error correction as EEG beta (17-24 Hz) activity was increased after erroneous responses compared to after correct responses. The practical application of error correction to increasing work safety, which can be jeopardized by repeated errors, is suggested for workers who are involved in monotonous but attention-demanding monitoring tasks.

  13. Medical error and decision making: Learning from the past and present in intensive care.

    Science.gov (United States)

    Bucknall, Tracey K

    2010-08-01

    Human error occurs in every occupation. Medical errors may result in a near miss or an actual injury to a patient that has nothing to do with the underlying medical condition. Intensive care has one of the highest incidences of medical error and patient injury in any specialty medical area; thought to be related to the rapidly changing patient status and complex diagnoses and treatments. The aims of this paper are to: (1) outline the definition, classifications and aetiology of medical error; (2) summarise key findings from the literature with a specific focus on errors arising from intensive care areas; and (3) conclude with an outline of approaches for analysing clinical information to determine adverse events and inform practice change in intensive care. Database searches of articles and textbooks using keywords: medical error, patient safety, decision making and intensive care. Sociology and psychology literature cited therein. Critically ill patients require numerous medications, multiple infusions and procedures. Although medical errors are often detected by clinicians at the bedside, organisational processes and systems may contribute to the problem. A systems approach is thought to provide greater insight into the contributory factors and potential solutions to avoid preventable adverse events. It is recommended that a variety of clinical information and research techniques are used as a priority to prevent hospital acquired injuries and address patient safety concerns in intensive care. 2010 Australian College of Critical Care Nurses Ltd. Published by Elsevier Ltd. All rights reserved.

  14. Accurate and robust estimation of phase error and its uncertainty of 50 GHz bandwidth sampling circuit

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    This paper discusses the dependence of the phase error on the 50 GHz bandwidth oscilloscope's sampling circuitry- We give the definition of the phase error as the difference between the impulse responses of the NTN (nose-to-nose) estimate and the true response of the sampling circuit. We develop a method to predict the NTN phase response arising from the internal sampling circuitry of the oscilloscope. For the default sampling-circuit configuration that we examine, our phase error is approximately 7.03 at 50 GHz. We study the sensitivity of the oscilloscope's phase response to parametric changes in sampling-circuit component values. We develop procedures to quantify the sensitivity of the phase error to each component and to a combination of components that depend on the fractional uncertainty in each of the model parameters as the same value, 10%. We predict the upper and lower bounds of phase error, that is, we vary all of the circuit parameters simultaneously in such a way as to increase the phase error, and then vary all of the circuit parameters to decrease the phase error. Based on Type B evaluation, this method qualifies the impresses of all parameters of the sampling circuit and gives the value of standard uncertainty, 1.34. This result is developed at the first time and has important practical uses. It can be used for phase calibration in the 50 GHz bandwidth large signal network analyzers (LSNAs).

  15. Learning without Borders: A Review of the Implementation of Medical Error Reporting in Medecins Sans Frontieres.

    Directory of Open Access Journals (Sweden)

    Leslie Shanks

    Full Text Available To analyse the results from the first 3 years of implementation of a medical error reporting system in Médecins Sans Frontières-Operational Centre Amsterdam (MSF programs.A medical error reporting policy was developed with input from frontline workers and introduced to the organisation in June 2010. The definition of medical error used was "the failure of a planned action to be completed as intended or the use of a wrong plan to achieve an aim." All confirmed error reports were entered into a database without the use of personal identifiers.179 errors were reported from 38 projects in 18 countries over the period of June 2010 to May 2013. The rate of reporting was 31, 42, and 106 incidents/year for reporting year 1, 2 and 3 respectively. The majority of errors were categorized as dispensing errors (62 cases or 34.6%, errors or delays in diagnosis (24 cases or 13.4% and inappropriate treatment (19 cases or 10.6%. The impact of the error was categorized as no harm (58, 32.4%, harm (70, 39.1%, death (42, 23.5% and unknown in 9 (5.0% reports. Disclosure to the patient took place in 34 cases (19.0%, did not take place in 46 (25.7%, was not applicable for 5 (2.8% cases and not reported for 94 (52.5%. Remedial actions introduced at headquarters level included guideline revisions and changes to medical supply procedures. At field level improvements included increased training and supervision, adjustments in staffing levels, and adaptations to the organization of the pharmacy.It was feasible to implement a voluntary reporting system for medical errors despite the complex contexts in which MSF intervenes. The reporting policy led to system changes that improved patient safety and accountability to patients. Challenges remain in achieving widespread acceptance of the policy as evidenced by the low reporting and disclosure rates.

  16. On some positive definite functions

    OpenAIRE

    Bhatia, Rajendra; Jain, Tanvi

    2014-01-01

    We study the function $(1 - \\|x\\|)\\slash (1 - \\|x\\|^r),$ and its reciprocal, on the Euclidean space $\\mathbb{R}^n,$ with respect to properties like being positive definite, conditionally positive definite, and infinitely divisible.

  17. Definitions of Health Terms: Nutrition

    Science.gov (United States)

    ... gov/definitions/nutritiondefinitions.html Definitions of Health Terms: Nutrition To use the sharing features on this page, ... National Institutes of Health, Office of Dietary Supplements Nutrition This field of study focuses on foods and ...

  18. Identification errors in pathology and laboratory medicine.

    Science.gov (United States)

    Valenstein, Paul N; Sirota, Ronald L

    2004-12-01

    Identification errors involve misidentification of a patient or a specimen. Either has the potential to cause patients harm. Identification errors can occur during any part of the test cycle; however, most occur in the preanalytic phase. Patient identification errors in transfusion medicine occur in 0.05% of specimens; for general laboratory specimens the rate is much higher, around 1%. Anatomic pathology, which involves multiple specimen transfers and hand-offs, may have the highest identification error rate. Certain unavoidable cognitive failures lead to identification errors. Technology, ranging from bar-coded specimen labels to radio frequency identification tags, can be incorporated into protective systems that have the potential to detect and correct human error and reduce the frequency with which patients and specimens are misidentified.

  19. Error handling strategies in multiphase inverse modeling

    Energy Technology Data Exchange (ETDEWEB)

    Finsterle, S.; Zhang, Y.

    2010-12-01

    Parameter estimation by inverse modeling involves the repeated evaluation of a function of residuals. These residuals represent both errors in the model and errors in the data. In practical applications of inverse modeling of multiphase flow and transport, the error structure of the final residuals often significantly deviates from the statistical assumptions that underlie standard maximum likelihood estimation using the least-squares method. Large random or systematic errors are likely to lead to convergence problems, biased parameter estimates, misleading uncertainty measures, or poor predictive capabilities of the calibrated model. The multiphase inverse modeling code iTOUGH2 supports strategies that identify and mitigate the impact of systematic or non-normal error structures. We discuss these approaches and provide an overview of the error handling features implemented in iTOUGH2.

  20. Beyond the answer: post-error processes.

    Science.gov (United States)

    Kleiter, G D; Schwarzenbacher, K

    1989-08-01

    When you suspect that you just gave an erroneous answer to a question you stop and rethink. Suspected errors lead to a shift in the control and content of cognitive processes. In the present experiment we investigated the influence of errors upon heart rates and response latencies. Sixty-four subjects participated in an experiment in which each subject solved a sequence of 60 verbal analogies. The results demonstrated increased latencies after errors and decelerated heart rates during the post-error period. The results were explained by a psychophysiological model in which the septo-hippocampal system functions as a control system which coordinates the priority and selection of cognitive processes. Error detection suppresses strategies which otherwise prevent looping and iterative reanalyses of old material. The inhibition is also responsible for the cardiac slowing during the post-error period.

  1. Meteorological Error Budget Using Open Source Data

    Science.gov (United States)

    2016-09-01

    VBA ) script was created that would read the model - based output and corresponding sounding data for each message type (METCM or METB3), output type...produce artillery MET error budget tables that account for expected errors when using MET model -based systems. Representatives of the US and other...nations within the North Atlantic Treaty Organization expressed a need for shareable model -based MET error budgets. Use of an openly available civilian

  2. Soft errors in modern electronic systems

    CERN Document Server

    Nicolaidis, Michael

    2010-01-01

    This book provides a comprehensive presentation of the most advanced research results and technological developments enabling understanding, qualifying and mitigating the soft errors effect in advanced electronics, including the fundamental physical mechanisms of radiation induced soft errors, the various steps that lead to a system failure, the modelling and simulation of soft error at various levels (including physical, electrical, netlist, event driven, RTL, and system level modelling and simulation), hardware fault injection, accelerated radiation testing and natural environment testing, s

  3. Medication errors in anesthesia: unacceptable or unavoidable?

    OpenAIRE

    Ira Dhawan; Anurag Tewari; Sankalp Sehgal; Ashish Chandra Sinha

    2017-01-01

    Abstract Medication errors are the common causes of patient morbidity and mortality. It adds financial burden to the institution as well. Though the impact varies from no harm to serious adverse effects including death, it needs attention on priority basis since medication errors' are preventable. In today's world where people are aware and medical claims are on the hike, it is of utmost priority that we curb this issue. Individual effort to decrease medication error alone might not be succes...

  4. Group representations, error bases and quantum codes

    Energy Technology Data Exchange (ETDEWEB)

    Knill, E

    1996-01-01

    This report continues the discussion of unitary error bases and quantum codes. Nice error bases are characterized in terms of the existence of certain characters in a group. A general construction for error bases which are non-abelian over the center is given. The method for obtaining codes due to Calderbank et al. is generalized and expressed purely in representation theoretic terms. The significance of the inertia subgroup both for constructing codes and obtaining the set of transversally implementable operations is demonstrated.

  5. ERROR CORRECTION IN HIGH SPEED ARITHMETIC,

    Science.gov (United States)

    The errors due to a faulty high speed multiplier are shown to be iterative in nature. These errors are analyzed in various aspects. The arithmetic coding technique is suggested for the improvement of high speed multiplier reliability. Through a number theoretic investigation, a large class of arithmetic codes for single iterative error correction are developed. The codes are shown to have near-optimal rates and to render a simple decoding method. The implementation of these codes seems highly practical. (Author)

  6. Medication errors recovered by emergency department pharmacists.

    Science.gov (United States)

    Rothschild, Jeffrey M; Churchill, William; Erickson, Abbie; Munz, Kristin; Schuur, Jeremiah D; Salzberg, Claudia A; Lewinski, Daniel; Shane, Rita; Aazami, Roshanak; Patka, John; Jaggers, Rondell; Steffenhagen, Aaron; Rough, Steve; Bates, David W

    2010-06-01

    We assess the impact of emergency department (ED) pharmacists on reducing potentially harmful medication errors. We conducted this observational study in 4 academic EDs. Trained pharmacy residents observed a convenience sample of ED pharmacists' activities. The primary outcome was medication errors recovered by pharmacists, including errors intercepted before reaching the patient (near miss or potential adverse drug event), caught after reaching the patient but before causing harm (mitigated adverse drug event), or caught after some harm but before further or worsening harm (ameliorated adverse drug event). Pairs of physician and pharmacist reviewers confirmed recovered medication errors and assessed their potential for harm. Observers were unblinded and clinical outcomes were not evaluated. We conducted 226 observation sessions spanning 787 hours and observed pharmacists reviewing 17,320 medications ordered or administered to 6,471 patients. We identified 504 recovered medication errors, or 7.8 per 100 patients and 2.9 per 100 medications. Most of the recovered medication errors were intercepted potential adverse drug events (90.3%), with fewer mitigated adverse drug events (3.9%) and ameliorated adverse drug events (0.2%). The potential severities of the recovered errors were most often serious (47.8%) or significant (36.2%). The most common medication classes associated with recovered medication errors were antimicrobial agents (32.1%), central nervous system agents (16.2%), and anticoagulant and thrombolytic agents (14.1%). The most common error types were dosing errors, drug omission, and wrong frequency errors. ED pharmacists can identify and prevent potentially harmful medication errors. Controlled trials are necessary to determine the net costs and benefits of ED pharmacist staffing on safety, quality, and costs, especially important considerations for smaller EDs and pharmacy departments. Copyright (c) 2009 American College of Emergency Physicians

  7. Error Estimates of Theoretical Models: a Guide

    CERN Document Server

    Dobaczewski, J; Reinhard, P -G

    2014-01-01

    This guide offers suggestions/insights on uncertainty quantification of nuclear structure models. We discuss a simple approach to statistical error estimates, strategies to assess systematic errors, and show how to uncover inter-dependencies by correlation analysis. The basic concepts are illustrated through simple examples. By providing theoretical error bars on predicted quantities and using statistical methods to study correlations between observables, theory can significantly enhance the feedback between experiment and nuclear modeling.

  8. Maximum privacy without coherence, zero-error

    Science.gov (United States)

    Leung, Debbie; Yu, Nengkun

    2016-09-01

    We study the possible difference between the quantum and the private capacities of a quantum channel in the zero-error setting. For a family of channels introduced by Leung et al. [Phys. Rev. Lett. 113, 030512 (2014)], we demonstrate an extreme difference: the zero-error quantum capacity is zero, whereas the zero-error private capacity is maximum given the quantum output dimension.

  9. A New Definition of Creativity

    Science.gov (United States)

    Dorin, Alan; Korb, Kevin B.

    Creative artifacts can be generated by employing A-Life software, but programmers must first consider, explicitly or implicitly, what would count as creative. Most apply standard definitions that incorporate notions of novelty, value and appropriateness. Here we re-assess this approach. Some basic facts about creativity suggest criteria that guide us to a new definition of creativity. We briefly defend our definition against some plausible objections and explore the ways in which this new definition differs and improves upon the alternatives.

  10. A Definition of Artificial Intelligence

    OpenAIRE

    2012-01-01

    In this paper we offer a formal definition of Artificial Intelligence and this directly gives us an algorithm for construction of this object. Really, this algorithm is useless due to the combinatory explosion. The main innovation in our definition is that it does not include the knowledge as a part of the intelligence. So according to our definition a newly born baby also is an Intellect. Here we differs with Turing's definition which suggests that an Intellect is a person with knowledge gai...

  11. How social is error observation? The neural mechanisms underlying the observation of human and machine errors.

    Science.gov (United States)

    Desmet, Charlotte; Deschrijver, Eliane; Brass, Marcel

    2014-04-01

    Recently, it has been shown that the medial prefrontal cortex (MPFC) is involved in error execution as well as error observation. Based on this finding, it has been argued that recognizing each other's mistakes might rely on motor simulation. In the current functional magnetic resonance imaging (fMRI) study, we directly tested this hypothesis by investigating whether medial prefrontal activity in error observation is restricted to situations that enable simulation. To this aim, we compared brain activity related to the observation of errors that can be simulated (human errors) with brain activity related to errors that cannot be simulated (machine errors). We show that medial prefrontal activity is not only restricted to the observation of human errors but also occurs when observing errors of a machine. In addition, our data indicate that the MPFC reflects a domain general mechanism of monitoring violations of expectancies.

  12. Research on the technology for processing errors of photoelectric theodolite based on error design idea

    Science.gov (United States)

    Guo, Xiaosong; Pu, Pengcheng; Zhou, Zhaofa; Wang, Kunming

    2012-10-01

    The errors existing in photoelectric theodolite were studied according to the error design idea , that is - the correction of theodolite errors was achieved by analyzing the effect of errors actively instead of processing the data with error passively. Aiming at the shafting error, the relationship between different errors was analyzed by the error model based on coordinate transformation, and the real-time error compensation method based on the normal-reversed measuring method and levelness auto-detection was supposed. As to the eccentric error of dial, the idea of eccentric residual error was presented and its influence to measuring precision was studied, then the dynamic compensation model was build, so the influence of eccentric error of dial to measuring precision can be eliminated. For the centering deviation in the process of measuring angle, the compensation method based on the error model was supposed, in which the centering deviation was detected automatically based on computer vision. The above method based on error design idea reduced the influence to measuring result by software compensation method effectively, and improved the automation degree of azimuth angle measuring of theodolite, at the same time the precision was not depressed.

  13. Spelling Errors in University Students’ English Writing

    Institute of Scientific and Technical Information of China (English)

    王祥德; 邓兆红

    2012-01-01

      [3] Wyatt, V. An Analysis of Errors in Composition Writing[J]. ELT Journal,1973(2):177-188.%  This paper investigated the spelling errors made by university students in Hong Kong. By analyzing the spelling errors in the untimed essays and exam scripts, we found that students are prone to make more spelling mistakes in exam scripts, the same type of errors occur in both of the two kinds of texts; and their ranks of the frequency also are the same

  14. Error measuring system of rotary Inductosyn

    Science.gov (United States)

    Liu, Chengjun; Zou, Jibin; Fu, Xinghe

    2008-10-01

    The inductosyn is a kind of high-precision angle-position sensor. It has important applications in servo table, precision machine tool and other products. The precision of inductosyn is calibrated by its error. It's an important problem about the error measurement in the process of production and application of the inductosyn. At present, it mainly depends on the method of artificial measurement to obtain the error of inductosyn. Therefore, the disadvantages can't be ignored such as the high labour intensity of the operator, the occurrent error which is easy occurred and the poor repeatability, and so on. In order to solve these problems, a new automatic measurement method is put forward in this paper which based on a high precision optical dividing head. Error signal can be obtained by processing the output signal of inductosyn and optical dividing head precisely. When inductosyn rotating continuously, its zero position error can be measured dynamically, and zero error curves can be output automatically. The measuring and calculating errors caused by man-made factor can be overcome by this method, and it makes measuring process more quickly, exactly and reliably. Experiment proves that the accuracy of error measuring system is 1.1 arc-second (peak - peak value).

  15. Estimating IMU heading error from SAR images.

    Energy Technology Data Exchange (ETDEWEB)

    Doerry, Armin Walter

    2009-03-01

    Angular orientation errors of the real antenna for Synthetic Aperture Radar (SAR) will manifest as undesired illumination gradients in SAR images. These gradients can be measured, and the pointing error can be calculated. This can be done for single images, but done more robustly using multi-image methods. Several methods are provided in this report. The pointing error can then be fed back to the navigation Kalman filter to correct for problematic heading (yaw) error drift. This can mitigate the need for uncomfortable and undesired IMU alignment maneuvers such as S-turns.

  16. Beam positioning error budget in ICF driver

    CERN Document Server

    Shi Zhi Quan; Su Jing Qin

    2002-01-01

    The author presents the method of linear weight sum to beam positioning budget on the basis of ICF request on targeting, the approach of equal or unequal probability to allocate errors to each optical element. Based on the relationship between the motion of the optical components and beam position on target, the position error of the optical components was evaluated, which was referred to as the maximum range. Lots of ray trace were performed, the position error budget were modified by law of the normal distribution. An overview of position error budget of the components is provided

  17. Study of Errors among Nursing Students

    Directory of Open Access Journals (Sweden)

    Ella Koren

    2007-09-01

    Full Text Available The study of errors in the health system today is a topic of considerable interest aimed at reducing errors through analysis of the phenomenon and the conclusions reached. Errors that occur frequently among health professionals have also been observed among nursing students. True, in most cases they are actually “near errors,” but these could be a future indicator of therapeutic reality and the effect of nurses' work environment on their personal performance. There are two different approaches to such errors: (a The EPP (error prone person approach lays full responsibility at the door of the individual involved in the error, whether a student, nurse, doctor, or pharmacist. According to this approach, handling consists purely in identifying and penalizing the guilty party. (b The EPE (error prone environment approach emphasizes the environment as a primary contributory factor to errors. The environment as an abstract concept includes components and processes of interpersonal communications, work relations, human engineering, workload, pressures, technical apparatus, and new technologies. The objective of the present study was to examine the role played by factors in and components of personal performance as compared to elements and features of the environment. The study was based on both of the aforementioned approaches, which, when combined, enable a comprehensive understanding of the phenomenon of errors among the student population as well as a comparison of factors contributing to human error and to error deriving from the environment. The theoretical basis of the study was a model that combined both approaches: one focusing on the individual and his or her personal performance and the other focusing on the work environment. The findings emphasize the work environment of health professionals as an EPE. However, errors could have been avoided by means of strict adherence to practical procedures. The authors examined error events in the

  18. Error Analysis and English Language Teaching

    Institute of Scientific and Technical Information of China (English)

    Ma; Jinling

    2015-01-01

    The theory of Error Analysis is a crucial part in the research of second language acquisition and it has significant influence on exploring the pattern of English teaching.Although there are some limitations in error analysis both in theory and practice, its significant role has been proved and recognized.It is inevitable that how to scientifically treat the errors will be become more and more popular in modern English teaching.The aim of this paper is to show the importance of error analysis in English teaching and also present how well it can function in English language teaching.

  19. Error computation for adaptive finite element analysis

    CERN Document Server

    Khan, A A; Memon, I R; Ming, X Y

    2002-01-01

    The paper gives a simple numerical procedure for computations of errors generated by the discretisation process of finite element method. The procedure given is based on the ZZ error estimator which is believed to be reasonable accurate and thus can be readily implemented in any existing finite element codes. The devised procedure not only estimates the global energy norm error but also evaluates the local errors in individual elements. In the example, the given procedure is combined with an adaptive refinement procedure, which provides guidance for optimal mesh designing and allows the user to obtain a desired accuracy with a limited number of interaction. (author)

  20. The NASTRAN Error Correction Information System (ECIS)

    Science.gov (United States)

    Rosser, D. C., Jr.; Rogers, J. L., Jr.

    1975-01-01

    A data management procedure, called Error Correction Information System (ECIS), is described. The purpose of this system is to implement the rapid transmittal of error information between the NASTRAN Systems Management Office (NSMO) and the NASTRAN user community. The features of ECIS and its operational status are summarized. The mode of operation for ECIS is compared to the previous error correction procedures. It is shown how the user community can have access to error information much more rapidly when using ECIS. Flow charts and time tables characterize the convenience and time saving features of ECIS.

  1. Assigning error to an M2 measurement

    Science.gov (United States)

    Ross, T. Sean

    2006-02-01

    The ISO 11146:1999 standard has been published for 6 years and set forth the proper way to measure the M2 parameter. In spite of the strong experimental guidance given by this standard and the many commercial devices based upon ISO 11146, it is still the custom to quote M2 measurements without any reference to significant figures or error estimation. To the author's knowledge, no commercial M2 measurement device includes error estimation. There exists, perhaps, a false belief that M2 numbers are high precision and of insignificant error. This paradigm causes program managers and purchasers to over-specify a beam quality parameter and researchers not to question the accuracy and precision of their M2 measurements. This paper will examine the experimental sources of error in an M2 measurement including discretization error, CCD noise, discrete filter sets, noise equivalent aperture estimation, laser fluctuation and curve fitting error. These sources of error will be explained in their experimental context and convenient formula given to properly estimate error in a given M2 measurement. This work is the result of the author's inability to find error estimation and disclosure of methods in commercial beam quality measurement devices and building an ISO 11146 compliant, computer- automated M2 measurement device and the resulting lessons learned and concepts developed.

  2. Definitions of Health Terms: Fitness

    Science.gov (United States)

    ... this page: https://medlineplus.gov/definitions/fitnessdefinitions.html Definitions of Health Terms: Fitness To use the sharing features on ... the most of your exercise routine. Find more definitions on Fitness | General Health | Minerals | Nutrition | Vitamins Activity Count Physical activity is ...

  3. Definitions of Health Terms: Minerals

    Science.gov (United States)

    ... this page: https://medlineplus.gov/definitions/mineralsdefinitions.html Definitions of Health Terms : Minerals To use the sharing features on ... of the minerals that you need. Find more definitions on Fitness | General Health | Minerals | Nutrition | Vitamins Antioxidants Antioxidants are substances that ...

  4. Oil and Gas Exploration Planning using VOI Technique

    Science.gov (United States)

    Peskova, D. N.; Sizykh, A. V.; Rukavishnikov, V. S.

    2016-03-01

    Paper deals with actual problem about making decisions during field development. The main aim was to apply method “Value of information” in order to estimate the necessity of field exploration works and show the effectiveness of this method. The object of analysis - field X, which is located in the Eastern Siberia. The reservoir is B13 formation of Vend age. The Field has complex structure, and divided into blocks by faults. During evaluation of the project, main uncertainties and oil in place were obtained for three blocks of the field. According to uncertainty analysis, it was suggested to drill a new exploration well, and value of information method was applied to estimate results from this exploration works. Economic evaluation of the value of information method was made by choosing optimal development strategy. According to the obtained results, drilling of the exploration wells for blocks 1 and 3 of the field X is a good decision, while drilling a well in the second block is risky and not recommended. Also using the value of information, optimal well locations were advised - well l_le for the first block, and well 33 for the third block.

  5. Finance d'entreprise:voix nouvelles et nouvelles voies

    OpenAIRE

    Hélène Rainelli-Le Montagner

    2008-01-01

    (VF)Cet article examine les implications sur les recherches en finance d’entreprise du renouvellement récent des conceptions théoriques sur le fonctionnement des marchés financiers. En distinguant les approches issues de la finance comportementale de celles portées par les conventionnalistes ou par les tenants de l’utilisation des études sociales de la finance ou de la sociologie économique, le travail présenté ici permet d’identifier un certain nombre de perspectives pour les recherches futu...

  6. Maven The Definitive Guide

    CERN Document Server

    Company, Sonatype

    2009-01-01

    Written by Maven creator Jason Van Zyl and his team at Sonatype, Maven: The Definitive Guide clearly explains how this popular tool can bring order to your software development projects. The first part of the book demonstrates Maven's capabilities through the development of several sample applications from ideation to deployment, and the second part offers a complete reference guide. Concise and to the point, this is the only guide you need to manage your project.

  7. The Logic of Definition

    Science.gov (United States)

    2009-05-01

    Wittgenstein , in his Philosophical Investigations (1953), observed that, for many phenomena, there are no necessary conditions common to all members of... Wittgenstein 1953: §66, p.27e). Wittgenstein refers to these overlapping similarities as “family resemblances” (Ibid., §67, p.27e). Few, if any, of...that is, we may resort to stipulative definition). But this is not necessary for the concept to be usable. Indeed, as Wittgenstein says, sometimes

  8. XMPP The Definitive Guide

    CERN Document Server

    Saint-Andre, Peter; Smith, Kevin

    2009-01-01

    This practical book provides everything you need to know about the Extensible Messaging and Presence Protocol (XMPP) -- the open technology for real-time communication used in instant messaging, Voice over IP, real-time collaboration, social networking, microblogging, lightweight middleware, cloud computing, and more. XMPP: The Definitive Guide walks you through the thought processes and design decisions involved in building a complete XMPP-enabled application, and adding real-time interfaces to existing applications.

  9. Definition Study PHARUS

    Science.gov (United States)

    1991-11-01

    definitie studie zijn gebruikt in het voorontwerp van het PHARUS systeem . TNO report STUDelft Page 6 ABSTRACT 3 SAMENVAlTING 4 CONTENTS 6 1 INTRODUCTION 9...University of Technology, Laboratory for Telecommunication and Remote Sensing Technology. FEL-TNO had tke lead and was responsible for the project management ...Aerospace Programs in Delft was responsible for the program management . The definition study PHARUS was started in the first half of 1988 and ended

  10. Controversy around the definition of waste

    CSIR Research Space (South Africa)

    Oelofse, Suzanna HH

    2009-11-20

    Full Text Available This paper presents the information concerning the definition of waste. Discussing the importance of the clear definition, ongoing debates, broad definition of waste, problems with the broad definition, interpretation, current waste management model...

  11. A New Definition of Virus

    Institute of Scientific and Technical Information of China (English)

    HE Hongjun; CAO Sihua; LUO Li; FENG Tao; PAN Li; ZOU Zhiji

    2006-01-01

    Security experts have not formally defined the distinction between viruses and normal programs. The paper takes user's intension as the criteria for malice, gives a formal definition of viruses that aim at stealing or destroying files, and proposes an algorithm to detect virus correctly. Compared with traditional definitions, this new definition is easy to understand, covers more malwares, adapts development of virus technology, and defines virus on the spot. The paper has also analyzed more than 250 real viruses and finds that they are all in the domain of the new definition, this implies that the new definition has great practical significance.

  12. Application of an Error Statistics Estimation Method to the PSAS Forecast Error Covariance Model

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    In atmospheric data assimilation systems, the forecast error covariance model is an important component. However, the parameters required by a forecast error covariance model are difficult to obtain due to the absence of the truth. This study applies an error statistics estimation method to the Physical-space Statistical Analysis System (PSAS) height-wind forecast error covariance model. This method consists of two components: the first component computes the error statistics by using the National Meteorological Center (NMC) method, which is a lagged-forecast difference approach, within the framework of the PSAS height-wind forecast error covariance model; the second obtains a calibration formula to rescale the error standard deviations provided by the NMC method. The calibration is against the error statistics estimated by using a maximum-likelihood estimation (MLE) with rawindsonde height observed-minus-forecast residuals. A complete set of formulas for estimating the error statistics and for the calibration is applied to a one-month-long dataset generated by a general circulation model of the Global Model and Assimilation Office (GMAO), NASA. There is a clear constant relationship between the error statistics estimates of the NMC-method and MLE. The final product provides a full set of 6-hour error statistics required by the PSAS height-wind forecast error covariance model over the globe. The features of these error statistics are examined and discussed.

  13. Influenza infection rates, measurement errors and the interpretation of paired serology.

    Directory of Open Access Journals (Sweden)

    Simon Cauchemez

    Full Text Available Serological studies are the gold standard method to estimate influenza infection attack rates (ARs in human populations. In a common protocol, blood samples are collected before and after the epidemic in a cohort of individuals; and a rise in haemagglutination-inhibition (HI antibody titers during the epidemic is considered as a marker of infection. Because of inherent measurement errors, a 2-fold rise is usually considered as insufficient evidence for infection and seroconversion is therefore typically defined as a 4-fold rise or more. Here, we revisit this widely accepted 70-year old criterion. We develop a Markov chain Monte Carlo data augmentation model to quantify measurement errors and reconstruct the distribution of latent true serological status in a Vietnamese 3-year serological cohort, in which replicate measurements were available. We estimate that the 1-sided probability of a 2-fold error is 9.3% (95% Credible Interval, CI: 3.3%, 17.6% when antibody titer is below 10 but is 20.2% (95% CI: 15.9%, 24.0% otherwise. After correction for measurement errors, we find that the proportion of individuals with 2-fold rises in antibody titers was too large to be explained by measurement errors alone. Estimates of ARs vary greatly depending on whether those individuals are included in the definition of the infected population. A simulation study shows that our method is unbiased. The 4-fold rise case definition is relevant when aiming at a specific diagnostic for individual cases, but the justification is less obvious when the objective is to estimate ARs. In particular, it may lead to large underestimates of ARs. Determining which biological phenomenon contributes most to 2-fold rises in antibody titers is essential to assess bias with the traditional case definition and offer improved estimates of influenza ARs.

  14. Arabic Spelling: Errors, Perceptions, and Strategies

    Science.gov (United States)

    Brosh, Hezi

    2015-01-01

    This study investigated common spelling errors among first language English speakers who study Arabic at the college level. A sample of 63 students (45 males and 18 females) was asked to write texts about a variety of topics and then to answer survey questions regarding their perceptions and strategies. Their writing produced 457 spelling errors,…

  15. Analysis of Pronominal Errors: A Case Study.

    Science.gov (United States)

    Oshima-Takane, Yuriko

    1992-01-01

    Reports on a study of a normally developing boy who made pronominal errors for about 10 months. Comprehension and production data clearly indicate that the child persistently made pronominal errors because of semantic confusion in the use of first- and second-person pronouns. (28 references) (GLR)

  16. Measurement error analysis of taxi meter

    Science.gov (United States)

    He, Hong; Li, Dan; Li, Hang; Zhang, Da-Jian; Hou, Ming-Feng; Zhang, Shi-pu

    2011-12-01

    The error test of the taximeter is divided into two aspects: (1) the test about time error of the taximeter (2) distance test about the usage error of the machine. The paper first gives the working principle of the meter and the principle of error verification device. Based on JJG517 - 2009 "Taximeter Verification Regulation ", the paper focuses on analyzing the machine error and test error of taxi meter. And the detect methods of time error and distance error are discussed as well. In the same conditions, standard uncertainty components (Class A) are evaluated, while in different conditions, standard uncertainty components (Class B) are also evaluated and measured repeatedly. By the comparison and analysis of the results, the meter accords with JJG517-2009, "Taximeter Verification Regulation ", thereby it improves the accuracy and efficiency largely. In actual situation, the meter not only makes up the lack of accuracy, but also makes sure the deal between drivers and passengers fair. Absolutely it enriches the value of the taxi as a way of transportation.

  17. Sources of Error in Satellite Navigation Positioning

    Directory of Open Access Journals (Sweden)

    Jacek Januszewski

    2017-09-01

    Full Text Available An uninterrupted information about the user’s position can be obtained generally from satellite navigation system (SNS. At the time of this writing (January 2017 currently two global SNSs, GPS and GLONASS, are fully operational, two next, also global, Galileo and BeiDou are under construction. In each SNS the accuracy of the user’s position is affected by the three main factors: accuracy of each satellite position, accuracy of pseudorange measurement and satellite geometry. The user’s position error is a function of both the pseudorange error called UERE (User Equivalent Range Error and user/satellite geometry expressed by right Dilution Of Precision (DOP coefficient. This error is decomposed into two types of errors: the signal in space ranging error called URE (User Range Error and the user equipment error UEE. The detailed analyses of URE, UEE, UERE and DOP coefficients, and the changes of DOP coefficients in different days are presented in this paper.

  18. Compounding errors in 2 dogs receiving anticonvulsants.

    Science.gov (United States)

    McConkey, Sandra E; Walker, Susan; Adams, Cathy

    2012-04-01

    Two cases that involve drug compounding errors are described. One dog exhibited increased seizure activity due to a compounded, flavored phenobarbital solution that deteriorated before the expiration date provided by the compounder. The other dog developed clinical signs of hyperkalemia and bromine toxicity following a 5-fold compounding error in the concentration of potassium bromide (KBr).

  19. Optimal correction of independent and correlated errors

    OpenAIRE

    Jacobsen, Sol H.; Mintert, Florian

    2013-01-01

    We identify optimal quantum error correction codes for situations that do not admit perfect correction. We provide analytic n-qubit results for standard cases with correlated errors on multiple qubits and demonstrate significant improvements to the fidelity bounds and optimal entanglement decay profiles.

  20. Temperature error in digital bathythermograph data

    Digital Repository Service at National Institute of Oceanography (India)

    Pankajakshan, T.; Reddy, G.V.; Ratnakaran, L.; Sarupria, J.S.; RameshBabu, V.

    depth, however there is no apparent or measurable systematic dependence of the error on depth. Considering the given temperature accuracy of 0.05 degrees C, the observed DBT error, varying from -0.3 to -1 degrees C, is significant and such offsets should...

  1. Cardiac manifestations of inborn errors of metabolism.

    NARCIS (Netherlands)

    Evangeliou, A.; Papadopoulou-Legbelou, K.; Daphnis, E.; Ganotakis, E.; Vavouranakis, I.; Michailidou, H.; Hitoglou-Makedou, A.; Nicolaidou, P.; Wevers, R.A.; Varlamis, G.

    2007-01-01

    AIM: The aim of the study was to investigate the frequency and type of cardiac manifestations in a defined group of patients with inborn errors of metabolism. This paper also explores the key role of cardiac manifestations in the diagnosis of inborn errors of metabolism in daily practice. METHODS: O

  2. The Impact of Error-Management Climate, Error Type and Error Originator on Auditors’ Reporting Errors Discovered on Audit Work Papers

    NARCIS (Netherlands)

    A.H. Gold-Nöteberg (Anna); U. Gronewold (Ulfert); S. Salterio (Steve)

    2010-01-01

    textabstractWe examine factors affecting the auditor’s willingness to report their own or their peers’ self-discovered errors in working papers subsequent to detailed working paper review. Prior research has shown that errors in working papers are detected in the review process; however, such

  3. Sudden Possibilities: Porpoises, Eggcorns, and Error

    Science.gov (United States)

    Crovitz, Darren

    2011-01-01

    This article discusses how amusing mistakes can make for serious language instruction. The notion that close analysis of language errors can yield insight into how one thinks and learns seems fundamentally obvious. Yet until relatively recently, language errors were primarily treated as indicators of learner deficiency rather than opportunities to…

  4. The Impact of Error-Management Climate, Error Type and Error Originator on Auditors’ Reporting Errors Discovered on Audit Work Papers

    NARCIS (Netherlands)

    A.H. Gold-Nöteberg (Anna); U. Gronewold (Ulfert); S. Salterio (Steve)

    2010-01-01

    textabstractWe examine factors affecting the auditor’s willingness to report their own or their peers’ self-discovered errors in working papers subsequent to detailed working paper review. Prior research has shown that errors in working papers are detected in the review process; however, such detect

  5. A generalization error estimate for nonlinear systems

    DEFF Research Database (Denmark)

    Larsen, Jan

    1992-01-01

    models of linear and simple neural network systems. Within the linear system GEN is compared to the final prediction error criterion and the leave-one-out cross-validation technique. It was found that the GEN estimate of the true generalization error is less biased on the average. It is concluded...

  6. A Hybrid Approach for Correcting Grammatical Errors

    Science.gov (United States)

    Lee, Kiyoung; Kwon, Oh-Woog; Kim, Young-Kil; Lee, Yunkeun

    2015-01-01

    This paper presents a hybrid approach for correcting grammatical errors in the sentences uttered by Korean learners of English. The error correction system plays an important role in GenieTutor, which is a dialogue-based English learning system designed to teach English to Korean students. During the talk with GenieTutor, grammatical error…

  7. Error tracking in a clinical biochemistry laboratory

    DEFF Research Database (Denmark)

    Szecsi, Pal Bela; Ødum, Lars

    2009-01-01

    BACKGROUND: We report our results for the systematic recording of all errors in a standard clinical laboratory over a 1-year period. METHODS: Recording was performed using a commercial database program. All individuals in the laboratory were allowed to report errors. The testing processes were cl...

  8. Identifying systematic DFT errors in catalytic reactions

    DEFF Research Database (Denmark)

    Christensen, Rune; Hansen, Heine Anton; Vegge, Tejs

    2015-01-01

    Using CO2 reduction reactions as examples, we present a widely applicable method for identifying the main source of errors in density functional theory (DFT) calculations. The method has broad applications for error correction in DFT calculations in general, as it relies on the dependence...

  9. Quantum Error Correction Beyond Completely Positive Maps

    OpenAIRE

    Shabani, A.; Lidar, D. A.

    2006-01-01

    By introducing an operator sum representation for arbitrary linear maps, we develop a generalized theory of quantum error correction (QEC) that applies to any linear map, in particular maps that are not completely positive (CP). This theory of "linear quantum error correction" is applicable in cases where the standard and restrictive assumption of a factorized initial system-bath state does not apply.

  10. 5 CFR 1604.6 - Error correction.

    Science.gov (United States)

    2010-01-01

    ... 5 Administrative Personnel 3 2010-01-01 2010-01-01 false Error correction. 1604.6 Section 1604.6 Administrative Personnel FEDERAL RETIREMENT THRIFT INVESTMENT BOARD UNIFORMED SERVICES ACCOUNTS § 1604.6 Error correction. (a) General rule. A service member's employing agency must correct the service member's...

  11. Textbook Errors & Misconceptions in Biology: Cell Metabolism.

    Science.gov (United States)

    Storey, Richard D.

    1991-01-01

    The idea that errors and misconceptions in biology textbooks are often slow to be discovered and corrected is discussed. Selected errors, misconceptions, and topics of confusion about cell metabolism are described. Fermentation, respiration, Krebs cycle, pentose phosphate pathway, uniformity of catabolism, and metabolic pathways as models are…

  12. Real-time Texture Error Detection

    Directory of Open Access Journals (Sweden)

    Dan Laurentiu Lacrama

    2008-01-01

    Full Text Available This paper advocates an improved solution for the real-time error detection of texture errors that occurs in the production process in textile industry. The research is focused on the mono-color products with 3D texture model (Jacquard fabrics. This is a more difficult task than, for example, 2D multicolor textures.

  13. Textbook Errors & Misconceptions in Biology: Cell Metabolism.

    Science.gov (United States)

    Storey, Richard D.

    1991-01-01

    The idea that errors and misconceptions in biology textbooks are often slow to be discovered and corrected is discussed. Selected errors, misconceptions, and topics of confusion about cell metabolism are described. Fermentation, respiration, Krebs cycle, pentose phosphate pathway, uniformity of catabolism, and metabolic pathways as models are…

  14. Error Analysis: Past, Present, and Future

    Science.gov (United States)

    McCloskey, George

    2017-01-01

    This commentary will take an historical perspective on the Kaufman Test of Educational Achievement (KTEA) error analysis, discussing where it started, where it is today, and where it may be headed in the future. In addition, the commentary will compare and contrast the KTEA error analysis procedures that are rooted in psychometric methodology and…

  15. Hypercorrection of High Confidence Errors in Children

    Science.gov (United States)

    Metcalfe, Janet; Finn, Bridgid

    2012-01-01

    Three experiments investigated whether the hypercorrection effect--the finding that errors committed with high confidence are easier, rather than more difficult, to correct than are errors committed with low confidence--occurs in grade school children as it does in young adults. All three experiments showed that Grade 3-6 children hypercorrected…

  16. Error Analysis: Past, Present, and Future

    Science.gov (United States)

    McCloskey, George

    2017-01-01

    This commentary will take an historical perspective on the Kaufman Test of Educational Achievement (KTEA) error analysis, discussing where it started, where it is today, and where it may be headed in the future. In addition, the commentary will compare and contrast the KTEA error analysis procedures that are rooted in psychometric methodology and…

  17. Judicial error by groups and individuals

    NARCIS (Netherlands)

    F. van Dijk; J. Sonnemans; E. Bauw

    2014-01-01

    In criminal cases judges evaluate and combine probabilistic evidence to reach verdicts. Unavoidably, errors are made, resulting in unwarranted conviction or acquittal of defendants. This paper addresses the questions (1) whether hearing cases by teams of three persons leads to less error than hearin

  18. Judicial error by groups and individuals

    NARCIS (Netherlands)

    F. van Dijk; J.H. Sonnemans; E. Bauw

    2012-01-01

    In criminal cases judges evaluate and combine probabilistic evidence to reach verdicts. Unavoidably, errors are made, resulting in unwarranted conviction or acquittal of defendants. This paper addresses the questions (1) whether hearing cases by teams of three persons leads to less error than hearin

  19. Judicial error by groups and individuals

    NARCIS (Netherlands)

    F. van Dijk; J. Sonnemans; E. Bauw

    2013-01-01

    In criminal cases judges evaluate and combine probabilistic evidence to reach verdicts. Unavoidably, errors are made, resulting in unwarranted conviction or acquittal of defendants. This paper addresses the questions (1) whether hearing cases by teams of three persons leads to less error than hearin

  20. Judicial error by groups and individuals

    NARCIS (Netherlands)

    Bauw, Eddy; van Dijk, F.; Sonnemans, J.

    2014-01-01

    tIn criminal cases judges evaluate and combine probabilistic evidence to reach verdicts.Unavoidably, errors are made, resulting in unwarranted conviction or acquittal of defen-dants. This paper addresses the questions (1) whether hearing cases by teams of threepersons leads to less error than hearin

  1. Hypercorrection of High Confidence Errors in Children

    Science.gov (United States)

    Metcalfe, Janet; Finn, Bridgid

    2012-01-01

    Three experiments investigated whether the hypercorrection effect--the finding that errors committed with high confidence are easier, rather than more difficult, to correct than are errors committed with low confidence--occurs in grade school children as it does in young adults. All three experiments showed that Grade 3-6 children hypercorrected…

  2. A New Positive Definite Expanded Mixed Finite Element Method for Parabolic Integrodifferential Equations

    Directory of Open Access Journals (Sweden)

    Yang Liu

    2012-01-01

    Full Text Available A new positive definite expanded mixed finite element method is proposed for parabolic partial integrodifferential equations. Compared to expanded mixed scheme, the new expanded mixed element system is symmetric positive definite and both the gradient equation and the flux equation are separated from its scalar unknown equation. The existence and uniqueness for semidiscrete scheme are proved and error estimates are derived for both semidiscrete and fully discrete schemes. Finally, some numerical results are provided to confirm our theoretical analysis.

  3. The Ryu-Takayanagi Formula from Quantum Error Correction

    CERN Document Server

    Harlow, Daniel

    2016-01-01

    I argue that a version of the quantum-corrected Ryu-Takayanagi formula holds in any quantum error-correcting code. I present this result as a series of theorems of increasing generality, with the final statement expressed in the language of operator-algebra quantum error correction. In AdS/CFT this gives a "purely boundary" interpretation of the formula. I also extend a recent theorem, which established entanglement-wedge reconstruction in AdS/CFT, when interpreted as a subsystem code, to the more general, and I argue more physical, case of subalgebra codes. For completeness, I include a self-contained presentation of the theory of von Neumann algebras on finite-dimensional Hilbert spaces, as well as the algebraic definition of entropy. The results confirm a close relationship between bulk gauge transformations, edge-modes/soft-hair on black holes, and the Ryu-Takayanagi formula. They also suggest a new perspective on the homology constraint, which basically is to get rid of it in a way that preserves the val...

  4. Transition State Theory: Variational Formulation, Dynamical Corrections, and Error Estimates

    Science.gov (United States)

    vanden-Eijnden, Eric

    2009-03-01

    Transition state theory (TST) is discussed from an original viewpoint: it is shown how to compute exactly the mean frequency of transition between two predefined sets which either partition phase space (as in TST) or are taken to be well separate metastable sets corresponding to long-lived conformation states (as necessary to obtain the actual transition rate constants between these states). Exact and approximate criterions for the optimal TST dividing surface with minimum recrossing rate are derived. Some issues about the definition and meaning of the free energy in the context of TST are also discussed. Finally precise error estimates for the numerical procedure to evaluate the transmission coefficient κS of the TST dividing surface are given, and it shown that the relative error on κS scales as 1/√κS when κS is small. This implies that dynamical corrections to the TST rate constant can be computed efficiently if and only if the TST dividing surface has a transmission coefficient κS which is not too small. In particular the TST dividing surface must be optimized upon (for otherwise κS is generally very small), but this may not be sufficient to make the procedure numerically efficient (because the optimal dividing surface has maximum κS, but this coefficient may still be very small).

  5. The Ryu-Takayanagi Formula from Quantum Error Correction

    Science.gov (United States)

    Harlow, Daniel

    2017-09-01

    I argue that a version of the quantum-corrected Ryu-Takayanagi formula holds in any quantum error-correcting code. I present this result as a series of theorems of increasing generality, with the final statement expressed in the language of operator-algebra quantum error correction. In AdS/CFT this gives a "purely boundary" interpretation of the formula. I also extend a recent theorem, which established entanglement-wedge reconstruction in AdS/CFT, when interpreted as a subsystem code, to the more general, and I argue more physical, case of subalgebra codes. For completeness, I include a self-contained presentation of the theory of von Neumann algebras on finite-dimensional Hilbert spaces, as well as the algebraic definition of entropy. The results confirm a close relationship between bulk gauge transformations, edge-modes/soft-hair on black holes, and the Ryu-Takayanagi formula. They also suggest a new perspective on the homology constraint, which basically is to get rid of it in a way that preserves the validity of the formula, but which removes any tension with the linearity of quantum mechanics. Moreover, they suggest a boundary interpretation of the "bit threads" recently introduced by Freedman and Headrick.

  6. Monitoring and reporting of preanalytical errors in laboratory medicine: the UK situation.

    Science.gov (United States)

    Cornes, Michael P; Atherton, Jennifer; Pourmahram, Ghazaleh; Borthwick, Hazel; Kyle, Betty; West, Jamie; Costelloe, Seán J

    2016-03-01

    Most errors in the clinical laboratory occur in the preanalytical phase. This study aimed to comprehensively describe the prevalence and nature of preanalytical quality monitoring practices in UK clinical laboratories. A survey was sent on behalf of the Association for Clinical Biochemistry and Laboratory Medicine Preanalytical Working Group (ACB-WG-PA) to all heads of department of clinical laboratories in the UK. The survey captured data on the analytical platform and Laboratory Information Management System in use; which preanalytical errors were recorded and how they were classified and gauged interest in an external quality assurance scheme for preanalytical errors. Of the 157 laboratories asked to participate, responses were received from 104 (66.2%). Laboratory error rates were recorded per number of specimens, rather than per number of requests in 51% of respondents. Aside from serum indices for haemolysis, icterus and lipaemia, which were measured in 80% of laboratories, the most common errors recorded were booking-in errors (70.1%) and sample mislabelling (56.9%) in laboratories who record preanalytical errors. Of the laboratories surveyed, 95.9% expressed an interest in guidance on recording preanalytical error and 91.8% expressed interest in an external quality assurance scheme. This survey observes a wide variation in the definition, repertoire and collection methods for preanalytical errors in the UK. Data indicate there is a lot of interest in improving preanalytical data collection. The ACB-WG-PA aims to produce guidance and support for laboratories to standardize preanalytical data collection and to help establish and validate an external quality assurance scheme for interlaboratory comparison. © The Author(s) 2015.

  7. Quantifying and handling errors in instrumental measurements using the measurement error theory

    DEFF Research Database (Denmark)

    Andersen, Charlotte Møller; Bro, R.; Brockhoff, P.B.

    2003-01-01

    Measurement error modelling is used for investigating the influence of measurement/sampling error on univariate predictions of water content and water-holding capacity (reference measurement) from nuclear magnetic resonance (NMR) relaxations (instrumental) measured on two gadoid fish species....... This is a new way of using the measurement error theory. Reliability ratios illustrate that the models for the two fish species are influenced differently by the error. However, the error seems to influence the predictions of the two reference measures in the same way. The effect of using replicated x......-measurements is illustrated by simulated data and by NMR relaxations measured several times on each fish. The standard error of the Physical determination of the reference values is lower than the standard error of the NMR measurements. In this case, lower prediction error is obtained by replicating the instrumental...

  8. Impacts of motivational valence on the error-related negativity elicited by full and partial errors.

    Science.gov (United States)

    Maruo, Yuya; Schacht, Annekathrin; Sommer, Werner; Masaki, Hiroaki

    2016-02-01

    Affect and motivation influence the error-related negativity (ERN) elicited by full errors; however, it is unknown whether they also influence ERNs to correct responses accompanied by covert incorrect response activation (partial errors). Here we compared a neutral condition with conditions, where correct responses were rewarded or where incorrect responses were punished with gains and losses of small amounts of money, respectively. Data analysis distinguished ERNs elicited by full and partial errors. In the reward and punishment conditions, ERN amplitudes to both full and partial errors were larger than in the neutral condition, confirming participants' sensitivity to the significance of errors. We also investigated the relationships between ERN amplitudes and the behavioral inhibition and activation systems (BIS/BAS). Regardless of reward/punishment condition, participants scoring higher on BAS showed smaller ERN amplitudes in full error trials. These findings provide further evidence that the ERN is related to motivational valence and that similar relationships hold for both full and partial errors.

  9. Quantifying truncation errors in effective field theory

    CERN Document Server

    Furnstahl, R J; Phillips, D R; Wesolowski, S

    2015-01-01

    Bayesian procedures designed to quantify truncation errors in perturbative calculations of quantum chromodynamics observables are adapted to expansions in effective field theory (EFT). In the Bayesian approach, such truncation errors are derived from degree-of-belief (DOB) intervals for EFT predictions. Computation of these intervals requires specification of prior probability distributions ("priors") for the expansion coefficients. By encoding expectations about the naturalness of these coefficients, this framework provides a statistical interpretation of the standard EFT procedure where truncation errors are estimated using the order-by-order convergence of the expansion. It also permits exploration of the ways in which such error bars are, and are not, sensitive to assumptions about EFT-coefficient naturalness. We first demonstrate the calculation of Bayesian probability distributions for the EFT truncation error in some representative examples, and then focus on the application of chiral EFT to neutron-pr...

  10. FUZZY ECCENTRICITY AND GROSS ERROR IDENTIFICATION

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    The dominant and recessive effect made by exceptional interferer is analyzed in measurement system based on responsive character, and the gross error model of fuzzy clustering based on fuzzy relation and fuzzy equipollence relation is built. The concept and calculate formula of fuzzy eccentricity are defined to deduce the evaluation rule and function of gross error, on the base of them, a fuzzy clustering method of separating and discriminating the gross error is found. Utilized in the dynamic circular division measurement system, the method can identify and eliminate gross error in measured data, and reduce measured data dispersity. Experimental results indicate that the use of the method and model enables repetitive precision of the system to improve 80% higher than the foregoing system, to reach 3.5 s, and angle measurement error is less than 7 s.

  11. A Comparative Study on Error Analysis

    DEFF Research Database (Denmark)

    Wu, Xiaoli; Zhang, Chun

    2015-01-01

    Title: A Comparative Study on Error Analysis Subtitle: - Belgian (L1) and Danish (L1) learners’ use of Chinese (L2) comparative sentences in written production Xiaoli Wu, Chun Zhang Abstract: Making errors is an inevitable and necessary part of learning. The collection, classification and analysis...... the occurrence of errors either in linguistic or pedagogical terms. The purpose of the current study is to demonstrate the theoretical and practical relevance of error analysis approach in CFL by investigating two cases - (1) Belgian (L1) learners’ use of Chinese (L2) comparative sentences in written production....... Finally, pedagogical implication of CFL is discussed and future research is suggested. Keywords: error analysis, comparative sentences, comparative structure ‘‘bǐ - 比’, Chinese as a foreign language (CFL), written production...

  12. Adaptive Error Resilience for Video Streaming

    Directory of Open Access Journals (Sweden)

    Lakshmi R. Siruvuri

    2009-01-01

    Full Text Available Compressed video sequences are vulnerable to channel errors, to the extent that minor errors and/or small losses can result in substantial degradation. Thus, protecting compressed data against channel errors is imperative. The use of channel coding schemes can be effective in reducing the impact of channel errors, although this requires that extra parity bits to be transmitted, thus utilizing more bandwidth. However, this can be ameliorated if the transmitter can tailor the parity data rate based on its knowledge regarding current channel conditions. This can be achieved via feedback from the receiver to the transmitter. This paper describes a channel emulation system comprised of a server/proxy/client combination that utilizes feedback from the client to adapt the number of Reed-Solomon parity symbols used to protect compressed video sequences against channel errors.

  13. Error Resilient Video Compression Using Behavior Models

    Directory of Open Access Journals (Sweden)

    Jacco R. Taal

    2004-03-01

    Full Text Available Wireless and Internet video applications are inherently subjected to bit errors and packet errors, respectively. This is especially so if constraints on the end-to-end compression and transmission latencies are imposed. Therefore, it is necessary to develop methods to optimize the video compression parameters and the rate allocation of these applications that take into account residual channel bit errors. In this paper, we study the behavior of a predictive (interframe video encoder and model the encoders behavior using only the statistics of the original input data and of the underlying channel prone to bit errors. The resulting data-driven behavior models are then used to carry out group-of-pictures partitioning and to control the rate of the video encoder in such a way that the overall quality of the decoded video with compression and channel errors is optimized.

  14. Medication Error, What Is the Reason?

    Directory of Open Access Journals (Sweden)

    Ali Banaozar Mohammadi

    2015-09-01

    Full Text Available Background: Medication errors due to different reasons may alter the outcome of all patients, especially patients with drug poisoning. We introduce one of the most common type of medication error in the present article. Case:A 48 year old woman with suspected organophosphate poisoning was died due to lethal medication error. Unfortunately these types of errors are not rare and had some preventable reasons included lack of suitable and enough training and practicing of medical students and some failures in medical students’ educational curriculum. Conclusion:Hereby some important reasons are discussed because sometimes they are tre-mendous. We found that most of them are easily preventable. If someone be aware about the method of use, complications, dosage and contraindication of drugs, we can minimize most of these fatal errors.

  15. Quantum Error Correction in the Zeno Regime

    CERN Document Server

    Erez, N; Reznik, B; Vaidman, L; Erez, Noam; Aharonov, Yakir; Reznik, Benni; Vaidman, Lev

    2003-01-01

    In order to reduce errors, error correction codes (ECCs) need to be implemented fast. They can correct the errors corresponding to the first few orders in the Taylor expansion of the Hamiltonian of the interaction with the environment. If implemented fast enough, the zeroth order error predominates and the dominant effect is of error prevention by measurement (Zeno Effect) rather than correction. In this ``Zeno Regime'', codes with less redundancy are sufficient for protection. We describe such a simple scheme, which uses two ``noiseless'' qubits to protect a large number, $n$, of information qubits from noise from the environment. The ``noisless qubits'' can be realized by treating them as logical qubits to be encoded by one of the previously introduced encoding schemes.

  16. Flux Sampling Errors for Aircraft and Towers

    Science.gov (United States)

    Mahrt, Larry

    1998-01-01

    Various errors and influences leading to differences between tower- and aircraft-measured fluxes are surveyed. This survey is motivated by reports in the literature that aircraft fluxes are sometimes smaller than tower-measured fluxes. Both tower and aircraft flux errors are larger with surface heterogeneity due to several independent effects. Surface heterogeneity may cause tower flux errors to increase with decreasing wind speed. Techniques to assess flux sampling error are reviewed. Such error estimates suffer various degrees of inapplicability in real geophysical time series due to nonstationarity of tower time series (or inhomogeneity of aircraft data). A new measure for nonstationarity is developed that eliminates assumptions on the form of the nonstationarity inherent in previous methods. When this nonstationarity measure becomes large, the surface energy imbalance increases sharply. Finally, strategies for obtaining adequate flux sampling using repeated aircraft passes and grid patterns are outlined.

  17. The District Nursing Clinical Error Reduction Programme.

    Science.gov (United States)

    McGraw, Caroline; Topping, Claire

    2011-01-01

    The District Nursing Clinical Error Reduction (DANCER) Programme was initiated in NHS Islington following an increase in the number of reported medication errors. The objectives were to reduce the actual degree of harm and the potential risk of harm associated with medication errors and to maintain the existing positive reporting culture, while robustly addressing performance issues. One hundred medication errors reported in 2007/08 were analysed using a framework that specifies the factors that predispose to adverse medication events in domiciliary care. Various contributory factors were identified and interventions were subsequently developed to address poor drug calculation and medication problem-solving skills and incorrectly transcribed medication administration record charts. Follow up data were obtained at 12 months and two years. The evaluation has shown that although medication errors do still occur, the programme has resulted in a marked shift towards a reduction in the associated actual degree of harm and the potential risk of harm.

  18. Interactions of timing and prediction error learning.

    Science.gov (United States)

    Kirkpatrick, Kimberly

    2014-01-01

    Timing and prediction error learning have historically been treated as independent processes, but growing evidence has indicated that they are not orthogonal. Timing emerges at the earliest time point when conditioned responses are observed, and temporal variables modulate prediction error learning in both simple conditioning and cue competition paradigms. In addition, prediction errors, through changes in reward magnitude or value alter timing of behavior. Thus, there appears to be a bi-directional interaction between timing and prediction error learning. Modern theories have attempted to integrate the two processes with mixed success. A neurocomputational approach to theory development is espoused, which draws on neurobiological evidence to guide and constrain computational model development. Heuristics for future model development are presented with the goal of sparking new approaches to theory development in the timing and prediction error fields.

  19. El error en el delito imprudente

    Directory of Open Access Journals (Sweden)

    Miguel Angel Muñoz García

    2011-12-01

    Full Text Available La teoría del error en los delitos culposos constituye un tema álgido de tratar, y controversial en la dogmática penal: existen en realidad muy escasas referencias, y no se ha llegado a un consenso razonable. Partiendo del análisis de la estructura dogmática del delito imprudente, en donde se destaca el deber objetivo de cuidado como elemento del tipo sobre el que recae el error, y de las diferentes posiciones doctrinales que defienden la aplicabilidad del error de tipo y del error de prohibición, se plantea la viabilidad de este último, con fundamento en razones dogmáticas y de política criminal, siendo la infracción del deber objetivo de cuidado en tanto consecuencia del error, un tema por analizar en sede de culpabilidad.

  20. Errors in Junior English Writing:Resources and Strategies

    Institute of Scientific and Technical Information of China (English)

    XU Shu-ling

    2013-01-01

    The research on the common errors in junior English writings reflects the categories of errors, resources of errors, and how to do with errors in effective ways. Errors are divided into two types:intralingual errors and interlingual errors. The research finds that Chinese junior students depend heavily on their native language in English writing and finds out some effective strate-gies avoiding errors in writing.

  1. Analyzing temozolomide medication errors: potentially fatal.

    Science.gov (United States)

    Letarte, Nathalie; Gabay, Michael P; Bressler, Linda R; Long, Katie E; Stachnik, Joan M; Villano, J Lee

    2014-10-01

    The EORTC-NCIC regimen for glioblastoma requires different dosing of temozolomide (TMZ) during radiation and maintenance therapy. This complexity is exacerbated by the availability of multiple TMZ capsule strengths. TMZ is an alkylating agent and the major toxicity of this class is dose-related myelosuppression. Inadvertent overdose can be fatal. The websites of the Institute for Safe Medication Practices (ISMP), and the Food and Drug Administration (FDA) MedWatch database were reviewed. We searched the MedWatch database for adverse events associated with TMZ and obtained all reports including hematologic toxicity submitted from 1st November 1997 to 30th May 2012. The ISMP describes errors with TMZ resulting from the positioning of information on the label of the commercial product. The strength and quantity of capsules on the label were in close proximity to each other, and this has been changed by the manufacturer. MedWatch identified 45 medication errors. Patient errors were the most common, accounting for 21 or 47% of errors, followed by dispensing errors, which accounted for 13 or 29%. Seven reports or 16% were errors in the prescribing of TMZ. Reported outcomes ranged from reversible hematological adverse events (13%), to hospitalization for other adverse events (13%) or death (18%). Four error reports lacked detail and could not be categorized. Although the FDA issued a warning in 2003 regarding fatal medication errors and the product label warns of overdosing, errors in TMZ dosing occur for various reasons and involve both healthcare professionals and patients. Overdosing errors can be fatal.

  2. Error in Monte Carlo, quasi-error in Quasi-Monte Carlo

    OpenAIRE

    Kleiss, R. H. P.; Lazopoulos, A.

    2006-01-01

    While the Quasi-Monte Carlo method of numerical integration achieves smaller integration error than standard Monte Carlo, its use in particle physics phenomenology has been hindered by the abscence of a reliable way to estimate that error. The standard Monte Carlo error estimator relies on the assumption that the points are generated independently of each other and, therefore, fails to account for the error improvement advertised by the Quasi-Monte Carlo method. We advocate the construction o...

  3. 越南西北部大象山超高温变质岩的发现及其区域构造意义%The discovery and tectonic implication of ultrahigh-temperature metamorphic rocks in the Day Nui Con Voi, northwestern Vietnam

    Institute of Scientific and Technical Information of China (English)

    吴虎峻; 刘俊来; TRANMy Dung; NGUYEN QuangLuat; PHAMBinh; 吴文彬; 陈文; 张招崇

    2011-01-01

    越南西北部大象山群孔兹岩系中发育一套含刚玉+尖晶石+石榴石+夕线石组合的富铝岩石块体,它们呈透镜状包体形式赋存于孔兹岩系内.岩石中刚玉+尖晶石+石榴石+夕线石组合的发育指示岩石经历了超高温变质作用的改造.其中尖晶石和石英的共生组合表明了变质温度高于900℃,而利用岩石退变质矿物组合中的黑云母-石榴石温度计,黑云母-斜长石-石榴石-石英组合温度-压力计估算的变质温度压力条件分别为879 ~917℃、0.90 ~0.94GPa.岩石中的早期刚玉+夕线石的组合的存在说明岩石变质作用已经从高角闪岩相进入到了麻粒岩相;而峰期变质矿物组合尖晶石+石英的出现,指示了变质温度高于900℃的超高温变质作用.另一方面,退变质过程中钛铁矿的发育表明岩石经历了快速抬升降压的过程.变质作用的P-T轨迹分析揭示出岩石经历了早期同步升温增压后的快速增温达到峰期条件,后经历快速等温减压过程.这种温压条件的变化与板块会聚过程中由于俯冲板片的断离而使软流圈上涌造成热异常的温压条件变化基本一致.通过对超高温变质岩石进行锆石SIMS U-Pb测年获得的结果大于58Ma,推测这次超高温变质与喜马拉雅运动中印度与欧亚大陆的初期会聚-碰撞作用相关.%There is a pile of rock blocks which contain corundum + spinel + garnet + fibrolite association in the form of lenticular enclaves in khondalite series along the Day Nui Con Voi, northwestern Vietnam. The mineral association is resulted from ultrahigh-temperature metamorphism. The coexistence of spinel and quartz indicates that the metamorphic temperature is higher than 900℃. Meanwhile, the biotite-garnet thermometer and biotite-plagioclase-garnet-quartz association thermobarometer for the mineral associations during retrogressive metamorphism gave temperature and pressure ranges of 879 ~917℃ and 0

  4. Relative and Interraction Effects of Errors in Physics Practical

    Directory of Open Access Journals (Sweden)

    Owolabi Olabode Thomas

    2013-07-01

    Full Text Available The importance of physics in human endeavour cannot be glossed over, for it places a vital role and essential part of all human endeavour, especially in science and technology. The study was designed to see the relative and interraction effects of errors in physics practical in Nigeria Secondary Schools. A Quasi experimental design of the three group pre-test, post-test, control design was employed. The sample for the study consisted of sixty (60 students from three selected secondary schools in Nigeria. Equal male and female students were selected using stratified random sampling technique. Physics Practical Questions (PPQ were validated and used before and after treatment in the groups. The findings revealed that when students are exposed to the idea of errors in practical physics, the degree of accuracy will increase and therefore enhance their performance in the subject. Physics and related courses had been recommended for both male and female students in secondary schools, since sex is not the major issue in physics practical works. If the students are taught how to get accurate results and errors are minimised in physics practical, this will enhance good performance in the subject. Normal 0 false false false EN-US X-NONE AR-SA /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; text-align:justify; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin;}

  5. An evaluation and regional error modeling methodology for near-real-time satellite rainfall data over Australia

    Science.gov (United States)

    Pipunic, Robert C.; Ryu, Dongryeol; Costelloe, Justin F.; Su, Chun-Hsu

    2015-10-01

    In providing uniform spatial coverage, satellite-based rainfall estimates can potentially benefit hydrological modeling, particularly for flood prediction. Maximizing the value of information from such data requires knowledge of its error. The most recent Tropical Rainfall Measuring Mission (TRMM) 3B42RT (TRMM-RT) satellite product version 7 (v7) was used for examining evaluation procedures against in situ gauge data across mainland Australia at a daily time step, over a 9 year period. This provides insights into estimating uncertainty and informing quantitative error model development, with methodologies relevant to the recently operational Global Precipitation Measurement mission that builds upon the TRMM legacy. Important error characteristics highlighted for daily aggregated TRMM-RT v7 include increasing (negative) bias and error variance with increasing daily gauge totals and more reliability at detecting larger gauge totals with a probability of detection of data have increasing (positive) bias and error variance with increasing TRMM-RT estimates. Difference errors binned within 10 mm/d increments of TRMM-RT v7 estimates highlighted negatively skewed error distributions for all bins, suitably approximated by the generalized extreme value distribution. An error model based on this distribution enables bias correction and definition of quantitative uncertainty bounds, which are expected to be valuable for hydrological modeling and/or merging with other rainfall products. These error characteristics are also an important benchmark for assessing if/how future satellite rainfall products have improved.

  6. Reducing medication errors: Teaching strategies that increase nursing students' awareness of medication errors and their prevention.

    Science.gov (United States)

    Latimer, Sharon; Hewitt, Jayne; Stanbrough, Rebecca; McAndrew, Ron

    2017-02-14

    Medication errors are a patient safety and quality of care issue. There is evidence to suggest many undergraduate nursing curricula do not adequately educate students about the factors that contribute to medication errors and possible strategies to prevent them. We designed and developed a suite of teaching strategies that raise students' awareness of medication error producing situations and their prevention.

  7. How does prostate biopsy guidance error impact pathologic cancer risk assessment?

    Science.gov (United States)

    Martin, Peter R.; Gaed, Mena; Gómez, José A.; Moussa, Madeleine; Gibson, Eli; Cool, Derek W.; Chin, Joseph L.; Pautler, Stephen; Fenster, Aaron; Ward, Aaron D.

    2016-03-01

    Magnetic resonance imaging (MRI)-targeted, 3D transrectal ultrasound (TRUS)-guided "fusion" prostate biopsy aims to reduce the 21-47% false negative rate of clinical 2D TRUS-guided sextant biopsy, but still has a substantial false negative rate. This could be improved via biopsy needle target optimization, accounting for uncertainties due to guidance system errors, image registration errors, and irregular tumor shapes. As an initial step toward the broader goal of optimized prostate biopsy targeting, in this study we elucidated the impact of biopsy needle delivery error on the probability of obtaining a tumor sample, and on the core involvement. These are both important parameters to patient risk stratification and the decision for active surveillance vs. definitive therapy. We addressed these questions for cancer of all grades, and separately for high grade (>= Gleason 4+3) cancer. We used expert-contoured gold-standard prostatectomy histology to simulate targeted biopsies using an isotropic Gaussian needle delivery error from 1 to 6 mm, and investigated the amount of cancer obtained in each biopsy core as determined by histology. Needle delivery error resulted in variability in core involvement that could influence treatment decisions; the presence or absence of cancer in 1/3 or more of each needle core can be attributed to a needle delivery error of 4 mm. However, our data showed that by making multiple biopsy attempts at selected tumor foci, we may increase the probability of correctly characterizing the extent and grade of the cancer.

  8. Preanalytical errors in medical laboratories: a review of the available methodologies of data collection and analysis.

    Science.gov (United States)

    West, Jamie; Atherton, Jennifer; Costelloe, Seán J; Pourmahram, Ghazaleh; Stretton, Adam; Cornes, Michael

    2017-01-01

    Preanalytical errors have previously been shown to contribute a significant proportion of errors in laboratory processes and contribute to a number of patient safety risks. Accreditation against ISO 15189:2012 requires that laboratory Quality Management Systems consider the impact of preanalytical processes in areas such as the identification and control of non-conformances, continual improvement, internal audit and quality indicators. Previous studies have shown that there is a wide variation in the definition, repertoire and collection methods for preanalytical quality indicators. The International Federation of Clinical Chemistry Working Group on Laboratory Errors and Patient Safety has defined a number of quality indicators for the preanalytical stage, and the adoption of harmonized definitions will support interlaboratory comparisons and continual improvement. There are a variety of data collection methods, including audit, manual recording processes, incident reporting mechanisms and laboratory information systems. Quality management processes such as benchmarking, statistical process control, Pareto analysis and failure mode and effect analysis can be used to review data and should be incorporated into clinical governance mechanisms. In this paper, The Association for Clinical Biochemistry and Laboratory Medicine PreAnalytical Specialist Interest Group review the various data collection methods available. Our recommendation is the use of the laboratory information management systems as a recording mechanism for preanalytical errors as this provides the easiest and most standardized mechanism of data capture.

  9. Investigation of error sources in regional inverse estimates of greenhouse gas emissions in Canada

    Directory of Open Access Journals (Sweden)

    E. Chan

    2015-08-01

    Full Text Available Inversion models can use atmospheric concentration measurements to estimate surface fluxes. This study is an evaluation of the errors in a regional flux inversion model for different provinces of Canada, Alberta (AB, Saskatchewan (SK and Ontario (ON. Using CarbonTracker model results as the target, the synthetic data experiment analyses examined the impacts of the errors from the Bayesian optimisation method, prior flux distribution and the atmospheric transport model, as well as their interactions. The scaling factors for different sub-regions were estimated by the Markov chain Monte Carlo (MCMC simulation and cost function minimization (CFM methods. The CFM method results are sensitive to the relative size of the assumed model-observation mismatch and prior flux error variances. Experiment results show that the estimation error increases with the number of sub-regions using the CFM method. For the region definitions that lead to realistic flux estimates, the numbers of sub-regions for the western region of AB/SK combined and the eastern region of ON are 11 and 4 respectively. The corresponding annual flux estimation errors for the western and eastern regions using the MCMC (CFM method are -7 and -3 % (0 and 8 % respectively, when there is only prior flux error. The estimation errors increase to 36 and 94 % (40 and 232 % resulting from transport model error alone. When prior and transport model errors co-exist in the inversions, the estimation errors become 5 and 85 % (29 and 201 %. This result indicates that estimation errors are dominated by the transport model error and can in fact cancel each other and propagate to the flux estimates non-linearly. In addition, it is possible for the posterior flux estimates having larger differences than the prior compared to the target fluxes, and the posterior uncertainty estimates could be unrealistically small that do not cover the target. The systematic evaluation of the different components of the

  10. Investigation of error sources in regional inverse estimates of greenhouse gas emissions in Canada

    Science.gov (United States)

    Chan, E.; Chan, D.; Ishizawa, M.; Vogel, F.; Brioude, J.; Delcloo, A.; Wu, Y.; Jin, B.

    2015-08-01

    Inversion models can use atmospheric concentration measurements to estimate surface fluxes. This study is an evaluation of the errors in a regional flux inversion model for different provinces of Canada, Alberta (AB), Saskatchewan (SK) and Ontario (ON). Using CarbonTracker model results as the target, the synthetic data experiment analyses examined the impacts of the errors from the Bayesian optimisation method, prior flux distribution and the atmospheric transport model, as well as their interactions. The scaling factors for different sub-regions were estimated by the Markov chain Monte Carlo (MCMC) simulation and cost function minimization (CFM) methods. The CFM method results are sensitive to the relative size of the assumed model-observation mismatch and prior flux error variances. Experiment results show that the estimation error increases with the number of sub-regions using the CFM method. For the region definitions that lead to realistic flux estimates, the numbers of sub-regions for the western region of AB/SK combined and the eastern region of ON are 11 and 4 respectively. The corresponding annual flux estimation errors for the western and eastern regions using the MCMC (CFM) method are -7 and -3 % (0 and 8 %) respectively, when there is only prior flux error. The estimation errors increase to 36 and 94 % (40 and 232 %) resulting from transport model error alone. When prior and transport model errors co-exist in the inversions, the estimation errors become 5 and 85 % (29 and 201 %). This result indicates that estimation errors are dominated by the transport model error and can in fact cancel each other and propagate to the flux estimates non-linearly. In addition, it is possible for the posterior flux estimates having larger differences than the prior compared to the target fluxes, and the posterior uncertainty estimates could be unrealistically small that do not cover the target. The systematic evaluation of the different components of the inversion

  11. THE ROLES OF CLINICAL PHARMACY IN REDUCING MEDICATION ERRORS

    Directory of Open Access Journals (Sweden)

    Alsaraf Khulood Majid

    2012-09-01

    Full Text Available Potential activation of clinical pharmacist role is of great importance in reducing the medication errors which are a well- known problem in hospital. The medication errors could be prescribing errors, dispensing errors, and administering errors. In this study medication errors randomly were collected by clinical pharmacist and inpatient pharmacist from different wards at a Hospital in Dubai, UAE, from July to October 2011. The results showed that the highest percentage of medication errors was prescribing errors, followed by administering errors and then dispensing errors. Among prescribing errors, the results showed the highest percentage was stat errors, followed by pro re nata(PRN, then incomplete or unclear Rx and at the end antibiotic errors. The study shows that the clinical pharmacist play important role in reduction of medication errors evolving from pharmacist and nursing site, on the other hand, prescribing errors were reduced up to 23% with the medication review system.

  12. Is a genome a codeword of an error-correcting code?

    Directory of Open Access Journals (Sweden)

    Luzinete C B Faria

    Full Text Available Since a genome is a discrete sequence, the elements of which belong to a set of four letters, the question as to whether or not there is an error-correcting code underlying DNA sequences is unavoidable. The most common approach to answering this question is to propose a methodology to verify the existence of such a code. However, none of the methodologies proposed so far, although quite clever, has achieved that goal. In a recent work, we showed that DNA sequences can be identified as codewords in a class of cyclic error-correcting codes known as Hamming codes. In this paper, we show that a complete intron-exon gene, and even a plasmid genome, can be identified as a Hamming code codeword as well. Although this does not constitute a definitive proof that there is an error-correcting code underlying DNA sequences, it is the first evidence in this direction.

  13. Apache The Definitive Guide

    CERN Document Server

    Laurie, Ben

    2003-01-01

    Apache is far and away the most widely used web server platform in the world. This versatile server runs more than half of the world's existing web sites. Apache is both free and rock-solid, running more than 21 million web sites ranging from huge e-commerce operations to corporate intranets and smaller hobby sites. With this new third edition of Apache: The Definitive Guide, web administrators new to Apache will come up to speed quickly, and experienced administrators will find the logically organized, concise reference sections indispensable, and system programmers interested in customizin

  14. Jenkins The Definitive Guide

    CERN Document Server

    Smart, John

    2011-01-01

    Streamline software development with Jenkins, the popular Java-based open source tool that has revolutionized the way teams think about Continuous Integration (CI). This complete guide shows you how to automate your build, integration, release, and deployment processes with Jenkins-and demonstrates how CI can save you time, money, and many headaches. Ideal for developers, software architects, and project managers, Jenkins: The Definitive Guide is both a CI tutorial and a comprehensive Jenkins reference. Through its wealth of best practices and real-world tips, you'll discover how easy it is

  15. Cassandra the definitive guide

    CERN Document Server

    Hewitt, Eben

    2011-01-01

    What could you do with data if scalability wasn't a problem? With this hands-on guide, you'll learn how Apache Cassandra handles hundreds of terabytes of data while remaining highly available across multiple data centers -- capabilities that have attracted Facebook, Twitter, and other data-intensive companies. Cassandra: The Definitive Guide provides the technical details and practical examples you need to assess this database management system and put it to work in a production environment. Author Eben Hewitt demonstrates the advantages of Cassandra's nonrelational design, and pays special

  16. Hadoop The Definitive Guide

    CERN Document Server

    White, Tom

    2009-01-01

    Hadoop: The Definitive Guide helps you harness the power of your data. Ideal for processing large datasets, the Apache Hadoop framework is an open source implementation of the MapReduce algorithm on which Google built its empire. This comprehensive resource demonstrates how to use Hadoop to build reliable, scalable, distributed systems: programmers will find details for analyzing large datasets, and administrators will learn how to set up and run Hadoop clusters. Complete with case studies that illustrate how Hadoop solves specific problems, this book helps you: Use the Hadoop Distributed

  17. Wind power error estimation in resource assessments.

    Science.gov (United States)

    Rodríguez, Osvaldo; Del Río, Jesús A; Jaramillo, Oscar A; Martínez, Manuel

    2015-01-01

    Estimating the power output is one of the elements that determine the techno-economic feasibility of a renewable project. At present, there is a need to develop reliable methods that achieve this goal, thereby contributing to wind power penetration. In this study, we propose a method for wind power error estimation based on the wind speed measurement error, probability density function, and wind turbine power curves. This method uses the actual wind speed data without prior statistical treatment based on 28 wind turbine power curves, which were fitted by Lagrange's method, to calculate the estimate wind power output and the corresponding error propagation. We found that wind speed percentage errors of 10% were propagated into the power output estimates, thereby yielding an error of 5%. The proposed error propagation complements the traditional power resource assessments. The wind power estimation error also allows us to estimate intervals for the power production leveled cost or the investment time return. The implementation of this method increases the reliability of techno-economic resource assessment studies.

  18. Sources of error in emergency ultrasonography.

    Science.gov (United States)

    Pinto, Antonio; Pinto, Fabio; Faggian, Angela; Rubini, Giuseppe; Caranci, Ferdinando; Macarini, Luca; Genovese, Eugenio Annibale; Brunese, Luca

    2013-07-15

    To evaluate the common sources of diagnostic errors in emergency ultrasonography. The authors performed a Medline search using PubMed (National Library of Medicine, Bethesda, Maryland) for original research and review publications examining the common sources of errors in diagnosis with specific reference to emergency ultrasonography. The search design utilized different association of the following terms : (1) emergency ultrasonography, (2) error, (3) malpractice and (4) medical negligence. This review was restricted to human studies and to English-language literature. Four authors reviewed all the titles and subsequent the abstract of 171 articles that appeared appropriate. Other articles were recognized by reviewing the reference lists of significant papers. Finally, the full text of 48 selected articles was reviewed. Several studies indicate that the etiology of error in emergency ultrasonography is multi-factorial. Common sources of error in emergency ultrasonography are: lack of attention to the clinical history and examination, lack of communication with the patient, lack of knowledge of the technical equipment, use of inappropriate probes, inadequate optimization of the images, failure of perception, lack of knowledge of the possible differential diagnoses, over-estimation of one's own skill, failure to suggest further ultrasound examinations or other imaging techniques. To reduce errors in interpretation of ultrasonographic findings, the sonographer needs to be aware of the limitations of ultrasonography in the emergency setting, and the similarities in the appearances of various physiological and pathological processes. Adequate clinical informations are essential. Diagnostic errors should be considered not as signs of failure, but as learning opportunities.

  19. Target Uncertainty Mediates Sensorimotor Error Correction.

    Science.gov (United States)

    Acerbi, Luigi; Vijayakumar, Sethu; Wolpert, Daniel M

    2017-01-01

    Human movements are prone to errors that arise from inaccuracies in both our perceptual processing and execution of motor commands. We can reduce such errors by both improving our estimates of the state of the world and through online error correction of the ongoing action. Two prominent frameworks that explain how humans solve these problems are Bayesian estimation and stochastic optimal feedback control. Here we examine the interaction between estimation and control by asking if uncertainty in estimates affects how subjects correct for errors that may arise during the movement. Unbeknownst to participants, we randomly shifted the visual feedback of their finger position as they reached to indicate the center of mass of an object. Even though participants were given ample time to compensate for this perturbation, they only fully corrected for the induced error on trials with low uncertainty about center of mass, with correction only partial in trials involving more uncertainty. The analysis of subjects' scores revealed that participants corrected for errors just enough to avoid significant decrease in their overall scores, in agreement with the minimal intervention principle of optimal feedback control. We explain this behavior with a term in the loss function that accounts for the additional effort of adjusting one's response. By suggesting that subjects' decision uncertainty, as reflected in their posterior distribution, is a major factor in determining how their sensorimotor system responds to error, our findings support theoretical models in which the decision making and control processes are fully integrated.

  20. Wind Power Error Estimation in Resource Assessments

    Science.gov (United States)

    Rodríguez, Osvaldo; del Río, Jesús A.; Jaramillo, Oscar A.; Martínez, Manuel

    2015-01-01

    Estimating the power output is one of the elements that determine the techno-economic feasibility of a renewable project. At present, there is a need to develop reliable methods that achieve this goal, thereby contributing to wind power penetration. In this study, we propose a method for wind power error estimation based on the wind speed measurement error, probability density function, and wind turbine power curves. This method uses the actual wind speed data without prior statistical treatment based on 28 wind turbine power curves, which were fitted by Lagrange's method, to calculate the estimate wind power output and the corresponding error propagation. We found that wind speed percentage errors of 10% were propagated into the power output estimates, thereby yielding an error of 5%. The proposed error propagation complements the traditional power resource assessments. The wind power estimation error also allows us to estimate intervals for the power production leveled cost or the investment time return. The implementation of this method increases the reliability of techno-economic resource assessment studies. PMID:26000444

  1. Gear Transmission Error Measurement System Made Operational

    Science.gov (United States)

    Oswald, Fred B.

    2002-01-01

    A system directly measuring the transmission error between the meshing spur or helical gears was installed at the NASA Glenn Research Center and made operational in August 2001. This system employs light beams directed by lenses and prisms through gratings mounted on the two gear shafts. The amount of light that passes through both gratings is directly proportional to the transmission error of the gears. The device is capable of resolution better than 0.1 mm (one thousandth the thickness of a human hair). The measured transmission error can be displayed in a "map" that shows how the transmission error varies with the gear rotation or it can be converted to spectra to show the components at the meshing frequencies. Accurate transmission error data will help researchers better understand the mechanisms that cause gear noise and vibration and will lead to The Design Unit at the University of Newcastle in England specifically designed the new system for NASA. It is the only device in the United States that can measure dynamic transmission error at high rotational speeds. The new system will be used to develop new techniques to reduce dynamic transmission error along with the resulting noise and vibration of aeronautical transmissions.

  2. Hierarchical error representation in medial prefrontal cortex.

    Science.gov (United States)

    Zarr, Noah; Brown, Joshua W

    2016-01-01

    The medial prefrontal cortex (mPFC) is reliably activated by both performance and prediction errors. Error signals have typically been treated as a scalar, and it is unknown to what extent multiple error signals may co-exist within mPFC. Previous studies have shown that lateral frontal cortex (LFC) is arranged in a hierarchy of abstraction, such that more abstract concepts and rules are represented in more anterior cortical regions. Given the close interaction between lateral and medial prefrontal cortex, we explored the hypothesis that mPFC would be organized along a similar rostro-caudal gradient of abstraction, such that more abstract prediction errors are represented further anterior and more concrete errors further posterior. We show that multiple prediction error signals can be found in mPFC, and furthermore, these are arranged in a rostro-caudal gradient of abstraction which parallels that found in LFC. We used a task that requires a three-level hierarchy of rules to be followed, in which the rules changed without warning at each level of the hierarchy. Task feedback indicated which level of the rule hierarchy changed and led to corresponding prediction error signals in mPFC. Moreover, each identified region of mPFC was preferentially functionally connected to correspondingly anterior regions of LFC. These results suggest the presence of a parallel structure between lateral and medial prefrontal cortex, with the medial regions monitoring and evaluating performance based on rules maintained in the corresponding lateral regions.

  3. Target Uncertainty Mediates Sensorimotor Error Correction

    Science.gov (United States)

    Vijayakumar, Sethu; Wolpert, Daniel M.

    2017-01-01

    Human movements are prone to errors that arise from inaccuracies in both our perceptual processing and execution of motor commands. We can reduce such errors by both improving our estimates of the state of the world and through online error correction of the ongoing action. Two prominent frameworks that explain how humans solve these problems are Bayesian estimation and stochastic optimal feedback control. Here we examine the interaction between estimation and control by asking if uncertainty in estimates affects how subjects correct for errors that may arise during the movement. Unbeknownst to participants, we randomly shifted the visual feedback of their finger position as they reached to indicate the center of mass of an object. Even though participants were given ample time to compensate for this perturbation, they only fully corrected for the induced error on trials with low uncertainty about center of mass, with correction only partial in trials involving more uncertainty. The analysis of subjects’ scores revealed that participants corrected for errors just enough to avoid significant decrease in their overall scores, in agreement with the minimal intervention principle of optimal feedback control. We explain this behavior with a term in the loss function that accounts for the additional effort of adjusting one’s response. By suggesting that subjects’ decision uncertainty, as reflected in their posterior distribution, is a major factor in determining how their sensorimotor system responds to error, our findings support theoretical models in which the decision making and control processes are fully integrated. PMID:28129323

  4. Personality and error monitoring: an update

    Directory of Open Access Journals (Sweden)

    Sven eHoffmann

    2012-06-01

    Full Text Available People differ considerably with respect to their ability to initiate and maintain cognitive control. A core control function is the processing and evaluation of errors from which we learn to prevent maladaptive behavior. People strongly differ in the degree of error processing, and how errors are interpreted and appraised. In the present study it was investigated whether a correlate of error monitoring, the error negativity (Ne or ERN, is related to personality factors. Therefore the EEG was measured continuously during a task which provoked errors, and the Ne was tested with respect to its relation to personality traits. Our results indicate a substantial trait-like relation of error processing and personality factors: The Ne was more pronounced for subjection scoring low on the Openness scale, the Impulsiveness scale and the Emotionality scale. Inversely, the Ne was less pronounced for subjects scoring low on the Social Orientation scale. The results implicate that personality traits related to emotional valences and rigidity are reflected in the way people monitor and adapt to erroneous actions. 

  5. Prediction with measurement errors in finite populations.

    Science.gov (United States)

    Singer, Julio M; Stanek, Edward J; Lencina, Viviana B; González, Luz Mery; Li, Wenjun; Martino, Silvina San

    2012-02-01

    We address the problem of selecting the best linear unbiased predictor (BLUP) of the latent value (e.g., serum glucose fasting level) of sample subjects with heteroskedastic measurement errors. Using a simple example, we compare the usual mixed model BLUP to a similar predictor based on a mixed model framed in a finite population (FPMM) setup with two sources of variability, the first of which corresponds to simple random sampling and the second, to heteroskedastic measurement errors. Under this last approach, we show that when measurement errors are subject-specific, the BLUP shrinkage constants are based on a pooled measurement error variance as opposed to the individual ones generally considered for the usual mixed model BLUP. In contrast, when the heteroskedastic measurement errors are measurement condition-specific, the FPMM BLUP involves different shrinkage constants. We also show that in this setup, when measurement errors are subject-specific, the usual mixed model predictor is biased but has a smaller mean squared error than the FPMM BLUP which point to some difficulties in the interpretation of such predictors.

  6. Wind power error estimation in resource assessments.

    Directory of Open Access Journals (Sweden)

    Osvaldo Rodríguez

    Full Text Available Estimating the power output is one of the elements that determine the techno-economic feasibility of a renewable project. At present, there is a need to develop reliable methods that achieve this goal, thereby contributing to wind power penetration. In this study, we propose a method for wind power error estimation based on the wind speed measurement error, probability density function, and wind turbine power curves. This method uses the actual wind speed data without prior statistical treatment based on 28 wind turbine power curves, which were fitted by Lagrange's method, to calculate the estimate wind power output and the corresponding error propagation. We found that wind speed percentage errors of 10% were propagated into the power output estimates, thereby yielding an error of 5%. The proposed error propagation complements the traditional power resource assessments. The wind power estimation error also allows us to estimate intervals for the power production leveled cost or the investment time return. The implementation of this method increases the reliability of techno-economic resource assessment studies.

  7. Error model identification of inertial navigation platform based on errors-in-variables model

    Institute of Scientific and Technical Information of China (English)

    Liu Ming; Liu Yu; Su Baoku

    2009-01-01

    Because the real input acceleration cannot be obtained during the error model identification of inertial navigation platform, both the input and output data contain noises. In this case, the conventional regression model and the least squares (LS) method will result in bias. Based on the models of inertial navigation platform error and observation error, the errors-in-variables (EV) model and the total least squares (TLS) method are proposed to identify the error model of the inertial navigation platform. The estimation precision is improved and the result is better than the conventional regression model based LS method. The simulation results illustrate the effectiveness of the proposed method.

  8. Error estimation in plant growth analysis

    Directory of Open Access Journals (Sweden)

    Andrzej Gregorczyk

    2014-01-01

    Full Text Available The scheme is presented for calculation of errors of dry matter values which occur during approximation of data with growth curves, determined by the analytical method (logistic function and by the numerical method (Richards function. Further formulae are shown, which describe absolute errors of growth characteristics: Growth rate (GR, Relative growth rate (RGR, Unit leaf rate (ULR and Leaf area ratio (LAR. Calculation examples concerning the growth course of oats and maize plants are given. The critical analysis of the estimation of obtained results has been done. The purposefulness of joint application of statistical methods and error calculus in plant growth analysis has been ascertained.

  9. Experimental research on English vowel errors analysis

    Directory of Open Access Journals (Sweden)

    Huang Qiuhua

    2016-01-01

    Full Text Available Our paper analyzed relevant acoustic parameters of people’s speech samples and the results that compared with English standard pronunciation with methods of experimental phonetics by phonetic analysis software and statistical analysis software. Then we summarized phonetic pronunciation errors of college students through the analysis of English pronunciation of vowels, we found that college students’ English pronunciation are easy occur tongue position and lip shape errors during pronounce vowels. Based on analysis of pronunciation errors, we put forward targeted voice training for college students’ English pronunciation, eventually increased the students learning interest, and improved the teaching of English phonetics.

  10. Error estimation and adaptive chemical transport modeling

    Directory of Open Access Journals (Sweden)

    Malte Braack

    2014-09-01

    Full Text Available We present a numerical method to use several chemical transport models of increasing accuracy and complexity in an adaptive way. In largest parts of the domain, a simplified chemical model may be used, whereas in certain regions a more complex model is needed for accuracy reasons. A mathematically derived error estimator measures the modeling error and provides information where to use more accurate models. The error is measured in terms of output functionals. Therefore, one has to consider adjoint problems which carry sensitivity information. This concept is demonstrated by means of ozone formation and pollution emission.

  11. Comparison of Prediction-Error-Modelling Criteria

    DEFF Research Database (Denmark)

    Jørgensen, John Bagterp; Jørgensen, Sten Bay

    2007-01-01

    is a realization of a continuous-discrete multivariate stochastic transfer function model. The proposed prediction error-methods are demonstrated for a SISO system parameterized by the transfer functions with time delays of a continuous-discrete-time linear stochastic system. The simulations for this case suggest......Single and multi-step prediction-error-methods based on the maximum likelihood and least squares criteria are compared. The prediction-error methods studied are based on predictions using the Kalman filter and Kalman predictors for a linear discrete-time stochastic state space model, which...... computational resources. The identification method is suitable for predictive control....

  12. MPC-Relevant Prediction-Error Identification

    DEFF Research Database (Denmark)

    Jørgensen, John Bagterp; Jørgensen, Sten Bay

    2007-01-01

    model is realized from a continuous-discrete-time linear stochastic system specified using transfer functions with time-delays. It is argued that the prediction-error criterion should be selected such that it is compatible with the objective function of the predictive controller in which the model......A prediction-error-method tailored for model based predictive control is presented. The prediction-error method studied are based on predictions using the Kalman filter and Kalman predictors for a linear discrete-time stochastic state space model. The linear discrete-time stochastic state space...

  13. When a Surgical Colleague Makes an Error.

    Science.gov (United States)

    Antiel, Ryan M; Blinman, Thane A; Rentea, Rebecca M; Gonzalez, Katherine W; Knott, E Marty; Juang, David; Oyetunji, Tolulope; Holcomb, G W; Angelos, Peter; Lantos, John D

    2016-03-01

    Professionalism requires that doctors acknowledge their errors and figure out how to avoid making similar ones in the future. Over the last few decades, doctors have gotten better at acknowledging mistakes and apologizing to patients when a mistake happens. Such disclosure is especially complicated when one becomes aware of an error made by a colleague. We present a case in which consultant surgeons became aware that a colleague seemed to have made a serious error. Experts in surgery and bioethics comment on appropriate responses to this situation.

  14. Systematic error mitigation in multiple field astrometry

    CERN Document Server

    Gai, Mario

    2011-01-01

    Combination of more than two fields provides constraints on the systematic error of simultaneous observations. The concept is investigated in the context of the Gravitation Astrometric Measurement Experiment (GAME), which aims at measurement of the PPN parameter $\\gamma$ at the $10^{-7}-10^{-8}$ level. Robust self-calibration and control of systematic error is crucial to the achievement of the precision goal. The present work is focused on the concept investigation and practical implementation strategy of systematic error control over four simultaneously observed fields, implementing a "double differential" measurement technique. Some basic requirements on geometry, observing and calibration strategy are derived, discussing the fundamental characteristics of the proposed concept.

  15. Superdense Coding Interleaved with Forward Error Correction

    Directory of Open Access Journals (Sweden)

    Sadlier Ronald J.

    2016-01-01

    Full Text Available Superdense coding promises increased classical capacity and communication security but this advantage may be undermined by noise in the quantum channel. We present a numerical study of how forward error correction (FEC applied to the encoded classical message can be used to mitigate against quantum channel noise. By studying the bit error rate under different FEC codes, we identify the unique role that burst errors play in superdense coding, and we show how these can be mitigated against by interleaving the FEC codewords prior to transmission. We conclude that classical FEC with interleaving is a useful method to improve the performance in near-term demonstrations of superdense coding.

  16. Measurement Error with Different Computer Vision Techniques

    Science.gov (United States)

    Icasio-Hernández, O.; Curiel-Razo, Y. I.; Almaraz-Cabral, C. C.; Rojas-Ramirez, S. R.; González-Barbosa, J. J.

    2017-09-01

    The goal of this work is to offer a comparative of measurement error for different computer vision techniques for 3D reconstruction and allow a metrological discrimination based on our evaluation results. The present work implements four 3D reconstruction techniques: passive stereoscopy, active stereoscopy, shape from contour and fringe profilometry to find the measurement error and its uncertainty using different gauges. We measured several dimensional and geometric known standards. We compared the results for the techniques, average errors, standard deviations, and uncertainties obtaining a guide to identify the tolerances that each technique can achieve and choose the best.

  17. MEASUREMENT ERROR WITH DIFFERENT COMPUTER VISION TECHNIQUES

    Directory of Open Access Journals (Sweden)

    O. Icasio-Hernández

    2017-09-01

    Full Text Available The goal of this work is to offer a comparative of measurement error for different computer vision techniques for 3D reconstruction and allow a metrological discrimination based on our evaluation results. The present work implements four 3D reconstruction techniques: passive stereoscopy, active stereoscopy, shape from contour and fringe profilometry to find the measurement error and its uncertainty using different gauges. We measured several dimensional and geometric known standards. We compared the results for the techniques, average errors, standard deviations, and uncertainties obtaining a guide to identify the tolerances that each technique can achieve and choose the best.

  18. 42 CFR 440.2 - Specific definitions; definitions of services for FFP purposes.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false Specific definitions; definitions of services for... Definitions § 440.2 Specific definitions; definitions of services for FFP purposes. (a) Specific definitions...) Definitions of services for FFP purposes. Except as limited in part 441, FFP is available in...

  19. Can the Misinterpretation Amendment Rate Be Used as a Measure of Interpretive Error in Anatomic Pathology?: Implications of a Survey of the Directors of Anatomic and Surgical Pathology.

    Science.gov (United States)

    Parkash, Vinita; Fadare, Oluwole; Dewar, Rajan; Nakhleh, Raouf; Cooper, Kumarasen

    2017-03-01

    A repeat survey of the Association of the Directors of Anatomic and Surgical Pathology, done 10 years after the original was used to assess trends and variability in classifying scenarios as errors, and the preferred post signout report modification for correcting error by the membership of the Association of the Directors of Anatomic and Surgical Pathology. The results were analyzed to inform on whether interpretive amendment rates might act as surrogate measures of interpretive error in pathology. An analyses of the responses indicated that primary level misinterpretations (benign to malignant and vice versa) were universally qualified as error; secondary-level misinterpretations or misclassifications were inconsistently labeled error. There was added variability in the preferred post signout report modification used to correct report alterations. The classification of a scenario as error appeared to correlate with severity of potential harm of the missed call, the perceived subjectivity of the diagnosis, and ambiguity of reporting terminology. Substantial differences in policies for error detection and optimal reporting format were documented between departments. In conclusion, the inconsistency in labeling scenarios as error, disagreement about the optimal post signout report modification for the correction of the error, and variability in error detection policies preclude the use of the misinterpretation amendment rate as a surrogate measure for error in anatomic pathology. There is little change in uniformity of definition, attitudes and perception of interpretive error in anatomic pathology in the last 10 years.

  20. Understanding and Confronting Our Mistakes: The Epidemiology of Error in Radiology and Strategies for Error Reduction.

    Science.gov (United States)

    Bruno, Michael A; Walker, Eric A; Abujudeh, Hani H

    2015-10-01

    Arriving at a medical diagnosis is a highly complex process that is extremely error prone. Missed or delayed diagnoses often lead to patient harm and missed opportunities for treatment. Since medical imaging is a major contributor to the overall diagnostic process, it is also a major potential source of diagnostic error. Although some diagnoses may be missed because of the technical or physical limitations of the imaging modality, including image resolution, intrinsic or extrinsic contrast, and signal-to-noise ratio, most missed radiologic diagnoses are attributable to image interpretation errors by radiologists. Radiologic interpretation cannot be mechanized or automated; it is a human enterprise based on complex psychophysiologic and cognitive processes and is itself subject to a wide variety of error types, including perceptual errors (those in which an important abnormality is simply not seen on the images) and cognitive errors (those in which the abnormality is visually detected but the meaning or importance of the finding is not correctly understood or appreciated). The overall prevalence of radiologists' errors in practice does not appear to have changed since it was first estimated in the 1960s. The authors review the epidemiology of errors in diagnostic radiology, including a recently proposed taxonomy of radiologists' errors, as well as research findings, in an attempt to elucidate possible underlying causes of these errors. The authors also propose strategies for error reduction in radiology. On the basis of current understanding, specific suggestions are offered as to how radiologists can improve their performance in practice.

  1. Analysis of error-prone survival data under additive hazards models: measurement error effects and adjustments.

    Science.gov (United States)

    Yan, Ying; Yi, Grace Y

    2016-07-01

    Covariate measurement error occurs commonly in survival analysis. Under the proportional hazards model, measurement error effects have been well studied, and various inference methods have been developed to correct for error effects under such a model. In contrast, error-contaminated survival data under the additive hazards model have received relatively less attention. In this paper, we investigate this problem by exploring measurement error effects on parameter estimation and the change of the hazard function. New insights of measurement error effects are revealed, as opposed to well-documented results for the Cox proportional hazards model. We propose a class of bias correction estimators that embraces certain existing estimators as special cases. In addition, we exploit the regression calibration method to reduce measurement error effects. Theoretical results for the developed methods are established, and numerical assessments are conducted to illustrate the finite sample performance of our methods.

  2. Error effects in anterior cingulate cortex reverse when error likelihood is high

    Science.gov (United States)

    Jessup, Ryan K.; Busemeyer, Jerome R.; Brown, Joshua W.

    2010-01-01

    Strong error-related activity in medial prefrontal cortex (mPFC) has been shown repeatedly with neuroimaging and event-related potential studies for the last several decades. Multiple theories have been proposed to account for error effects, including comparator models and conflict detection models, but the neural mechanisms that generate error signals remain in dispute. Typical studies use relatively low error rates, confounding the expectedness and the desirability of an error. Here we show with a gambling task and fMRI that when losses are more frequent than wins, the mPFC error effect disappears, and moreover, exhibits the opposite pattern by responding more strongly to unexpected wins than losses. These findings provide perspective on recent ERP studies and suggest that mPFC error effects result from a comparison between actual and expected outcomes. PMID:20203206

  3. Error recovery to enable error-free message transfer between nodes of a computer network

    Energy Technology Data Exchange (ETDEWEB)

    Blumrich, Matthias A.; Coteus, Paul W.; Chen, Dong; Gara, Alan; Giampapa, Mark E.; Heidelberger, Philip; Hoenicke, Dirk; Takken, Todd; Steinmacher-Burow, Burkhard; Vranas, Pavlos M.

    2016-01-26

    An error-recovery method to enable error-free message transfer between nodes of a computer network. A first node of the network sends a packet to a second node of the network over a link between the nodes, and the first node keeps a copy of the packet on a sending end of the link until the first node receives acknowledgment from the second node that the packet was received without error. The second node tests the packet to determine if the packet is error free. If the packet is not error free, the second node sets a flag to mark the packet as corrupt. The second node returns acknowledgement to the first node specifying whether the packet was received with or without error. When the packet is received with error, the link is returned to a known state and the packet is sent again to the second node.

  4. Error Detection and Error Classification: Failure Awareness in Data Transfer Scheduling

    Energy Technology Data Exchange (ETDEWEB)

    Louisiana State University; Balman, Mehmet; Kosar, Tevfik

    2010-10-27

    Data transfer in distributed environment is prone to frequent failures resulting from back-end system level problems, like connectivity failure which is technically untraceable by users. Error messages are not logged efficiently, and sometimes are not relevant/useful from users point-of-view. Our study explores the possibility of an efficient error detection and reporting system for such environments. Prior knowledge about the environment and awareness of the actual reason behind a failure would enable higher level planners to make better and accurate decisions. It is necessary to have well defined error detection and error reporting methods to increase the usability and serviceability of existing data transfer protocols and data management systems. We investigate the applicability of early error detection and error classification techniques and propose an error reporting framework and a failure-aware data transfer life cycle to improve arrangement of data transfer operations and to enhance decision making of data transfer schedulers.

  5. Phenotype definition in epilepsy.

    Science.gov (United States)

    Winawer, Melodie R

    2006-05-01

    Phenotype definition consists of the use of epidemiologic, biological, molecular, or computational methods to systematically select features of a disorder that might result from distinct genetic influences. By carefully defining the target phenotype, or dividing the sample by phenotypic characteristics, we can hope to narrow the range of genes that influence risk for the trait in the study population, thereby increasing the likelihood of finding them. In this article, fundamental issues that arise in phenotyping in epilepsy and other disorders are reviewed, and factors complicating genotype-phenotype correlation are discussed. Methods of data collection, analysis, and interpretation are addressed, focusing on epidemiologic studies. With this foundation in place, the epilepsy subtypes and clinical features that appear to have a genetic basis are described, and the epidemiologic studies that have provided evidence for the heritability of these phenotypic characteristics, supporting their use in future genetic investigations, are reviewed. Finally, several molecular approaches to phenotype definition are discussed, in which the molecular defect, rather than the clinical phenotype, is used as a starting point.

  6. Definitions of oppression.

    Science.gov (United States)

    LeBlanc, R G

    1997-12-01

    How we begin to serve others as healthcare professionals and how it is we define underserved and oppressed peoples is important in understanding issues in the organization and allocation of health care. This exploration, based on feminist post-structuralist theory, explores how nurses formulate definitions of 'underserved' and 'vulnerable'. This study goes beyond prescribed definitions to locate meanings used to identify oppression. I present, through an analysis of a literature search, the context and construction of the terms 'vulnerable population' and 'medically underserved'. Professional nursing journals are the source of relevant text. The implications of working with particular understandings of the terms 'vulnerable' and 'underserved' is based on my assumption that healthcare providers need to critically focus on the personal and political meanings they attribute to the labels used to define the clients and communities with whom they work. The presentation of this material supports that the way we define 'vulnerable' and 'underserved' is related to how our work fighting racism, ageism, AIDS, poverty, homophobia, addiction, domestic violence, sexism and colonization is realized.

  7. PSD Definition of Modification

    Science.gov (United States)

    This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.

  8. Error analysis of compensation cutting technique for wavefront error of KH2PO4 crystal.

    Science.gov (United States)

    Tie, Guipeng; Dai, Yifan; Guan, Chaoliang; Zhu, Dengchao; Song, Bing

    2013-09-20

    Considering the wavefront error of KH(2)PO(4) (KDP) crystal is difficult to control through face fly cutting process because of surface shape deformation during vacuum suction, an error compensation technique based on a spiral turning method is put forward. An in situ measurement device is applied to measure the deformed surface shape after vacuum suction, and the initial surface figure error, which is obtained off-line, is added to the in situ surface shape to obtain the final surface figure to be compensated. Then a three-axis servo technique is utilized to cut the final surface shape. In traditional cutting processes, in addition to common error sources such as the error in the straightness of guide ways, spindle rotation error, and error caused by ambient environment variance, three other errors, the in situ measurement error, position deviation error, and servo-following error, are the main sources affecting compensation accuracy. This paper discusses the effect of these three errors on compensation accuracy and provides strategies to improve the final surface quality. Experimental verification was carried out on one piece of KDP crystal with the size of Φ270 mm×11 mm. After one compensation process, the peak-to-valley value of the transmitted wavefront error dropped from 1.9λ (λ=632.8 nm) to approximately 1/3λ, and the mid-spatial-frequency error does not become worse when the frequency of the cutting tool trajectory is controlled by use of a low-pass filter.

  9. Size definitions for particle sampling

    Energy Technology Data Exchange (ETDEWEB)

    1981-05-01

    The recommendations of an ad hoc working group appointed by Committee TC 146 of the International Standards Organization on size definitions for particle sampling are reported. The task of the group was to collect the various definitions of 'respirable dust' and to propose a practical definition on recommendations for handling standardization on this matter. One of two proposed cut-sizes in regard to division at the larynx will be adopted after a ballot.

  10. Systematic sampling with errors in sample locations

    DEFF Research Database (Denmark)

    Ziegel, Johanna; Baddeley, Adrian; Dorph-Petersen, Karl-Anton

    2010-01-01

    Systematic sampling of points in continuous space is widely used in microscopy and spatial surveys. Classical theory provides asymptotic expressions for the variance of estimators based on systematic sampling as the grid spacing decreases. However, the classical theory assumes that the sample grid...... is exactly periodic; real physical sampling procedures may introduce errors in the placement of the sample points. This paper studies the effect of errors in sample positioning on the variance of estimators in the case of one-dimensional systematic sampling. First we sketch a general approach to variance...... analysis using point process methods. We then analyze three different models for the error process, calculate exact expressions for the variances, and derive asymptotic variances. Errors in the placement of sample points can lead to substantial inflation of the variance, dampening of zitterbewegung...

  11. Backtracking and error correction in DNA transcription

    Science.gov (United States)

    Voliotis, Margaritis; Cohen, Netta; Molina-Paris, Carmen; Liverpool, Tanniemola

    2008-03-01

    Genetic information is encoded in the nucleotide sequence of the DNA. This sequence contains the instruction code of the cell - determining protein structure and function, and hence cell function and fate. The viability and endurance of organisms crucially depend on the fidelity with which genetic information is transcribed/translated (during mRNA and protein production) and replicated (during DNA replication). However, thermodynamics introduces significant fluctuations which would incur massive error rates if efficient proofreading mechanisms were not in place. Here, we examine a putative mechanism for error correction during DNA transcription, which relies on backtracking of the RNA polymerase (RNAP). We develop an error correction model that incorporates RNAP translocation, backtracking pauses and mRNA cleavage. We calculate the error rate as a function of the relevant rates (translocation, cleavage, backtracking and polymerization) and show that the its theoretical limit is equivalent to that accomplished by a multiple-step kinetic proofreading mechanism.

  12. Quadratic dynamical decoupling with nonuniform error suppression

    Energy Technology Data Exchange (ETDEWEB)

    Quiroz, Gregory; Lidar, Daniel A. [Department of Physics and Center for Quantum Information Science and Technology, University of Southern California, Los Angeles, California 90089 (United States); Departments of Electrical Engineering, Chemistry, and Physics, and Center for Quantum Information Science and Technology, University of Southern California, Los Angeles, California 90089 (United States)

    2011-10-15

    We analyze numerically the performance of the near-optimal quadratic dynamical decoupling (QDD) single-qubit decoherence errors suppression method [J. West et al., Phys. Rev. Lett. 104, 130501 (2010)]. The QDD sequence is formed by nesting two optimal Uhrig dynamical decoupling sequences for two orthogonal axes, comprising N{sub 1} and N{sub 2} pulses, respectively. Varying these numbers, we study the decoherence suppression properties of QDD directly by isolating the errors associated with each system basis operator present in the system-bath interaction Hamiltonian. Each individual error scales with the lowest order of the Dyson series, therefore immediately yielding the order of decoherence suppression. We show that the error suppression properties of QDD are dependent upon the parities of N{sub 1} and N{sub 2}, and near-optimal performance is achieved for general single-qubit interactions when N{sub 1}=N{sub 2}.

  13. Understanding Human Error Based on Automated Analyses

    Data.gov (United States)

    National Aeronautics and Space Administration — This is a report on a continuing study of automated analyses of experiential textual reports to gain insight into the causal factors of human errors in aviation...

  14. Diet History Questionnaire II: Missing & Error Codes

    Science.gov (United States)

    A missing code indicates that the respondent skipped a question when a response was required. An error character indicates that the respondent marked two or more responses to a question where only one answer was appropriate.

  15. Long Burst Error Correcting Codes Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Long burst error mitigation is an enabling technology for the use of Ka band for high rate commercial and government users. Multiple NASA, government, and commercial...

  16. Simulating Bosonic Baths with Error Bars

    Science.gov (United States)

    Woods, M. P.; Cramer, M.; Plenio, M. B.

    2015-09-01

    We derive rigorous truncation-error bounds for the spin-boson model and its generalizations to arbitrary quantum systems interacting with bosonic baths. For the numerical simulation of such baths, the truncation of both the number of modes and the local Hilbert-space dimensions is necessary. We derive superexponential Lieb-Robinson-type bounds on the error when restricting the bath to finitely many modes and show how the error introduced by truncating the local Hilbert spaces may be efficiently monitored numerically. In this way we give error bounds for approximating the infinite system by a finite-dimensional one. As a consequence, numerical simulations such as the time-evolving density with orthogonal polynomials algorithm (TEDOPA) now allow for the fully certified treatment of the system-environment interaction.

  17. Motivation for error-tolerant communication

    Science.gov (United States)

    Halbach, Till

    2002-01-01

    The transmission of large data streams over error-prone channels as e.g. in multimedia applications is inherently linked to long transmission delays if automatic repeat request schemes are used. As this article will show, the delay can be reasonably traded against residual bit errors if a short transmission time has highest priority. The dependency of the delay on two important factors, packet length and channel bit error rate, is determined to be non-linear and strictly monotonously growing. Furthermore, transmission behavior and properties of a plain binary symmetric channel and one with additional repeat request technique are simulated and compared to previous research. The simulations finally lead to a redefinition of the formula for the estimation of the residual bit error rate of a non-transparent channel.

  18. Application of Interval Analysis to Error Control.

    Science.gov (United States)

    1976-09-01

    We give simple examples of ways in which interval arithmetic can be used to alert instabilities in computer algorithms , roundoff error accumulation, and even the effects of hardware inadequacies. This paper is primarily tutorial. (Author)

  19. Assessing Measurement Error in Medicare Coverage

    Data.gov (United States)

    U.S. Department of Health & Human Services — Assessing Measurement Error in Medicare Coverage From the National Health Interview Survey Using linked administrative data, to validate Medicare coverage estimates...

  20. Error Sonification of a Complex Motor Task

    Directory of Open Access Journals (Sweden)

    Riener Robert

    2011-12-01

    Full Text Available Visual information is mainly used to master complex motor tasks. Thus, additional information providing augmented feedback should be displayed in other modalities than vision, e.g. hearing. The present work evaluated the potential of error sonification to enhance learning of a rowing-type motor task. In contrast to a control group receiving self-controlled terminal feedback, the experimental group could not significantly reduce spatial errors. Thus, motor learning was not enhanced by error sonification, although during the training the participant could benefit from it. It seems that the motor task was too slow, resulting in immediate corrections of the movement rather than in an internal representation of the general characteristics of the motor task. Therefore, further studies should elaborate the impact of error sonification when general characteristics of the motor tasks are already known.

  1. Increasing sensing resolution with error correction.

    Science.gov (United States)

    Arrad, G; Vinkler, Y; Aharonov, D; Retzker, A

    2014-04-18

    The signal to noise ratio of quantum sensing protocols scales with the square root of the coherence time. Thus, increasing this time is a key goal in the field. By utilizing quantum error correction, we present a novel way of prolonging such coherence times beyond the fundamental limits of current techniques. We develop an implementable sensing protocol that incorporates error correction, and discuss the characteristics of these protocols in different noise and measurement scenarios. We examine the use of entangled versue untangled states, and error correction's reach of the Heisenberg limit. The effects of error correction on coherence times are calculated and we show that measurement precision can be enhanced for both one-directional and general noise.

  2. Measurement error in longitudinal film badge data

    CERN Document Server

    Marsh, J L

    2002-01-01

    Initial logistic regressions turned up some surprising contradictory results which led to a re-sampling of Sellafield mortality controls without the date of employment matching factor. It is suggested that over matching is the cause of the contradictory results. Comparisons of the two measurements of radiation exposure suggest a strongly linear relationship with non-Normal errors. A method has been developed using the technique of Regression Calibration to deal with these in a case-control study context, and applied to this Sellafield study. The classical measurement error model is that of a simple linear regression with unobservable variables. Information about the covariates is available only through error-prone measurements, usually with an additive structure. Ignoring errors has been shown to result in biased regression coefficients, reduced power of hypothesis tests and increased variability of parameter estimates. Radiation is known to be a causal factor for certain types of leukaemia. This link is main...

  3. Rank Modulation for Translocation Error Correction

    CERN Document Server

    Farnoud, Farzad; Milenkovic, Olgica

    2012-01-01

    We consider rank modulation codes for flash memories that allow for handling arbitrary charge drop errors. Unlike classical rank modulation codes used for correcting errors that manifest themselves as swaps of two adjacently ranked elements, the proposed \\emph{translocation rank codes} account for more general forms of errors that arise in storage systems. Translocations represent a natural extension of the notion of adjacent transpositions and as such may be analyzed using related concepts in combinatorics and rank modulation coding. Our results include tight bounds on the capacity of translocation rank codes, construction techniques for asymptotically good codes, as well as simple decoding methods for one class of structured codes. As part of our exposition, we also highlight the close connections between the new code family and permutations with short common subsequences, deletion and insertion error-correcting codes for permutations and permutation arrays.

  4. Assessing Measurement Error in Medicare Coverage

    Data.gov (United States)

    U.S. Department of Health & Human Services — Assessing Measurement Error in Medicare Coverage From the National Health Interview Survey Using linked administrative data, to validate Medicare coverage estimates...

  5. Morphological Errors Made By Jordanian University Students

    Directory of Open Access Journals (Sweden)

    Ramadan Saleh

    2015-11-01

    Full Text Available This study tries to identify, classify, describe and find out the causes of the morphological errors made by the fourth year university students majoring in English in Jordan. The students who participated in the study were 20 students from Al–Zaytoonah Private University of Jordan. The procedure followed was essay writing. After analyzing the errors, the study shows that (a the students’ competence in English morphology is poor and (b The errors are caused by some factors such as the inconsistency in English as well as misapplication of rules. Interference and overgeneralization are also other causes. Since the course of morphology is selective in the university plan, it is also considered as an important cause. In order to reduce their errors, the researcher has suggested some remedies.

  6. Error estimation and adaptivity for incompressible hyperelasticity

    KAUST Repository

    Whiteley, J.P.

    2014-04-30

    SUMMARY: A Galerkin FEM is developed for nonlinear, incompressible (hyper) elasticity that takes account of nonlinearities in both the strain tensor and the relationship between the strain tensor and the stress tensor. By using suitably defined linearised dual problems with appropriate boundary conditions, a posteriori error estimates are then derived for both linear functionals of the solution and linear functionals of the stress on a boundary, where Dirichlet boundary conditions are applied. A second, higher order method for calculating a linear functional of the stress on a Dirichlet boundary is also presented together with an a posteriori error estimator for this approach. An implementation for a 2D model problem with known solution, where the entries of the strain tensor exhibit large, rapid variations, demonstrates the accuracy and sharpness of the error estimators. Finally, using a selection of model problems, the a posteriori error estimate is shown to provide a basis for effective mesh adaptivity. © 2014 John Wiley & Sons, Ltd.

  7. Soft error mechanisms, modeling and mitigation

    CERN Document Server

    Sayil, Selahattin

    2016-01-01

    This book introduces readers to various radiation soft-error mechanisms such as soft delays, radiation induced clock jitter and pulses, and single event (SE) coupling induced effects. In addition to discussing various radiation hardening techniques for combinational logic, the author also describes new mitigation strategies targeting commercial designs. Coverage includes novel soft error mitigation techniques such as the Dynamic Threshold Technique and Soft Error Filtering based on Transmission gate with varied gate and body bias. The discussion also includes modeling of SE crosstalk noise, delay and speed-up effects. Various mitigation strategies to eliminate SE coupling effects are also introduced. Coverage also includes the reliability of low power energy-efficient designs and the impact of leakage power consumption optimizations on soft error robustness. The author presents an analysis of various power optimization techniques, enabling readers to make design choices that reduce static power consumption an...

  8. Formal Definition of Artificial Intelligence

    OpenAIRE

    Dobrev, Dimiter

    2005-01-01

    * This publication is partially supported by the KT-DigiCult-Bg project. A definition of Artificial Intelligence (AI) was proposed in [1] but this definition was not absolutely formal at least because the word "Human" was used. In this paper we will formalize the definition from [1]. The biggest problem in this definition was that the level of intelligence of AI is compared to the intelligence of a human being. In order to change this we will introduce some parameters to which AI ...

  9. Open quantum systems and error correction

    Science.gov (United States)

    Shabani Barzegar, Alireza

    Quantum effects can be harnessed to manipulate information in a desired way. Quantum systems which are designed for this purpose are suffering from harming interaction with their surrounding environment or inaccuracy in control forces. Engineering different methods to combat errors in quantum devices are highly demanding. In this thesis, I focus on realistic formulations of quantum error correction methods. A realistic formulation is the one that incorporates experimental challenges. This thesis is presented in two sections of open quantum system and quantum error correction. Chapters 2 and 3 cover the material on open quantum system theory. It is essential to first study a noise process then to contemplate methods to cancel its effect. In the second chapter, I present the non-completely positive formulation of quantum maps. Most of these results are published in [Shabani and Lidar, 2009b,a], except a subsection on geometric characterization of positivity domain of a quantum map. The real-time formulation of the dynamics is the topic of the third chapter. After introducing the concept of Markovian regime, A new post-Markovian quantum master equation is derived, published in [Shabani and Lidar, 2005a]. The section of quantum error correction is presented in three chapters of 4, 5, 6 and 7. In chapter 4, we introduce a generalized theory of decoherence-free subspaces and subsystems (DFSs), which do not require accurate initialization (published in [Shabani and Lidar, 2005b]). In Chapter 5, we present a semidefinite program optimization approach to quantum error correction that yields codes and recovery procedures that are robust against significant variations in the noise channel. Our approach allows us to optimize the encoding, recovery, or both, and is amenable to approximations that significantly improve computational cost while retaining fidelity (see [Kosut et al., 2008] for a published version). Chapter 6 is devoted to a theory of quantum error correction (QEC

  10. Demonstrating the robustness of population surveillance data: implications of error rates on demographic and mortality estimates

    Directory of Open Access Journals (Sweden)

    Berhane Yemane

    2008-03-01

    Full Text Available Abstract Background As in any measurement process, a certain amount of error may be expected in routine population surveillance operations such as those in demographic surveillance sites (DSSs. Vital events are likely to be missed and errors made no matter what method of data capture is used or what quality control procedures are in place. The extent to which random errors in large, longitudinal datasets affect overall health and demographic profiles has important implications for the role of DSSs as platforms for public health research and clinical trials. Such knowledge is also of particular importance if the outputs of DSSs are to be extrapolated and aggregated with realistic margins of error and validity. Methods This study uses the first 10-year dataset from the Butajira Rural Health Project (BRHP DSS, Ethiopia, covering approximately 336,000 person-years of data. Simple programmes were written to introduce random errors and omissions into new versions of the definitive 10-year Butajira dataset. Key parameters of sex, age, death, literacy and roof material (an indicator of poverty were selected for the introduction of errors based on their obvious importance in demographic and health surveillance and their established significant associations with mortality. Defining the original 10-year dataset as the 'gold standard' for the purposes of this investigation, population, age and sex compositions and Poisson regression models of mortality rate ratios were compared between each of the intentionally erroneous datasets and the original 'gold standard' 10-year data. Results The composition of the Butajira population was well represented despite introducing random errors, and differences between population pyramids based on the derived datasets were subtle. Regression analyses of well-established mortality risk factors were largely unaffected even by relatively high levels of random errors in the data. Conclusion The low sensitivity of parameter

  11. Error bounds from extra precise iterative refinement

    Energy Technology Data Exchange (ETDEWEB)

    Demmel, James; Hida, Yozo; Kahan, William; Li, Xiaoye S.; Mukherjee, Soni; Riedy, E. Jason

    2005-02-07

    We present the design and testing of an algorithm for iterative refinement of the solution of linear equations, where the residual is computed with extra precision. This algorithm was originally proposed in the 1960s [6, 22] as a means to compute very accurate solutions to all but the most ill-conditioned linear systems of equations. However two obstacles have until now prevented its adoption in standard subroutine libraries like LAPACK: (1) There was no standard way to access the higher precision arithmetic needed to compute residuals, and (2) it was unclear how to compute a reliable error bound for the computed solution. The completion of the new BLAS Technical Forum Standard [5] has recently removed the first obstacle. To overcome the second obstacle, we show how a single application of iterative refinement can be used to compute an error bound in any norm at small cost, and use this to compute both an error bound in the usual infinity norm, and a componentwise relative error bound. We report extensive test results on over 6.2 million matrices of dimension 5, 10, 100, and 1000. As long as a normwise (resp. componentwise) condition number computed by the algorithm is less than 1/max{l_brace}10,{radical}n{r_brace} {var_epsilon}{sub w}, the computed normwise (resp. componentwise) error bound is at most 2 max{l_brace}10,{radical}n{r_brace} {center_dot} {var_epsilon}{sub w}, and indeed bounds the true error. Here, n is the matrix dimension and w is single precision roundoff error. For worse conditioned problems, we get similarly small correct error bounds in over 89.4% of cases.

  12. Typeview: A Tool for Understanding Type Errors

    OpenAIRE

    Simon, Axel; Chitil, Olaf; Huch, Frank

    2000-01-01

    In modern statically typed functional languages, type inference is used to determine the type of each function automatically. Whenever this fails, the compiler emits an error message that is often very complex. Sometimes the expression mentioned in the type error message is not the one that is wrong. We therefore implement an interactive tool that allows programmers to browse through the source code of their program and query the types of each expression. If a variable cannot be typed, we wou...

  13. Errors and mistakes in breast ultrasound diagnostics.

    Science.gov (United States)

    Jakubowski, Wiesław; Dobruch-Sobczak, Katarzyna; Migda, Bartosz

    2012-09-01

    Sonomammography is often the first additional examination performed in the diagnostics of breast diseases. The development of ultrasound imaging techniques, particularly the introduction of high frequency transducers, matrix transducers, harmonic imaging and finally, elastography, influenced the improvement of breast disease diagnostics. Nevertheless, as in each imaging method, there are errors and mistakes resulting from the technical limitations of the method, breast anatomy (fibrous remodeling), insufficient sensitivity and, in particular, specificity. Errors in breast ultrasound diagnostics can be divided into impossible to be avoided and potentially possible to be reduced. In this article the most frequently made errors in ultrasound have been presented, including the ones caused by the presence of artifacts resulting from volumetric averaging in the near and far field, artifacts in cysts or in dilated lactiferous ducts (reverberations, comet tail artifacts, lateral beam artifacts), improper setting of general enhancement or time gain curve or range. Errors dependent on the examiner, resulting in the wrong BIRADS-usg classification, are divided into negative and positive errors. The sources of these errors have been listed. The methods of minimization of the number of errors made have been discussed, including the ones related to the appropriate examination technique, taking into account data from case history and the use of the greatest possible number of additional options such as: harmonic imaging, color and power Doppler and elastography. In the article examples of errors resulting from the technical conditions of the method have been presented, and those dependent on the examiner which are related to the great diversity and variation of ultrasound images of pathological breast lesions.

  14. Errors and mistakes in breast ultrasound diagnostics

    Science.gov (United States)

    Jakubowski, Wiesław; Migda, Bartosz

    2012-01-01

    Sonomammography is often the first additional examination performed in the diagnostics of breast diseases. The development of ultrasound imaging techniques, particularly the introduction of high frequency transducers, matrix transducers, harmonic imaging and finally, elastography, influenced the improvement of breast disease diagnostics. Nevertheless, as in each imaging method, there are errors and mistakes resulting from the technical limitations of the method, breast anatomy (fibrous remodeling), insufficient sensitivity and, in particular, specificity. Errors in breast ultrasound diagnostics can be divided into impossible to be avoided and potentially possible to be reduced. In this article the most frequently made errors in ultrasound have been presented, including the ones caused by the presence of artifacts resulting from volumetric averaging in the near and far field, artifacts in cysts or in dilated lactiferous ducts (reverberations, comet tail artifacts, lateral beam artifacts), improper setting of general enhancement or time gain curve or range. Errors dependent on the examiner, resulting in the wrong BIRADS-usg classification, are divided into negative and positive errors. The sources of these errors have been listed. The methods of minimization of the number of errors made have been discussed, including the ones related to the appropriate examination technique, taking into account data from case history and the use of the greatest possible number of additional options such as: harmonic imaging, color and power Doppler and elastography. In the article examples of errors resulting from the technical conditions of the method have been presented, and those dependent on the examiner which are related to the great diversity and variation of ultrasound images of pathological breast lesions. PMID:26675358

  15. Errors and mistakes in breast ultrasound diagnostics

    Directory of Open Access Journals (Sweden)

    Wiesław Jakubowski

    2012-09-01

    Full Text Available Sonomammography is often the first additional examination performed in the diagnostics of breast diseases. The development of ultrasound imaging techniques, particularly the introduction of high frequency transducers, matrix transducers, harmonic imaging and finally, elastography, influenced the improvement of breast disease diagnostics. Neverthe‑ less, as in each imaging method, there are errors and mistakes resulting from the techni‑ cal limitations of the method, breast anatomy (fibrous remodeling, insufficient sensitivity and, in particular, specificity. Errors in breast ultrasound diagnostics can be divided into impossible to be avoided and potentially possible to be reduced. In this article the most frequently made errors in ultrasound have been presented, including the ones caused by the presence of artifacts resulting from volumetric averaging in the near and far field, artifacts in cysts or in dilated lactiferous ducts (reverberations, comet tail artifacts, lateral beam artifacts, improper setting of general enhancement or time gain curve or range. Errors dependent on the examiner, resulting in the wrong BIRADS‑usg classification, are divided into negative and positive errors. The sources of these errors have been listed. The methods of minimization of the number of errors made have been discussed, includ‑ ing the ones related to the appropriate examination technique, taking into account data from case history and the use of the greatest possible number of additional options such as: harmonic imaging, color and power Doppler and elastography. In the article examples of errors resulting from the technical conditions of the method have been presented, and those dependent on the examiner which are related to the great diversity and variation of ultrasound images of pathological breast lesions.

  16. Case study: error rates and paperwork design.

    Science.gov (United States)

    Drury, C G

    1998-01-01

    A job instruction document, or workcard, for civil aircraft maintenance produced a number of paperwork errors when used operationally. The design of the workcard was compared to the guidelines of Patel et al [1994, Applied Ergonomics, 25 (5), 286-293]. All of the errors occurred in work instructions which did not meet these guidelines, demonstrating that the design of documentation does affect operational performance.

  17. Engaging with learners’ errors when teaching mathematics

    OpenAIRE

    Ingrid Sapire; Yael Shalem; Bronwen Wilson-Thompson; Ronél Paulsen

    2016-01-01

    Teachers come across errors not only in tests but also in their mathematics classrooms virtually every day. When they respond to learners’ errors in their classrooms, during or after teaching, teachers are actively carrying out formative assessment. In South Africa the Annual National Assessment, a written test under the auspices of the Department of Basic Education, requires that teachers use learner data diagnostically. This places a new and complex cognitive demand on teachers’ pedagogical...

  18. Interlanguage Signs and Lexical Transfer Errors

    CERN Document Server

    Ro, A

    1994-01-01

    A theory of interlanguage (IL) lexicons is outlined, with emphasis on IL lexical entries, based on the HPSG notion of lexical sign. This theory accounts for idiosyncratic or lexical transfer of syntactic subcategorisation and idioms from the first language to the IL. It also accounts for developmental stages in IL lexical grammar, and grammatical variation in the use of the same lexical item. The theory offers a tool for robust parsing of lexical transfer errors and diagnosis of such errors.

  19. Counting OCR errors in typeset text

    Science.gov (United States)

    Sandberg, Jonathan S.

    1995-03-01

    Frequently object recognition accuracy is a key component in the performance analysis of pattern matching systems. In the past three years, the results of numerous excellent and rigorous studies of OCR system typeset-character accuracy (henceforth OCR accuracy) have been published, encouraging performance comparisons between a variety of OCR products and technologies. These published figures are important; OCR vendor advertisements in the popular trade magazines lead readers to believe that published OCR accuracy figures effect market share in the lucrative OCR market. Curiously, a detailed review of many of these OCR error occurrence counting results reveals that they are not reproducible as published and they are not strictly comparable due to larger variances in the counts than would be expected by the sampling variance. Naturally, since OCR accuracy is based on a ratio of the number of OCR errors over the size of the text searched for errors, imprecise OCR error accounting leads to similar imprecision in OCR accuracy. Some published papers use informal, non-automatic, or intuitively correct OCR error accounting. Still other published results present OCR error accounting methods based on string matching algorithms such as dynamic programming using Levenshtein (edit) distance but omit critical implementation details (such as the existence of suspect markers in the OCR generated output or the weights used in the dynamic programming minimization procedure). The problem with not specifically revealing the accounting method is that the number of errors found by different methods are significantly different. This paper identifies the basic accounting methods used to measure OCR errors in typeset text and offers an evaluation and comparison of the various accounting methods.

  20. THE ROLES OF CLINICAL PHARMACY IN REDUCING MEDICATION ERRORS

    OpenAIRE

    Alsaraf Khulood Majid

    2012-01-01

    Potential activation of clinical pharmacist role is of great importance in reducing the medication errors which are a well- known problem in hospital. The medication errors could be prescribing errors, dispensing errors, and administering errors. In this study medication errors randomly were collected by clinical pharmacist and inpatient pharmacist from different wards at a Hospital in Dubai, UAE, from July to October 2011. The results showed that the highest percentage of medication errors w...