WorldWideScience

Sample records for human error rates

  1. Payment Error Rate Measurement (PERM)

    Data.gov (United States)

    U.S. Department of Health & Human Services — The PERM program measures improper payments in Medicaid and CHIP and produces error rates for each program. The error rates are based on reviews of the...

  2. Comprehensive Error Rate Testing (CERT)

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Centers for Medicare and Medicaid Services (CMS) implemented the Comprehensive Error Rate Testing (CERT) program to measure improper payments in the Medicare...

  3. The effect of retinal image error update rate on human vestibulo-ocular reflex gain adaptation.

    Science.gov (United States)

    Fadaee, Shannon B; Migliaccio, Americo A

    2016-04-01

    The primary function of the angular vestibulo-ocular reflex (VOR) is to stabilise images on the retina during head movements. Retinal image movement is the likely feedback signal that drives VOR modification/adaptation for different viewing contexts. However, it is not clear whether a retinal image position or velocity error is used primarily as the feedback signal. Recent studies examining this signal are limited because they used near viewing to modify the VOR. However, it is not known whether near viewing drives VOR adaptation or is a pre-programmed contextual cue that modifies the VOR. Our study is based on analysis of the VOR evoked by horizontal head impulses during an established adaptation task. Fourteen human subjects underwent incremental unilateral VOR adaptation training and were tested using the scleral search coil technique over three separate sessions. The update rate of the laser target position (source of the retinal image error signal) used to drive VOR adaptation was different for each session [50 (once every 20 ms), 20 and 15/35 Hz]. Our results show unilateral VOR adaptation occurred at 50 and 20 Hz for both the active (23.0 ± 9.6 and 11.9 ± 9.1% increase on adapting side, respectively) and passive VOR (13.5 ± 14.9, 10.4 ± 12.2%). At 15 Hz, unilateral adaptation no longer occurred in the subject group for both the active and passive VOR, whereas individually, 4/9 subjects tested at 15 Hz had significant adaptation. Our findings suggest that 1-2 retinal image position error signals every 100 ms (i.e. target position update rate 15-20 Hz) are sufficient to drive VOR adaptation.

  4. Errors in Human Performance

    Science.gov (United States)

    1980-08-15

    activities. Internatin , f - Studie, 1979, _ 5-24. Collins, A. h., & Loftus, E. F. A spreading activation theory of seman- tic processing. k fl _j-ej w...rea-daAm i_=E1jJh. Providence, R.I.: Brown University Press, 1967. LaBerge, D., & Samuels, S. J. Toward a theory of automatic information processing...Report, November, 1979. Norman, D. A. Er n human pefg ce (Tech. Rep. 8004). University of California, San Diego, July 1980. Norman, D. A. Post Freudian

  5. Error-associated behaviors and error rates for robotic geology

    Science.gov (United States)

    Anderson, Robert C.; Thomas, Geb; Wagner, Jacob; Glasgow, Justin

    2004-01-01

    This study explores human error as a function of the decision-making process. One of many models for human decision-making is Rasmussen's decision ladder [9]. The decision ladder identifies the multiple tasks and states of knowledge involved in decision-making. The tasks and states of knowledge can be classified by the level of cognitive effort required to make the decision, leading to the skill, rule, and knowledge taxonomy (Rasmussen, 1987). Skill based decisions require the least cognitive effort and knowledge based decisions require the greatest cognitive effort. Errors can occur at any of the cognitive levels.

  6. Error-associated behaviors and error rates for robotic geology

    Science.gov (United States)

    Anderson, Robert C.; Thomas, Geb; Wagner, Jacob; Glasgow, Justin

    2004-01-01

    This study explores human error as a function of the decision-making process. One of many models for human decision-making is Rasmussen's decision ladder [9]. The decision ladder identifies the multiple tasks and states of knowledge involved in decision-making. The tasks and states of knowledge can be classified by the level of cognitive effort required to make the decision, leading to the skill, rule, and knowledge taxonomy (Rasmussen, 1987). Skill based decisions require the least cognitive effort and knowledge based decisions require the greatest cognitive effort. Errors can occur at any of the cognitive levels.

  7. Managing human error in aviation.

    Science.gov (United States)

    Helmreich, R L

    1997-05-01

    Crew resource management (CRM) programs were developed to address team and leadership aspects of piloting modern airplanes. The goal is to reduce errors through team work. Human factors research and social, cognitive, and organizational psychology are used to develop programs tailored for individual airlines. Flight crews study accident case histories, group dynamics, and human error. Simulators provide pilots with the opportunity to solve complex flight problems. CRM in the simulator is called line-oriented flight training (LOFT). In automated cockpits CRM promotes the idea of automation as a crew member. Cultural aspects of aviation include professional, business, and national culture. The aviation CRM model has been adapted for training surgeons and operating room staff in human factors.

  8. The Detection of Human Spreadsheet Errors by Humans versus Inspection (Auditing) Software

    CERN Document Server

    Aurigemma, Salvatore

    2010-01-01

    Previous spreadsheet inspection experiments have had human subjects look for seeded errors in spreadsheets. In this study, subjects attempted to find errors in human-developed spreadsheets to avoid the potential artifacts created by error seeding. Human subject success rates were compared to the successful rates for error-flagging by spreadsheet static analysis tools (SSATs) applied to the same spreadsheets. The human error detection results were comparable to those of studies using error seeding. However, Excel Error Check and Spreadsheet Professional were almost useless for correctly flagging natural (human) errors in this study.

  9. Human Errors and Bridge Management Systems

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle; Nowak, A. S.

    Human errors are divided in two groups. The first group contains human errors, which effect the reliability directly. The second group contains human errors, which will not directly effect the reliability of the structure. The methodology used to estimate so-called reliability distributions on ba...

  10. Game Design Principles based on Human Error

    Directory of Open Access Journals (Sweden)

    Guilherme Zaffari

    2016-03-01

    Full Text Available This paper displays the result of the authors’ research regarding to the incorporation of Human Error, through design principles, to video game design. In a general way, designers must consider Human Error factors throughout video game interface development; however, when related to its core design, adaptations are in need, since challenge is an important factor for fun and under the perspective of Human Error, challenge can be considered as a flaw in the system. The research utilized Human Error classifications, data triangulation via predictive human error analysis, and the expanded flow theory to allow the design of a set of principles in order to match the design of playful challenges with the principles of Human Error. From the results, it was possible to conclude that the application of Human Error in game design has a positive effect on player experience, allowing it to interact only with errors associated with the intended aesthetics of the game.

  11. Human error: A significant information security issue

    Energy Technology Data Exchange (ETDEWEB)

    Banks, W.W.

    1994-12-31

    One of the major threats to information security human error is often ignored or dismissed with statements such as {open_quotes}There is not much we can do about it.{close_quotes} This type of thinking runs counter to reality because studies have shown that, of all systems threats, human error has the highest probability of occurring and that, with professional assistance, human errors can be prevented or significantly reduced Security analysts often overlook human error as a major threat; however, other professionals such as human factors engineers are trained to deal with these probabilistic occurrences and mitigate them. In a recent study 55% of the respondents surveyed considered human error as the most important security threat. Documentation exists to show that human error was a major cause of the consequences suffered at Three Mile Island, Chernobyl, Bhopal, and the Exxon tanker, Valdez. Ironically, causes of human error can usually be quickly and easily eliminated.

  12. Information systems and human error in the lab.

    Science.gov (United States)

    Bissell, Michael G

    2004-01-01

    Health system costs in clinical laboratories are incurred daily due to human error. Indeed, a major impetus for automating clinical laboratories has always been the opportunity it presents to simultaneously reduce cost and improve quality of operations by decreasing human error. But merely automating these processes is not enough. To the extent that introduction of these systems results in operators having less practice in dealing with unexpected events or becoming deskilled in problemsolving, however new kinds of error will likely appear. Clinical laboratories could potentially benefit by integrating findings on human error from modern behavioral science into their operations. Fully understanding human error requires a deep understanding of human information processing and cognition. Predicting and preventing negative consequences requires application of this understanding to laboratory operations. Although the occurrence of a particular error at a particular instant cannot be absolutely prevented, human error rates can be reduced. The following principles are key: an understanding of the process of learning in relation to error; understanding the origin of errors since this knowledge can be used to reduce their occurrence; optimal systems should be forgiving to the operator by absorbing errors, at least for a time; although much is known by industrial psychologists about how to write operating procedures and instructions in ways that reduce the probability of error, this expertise is hardly ever put to use in the laboratory; and a feedback mechanism must be designed into the system that enables the operator to recognize in real time that an error has occurred.

  13. Human Error Mechanisms in Complex Work Environments

    DEFF Research Database (Denmark)

    Rasmussen, Jens

    1988-01-01

    will account for most of the action errors observed. In addition, error mechanisms appear to be intimately related to the development of high skill and know-how in a complex work context. This relationship between errors and human adaptation is discussed in detail for individuals and organisations...

  14. Correcting the optimal resampling-based error rate by estimating the error rate of wrapper algorithms.

    Science.gov (United States)

    Bernau, Christoph; Augustin, Thomas; Boulesteix, Anne-Laure

    2013-09-01

    High-dimensional binary classification tasks, for example, the classification of microarray samples into normal and cancer tissues, usually involve a tuning parameter. By reporting the performance of the best tuning parameter value only, over-optimistic prediction errors are obtained. For correcting this tuning bias, we develop a new method which is based on a decomposition of the unconditional error rate involving the tuning procedure, that is, we estimate the error rate of wrapper algorithms as introduced in the context of internal cross-validation (ICV) by Varma and Simon (2006, BMC Bioinformatics 7, 91). Our subsampling-based estimator can be written as a weighted mean of the errors obtained using the different tuning parameter values, and thus can be interpreted as a smooth version of ICV, which is the standard approach for avoiding tuning bias. In contrast to ICV, our method guarantees intuitive bounds for the corrected error. Additionally, we suggest to use bias correction methods also to address the conceptually similar method selection bias that results from the optimal choice of the classification method itself when evaluating several methods successively. We demonstrate the performance of our method on microarray and simulated data and compare it to ICV. This study suggests that our approach yields competitive estimates at a much lower computational price.

  15. Monitoring Error Rates In Illumina Sequencing

    Science.gov (United States)

    Manley, Leigh J.; Ma, Duanduan; Levine, Stuart S.

    2016-01-01

    Guaranteeing high-quality next-generation sequencing data in a rapidly changing environment is an ongoing challenge. The introduction of the Illumina NextSeq 500 and the depreciation of specific metrics from Illumina's Sequencing Analysis Viewer (SAV; Illumina, San Diego, CA, USA) have made it more difficult to determine directly the baseline error rate of sequencing runs. To improve our ability to measure base quality, we have created an open-source tool to construct the Percent Perfect Reads (PPR) plot, previously provided by the Illumina sequencers. The PPR program is compatible with HiSeq 2000/2500, MiSeq, and NextSeq 500 instruments and provides an alternative to Illumina's quality value (Q) scores for determining run quality. Whereas Q scores are representative of run quality, they are often overestimated and are sourced from different look-up tables for each platform. The PPR’s unique capabilities as a cross-instrument comparison device, as a troubleshooting tool, and as a tool for monitoring instrument performance can provide an increase in clarity over SAV metrics that is often crucial for maintaining instrument health. These capabilities are highlighted. PMID:27672352

  16. Understanding Human Error Based on Automated Analyses

    Data.gov (United States)

    National Aeronautics and Space Administration — This is a report on a continuing study of automated analyses of experiential textual reports to gain insight into the causal factors of human errors in aviation...

  17. Case study: error rates and paperwork design.

    Science.gov (United States)

    Drury, C G

    1998-01-01

    A job instruction document, or workcard, for civil aircraft maintenance produced a number of paperwork errors when used operationally. The design of the workcard was compared to the guidelines of Patel et al [1994, Applied Ergonomics, 25 (5), 286-293]. All of the errors occurred in work instructions which did not meet these guidelines, demonstrating that the design of documentation does affect operational performance.

  18. Simultaneous control of error rates in fMRI data analysis.

    Science.gov (United States)

    Kang, Hakmook; Blume, Jeffrey; Ombao, Hernando; Badre, David

    2015-12-01

    The key idea of statistical hypothesis testing is to fix, and thereby control, the Type I error (false positive) rate across samples of any size. Multiple comparisons inflate the global (family-wise) Type I error rate and the traditional solution to maintaining control of the error rate is to increase the local (comparison-wise) Type II error (false negative) rates. However, in the analysis of human brain imaging data, the number of comparisons is so large that this solution breaks down: the local Type II error rate ends up being so large that scientifically meaningful analysis is precluded. Here we propose a novel solution to this problem: allow the Type I error rate to converge to zero along with the Type II error rate. It works because when the Type I error rate per comparison is very small, the accumulation (or global) Type I error rate is also small. This solution is achieved by employing the likelihood paradigm, which uses likelihood ratios to measure the strength of evidence on a voxel-by-voxel basis. In this paper, we provide theoretical and empirical justification for a likelihood approach to the analysis of human brain imaging data. In addition, we present extensive simulations that show the likelihood approach is viable, leading to "cleaner"-looking brain maps and operational superiority (lower average error rate). Finally, we include a case study on cognitive control related activation in the prefrontal cortex of the human brain.

  19. Logical error rate in the Pauli twirling approximation.

    Science.gov (United States)

    Katabarwa, Amara; Geller, Michael R

    2015-09-30

    The performance of error correction protocols are necessary for understanding the operation of potential quantum computers, but this requires physical error models that can be simulated efficiently with classical computers. The Gottesmann-Knill theorem guarantees a class of such error models. Of these, one of the simplest is the Pauli twirling approximation (PTA), which is obtained by twirling an arbitrary completely positive error channel over the Pauli basis, resulting in a Pauli channel. In this work, we test the PTA's accuracy at predicting the logical error rate by simulating the 5-qubit code using a 9-qubit circuit with realistic decoherence and unitary gate errors. We find evidence for good agreement with exact simulation, with the PTA overestimating the logical error rate by a factor of 2 to 3. Our results suggest that the PTA is a reliable predictor of the logical error rate, at least for low-distance codes.

  20. Medicare FFS Jurisdiction Error Rate Contribution Data

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Centers for Medicare and Medicaid Services CMS is dedicated to continually strengthening and improving the Medicare program, which provides vital services to...

  1. PERM Error Rate Findings and Reports

    Data.gov (United States)

    U.S. Department of Health & Human Services — Federal agencies are required to annually review programs they administer and identify those that may be susceptible to significant improper payments, to estimate...

  2. Human error in daily intensive nursing care

    Directory of Open Access Journals (Sweden)

    Sabrina da Costa Machado Duarte

    2015-12-01

    Full Text Available Objectives: to identify the errors in daily intensive nursing care and analyze them according to the theory of human error. Method: quantitative, descriptive and exploratory study, undertaken at the Intensive Care Center of a hospital in the Brazilian Sentinel Hospital Network. The participants were 36 professionals from the nursing team. The data were collected through semistructured interviews, observation and lexical analysis in the software ALCESTE(r. Results: human error in nursing care can be related to the approach of the system, through active faults and latent conditions. The active faults are represented by the errors in medication administration and not raising the bedside rails. The latent conditions can be related to the communication difficulties in the multiprofessional team, lack of standards and institutional routines and absence of material resources. Conclusion: the errors identified interfere in nursing care and the clients' recovery and can cause damage. Nevertheless, they are treated as common events inherent in daily practice. The need to acknowledge these events is emphasized, stimulating the safety culture at the institution.

  3. Human Error Assessmentin Minefield Cleaning Operation Using Human Event Analysis

    Directory of Open Access Journals (Sweden)

    Mohammad Hajiakbari

    2015-12-01

    Full Text Available Background & objective: Human error is one of the main causes of accidents. Due to the unreliability of the human element and the high-risk nature of demining operations, this study aimed to assess and manage human errors likely to occur in such operations. Methods: This study was performed at a demining site in war zones located in the West of Iran. After acquiring an initial familiarity with the operations, methods, and tools of clearing minefields, job task related to clearing landmines were specified. Next, these tasks were studied using HTA and related possible errors were assessed using ATHEANA. Results: de-mining task was composed of four main operations, including primary detection, technical identification, investigation, and neutralization. There were found four main reasons for accidents occurring in such operations; walking on the mines, leaving mines with no action, error in neutralizing operation and environmental explosion. The possibility of human error in mine clearance operations was calculated as 0.010. Conclusion: The main causes of human error in de-mining operations can be attributed to various factors such as poor weather and operating conditions like outdoor work, inappropriate personal protective equipment, personality characteristics, insufficient accuracy in the work, and insufficient time available. To reduce the probability of human error in de-mining operations, the aforementioned factors should be managed properly.

  4. Error rate information in attention allocation pilot models

    Science.gov (United States)

    Faulkner, W. H.; Onstott, E. D.

    1977-01-01

    The Northrop urgency decision pilot model was used in a command tracking task to compare the optimized performance of multiaxis attention allocation pilot models whose urgency functions were (1) based on tracking error alone, and (2) based on both tracking error and error rate. A matrix of system dynamics and command inputs was employed, to create both symmetric and asymmetric two axis compensatory tracking tasks. All tasks were single loop on each axis. Analysis showed that a model that allocates control attention through nonlinear urgency functions using only error information could not achieve performance of the full model whose attention shifting algorithm included both error and error rate terms. Subsequent to this analysis, tracking performance predictions for the full model were verified by piloted flight simulation. Complete model and simulation data are presented.

  5. The cost of human error intervention

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, C.T.; Banks, W.W.; Jones, E.D.

    1994-03-01

    DOE has directed that cost-benefit analyses be conducted as part of the review process for all new DOE orders. This new policy will have the effect of ensuring that DOE analysts can justify the implementation costs of the orders that they develop. We would like to argue that a cost-benefit analysis is merely one phase of a complete risk management program -- one that would more than likely start with a probabilistic risk assessment. The safety community defines risk as the probability of failure times the severity of consequence. An engineering definition of failure can be considered in terms of physical performance, as in mean-time-between-failure; or, it can be thought of in terms of human performance, as in probability of human error. The severity of consequence of a failure can be measured along any one of a number of dimensions -- economic, political, or social. Clearly, an analysis along one dimension cannot be directly compared to another but, a set of cost-benefit analyses, based on a series of cost-dimensions, can be extremely useful to managers who must prioritize their resources. Over the last two years, DOE has been developing a series of human factors orders, directed a lowering the probability of human error -- or at least changing the distribution of those errors. The following discussion presents a series of cost-benefit analyses using historical events in the nuclear industry. However, we would first like to discuss some of the analytic cautions that must be considered when we deal with human error.

  6. Human error mitigation initiative (HEMI) : summary report.

    Energy Technology Data Exchange (ETDEWEB)

    Stevens, Susan M.; Ramos, M. Victoria; Wenner, Caren A.; Brannon, Nathan Gregory

    2004-11-01

    Despite continuing efforts to apply existing hazard analysis methods and comply with requirements, human errors persist across the nuclear weapons complex. Due to a number of factors, current retroactive and proactive methods to understand and minimize human error are highly subjective, inconsistent in numerous dimensions, and are cumbersome to characterize as thorough. An alternative and proposed method begins with leveraging historical data to understand what the systemic issues are and where resources need to be brought to bear proactively to minimize the risk of future occurrences. An illustrative analysis was performed using existing incident databases specific to Pantex weapons operations indicating systemic issues associated with operating procedures that undergo notably less development rigor relative to other task elements such as tooling and process flow. Future recommended steps to improve the objectivity, consistency, and thoroughness of hazard analysis and mitigation were delineated.

  7. Total Dose Effects on Error Rates in Linear Bipolar Systems

    Science.gov (United States)

    Buchner, Stephen; McMorrow, Dale; Bernard, Muriel; Roche, Nicholas; Dusseau, Laurent

    2007-01-01

    The shapes of single event transients in linear bipolar circuits are distorted by exposure to total ionizing dose radiation. Some transients become broader and others become narrower. Such distortions may affect SET system error rates in a radiation environment. If the transients are broadened by TID, the error rate could increase during the course of a mission, a possibility that has implications for hardness assurance.

  8. Forecasting the Euro exchange rate using vector error correction models

    NARCIS (Netherlands)

    Aarle, B. van; Bos, M.; Hlouskova, J.

    2000-01-01

    Forecasting the Euro Exchange Rate Using Vector Error Correction Models. — This paper presents an exchange rate model for the Euro exchange rates of four major currencies, namely the US dollar, the British pound, the Japanese yen and the Swiss franc. The model is based on the monetary approach of ex

  9. Perancangan Fasilitas Kerja untuk Mereduksi Human Error

    Directory of Open Access Journals (Sweden)

    Harmein Nasution

    2012-01-01

    Full Text Available Work equipments and environment which are not design ergonomically can cause physical exhaustion to the workers. As a result of that physical exhaustion, many defects in the production lines can happen due to human error and also cause musculoskeletal complaints. To overcome, those effects, we occupied methods for analyzing the workers posture based on the SNQ (Standard Nordic Questionnaire, plibel, QEC (Quick Exposure Check and biomechanism. Moreover, we applied those methods for designing rolling machines and grip egrek ergono-mically, so that the defects on those production lines can be minimized.

  10. Framed bit error rate testing for 100G ethernet equipment

    DEFF Research Database (Denmark)

    Rasmussen, Anders; Ruepp, Sarah Renée; Berger, Michael Stübert

    2010-01-01

    The Internet users behavioural patterns are migrating towards bandwidth-intensive applications, which require a corresponding capacity extension. The emerging 100 Gigabit Ethernet (GE) technology is a promising candidate for providing a ten-fold increase of todays available Internet transmission...... rate. As the need for 100 Gigabit Ethernet equipment rises, so does the need for equipment, which can properly test these systems during development, deployment and use. This paper presents early results from a work-in-progress academia-industry collaboration project and elaborates on the challenges...... of performing bit error rate testing at 100Gbps. In particular, we show how Bit Error Rate Testing (BERT) can be performed over an aggregated 100G Attachment Unit Interface (CAUI) by encapsulating the test data in Ethernet frames at line speed. Our results show that framed bit error rate testing can...

  11. A technique for human error analysis (ATHEANA)

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, S.E.; Ramey-Smith, A.M.; Wreathall, J.; Parry, G.W. [and others

    1996-05-01

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions.

  12. Individual Differences and Rating Errors in First Impressions of Psychopathy

    Directory of Open Access Journals (Sweden)

    Christopher T. A. Gillen

    2016-10-01

    Full Text Available The current study is the first to investigate whether individual differences in personality are related to improved first impression accuracy when appraising psychopathy in female offenders from thin-slices of information. The study also investigated the types of errors laypeople make when forming these judgments. Sixty-seven undergraduates assessed 22 offenders on their level of psychopathy, violence, likability, and attractiveness. Psychopathy rating accuracy improved as rater extroversion-sociability and agreeableness increased and when neuroticism and lifestyle and antisocial characteristics decreased. These results suggest that traits associated with nonverbal rating accuracy or social functioning may be important in threat detection. Raters also made errors consistent with error management theory, suggesting that laypeople overappraise danger when rating psychopathy.

  13. Modeling of Bit Error Rate in Cascaded 2R Regenerators

    DEFF Research Database (Denmark)

    Öhman, Filip; Mørk, Jesper

    2006-01-01

    This paper presents a simple and efficient model for estimating the bit error rate in a cascade of optical 2R-regenerators. The model includes the influences of of amplifier noise, finite extinction ratio and nonlinear reshaping. The interplay between the different signal impairments and the rege......This paper presents a simple and efficient model for estimating the bit error rate in a cascade of optical 2R-regenerators. The model includes the influences of of amplifier noise, finite extinction ratio and nonlinear reshaping. The interplay between the different signal impairments...

  14. Neighbourhood effects on error rates in speech production.

    Science.gov (United States)

    Stemberger, Joseph Paul

    2004-01-01

    Models of speech production differ on whether phonological neighbourhoods should affect processing, and on whether effects should be facilitatory or inhibitory. Inhibitory effects of large neighbourhoods have been argued to underlie apparent anti-frequency effects, whereby high-frequency default features are more prone to mispronunciation errors than low-frequency nondefault features. Data from the original SLIPs experiments that found apparent anti-frequency effects are analysed for neighbourhood effects. Effects are facilitatory: errors are significantly less likely for words with large numbers of neighbours that share the characteristic that is being primed for error ("friends"). Words in the neighbourhood that do not share the target characteristic ("enemies") have little effect on error rates. Neighbourhood effects do not underlie the apparent anti-frequency effects. Implications for models of speech production are discussed.

  15. How social is error observation? The neural mechanisms underlying the observation of human and machine errors.

    Science.gov (United States)

    Desmet, Charlotte; Deschrijver, Eliane; Brass, Marcel

    2014-04-01

    Recently, it has been shown that the medial prefrontal cortex (MPFC) is involved in error execution as well as error observation. Based on this finding, it has been argued that recognizing each other's mistakes might rely on motor simulation. In the current functional magnetic resonance imaging (fMRI) study, we directly tested this hypothesis by investigating whether medial prefrontal activity in error observation is restricted to situations that enable simulation. To this aim, we compared brain activity related to the observation of errors that can be simulated (human errors) with brain activity related to errors that cannot be simulated (machine errors). We show that medial prefrontal activity is not only restricted to the observation of human errors but also occurs when observing errors of a machine. In addition, our data indicate that the MPFC reflects a domain general mechanism of monitoring violations of expectancies.

  16. The 95% confidence intervals of error rates and discriminant coefficients

    Directory of Open Access Journals (Sweden)

    Shuichi Shinmura

    2015-02-01

    Full Text Available Fisher proposed a linear discriminant function (Fisher’s LDF. From 1971, we analysed electrocardiogram (ECG data in order to develop the diagnostic logic between normal and abnormal symptoms by Fisher’s LDF and a quadratic discriminant function (QDF. Our four years research was inferior to the decision tree logic developed by the medical doctor. After this experience, we discriminated many data and found four problems of the discriminant analysis. A revised Optimal LDF by Integer Programming (Revised IP-OLDF based on the minimum number of misclassification (minimum NM criterion resolves three problems entirely [13, 18]. In this research, we discuss fourth problem of the discriminant analysis. There are no standard errors (SEs of the error rate and discriminant coefficient. We propose a k-fold crossvalidation method. This method offers a model selection technique and a 95% confidence intervals (C.I. of error rates and discriminant coefficients.

  17. Controlling the Type I Error Rate in Stepwise Regression Analysis.

    Science.gov (United States)

    Pohlmann, John T.

    Three procedures used to control Type I error rate in stepwise regression analysis are forward selection, backward elimination, and true stepwise. In the forward selection method, a model of the dependent variable is formed by choosing the single best predictor; then the second predictor which makes the strongest contribution to the prediction of…

  18. Assessment of salivary flow rate: biologic variation and measure error.

    NARCIS (Netherlands)

    Jongerius, P.H.; Limbeek, J. van; Rotteveel, J.J.

    2004-01-01

    OBJECTIVE: To investigate the applicability of the swab method in the measurement of salivary flow rate in multiple-handicap drooling children. To quantify the measurement error of the procedure and the biologic variation in the population. STUDY DESIGN: Cohort study. METHODS: In a repeated measurem

  19. Relating Complexity and Error Rates of Ontology Concepts. More Complex NCIt Concepts Have More Errors.

    Science.gov (United States)

    Min, Hua; Zheng, Ling; Perl, Yehoshua; Halper, Michael; De Coronado, Sherri; Ochs, Christopher

    2017-05-18

    Ontologies are knowledge structures that lend support to many health-information systems. A study is carried out to assess the quality of ontological concepts based on a measure of their complexity. The results show a relation between complexity of concepts and error rates of concepts. A measure of lateral complexity defined as the number of exhibited role types is used to distinguish between more complex and simpler concepts. Using a framework called an area taxonomy, a kind of abstraction network that summarizes the structural organization of an ontology, concepts are divided into two groups along these lines. Various concepts from each group are then subjected to a two-phase QA analysis to uncover and verify errors and inconsistencies in their modeling. A hierarchy of the National Cancer Institute thesaurus (NCIt) is used as our test-bed. A hypothesis pertaining to the expected error rates of the complex and simple concepts is tested. Our study was done on the NCIt's Biological Process hierarchy. Various errors, including missing roles, incorrect role targets, and incorrectly assigned roles, were discovered and verified in the two phases of our QA analysis. The overall findings confirmed our hypothesis by showing a statistically significant difference between the amounts of errors exhibited by more laterally complex concepts vis-à-vis simpler concepts. QA is an essential part of any ontology's maintenance regimen. In this paper, we reported on the results of a QA study targeting two groups of ontology concepts distinguished by their level of complexity, defined in terms of the number of exhibited role types. The study was carried out on a major component of an important ontology, the NCIt. The findings suggest that more complex concepts tend to have a higher error rate than simpler concepts. These findings can be utilized to guide ongoing efforts in ontology QA.

  20. Human reliability, error, and human factors in power generation

    CERN Document Server

    Dhillon, B S

    2014-01-01

    Human reliability, error, and human factors in the area of power generation have been receiving increasing attention in recent years. Each year billions of dollars are spent in the area of power generation to design, construct/manufacture, operate, and maintain various types of power systems around the globe, and such systems often fail due to human error. This book compiles various recent results and data into one volume, and eliminates the need to consult many diverse sources to obtain vital information.  It enables potential readers to delve deeper into a specific area, providing the source of most of the material presented in references at the end of each chapter. Examples along with solutions are also provided at appropriate places, and there are numerous problems for testing the reader’s comprehension.  Chapters cover a broad range of topics, including general methods for performing human reliability and error analysis in power plants, specific human reliability analysis methods for nuclear power pl...

  1. DNA barcoding: error rates based on comprehensive sampling.

    Directory of Open Access Journals (Sweden)

    Christopher P Meyer

    2005-12-01

    Full Text Available DNA barcoding has attracted attention with promises to aid in species identification and discovery; however, few well-sampled datasets are available to test its performance. We provide the first examination of barcoding performance in a comprehensively sampled, diverse group (cypraeid marine gastropods, or cowries. We utilize previous methods for testing performance and employ a novel phylogenetic approach to calculate intraspecific variation and interspecific divergence. Error rates are estimated for (1 identifying samples against a well-characterized phylogeny, and (2 assisting in species discovery for partially known groups. We find that the lowest overall error for species identification is 4%. In contrast, barcoding performs poorly in incompletely sampled groups. Here, species delineation relies on the use of thresholds, set to differentiate between intraspecific variation and interspecific divergence. Whereas proponents envision a "barcoding gap" between the two, we find substantial overlap, leading to minimal error rates of approximately 17% in cowries. Moreover, error rates double if only traditionally recognized species are analyzed. Thus, DNA barcoding holds promise for identification in taxonomically well-understood and thoroughly sampled clades. However, the use of thresholds does not bode well for delineating closely related species in taxonomically understudied groups. The promise of barcoding will be realized only if based on solid taxonomic foundations.

  2. Research Workshop on Expert Judgment, Human Error, and Intelligent Systems

    OpenAIRE

    Silverman, Barry G.

    1993-01-01

    This workshop brought together 20 computer scientists, psychologists, and human-computer interaction (HCI) researchers to exchange results and views on human error and judgment bias. Human error is typically studied when operators undertake actions, but judgment bias is an issue in thinking rather than acting. Both topics are generally ignored by the HCI community, which is interested in designs that eliminate human error and bias tendencies. As a result, almost no one at the workshop had met...

  3. Modeling human response errors in synthetic flight simulator domain

    Science.gov (United States)

    Ntuen, Celestine A.

    1992-01-01

    This paper presents a control theoretic approach to modeling human response errors (HRE) in the flight simulation domain. The human pilot is modeled as a supervisor of a highly automated system. The synthesis uses the theory of optimal control pilot modeling for integrating the pilot's observation error and the error due to the simulation model (experimental error). Methods for solving the HRE problem are suggested. Experimental verification of the models will be tested in a flight quality handling simulation.

  4. Human Error Classification for the Permit to Work System by SHERPA in a Petrochemical Industry

    Directory of Open Access Journals (Sweden)

    Arash Ghasemi

    2015-12-01

    Full Text Available Background & objective: Occupational accidents may occur in any types of activities. Carrying out daily activities such as repairing and maintaining are one of the work phases that have high risck. Despite the issuance of work permits or work license systems for controling the risks of non-routine activities, the high rate of accidents during activity indicates the inadequacy of such systems. A main portion of this lacking is attributed to the human errors. Then, it is necessary to identify and control the probable human errors during issuing permits. Methods: In the present study, the probable errors for four categories of working permits were identified using SHERPA method. Then, an expert team analyzed 25500 issued permits during a period of approximately one year. Most of frequent human errors and their types were determined. Results: The “Excavation” and “Entry to confined space” permit possess the most errors. Approximately, 28.5 present of all errors were related to the excavation permits. The implementation error was recognized as the most frequent error for all types of error taxonomy. For every category of permits, about 40% of all errors were attributed to the implementation errors. Conclusion: The results may indicate the weakness points in the practical training of the licensing system. The human error identification methods can be used to predict and decrease the human errors.

  5. Individual Differences and Rating Errors in First Impressions of Psychopathy

    OpenAIRE

    Christopher T. A. Gillen; Henriette Bergstrøm; Adelle E. Forth

    2016-01-01

    The current study is the first to investigate whether individual differences in personality are related to improved first impression accuracy when appraising psychopathy in female offenders from thin-slices of information. The study also investigated the types of errors laypeople make when forming these judgments. Sixty-seven undergraduates assessed 22 offenders on their level of psychopathy, violence, likability, and attractiveness. Psychopathy rating accuracy improved as rater extroversion-...

  6. Individual Differences and Rating Errors in First Impressions of Psychopathy

    OpenAIRE

    Christopher T. A. Gillen; Henriette Bergstrøm; Forth, Adelle E.

    2016-01-01

    The current study is the first to investigate whether individual differences in personality are related to improved first impression accuracy when appraising psychopathy in female offenders from thin-slices of information. The study also investigated the types of errors laypeople make when forming these judgments. Sixty-seven undergraduates assessed 22 offenders on their level of psychopathy, violence, likability, and attractiveness. Psychopathy rating accuracy improved as rater extroversion-...

  7. CREME96 and Related Error Rate Prediction Methods

    Science.gov (United States)

    Adams, James H., Jr.

    2012-01-01

    Predicting the rate of occurrence of single event effects (SEEs) in space requires knowledge of the radiation environment and the response of electronic devices to that environment. Several analytical models have been developed over the past 36 years to predict SEE rates. The first error rate calculations were performed by Binder, Smith and Holman. Bradford and Pickel and Blandford, in their CRIER (Cosmic-Ray-Induced-Error-Rate) analysis code introduced the basic Rectangular ParallelePiped (RPP) method for error rate calculations. For the radiation environment at the part, both made use of the Cosmic Ray LET (Linear Energy Transfer) spectra calculated by Heinrich for various absorber Depths. A more detailed model for the space radiation environment within spacecraft was developed by Adams and co-workers. This model, together with a reformulation of the RPP method published by Pickel and Blandford, was used to create the CR ME (Cosmic Ray Effects on Micro-Electronics) code. About the same time Shapiro wrote the CRUP (Cosmic Ray Upset Program) based on the RPP method published by Bradford. It was the first code to specifically take into account charge collection from outside the depletion region due to deformation of the electric field caused by the incident cosmic ray. Other early rate prediction methods and codes include the Single Event Figure of Merit, NOVICE, the Space Radiation code and the effective flux method of Binder which is the basis of the SEFA (Scott Effective Flux Approximation) model. By the early 1990s it was becoming clear that CREME and the other early models needed Revision. This revision, CREME96, was completed and released as a WWW-based tool, one of the first of its kind. The revisions in CREME96 included improved environmental models and improved models for calculating single event effects. The need for a revision of CREME also stimulated the development of the CHIME (CRRES/SPACERAD Heavy Ion Model of the Environment) and MACREE (Modeling and

  8. Error Rates in Users of Automatic Face Recognition Software.

    Science.gov (United States)

    White, David; Dunn, James D; Schmid, Alexandra C; Kemp, Richard I

    2015-01-01

    In recent years, wide deployment of automatic face recognition systems has been accompanied by substantial gains in algorithm performance. However, benchmarking tests designed to evaluate these systems do not account for the errors of human operators, who are often an integral part of face recognition solutions in forensic and security settings. This causes a mismatch between evaluation tests and operational accuracy. We address this by measuring user performance in a face recognition system used to screen passport applications for identity fraud. Experiment 1 measured target detection accuracy in algorithm-generated 'candidate lists' selected from a large database of passport images. Accuracy was notably poorer than in previous studies of unfamiliar face matching: participants made over 50% errors for adult target faces, and over 60% when matching images of children. Experiment 2 then compared performance of student participants to trained passport officers-who use the system in their daily work-and found equivalent performance in these groups. Encouragingly, a group of highly trained and experienced "facial examiners" outperformed these groups by 20 percentage points. We conclude that human performance curtails accuracy of face recognition systems-potentially reducing benchmark estimates by 50% in operational settings. Mere practise does not attenuate these limits, but superior performance of trained examiners suggests that recruitment and selection of human operators, in combination with effective training and mentorship, can improve the operational accuracy of face recognition systems.

  9. Promoting safety improvements via potential human error audits

    Energy Technology Data Exchange (ETDEWEB)

    Simpson, G.C. (International Mining Consultants (United Kingdom). Ergonomics and Safety Management)

    1994-08-01

    It has become increasingly recognised that human error plays a major role in mining accident causation. Moreover, it also recognised that this aspect of accident causation has had relatively little systematic attention in the past. Recent studies within British Coal have succeeded in developing a Potential Human Error Audit as a means of targeting accident prevention initiatives. 7 refs., 2 tabs.

  10. Structured methods for identifying and correcting potential human errors in space operations.

    Science.gov (United States)

    Nelson, W R; Haney, L N; Ostrom, L T; Richards, R E

    1998-01-01

    Human performance plays a significant role in the development and operation of any complex system, and human errors are significant contributors to degraded performance, incidents, and accidents for technologies as diverse as medical systems, commercial aircraft, offshore oil platforms, nuclear power plants, and space systems. To date, serious accidents attributed to human error have fortunately been rare in space operations. However, as flight rates go up and the duration of space missions increases, the accident rate could increase unless proactive action is taken to identity and correct potential human errors in space operations. The Idaho National Engineering and Environmental Laboratory (INEEL) has developed and applied structured methods of human error analysis to identify potential human errors, assess their effects on system performance, and develop strategies to prevent the errors or mitigate their consequences. These methods are being applied in NASA-sponsored programs to the domain of commercial aviation, focusing on airplane maintenance and air traffic management. The application of human error analysis to space operations could contribute to minimize the risks associated with human error in the design and operation of future space systems.

  11. Minimizing Symbol Error Rate for Cognitive Relaying with Opportunistic Access

    KAUST Repository

    Zafar, Ammar

    2012-12-29

    In this paper, we present an optimal resource allocation scheme (ORA) for an all-participate(AP) cognitive relay network that minimizes the symbol error rate (SER). The SER is derived and different constraints are considered on the system. We consider the cases of both individual and global power constraints, individual constraints only and global constraints only. Numerical results show that the ORA scheme outperforms the schemes with direct link only and uniform power allocation (UPA) in terms of minimizing the SER for all three cases of different constraints. Numerical results also show that the individual constraints only case provides the best performance at large signal-to-noise-ratio (SNR).

  12. Impact Propagation of Human Errors on Software Requirements Volatility

    Directory of Open Access Journals (Sweden)

    Zahra Askarinejadamiri

    2017-02-01

    Full Text Available Requirements volatility (RV is one of the key risk sources in software development and maintenance projects because of the frequent changes made to the software. Human faults and errors are major factors contributing to requirement change in software development projects. As such, predicting requirements volatility is a challenge to risk management in the software area. Previous studies only focused on certain aspects of the human error in this area. This study specifically identifies and analyses the impact of human errors on requirements gathering and requirements volatility. It proposes a model based on responses to a survey questionnaire administered to 215 participants who have experience in software requirement gathering. Exploratory factor analysis (EFA and structural equation modelling (SEM were used to analyse the correlation of human errors and requirement volatility. The results of the analysis confirm the correlation between human errors and RV. The results show that human actions have a higher impact on RV compared to human perception. The study provides insights into software management to understand socio-technical aspects of requirements volatility in order to control risk management. Human actions and perceptions respectively are a root cause contributing to human errors that lead to RV.

  13. Error rates in forensic DNA analysis: definition, numbers, impact and communication.

    Science.gov (United States)

    Kloosterman, Ate; Sjerps, Marjan; Quak, Astrid

    2014-09-01

    Forensic DNA casework is currently regarded as one of the most important types of forensic evidence, and important decisions in intelligence and justice are based on it. However, errors occasionally occur and may have very serious consequences. In other domains, error rates have been defined and published. The forensic domain is lagging behind concerning this transparency for various reasons. In this paper we provide definitions and observed frequencies for different types of errors at the Human Biological Traces Department of the Netherlands Forensic Institute (NFI) over the years 2008-2012. Furthermore, we assess their actual and potential impact and describe how the NFI deals with the communication of these numbers to the legal justice system. We conclude that the observed relative frequency of quality failures is comparable to studies from clinical laboratories and genetic testing centres. Furthermore, this frequency is constant over the five-year study period. The most common causes of failures related to the laboratory process were contamination and human error. Most human errors could be corrected, whereas gross contamination in crime samples often resulted in irreversible consequences. Hence this type of contamination is identified as the most significant source of error. Of the known contamination incidents, most were detected by the NFI quality control system before the report was issued to the authorities, and thus did not lead to flawed decisions like false convictions. However in a very limited number of cases crucial errors were detected after the report was issued, sometimes with severe consequences. Many of these errors were made in the post-analytical phase. The error rates reported in this paper are useful for quality improvement and benchmarking, and contribute to an open research culture that promotes public trust. However, they are irrelevant in the context of a particular case. Here case-specific probabilities of undetected errors are needed

  14. Estimation of the minimum mRNA splicing error rate in vertebrates.

    Science.gov (United States)

    Skandalis, A

    2016-01-01

    The majority of protein coding genes in vertebrates contain several introns that are removed by the mRNA splicing machinery. Errors during splicing can generate aberrant transcripts and degrade the transmission of genetic information thus contributing to genomic instability and disease. However, estimating the error rate of constitutive splicing is complicated by the process of alternative splicing which can generate multiple alternative transcripts per locus and is particularly active in humans. In order to estimate the error frequency of constitutive mRNA splicing and avoid bias by alternative splicing we have characterized the frequency of splice variants at three loci, HPRT, POLB, and TRPV1 in multiple tissues of six vertebrate species. Our analysis revealed that the frequency of splice variants varied widely among loci, tissues, and species. However, the lowest observed frequency is quite constant among loci and approximately 0.1% aberrant transcripts per intron. Arguably this reflects the "irreducible" error rate of splicing, which consists primarily of the combination of replication errors by RNA polymerase II in splice consensus sequences and spliceosome errors in correctly pairing exons.

  15. Error-rate performance analysis of opportunistic regenerative relaying

    KAUST Repository

    Tourki, Kamel

    2011-09-01

    In this paper, we investigate an opportunistic relaying scheme where the selected relay assists the source-destination (direct) communication. In our study, we consider a regenerative opportunistic relaying scheme in which the direct path can be considered unusable, and takes into account the effect of the possible erroneously detected and transmitted data at the best relay. We first derive the exact statistics of each hop, in terms of probability density function (PDF). Then, the PDFs are used to determine accurate closed form expressions for end-to-end bit-error rate (BER) of binary phase-shift keying (BPSK) modulation where the detector may use maximum ration combining (MRC) or selection combining (SC). Finally, we validate our analysis by showing that performance simulation results coincide with our analytical results over linear network (LN) architecture and considering Rayleigh fading channels. © 2011 IEEE.

  16. Measurements of Aperture Averaging on Bit-Error-Rate

    Science.gov (United States)

    Bastin, Gary L.; Andrews, Larry C.; Phillips, Ronald L.; Nelson, Richard A.; Ferrell, Bobby A.; Borbath, Michael R.; Galus, Darren J.; Chin, Peter G.; Harris, William G.; Marin, Jose A.; Burdge, Geoffrey L.; Wayne, David; Pescatore, Robert

    2005-01-01

    We report on measurements made at the Shuttle Landing Facility (SLF) runway at Kennedy Space Center of receiver aperture averaging effects on a propagating optical Gaussian beam wave over a propagation path of 1,000 in. A commercially available instrument with both transmit and receive apertures was used to transmit a modulated laser beam operating at 1550 nm through a transmit aperture of 2.54 cm. An identical model of the same instrument was used as a receiver with a single aperture that was varied in size up to 20 cm to measure the effect of receiver aperture averaging on Bit Error Rate. Simultaneous measurements were also made with a scintillometer instrument and local weather station instruments to characterize atmospheric conditions along the propagation path during the experiments.

  17. Selecting Human Error Types for Cognitive Modelling and Simulation

    NARCIS (Netherlands)

    Mioch, T.; Osterloh, J.P.; Javaux, D.

    2010-01-01

    This paper presents a method that has enabled us to make a selection of error types and error production mechanisms relevant to the HUMAN European project, and discusses the reasons underlying those choices. We claim that this method has the advantage that it is very exhaustive in determining the re

  18. Avoiding Human Error in Mission Operations: Cassini Flight Experience

    Science.gov (United States)

    Burk, Thomas A.

    2012-01-01

    Operating spacecraft is a never-ending challenge and the risk of human error is ever- present. Many missions have been significantly affected by human error on the part of ground controllers. The Cassini mission at Saturn has not been immune to human error, but Cassini operations engineers use tools and follow processes that find and correct most human errors before they reach the spacecraft. What is needed are skilled engineers with good technical knowledge, good interpersonal communications, quality ground software, regular peer reviews, up-to-date procedures, as well as careful attention to detail and the discipline to test and verify all commands that will be sent to the spacecraft. Two areas of special concern are changes to flight software and response to in-flight anomalies. The Cassini team has a lot of practical experience in all these areas and they have found that well-trained engineers with good tools who follow clear procedures can catch most errors before they get into command sequences to be sent to the spacecraft. Finally, having a robust and fault-tolerant spacecraft that allows ground controllers excellent visibility of its condition is the most important way to ensure human error does not compromise the mission.

  19. Application of human error analysis to aviation and space operations

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, W.R.

    1998-03-01

    For the past several years at the Idaho National Engineering and Environmental Laboratory (INEEL) the authors have been working to apply methods of human error analysis to the design of complex systems. They have focused on adapting human reliability analysis (HRA) methods that were developed for Probabilistic Safety Assessment (PSA) for application to system design. They are developing methods so that human errors can be systematically identified during system design, the potential consequences of each error can be assessed, and potential corrective actions (e.g. changes to system design or procedures) can be identified. The primary vehicle the authors have used to develop and apply these methods has been a series of projects sponsored by the National Aeronautics and Space Administration (NASA) to apply human error analysis to aviation operations. They are currently adapting their methods and tools of human error analysis to the domain of air traffic management (ATM) systems. Under the NASA-sponsored Advanced Air Traffic Technologies (AATT) program they are working to address issues of human reliability in the design of ATM systems to support the development of a free flight environment for commercial air traffic in the US. They are also currently testing the application of their human error analysis approach for space flight operations. They have developed a simplified model of the critical habitability functions for the space station Mir, and have used this model to assess the affects of system failures and human errors that have occurred in the wake of the collision incident last year. They are developing an approach so that lessons learned from Mir operations can be systematically applied to design and operation of long-term space missions such as the International Space Station (ISS) and the manned Mars mission.

  20. CLOSED-FORM ERROR RATES OF STBC SYSTEMS AND ITS PERFORMANCE ANALYSIS

    Institute of Scientific and Technical Information of China (English)

    Hu Xianbin; Gao Yuanyuan; Yi Xiaoxin

    2006-01-01

    The closed-form solutions for error rates of Space-Time Block Code (STBC) Multiple Phase Shift Keying (MPSK) systems are derived in this paper. With characteristic function based method and the partial integration based respectively, the exact expressions of error rates are obtained for (2,1) STBC with and without channel estimation error.Simulations show that the practical error rates accord with the theoretical ones, so closed-form error rates are accurate references for STBC performance evaluation. For the error of pilot assisted channel estimation, the performance of a (2,1)STBC system is deteriorated about 3dB.

  1. Derivation of main drivers affecting the possibility of human errors during low power and shutdown operation

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ar Ryum; Seong, Poong Hyun [KAIST, Daejeon (Korea, Republic of); Park, Jin Kyun; Kim, Jae Whan [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    In order to estimate the possibility of human error and identify its nature, human reliability analysis (HRA) methods have been implemented. For this, various HRA methods have been developed so far: techniques for human error rate prediction (THERP), cause based decision tree (CBDT), the cognitive reliability and error analysis method (CREAM) and so on. Most HRA methods have been developed with a focus on full power operation of NPPs even though human performance may more largely affect the safety of the system during low power and shutdown (LPSD) operation than it would when the system is in full power operation. In this regard, it is necessary to conduct a research for developing HRA method to be used in LPSD operation. For the first step of the study, main drivers which affect the possibility of human error have been developed. Drivers which are commonly called as performance shaping factors (PSFs) are aspects of the human's individual characteristics, environment, organization, or task that specifically decrements or improves human performance, thus respectively increasing or decreasing the likelihood of human errors. In order to estimate the possibility of human error and identify its nature, human reliability analysis (HRA) methods have been implemented. For this, various HRA methods have been developed so far: techniques for human error rate prediction (THERP), cause based decision tree (CBDT), the cognitive reliability and error analysis method (CREAM) and so on. Most HRA methods have been developed with a focus on full power operation of NPPs even though human performance may more largely affect the safety of the system during low power and shutdown (LPSD) operation than it would when the system is in full power operation. In this regard, it is necessary to conduct a research for developing HRA method to be used in LPSD operation. For the first step of the study, main drivers which affect the possibility of human error have been developed. Drivers

  2. Testing Theories of Transfer Using Error Rate Learning Curves.

    Science.gov (United States)

    Koedinger, Kenneth R; Yudelson, Michael V; Pavlik, Philip I

    2016-07-01

    We analyze naturally occurring datasets from student use of educational technologies to explore a long-standing question of the scope of transfer of learning. We contrast a faculty theory of broad transfer with a component theory of more constrained transfer. To test these theories, we develop statistical models of them. These models use latent variables to represent mental functions that are changed while learning to cause a reduction in error rates for new tasks. Strong versions of these models provide a common explanation for the variance in task difficulty and transfer. Weak versions decouple difficulty and transfer explanations by describing task difficulty with parameters for each unique task. We evaluate these models in terms of both their prediction accuracy on held-out data and their power in explaining task difficulty and learning transfer. In comparisons across eight datasets, we find that the component models provide both better predictions and better explanations than the faculty models. Weak model variations tend to improve generalization across students, but hurt generalization across items and make a sacrifice to explanatory power. More generally, the approach could be used to identify malleable components of cognitive functions, such as spatial reasoning or executive functions. Copyright © 2016 Cognitive Science Society, Inc.

  3. HUMAN ERROR QUANTIFICATION USING PERFORMANCE SHAPING FACTORS IN THE SPAR-H METHOD

    Energy Technology Data Exchange (ETDEWEB)

    Harold S. Blackman; David I. Gertman; Ronald L. Boring

    2008-09-01

    This paper describes a cognitively based human reliability analysis (HRA) quantification technique for estimating the human error probabilities (HEPs) associated with operator and crew actions at nuclear power plants. The method described here, Standardized Plant Analysis Risk-Human Reliability Analysis (SPAR-H) method, was developed to aid in characterizing and quantifying human performance at nuclear power plants. The intent was to develop a defensible method that would consider all factors that may influence performance. In the SPAR-H approach, calculation of HEP rates is especially straightforward, starting with pre-defined nominal error rates for cognitive vs. action-oriented tasks, and incorporating performance shaping factor multipliers upon those nominal error rates.

  4. Human oocytes. Error-prone chromosome-mediated spindle assembly favors chromosome segregation defects in human oocytes.

    Science.gov (United States)

    Holubcová, Zuzana; Blayney, Martyn; Elder, Kay; Schuh, Melina

    2015-06-05

    Aneuploidy in human eggs is the leading cause of pregnancy loss and several genetic disorders such as Down syndrome. Most aneuploidy results from chromosome segregation errors during the meiotic divisions of an oocyte, the egg's progenitor cell. The basis for particularly error-prone chromosome segregation in human oocytes is not known. We analyzed meiosis in more than 100 live human oocytes and identified an error-prone chromosome-mediated spindle assembly mechanism as a major contributor to chromosome segregation defects. Human oocytes assembled a meiotic spindle independently of either centrosomes or other microtubule organizing centers. Instead, spindle assembly was mediated by chromosomes and the small guanosine triphosphatase Ran in a process requiring ~16 hours. This unusually long spindle assembly period was marked by intrinsic spindle instability and abnormal kinetochore-microtubule attachments, which favor chromosome segregation errors and provide a possible explanation for high rates of aneuploidy in human eggs.

  5. Error detection in spoken human-machine interaction

    NARCIS (Netherlands)

    Krahmer, E.; Swerts, M.; Theune, Mariet; Weegels, M.

    Given the state of the art of current language and speech technology, errors are unavoidable in present-day spoken dialogue systems. Therefore, one of the main concerns in dialogue design is how to decide whether or not the system has understood the user correctly. In human-human communication,

  6. Error detection in spoken human-machine interaction

    NARCIS (Netherlands)

    Krahmer, E.; Swerts, M.; Theune, M.; Weegels, M.

    2001-01-01

    Given the state of the art of current language and speech technology, errors are unavoidable in present-day spoken dialogue systems. Therefore, one of the main concerns in dialogue design is how to decide whether or not the system has understood the user correctly. In human-human communication, dial

  7. Rates of computational errors for scoring the SIRS primary scales.

    Science.gov (United States)

    Tyner, Elizabeth A; Frederick, Richard I

    2013-12-01

    We entered item scores for the Structured Interview of Reported Symptoms (SIRS; Rogers, Bagby, & Dickens, 1991) into a spreadsheet and compared computed scores with those hand-tallied by examiners. We found that about 35% of the tests had at least 1 scoring error. Of SIRS scale scores tallied by examiners, about 8% were incorrectly summed. When the errors were corrected, only 1 SIRS classification was reclassified in the fourfold scheme used by the SIRS. We note that mistallied scores on psychological tests are common, and we review some strategies for reducing scale score errors on the SIRS. (c) 2013 APA, all rights reserved.

  8. Applications of human error analysis to aviation and space operations

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, W.R.

    1998-07-01

    For the past several years at the Idaho National Engineering and Environmental Laboratory (INEEL) we have been working to apply methods of human error analysis to the design of complex systems. We have focused on adapting human reliability analysis (HRA) methods that were developed for Probabilistic Safety Assessment (PSA) for application to system design. We are developing methods so that human errors can be systematically identified during system design, the potential consequences of each error can be assessed, and potential corrective actions (e.g. changes to system design or procedures) can be identified. These applications lead to different requirements when compared with HR.As performed as part of a PSA. For example, because the analysis will begin early during the design stage, the methods must be usable when only partial design information is available. In addition, the ability to perform numerous ''what if'' analyses to identify and compare multiple design alternatives is essential. Finally, since the goals of such human error analyses focus on proactive design changes rather than the estimate of failure probabilities for PRA, there is more emphasis on qualitative evaluations of error relationships and causal factors than on quantitative estimates of error frequency. The primary vehicle we have used to develop and apply these methods has been a series of projects sponsored by the National Aeronautics and Space Administration (NASA) to apply human error analysis to aviation operations. The first NASA-sponsored project had the goal to evaluate human errors caused by advanced cockpit automation. Our next aviation project focused on the development of methods and tools to apply human error analysis to the design of commercial aircraft. This project was performed by a consortium comprised of INEEL, NASA, and Boeing Commercial Airplane Group. The focus of the project was aircraft design and procedures that could lead to human errors during

  9. Advanced MMIS Toward Substantial Reduction in Human Errors in NPPs

    Energy Technology Data Exchange (ETDEWEB)

    Seong, Poong Hyun; Kang, Hyun Gook [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Na, Man Gyun [Chosun Univ., Gwangju (Korea, Republic of); Kim, Jong Hyun [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of); Heo, Gyunyoung [Kyung Hee Univ., Yongin (Korea, Republic of); Jung, Yoensub [Korea Hydro and Nuclear Power Co., Ltd., Daejeon (Korea, Republic of)

    2013-04-15

    This paper aims to give an overview of the methods to inherently prevent human errors and to effectively mitigate the consequences of such errors by securing defense-in-depth during plant management through the advanced man-machine interface system (MMIS). It is needless to stress the significance of human error reduction during an accident in nuclear power plants (NPPs). Unexpected shutdowns caused by human errors not only threaten nuclear safety but also make public acceptance of nuclear power extremely lower. We have to recognize there must be the possibility of human errors occurring since humans are not essentially perfect particularly under stressful conditions. However, we have the opportunity to improve such a situation through advanced information and communication technologies on the basis of lessons learned from our experiences. As important lessons, authors explained key issues associated with automation, man-machine interface, operator support systems, and procedures. Upon this investigation, we outlined the concept and technical factors to develop advanced automation, operation and maintenance support systems, and computer-based procedures using wired/wireless technology. It should be noted that the ultimate responsibility of nuclear safety obviously belongs to humans not to machines. Therefore, safety culture including education and training, which is a kind of organizational factor, should be emphasized as well. In regard to safety culture for human error reduction, several issues that we are facing these days were described. We expect the ideas of the advanced MMIS proposed in this paper to lead in the future direction of related researches and finally supplement the safety of NPPs.

  10. ADVANCED MMIS TOWARD SUBSTANTIAL REDUCTION IN HUMAN ERRORS IN NPPS

    Directory of Open Access Journals (Sweden)

    POONG HYUN SEONG

    2013-04-01

    Full Text Available This paper aims to give an overview of the methods to inherently prevent human errors and to effectively mitigate the consequences of such errors by securing defense-in-depth during plant management through the advanced man-machine interface system (MMIS. It is needless to stress the significance of human error reduction during an accident in nuclear power plants (NPPs. Unexpected shutdowns caused by human errors not only threaten nuclear safety but also make public acceptance of nuclear power extremely lower. We have to recognize there must be the possibility of human errors occurring since humans are not essentially perfect particularly under stressful conditions. However, we have the opportunity to improve such a situation through advanced information and communication technologies on the basis of lessons learned from our experiences. As important lessons, authors explained key issues associated with automation, man-machine interface, operator support systems, and procedures. Upon this investigation, we outlined the concept and technical factors to develop advanced automation, operation and maintenance support systems, and computer-based procedures using wired/wireless technology. It should be noted that the ultimate responsibility of nuclear safety obviously belongs to humans not to machines. Therefore, safety culture including education and training, which is a kind of organizational factor, should be emphasized as well. In regard to safety culture for human error reduction, several issues that we are facing these days were described. We expect the ideas of the advanced MMIS proposed in this paper to lead in the future direction of related researches and finally supplement the safety of NPPs.

  11. Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS)

    Science.gov (United States)

    Alexander, Tiffaney Miller

    2017-01-01

    Research results have shown that more than half of aviation, aerospace and aeronautics mishaps incidents are attributed to human error. As a part of Quality within space exploration ground processing operations, the identification and or classification of underlying contributors and causes of human error must be identified, in order to manage human error.This presentation will provide a framework and methodology using the Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS), as an analysis tool to identify contributing factors, their impact on human error events, and predict the Human Error probabilities (HEPs) of future occurrences. This research methodology was applied (retrospectively) to six (6) NASA ground processing operations scenarios and thirty (30) years of Launch Vehicle related mishap data. This modifiable framework can be used and followed by other space and similar complex operations.

  12. Human Errors - A Taxonomy for Describing Human Malfunction in Industrial Installations

    DEFF Research Database (Denmark)

    Rasmussen, J.

    1982-01-01

    This paper describes the definition and the characteristics of human errors. Different types of human behavior are classified, and their relation to different error mechanisms are analyzed. The effect of conditioning factors related to affective, motivating aspects of the work situation as well...... as physiological factors are also taken into consideration. The taxonomy for event analysis, including human malfunction, is presented. Possibilities for the prediction of human error are discussed. The need for careful studies in actual work situations is expressed. Such studies could provide a better...... understanding of the complexity of human error situations as well as the data needed to characterize these situations....

  13. Normal accidents: human error and medical equipment design.

    Science.gov (United States)

    Dain, Steven

    2002-01-01

    High-risk systems, which are typical of our technologically complex era, include not just nuclear power plants but also hospitals, anesthesia systems, and the practice of medicine and perfusion. In high-risk systems, no matter how effective safety devices are, some types of accidents are inevitable because the system's complexity leads to multiple and unexpected interactions. It is important for healthcare providers to apply a risk assessment and management process to decisions involving new equipment and procedures or staffing matters in order to minimize the residual risks of latent errors, which are amenable to correction because of the large window of opportunity for their detection. This article provides an introduction to basic risk management and error theory principles and examines ways in which they can be applied to reduce and mitigate the inevitable human errors that accompany high-risk systems. The article also discusses "human factor engineering" (HFE), the process which is used to design equipment/ human interfaces in order to mitigate design errors. The HFE process involves interaction between designers and endusers to produce a series of continuous refinements that are incorporated into the final product. The article also examines common design problems encountered in the operating room that may predispose operators to commit errors resulting in harm to the patient. While recognizing that errors and accidents are unavoidable, organizations that function within a high-risk system must adopt a "safety culture" that anticipates problems and acts aggressively through an anonymous, "blameless" reporting mechanism to resolve them. We must continuously examine and improve the design of equipment and procedures, personnel, supplies and materials, and the environment in which we work to reduce error and minimize its effects. Healthcare providers must take a leading role in the day-to-day management of the "Perioperative System" and be a role model in

  14. SCHEME (Soft Control Human error Evaluation MEthod) for advanced MCR HRA

    Energy Technology Data Exchange (ETDEWEB)

    Jang, Inseok; Jung, Wondea [KAERI, Daejeon (Korea, Republic of); Seong, Poong Hyun [KAIST, Daejeon (Korea, Republic of)

    2015-05-15

    The Technique for Human Error Rate Prediction (THERP), Korean Human Reliability Analysis (K-HRA), Human Error Assessment and Reduction Technique (HEART), A Technique for Human Event Analysis (ATHEANA), Cognitive Reliability and Error Analysis Method (CREAM), and Simplified Plant Analysis Risk Human Reliability Assessment (SPAR-H) in relation to NPP maintenance and operation. Most of these methods were developed considering the conventional type of Main Control Rooms (MCRs). They are still used for HRA in advanced MCRs even though the operating environment of advanced MCRs in NPPs has been considerably changed by the adoption of new human-system interfaces such as computer-based soft controls. Among the many features in advanced MCRs, soft controls are an important feature because the operation action in NPP advanced MCRs is performed by soft controls. Consequently, those conventional methods may not sufficiently consider the features of soft control execution human errors. To this end, a new framework of a HRA method for evaluating soft control execution human error is suggested by performing the soft control task analysis and the literature reviews regarding widely accepted human error taxonomies. In this study, the framework of a HRA method for evaluating soft control execution human error in advanced MCRs is developed. First, the factors which HRA method in advanced MCRs should encompass are derived based on the literature review, and soft control task analysis. Based on the derived factors, execution HRA framework in advanced MCRs is developed mainly focusing on the features of soft control. Moreover, since most current HRA database deal with operation in conventional type of MCRs and are not explicitly designed to deal with digital HSI, HRA database are developed under lab scale simulation.

  15. Human error in strabismus surgery: Quantification with a sensitivity analysis

    NARCIS (Netherlands)

    S. Schutte (Sander); J.R. Polling; F.C.T. van der Helm (Frans); H.J. Simonsz (Huib)

    2009-01-01

    textabstractBackground: Reoperations are frequently necessary in strabismus surgery. The goal of this study was to analyze human-error related factors that introduce variability in the results of strabismus surgery in a systematic fashion. Methods: We identified the primary factors that influence th

  16. Human error in strabismus surgery: quantification with a sensitivity analysis

    NARCIS (Netherlands)

    Schutte, S.; Polling, J.R.; Van der Helm, F.C.T.; Simonsz, H.J.

    2008-01-01

    Background- Reoperations are frequently necessary in strabismus surgery. The goal of this study was to analyze human-error related factors that introduce variability in the results of strabismus surgery in a systematic fashion. Methods- We identified the primary factors that influence the outcome of

  17. A Six Sigma Trial For Reduction of Error Rates in Pathology Laboratory.

    Science.gov (United States)

    Tosuner, Zeynep; Gücin, Zühal; Kiran, Tuğçe; Büyükpinarbaşili, Nur; Turna, Seval; Taşkiran, Olcay; Arici, Dilek Sema

    2016-01-01

    A major target of quality assurance is the minimization of error rates in order to enhance patient safety. Six Sigma is a method targeting zero error (3.4 errors per million events) used in industry. The five main principles of Six Sigma are defining, measuring, analysis, improvement and control. Using this methodology, the causes of errors can be examined and process improvement strategies can be identified. The aim of our study was to evaluate the utility of Six Sigma methodology in error reduction in our pathology laboratory. The errors encountered between April 2014 and April 2015 were recorded by the pathology personnel. Error follow-up forms were examined by the quality control supervisor, administrative supervisor and the head of the department. Using Six Sigma methodology, the rate of errors was measured monthly and the distribution of errors at the preanalytic, analytic and postanalytical phases was analysed. Improvement strategies were reclaimed in the monthly intradepartmental meetings and the control of the units with high error rates was provided. Fifty-six (52.4%) of 107 recorded errors in total were at the pre-analytic phase. Forty-five errors (42%) were recorded as analytical and 6 errors (5.6%) as post-analytical. Two of the 45 errors were major irrevocable errors. The error rate was 6.8 per million in the first half of the year and 1.3 per million in the second half, decreasing by 79.77%. The Six Sigma trial in our pathology laboratory provided the reduction of the error rates mainly in the pre-analytic and analytic phases.

  18. A long lifetime, low error rate RRAM design with self-repair module

    Science.gov (United States)

    Zhiqiang, You; Fei, Hu; Liming, Huang; Peng, Liu; Jishun, Kuang; Shiying, Li

    2016-11-01

    Resistive random access memory (RRAM) is one of the promising candidates for future universal memory. However, it suffers from serious error rate and endurance problems. Therefore, exploring a technical solution is greatly demanded to enhance endurance and reduce error rate. In this paper, we propose a reliable RRAM architecture that includes two reliability modules: error correction code (ECC) and self-repair modules. The ECC module is used to detect errors and decrease error rate. The self-repair module, which is proposed for the first time for RRAM, can get the information of error bits and repair wear-out cells by a repair voltage. Simulation results show that the proposed architecture can achieve lowest error rate and longest lifetime compared to previous reliable designs. Project supported by the New Century Excellent Talents in University (No. NCET-12-0165) and the National Natural Science Foundation of China (Nos. 61472123, 61272396).

  19. Impact of translational error-induced and error-free misfolding on the rate of protein evolution.

    Science.gov (United States)

    Yang, Jian-Rong; Zhuang, Shi-Mei; Zhang, Jianzhi

    2010-10-19

    What determines the rate of protein evolution is a fundamental question in biology. Recent genomic studies revealed a surprisingly strong anticorrelation between the expression level of a protein and its rate of sequence evolution. This observation is currently explained by the translational robustness hypothesis in which the toxicity of translational error-induced protein misfolding selects for higher translational robustness of more abundant proteins, which constrains sequence evolution. However, the impact of error-free protein misfolding has not been evaluated. We estimate that a non-negligible fraction of misfolded proteins are error free and demonstrate by a molecular-level evolutionary simulation that selection against protein misfolding results in a greater reduction of error-free misfolding than error-induced misfolding. Thus, an overarching protein-misfolding-avoidance hypothesis that includes both sources of misfolding is superior to the translational robustness hypothesis. We show that misfolding-minimizing amino acids are preferentially used in highly abundant yeast proteins and that these residues are evolutionarily more conserved than other residues of the same proteins. These findings provide unambiguous support to the role of protein-misfolding-avoidance in determining the rate of protein sequence evolution.

  20. A Six Sigma approach to the rate and clinical effect of registration errors in a laboratory.

    Science.gov (United States)

    Vanker, Naadira; van Wyk, Johan; Zemlin, Annalise E; Erasmus, Rajiv T

    2010-05-01

    Laboratory errors made during the pre-analytical phase can have an impact on clinical care. Quality management tools such as Six Sigma may help improve error rates. To use elements of a Six Sigma model to establish the error rate of test registration onto the laboratory information system (LIS), and to deduce the potential clinical impact of these errors. In this retrospective study, test request forms were compared with the tests registered onto the LIS, and all errors were noted before being rectified. The error rate was calculated. The corresponding patient records were then examined to determine the actual outcome, and to deduce the potential clinical impact of the registration errors. Of the 47 543 tests requested, 72 errors were noted, resulting in an error rate of 0.151%, equating to a sigma score of 4.46. The patient records reviewed indicated that these errors could, in various ways, have impacted on clinical care. This study highlights the clinical effect of errors made during the pre-analytical phase of the laboratory testing process. Reduction of errors may be achieved through implementation of a Six Sigma programme.

  1. An Estimation of Human Error Probability of Filtered Containment Venting System Using Dynamic HRA Method

    Energy Technology Data Exchange (ETDEWEB)

    Jang, Seunghyun; Jae, Moosung [Hanyang University, Seoul (Korea, Republic of)

    2016-10-15

    The human failure events (HFEs) are considered in the development of system fault trees as well as accident sequence event trees in part of Probabilistic Safety Assessment (PSA). As a method for analyzing the human error, several methods, such as Technique for Human Error Rate Prediction (THERP), Human Cognitive Reliability (HCR), and Standardized Plant Analysis Risk-Human Reliability Analysis (SPAR-H) are used and new methods for human reliability analysis (HRA) are under developing at this time. This paper presents a dynamic HRA method for assessing the human failure events and estimation of human error probability for filtered containment venting system (FCVS) is performed. The action associated with implementation of the containment venting during a station blackout sequence is used as an example. In this report, dynamic HRA method was used to analyze FCVS-related operator action. The distributions of the required time and the available time were developed by MAAP code and LHS sampling. Though the numerical calculations given here are only for illustrative purpose, the dynamic HRA method can be useful tools to estimate the human error estimation and it can be applied to any kind of the operator actions, including the severe accident management strategy.

  2. Examining rating quality in writing assessment: rater agreement, error, and accuracy.

    Science.gov (United States)

    Wind, Stefanie A; Engelhard, George

    2012-01-01

    The use of performance assessments in which human raters evaluate student achievement has become increasingly prevalent in high-stakes assessment systems such as those associated with recent policy initiatives (e.g., Race to the Top). In this study, indices of rating quality are compared between two measurement perspectives. Within the context of a large-scale writing assessment, this study focuses on the alignment between indices of rater agreement, error, and accuracy based on traditional and Rasch measurement theory perspectives. Major empirical findings suggest that Rasch-based indices of model-data fit for ratings provide information about raters that is comparable to direct measures of accuracy. The use of easily obtained approximations of direct accuracy measures holds significant implications for monitoring rating quality in large-scale rater-mediated performance assessments.

  3. Frame Rate and Human Vision

    Science.gov (United States)

    Watson, Andrew B.

    2012-01-01

    To enhance the quality of the theatre experience, the film industry is interested in achieving higher frame rates for capture and display. In this talk I will describe the basic spatio-temporal sensitivities of human vision, and how they respond to the time sequence of static images that is fundamental to cinematic presentation.

  4. Hard Data on Soft Errors: A Large-Scale Assessment of Real-World Error Rates in GPGPU

    CERN Document Server

    Haque, Imran S

    2009-01-01

    Graphics processing units (GPUs) are gaining widespread use in computational chemistry and other scientific simulation contexts because of their huge performance advantages relative to conventional CPUs. However, the reliability of GPUs in error-intolerant applications is largely unproven. In particular, a lack of error checking and correcting (ECC) capability in the memory subsystems of graphics cards has been cited as a hindrance to the acceptance of GPUs as high-performance coprocessors, but the impact of this design has not been previously quantified. In this article we present MemtestG80, our software for assessing memory error rates on NVIDIA G80 and GT200-architecture-based graphics cards. Furthermore, we present the results of a large-scale assessment of GPU error rate, conducted by running MemtestG80 on over 20,000 hosts on the Folding@home distributed computing network. Our control experiments on consumer-grade and dedicated-GPGPU hardware in a controlled environment found no errors. However, our su...

  5. Medication Error Reporting Rate and its Barriers and Facilitators among Nurses

    Directory of Open Access Journals (Sweden)

    Snor Bayazidi

    2012-11-01

    Full Text Available Introduction: Medication errors are among the most prevalent medical errors leading to morbidity and mortality. Effective prevention of this type of errors depends on the presence of a well-organized reporting system. The purpose of this study was to explore medication error reporting rate and its barriers and facilitators among nurses in teaching hospitals of Urmia University of Medical Sciences (Iran.Methods: In a descriptive study in 2011, 733 nurses working in Urmia teaching hospitals were included. Data was collected using a questionnaire based on Haddon matrix. The questionnaire consisted of three items about medication error reporting rate, eight items on barriers of reporting, and seven items on facilitators of reporting. The collected data was analyzed by descriptive statistics in SPSS14.Results:The rate of reporting medication errors among nurses was far less than medication errors they had made. Nurses perceived that the most important barriers of reporting medication errors were blaming individuals instead of the system, consequences of reporting errors, and fear of reprimand and punishment. Some facilitating factors were also determined. Conclusion: Overall, the rate of medication errors was found to be much more than what had been reported by nurses. Therefore, it is suggested to train nurses and hospital administrators on facilitators and barriers of error reporting in order to enhance patient safety.

  6. Influenza infection rates, measurement errors and the interpretation of paired serology.

    Directory of Open Access Journals (Sweden)

    Simon Cauchemez

    Full Text Available Serological studies are the gold standard method to estimate influenza infection attack rates (ARs in human populations. In a common protocol, blood samples are collected before and after the epidemic in a cohort of individuals; and a rise in haemagglutination-inhibition (HI antibody titers during the epidemic is considered as a marker of infection. Because of inherent measurement errors, a 2-fold rise is usually considered as insufficient evidence for infection and seroconversion is therefore typically defined as a 4-fold rise or more. Here, we revisit this widely accepted 70-year old criterion. We develop a Markov chain Monte Carlo data augmentation model to quantify measurement errors and reconstruct the distribution of latent true serological status in a Vietnamese 3-year serological cohort, in which replicate measurements were available. We estimate that the 1-sided probability of a 2-fold error is 9.3% (95% Credible Interval, CI: 3.3%, 17.6% when antibody titer is below 10 but is 20.2% (95% CI: 15.9%, 24.0% otherwise. After correction for measurement errors, we find that the proportion of individuals with 2-fold rises in antibody titers was too large to be explained by measurement errors alone. Estimates of ARs vary greatly depending on whether those individuals are included in the definition of the infected population. A simulation study shows that our method is unbiased. The 4-fold rise case definition is relevant when aiming at a specific diagnostic for individual cases, but the justification is less obvious when the objective is to estimate ARs. In particular, it may lead to large underestimates of ARs. Determining which biological phenomenon contributes most to 2-fold rises in antibody titers is essential to assess bias with the traditional case definition and offer improved estimates of influenza ARs.

  7. Rate-distortion theory and human perception.

    Science.gov (United States)

    Sims, Chris R

    2016-07-01

    The fundamental goal of perception is to aid in the achievement of behavioral objectives. This requires extracting and communicating useful information from noisy and uncertain sensory signals. At the same time, given the complexity of sensory information and the limitations of biological information processing, it is necessary that some information must be lost or discarded in the act of perception. Under these circumstances, what constitutes an 'optimal' perceptual system? This paper describes the mathematical framework of rate-distortion theory as the optimal solution to the problem of minimizing the costs of perceptual error subject to strong constraints on the ability to communicate or transmit information. Rate-distortion theory offers a general and principled theoretical framework for developing computational-level models of human perception (Marr, 1982). Models developed in this framework are capable of producing quantitatively precise explanations for human perceptual performance, while yielding new insights regarding the nature and goals of perception. This paper demonstrates the application of rate-distortion theory to two benchmark domains where capacity limits are especially salient in human perception: discrete categorization of stimuli (also known as absolute identification) and visual working memory. A software package written for the R statistical programming language is described that aids in the development of models based on rate-distortion theory.

  8. Bit error rate analysis of free-space optical communication over general Malaga turbulence channels with pointing error

    KAUST Repository

    Alheadary, Wael G.

    2016-12-24

    In this work, we present a bit error rate (BER) and achievable spectral efficiency (ASE) performance of a freespace optical (FSO) link with pointing errors based on intensity modulation/direct detection (IM/DD) and heterodyne detection over general Malaga turbulence channel. More specifically, we present exact closed-form expressions for adaptive and non-adaptive transmission. The closed form expressions are presented in terms of generalized power series of the Meijer\\'s G-function. Moreover, asymptotic closed form expressions are provided to validate our work. In addition, all the presented analytical results are illustrated using a selected set of numerical results.

  9. Beneficial Effects of Population Bottlenecks in an RNA Virus Evolving at Increased Error Rate

    OpenAIRE

    Cases-González, Clara E.; Arribas, María; Domingo, Esteban; Lázaro, Ester

    2008-01-01

    RNA viruses replicate their genomes with a very high error rate and constitute highly heterogeneous mutant distributions similar to the molecular quasispecies introduced to explain the evolution of prebiotic replicators. The genetic information included in a quasispecies can only be faithfully transmitted below a critical error rate. When the error threshold is crossed, the population structure disorganizes, and it is substituted by a randomly distributed mutant spectrum. For viral quasispeci...

  10. Conserved rates and patterns of transcription errors across bacterial growth states and lifestyles

    Science.gov (United States)

    Traverse, Charles C.; Ochman, Howard

    2016-01-01

    Errors that occur during transcription have received much less attention than the mutations that occur in DNA because transcription errors are not heritable and usually result in a very limited number of altered proteins. However, transcription error rates are typically several orders of magnitude higher than the mutation rate. Also, individual transcripts can be translated multiple times, so a single error can have substantial effects on the pool of proteins. Transcription errors can also contribute to cellular noise, thereby influencing cell survival under stressful conditions, such as starvation or antibiotic stress. Implementing a method that captures transcription errors genome-wide, we measured the rates and spectra of transcription errors in Escherichia coli and in endosymbionts for which mutation and/or substitution rates are greatly elevated over those of E. coli. Under all tested conditions, across all species, and even for different categories of RNA sequences (mRNA and rRNAs), there were no significant differences in rates of transcription errors, which ranged from 2.3 × 10−5 per nucleotide in mRNA of the endosymbiont Buchnera aphidicola to 5.2 × 10−5 per nucleotide in rRNA of the endosymbiont Carsonella ruddii. The similarity of transcription error rates in these bacterial endosymbionts to that in E. coli (4.63 × 10−5 per nucleotide) is all the more surprising given that genomic erosion has resulted in the loss of transcription fidelity factors in both Buchnera and Carsonella. PMID:26884158

  11. Conserved rates and patterns of transcription errors across bacterial growth states and lifestyles.

    Science.gov (United States)

    Traverse, Charles C; Ochman, Howard

    2016-03-22

    Errors that occur during transcription have received much less attention than the mutations that occur in DNA because transcription errors are not heritable and usually result in a very limited number of altered proteins. However, transcription error rates are typically several orders of magnitude higher than the mutation rate. Also, individual transcripts can be translated multiple times, so a single error can have substantial effects on the pool of proteins. Transcription errors can also contribute to cellular noise, thereby influencing cell survival under stressful conditions, such as starvation or antibiotic stress. Implementing a method that captures transcription errors genome-wide, we measured the rates and spectra of transcription errors in Escherichia coli and in endosymbionts for which mutation and/or substitution rates are greatly elevated over those of E. coli Under all tested conditions, across all species, and even for different categories of RNA sequences (mRNA and rRNAs), there were no significant differences in rates of transcription errors, which ranged from 2.3 × 10(-5) per nucleotide in mRNA of the endosymbiont Buchnera aphidicola to 5.2 × 10(-5) per nucleotide in rRNA of the endosymbiont Carsonella ruddii The similarity of transcription error rates in these bacterial endosymbionts to that in E. coli (4.63 × 10(-5) per nucleotide) is all the more surprising given that genomic erosion has resulted in the loss of transcription fidelity factors in both Buchnera and Carsonella.

  12. Mutual information, bit error rate and security in W\\'{o}jcik's scheme

    CERN Document Server

    Zhang, Z

    2004-01-01

    In this paper the correct calculations of the mutual information of the whole transmission, the quantum bit error rate (QBER) are presented. Mistakes of the general conclusions relative to the mutual information, the quantum bit error rate (QBER) and the security in W\\'{o}jcik's paper [Phys. Rev. Lett. {\\bf 90}, 157901(2003)] have been pointed out and corrected.

  13. National Suicide Rates a Century after Durkheim: Do We Know Enough to Estimate Error?

    Science.gov (United States)

    Claassen, Cynthia A.; Yip, Paul S.; Corcoran, Paul; Bossarte, Robert M.; Lawrence, Bruce A.; Currier, Glenn W.

    2010-01-01

    Durkheim's nineteenth-century analysis of national suicide rates dismissed prior concerns about mortality data fidelity. Over the intervening century, however, evidence documenting various types of error in suicide data has only mounted, and surprising levels of such error continue to be routinely uncovered. Yet the annual suicide rate remains the…

  14. Agreeableness and Conscientiousness as Predictors of University Students' Self/Peer-Assessment Rating Error

    Science.gov (United States)

    Birjandi, Parviz; Siyyari, Masood

    2016-01-01

    This paper presents the results of an investigation into the role of two personality traits (i.e. Agreeableness and Conscientiousness from the Big Five personality traits) in predicting rating error in the self-assessment and peer-assessment of composition writing. The average self/peer-rating errors of 136 Iranian English major undergraduates…

  15. National Suicide Rates a Century after Durkheim: Do We Know Enough to Estimate Error?

    Science.gov (United States)

    Claassen, Cynthia A.; Yip, Paul S.; Corcoran, Paul; Bossarte, Robert M.; Lawrence, Bruce A.; Currier, Glenn W.

    2010-01-01

    Durkheim's nineteenth-century analysis of national suicide rates dismissed prior concerns about mortality data fidelity. Over the intervening century, however, evidence documenting various types of error in suicide data has only mounted, and surprising levels of such error continue to be routinely uncovered. Yet the annual suicide rate remains the…

  16. Applying lessons learned to enhance human performance and reduce human error for ISS operations

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, W.R.

    1998-09-01

    A major component of reliability, safety, and mission success for space missions is ensuring that the humans involved (flight crew, ground crew, mission control, etc.) perform their tasks and functions as required. This includes compliance with training and procedures during normal conditions, and successful compensation when malfunctions or unexpected conditions occur. A very significant issue that affects human performance in space flight is human error. Human errors can invalidate carefully designed equipment and procedures. If certain errors combine with equipment failures or design flaws, mission failure or loss of life can occur. The control of human error during operation of the International Space Station (ISS) will be critical to the overall success of the program. As experience from Mir operations has shown, human performance plays a vital role in the success or failure of long duration space missions. The Department of Energy`s Idaho National Engineering and Environmental Laboratory (INEEL) is developed a systematic approach to enhance human performance and reduce human errors for ISS operations. This approach is based on the systematic identification and evaluation of lessons learned from past space missions such as Mir to enhance the design and operation of ISS. This paper describes previous INEEL research on human error sponsored by NASA and how it can be applied to enhance human reliability for ISS.

  17. The Relationship between Human Operators' Psycho-physiological Condition and Human Errors in Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Arryum; Jang, Inseok; Kang, Hyungook; Seong, Poonghyun [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of)

    2013-05-15

    The safe operation of nuclear power plants (NPPs) is substantially dependent on the performance of the human operators who operate the systems. In this environment, human errors caused by inappropriate performance of operator have been considered to be critical since it may lead serious problems in the safety-critical plants. In order to provide meaningful insights to prevent human errors and enhance the human performance, operators' physiological conditions such as stress and workload have been investigated. Physiological measurements were considered as reliable tools to assess the stress and workload. T. Q. Tran et al. and J. B. Brooking et al pointed out that operators' workload can be assessed using eye tracking, galvanic skin response, electroencephalograms (EEGs), heart rate, respiration and other measurements. The purpose of this study is to investigate the effect of the human operators' tense level and knowledge level to the number of human errors. For this study, the experiments were conducted in the mimic of the main control rooms (MCR) in NPP. It utilized the compact nuclear simulator (CNS) which is modeled based on the three loop Pressurized Water Reactor, 993MWe, Kori unit 3 and 4 in Korea and the subjects were asked to follow the tasks described in the emergency operating procedures (EOP). During the simulation, three kinds of physiological measurement were utilized; Electrocardiogram (ECG), EEG and nose temperature. Also, subjects were divided into three groups based on their knowledge of the plant operation. The result shows that subjects who are tense make fewer errors. In addition, subjects who are in higher knowledge level tend to be tense and make fewer errors. For the ECG data, subjects who make fewer human errors tend to be located in higher tense level area of high SNS activity and low PSNS activity. The results of EEG data are also similar to ECG result. Beta power ratio of subjects who make fewer errors was higher. Since beta

  18. Algorithm-supported visual error correction (AVEC) of heart rate measurements in dogs, Canis lupus familiaris.

    Science.gov (United States)

    Schöberl, Iris; Kortekaas, Kim; Schöberl, Franz F; Kotrschal, Kurt

    2015-12-01

    Dog heart rate (HR) is characterized by a respiratory sinus arrhythmia, and therefore makes an automatic algorithm for error correction of HR measurements hard to apply. Here, we present a new method of error correction for HR data collected with the Polar system, including (1) visual inspection of the data, (2) a standardized way to decide with the aid of an algorithm whether or not a value is an outlier (i.e., "error"), and (3) the subsequent removal of this error from the data set. We applied our new error correction method to the HR data of 24 dogs and compared the uncorrected and corrected data, as well as the algorithm-supported visual error correction (AVEC) with the Polar error correction. The results showed that fewer values were identified as errors after AVEC than after the Polar error correction (p error correction is more suitable for dog HR and HR variability than is the customized Polar error correction, especially because AVEC decreases the likelihood of Type I errors, preserves the natural variability in HR, and does not lead to a time shift in the data.

  19. Study of bit error rate (BER) for multicarrier OFDM

    Science.gov (United States)

    Alshammari, Ahmed; Albdran, Saleh; Matin, Mohammad

    2012-10-01

    Orthogonal Frequency Division Multiplexing (OFDM) is a multicarrier technique that is being used more and more in recent wideband digital communications. It is known for its ability to handle severe channel conditions, the efficiency of spectral usage and the high data rate. Therefore, It has been used in many wired and wireless communication systems such as DSL, wireless networks and 4G mobile communications. Data streams are modulated and sent over multiple subcarriers using either M-QAM or M-PSK. OFDM has lower inter simple interference (ISI) levels because of the of the low data rates of carriers resulting in long symbol periods. In this paper, BER performance of OFDM with respect to signal to noise ratio (SNR) is evaluated. BPSK Modulation is used in s Simulation based system in order to get the BER over different wireless channels. These channels include additive white Gaussian Noise (AWGN) and fading channels that are based on Doppler spread and Delay spread. Plots of the results are compared with each other after varying some of the key parameters of the system such as the IFFT, number of carriers, SNR. The results of the simulation give visualization of what kind of BER to expect when the signal goes through those channels.

  20. Multipath error in range rate measurement by PLL-transponder/GRARR/TDRS

    Science.gov (United States)

    Sohn, S. J.

    1970-01-01

    Range rate errors due to specular and diffuse multipath are calculated for a tracking and data relay satellite (TDRS) using an S band Goddard range and range rate (GRARR) system modified with a phase-locked loop transponder. Carrier signal processing in the coherent turn-around transponder and the GRARR reciever is taken into account. The root-mean-square (rms) range rate error was computed for the GRARR Doppler extractor and N-cycle count range rate measurement. Curves of worst-case range rate error are presented as a function of grazing angle at the reflection point. At very low grazing angles specular scattering predominates over diffuse scattering as expected, whereas for grazing angles greater than approximately 15 deg, the diffuse multipath predominates. The range rate errors at different low orbit altutudes peaked between 5 and 10 deg grazing angles.

  1. Comparing measurement error correction methods for rate-of-change exposure variables in survival analysis.

    Science.gov (United States)

    Veronesi, Giovanni; Ferrario, Marco M; Chambless, Lloyd E

    2013-12-01

    In this article we focus on comparing measurement error correction methods for rate-of-change exposure variables in survival analysis, when longitudinal data are observed prior to the follow-up time. Motivational examples include the analysis of the association between changes in cardiovascular risk factors and subsequent onset of coronary events. We derive a measurement error model for the rate of change, estimated through subject-specific linear regression, assuming an additive measurement error model for the time-specific measurements. The rate of change is then included as a time-invariant variable in a Cox proportional hazards model, adjusting for the first time-specific measurement (baseline) and an error-free covariate. In a simulation study, we compared bias, standard deviation and mean squared error (MSE) for the regression calibration (RC) and the simulation-extrapolation (SIMEX) estimators. Our findings indicate that when the amount of measurement error is substantial, RC should be the preferred method, since it has smaller MSE for estimating the coefficients of the rate of change and of the variable measured without error. However, when the amount of measurement error is small, the choice of the method should take into account the event rate in the population and the effect size to be estimated. An application to an observational study, as well as examples of published studies where our model could have been applied, are also provided.

  2. Topological quantum computing with a very noisy network and local error rates approaching one percent.

    Science.gov (United States)

    Nickerson, Naomi H; Li, Ying; Benjamin, Simon C

    2013-01-01

    A scalable quantum computer could be built by networking together many simple processor cells, thus avoiding the need to create a single complex structure. The difficulty is that realistic quantum links are very error prone. A solution is for cells to repeatedly communicate with each other and so purify any imperfections; however prior studies suggest that the cells themselves must then have prohibitively low internal error rates. Here we describe a method by which even error-prone cells can perform purification: groups of cells generate shared resource states, which then enable stabilization of topologically encoded data. Given a realistically noisy network (≥10% error rate) we find that our protocol can succeed provided that intra-cell error rates for initialisation, state manipulation and measurement are below 0.82%. This level of fidelity is already achievable in several laboratory systems.

  3. Error rates in forensic DNA analysis: Definition, numbers, impact and communication

    NARCIS (Netherlands)

    Kloosterman, A.; Sjerps, M.; Quak, A.

    2014-01-01

    Forensic DNA casework is currently regarded as one of the most important types of forensic evidence, and important decisions in intelligence and justice are based on it. However, errors occasionally occur and may have very serious consequences. In other domains, error rates have been defined and pub

  4. Human factors and error prevention in emergency medicine.

    Science.gov (United States)

    Bleetman, Anthony; Sanusi, Seliat; Dale, Trevor; Brace, Samantha

    2012-05-01

    Emergency departments are one of the highest risk areas in health care. Emergency physicians have to assemble and manage unrehearsed multidisciplinary teams with little notice and manage critically ill patients. With greater emphasis on management and leadership skills, there is an increasing awareness of the importance of human factors in making changes to improve patient safety. Non-clinical skills are required to achieve this in an information-poor environment and to minimise the risk of errors. Training in these non-clinical skills is a mandatory component in other high-risk industries, such as aviation and, needs to be part of an emergency physician's skill set. Therefore, there remains an educational gap that we need to fill before an emergency physician is equipped to function as a team leader and manager. This review will examine the lessons from aviation and how these are applicable to emergency medicine. Solutions to averting errors are discussed and the need for formal human factors training in emergency medicine.

  5. Type I Error Rates and Power Estimates of Selected Parametric and Nonparametric Tests of Scale.

    Science.gov (United States)

    Olejnik, Stephen F.; Algina, James

    1987-01-01

    Estimated Type I Error rates and power are reported for the Brown-Forsythe, O'Brien, Klotz, and Siegal-Tukey procedures. The effect of aligning the data using deviations from group means or group medians is investigated. (RB)

  6. An error criterion for determining sampling rates in closed-loop control systems

    Science.gov (United States)

    Brecher, S. M.

    1972-01-01

    The determination of an error criterion which will give a sampling rate for adequate performance of linear, time-invariant closed-loop, discrete-data control systems was studied. The proper modelling of the closed-loop control system for characterization of the error behavior, and the determination of an absolute error definition for performance of the two commonly used holding devices are discussed. The definition of an adequate relative error criterion as a function of the sampling rate and the parameters characterizing the system is established along with the determination of sampling rates. The validity of the expressions for the sampling interval was confirmed by computer simulations. Their application solves the problem of making a first choice in the selection of sampling rates.

  7. Type I Error Rates and Power Estimates of Selected Parametric and Nonparametric Tests of Scale.

    Science.gov (United States)

    Olejnik, Stephen F.; Algina, James

    1987-01-01

    Estimated Type I Error rates and power are reported for the Brown-Forsythe, O'Brien, Klotz, and Siegal-Tukey procedures. The effect of aligning the data using deviations from group means or group medians is investigated. (RB)

  8. Graphical algorithms and threshold error rates for the 2d colour code

    CERN Document Server

    Wang, D S; Hill, C D; Hollenberg, L C L

    2009-01-01

    Recent work on fault-tolerant quantum computation making use of topological error correction shows great potential, with the 2d surface code possessing a threshold error rate approaching 1% (NJoP 9:199, 2007), (arXiv:0905.0531). However, the 2d surface code requires the use of a complex state distillation procedure to achieve universal quantum computation. The colour code of (PRL 97:180501, 2006) is a related scheme partially solving the problem, providing a means to perform all Clifford group gates transversally. We review the colour code and its error correcting methodology, discussing one approximate technique based on graph matching. We derive an analytic lower bound to the threshold error rate of 6.25% under error-free syndrome extraction, while numerical simulations indicate it may be as high as 13.3%. Inclusion of faulty syndrome extraction circuits drops the threshold to approximately 0.1%.

  9. Influence of the FEC Channel Coding on Error Rates and Picture Quality in DVB Baseband Transmission

    Directory of Open Access Journals (Sweden)

    T. Kratochvil

    2006-09-01

    Full Text Available The paper deals with the component analysis of DTV (Digital Television and DVB (Digital Video Broadcasting baseband channel coding. Used FEC (Forward Error Correction error-protection codes principles are shortly outlined and the simulation model applied in Matlab is presented. Results of achieved bit and symbol error rates and corresponding picture quality evaluation analysis are presented, including the evaluation of influence of the channel coding on transmitted RGB images and their noise rates related to MOS (Mean Opinion Score. Conclusion of the paper contains comparison of DVB channel codes efficiency.

  10. Conjunction error rates on a continuous recognition memory test: little evidence for recollection.

    Science.gov (United States)

    Jones, Todd C; Atchley, Paul

    2002-03-01

    Two experiments examined conjunction memory errors on a continuous recognition task where the lag between parent words (e.g., blackmail, jailbird) and later conjunction lures (blackbird) was manipulated. In Experiment 1, contrary to expectations, the conjunction error rate was highest at the shortest lag (1 word) and decreased as the lag increased. In Experiment 2 the conjunction error rate increased significantly from a 0- to a 1-word lag, then decreased slightly from a 1- to a 5-word lag. The results provide mixed support for simple familiarity and dual-process accounts of recognition. Paradoxically, searching for an item in memory does not appear to be a good encoding task.

  11. Predicting errors from reconfiguration patterns in human brain networks.

    Science.gov (United States)

    Ekman, Matthias; Derrfuss, Jan; Tittgemeyer, Marc; Fiebach, Christian J

    2012-10-09

    Task preparation is a complex cognitive process that implements anticipatory adjustments to facilitate future task performance. Little is known about quantitative network parameters governing this process in humans. Using functional magnetic resonance imaging (fMRI) and functional connectivity measurements, we show that the large-scale topology of the brain network involved in task preparation shows a pattern of dynamic reconfigurations that guides optimal behavior. This network could be decomposed into two distinct topological structures, an error-resilient core acting as a major hub that integrates most of the network's communication and a predominantly sensory periphery showing more flexible network adaptations. During task preparation, core-periphery interactions were dynamically adjusted. Task-relevant visual areas showed a higher topological proximity to the network core and an enhancement in their local centrality and interconnectivity. Failure to reconfigure the network topology was predictive for errors, indicating that anticipatory network reconfigurations are crucial for successful task performance. On the basis of a unique network decoding approach, we also develop a general framework for the identification of characteristic patterns in complex networks, which is applicable to other fields in neuroscience that relate dynamic network properties to behavior.

  12. Joint Estimation of Contamination, Error and Demography for Nuclear DNA from Ancient Humans.

    Directory of Open Access Journals (Sweden)

    Fernando Racimo

    2016-04-01

    Full Text Available When sequencing an ancient DNA sample from a hominin fossil, DNA from present-day humans involved in excavation and extraction will be sequenced along with the endogenous material. This type of contamination is problematic for downstream analyses as it will introduce a bias towards the population of the contaminating individual(s. Quantifying the extent of contamination is a crucial step as it allows researchers to account for possible biases that may arise in downstream genetic analyses. Here, we present an MCMC algorithm to co-estimate the contamination rate, sequencing error rate and demographic parameters-including drift times and admixture rates-for an ancient nuclear genome obtained from human remains, when the putative contaminating DNA comes from present-day humans. We assume we have a large panel representing the putative contaminant population (e.g. European, East Asian or African. The method is implemented in a C++ program called 'Demographic Inference with Contamination and Error' (DICE. We applied it to simulations and genome data from ancient Neanderthals and modern humans. With reasonable levels of genome sequence coverage (>3X, we find we can recover accurate estimates of all these parameters, even when the contamination rate is as high as 50%.

  13. Foreign Exchange Rate Futures Trends: Foreign Exchange Risk or Systematic Forecasting Errors?

    Directory of Open Access Journals (Sweden)

    Marcelo Cunha Medeiros

    2006-12-01

    Full Text Available The forward exchange rate is widely used in international finance whenever the analysis of the expected depreciation is needed. It is also used to identify currency risk premium. The difference between the spot rate and the forward rate is supposed to be a predictor of the future movements of the spot rate. This prediction is hardly precise. The fact that the forward rate is a biased predictor of the future change in the spot rate can be attributed to a currency risk premium. The bias can also be attributed to systematic errors of the future depreciation of the currency. This paper analyzes the nature of the risk premium and of the prediction errors in using the forward rate. It will look into the efficiency and rationality of the futures market in Brazil from April 1995 to December 1998, a period of controled exchange rates.

  14. The effect of sampling on estimates of lexical specificity and error rates.

    Science.gov (United States)

    Rowland, Caroline F; Fletcher, Sarah L

    2006-11-01

    Studies based on naturalistic data are a core tool in the field of language acquisition research and have provided thorough descriptions of children's speech. However, these descriptions are inevitably confounded by differences in the relative frequency with which children use words and language structures. The purpose of the present work was to investigate the impact of sampling constraints on estimates of the productivity of children's utterances, and on the validity of error rates. Comparisons were made between five different sized samples of wh-question data produced by one child aged 2;8. First, we assessed whether sampling constraints undermined the claim (e.g. Tomasello, 2000) that the restricted nature of early child speech reflects a lack of adultlike grammatical knowledge. We demonstrated that small samples were equally likely to under- as overestimate lexical specificity in children's speech, and that the reliability of estimates varies according to sample size. We argued that reliable analyses require a comparison with a control sample, such as that from an adult speaker. Second, we investigated the validity of estimates of error rates based on small samples. The results showed that overall error rates underestimate the incidence of error in some rarely produced parts of the system and that analyses on small samples were likely to substantially over- or underestimate error rates in infrequently produced constructions. We concluded that caution must be used when basing arguments about the scope and nature of errors in children's early multi-word productions on analyses of samples of spontaneous speech.

  15. Robot perception errors and human resolution strategies in situated human-robot dialogue

    OpenAIRE

    Schutte, Niels; Kelleher, John; MacNamee, Brian

    2017-01-01

    We performed an experiment in which human participants interacted through a natural language dialogue interface with a simulated robot to fulfil a series of object manipulation tasks. We introduced errors into the robot’s perception, and observed the resulting problems in the dialogues and their resolutions. We then introduced different methods for the user to request information about the robot’s understanding of the environment. We quantify the impact of perception errors on the dialogues, ...

  16. Latent human error analysis and efficient improvement strategies by fuzzy TOPSIS in aviation maintenance tasks.

    Science.gov (United States)

    Chiu, Ming-Chuan; Hsieh, Min-Chih

    2016-05-01

    The purposes of this study were to develop a latent human error analysis process, to explore the factors of latent human error in aviation maintenance tasks, and to provide an efficient improvement strategy for addressing those errors. First, we used HFACS and RCA to define the error factors related to aviation maintenance tasks. Fuzzy TOPSIS with four criteria was applied to evaluate the error factors. Results show that 1) adverse physiological states, 2) physical/mental limitations, and 3) coordination, communication, and planning are the factors related to airline maintenance tasks that could be addressed easily and efficiently. This research establishes a new analytic process for investigating latent human error and provides a strategy for analyzing human error using fuzzy TOPSIS. Our analysis process complements shortages in existing methodologies by incorporating improvement efficiency, and it enhances the depth and broadness of human error analysis methodology.

  17. On Kolmogorov asymptotics of estimators of the misclassification error rate in linear discriminant analysis

    KAUST Repository

    Zollanvari, Amin

    2013-05-24

    We provide a fundamental theorem that can be used in conjunction with Kolmogorov asymptotic conditions to derive the first moments of well-known estimators of the actual error rate in linear discriminant analysis of a multivariate Gaussian model under the assumption of a common known covariance matrix. The estimators studied in this paper are plug-in and smoothed resubstitution error estimators, both of which have not been studied before under Kolmogorov asymptotic conditions. As a result of this work, we present an optimal smoothing parameter that makes the smoothed resubstitution an unbiased estimator of the true error. For the sake of completeness, we further show how to utilize the presented fundamental theorem to achieve several previously reported results, namely the first moment of the resubstitution estimator and the actual error rate. We provide numerical examples to show the accuracy of the succeeding finite sample approximations in situations where the number of dimensions is comparable or even larger than the sample size.

  18. On Kolmogorov Asymptotics of Estimators of the Misclassification Error Rate in Linear Discriminant Analysis.

    Science.gov (United States)

    Zollanvari, Amin; Genton, Marc G

    2013-08-01

    We provide a fundamental theorem that can be used in conjunction with Kolmogorov asymptotic conditions to derive the first moments of well-known estimators of the actual error rate in linear discriminant analysis of a multivariate Gaussian model under the assumption of a common known covariance matrix. The estimators studied in this paper are plug-in and smoothed resubstitution error estimators, both of which have not been studied before under Kolmogorov asymptotic conditions. As a result of this work, we present an optimal smoothing parameter that makes the smoothed resubstitution an unbiased estimator of the true error. For the sake of completeness, we further show how to utilize the presented fundamental theorem to achieve several previously reported results, namely the first moment of the resubstitution estimator and the actual error rate. We provide numerical examples to show the accuracy of the succeeding finite sample approximations in situations where the number of dimensions is comparable or even larger than the sample size.

  19. On zero-rate error exponent for BSC with noisy feedback

    CERN Document Server

    Burnashev, Marat V

    2008-01-01

    For the information transmission a binary symmetric channel is used. There is also another noisy binary symmetric channel (feedback channel), and the transmitter observes without delay all the outputs of the forward channel via that feedback channel. The transmission of a nonexponential number of messages (i.e. the transmission rate equals zero) is considered. The achievable decoding error exponent for such a combination of channels is investigated. It is shown that if the crossover probability of the feedback channel is less than a certain positive value, then the achievable error exponent is better than the similar error exponent of the no-feedback channel. The transmission method described and the corresponding lower bound for the error exponent can be strengthened, and also extended to the positive transmission rates.

  20. Entanglement can increase asymptotic rates of zero-error classical communication over classical channels

    CERN Document Server

    Leung, Debbie; Matthews, William; Ozols, Maris; Roy, Aidan

    2010-01-01

    It is known that the number of different classical messages which can be communicated with a single use of a classical channel with zero probability of decoding error can sometimes be increased by using entanglement shared between sender and receiver. It has been an open question to determine whether entanglement can ever offer an advantage in terms of the zero-error communication rates achievable in the limit of many channel uses. In this paper we show, by explicit examples, that entanglement can indeed increase asymptotic zero-error capacity. Interestingly, in our examples the quantum protocols are based on the root systems of the exceptional Lie groups E7 and E8.

  1. A Very Efficient Transfer Function Bounding Technique on Bit Error Rate for Viterbi Decoded, Rate 1/N Convolutional Codes

    Science.gov (United States)

    Lee, P. J.

    1984-01-01

    For rate 1/N convolutional codes, a recursive algorithm for finding the transfer function bound on bit error rate (BER) at the output of a Viterbi decoder is described. This technique is very fast and requires very little storage since all the unnecessary operations are eliminated. Using this technique, we find and plot bounds on the BER performance of known codes of rate 1/2 with K 18, rate 1/3 with K 14. When more than one reported code with the same parameter is known, we select the code that minimizes the required signal to noise ratio for a desired bit error rate of 0.000001. This criterion of determining goodness of a code had previously been found to be more useful than the maximum free distance criterion and was used in the code search procedures of very short constraint length codes. This very efficient technique can also be used for searches of longer constraint length codes.

  2. Difference of soft error rates in SOI SRAM induced by various high energy ion species

    Energy Technology Data Exchange (ETDEWEB)

    Abo, Satoshi, E-mail: abo@cqst.osaka-u.ac.jp [Center for Quantum Science and Technology Under Extreme Conditions, Osaka University, 1-3 Machikaneyama, Toyonaka, Osaka 560-8531 (Japan); Masuda, Naoyuki; Wakaya, Fujio; Lohner, Tivadar [Center for Quantum Science and Technology Under Extreme Conditions, Osaka University, 1-3 Machikaneyama, Toyonaka, Osaka 560-8531 (Japan); Onoda, Shinobu; Makino, Takahiro; Hirao, Toshio; Ohshima, Takeshi [Semiconductor Analysis and Radiation Effects Group, Environment and Industrial Materials Research Division, Quantum Beam Science Directorate, Japan Atomic Energy Agency, 1233 Watanuki-machi, Takasaki, Gunma 370-1292 (Japan); Iwamatsu, Toshiaki; Oda, Hidekazu [Advanced Device Technology Department, Production and Technology Unit, Devices and Analysis Technology Division, Renesas Electronics Corporation, 751, Horiguchi, Hitachinaka, Ibaraki 312-8504 (Japan); Takai, Mikio [Center for Quantum Science and Technology Under Extreme Conditions, Osaka University, 1-3 Machikaneyama, Toyonaka, Osaka 560-8531 (Japan)

    2012-02-15

    Soft error rates in silicon-on-insulator (SOI) static random access memories (SRAMs) with a technology node of 90 nm have been investigated by beryllium and carbon ion probes. The soft error rates induced by beryllium and carbon probes started to increase with probe energies of 5.0 and 8.5 MeV, in which probes slightly penetrated the over-layer, and were saturated with energies at and above 7.0 and 9.0 MeV, in which the generated charge in the SOI body was more than the critical charge. The soft error rates in the SOI SRAMs by various ion probes were also compared with the generated charge in the SOI body. The soft error rates induced by hydrogen and helium ion probes were 1-2 orders of magnitude lower than those by beryllium, carbon and oxygen ion probes. The soft error rates depend not only on the generated charge in the SOI body but also on the incident ion species.

  3. A Conceptual Framework of Human Reliability Analysis for Execution Human Error in NPP Advanced MCRs

    Energy Technology Data Exchange (ETDEWEB)

    Jang, In Seok; Kim, Ar Ryum; Seong, Poong Hyun [KAIST, Daejeon (Korea, Republic of); Jung, Won Dea [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-08-15

    The operation environment of Main Control Rooms (MCRs) in Nuclear Power Plants (NPPs) has changed with the adoption of new human-system interfaces that are based on computer-based technologies. The MCRs that include these digital and computer technologies, such as large display panels, computerized procedures, and soft controls, are called Advanced MCRs. Among the many features of Advanced MCRs, soft controls are a particularly important feature because the operation action in NPP Advanced MCRs is performed by soft control. Using soft controls such as mouse control, and touch screens, operators can select a specific screen, then choose the controller, and finally manipulate the given devices. Due to the different interfaces between soft control and hardwired conventional type control, different human error probabilities and a new Human Reliability Analysis (HRA) framework should be considered in the HRA for advanced MCRs. In other words, new human error modes should be considered for interface management tasks such as navigation tasks, and icon (device) selection tasks in monitors and a new framework of HRA method taking these newly generated human error modes into account should be considered. In this paper, a conceptual framework for a HRA method for the evaluation of soft control execution human error in advanced MCRs is suggested by analyzing soft control tasks.

  4. High speed and adaptable error correction for megabit/s rate quantum key distribution.

    Science.gov (United States)

    Dixon, A R; Sato, H

    2014-12-02

    Quantum Key Distribution is moving from its theoretical foundation of unconditional security to rapidly approaching real world installations. A significant part of this move is the orders of magnitude increases in the rate at which secure key bits are distributed. However, these advances have mostly been confined to the physical hardware stage of QKD, with software post-processing often being unable to support the high raw bit rates. In a complete implementation this leads to a bottleneck limiting the final secure key rate of the system unnecessarily. Here we report details of equally high rate error correction which is further adaptable to maximise the secure key rate under a range of different operating conditions. The error correction is implemented both in CPU and GPU using a bi-directional LDPC approach and can provide 90-94% of the ideal secure key rate over all fibre distances from 0-80 km.

  5. Block Recovery Rate-Based Unequal Error Protection for Three-Screen TV

    Directory of Open Access Journals (Sweden)

    Hojin Ha

    2017-02-01

    Full Text Available This paper describes a three-screen television system using a block recovery rate (BRR-based unequal error protection (UEP. The proposed in-home wireless network uses scalable video coding (SVC and UEP with forward error correction (FEC for maximizing the quality of service (QoS over error-prone wireless networks. For efficient FEC packet assignment, this paper proposes a simple and efficient performance metric, a BRR which is defined as a recovery rate of temporal and quality layer from FEC assignment by analyzing the hierarchical prediction structure including the current packet loss. It also explains the SVC layer switching scheme according to network conditions such as packet loss rate (PLR and available bandwidth (ABW. In the experiments conducted, gains in video quality with the proposed UEP scheme vary from 1 to 3 dB in Y-peak signal-to-noise ratio (PSNR with corresponding subjective video quality improvements.

  6. Per-beam, planar IMRT QA passing rates do not predict clinically relevant patient dose errors

    Energy Technology Data Exchange (ETDEWEB)

    Nelms, Benjamin E.; Zhen Heming; Tome, Wolfgang A. [Canis Lupus LLC and Department of Human Oncology, University of Wisconsin, Merrimac, Wisconsin 53561 (United States); Department of Medical Physics, University of Wisconsin, Madison, Wisconsin 53705 (United States); Departments of Human Oncology, Medical Physics, and Biomedical Engineering, University of Wisconsin, Madison, Wisconsin 53792 (United States)

    2011-02-15

    Purpose: The purpose of this work is to determine the statistical correlation between per-beam, planar IMRT QA passing rates and several clinically relevant, anatomy-based dose errors for per-patient IMRT QA. The intent is to assess the predictive power of a common conventional IMRT QA performance metric, the Gamma passing rate per beam. Methods: Ninety-six unique data sets were created by inducing four types of dose errors in 24 clinical head and neck IMRT plans, each planned with 6 MV Varian 120-leaf MLC linear accelerators using a commercial treatment planning system and step-and-shoot delivery. The error-free beams/plans were used as ''simulated measurements'' (for generating the IMRT QA dose planes and the anatomy dose metrics) to compare to the corresponding data calculated by the error-induced plans. The degree of the induced errors was tuned to mimic IMRT QA passing rates that are commonly achieved using conventional methods. Results: Analysis of clinical metrics (parotid mean doses, spinal cord max and D1cc, CTV D95, and larynx mean) vs IMRT QA Gamma analysis (3%/3 mm, 2/2, 1/1) showed that in all cases, there were only weak to moderate correlations (range of Pearson's r-values: -0.295 to 0.653). Moreover, the moderate correlations actually had positive Pearson's r-values (i.e., clinically relevant metric differences increased with increasing IMRT QA passing rate), indicating that some of the largest anatomy-based dose differences occurred in the cases of high IMRT QA passing rates, which may be called ''false negatives.'' The results also show numerous instances of false positives or cases where low IMRT QA passing rates do not imply large errors in anatomy dose metrics. In none of the cases was there correlation consistent with high predictive power of planar IMRT passing rates, i.e., in none of the cases did high IMRT QA Gamma passing rates predict low errors in anatomy dose metrics or vice versa

  7. The study of error for analysis in dynamic image from the error of count rates in Nal (Tl) scintillation camera

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Joo Young; Kang, Chun Goo; Kim, Jung Yul; Oh, Ki Baek; Kim, Jae Sam [Dept. of Nuclear Medicine, Severance Hospital, Yonsei University, Seoul (Korea, Republic of); Park, Hoon Hee [Dept. of Radiological Technology, Shingu college, Sungnam (Korea, Republic of)

    2013-12-15

    This study is aimed to evaluate the effect of T{sub 1/2} upon count rates in the analysis of dynamic scan using NaI (Tl) scintillation camera, and suggest a new quality control method with this effects. We producted a point source with '9{sup 9m}TcO{sub 4}- of 18.5 to 185 MBq in the 2 mL syringes, and acquired 30 frames of dynamic images with 10 to 60 seconds each using Infinia gamma camera (GE, USA). In the second experiment, 90 frames of dynamic images were acquired from 74 MBq point source by 5 gamma cameras (Infinia 2, Forte 2, Argus 1). There were not significant differences in average count rates of the sources with 18.5 to 92.5 MBq in the analysis of 10 to 60 seconds/frame with 10 seconds interval in the first experiment (p>0.05). But there were significantly low average count rates with the sources over 111 MBq activity at 60 seconds/frame (p<0.01). According to the second analysis results of linear regression by count rates of 5 gamma cameras those were acquired during 90 minutes, counting efficiency of fourth gamma camera was most low as 0.0064%, and gradient and coefficient of variation was high as 0.0042 and 0.229 each. We could not find abnormal fluctuation in χ{sup 2} test with count rates (p>0.02), and we could find the homogeneity of variance in Levene's F-test among the gamma cameras (p>0.05). At the correlation analysis, there was only correlation between counting efficiency and gradient as significant negative correlation (r=-0.90, p<0.05). Lastly, according to the results of calculation of T{sub 1/2} error from change of gradient with -0.25% to +0.25%, if T{sub 1/2} is relatively long, or gradient is high, the error increase relationally. When estimate the value of 4th camera which has highest gradient from the above mentioned result, we could not see T{sub 1/2} error within 60 minutes at that value. In conclusion, it is necessary for the scintillation gamma camera in medical field to manage hard for the quality of radiation

  8. A novel multitemporal insar model for joint estimation of deformation rates and orbital errors

    KAUST Repository

    Zhang, Lei

    2014-06-01

    Orbital errors, characterized typically as longwavelength artifacts, commonly exist in interferometric synthetic aperture radar (InSAR) imagery as a result of inaccurate determination of the sensor state vector. Orbital errors degrade the precision of multitemporal InSAR products (i.e., ground deformation). Although research on orbital error reduction has been ongoing for nearly two decades and several algorithms for reducing the effect of the errors are already in existence, the errors cannot always be corrected efficiently and reliably. We propose a novel model that is able to jointly estimate deformation rates and orbital errors based on the different spatialoral characteristics of the two types of signals. The proposed model is able to isolate a long-wavelength ground motion signal from the orbital error even when the two types of signals exhibit similar spatial patterns. The proposed algorithm is efficient and requires no ground control points. In addition, the method is built upon wrapped phases of interferograms, eliminating the need of phase unwrapping. The performance of the proposed model is validated using both simulated and real data sets. The demo codes of the proposed model are also provided for reference. © 2013 IEEE.

  9. Analytical expression for the bit error rate of cascaded all-optical regenerators

    DEFF Research Database (Denmark)

    Mørk, Jesper; Öhman, Filip; Bischoff, S.

    2003-01-01

    We derive an approximate analytical expression for the bit error rate of cascaded fiber links containing all-optical 2R-regenerators. A general analysis of the interplay between noise due to amplification and the degree of reshaping (nonlinearity) of the regenerator is performed.......We derive an approximate analytical expression for the bit error rate of cascaded fiber links containing all-optical 2R-regenerators. A general analysis of the interplay between noise due to amplification and the degree of reshaping (nonlinearity) of the regenerator is performed....

  10. Air pollution and human fertility rates

    NARCIS (Netherlands)

    Nieuwenhuijsen, Mark J.; Basagaña, Xavier; Dadvand, Payam; Martinez, David; Cirach, Marta; Beelen, Rob; Jacquemin, Bénédicte

    2014-01-01

    Background: Some reports have suggested effects of air pollution on semen quality and success rates of in vitro fertilization (IVF) in humans and lower fertility rates in mice. However, no studies have evaluated the impact of air pollution on human fertility rates. Aims: We assessed the association

  11. Air pollution and human fertility rates

    NARCIS (Netherlands)

    Nieuwenhuijsen, Mark J.; Basagaña, Xavier; Dadvand, Payam; Martinez, David; Cirach, Marta; Beelen, Rob|info:eu-repo/dai/nl/30483100X; Jacquemin, Bénédicte

    2014-01-01

    Background: Some reports have suggested effects of air pollution on semen quality and success rates of in vitro fertilization (IVF) in humans and lower fertility rates in mice. However, no studies have evaluated the impact of air pollution on human fertility rates. Aims: We assessed the association

  12. Comparison of risk sensitivity to human errors in the Oconee and LaSalle PRAs

    Energy Technology Data Exchange (ETDEWEB)

    Wong, S.; Higgins, J.

    1991-01-01

    This paper describes the comparative analyses of plant risk sensitivity to human errors in the Oconee and La Salle Probabilistic Risk Assessment (PRAs). These analyses were performed to determine the reasons for the observed differences in the sensitivity of core melt frequency (CMF) to changes in human error probabilities (HEPs). Plant-specific design features, PRA methods, and the level of detail and assumptions in the human error modeling were evaluated to assess their influence risk estimates and sensitivities.

  13. Demonstrating the robustness of population surveillance data: implications of error rates on demographic and mortality estimates

    Directory of Open Access Journals (Sweden)

    Berhane Yemane

    2008-03-01

    Full Text Available Abstract Background As in any measurement process, a certain amount of error may be expected in routine population surveillance operations such as those in demographic surveillance sites (DSSs. Vital events are likely to be missed and errors made no matter what method of data capture is used or what quality control procedures are in place. The extent to which random errors in large, longitudinal datasets affect overall health and demographic profiles has important implications for the role of DSSs as platforms for public health research and clinical trials. Such knowledge is also of particular importance if the outputs of DSSs are to be extrapolated and aggregated with realistic margins of error and validity. Methods This study uses the first 10-year dataset from the Butajira Rural Health Project (BRHP DSS, Ethiopia, covering approximately 336,000 person-years of data. Simple programmes were written to introduce random errors and omissions into new versions of the definitive 10-year Butajira dataset. Key parameters of sex, age, death, literacy and roof material (an indicator of poverty were selected for the introduction of errors based on their obvious importance in demographic and health surveillance and their established significant associations with mortality. Defining the original 10-year dataset as the 'gold standard' for the purposes of this investigation, population, age and sex compositions and Poisson regression models of mortality rate ratios were compared between each of the intentionally erroneous datasets and the original 'gold standard' 10-year data. Results The composition of the Butajira population was well represented despite introducing random errors, and differences between population pyramids based on the derived datasets were subtle. Regression analyses of well-established mortality risk factors were largely unaffected even by relatively high levels of random errors in the data. Conclusion The low sensitivity of parameter

  14. Analytical expression for the bit error rate of cascaded all-optical regenerators

    DEFF Research Database (Denmark)

    Mørk, Jesper; Öhman, Filip; Bischoff, S.

    2003-01-01

    We derive an approximate analytical expression for the bit error rate of cascaded fiber links containing all-optical 2R-regenerators. A general analysis of the interplay between noise due to amplification and the degree of reshaping (nonlinearity) of the regenerator is performed....

  15. Finding the right coverage : The impact of coverage and sequence quality on SNP genotyping error rates

    NARCIS (Netherlands)

    Fountain, Emily D; Pauli, Jonathan N; Reid, Brendan N; Palsbøll, Per J; Peery, M Zachariah

    2016-01-01

    Restriction-enzyme-based sequencing methods enable the genotyping of thousands of single nucleotide polymorphism (SNP) loci in non-model organisms. However, in contrast to traditional genetic markers, genotyping error rates in SNPs derived from restriction-enzyme-based methods remain largely unknown

  16. Impact of Spacecraft Shielding on Direct Ionization Soft Error Rates for sub-130 nm Technologies

    Science.gov (United States)

    Pellish, Jonathan A.; Xapsos, Michael A.; Stauffer, Craig A.; Jordan, Michael M.; Sanders, Anthony B.; Ladbury, Raymond L.; Oldham, Timothy R.; Marshall, Paul W.; Heidel, David F.; Rodbell, Kenneth P.

    2010-01-01

    We use ray tracing software to model various levels of spacecraft shielding complexity and energy deposition pulse height analysis to study how it affects the direct ionization soft error rate of microelectronic components in space. The analysis incorporates the galactic cosmic ray background, trapped proton, and solar heavy ion environments as well as the October 1989 and July 2000 solar particle events.

  17. Performance Analysis for Bit Error Rate of DS- CDMA Sensor Network Systems with Source Coding

    Directory of Open Access Journals (Sweden)

    Haider M. AlSabbagh

    2012-03-01

    Full Text Available The minimum energy (ME coding combined with DS-CDMA wireless sensor network is analyzed in order to reduce energy consumed and multiple access interference (MAI with related to number of user(receiver. Also, the minimum energy coding which exploits redundant bits for saving power with utilizing RF link and On-Off-Keying modulation. The relations are presented and discussed for several levels of errors expected in the employed channel via amount of bit error rates and amount of the SNR for number of users (receivers.

  18. A minimum bit error-rate detector for amplify and forward relaying systems

    KAUST Repository

    Ahmed, Qasim Zeeshan

    2012-05-01

    In this paper, a new detector is being proposed for amplify-and-forward (AF) relaying system when communicating with the assistance of L number of relays. The major goal of this detector is to improve the bit error rate (BER) performance of the system. The complexity of the system is further reduced by implementing this detector adaptively. The proposed detector is free from channel estimation. Our results demonstrate that the proposed detector is capable of achieving a gain of more than 1-dB at a BER of 10 -5 as compared to the conventional minimum mean square error detector when communicating over a correlated Rayleigh fading channel. © 2012 IEEE.

  19. Linear transceiver design for nonorthogonal amplify-and-forward protocol using a bit error rate criterion

    KAUST Repository

    Ahmed, Qasim Zeeshan

    2014-04-01

    The ever growing demand of higher data rates can now be addressed by exploiting cooperative diversity. This form of diversity has become a fundamental technique for achieving spatial diversity by exploiting the presence of idle users in the network. This has led to new challenges in terms of designing new protocols and detectors for cooperative communications. Among various amplify-and-forward (AF) protocols, the half duplex non-orthogonal amplify-and-forward (NAF) protocol is superior to other AF schemes in terms of error performance and capacity. However, this superiority is achieved at the cost of higher receiver complexity. Furthermore, in order to exploit the full diversity of the system an optimal precoder is required. In this paper, an optimal joint linear transceiver is proposed for the NAF protocol. This transceiver operates on the principles of minimum bit error rate (BER), and is referred as joint bit error rate (JBER) detector. The BER performance of JBER detector is superior to all the proposed linear detectors such as channel inversion, the maximal ratio combining, the biased maximum likelihood detectors, and the minimum mean square error. The proposed transceiver also outperforms previous precoders designed for the NAF protocol. © 2002-2012 IEEE.

  20. Tissue pattern recognition error rates and tumor heterogeneity in gastric cancer.

    Science.gov (United States)

    Potts, Steven J; Huff, Sarah E; Lange, Holger; Zakharov, Vladislav; Eberhard, David A; Krueger, Joseph S; Hicks, David G; Young, George David; Johnson, Trevor; Whitney-Miller, Christa L

    2013-01-01

    The anatomic pathology discipline is slowly moving toward a digital workflow, where pathologists will evaluate whole-slide images on a computer monitor rather than glass slides through a microscope. One of the driving factors in this workflow is computer-assisted scoring, which depends on appropriate selection of regions of interest. With advances in tissue pattern recognition techniques, a more precise region of the tissue can be evaluated, no longer bound by the pathologist's patience in manually outlining target tissue areas. Pathologists use entire tissues from which to determine a score in a region of interest when making manual immunohistochemistry assessments. Tissue pattern recognition theoretically offers this same advantage; however, error rates exist in any tissue pattern recognition program, and these error rates contribute to errors in the overall score. To provide a real-world example of tissue pattern recognition, 11 HER2-stained upper gastrointestinal malignancies with high heterogeneity were evaluated. HER2 scoring of gastric cancer was chosen due to its increasing importance in gastrointestinal disease. A method is introduced for quantifying the error rates of tissue pattern recognition. The trade-off between fully sampling tumor with a given tissue pattern recognition error rate versus randomly sampling a limited number of fields of view with higher target accuracy was modeled with a Monte-Carlo simulation. Under most scenarios, stereological methods of sampling-limited fields of view outperformed whole-slide tissue pattern recognition approaches for accurate immunohistochemistry analysis. The importance of educating pathologists in the use of statistical sampling is discussed, along with the emerging role of hybrid whole-tissue imaging and stereological approaches.

  1. Putative extremely high rate of proteome innovation in lancelets might be explained by high rate of gene prediction errors.

    Science.gov (United States)

    Bányai, László; Patthy, László

    2016-08-01

    A recent analysis of the genomes of Chinese and Florida lancelets has concluded that the rate of creation of novel protein domain combinations is orders of magnitude greater in lancelets than in other metazoa and it was suggested that continuous activity of transposable elements in lancelets is responsible for this increased rate of protein innovation. Since morphologically Chinese and Florida lancelets are highly conserved, this finding would contradict the observation that high rates of protein innovation are usually associated with major evolutionary innovations. Here we show that the conclusion that the rate of proteome innovation is exceptionally high in lancelets may be unjustified: the differences observed in domain architectures of orthologous proteins of different amphioxus species probably reflect high rates of gene prediction errors rather than true innovation.

  2. Parental Cognitive Errors Mediate Parental Psychopathology and Ratings of Child Inattention.

    Science.gov (United States)

    Haack, Lauren M; Jiang, Yuan; Delucchi, Kevin; Kaiser, Nina; McBurnett, Keith; Hinshaw, Stephen; Pfiffner, Linda

    2017-09-01

    We investigate the Depression-Distortion Hypothesis in a sample of 199 school-aged children with ADHD-Predominantly Inattentive presentation (ADHD-I) by examining relations and cross-sectional mediational pathways between parental characteristics (i.e., levels of parental depressive and ADHD symptoms) and parental ratings of child problem behavior (inattention, sluggish cognitive tempo, and functional impairment) via parental cognitive errors. Results demonstrated a positive association between parental factors and parental ratings of inattention, as well as a mediational pathway between parental depressive and ADHD symptoms and parental ratings of inattention via parental cognitive errors. Specifically, higher levels of parental depressive and ADHD symptoms predicted higher levels of cognitive errors, which in turn predicted higher parental ratings of inattention. Findings provide evidence for core tenets of the Depression-Distortion Hypothesis, which state that parents with high rates of psychopathology hold negative schemas for their child's behavior and subsequently, report their child's behavior as more severe. © 2016 Family Process Institute.

  3. Pupillary response predicts multiple object tracking load, error rate, and conscientiousness, but not inattentional blindness.

    Science.gov (United States)

    Wright, Timothy J; Boot, Walter R; Morgan, Chelsea S

    2013-09-01

    Research on inattentional blindness (IB) has uncovered few individual difference measures that predict failures to detect an unexpected event. Notably, no clear relationship exists between primary task performance and IB. This is perplexing as better task performance is typically associated with increased effort and should result in fewer spare resources to process the unexpected event. We utilized a psychophysiological measure of effort (pupillary response) to explore whether differences in effort devoted to the primary task (multiple object tracking) are related to IB. Pupillary response was sensitive to tracking load and differences in primary task error rates. Furthermore, pupillary response was a better predictor of conscientiousness than primary task errors; errors were uncorrelated with conscientiousness. Despite being sensitive to task load, individual differences in performance and conscientiousness, pupillary response did not distinguish between those who noticed the unexpected event and those who did not. Results provide converging evidence that effort and primary task engagement may be unrelated to IB.

  4. An Approach to Human Error Hazard Detection of Unexpected Situations in NPPs

    Energy Technology Data Exchange (ETDEWEB)

    Park, Sangjun; Oh, Yeonju; Shin, Youmin; Lee, Yong-Hee [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    Fukushima accident is a typical complex event including the extreme situations induced by the succeeding earthquake, tsunami, explosion, and human errors. And it is judged with incomplete cause of system build-up same manner, procedure as a deficiency of response manual, education and training, team capability and the discharge of operator from human engineering point of view. Especially, the guidelines of current operating NPPs are not enough including countermeasures to the human errors at the extreme situations. Therefore, this paper describes a trial to detect the hazards of human errors at extreme situation, and to define the countermeasures that can properly response to the human error hazards when an individual, team, organization, and working entities that encounter the extreme situation in NPPs. In this paper we try to propose an approach to analyzing and extracting human error hazards for suggesting additional countermeasures to the human errors in unexpected situations. They might be utilized to develop contingency guidelines, especially for reducing the human error accident in NPPs. But the trial application in this study is currently limited since it is not easy to find accidents cases in detail enough to enumerate the proposed steps. Therefore, we will try to analyze as more cases as possible, and consider other environmental factors and human error conditions.

  5. Accurate and fast methods to estimate the population mutation rate from error prone sequences

    Directory of Open Access Journals (Sweden)

    Miyamoto Michael M

    2009-08-01

    Full Text Available Abstract Background The population mutation rate (θ remains one of the most fundamental parameters in genetics, ecology, and evolutionary biology. However, its accurate estimation can be seriously compromised when working with error prone data such as expressed sequence tags, low coverage draft sequences, and other such unfinished products. This study is premised on the simple idea that a random sequence error due to a chance accident during data collection or recording will be distributed within a population dataset as a singleton (i.e., as a polymorphic site where one sampled sequence exhibits a unique base relative to the common nucleotide of the others. Thus, one can avoid these random errors by ignoring the singletons within a dataset. Results This strategy is implemented under an infinite sites model that focuses on only the internal branches of the sample genealogy where a shared polymorphism can arise (i.e., a variable site where each alternative base is represented by at least two sequences. This approach is first used to derive independently the same new Watterson and Tajima estimators of θ, as recently reported by Achaz 1 for error prone sequences. It is then used to modify the recent, full, maximum-likelihood model of Knudsen and Miyamoto 2, which incorporates various factors for experimental error and design with those for coalescence and mutation. These new methods are all accurate and fast according to evolutionary simulations and analyses of a real complex population dataset for the California seahare. Conclusion In light of these results, we recommend the use of these three new methods for the determination of θ from error prone sequences. In particular, we advocate the new maximum likelihood model as a starting point for the further development of more complex coalescent/mutation models that also account for experimental error and design.

  6. The treatment of commission errors in first generation human reliability analysis methods

    Energy Technology Data Exchange (ETDEWEB)

    Alvarengga, Marco Antonio Bayout; Fonseca, Renato Alves da, E-mail: bayout@cnen.gov.b, E-mail: rfonseca@cnen.gov.b [Comissao Nacional de Energia Nuclear (CNEN) Rio de Janeiro, RJ (Brazil); Melo, Paulo Fernando Frutuoso e, E-mail: frutuoso@nuclear.ufrj.b [Coordenacao dos Programas de Pos-Graduacao de Engenharia (PEN/COPPE/UFRJ), RJ (Brazil). Programa de Engenharia Nuclear

    2011-07-01

    Human errors in human reliability analysis can be classified generically as errors of omission and commission errors. Omission errors are related to the omission of any human action that should have been performed, but does not occur. Errors of commission are those related to human actions that should not be performed, but which in fact are performed. Both involve specific types of cognitive error mechanisms, however, errors of commission are more difficult to model because they are characterized by non-anticipated actions that are performed instead of others that are omitted (omission errors) or are entered into an operational task without being part of the normal sequence of this task. The identification of actions that are not supposed to occur depends on the operational context that will influence or become easy certain unsafe actions of the operator depending on the operational performance of its parameters and variables. The survey of operational contexts and associated unsafe actions is a characteristic of second-generation models, unlike the first generation models. This paper discusses how first generation models can treat errors of commission in the steps of detection, diagnosis, decision-making and implementation, in the human information processing, particularly with the use of THERP tables of errors quantification. (author)

  7. Error-related EEG patterns during tactile human-machine interaction

    NARCIS (Netherlands)

    Lehne, M.; Ihme, K.; Brouwer, A.M.; Erp, J.B.F. van; Zander, T.O.

    2009-01-01

    Recently, the use of brain-computer interfaces (BCIs) has been extended from active control to passive detection of cognitive user states. These passive BCI systems can be especially useful for automatic error detection in human-machine systems by recording EEG potentials related to human error proc

  8. Quality of IT service delivery — Analysis and framework for human error prevention

    KAUST Repository

    Shwartz, L.

    2010-12-01

    In this paper, we address the problem of reducing the occurrence of Human Errors that cause service interruptions in IT Service Support and Delivery operations. Analysis of a large volume of service interruption records revealed that more than 21% of interruptions were caused by human error. We focus on Change Management, the process with the largest risk of human error, and identify the main instances of human errors as the 4 Wrongs: request, time, configuration item, and command. Analysis of change records revealed that the humanerror prevention by partial automation is highly relevant. We propose the HEP Framework, a framework for execution of IT Service Delivery operations that reduces human error by addressing the 4 Wrongs using content integration, contextualization of operation patterns, partial automation of command execution, and controlled access to resources.

  9. Study on Cell Error Rate of a Satellite ATM System Based on CDMA

    Institute of Scientific and Technical Information of China (English)

    赵彤宇; 张乃通

    2003-01-01

    In this paper, the cell error rate (CER) of a CDMA-based satellite ATM system is analyzed. Two fading models, i.e. the partial fading model and the total fading model are presented according to multi-path propagation fading and shadow effect. Based on the total shadow model, the relation of CER vs. the number of subscribers at various elevations under 2D-RAKE receiving and non-diversity receiving is got. The impact on cell error rate with pseudo noise (PN) code length is also considered. The result that the maximum likelihood combination of multi-path signal would not improve the system performance when multiple access interference (MAI) is small, on the contrary the performance may be even worse is abtained.

  10. Novel relations between the ergodic capacity and the average bit error rate

    KAUST Repository

    Yilmaz, Ferkan

    2011-11-01

    Ergodic capacity and average bit error rate have been widely used to compare the performance of different wireless communication systems. As such recent scientific research and studies revealed strong impact of designing and implementing wireless technologies based on these two performance indicators. However and to the best of our knowledge, the direct links between these two performance indicators have not been explicitly proposed in the literature so far. In this paper, we propose novel relations between the ergodic capacity and the average bit error rate of an overall communication system using binary modulation schemes for signaling with a limited bandwidth and operating over generalized fading channels. More specifically, we show that these two performance measures can be represented in terms of each other, without the need to know the exact end-to-end statistical characterization of the communication channel. We validate the correctness and accuracy of our newly proposed relations and illustrated their usefulness by considering some classical examples. © 2011 IEEE.

  11. The examination of commercial printing defects to assess common origin, batch variation, and error rate.

    Science.gov (United States)

    LaPorte, Gerald M; Stephens, Joseph C; Beuchel, Amanda K

    2010-01-01

    The examination of printing defects, or imperfections, found on printed or copied documents has been recognized as a generally accepted approach for linking questioned documents to a common source. This research paper will highlight the results from two mutually exclusive studies. The first involved the examination and characterization of printing defects found in a controlled production run of 500,000 envelopes bearing text and images. It was concluded that printing defects are random occurrences and that morphological differences can be used to identify variations within the same production batch. The second part incorporated a blind study to assess the error rate of associating randomly selected envelopes from different retail locations to a known source. The examination was based on the comparison of printing defects in the security patterns found in some envelopes. The results demonstrated that it is possible to associate envelopes to a common origin with a 0% error rate.

  12. Confidence Intervals Verification for Simulated Error Rate Performance of Wireless Communication System

    KAUST Repository

    Smadi, Mahmoud A.

    2012-12-06

    In this paper, we derived an efficient simulation method to evaluate the error rate of wireless communication system. Coherent binary phase-shift keying system is considered with imperfect channel phase recovery. The results presented demonstrate the system performance under very realistic Nakagami-m fading and additive white Gaussian noise channel. On the other hand, the accuracy of the obtained results is verified through running the simulation under a good confidence interval reliability of 95 %. We see that as the number of simulation runs N increases, the simulated error rate becomes closer to the actual one and the confidence interval difference reduces. Hence our results are expected to be of significant practical use for such scenarios. © 2012 Springer Science+Business Media New York.

  13. Novel Relations between the Ergodic Capacity and the Average Bit Error Rate

    CERN Document Server

    Yilmaz, Ferkan

    2012-01-01

    Ergodic capacity and average bit error rate have been widely used to compare the performance of different wireless communication systems. As such recent scientific research and studies revealed strong impact of designing and implementing wireless technologies based on these two performance indicators. However and to the best of our knowledge, the direct links between these two performance indicators have not been explicitly proposed in the literature so far. In this paper, we propose novel relations between the ergodic capacity and the average bit error rate of an overall communication system using binary modulation schemes for signaling with a limited bandwidth and operating over generalized fading channels. More specifically, we show that these two performance measures can be represented in terms of each other, without the need to know the exact end-to-end statistical characterization of the communication channel. We validate the correctness and accuracy of our newly proposed relations and illustrated their...

  14. Optical codeword demodulation with error rates below standard quantum limit using a conditional nulling receiver

    CERN Document Server

    Chen, Jian; Dutton, Zachary; Lazarus, Richard; Guha, Saikat

    2011-01-01

    The quantum states of two laser pulses---coherent states---are never mutually orthogonal, making perfect discrimination impossible. Even so, coherent states can achieve the ultimate quantum limit for capacity of a classical channel, the Holevo capacity. Attaining this requires the receiver to make joint-detection measurements on long codeword blocks, optical implementations of which remain unknown. We report the first experimental demonstration of a joint-detection receiver, demodulating quaternary pulse-position-modulation (PPM) codewords at a word error rate of up to 40% (2.2 dB) below that attained with direct-detection, the largest error-rate improvement over the standard quantum limit reported to date. This is accomplished with a conditional nulling receiver, which uses optimized-amplitude coherent pulse nulling, single photon detection and quantum feedforward. We further show how this translates into coding complexity improvements for practical PPM systems, such as in deep-space communication. We antici...

  15. Faces in places: humans and machines make similar face detection errors.

    Directory of Open Access Journals (Sweden)

    Bernard Marius 't Hart

    Full Text Available The human visual system seems to be particularly efficient at detecting faces. This efficiency sometimes comes at the cost of wrongfully seeing faces in arbitrary patterns, including famous examples such as a rock configuration on Mars or a toast's roast patterns. In machine vision, face detection has made considerable progress and has become a standard feature of many digital cameras. The arguably most wide-spread algorithm for such applications ("Viola-Jones" algorithm achieves high detection rates at high computational efficiency. To what extent do the patterns that the algorithm mistakenly classifies as faces also fool humans? We selected three kinds of stimuli from real-life, first-person perspective movies based on the algorithm's output: correct detections ("real faces", false positives ("illusory faces" and correctly rejected locations ("non faces". Observers were shown pairs of these for 20 ms and had to direct their gaze to the location of the face. We found that illusory faces were mistaken for faces more frequently than non faces. In addition, rotation of the real face yielded more errors, while rotation of the illusory face yielded fewer errors. Using colored stimuli increases overall performance, but does not change the pattern of results. When replacing the eye movement by a manual response, however, the preference for illusory faces over non faces disappeared. Taken together, our data show that humans make similar face-detection errors as the Viola-Jones algorithm, when directing their gaze to briefly presented stimuli. In particular, the relative spatial arrangement of oriented filters seems of relevance. This suggests that efficient face detection in humans is likely to be pre-attentive and based on rather simple features as those encoded in the early visual system.

  16. A minimum-error, energy-constrained neural code is an instantaneous-rate code.

    Science.gov (United States)

    Johnson, Erik C; Jones, Douglas L; Ratnam, Rama

    2016-04-01

    Sensory neurons code information about stimuli in their sequence of action potentials (spikes). Intuitively, the spikes should represent stimuli with high fidelity. However, generating and propagating spikes is a metabolically expensive process. It is therefore likely that neural codes have been selected to balance energy expenditure against encoding error. Our recently proposed optimal, energy-constrained neural coder (Jones et al. Frontiers in Computational Neuroscience, 9, 61 2015) postulates that neurons time spikes to minimize the trade-off between stimulus reconstruction error and expended energy by adjusting the spike threshold using a simple dynamic threshold. Here, we show that this proposed coding scheme is related to existing coding schemes, such as rate and temporal codes. We derive an instantaneous rate coder and show that the spike-rate depends on the signal and its derivative. In the limit of high spike rates the spike train maximizes fidelity given an energy constraint (average spike-rate), and the predicted interspike intervals are identical to those generated by our existing optimal coding neuron. The instantaneous rate coder is shown to closely match the spike-rates recorded from P-type primary afferents in weakly electric fish. In particular, the coder is a predictor of the peristimulus time histogram (PSTH). When tested against in vitro cortical pyramidal neuron recordings, the instantaneous spike-rate approximates DC step inputs, matching both the average spike-rate and the time-to-first-spike (a simple temporal code). Overall, the instantaneous rate coder relates optimal, energy-constrained encoding to the concepts of rate-coding and temporal-coding, suggesting a possible unifying principle of neural encoding of sensory signals.

  17. Modeling Human Error Mechanism for Soft Control in Advanced Control Rooms (ACRs)

    Energy Technology Data Exchange (ETDEWEB)

    Aljneibi, Hanan Salah Ali [Khalifa Univ., Abu Dhabi (United Arab Emirates); Ha, Jun Su; Kang, Seongkeun; Seong, Poong Hyun [KAIST, Daejeon (Korea, Republic of)

    2015-10-15

    To achieve the switch from conventional analog-based design to digital design in ACRs, a large number of manual operating controls and switches have to be replaced by a few common multi-function devices which is called soft control system. The soft controls in APR-1400 ACRs are classified into safety-grade and non-safety-grade soft controls; each was designed using different and independent input devices in ACRs. The operations using soft controls require operators to perform new tasks which were not necessary in conventional controls such as navigating computerized displays to monitor plant information and control devices. These kinds of computerized displays and soft controls may make operations more convenient but they might cause new types of human error. In this study the human error mechanism during the soft controls is studied and modeled to be used for analysis and enhancement of human performance (or human errors) during NPP operation. The developed model would contribute to a lot of applications to improve human performance (or reduce human errors), HMI designs, and operators' training program in ACRs. The developed model of human error mechanism for the soft control is based on assumptions that a human operator has certain amount of capacity in cognitive resources and if resources required by operating tasks are greater than resources invested by the operator, human error (or poor human performance) is likely to occur (especially in 'slip'); good HMI (Human-machine Interface) design decreases the required resources; operator's skillfulness decreases the required resources; and high vigilance increases the invested resources. In this study the human error mechanism during the soft controls is studied and modeled to be used for analysis and enhancement of human performance (or reduction of human errors) during NPP operation.

  18. Reducing error rates in straintronic multiferroic nanomagnetic logic by pulse shaping.

    Science.gov (United States)

    Munira, Kamaram; Xie, Yunkun; Nadri, Souheil; Forgues, Mark B; Fashami, Mohammad Salehi; Atulasimha, Jayasimha; Bandyopadhyay, Supriyo; Ghosh, Avik W

    2015-06-19

    Dipole-coupled nanomagnetic logic (NML), where nanomagnets (NMs) with bistable magnetization states act as binary switches and information is transferred between them via dipole-coupling and Bennett clocking, is a potential replacement for conventional transistor logic since magnets dissipate less energy than transistors when they switch in a logic circuit. Magnets are also 'non-volatile' and hence can store the results of a computation after the computation is over, thereby doubling as both logic and memory-a feat that transistors cannot achieve. However, dipole-coupled NML is much more error-prone than transistor logic at room temperature [Formula: see text] because thermal noise can easily disrupt magnetization dynamics. Here, we study a particularly energy-efficient version of dipole-coupled NML known as straintronic multiferroic logic (SML) where magnets are clocked/switched with electrically generated mechanical strain. By appropriately 'shaping' the voltage pulse that generates strain, we show that the error rate in SML can be reduced to tolerable limits. We describe the error probabilities associated with various stress pulse shapes and discuss the trade-off between error rate and switching speed in SML.The lowest error probability is obtained when a 'shaped' high voltage pulse is applied to strain the output NM followed by a low voltage pulse. The high voltage pulse quickly rotates the output magnet's magnetization by 90° and aligns it roughly along the minor (or hard) axis of the NM. Next, the low voltage pulse produces the critical strain to overcome the shape anisotropy energy barrier in the NM and produce a monostable potential energy profile in the presence of dipole coupling from the neighboring NM. The magnetization of the output NM then migrates to the global energy minimum in this monostable profile and completes a 180° rotation (magnetization flip) with high likelihood.

  19. A forward error correction technique using a high-speed, high-rate single chip codec

    Science.gov (United States)

    Boyd, R. W.; Hartman, W. F.; Jones, Robert E.

    1989-01-01

    The authors describe an error-correction coding approach that allows operation in either burst or continuous modes at data rates of multiple hundreds of megabits per second. Bandspreading is low since the code rate is 7/8 or greater, which is consistent with high-rate link operation. The encoder, along with a hard-decision decoder, fits on a single application-specific integrated circuit (ASIC) chip. Soft-decision decoding is possible utilizing applique hardware in conjunction with the hard-decision decoder. Expected coding gain is a function of the application and is approximately 2.5 dB for hard-decision decoding at 10-5 bit-error rate with phase-shift-keying modulation and additive Gaussian white noise interference. The principal use envisioned for this technique is to achieve a modest amount of coding gain on high-data-rate, bandwidth-constrained channels. Data rates of up to 300 Mb/s can be accommodated by the codec chip. The major objective is burst-mode communications, where code words are composed of 32 n data bits followed by 32 overhead bits.

  20. Comparing Response Times and Error Rates in a Simultaneous Masking Paradigm

    Directory of Open Access Journals (Sweden)

    F Hermens

    2014-08-01

    Full Text Available In simultaneous masking, performance on a foveally presented target is impaired by one or more flanking elements. Previous studies have demonstrated strong effects of the grouping of the target and the flankers on the strength of masking (e.g., Malania, Herzog & Westheimer, 2007. These studies have predominantly examined performance by measuring offset discrimination thresholds as a measure of performance, and it is therefore unclear whether other measures of performance provide similar outcomes. A recent study, which examined the role of grouping on error rates and response times in a speeded vernier offset discrimination task, similar to that used by Malania et al. (2007, suggested a possible dissociation between the two measures, with error rates mimicking threshold performance, but response times showing differential results (Panis & Hermens, 2014. We here report the outcomes of three experiments examining this possible dissociation, and demonstrate an overall similar pattern of results for error rates and response times across a broad range of mask layouts. Moreover, the pattern of results in our experiments strongly correlates with threshold performance reported earlier (Malania et al., 2007. Our results suggest that outcomes in a simultaneous masking paradigm do not critically depend on the outcome measure used, and therefore provide evidence for a common underlying mechanism.

  1. Human error and the problem of causality in analysis of accidents

    DEFF Research Database (Denmark)

    Rasmussen, Jens

    1990-01-01

    and for termination of the search for `causes'. In addition, the concept of human error is analysed and its intimate relation with human adaptation and learning is discussed. It is concluded that identification of errors as a separate class of behaviour is becoming increasingly difficult in modern work environments......Present technology is characterized by complexity, rapid change and growing size of technical systems. This has caused increasing concern with the human involvement in system safety. Analyses of the major accidents during recent decades have concluded that human errors on part of operators...

  2. Bit error rate testing of fiber optic data links for MMIC-based phased array antennas

    Science.gov (United States)

    Shalkhauser, K. A.; Kunath, R. R.; Daryoush, A. S.

    1990-06-01

    The measured bit-error-rate (BER) performance of a fiber optic data link to be used in satellite communications systems is presented and discussed. In the testing, the link was measured for its ability to carry high burst rate, serial-minimum shift keyed (SMSK) digital data similar to those used in actual space communications systems. The fiber optic data link, as part of a dual-segment injection-locked RF fiber optic link system, offers a means to distribute these signals to the many radiating elements of a phased array antenna. Test procedures, experimental arrangements, and test results are presented.

  3. Error resilient H.264/AVC Video over Satellite for low Packet Loss Rates

    DEFF Research Database (Denmark)

    Aghito, Shankar Manuel; Forchhammer, Søren; Andersen, Jakob Dahl

    2007-01-01

    The performance of video over satellite is simulated. The error resilience tools of intra macroblock refresh and slicing are optimized for live broadcast video over satellite. The improved performance using feedback, using a cross- layer approach, over the satellite link is also simulated. The ne...... Inmarsat BGAN system at 256 kbit/s is used as test case. This systems operates at low loss rates guaranteeing a packet loss rate of not more than 10~3. For high-end applications as 'reporter-in-the-field' live broadcast, it is crucial to obtain high quality without increasing delay....

  4. Automation of Commanding at NASA: Reducing Human Error in Space Flight

    Science.gov (United States)

    Dorn, Sarah J.

    2010-01-01

    Automation has been implemented in many different industries to improve efficiency and reduce human error. Reducing or eliminating the human interaction in tasks has been proven to increase productivity in manufacturing and lessen the risk of mistakes by humans in the airline industry. Human space flight requires the flight controllers to monitor multiple systems and react quickly when failures occur so NASA is interested in implementing techniques that can assist in these tasks. Using automation to control some of these responsibilities could reduce the number of errors the flight controllers encounter due to standard human error characteristics. This paper will investigate the possibility of reducing human error in the critical area of manned space flight at NASA.

  5. Air pollution and human fertility rates.

    Science.gov (United States)

    Nieuwenhuijsen, Mark J; Basagaña, Xavier; Dadvand, Payam; Martinez, David; Cirach, Marta; Beelen, Rob; Jacquemin, Bénédicte

    2014-09-01

    Some reports have suggested effects of air pollution on semen quality and success rates of in vitro fertilization (IVF) in humans and lower fertility rates in mice. However, no studies have evaluated the impact of air pollution on human fertility rates. We assessed the association between traffic related air pollution and fertility rates in humans in Barcelona, Spain (2011-2012). We hypothesized that higher air pollution levels would be associated with lower fertility rates. We calculated the general fertility rate which is the number of live births per 1000 women between the ages of 15 and 44 years per census tract. We used land use regression (LUR) modeling to estimate the air pollution concentrations (particulate matter, NO2/NOx) per census tract. We used Besag-York-Mollié models to quantify the relationship between air pollution and fertility rates with adjustment for a number of potential confounders such as maternal age and area level socio-economic status. We found a statistically significant reduction of fertility rates with an increase in traffic related air pollution levels, particularly for the coarse fraction of particulate matter (IRR=0.87 95% CI 0.82, 0.94 per IQR). This is the first study in humans to show an association between reduced fertility rates and higher traffic related air pollution levels. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. Error baseline rates of five sample preparation methods used to characterize RNA virus populations

    Science.gov (United States)

    Kugelman, Jeffrey R.; Wiley, Michael R.; Nagle, Elyse R.; Reyes, Daniel; Pfeffer, Brad P.; Kuhn, Jens H.; Sanchez-Lockhart, Mariano; Palacios, Gustavo F.

    2017-01-01

    Individual RNA viruses typically occur as populations of genomes that differ slightly from each other due to mutations introduced by the error-prone viral polymerase. Understanding the variability of RNA virus genome populations is critical for understanding virus evolution because individual mutant genomes may gain evolutionary selective advantages and give rise to dominant subpopulations, possibly even leading to the emergence of viruses resistant to medical countermeasures. Reverse transcription of virus genome populations followed by next-generation sequencing is the only available method to characterize variation for RNA viruses. However, both steps may lead to the introduction of artificial mutations, thereby skewing the data. To better understand how such errors are introduced during sample preparation, we determined and compared error baseline rates of five different sample preparation methods by analyzing in vitro transcribed Ebola virus RNA from an artificial plasmid-based system. These methods included: shotgun sequencing from plasmid DNA or in vitro transcribed RNA as a basic “no amplification” method, amplicon sequencing from the plasmid DNA or in vitro transcribed RNA as a “targeted” amplification method, sequence-independent single-primer amplification (SISPA) as a “random” amplification method, rolling circle reverse transcription sequencing (CirSeq) as an advanced “no amplification” method, and Illumina TruSeq RNA Access as a “targeted” enrichment method. The measured error frequencies indicate that RNA Access offers the best tradeoff between sensitivity and sample preparation error (1.4−5) of all compared methods. PMID:28182717

  7. Error-rate performance analysis of incremental decode-and-forward opportunistic relaying

    KAUST Repository

    Tourki, Kamel

    2011-06-01

    In this paper, we investigate an incremental opportunistic relaying scheme where the selected relay chooses to cooperate only if the source-destination channel is of an unacceptable quality. In our study, we consider regenerative relaying in which the decision to cooperate is based on a signal-to-noise ratio (SNR) threshold and takes into account the effect of the possible erroneously detected and transmitted data at the best relay. We derive a closed-form expression for the end-to-end bit-error rate (BER) of binary phase-shift keying (BPSK) modulation based on the exact probability density function (PDF) of each hop. Furthermore, we evaluate the asymptotic error performance and the diversity order is deduced. We show that performance simulation results coincide with our analytical results. © 2011 IEEE.

  8. On the symmetric α-stable distribution with application to symbol error rate calculations

    KAUST Repository

    Soury, Hamza

    2016-12-24

    The probability density function (PDF) of the symmetric α-stable distribution is investigated using the inverse Fourier transform of its characteristic function. For general values of the stable parameter α, it is shown that the PDF and the cumulative distribution function of the symmetric stable distribution can be expressed in terms of the Fox H function as closed-form. As an application, the probability of error of single input single output communication systems using different modulation schemes with an α-stable perturbation is studied. In more details, a generic formula is derived for generalized fading distribution, such as the extended generalized-k distribution. Later, simpler expressions of these error rates are deduced for some selected special cases and compact approximations are derived using asymptotic expansions.

  9. Asymptotic correctability of Bell-diagonal quantum states and maximum tolerable bit error rates

    CERN Document Server

    Ranade, K S; Ranade, Kedar S.; Alber, Gernot

    2005-01-01

    The general conditions are discussed which quantum state purification protocols have to fulfill in order to be capable of purifying Bell-diagonal qubit-pair states, provided they consist of steps that map Bell-diagonal states to Bell-diagonal states and they finally apply a suitably chosen Calderbank-Shor-Steane code to the outcome of such steps. As a main result a necessary and a sufficient condition on asymptotic correctability are presented, which relate this problem to the magnitude of a characteristic exponent governing the relation between bit and phase errors under the purification steps. These conditions allow a straightforward determination of maximum tolerable bit error rates of quantum key distribution protocols whose security analysis can be reduced to the purification of Bell-diagonal states.

  10. Stability Comparison of Recordable Optical Discs—A Study of Error Rates in Harsh Conditions

    Science.gov (United States)

    Slattery, Oliver; Lu, Richang; Zheng, Jian; Byers, Fred; Tang, Xiao

    2004-01-01

    The reliability and longevity of any storage medium is a key issue for archivists and preservationists as well as for the creators of important information. This is particularly true in the case of digital media such as DVD and CD where a sufficient number of errors may render the disc unreadable. This paper describes an initial stability study of commercially available recordable DVD and CD media using accelerated aging tests under conditions of increased temperature and humidity. The effect of prolonged exposure to direct light is also investigated and shown to have an effect on the error rates of the media. Initial results show that high quality optical media have very stable characteristics and may be suitable for long-term storage applications. However, results also indicate that significant differences exist in the stability of recordable optical media from different manufacturers. PMID:27366630

  11. Learning High-Dimensional Markov Forest Distributions: Analysis of Error Rates

    CERN Document Server

    Tan, Vincent Y F; Willsky, Alan S

    2010-01-01

    The problem of learning forest-structured discrete graphical models from i.i.d. samples is considered. An algorithm based on pruning of the Chow-Liu tree through adaptive thresholding is proposed. It is shown that this algorithm is both structurally consistent and risk consistent and the error probability of structure learning decays faster than any polynomial in the number of samples under fixed model size. For the high-dimensional scenario where the size of the model d and the number of edges k scale with the number of samples n, sufficient conditions on (n,d,k) are given for the algorithm to satisfy structural and risk consistencies. In addition, the extremal structures for learning are identified; we prove that the independent (resp. tree) model is the hardest (resp. easiest) to learn using the proposed algorithm in terms of error rates for structure learning.

  12. THE ASYMPTOTICS OF THE INTEGRATED SQUARE ERROR FOR THE KERNELHAZARD RATE ESTIMATORS WITH LEFT TRUNCATED AND RIGHT CENSORED DATA

    Institute of Scientific and Technical Information of China (English)

    SUN Liuquan; ZHENG Zhongguo

    1999-01-01

    A central limit theorem for the integrated square error (ISE)of the kernel hazard rate estimators is obtained based on left truncated and right censored data.An asymptotic representation of the mean integrated square error(MISE) for the kernel hazard rate estimators is also presented.

  13. Cognitive tests predict real-world errors: the relationship between drug name confusion rates in laboratory-based memory and perception tests and corresponding error rates in large pharmacy chains

    Science.gov (United States)

    Schroeder, Scott R; Salomon, Meghan M; Galanter, William L; Schiff, Gordon D; Vaida, Allen J; Gaunt, Michael J; Bryson, Michelle L; Rash, Christine; Falck, Suzanne; Lambert, Bruce L

    2017-01-01

    Background Drug name confusion is a common type of medication error and a persistent threat to patient safety. In the USA, roughly one per thousand prescriptions results in the wrong drug being filled, and most of these errors involve drug names that look or sound alike. Prior to approval, drug names undergo a variety of tests to assess their potential for confusability, but none of these preapproval tests has been shown to predict real-world error rates. Objectives We conducted a study to assess the association between error rates in laboratory-based tests of drug name memory and perception and real-world drug name confusion error rates. Methods Eighty participants, comprising doctors, nurses, pharmacists, technicians and lay people, completed a battery of laboratory tests assessing visual perception, auditory perception and short-term memory of look-alike and sound-alike drug name pairs (eg, hydroxyzine/hydralazine). Results Laboratory test error rates (and other metrics) significantly predicted real-world error rates obtained from a large, outpatient pharmacy chain, with the best-fitting model accounting for 37% of the variance in real-world error rates. Cross-validation analyses confirmed these results, showing that the laboratory tests also predicted errors from a second pharmacy chain, with 45% of the variance being explained by the laboratory test data. Conclusions Across two distinct pharmacy chains, there is a strong and significant association between drug name confusion error rates observed in the real world and those observed in laboratory-based tests of memory and perception. Regulators and drug companies seeking a validated preapproval method for identifying confusing drug names ought to consider using these simple tests. By using a standard battery of memory and perception tests, it should be possible to reduce the number of confusing look-alike and sound-alike drug name pairs that reach the market, which will help protect patients from potentially

  14. A Simple Exact Error Rate Analysis for DS-CDMA with Arbitrary Pulse Shape in Flat Nakagami Fading

    Science.gov (United States)

    Rahman, Mohammad Azizur; Sasaki, Shigenobu; Kikuchi, Hisakazu; Harada, Hiroshi; Kato, Shuzo

    A simple exact error rate analysis is presented for random binary direct sequence code division multiple access (DS-CDMA) considering a general pulse shape and flat Nakagami fading channel. First of all, a simple model is developed for the multiple access interference (MAI). Based on this, a simple exact expression of the characteristic function (CF) of MAI is developed in a straight forward manner. Finally, an exact expression of error rate is obtained following the CF method of error rate analysis. The exact error rate so obtained can be much easily evaluated as compared to the only reliable approximate error rate expression currently available, which is based on the Improved Gaussian Approximation (IGA).

  15. Analysis of measured data of human body based on error correcting frequency

    Science.gov (United States)

    Jin, Aiyan; Peipei, Gao; Shang, Xiaomei

    2014-04-01

    Anthropometry is to measure all parts of human body surface, and the measured data is the basis of analysis and study of the human body, establishment and modification of garment size and formulation and implementation of online clothing store. In this paper, several groups of the measured data are gained, and analysis of data error is gotten by analyzing the error frequency and using analysis of variance method in mathematical statistics method. Determination of the measured data accuracy and the difficulty of measured parts of human body, further studies of the causes of data errors, and summarization of the key points to minimize errors possibly are also mentioned in the paper. This paper analyses the measured data based on error frequency, and in a way , it provides certain reference elements to promote the garment industry development.

  16. Prediction of human errors by maladaptive changes in event-related brain networks

    NARCIS (Netherlands)

    Eichele, T.; Debener, S.; Calhoun, V.D.; Specht, K.; Engel, A.K.; Hugdahl, K.; Cramon, D.Y. von; Ullsperger, M.

    2008-01-01

    Humans engaged in monotonous tasks are susceptible to occasional errors that may lead to serious consequences, but little is known about brain activity patterns preceding errors. Using functional Mill and applying independent component analysis followed by deconvolution of hemodynamic responses, we

  17. Human errors evaluation for muster in emergency situations applying human error probability index (HEPI, in the oil company warehouse in Hamadan City

    Directory of Open Access Journals (Sweden)

    2012-12-01

    Full Text Available Introduction: Emergency situation is one of the influencing factors on human error. The aim of this research was purpose to evaluate human error in emergency situation of fire and explosion at the oil company warehouse in Hamadan city applying human error probability index (HEPI. . Material and Method: First, the scenario of emergency situation of those situation of fire and explosion at the oil company warehouse was designed and then maneuver against, was performed. The scaled questionnaire of muster for the maneuver was completed in the next stage. Collected data were analyzed to calculate the probability success for the 18 actions required in an emergency situation from starting point of the muster until the latest action to temporary sheltersafe. .Result: The result showed that the highest probability of error occurrence was related to make safe workplace (evaluation phase with 32.4 % and lowest probability of occurrence error in detection alarm (awareness phase with 1.8 %, probability. The highest severity of error was in the evaluation phase and the lowest severity of error was in the awareness and recovery phase. Maximum risk level was related to the evaluating exit routes and selecting one route and choosy another exit route and minimum risk level was related to the four evaluation phases. . Conclusion: To reduce the risk of reaction in the exit phases of an emergency situation, the following actions are recommended, based on the finding in this study: A periodic evaluation of the exit phase and modifying them if necessary, conducting more maneuvers and analyzing this results along with a sufficient feedback to the employees.

  18. Detection of error related neuronal responses recorded by electrocorticography in humans during continuous movements.

    Directory of Open Access Journals (Sweden)

    Tomislav Milekovic

    Full Text Available BACKGROUND: Brain-machine interfaces (BMIs can translate the neuronal activity underlying a user's movement intention into movements of an artificial effector. In spite of continuous improvements, errors in movement decoding are still a major problem of current BMI systems. If the difference between the decoded and intended movements becomes noticeable, it may lead to an execution error. Outcome errors, where subjects fail to reach a certain movement goal, are also present during online BMI operation. Detecting such errors can be beneficial for BMI operation: (i errors can be corrected online after being detected and (ii adaptive BMI decoding algorithm can be updated to make fewer errors in the future. METHODOLOGY/PRINCIPAL FINDINGS: Here, we show that error events can be detected from human electrocorticography (ECoG during a continuous task with high precision, given a temporal tolerance of 300-400 milliseconds. We quantified the error detection accuracy and showed that, using only a small subset of 2×2 ECoG electrodes, 82% of detection information for outcome error and 74% of detection information for execution error available from all ECoG electrodes could be retained. CONCLUSIONS/SIGNIFICANCE: The error detection method presented here could be used to correct errors made during BMI operation or to adapt a BMI algorithm to make fewer errors in the future. Furthermore, our results indicate that smaller ECoG implant could be used for error detection. Reducing the size of an ECoG electrode implant used for BMI decoding and error detection could significantly reduce the medical risk of implantation.

  19. Symbol Error Rate of MPSK over EGK Channels Perturbed by a Dominant Additive Laplacian Noise

    KAUST Repository

    Souri, Hamza

    2015-06-01

    The Laplacian noise has received much attention during the recent years since it affects many communication systems. We consider in this paper the probability of error of an M-ary phase shift keying (PSK) constellation operating over a generalized fading channel in presence of a dominant additive Laplacian noise. In this context, the decision regions of the receiver are determined using the maximum likelihood and the minimum distance detectors. Once the decision regions are extracted, the resulting symbol error rate expressions are computed and averaged over an Extended Generalized-K fading distribution. Generic closed form expressions of the conditional and the average probability of error are obtained in terms of the Fox’s H function. Simplifications for some special cases of fading are presented and the resulting formulas end up being often expressed in terms of well known elementary functions. Finally, the mathematical formalism is validated using some selected analytical-based numerical results as well as Monte- Carlo simulation-based results.

  20. Creation and implementation of department-wide structured reports: an analysis of the impact on error rate in radiology reports.

    Science.gov (United States)

    Hawkins, C Matthew; Hall, Seth; Zhang, Bin; Towbin, Alexander J

    2014-10-01

    The purpose of this study was to evaluate and compare textual error rates and subtypes in radiology reports before and after implementation of department-wide structured reports. Randomly selected radiology reports that were generated following the implementation of department-wide structured reports were evaluated for textual errors by two radiologists. For each report, the text was compared to the corresponding audio file. Errors in each report were tabulated and classified. Error rates were compared to results from a prior study performed prior to implementation of structured reports. Calculated error rates included the average number of errors per report, average number of nongrammatical errors per report, the percentage of reports with an error, and the percentage of reports with a nongrammatical error. Identical versions of voice-recognition software were used for both studies. A total of 644 radiology reports were randomly evaluated as part of this study. There was a statistically significant reduction in the percentage of reports with nongrammatical errors (33 to 26%; p = 0.024). The likelihood of at least one missense omission error (omission errors that changed the meaning of a phrase or sentence) occurring in a report was significantly reduced from 3.5 to 1.2% (p = 0.0175). A statistically significant reduction in the likelihood of at least one comission error (retained statements from a standardized report that contradict the dictated findings or impression) occurring in a report was also observed (3.9 to 0.8%; p = 0.0007). Carefully constructed structured reports can help to reduce certain error types in radiology reports.

  1. Equilibrating errors: reliable estimation of information transmission rates in biological systems with spectral analysis-based methods.

    Science.gov (United States)

    Ignatova, Irina; French, Andrew S; Immonen, Esa-Ville; Frolov, Roman; Weckström, Matti

    2014-06-01

    Shannon's seminal approach to estimating information capacity is widely used to quantify information processing by biological systems. However, the Shannon information theory, which is based on power spectrum estimation, necessarily contains two sources of error: time delay bias error and random error. These errors are particularly important for systems with relatively large time delay values and for responses of limited duration, as is often the case in experimental work. The window function type and size chosen, as well as the values of inherent delays cause changes in both the delay bias and random errors, with possibly strong effect on the estimates of system properties. Here, we investigated the properties of these errors using white-noise simulations and analysis of experimental photoreceptor responses to naturalistic and white-noise light contrasts. Photoreceptors were used from several insect species, each characterized by different visual performance, behavior, and ecology. We show that the effect of random error on the spectral estimates of photoreceptor performance (gain, coherence, signal-to-noise ratio, Shannon information rate) is opposite to that of the time delay bias error: the former overestimates information rate, while the latter underestimates it. We propose a new algorithm for reducing the impact of time delay bias error and random error, based on discovering, and then using that size of window, at which the absolute values of these errors are equal and opposite, thus cancelling each other, allowing minimally biased measurement of neural coding.

  2. Minimum Symbol Error Rate Detection in Single-Input Multiple-Output Channels with Markov Noise

    DEFF Research Database (Denmark)

    Christensen, Lars P.B.

    2005-01-01

    Minimum symbol error rate detection in Single-Input Multiple- Output(SIMO) channels with Markov noise is presented. The special case of zero-mean Gauss-Markov noise is examined closer as it only requires knowledge of the second-order moments. In this special case, it is shown that optimal detection...... can be achieved by a Multiple-Input Multiple- Output(MIMO) whitening filter followed by a traditional BCJR algorithm. The Gauss-Markov noise model provides a reasonable approximation for co-channel interference, making it an interesting single-user detector for many multiuser communication systems...

  3. Human Error and the International Space Station: Challenges and Triumphs in Science Operations

    Science.gov (United States)

    Harris, Samantha S.; Simpson, Beau C.

    2016-01-01

    Any system with a human component is inherently risky. Studies in human factors and psychology have repeatedly shown that human operators will inevitably make errors, regardless of how well they are trained. Onboard the International Space Station (ISS) where crew time is arguably the most valuable resource, errors by the crew or ground operators can be costly to critical science objectives. Operations experts at the ISS Payload Operations Integration Center (POIC), located at NASA's Marshall Space Flight Center in Huntsville, Alabama, have learned that from payload concept development through execution, there are countless opportunities to introduce errors that can potentially result in costly losses of crew time and science. To effectively address this challenge, we must approach the design, testing, and operation processes with two specific goals in mind. First, a systematic approach to error and human centered design methodology should be implemented to minimize opportunities for user error. Second, we must assume that human errors will be made and enable rapid identification and recoverability when they occur. While a systematic approach and human centered development process can go a long way toward eliminating error, the complete exclusion of operator error is not a reasonable expectation. The ISS environment in particular poses challenging conditions, especially for flight controllers and astronauts. Operating a scientific laboratory 250 miles above the Earth is a complicated and dangerous task with high stakes and a steep learning curve. While human error is a reality that may never be fully eliminated, smart implementation of carefully chosen tools and techniques can go a long way toward minimizing risk and increasing the efficiency of NASA's space science operations.

  4. Accuracy of cited "facts" in medical research articles: A review of study methodology and recalculation of quotation error rate.

    Science.gov (United States)

    Mogull, Scott A

    2017-01-01

    Previous reviews estimated that approximately 20 to 25% of assertions cited from original research articles, or "facts," are inaccurately quoted in the medical literature. These reviews noted that the original studies were dissimilar and only began to compare the methods of the original studies. The aim of this review is to examine the methods of the original studies and provide a more specific rate of incorrectly cited assertions, or quotation errors, in original research articles published in medical journals. Additionally, the estimate of quotation errors calculated here is based on the ratio of quotation errors to quotations examined (a percent) rather than the more prevalent and weighted metric of quotation errors to the references selected. Overall, this resulted in a lower estimate of the quotation error rate in original medical research articles. A total of 15 studies met the criteria for inclusion in the primary quantitative analysis. Quotation errors were divided into two categories: content ("factual") or source (improper indirect citation) errors. Content errors were further subdivided into major and minor errors depending on the degree that the assertion differed from the original source. The rate of quotation errors recalculated here is 14.5% (10.5% to 18.6% at a 95% confidence interval). These content errors are predominantly, 64.8% (56.1% to 73.5% at a 95% confidence interval), major errors or cited assertions in which the referenced source either fails to substantiate, is unrelated to, or contradicts the assertion. Minor errors, which are an oversimplification, overgeneralization, or trivial inaccuracies, are 35.2% (26.5% to 43.9% at a 95% confidence interval). Additionally, improper secondary (or indirect) citations, which are distinguished from calculations of quotation accuracy, occur at a rate of 10.4% (3.4% to 17.5% at a 95% confidence interval).

  5. Resilience to evolving drinking water contamination risks: a human error prevention perspective

    OpenAIRE

    Tang, Yanhong; Wu, Shaomin; Miao, Xin; Pollard, Simon J.T.; Hrudey, Steve E.

    2013-01-01

    Human error contributes to one of the major causes of the prevalence of drinking water contamination incidents. It has, however, attracted insufficient attention in the cleaner production management community. This paper analyzes human error appearing in each stage of the gestation of 40 drinking water incidents and their causes, proposes resilience-based mechanisms and tools within three groups: consumers, drinking water companies, and policy regulators. The mechanism analysis involves conce...

  6. Data-driven region-of-interest selection without inflating Type I error rate.

    Science.gov (United States)

    Brooks, Joseph L; Zoumpoulaki, Alexia; Bowman, Howard

    2017-01-01

    In ERP and other large multidimensional neuroscience data sets, researchers often select regions of interest (ROIs) for analysis. The method of ROI selection can critically affect the conclusions of a study by causing the researcher to miss effects in the data or to detect spurious effects. In practice, to avoid inflating Type I error rate (i.e., false positives), ROIs are often based on a priori hypotheses or independent information. However, this can be insensitive to experiment-specific variations in effect location (e.g., latency shifts) reducing power to detect effects. Data-driven ROI selection, in contrast, is nonindependent and uses the data under analysis to determine ROI positions. Therefore, it has potential to select ROIs based on experiment-specific information and increase power for detecting effects. However, data-driven methods have been criticized because they can substantially inflate Type I error rate. Here, we demonstrate, using simulations of simple ERP experiments, that data-driven ROI selection can indeed be more powerful than a priori hypotheses or independent information. Furthermore, we show that data-driven ROI selection using the aggregate grand average from trials (AGAT), despite being based on the data at hand, can be safely used for ROI selection under many circumstances. However, when there is a noise difference between conditions, using the AGAT can inflate Type I error and should be avoided. We identify critical assumptions for use of the AGAT and provide a basis for researchers to use, and reviewers to assess, data-driven methods of ROI localization in ERP and other studies.

  7. Coping with human errors through system design: Implications for ecological interface design

    DEFF Research Database (Denmark)

    Rasmussen, Jens; Vicente, Kim J.

    1989-01-01

    Research during recent years has revealed that human errors are not stochastic events which can be removed through improved training programs or optimal interface design. Rather, errors tend to reflect either systematic interference between various models, rules, and schemata, or the effects...... of the adaptive mechanisms involved in learning. In terms of design implications, these findings suggest that reliable human-system interaction will be achieved by designing interfaces which tend to minimize the potential for control interference and support recovery from errors. In other words, the focus should...... be on control of the effects of errors rather than on the elimination of errors per se. In this paper, we propose a theoretical framework for interface design that attempts to satisfy these objectives. The goal of our framework, called ecological interface design, is to develop a meaningful representation...

  8. Estimation of hominoid ancestral population sizes under bayesian coalescent models incorporating mutation rate variation and sequencing errors.

    Science.gov (United States)

    Burgess, Ralph; Yang, Ziheng

    2008-09-01

    Estimation of population parameters for the common ancestors of humans and the great apes is important in understanding our evolutionary history. In particular, inference of population size for the human-chimpanzee common ancestor may shed light on the process by which the 2 species separated and on whether the human population experienced a severe size reduction in its early evolutionary history. In this study, the Bayesian method of ancestral inference of Rannala and Yang (2003. Bayes estimation of species divergence times and ancestral population sizes using DNA sequences from multiple loci. Genetics. 164:1645-1656) was extended to accommodate variable mutation rates among loci and random species-specific sequencing errors. The model was applied to analyze a genome-wide data set of approximately 15,000 neutral loci (7.4 Mb) aligned for human, chimpanzee, gorilla, orangutan, and macaque. We obtained robust and precise estimates for effective population sizes along the hominoid lineage extending back approximately 30 Myr to the cercopithecoid divergence. The results showed that ancestral populations were 5-10 times larger than modern humans along the entire hominoid lineage. The estimates were robust to the priors used and to model assumptions about recombination. The unusually low X chromosome divergence between human and chimpanzee could not be explained by variation in the male mutation bias or by current models of hybridization and introgression. Instead, our parameter estimates were consistent with a simple instantaneous process for human-chimpanzee speciation but showed a major reduction in X chromosome effective population size peculiar to the human-chimpanzee common ancestor, possibly due to selective sweeps on the X prior to separation of the 2 species.

  9. Modified Golden Codes for Improved Error Rates Through Low Complex Sphere Decoder

    Directory of Open Access Journals (Sweden)

    K.Thilagam

    2013-05-01

    Full Text Available n recent years, the golden codes have proven to ex hibit a superior performance in a wireless MIMO (Multiple Input Multiple Output scenario than any other code. However, a serious limitation associated with it is its increased deco ding complexity. This paper attempts to resolve this challenge through suitable modification of gol den code such that a less complex sphere decoder could be used without much compromising the error rates. In this paper, a minimum polynomial equation is introduced to obtain a reduc ed golden ratio (RGR number for golden code which demands only for a low complexity decodi ng procedure. One of the attractive approaches used in this paper is that the effective channel matrix has been exploited to perform a single symbol wise decoding instead of grouped sy mbols using a sphere decoder with tree search algorithm. It has been observed that the low decoding complexity of O (q 1.5 is obtained against conventional method of O (q 2.5 . Simulation analysis envisages that in addition t o reduced decoding, improved error rates is also obta ined.

  10. Threshold based Bit Error Rate Optimization in Four Wave Mixing Optical WDM Systems

    Directory of Open Access Journals (Sweden)

    Er. Karamjeet Kaur

    2016-07-01

    Full Text Available Optical communication is communication at a distance using light to carry information which can be performed visually or by using electronic devices. The trend toward higher bit rates in light-wave communication has interest in dispersion-shifted fibre to reduce dispersion penalties. At an equivalent time optical amplifiers have exaggerated interest in wavelength multiplexing. This paper describes optical communication systems where we discuss different optical multiplexing schemes. The effect of channel power depletion due to generation of Four Wave Mixing waves and the effect of FWM cross talk on the performance of a WDM receiver has been studied in this paper. The main focus is to minimize Bit Error Rate to increase the QoS of the optical WDM system.

  11. Error rate performance of Hybrid QAM-FSK in OFDM systems exhibiting low PAPR

    Institute of Scientific and Technical Information of China (English)

    LATIF Asma; GOHAR Nasir D.

    2009-01-01

    Multicarrier transmission systems like orthogonal frequency division multiplexing (OFDM) support high data rate and generally require no equalization at the receiver, making them simple and efficient. This paper studies the design and performance analysis of a hybrid modulation system derived from multi-frequency and MQAM signals, employed in OFDM. This modulation scheme has better bit error rate (BER) performance and exhibits low PAPR. The proposed hybrid modulator reduces PAPR while keep-ing the OFDM transceiver design simple, as it does not require any side information or a little side Information (only one bit) to be sent and is efficient for arbitrary number of subcarriers. The results of the implementations are compared with those of conventional OFDM system.

  12. Bit Error Rate Measurements on Prototype Digital Optical Links for the CMS Tracker

    CERN Document Server

    Azevedo, C S; Faccio, F; Gill, Karl; Grabit, Robert; Jensen, Fredrik Bjorn Henning; Vasey, François

    2000-01-01

    Two prototypes of a four-channel digital optical link to be used for the slow control of the CMS Tracker detector were tested for bit error rate, at transmission rates of 40 Mbit/s and 80 Mbit/s. Both prototypes used the same transmitter and PIN photodiode, but different receiver configurations: one used COTS electronics, whilst the other used a digital receiver ASIC developed at CERN in a 0.25 mm process. Both links proved to be well within the specification limits even after the ASIC receiver was irradiated to a 20 Mrad total dose, and the PIN photodiode to a 6.5á1014 n/cm2 fluence.

  13. Asymptotics for partly linear regression with dependent samples and ARCH errors: consistency with rates

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Partly linear regression model is useful in practice, but littleis investigated in the literature to adapt it to the real data which are dependent and conditionally heteroscedastic. In this paper, the estimators of the regression components are constructed via local polynomial fitting and the large sample properties are explored. Under certain mild regularities, the conditions are obtained to ensure that the estimators of the nonparametric component and its derivatives are consistent up to the convergence rates which are optimal in the i.i.d. case, and the estimator of the parametric component is root-n consistent with the same rate as for parametric model. The technique adopted in the proof differs from that used and corrects the errors in the reference by Hamilton and Truong under i.i.d. samples.

  14. Soft error rate simulation and initial design considerations of neutron intercepting silicon chip (NISC)

    Science.gov (United States)

    Celik, Cihangir

    Advances in microelectronics result in sub-micrometer electronic technologies as predicted by Moore's Law, 1965, which states the number of transistors in a given space would double every two years. The most available memory architectures today have submicrometer transistor dimensions. The International Technology Roadmap for Semiconductors (ITRS), a continuation of Moore's Law, predicts that Dynamic Random Access Memory (DRAM) will have an average half pitch size of 50 nm and Microprocessor Units (MPU) will have an average gate length of 30 nm over the period of 2008-2012. Decreases in the dimensions satisfy the producer and consumer requirements of low power consumption, more data storage for a given space, faster clock speed, and portability of integrated circuits (IC), particularly memories. On the other hand, these properties also lead to a higher susceptibility of IC designs to temperature, magnetic interference, power supply, and environmental noise, and radiation. Radiation can directly or indirectly affect device operation. When a single energetic particle strikes a sensitive node in the micro-electronic device, it can cause a permanent or transient malfunction in the device. This behavior is called a Single Event Effect (SEE). SEEs are mostly transient errors that generate an electric pulse which alters the state of a logic node in the memory device without having a permanent effect on the functionality of the device. This is called a Single Event Upset (SEU) or Soft Error . Contrary to SEU, Single Event Latchup (SEL), Single Event Gate Rapture (SEGR), or Single Event Burnout (SEB) they have permanent effects on the device operation and a system reset or recovery is needed to return to proper operations. The rate at which a device or system encounters soft errors is defined as Soft Error Rate (SER). The semiconductor industry has been struggling with SEEs and is taking necessary measures in order to continue to improve system designs in nano

  15. A Human Reliability Analysis of Post- Accident Human Errors in the Low Power and Shutdown PSA of KSNP

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Daeil; Kim, J. H.; Jang, S. C

    2007-03-15

    Korea Atomic Energy Research Institute, using the ANS low power and shutdown (LPSD) probabilistic risk assessment (PRA) Standard, evaluated the LPSD PSA model of the KSNP, Yonggwang Units 5 and 6, and identified the items to be improved. The evaluation results of human reliability analysis (HRA) of the post-accident human errors in the LPSD PSA model for the KSNP showed that 10 items among 19 items of supporting requirements for those in the ANS PRA Standard were identified as them to be improved. Thus, we newly carried out a HRA for post-accident human errors in the LPSD PSA model for the KSNP. Following tasks are the improvements in the HRA of post-accident human errors of the LPSD PSA model for the KSNP compared with the previous one: Interviews with operators in the interpretation of the procedure, modeling of operator actions, and the quantification results of human errors, site visit. Applications of limiting value to the combined post-accident human errors. Documentation of information of all the input and bases for the detailed quantifications and the dependency analysis using the quantification sheets The assessment results for the new HRA results of post-accident human errors using the ANS LPSD PRA Standard show that above 80% items of its supporting requirements for post-accident human errors were graded as its Category II. The number of the re-estimated human errors using the LPSD Korea Standard HRA method is 385. Among them, the number of individual post-accident human errors is 253. The number of dependent post-accident human errors is 135. The quantification results of the LPSD PSA model for the KSNP with new HEPs show that core damage frequency (CDF) is increased by 5.1% compared with the previous baseline CDF It is expected that this study results will be greatly helpful to improve the PSA quality for the domestic nuclear power plants because they have sufficient PSA quality to meet the Category II of Supporting Requirements for the post

  16. An experimental approach to validating a theory of human error in complex systems

    Science.gov (United States)

    Morris, N. M.; Rouse, W. B.

    1985-01-01

    The problem of 'human error' is pervasive in engineering systems in which the human is involved. In contrast to the common engineering approach of dealing with error probabilistically, the present research seeks to alleviate problems associated with error by gaining a greater understanding of causes and contributing factors from a human information processing perspective. The general approach involves identifying conditions which are hypothesized to contribute to errors, and experimentally creating the conditions in order to verify the hypotheses. The conceptual framework which serves as the basis for this research is discussed briefly, followed by a description of upcoming research. Finally, the potential relevance of this research to design, training, and aiding issues is discussed.

  17. Integrated Framework for Understanding Relationship Between Human Error and Aviation Safety

    Institute of Scientific and Technical Information of China (English)

    徐锡东

    2009-01-01

    Introducing a framework for understanding the relationship between human error and aviation safety from mul-tiple perspectives and using multiple models. The first part of the framework is the perspective of individual operator using the information processing model. The second part is the group perspective with the Crew Re-source Management (CRM) model. The third and final is the organization perspective using Reason's Swiss cheese model. Each of the perspectives and models has been in existence for a long time, but the integrated framework presented allows a systematic understanding of the complex relationship between human error and aviation safety, along with the numerous factors that cause or influence error. The framework also allows the i-dentification of mitigation measures to systematically reduce human error and improve aviation safety.

  18. Finding the right coverage: the impact of coverage and sequence quality on single nucleotide polymorphism genotyping error rates.

    Science.gov (United States)

    Fountain, Emily D; Pauli, Jonathan N; Reid, Brendan N; Palsbøll, Per J; Peery, M Zachariah

    2016-07-01

    Restriction-enzyme-based sequencing methods enable the genotyping of thousands of single nucleotide polymorphism (SNP) loci in nonmodel organisms. However, in contrast to traditional genetic markers, genotyping error rates in SNPs derived from restriction-enzyme-based methods remain largely unknown. Here, we estimated genotyping error rates in SNPs genotyped with double digest RAD sequencing from Mendelian incompatibilities in known mother-offspring dyads of Hoffman's two-toed sloth (Choloepus hoffmanni) across a range of coverage and sequence quality criteria, for both reference-aligned and de novo-assembled data sets. Genotyping error rates were more sensitive to coverage than sequence quality and low coverage yielded high error rates, particularly in de novo-assembled data sets. For example, coverage ≥5 yielded median genotyping error rates of ≥0.03 and ≥0.11 in reference-aligned and de novo-assembled data sets, respectively. Genotyping error rates declined to ≤0.01 in reference-aligned data sets with a coverage ≥30, but remained ≥0.04 in the de novo-assembled data sets. We observed approximately 10- and 13-fold declines in the number of loci sampled in the reference-aligned and de novo-assembled data sets when coverage was increased from ≥5 to ≥30 at quality score ≥30, respectively. Finally, we assessed the effects of genotyping coverage on a common population genetic application, parentage assignments, and showed that the proportion of incorrectly assigned maternities was relatively high at low coverage. Overall, our results suggest that the trade-off between sample size and genotyping error rates be considered prior to building sequencing libraries, reporting genotyping error rates become standard practice, and that effects of genotyping errors on inference be evaluated in restriction-enzyme-based SNP studies.

  19. Behind Human Error: Cognitive Systems, Computers and Hindsight

    Science.gov (United States)

    1994-12-01

    squeeze became on the powers of the operator.... And as Norbert Wiener noted some years later (1964, p. 63): The gadget-minded people often have the...for one exception see Woods and Elias , 1988). This failure to develop representations that reveal change and highlight events in the monitored...Woods, D. D., and Elias , G. (1988). Significance messages: An inte- gral display concept. In Proceedings of the 32nd Annual Meeting of the Human

  20. "It's All Human Error!": When a School Science Experiment Fails

    Science.gov (United States)

    Viechnicki, Gail Brendel; Kuipers, Joel

    2006-01-01

    This paper traces the sophisticated negotiations to re-inscribe the authority of Nature when a school science experiment fails during the enactment of a highly rated science curriculum unit. Drawing on transcriptions from classroom videotapes, we identify and describe four primary patterns of interaction that characterize this process, arguing…

  1. "It's All Human Error!": When a School Science Experiment Fails

    Science.gov (United States)

    Viechnicki, Gail Brendel; Kuipers, Joel

    2006-01-01

    This paper traces the sophisticated negotiations to re-inscribe the authority of Nature when a school science experiment fails during the enactment of a highly rated science curriculum unit. Drawing on transcriptions from classroom videotapes, we identify and describe four primary patterns of interaction that characterize this process, arguing…

  2. Respiration rate in human pituitary tumor explants.

    Science.gov (United States)

    Anniko, M; Bagger-Sjöbäck, D; Hultborn, R

    1982-01-01

    Studies on the respiration rate of human pituitary tumor tissue have so far been lacking in the literature. This study presents the results from four adenomas causing acromegaly, all with different clinical degrees of the disease. Determination of oxygen uptake was performed in vitro with a spectrophotorespirometric system. Pieces of the tumors were explanted to an organ culture system with a high degree of stability. The secretion rate of growth hormone (GH) and prolactin (PRL) was determined. After 4-8 days in vitro, specimens were analyzed for respiration rate. This was approximately 1-1.5 microliters O2/h/micrograms dry weight. The activity of the pituitary tumor tissue was characterized by both the hormone secretion rate and the respiration rate. Particularly active foci were found to occur in the adenoma tissue. Depending on the individual tumor, the GH secretion rate was approximately 0.1-100 pmol/micrograms dry weight/h and PRL secretion rate approximately 0.4-18 micrograms/micrograms dry weight/h. The respiration rate--as is also the hormone secretion rate--is dependent on the time in vitro prior to analysis. The respiration rate in individual tumors is a parameter which does not reflect GH or PRL serum levels or clinical activity of the disease.

  3. Minimizing the symbol-error-rate for amplify-and-forward relaying systems using evolutionary algorithms

    KAUST Repository

    Ahmed, Qasim Zeeshan

    2015-02-01

    In this paper, a new detector is proposed for an amplify-and-forward (AF) relaying system. The detector is designed to minimize the symbol-error-rate (SER) of the system. The SER surface is non-linear and may have multiple minimas, therefore, designing an SER detector for cooperative communications becomes an optimization problem. Evolutionary based algorithms have the capability to find the global minima, therefore, evolutionary algorithms such as particle swarm optimization (PSO) and differential evolution (DE) are exploited to solve this optimization problem. The performance of proposed detectors is compared with the conventional detectors such as maximum likelihood (ML) and minimum mean square error (MMSE) detector. In the simulation results, it can be observed that the SER performance of the proposed detectors is less than 2 dB away from the ML detector. Significant improvement in SER performance is also observed when comparing with the MMSE detector. The computational complexity of the proposed detector is much less than the ML and MMSE algorithms. Moreover, in contrast to ML and MMSE detectors, the computational complexity of the proposed detectors increases linearly with respect to the number of relays.

  4. Type I Error Rates, Coverage of Confidence Intervals, and Variance Estimation in Propensity-Score Matched Analyses

    National Research Council Canada - National Science Library

    Austin, Peter C

    2009-01-01

    ... the statistical significance of the treatment effect. We conducted a series of Monte Carlo simulations to examine the impact of ignoring the matched nature of the propensity-score matched sample on Type I error rates, coverage of confidence...

  5. Error-rate performance analysis of cooperative OFDMA system with decode-and-forward relaying

    KAUST Repository

    Fareed, Muhammad Mehboob

    2014-06-01

    In this paper, we investigate the performance of a cooperative orthogonal frequency-division multiple-access (OFDMA) system with decode-and-forward (DaF) relaying. Specifically, we derive a closed-form approximate symbol-error-rate expression and analyze the achievable diversity orders. Depending on the relay location, a diversity order up to (LSkD + 1) + σ M m = 1 min(LSkRm + 1, LR mD + 1) is available, where M is the number of relays, and LS kD + 1, LSkRm + 1, and LRmD + 1 are the lengths of channel impulse responses of source-to-destination, source-to- mth relay, and mth relay-to-destination links, respectively. Monte Carlo simulation results are also presented to confirm the analytical findings. © 2013 IEEE.

  6. Error-rate performance analysis of incremental decode-and-forward opportunistic relaying

    KAUST Repository

    Tourki, Kamel

    2010-10-01

    In this paper, we investigate an incremental opportunistic relaying scheme where the selected relay chooses to cooperate only if the source-destination channel is of an unacceptable quality. In our study, we consider regenerative relaying in which the decision to cooperate is based on a signal-to-noise ratio (SNR) threshold and takes into account the effect of the possible erroneously detected and transmitted data at the best relay. We derive a closed-form expression for the end-to-end biterror rate (BER) of binary phase-shift keying (BPSK) modulation based on the exact probability density function (PDF) of each hop. Furthermore, we evaluate the asymptotic error performance and the diversity order is deduced. We show that performance simulation results coincide with our analytical results. ©2010 IEEE.

  7. Evaluate the Word Error Rate of Binary Block Codes with Square Radius Probability Density Function

    CERN Document Server

    Chen, Xiaogang; Gu, Jian; Yang, Hongkui

    2007-01-01

    The word error rate (WER) of soft-decision-decoded binary block codes rarely has closed-form. Bounding techniques are widely used to evaluate the performance of maximum-likelihood decoding algorithm. But the existing bounds are not tight enough especially for low signal-to-noise ratios and become looser when a suboptimum decoding algorithm is used. This paper proposes a new concept named square radius probability density function (SR-PDF) of decision region to evaluate the WER. Based on the SR-PDF, The WER of binary block codes can be calculated precisely for ML and suboptimum decoders. Furthermore, for a long binary block code, SR-PDF can be approximated by Gamma distribution with only two parameters that can be measured easily. Using this property, two closed-form approximative expressions are proposed which are very close to the simulation results of the WER of interesting.

  8. KAPASITAS KANAL DAN BIT ERROR RATE SISTEM D-MIMO DALAM VARIASI SPASIAL DAERAH CAKUPAN

    Directory of Open Access Journals (Sweden)

    Nyoman Gunantara

    2009-05-01

    Full Text Available Kemajuan teknologi komunikasi, dikembangkan sistem D-MIMO (Distributed MIMO yang sebelumnya telah digunakan sistem C-MIMO (Conventional co-located MIMO. Sistem C-MIMO menyebabkan penggunaan spektrummenjadi efisien, daya pancar berkurang, dan kapasitas kanal meningkat.Dengan sistem D-MIMO jarak antara pemancar dan penerima dapat diperpendek, macrodiversity dan adanya daerah cakupan layanan. Pada tulisan ini akan diteliti tentang kapasitas kanal dan Bit Error Rate (BER pada variasi spasial daerah cakupan. Penelitian tersebut dilakukan pada kapasitas kanal teoritis dan BER dengan teknik waterfilling.Kapasitas kanal dan kinerja BER pada sistem D-MIMO pada variasi spasial daerah cakupan tergantung dari konfigurasi sistem D-MIMO. Lokasi penerima yang dekat port antena pemancar mempunyai kapasitas kanal yanglebih besar tetapi memiliki kinerja BER yang lebih buruk.

  9. Bit Error Rate Analysis for MC-CDMA Systems in Nakagami- Fading Channels

    Directory of Open Access Journals (Sweden)

    Li Zexian

    2004-01-01

    Full Text Available Multicarrier code division multiple access (MC-CDMA is a promising technique that combines orthogonal frequency division multiplexing (OFDM with CDMA. In this paper, based on an alternative expression for the -function, characteristic function and Gaussian approximation, we present a new practical technique for determining the bit error rate (BER of multiuser MC-CDMA systems in frequency-selective Nakagami- fading channels. The results are applicable to systems employing coherent demodulation with maximal ratio combining (MRC or equal gain combining (EGC. The analysis assumes that different subcarriers experience independent fading channels, which are not necessarily identically distributed. The final average BER is expressed in the form of a single finite range integral and an integrand composed of tabulated functions which can be easily computed numerically. The accuracy of the proposed approach is demonstrated with computer simulations.

  10. Threshold-Based Bit Error Rate for Stopping Iterative Turbo Decoding in a Varying SNR Environment

    Science.gov (United States)

    Mohamad, Roslina; Harun, Harlisya; Mokhtar, Makhfudzah; Adnan, Wan Azizun Wan; Dimyati, Kaharudin

    2017-01-01

    Online bit error rate (BER) estimation (OBE) has been used as a stopping iterative turbo decoding criterion. However, the stopping criteria only work at high signal-to-noise ratios (SNRs), and fail to have early termination at low SNRs, which contributes to an additional iteration number and an increase in computational complexity. The failure of the stopping criteria is caused by the unsuitable BER threshold, which is obtained by estimating the expected BER performance at high SNRs, and this threshold does not indicate the correct termination according to convergence and non-convergence outputs (CNCO). Hence, in this paper, the threshold computation based on the BER of CNCO is proposed for an OBE stopping criterion (OBEsc). From the results, OBEsc is capable of terminating early in a varying SNR environment. The optimum number of iterations achieved by the OBEsc allows huge savings in decoding iteration number and decreasing the delay of turbo iterative decoding.

  11. SITE project. Phase 1: Continuous data bit-error-rate testing

    Science.gov (United States)

    Fujikawa, Gene; Kerczewski, Robert J.

    1992-01-01

    The Systems Integration, Test, and Evaluation (SITE) Project at NASA LeRC encompasses a number of research and technology areas of satellite communications systems. Phase 1 of this project established a complete satellite link simulator system. The evaluation of proof-of-concept microwave devices, radiofrequency (RF) and bit-error-rate (BER) testing of hardware, testing of remote airlinks, and other tests were performed as part of this first testing phase. This final report covers the test results produced in phase 1 of the SITE Project. The data presented include 20-GHz high-power-amplifier testing, 30-GHz low-noise-receiver testing, amplitude equalization, transponder baseline testing, switch matrix tests, and continuous-wave and modulated interference tests. The report also presents the methods used to measure the RF and BER performance of the complete system. Correlations of the RF and BER data are summarized to note the effects of the RF responses on the BER.

  12. The New Tapered Fiber Connector and the Test of Its Error Rate and Coupling Characteristics

    Directory of Open Access Journals (Sweden)

    Qinggui Hu

    2017-01-01

    Full Text Available Since the fiber core is very small, the communication fiber connector requires high precision. In this paper, the effect of lateral deviation on coupling efficiency of fiber connector is analyzed. Then, considering the fact that optical fiber is generally used in pairs, one for transmitting data and the other for receiving, the novel directional tapered communication optical fiber connector is designed. In the new connector, the structure of the fiber head is tapered according to the signal transmission direction. In order to study the performance of the new connector, several samples were made in the laboratory of corporation CDSEI and two testing experiments were done. The experiment results show that compared with the traditional connector, for the same lateral deviation, the coupling efficiency of the tapered connector is higher and the error rate is lower.

  13. Human error and the problem of causality in analysis of accidents

    DEFF Research Database (Denmark)

    Rasmussen, Jens

    1990-01-01

    , designers or managers have played a major role. There are, however, several basic problems in analysis of accidents and identification of human error. This paper addresses the nature of causal explanations and the ambiguity of the rules applied for identification of the events to include in analysis......Present technology is characterized by complexity, rapid change and growing size of technical systems. This has caused increasing concern with the human involvement in system safety. Analyses of the major accidents during recent decades have concluded that human errors on part of operators...

  14. Errors in Seismic Hazard Assessment are Creating Huge Human Losses

    Science.gov (United States)

    Bela, J.

    2015-12-01

    The current practice of representing earthquake hazards to the public based upon their perceived likelihood or probability of occurrence is proven now by the global record of actual earthquakes to be not only erroneous and unreliable, but also too deadly! Earthquake occurrence is sporadic and therefore assumptions of earthquake frequency and return-period are both not only misleading, but also categorically false. More than 700,000 people have now lost their lives (2000-2011), wherein 11 of the World's Deadliest Earthquakes have occurred in locations where probability-based seismic hazard assessments had predicted only low seismic low hazard. Unless seismic hazard assessment and the setting of minimum earthquake design safety standards for buildings and bridges are based on a more realistic deterministic recognition of "what can happen" rather than on what mathematical models suggest is "most likely to happen" such future huge human losses can only be expected to continue! The actual earthquake events that did occur were at or near the maximum potential-size event that either already had occurred in the past; or were geologically known to be possible. Haiti's M7 earthquake, 2010 (with > 222,000 fatalities) meant the dead could not even be buried with dignity. Japan's catastrophic Tohoku earthquake, 2011; a M9 Megathrust earthquake, unleashed a tsunami that not only obliterated coastal communities along the northern Japanese coast, but also claimed > 20,000 lives. This tsunami flooded nuclear reactors at Fukushima, causing 4 explosions and 3 reactors to melt down. But while this history of huge human losses due to erroneous and misleading seismic hazard estimates, despite its wrenching pain, cannot be unlived; if faced with courage and a more realistic deterministic estimate of "what is possible", it need not be lived again. An objective testing of the results of global probability based seismic hazard maps against real occurrences has never been done by the

  15. Convergence Rates and Explicit Error Bounds of Hill's Method for Spectra of Self-Adjoint Differential Operators

    OpenAIRE

    Tanaka, Ken'ichiro; Murashige, Sunao

    2012-01-01

    We present the convergence rates and the explicit error bounds of Hill's method, which is a numerical method for computing the spectra of ordinary differential operators with periodic coefficients. This method approximates the operator by a finite dimensional matrix. On the assumption that the operator is selfadjoint, it is shown that, under some conditions, we can obtain the convergence rates of eigenvalues with respect to the dimension and the explicit error bounds. Numerical examples demon...

  16. Computational analysis of splicing errors and mutations in human transcripts

    Directory of Open Access Journals (Sweden)

    Gelfand Mikhail S

    2008-01-01

    Full Text Available Abstract Background Most retained introns found in human cDNAs generated by high-throughput sequencing projects seem to result from underspliced transcripts, and thus they capture intermediate steps of pre-mRNA splicing. On the other hand, mutations in splice sites cause exon skipping of the respective exon or activation of pre-existing cryptic sites. Both types of events reflect properties of the splicing mechanism. Results The retained introns were significantly shorter than constitutive ones, and skipped exons are shorter than exons with cryptic sites. Both donor and acceptor splice sites of retained introns were weaker than splice sites of constitutive introns. The authentic acceptor sites affected by mutations were significantly weaker in exons with activated cryptic sites than in skipped exons. The distance from a mutated splice site to the nearest equivalent site is significantly shorter in cases of activated cryptic sites compared to exon skipping events. The prevalence of retained introns within genes monotonically increased in the 5'-to-3' direction (more retained introns close to the 3'-end, consistent with the model of co-transcriptional splicing. The density of exonic splicing enhancers was higher, and the density of exonic splicing silencers lower in retained introns compared to constitutive ones and in exons with cryptic sites compared to skipped exons. Conclusion Thus the analysis of retained introns in human cDNA, exons skipped due to mutations in splice sites and exons with cryptic sites produced results consistent with the intron definition mechanism of splicing of short introns, co-transcriptional splicing, dependence of splicing efficiency on the splice site strength and the density of candidate exonic splicing enhancers and silencers. These results are consistent with other, recently published analyses.

  17. Human and organizational errors in loading and discharge operations at marine terminals: Reduction of tanker oil and chemical spills. Organizing to minimize human and organizational errors

    Energy Technology Data Exchange (ETDEWEB)

    Mannarelli, T.; Roberts, K.; Bea, R.

    1995-11-01

    This report summarizes organizational and managerial findings, and proposes corresponding recommendations, based on a program of research conducted at two major locations: Chevron USA Products Company Refinery in Richmond, California and Arco Marine Incorporated shipping operations in Long Beach, California. The Organizational Behavior and Industrial Relations group from the Business School approached the project with the same objective (of reducing the risk of accidents resulting from human and/or organizational errors), but used a different means of achieving those ends. On the Business side, the aim of the project is to identify organizational and managerial practices, problems, and potential problems, analyze them, and then make recommendations that offer potential solutions to those circumstances which pose a human and/or organizational error (HOE) risk.

  18. Human error identification for laparoscopic surgery: Development of a motion economy perspective.

    Science.gov (United States)

    Al-Hakim, Latif; Sevdalis, Nick; Maiping, Tanaphon; Watanachote, Damrongpan; Sengupta, Shomik; Dissaranan, Charuspong

    2015-09-01

    This study postulates that traditional human error identification techniques fail to consider motion economy principles and, accordingly, their applicability in operating theatres may be limited. This study addresses this gap in the literature with a dual aim. First, it identifies the principles of motion economy that suit the operative environment and second, it develops a new error mode taxonomy for human error identification techniques which recognises motion economy deficiencies affecting the performance of surgeons and predisposing them to errors. A total of 30 principles of motion economy were developed and categorised into five areas. A hierarchical task analysis was used to break down main tasks of a urological laparoscopic surgery (hand-assisted laparoscopic nephrectomy) to their elements and the new taxonomy was used to identify errors and their root causes resulting from violation of motion economy principles. The approach was prospectively tested in 12 observed laparoscopic surgeries performed by 5 experienced surgeons. A total of 86 errors were identified and linked to the motion economy deficiencies. Results indicate the developed methodology is promising. Our methodology allows error prevention in surgery and the developed set of motion economy principles could be useful for training surgeons on motion economy principles. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  19. Safety coaches in radiology: decreasing human error and minimizing patient harm

    Energy Technology Data Exchange (ETDEWEB)

    Dickerson, Julie M.; Adams, Janet M. [Cincinnati Children' s Hospital Medical Center, Department of Radiology, MLC 5031, Cincinnati, OH (United States); Koch, Bernadette L.; Donnelly, Lane F. [Cincinnati Children' s Hospital Medical Center, Department of Radiology, MLC 5031, Cincinnati, OH (United States); Cincinnati Children' s Hospital Medical Center, Department of Pediatrics, Cincinnati, OH (United States); Goodfriend, Martha A. [Cincinnati Children' s Hospital Medical Center, Department of Quality Improvement, Cincinnati, OH (United States)

    2010-09-15

    Successful programs to improve patient safety require a component aimed at improving safety culture and environment, resulting in a reduced number of human errors that could lead to patient harm. Safety coaching provides peer accountability. It involves observing for safety behaviors and use of error prevention techniques and provides immediate feedback. For more than a decade, behavior-based safety coaching has been a successful strategy for reducing error within the context of occupational safety in industry. We describe the use of safety coaches in radiology. Safety coaches are an important component of our comprehensive patient safety program. (orig.)

  20. Asymptotics for partly linear regression with dependent samples and ARCH errors: consistency with rates

    Institute of Scientific and Technical Information of China (English)

    LU; Zudi

    2001-01-01

    [1]Engle, R. F., Granger, C. W. J., Rice, J. et al., Semiparametric estimates of the relation between weather and electricity sales, Journal of the American Statistical Association, 1986, 81: 310.[2]Heckman, N. E., Spline smoothing in partly linear models, Journal of the Royal Statistical Society, Ser. B, 1986, 48: 244.[3]Rice, J., Convergence rates for partially splined models, Statistics & Probability Letters, 1986, 4: 203.[4]Chen, H., Convergence rates for parametric components in a partly linear model, Annals of Statistics, 1988, 16: 136.[5]Robinson, P. M., Root-n-consistent semiparametric regression, Econometrica, 1988, 56: 931.[6]Speckman, P., Kernel smoothing in partial linear models, Journal of the Royal Statistical Society, Ser. B, 1988, 50: 413.[7]Cuzick, J., Semiparametric additive regression, Journal of the Royal Statistical Society, Ser. B, 1992, 54: 831.[8]Cuzick, J., Efficient estimates in semiparametric additive regression models with unknown error distribution, Annals of Statistics, 1992, 20: 1129.[9]Chen, H., Shiau, J. H., A two-stage spline smoothing method for partially linear models, Journal of Statistical Planning & Inference, 1991, 27: 187.[10]Chen, H., Shiau, J. H., Data-driven efficient estimators for a partially linear model, Annals of Statistics, 1994, 22: 211.[11]Schick, A., Root-n consistent estimation in partly linear regression models, Statistics & Probability Letters, 1996, 28: 353.[12]Hamilton, S. A., Truong, Y. K., Local linear estimation in partly linear model, Journal of Multivariate Analysis, 1997, 60: 1.[13]Mills, T. C., The Econometric Modeling of Financial Time Series, Cambridge: Cambridge University Press, 1993, 137.[14]Engle, R. F., Autoregressive conditional heteroscedasticity with estimates of United Kingdom inflation, Econometrica, 1982, 50: 987.[15]Bera, A. K., Higgins, M. L., A survey of ARCH models: properties of estimation and testing, Journal of Economic

  1. A Human Reliability Analysis of Pre-Accident Human Errors in the Low Power and Shutdown PSA of the KSNP

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Daeil; Jang, Seungchul

    2007-03-15

    Korea Atomic Energy Research Institute, using the ANS Low Power /Shutdown (LPSD)PRA Standard, evaluated the LPSD PSA model of the KSNP, Younggwang (YGN) Units 5 and 6, and identified the items to be improved. The evaluation results of human reliability analysis (HRA) of the pre-accident human errors in the LPSD PSA model of the KSNP showed that 13 items among 15 items of supporting requirements for those in the ANS PRA Standard were identified as them to be improved. Thus, we newly carried out a HRA for pre-accident human errors in the LPSD PSA model for the KSNP to improve its quality. We considered potential pre-accident human errors for all manual valves and control/instrumentation equipment of the systems modeled in the KSNP LPSD PSA model except reactor protection system/ engineering safety features actuation system. We reviewed 160 manual valves and 56 control/instrumentation equipment. The number of newly identified pre-accident human errors is 101. Among them, the number of those related to testing/maintenance tasks is 56. The number of those related to calibration tasks is 45. The number of those related to only shutdown operation is 10. It was shown that the pre-accident human errors related to only shutdown operation contributions to the core damage frequency of LPSD PSA model for the KSNP was negligible.The self-assessment results for the new HRA results of pre-accident human errors using the ANS LPSD PRA Standard show that above 80% items of its supporting requirements for post-accident human errors were graded as its Category II or III. It is expected that the HRA results for the pre-accident human errors presented in this study will be greatly helpful to improve the PSA quality for the domestic nuclear power plants because they have sufficient PSA quality to meet the Category II of supporting requirements for the postaccident human errors in the ANS LPSD PRA Standard.

  2. A Human Reliability Analysis of Pre-Accident Human Errors in the Low Power and Shutdown PSA of the KSNP

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Daeil; Jang, Seungchul

    2007-03-15

    Korea Atomic Energy Research Institute, using the ANS Low Power /Shutdown (LPSD)PRA Standard, evaluated the LPSD PSA model of the KSNP, Younggwang (YGN) Units 5 and 6, and identified the items to be improved. The evaluation results of human reliability analysis (HRA) of the pre-accident human errors in the LPSD PSA model of the KSNP showed that 13 items among 15 items of supporting requirements for those in the ANS PRA Standard were identified as them to be improved. Thus, we newly carried out a HRA for pre-accident human errors in the LPSD PSA model for the KSNP to improve its quality. We considered potential pre-accident human errors for all manual valves and control/instrumentation equipment of the systems modeled in the KSNP LPSD PSA model except reactor protection system/ engineering safety features actuation system. We reviewed 160 manual valves and 56 control/instrumentation equipment. The number of newly identified pre-accident human errors is 101. Among them, the number of those related to testing/maintenance tasks is 56. The number of those related to calibration tasks is 45. The number of those related to only shutdown operation is 10. It was shown that the pre-accident human errors related to only shutdown operation contributions to the core damage frequency of LPSD PSA model for the KSNP was negligible.The self-assessment results for the new HRA results of pre-accident human errors using the ANS LPSD PRA Standard show that above 80% items of its supporting requirements for post-accident human errors were graded as its Category II or III. It is expected that the HRA results for the pre-accident human errors presented in this study will be greatly helpful to improve the PSA quality for the domestic nuclear power plants because they have sufficient PSA quality to meet the Category II of supporting requirements for the postaccident human errors in the ANS LPSD PRA Standard.

  3. Human error in medical practice: an unavoidable presence El error en la práctica médica: una presencia ineludible

    OpenAIRE

    Gladis Adriana Vélez Álvarez

    2006-01-01

    Making mistakes is a human characteristic and a mechanism to learn, but at the same time it may become a threat to human beings in some scenarios. Aviation and Medicine are good examples of this. Some data are presented about the frequency of error in Medicine, its ubiquity and the circumstances that favor it. A reflection is done about how the error is being managed and why it is not more often discussed. It is proposed that the first step in learning from an error is to accept it as an unav...

  4. Assessment of the rate and etiology of pharmacological errors by nurses of two major teaching hospitals in Shiraz

    Directory of Open Access Journals (Sweden)

    Fatemeh Vizeshfar

    2015-06-01

    Full Text Available Medication errors have serious consequences for patients, their families and care givers. Reduction of these faults by care givers such as nurses can increase the safety of patients. The goal of study was to assess the rate and etiology of medication error in pediatric and medical wards. This cross-sectional-analytic study is done on 101 registered nurses who had the duty of drug administration in medical pediatric and adults’ wards. Data was collected by a questionnaire including demographic information, self report faults, etiology of medication error and researcher observations. The results showed that nurses’ faults in pediatric wards were 51/6% and in adults wards were 47/4%. The most common faults in adults wards were later or sooner drug administration (48/6%, and administration of drugs without prescription and administering wrong drugs were the most common medication errors in pediatric wards (each one 49/2%. According to researchers’ observations, the medication error rate of 57/9% was rated low in adults wards and the rate of 69/4% in pediatric wards was rated moderate. The most frequent medication errors in both adults and pediatric wards were that nurses didn’t explain the reason and type of drug they were going to administer to patients. Independent T-test showed a significant change in faults observations in pediatric wards (p=0.000 and in adults wards (p=0.000. Several studies have shown medication errors all over the world, especially in pediatric wards. However, by designing a suitable report system and use a multi disciplinary approach, we can be reduced the occurrence of medication errors and its negative consequences.

  5. Estimating gene gain and loss rates in the presence of error in genome assembly and annotation using CAFE 3.

    Science.gov (United States)

    Han, Mira V; Thomas, Gregg W C; Lugo-Martinez, Jose; Hahn, Matthew W

    2013-08-01

    Current sequencing methods produce large amounts of data, but genome assemblies constructed from these data are often fragmented and incomplete. Incomplete and error-filled assemblies result in many annotation errors, especially in the number of genes present in a genome. This means that methods attempting to estimate rates of gene duplication and loss often will be misled by such errors and that rates of gene family evolution will be consistently overestimated. Here, we present a method that takes these errors into account, allowing one to accurately infer rates of gene gain and loss among genomes even with low assembly and annotation quality. The method is implemented in the newest version of the software package CAFE, along with several other novel features. We demonstrate the accuracy of the method with extensive simulations and reanalyze several previously published data sets. Our results show that errors in genome annotation do lead to higher inferred rates of gene gain and loss but that CAFE 3 sufficiently accounts for these errors to provide accurate estimates of important evolutionary parameters.

  6. Leak in the breathing circuit: CO2 absorber and human error.

    Science.gov (United States)

    Umesh, Goneppanavar; Jasvinder, Kaur; Sagarnil, Roy

    2010-04-01

    A couple of reports in literature have mentioned CO2 absorbers to be the cause for breathing circuit leak during anesthesia. Defective canister, failure to close the absorber chamber and overfilling of the chamber with sodalime were the problems in these reports. Among these, the last two are reports of human error resulting in problems. We report a case where despite taking precautions in this regard, we experienced a significant leak in the system due to a problem with the CO2 absorber, secondary to human error.

  7. A Conceptual Framework for Predicting Error in Complex Human-Machine Environments

    Science.gov (United States)

    Freed, Michael; Remington, Roger; Null, Cynthia H. (Technical Monitor)

    1998-01-01

    We present a Goals, Operators, Methods, and Selection Rules-Model Human Processor (GOMS-MHP) style model-based approach to the problem of predicting human habit capture errors. Habit captures occur when the model fails to allocate limited cognitive resources to retrieve task-relevant information from memory. Lacking the unretrieved information, decision mechanisms act in accordance with implicit default assumptions, resulting in error when relied upon assumptions prove incorrect. The model helps interface designers identify situations in which such failures are especially likely.

  8. Bit error rate analysis of free-space optical system with spatial diversity over strong atmospheric turbulence channel with pointing errors

    Science.gov (United States)

    Krishnan, Prabu; Sriram Kumar, D.

    2014-12-01

    Free-space optical communication (FSO) is emerging as a captivating alternative to work out the hindrances in the connectivity problems. It can be used for transmitting signals over common lands and properties that the sender or receiver may not own. The performance of an FSO system depends on the random environmental conditions. The bit error rate (BER) performance of differential phase shift keying FSO system is investigated. A distributed strong atmospheric turbulence channel with pointing error is considered for the BER analysis. Here, the system models are developed for single-input, single-output-FSO (SISO-FSO) and single-input, multiple-output-FSO (SIMO-FSO) systems. The closed-form mathematical expressions are derived for the average BER with various combining schemes in terms of the Meijer's G function.

  9. Does raising type 1 error rate improve power to detect interactions in linear regression models? A simulation study.

    Directory of Open Access Journals (Sweden)

    Casey P Durand

    Full Text Available INTRODUCTION: Statistical interactions are a common component of data analysis across a broad range of scientific disciplines. However, the statistical power to detect interactions is often undesirably low. One solution is to elevate the Type 1 error rate so that important interactions are not missed in a low power situation. To date, no study has quantified the effects of this practice on power in a linear regression model. METHODS: A Monte Carlo simulation study was performed. A continuous dependent variable was specified, along with three types of interactions: continuous variable by continuous variable; continuous by dichotomous; and dichotomous by dichotomous. For each of the three scenarios, the interaction effect sizes, sample sizes, and Type 1 error rate were varied, resulting in a total of 240 unique simulations. RESULTS: In general, power to detect the interaction effect was either so low or so high at α = 0.05 that raising the Type 1 error rate only served to increase the probability of including a spurious interaction in the model. A small number of scenarios were identified in which an elevated Type 1 error rate may be justified. CONCLUSIONS: Routinely elevating Type 1 error rate when testing interaction effects is not an advisable practice. Researchers are best served by positing interaction effects a priori and accounting for them when conducting sample size calculations.

  10. Bit error rate estimation for galvanic-type intra-body communication using experimental eye-diagram and jitter characteristics.

    Science.gov (United States)

    Li, Jia Wen; Chen, Xi Mei; Pun, Sio Hang; Mak, Peng Un; Gao, Yue Ming; Vai, Mang I; Du, Min

    2013-01-01

    Bit error rate (BER), which indicates the reliability of communicate channel, is one of the most important values in all kinds of communication system, including intra-body communication (IBC). In order to know more about IBC channel, this paper presents a new method of BER estimation for galvanic-type IBC using experimental eye-diagram and jitter characteristics. To lay the foundation for our methodology, the fundamental relationships between eye-diagram, jitter and BER are first reviewed. Then experiments based on human lower arm IBC are carried out using quadrature phase shift keying (QPSK) modulation scheme and 500 KHz carries frequency. In our IBC experiments, the symbol rate is from 10 Ksps to 100 Ksps, with two transmitted power settings, 0 dBm and -5 dBm. Finally, the BER results were obtained after calculation by experimental data through the relationships among eye-diagram, jitter and BER. These results are then compared with theoretical values and they show good agreement, especially when SNR is between 6 dB to 11 dB. Additionally, these results demonstrate assuming the noise of galvanic-type IBC channel as Additive White Gaussian Noise (AWGN) in previous study is applicable.

  11. IMPROVING THE PERFORMANCE AND REDUCING BIT ERROR RATE ON WIRELESS DEEP FADING ENVIRONMENT RECEIVERS

    Directory of Open Access Journals (Sweden)

    K. Jayanthi

    2014-01-01

    Full Text Available One of the major challenges in wireless communication system is increasing complexity and reducing performance in detecting the received digital information in indoor and outdoor Environments. Consequently to overcome this problem we analyze the delay performance of a multiuser with perfect channel state information transmitting data on deep fading environment. In this proposed system, the Wireless Deep Fading Environment (WDFE creation for causing a Nakagami Multipath Fading Channel of fading figure ‘m’ is used to rectify the delay performance over the existing Rayleigh fading channel. In this WDFE receivers received coherent, synchronized, secured and improved signal strength of information using a Multiuser Coherent Joint Diversity (MCJD with Multi Carrier-Code Division Multiple Access (MC-CDMA. The MCJD in ‘M’ branch of antennas are used to reduce the Bit Error Rate (BER and MC-CDMA method is used to improve the performance. Therefore, in this proposed system we accompany with MCJD and MC-CDMA is very good transceiver for next generation wireless system of an existing 3G wireless system. Overall, this experimental results show improved performance in different multiuser wireless systems under different multipath fading conditions.

  12. Bit Error Rate Performance Analysis on Modulation Techniques of Wideband Code Division Multiple Access

    CERN Document Server

    Masud, M A; Rahman, M A

    2010-01-01

    In the beginning of 21st century there has been a dramatic shift in the market dynamics of telecommunication services. The transmission from base station to mobile or downlink transmission using M-ary Quadrature Amplitude modulation (QAM) and Quadrature phase shift keying (QPSK) modulation schemes are considered in Wideband-Code Division Multiple Access (W-CDMA) system. We have done the performance analysis of these modulation techniques when the system is subjected to Additive White Gaussian Noise (AWGN) and multipath Rayleigh fading are considered in the channel. The research has been performed by using MATLAB 7.6 for simulation and evaluation of Bit Error Rate (BER) and Signal-To-Noise Ratio (SNR) for W-CDMA system models. It is shows that the analysis of Quadrature phases shift key and 16-ary Quadrature Amplitude modulations which are being used in wideband code division multiple access system, Therefore, the system could go for more suitable modulation technique to suit the channel quality, thus we can d...

  13. Methodological Approach for Performing Human Reliability and Error Analysis in Railway Transportation System

    Directory of Open Access Journals (Sweden)

    Fabio De Felice

    2011-10-01

    Full Text Available Today, billions of dollars are being spent annually world wide to develop, manufacture, and operate transportation system such trains, ships, aircraft, and motor vehicles. Around 70 to 90 percent oftransportation crashes are, directly or indirectly, the result of human error. In fact, with the development of technology, system reliability has increased dramatically during the past decades, while human reliability has remained unchanged over the same period. Accordingly, human error is now considered as the most significant source of accidents or incidents in safety-critical systems. The aim of the paper is the proposal of a methodological approach to improve the transportation system reliability and in particular railway transportation system. The methodology presented is based on Failure Modes, Effects and Criticality Analysis (FMECA and Human Reliability Analysis (HRA.

  14. A human error analysis methodology, AGAPE-ET, for emergency tasks in nuclear power plants and its application

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae Whan; Jung, Won Dea [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2002-03-01

    This report presents a procedurised human reliability analysis (HRA) methodology, AGAPE-ET (A Guidance And Procedure for Human Error Analysis for Emergency Tasks), for both qualitative error analysis and quantification of human error probability (HEP) of emergency tasks in nuclear power plants. The AGAPE-ET is based on the simplified cognitive model. By each cognitive function, error causes or error-likely situations have been identified considering the characteristics of the performance of each cognitive function and influencing mechanism of PIFs on the cognitive function. Then, error analysis items have been determined from the identified error causes or error-likely situations to help the analysts cue or guide overall human error analysis. A human error analysis procedure based on the error analysis items is organised. The basic scheme for the quantification of HEP consists in the multiplication of the BHEP assigned by the error analysis item and the weight from the influencing factors decision tree (IFDT) constituted by cognitive function. The method can be characterised by the structured identification of the weak points of the task required to perform and the efficient analysis process that the analysts have only to carry out with the necessary cognitive functions. The report also presents the the application of AFAPE-ET to 31 nuclear emergency tasks and its results. 42 refs., 7 figs., 36 tabs. (Author)

  15. How we learn to make decisions: rapid propagation of reinforcement learning prediction errors in humans.

    Science.gov (United States)

    Krigolson, Olav E; Hassall, Cameron D; Handy, Todd C

    2014-03-01

    Our ability to make decisions is predicated upon our knowledge of the outcomes of the actions available to us. Reinforcement learning theory posits that actions followed by a reward or punishment acquire value through the computation of prediction errors-discrepancies between the predicted and the actual reward. A multitude of neuroimaging studies have demonstrated that rewards and punishments evoke neural responses that appear to reflect reinforcement learning prediction errors [e.g., Krigolson, O. E., Pierce, L. J., Holroyd, C. B., & Tanaka, J. W. Learning to become an expert: Reinforcement learning and the acquisition of perceptual expertise. Journal of Cognitive Neuroscience, 21, 1833-1840, 2009; Bayer, H. M., & Glimcher, P. W. Midbrain dopamine neurons encode a quantitative reward prediction error signal. Neuron, 47, 129-141, 2005; O'Doherty, J. P. Reward representations and reward-related learning in the human brain: Insights from neuroimaging. Current Opinion in Neurobiology, 14, 769-776, 2004; Holroyd, C. B., & Coles, M. G. H. The neural basis of human error processing: Reinforcement learning, dopamine, and the error-related negativity. Psychological Review, 109, 679-709, 2002]. Here, we used the brain ERP technique to demonstrate that not only do rewards elicit a neural response akin to a prediction error but also that this signal rapidly diminished and propagated to the time of choice presentation with learning. Specifically, in a simple, learnable gambling task, we show that novel rewards elicited a feedback error-related negativity that rapidly decreased in amplitude with learning. Furthermore, we demonstrate the existence of a reward positivity at choice presentation, a previously unreported ERP component that has a similar timing and topography as the feedback error-related negativity that increased in amplitude with learning. The pattern of results we observed mirrored the output of a computational model that we implemented to compute reward

  16. The Role of Human Error in Design, Construction, and Reliability of Marine Structures.

    Science.gov (United States)

    1994-10-01

    OrganizationHuman Resources Syste ms Facilities The entire process is Equipment iterative (the design spiral) [Taggart, 1980]. The preliminary design...quantitative analyses. New, little Standard, good experience, experience, insufficient sufficient Materials ms Constuction - PSrocdures SyIstlI Design - ~J...of the MSIP project [Bea, 1993] indicated that there were four general approaches that should be considered in developing human error tol- erant

  17. Error Rates of the Maximum-Likelihood Detector for Arbitrary Constellations: Convex/Concave Behavior and Applications

    CERN Document Server

    Loyka, Sergey; Gagnon, Francois

    2009-01-01

    Motivated by a recent surge of interest in convex optimization techniques, convexity/concavity properties of error rates of the maximum likelihood detector operating in the AWGN channel are studied and extended to frequency-flat slow-fading channels. Generic conditions are identified under which the symbol error rate (SER) is convex/concave for arbitrary multi-dimensional constellations. In particular, the SER is convex in SNR for any one- and two-dimensional constellation, and also in higher dimensions at high SNR. Pairwise error probability and bit error rate are shown to be convex at high SNR, for arbitrary constellations and bit mapping. Universal bounds for the SER 1st and 2nd derivatives are obtained, which hold for arbitrary constellations and are tight for some of them. Applications of the results are discussed, which include optimum power allocation in spatial multiplexing systems, optimum power/time sharing to decrease or increase (jamming problem) error rate, an implication for fading channels ("fa...

  18. Forensic comparison and matching of fingerprints: using quantitative image measures for estimating error rates through understanding and predicting difficulty.

    Directory of Open Access Journals (Sweden)

    Philip J Kellman

    Full Text Available Latent fingerprint examination is a complex task that, despite advances in image processing, still fundamentally depends on the visual judgments of highly trained human examiners. Fingerprints collected from crime scenes typically contain less information than fingerprints collected under controlled conditions. Specifically, they are often noisy and distorted and may contain only a portion of the total fingerprint area. Expertise in fingerprint comparison, like other forms of perceptual expertise, such as face recognition or aircraft identification, depends on perceptual learning processes that lead to the discovery of features and relations that matter in comparing prints. Relatively little is known about the perceptual processes involved in making comparisons, and even less is known about what characteristics of fingerprint pairs make particular comparisons easy or difficult. We measured expert examiner performance and judgments of difficulty and confidence on a new fingerprint database. We developed a number of quantitative measures of image characteristics and used multiple regression techniques to discover objective predictors of error as well as perceived difficulty and confidence. A number of useful predictors emerged, and these included variables related to image quality metrics, such as intensity and contrast information, as well as measures of information quantity, such as the total fingerprint area. Also included were configural features that fingerprint experts have noted, such as the presence and clarity of global features and fingerprint ridges. Within the constraints of the overall low error rates of experts, a regression model incorporating the derived predictors demonstrated reasonable success in predicting objective difficulty for print pairs, as shown both in goodness of fit measures to the original data set and in a cross validation test. The results indicate the plausibility of using objective image metrics to predict expert

  19. Forensic comparison and matching of fingerprints: using quantitative image measures for estimating error rates through understanding and predicting difficulty.

    Science.gov (United States)

    Kellman, Philip J; Mnookin, Jennifer L; Erlikhman, Gennady; Garrigan, Patrick; Ghose, Tandra; Mettler, Everett; Charlton, David; Dror, Itiel E

    2014-01-01

    Latent fingerprint examination is a complex task that, despite advances in image processing, still fundamentally depends on the visual judgments of highly trained human examiners. Fingerprints collected from crime scenes typically contain less information than fingerprints collected under controlled conditions. Specifically, they are often noisy and distorted and may contain only a portion of the total fingerprint area. Expertise in fingerprint comparison, like other forms of perceptual expertise, such as face recognition or aircraft identification, depends on perceptual learning processes that lead to the discovery of features and relations that matter in comparing prints. Relatively little is known about the perceptual processes involved in making comparisons, and even less is known about what characteristics of fingerprint pairs make particular comparisons easy or difficult. We measured expert examiner performance and judgments of difficulty and confidence on a new fingerprint database. We developed a number of quantitative measures of image characteristics and used multiple regression techniques to discover objective predictors of error as well as perceived difficulty and confidence. A number of useful predictors emerged, and these included variables related to image quality metrics, such as intensity and contrast information, as well as measures of information quantity, such as the total fingerprint area. Also included were configural features that fingerprint experts have noted, such as the presence and clarity of global features and fingerprint ridges. Within the constraints of the overall low error rates of experts, a regression model incorporating the derived predictors demonstrated reasonable success in predicting objective difficulty for print pairs, as shown both in goodness of fit measures to the original data set and in a cross validation test. The results indicate the plausibility of using objective image metrics to predict expert performance and

  20. Support of protective work of human error in a nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Yoshizawa, Yuriko [Tokyo Electric Power Co., Inc. (Japan)

    1999-12-01

    The nuclear power plant human factor group of the Tokyo Electric Power Co., Ltd. supports various protective work of human error conducted at the nuclear power plant. Its main researching theme are studies on human factor on operation of a nuclear power plant, and on recovery and common basic study on human factor. In addition, on a base of the obtained informations, assistance to protective work of human error conducted at the nuclear power plant as well as development for its actual use was also promoted. Especially, for actions sharing some dangerous informations, various assistances such as a proposal on actual example analytical method to effectively understand a dangerous information not facially but faithfully, construction of a data base to conveniently share such dangerous information, and practice on non-accident business survey for a hint of effective promotion of the protection work, were promoted. Here were introduced on assistance and investigation for effective sharing of the dangerous informations for various actions on protection of human error mainly conducted in nuclear power plant. (G.K.)

  1. In-plant reliability data base for nuclear plant components: a feasibility study on human error information

    Energy Technology Data Exchange (ETDEWEB)

    Borkowski, R.J.; Fragola, J.R.; Schurman, D.L.; Johnson, J.W.

    1984-03-01

    This report documents the procedure and final results of a feasibility study which examined the usefulness of nuclear plant maintenance work requests in the IPRDS as tools for understanding human error and its influence on component failure and repair. Developed in this study were (1) a set of criteria for judging the quality of a plant maintenance record set for studying human error; (2) a scheme for identifying human errors in the maintenance records; and (3) two taxonomies (engineering-based and psychology-based) for categorizing and coding human error-related events.

  2. [Evaluation and improvement of a measure of drug name similarity, vwhtfrag, in relation to subjective similarities and experimental error rates].

    Science.gov (United States)

    Tamaki, Hirofumi; Satoh, Hiroki; Hori, Satoko; Sawada, Yasufumi

    2012-01-01

    Confusion of drug names is one of the most common causes of drug-related medical errors. A similarity measure of drug names, "vwhtfrag", was developed to discriminate whether drug name pairs are likely to cause confusion errors, and to provide information that would be helpful to avoid errors. The aim of the present study was to evaluate and improve vwhtfrag. Firstly, we evaluated the correlation of vwhtfrag with subjective similarity or error rate of drug name pairs in psychological experiments. Vwhtfrag showed a higher correlation to subjective similarity (college students: r=0.84) or error rate than did other conventional similarity measures (htco, cos1, edit). Moreover, name pairs that showed coincidences of the initial character strings had a higher subjective similarity than those which had coincidences of the end character strings and had the same vwhtfrag. Therefore, we developed a new similarity measure (vwhtfrag+), in which coincidence of initial character strings in name pairs is weighted by 1.53 times over coincidence of end character strings. Vwhtfrag+ showed a higher correlation to subjective similarity than did unmodified vwhtfrag. Further studies appear warranted to examine in detail whether vwhtfrag+ has superior ability to discriminate drug name pairs likely to cause confusion errors.

  3. Convergence Rates and Explicit Error Bounds of Hill's Method for Spectra of Self-Adjoint Differential Operators

    CERN Document Server

    Tanaka, Ken'ichiro

    2012-01-01

    We present the convergence rates and the explicit error bounds of Hill's method, which is a numerical method for computing the spectra of ordinary differential operators with periodic coefficients. This method approximates the operator by a finite dimensional matrix. On the assumption that the operator is selfadjoint, it is shown that, under some conditions, we can obtain the convergence rates of eigenvalues with respect to the dimension and the explicit error bounds. Numerical examples demonstrate that we can verify these conditions using Gershgorin's theorem for some real problems. Main theorems are proved using the Dunford integrals which project an eigenvector to the corresponding eigenspace.

  4. Step angles to reduce the north-finding error caused by rate random walk with fiber optic gyroscope.

    Science.gov (United States)

    Wang, Qin; Xie, Jun; Yang, Chuanchuan; He, Changhong; Wang, Xinyue; Wang, Ziyu

    2015-10-20

    We study the relationship between the step angles and the accuracy of north finding with fiber optic gyroscopes. A north-finding method with optimized step angles is proposed to reduce the errors caused by rate random walk (RRW). Based on this method, the errors caused by both angle random walk and RRW are reduced by increasing the number of positions. For when the number of positions is even, we proposed a north-finding method with symmetric step angles that can reduce the error caused by RRW and is not affected by the azimuth angles. Experimental results show that, compared with the traditional north-finding method, the proposed methods with the optimized step angles and the symmetric step angles can reduce the north-finding errors by 67.5% and 62.5%, respectively. The method with symmetric step angles is not affected by the azimuth angles and can offer consistent high accuracy for any azimuth angles.

  5. Development of an improved HRA method: A technique for human error analysis (ATHEANA)

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, J.H.; Luckas, W.J. [Brookhaven National Lab., Upton, NY (United States); Wreathall, J. [John Wreathall & Co., Dublin, OH (United States)] [and others

    1996-03-01

    Probabilistic risk assessment (PRA) has become an increasingly important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. The NRC recently published a final policy statement, SECY-95-126, encouraging the use of PRA in regulatory activities. Human reliability analysis (HRA), while a critical element of PRA, has limitations in the analysis of human actions in PRAs that have long been recognized as a constraint when using PRA. In fact, better integration of HRA into the PRA process has long been a NRC issue. Of particular concern, has been the omission of errors of commission - those errors that are associated with inappropriate interventions by operators with operating systems. To address these concerns, the NRC identified the need to develop an improved HRA method, so that human reliability can be better represented and integrated into PRA modeling and quantification.

  6. Human Error Probabilites (HEPs) for generic tasks and Performance Shaping Factors (PSFs) selected for railway operations

    DEFF Research Database (Denmark)

    Thommesen, Jacob; Andersen, Henning Boje

    at task level, which can be performed with fewer resources than a more detailed analysis of specific errors for each task. The generic tasks are presented with estimated Human Error Probabili-ties (HEPs) based on and extrapolated from the HRA literature, and estimates are compared with samples of measures...... on estimates derived from industries other than rail and the general warning that a task-based analysis is less precise than an error-based one. The authors recommend that estimates be adjusted to actual measures of task failures when feasible....... collaboration with Banedanmark. The estimates provided are based on HRA literature and primarily the HEART method, being recently been adapted for railway tasks by the British Rail Safety and Stan-dards Board (RSSB). The method presented in this report differs from the RSSB tool by supporting an analysis...

  7. Considering the role of time budgets on copy-error rates in material culture traditions: an experimental assessment.

    Science.gov (United States)

    Schillinger, Kerstin; Mesoudi, Alex; Lycett, Stephen J

    2014-01-01

    Ethnographic research highlights that there are constraints placed on the time available to produce cultural artefacts in differing circumstances. Given that copying error, or cultural 'mutation', can have important implications for the evolutionary processes involved in material culture change, it is essential to explore empirically how such 'time constraints' affect patterns of artefactual variation. Here, we report an experiment that systematically tests whether, and how, varying time constraints affect shape copying error rates. A total of 90 participants copied the shape of a 3D 'target handaxe form' using a standardized foam block and a plastic knife. Three distinct 'time conditions' were examined, whereupon participants had either 20, 15, or 10 minutes to complete the task. One aim of this study was to determine whether reducing production time produced a proportional increase in copy error rates across all conditions, or whether the concept of a task specific 'threshold' might be a more appropriate manner to model the effect of time budgets on copy-error rates. We found that mean levels of shape copying error increased when production time was reduced. However, there were no statistically significant differences between the 20 minute and 15 minute conditions. Significant differences were only obtained between conditions when production time was reduced to 10 minutes. Hence, our results more strongly support the hypothesis that the effects of time constraints on copying error are best modelled according to a 'threshold' effect, below which mutation rates increase more markedly. Our results also suggest that 'time budgets' available in the past will have generated varying patterns of shape variation, potentially affecting spatial and temporal trends seen in the archaeological record. Hence, 'time-budgeting' factors need to be given greater consideration in evolutionary models of material culture change.

  8. Controlling Type I Error Rate in Evaluating Differential Item Functioning for Four DIF Methods: Use of Three Procedures for Adjustment of Multiple Item Testing

    Science.gov (United States)

    Kim, Jihye

    2010-01-01

    In DIF studies, a Type I error refers to the mistake of identifying non-DIF items as DIF items, and a Type I error rate refers to the proportion of Type I errors in a simulation study. The possibility of making a Type I error in DIF studies is always present and high possibility of making such an error can weaken the validity of the assessment.…

  9. Error-rate estimation in discriminant analysis of non-linear longitudinal data: A comparison of resampling methods.

    Science.gov (United States)

    de la Cruz, Rolando; Fuentes, Claudio; Meza, Cristian; Núñez-Antón, Vicente

    2016-07-08

    Consider longitudinal observations across different subjects such that the underlying distribution is determined by a non-linear mixed-effects model. In this context, we look at the misclassification error rate for allocating future subjects using cross-validation, bootstrap algorithms (parametric bootstrap, leave-one-out, .632 and [Formula: see text]), and bootstrap cross-validation (which combines the first two approaches), and conduct a numerical study to compare the performance of the different methods. The simulation and comparisons in this study are motivated by real observations from a pregnancy study in which one of the main objectives is to predict normal versus abnormal pregnancy outcomes based on information gathered at early stages. Since in this type of studies it is not uncommon to have insufficient data to simultaneously solve the classification problem and estimate the misclassification error rate, we put special attention to situations when only a small sample size is available. We discuss how the misclassification error rate estimates may be affected by the sample size in terms of variability and bias, and examine conditions under which the misclassification error rate estimates perform reasonably well.

  10. Maximum type I error rate inflation from sample size reassessment when investigators are blind to treatment labels.

    Science.gov (United States)

    Żebrowska, Magdalena; Posch, Martin; Magirr, Dominic

    2016-05-30

    Consider a parallel group trial for the comparison of an experimental treatment to a control, where the second-stage sample size may depend on the blinded primary endpoint data as well as on additional blinded data from a secondary endpoint. For the setting of normally distributed endpoints, we demonstrate that this may lead to an inflation of the type I error rate if the null hypothesis holds for the primary but not the secondary endpoint. We derive upper bounds for the inflation of the type I error rate, both for trials that employ random allocation and for those that use block randomization. We illustrate the worst-case sample size reassessment rule in a case study. For both randomization strategies, the maximum type I error rate increases with the effect size in the secondary endpoint and the correlation between endpoints. The maximum inflation increases with smaller block sizes if information on the block size is used in the reassessment rule. Based on our findings, we do not question the well-established use of blinded sample size reassessment methods with nuisance parameter estimates computed from the blinded interim data of the primary endpoint. However, we demonstrate that the type I error rate control of these methods relies on the application of specific, binding, pre-planned and fully algorithmic sample size reassessment rules and does not extend to general or unplanned sample size adjustments based on blinded data. © 2015 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  11. Finding the right coverage : The impact of coverage and sequence quality on single nucleotide polymorphism genotyping error rates

    NARCIS (Netherlands)

    Fountain, Emily D.; Pauli, Jonathan N.; Reid, Brendan N.; Palsboll, Per J.; Peery, M. Zachariah

    2016-01-01

    Restriction-enzyme-based sequencing methods enable the genotyping of thousands of single nucleotide polymorphism (SNP) loci in nonmodel organisms. However, in contrast to traditional genetic markers, genotyping error rates in SNPs derived from restriction-enzyme-based methods remain largely unknown.

  12. El error en la práctica médica: una presencia ineludible Human error in medical practice: an unavoidable presence

    Directory of Open Access Journals (Sweden)

    Gladis Adriana Vélez Álvarez

    2006-01-01

    Full Text Available El errar, que es una característica humana y un mecanismo de aprendizaje, se convierte en una amenaza para el hombre mismo en algunos escenarios como la aviación y la medicina. Se presentan algunos datos acerca de la frecuencia del error en medicina, su ubicuidad y las circunstancias que lo favorecen, y se hace una reflexión acerca de cómo se ha enfrentado el error y de por qué no se habla abiertamente del mismo. Se propone que el primer paso para aprender del error es aceptarlo como una presencia ineludible. Making mistakes is a human characteristic and a mechanism to learn, but at the same time it may become a threat to human beings in some scenarios. Aviation and Medicine are good examples of this. Some data are presented about the frequency of error in Medicine, its ubiquity and the circumstances that favor it. A reflection is done about how the error is being managed and why it is not more often discussed. It is proposed that the first step in learning from an error is to accept it as an unavoidable presence.

  13. On the Symbol Error Rate of M-ary MPSK over Generalized Fading Channels with Additive Laplacian Noise

    KAUST Repository

    Soury, Hamza

    2015-01-07

    This work considers the symbol error rate of M-ary phase shift keying (MPSK) constellations over extended Generalized-K fading with Laplacian noise and using a minimum distance detector. A generic closed form expression of the conditional and the average probability of error is obtained and simplified in terms of the Fox’s H function. More simplifications to well known functions for some special cases of fading are also presented. Finally, the mathematical formalism is validated with some numerical results examples done by computer based simulations [1].

  14. On the symbol error rate of M-ary MPSK over generalized fading channels with additive Laplacian noise

    KAUST Repository

    Soury, Hamza

    2014-06-01

    This paper considers the symbol error rate of M-ary phase shift keying (MPSK) constellations over extended Generalized-K fading with Laplacian noise and using a minimum distance detector. A generic closed form expression of the conditional and the average probability of error is obtained and simplified in terms of the Fox\\'s H function. More simplifications to well known functions for some special cases of fading are also presented. Finally, the mathematical formalism is validated with some numerical results examples done by computer based simulations. © 2014 IEEE.

  15. Sharp Threshold Detection Based on Sup-norm Error rates in High-dimensional Models

    DEFF Research Database (Denmark)

    Callot, Laurent; Caner, Mehmet; Kock, Anders Bredahl

    focused almost exclusively on estimation errors in stronger norms. We show that this sup-norm bound can be used to distinguish between zero and non-zero coefficients at a much finer scale than would have been possible using classical oracle inequalities. Thus, our sup-norm bound is tailored to consistent...

  16. Sharp threshold detection based on sup-norm error rates in high-dimensional models

    DEFF Research Database (Denmark)

    Callot, Laurent; Caner, Mehmet; Kock, Anders Bredahl

    2017-01-01

    almost exclusively on ℓ1 and ℓ2 estimation errors. We show that this sup-norm bound can be used to distinguish between zero and non-zero coefficients at a much finer scale than would have been possible using classical oracle inequalities. Thus, our sup-norm bound is tailored to consistent variable...

  17. Methylphenidate improves diminished error and feedback sensitivity in ADHD : An Evoked Heart Rate analysis

    NARCIS (Netherlands)

    Groen, Yvonne; Mulder, Lambertus J. M.; Wijers, Albertus A.; Minderaa, Ruud B.; Althaus, Monika

    2009-01-01

    Attention Deficit Hyperactivity Disorder (ADHD) is a developmental disorder that has previously been related to a decreased sensitivity to errors and feedback. Supplementary to the traditional performance measures, this study uses autonomic measures to study this decreased sensitivity in ADHD and th

  18. Determination of corrosion rate of reinforcement with a modulated guard ring electrode; analysis of errors due to lateral current distribution

    Energy Technology Data Exchange (ETDEWEB)

    Wojtas, H

    2004-07-01

    The main source of errors in measuring the corrosion rate of rebars on site is a non-uniform current distribution between the small counter electrode (CE) on the concrete surface and the large rebar network. Guard ring electrodes (GEs) are used in an attempt to confine the excitation current within a defined area. In order to better understand the functioning of modulated guard ring electrode and to assess its effectiveness in eliminating errors due to lateral spread of current signal from the small CE, measurements of the polarisation resistance performed on a concrete beam have been numerically simulated. Effect of parameters such as rebar corrosion activity, concrete resistivity, concrete cover depth and size of the corroding area on errors in the estimation of polarisation resistance of a single rebar has been examined. The results indicate that modulated GE arrangement fails to confine the lateral spread of the CE current within a constant area. Using the constant diameter of confinement for the calculation of corrosion rate may lead to serious errors when test conditions change. When high corrosion activity of rebar and/or local corrosion occur, the use of the modulated GE confinement may lead to significant underestimation of the corrosion rate.

  19. Dual-mass vibratory rate gyroscope with suppressed translational acceleration response and quadrature-error correction capability

    Science.gov (United States)

    Clark, William A. (Inventor); Juneau, Thor N. (Inventor); Lemkin, Mark A. (Inventor); Roessig, Allen W. (Inventor)

    2001-01-01

    A microfabricated vibratory rate gyroscope to measure rotation includes two proof-masses mounted in a suspension system anchored to a substrate. The suspension has two principal modes of compliance, one of which is driven into oscillation. The driven oscillation combined with rotation of the substrate about an axis perpendicular to the substrate results in Coriolis acceleration along the other mode of compliance, the sense-mode. The sense-mode is designed to respond to Coriolis accelerationwhile suppressing the response to translational acceleration. This is accomplished using one or more rigid levers connecting the two proof-masses. The lever allows the proof-masses to move in opposite directions in response to Coriolis acceleration. The invention includes a means for canceling errors, termed quadrature error, due to imperfections in implementation of the sensor. Quadrature-error cancellation utilizes electrostatic forces to cancel out undesired sense-axis motion in phase with drive-mode position.

  20. PS-022 Complex automated medication systems reduce medication administration error rates in an acute medical ward

    DEFF Research Database (Denmark)

    Risør, Bettina Wulff; Lisby, Marianne; Sørensen, Jan

    2017-01-01

    Background Medication errors have received extensive attention in recent decades and are of significant concern to healthcare organisations globally. Medication errors occur frequently, and adverse events associated with medications are one of the largest causes of harm to hospitalised patients....... Reviews have suggested that up to 50% of the adverse events in the medication process may be preventable. Thus the medication process is an important means to improve safety. Purpose The objective of this study was to evaluate the effectiveness of two automated medication systems in reducing...... the medication administration error rate in comparison with current practice. Material and methods This was a controlled before and after study with follow-up after 7 and 14 months. The study was conducted in two acute medical hospital wards. Two automated medication systems were tested: (1) automated dispensing...

  1. DISTANCE MEASURING MODELING AND ERROR ANALYSIS OF DUAL CCD VISION SYSTEM SIMULATING HUMAN EYES AND NECK

    Institute of Scientific and Technical Information of China (English)

    Wang Xuanyin; Xiao Baoping; Pan Feng

    2003-01-01

    A dual-CCD simulating human eyes and neck (DSHEN) vision system is put forward. Its structure and principle are introduced. The DSHEN vision system can perform some movements simulating human eyes and neck by means of four rotating joints, and realize precise object recognizing and distance measuring in all orientations. The mathematic model of the DSHEN vision system is built, and its movement equation is solved. The coordinate error and measure precision affected by the movement parameters are analyzed by means of intersection measuring method. So a theoretic foundation for further research on automatic object recognizing and precise target tracking is provided.

  2. Does the A-not-B error in adult pet dogs indicate sensitivity to human communication?

    Science.gov (United States)

    Kis, Anna; Topál, József; Gácsi, Márta; Range, Friederike; Huber, Ludwig; Miklósi, Adám; Virányi, Zsófia

    2012-07-01

    Recent dog-infant comparisons have indicated that the experimenter's communicative signals in object hide-and-search tasks increase the probability of perseverative (A-not-B) errors in both species (Topál et al. 2009). These behaviourally similar results, however, might reflect different mechanisms in dogs and in children. Similar errors may occur if the motor response of retrieving the object during the A trials cannot be inhibited in the B trials or if the experimenter's movements and signals toward the A hiding place in the B trials ('sham-baiting') distract the dogs' attention. In order to test these hypotheses, we tested dogs similarly to Topál et al. (2009) but eliminated the motor search in the A trials and 'sham-baiting' in the B trials. We found that neither an inability to inhibit previously rewarded motor response nor insufficiencies in their working memory and/or attention skills can explain dogs' erroneous choices. Further, we replicated the finding that dogs have a strong tendency to commit the A-not-B error after ostensive-communicative hiding and demonstrated the crucial effect of socio-communicative cues as the A-not-B error diminishes when location B is ostensively enhanced. These findings further support the hypothesis that the dogs' A-not-B error may reflect a special sensitivity to human communicative cues. Such object-hiding and search tasks provide a typical case for how susceptibility to human social signals could (mis)lead domestic dogs.

  3. Generalized additive models and Lucilia sericata growth: assessing confidence intervals and error rates in forensic entomology.

    Science.gov (United States)

    Tarone, Aaron M; Foran, David R

    2008-07-01

    Forensic entomologists use blow fly development to estimate a postmortem interval. Although accurate, fly age estimates can be imprecise for older developmental stages and no standard means of assigning confidence intervals exists. Presented here is a method for modeling growth of the forensically important blow fly Lucilia sericata, using generalized additive models (GAMs). Eighteen GAMs were created to predict the extent of juvenile fly development, encompassing developmental stage, length, weight, strain, and temperature data, collected from 2559 individuals. All measures were informative, explaining up to 92.6% of the deviance in the data, though strain and temperature exerted negligible influences. Predictions made with an independent data set allowed for a subsequent examination of error. Estimates using length and developmental stage were within 5% of true development percent during the feeding portion of the larval life cycle, while predictions for postfeeding third instars were less precise, but within expected error.

  4. Error rates, PCR recombination, and sampling depth in HIV-1 whole genome deep sequencing.

    Science.gov (United States)

    Zanini, Fabio; Brodin, Johanna; Albert, Jan; Neher, Richard A

    2016-12-27

    Deep sequencing is a powerful and cost-effective tool to characterize the genetic diversity and evolution of virus populations. While modern sequencing instruments readily cover viral genomes many thousand fold and very rare variants can in principle be detected, sequencing errors, amplification biases, and other artifacts can limit sensitivity and complicate data interpretation. For this reason, the number of studies using whole genome deep sequencing to characterize viral quasi-species in clinical samples is still limited. We have previously undertaken a large scale whole genome deep sequencing study of HIV-1 populations. Here we discuss the challenges, error profiles, control experiments, and computational test we developed to quantify the accuracy of variant frequency estimation.

  5. Error Baseline Rates of Five Sequencing Strategies Used for RNA Virus Population Characterization

    Science.gov (United States)

    2017-01-31

    viral evolution , including the emergence of resistance to medical 21 countermeasures. To explore the sources of error in the determination of the...pressure on evolution of 36 viral genotypes and phenotypes, optimizing vaccine design, and identifying virus genome 37 mutations that may lead to...NGS) technologies have had a dramatic impact on the 43 experimental analysis of viral genetic diversity. With NGS, a virus population’s genomic 44

  6. Assessment of error rates in acoustic monitoring with the R package monitoR

    Science.gov (United States)

    Katz, Jonathan; Hafner, Sasha D.; Donovan, Therese

    2016-01-01

    Detecting population-scale reactions to climate change and land-use change may require monitoring many sites for many years, a process that is suited for an automated system. We developed and tested monitoR, an R package for long-term, multi-taxa acoustic monitoring programs. We tested monitoR with two northeastern songbird species: black-throated green warbler (Setophaga virens) and ovenbird (Seiurus aurocapilla). We compared detection results from monitoR in 52 10-minute surveys recorded at 10 sites in Vermont and New York, USA to a subset of songs identified by a human that were of a single song type and had visually identifiable spectrograms (e.g. a signal:noise ratio of at least 10 dB: 166 out of 439 total songs for black-throated green warbler, 502 out of 990 total songs for ovenbird). monitoR’s automated detection process uses a ‘score cutoff’, which is the minimum match needed for an unknown event to be considered a detection and results in a true positive, true negative, false positive or false negative detection. At the chosen score cut-offs, monitoR correctly identified presence for black-throated green warbler and ovenbird in 64% and 72% of the 52 surveys using binary point matching, respectively, and 73% and 72% of the 52 surveys using spectrogram cross-correlation, respectively. Of individual songs, 72% of black-throated green warbler songs and 62% of ovenbird songs were identified by binary point matching. Spectrogram cross-correlation identified 83% of black-throated green warbler songs and 66% of ovenbird songs. False positive rates were  for song event detection.

  7. Is human muscle spindle afference dependent on perceived size of error in visual tracking?

    Science.gov (United States)

    Kakuda, N; Wessberg, J; Vallbo, A B

    1997-04-01

    Impulses of 16 muscle spindle afferents from finger extensor muscles were recorded from the radial nerve along with electromyographic (EMG) activity and kinematics of joint movement. Twelve units were classified as Ia and 4 as II spindle afferents. Subjects were requested to perform precision movements at a single metacarpophalangeal joint in an indirect visual tracking task. Similar movements were executed under two different conditions, i.e. with high and low error gain. The purpose was to explore whether different precision demands were associated with different spindle firing rates. With high error gain, a small but significantly higher impulse rate was found in pooled data from Ia afferents during lengthening movements but not during shortening movements, nor with II afferents. EMG was also significantly higher with high error gain in recordings with Ia afferents. When the effect of EMG was factored out, using partial correlation analysis, the significant difference in Ia firing rate vanished. The findings suggest that fusimotor drive as well as skeletomotor activity were both marginally higher when the precision demand was higher, whereas no indication of independent fusimotor adjustments was found. These results are discussed with respect to data from behaving animals and the role of fusimotor independence in various limb muscles proposed.

  8. Metabolically Derived Human Ventilation Rates: A Revised Approach Based Upon Oxygen Consumption Rates (Final Report, 2009)

    Science.gov (United States)

    EPA announced the availability of the final report, Metabolically Derived Human Ventilation Rates: A Revised Approach Based Upon Oxygen Consumption Rates. This report provides a revised approach for calculating an individual's ventilation rate directly from their oxygen c...

  9. Bit-error-rate testing of high-power 30-GHz traveling-wave tubes for ground-terminal applications

    Science.gov (United States)

    Shalkhauser, Kurt A.

    1987-01-01

    Tests were conducted at NASA Lewis to measure the bit-error-rate performance of two 30-GHz 200-W coupled-cavity traveling-wave tubes (TWTs). The transmission effects of each TWT on a band-limited 220-Mbit/s SMSK signal were investigated. The tests relied on the use of a recently developed digital simulation and evaluation system constructed at Lewis as part of the 30/20-GHz technology development program. This paper describes the approach taken to test the 30-GHz tubes and discusses the test data. A description of the bit-error-rate measurement system and the adaptations needed to facilitate TWT testing are also presented.

  10. Effect of Vertical Rate Error on Recovery from Loss of Well Clear Between UAS and Non-Cooperative Intruders

    Science.gov (United States)

    Cone, Andrew; Thipphavong, David; Lee, Seung Man; Santiago, Confesor

    2016-01-01

    When an Unmanned Aircraft System (UAS) encounters an intruder and is unable to maintain required temporal and spatial separation between the two vehicles, it is referred to as a loss of well-clear. In this state, the UAS must make its best attempt to regain separation while maximizing the minimum separation between itself and the intruder. When encountering a non-cooperative intruder (an aircraft operating under visual flight rules without ADS-B or an active transponder) the UAS must rely on the radar system to provide the intruders location, velocity, and heading information. As many UAS have limited climb and descent performance, vertical position andor vertical rate errors make it difficult to determine whether an intruder will pass above or below them. To account for that, there is a proposal by RTCA Special Committee 228 to prohibit guidance systems from providing vertical guidance to regain well-clear to UAS in an encounter with a non-cooperative intruder unless their radar system has vertical position error below 175 feet (95) and vertical velocity errors below 200 fpm (95). Two sets of fast-time parametric studies was conducted, each with 54000 pairwise encounters between a UAS and non-cooperative intruder to determine the suitability of offering vertical guidance to regain well clear to a UAS in the presence of radar sensor noise. The UAS was not allowed to maneuver until it received well-clear recovery guidance. The maximum severity of the loss of well-clear was logged and used as the primary indicator of the separation achieved by the UAS. One set of 54000 encounters allowed the UAS to maneuver either vertically or horizontally, while the second permitted horizontal maneuvers, only. Comparing the two data sets allowed researchers to see the effect of allowing vertical guidance to a UAS for a particular encounter and vertical rate error. Study results show there is a small reduction in the average severity of a loss of well-clear when vertical maneuvers

  11. Results of a nuclear power plant application of A New Technique for Human Error Analysis (ATHEANA)

    Energy Technology Data Exchange (ETDEWEB)

    Whitehead, D.W.; Forester, J.A. [Sandia National Labs., Albuquerque, NM (United States); Bley, D.C. [Buttonwood Consulting, Inc. (United States)] [and others

    1998-03-01

    A new method to analyze human errors has been demonstrated at a pressurized water reactor (PWR) nuclear power plant. This was the first application of the new method referred to as A Technique for Human Error Analysis (ATHEANA). The main goals of the demonstration were to test the ATHEANA process as described in the frame-of-reference manual and the implementation guideline, test a training package developed for the method, test the hypothesis that plant operators and trainers have significant insight into the error-forcing-contexts (EFCs) that can make unsafe actions (UAs) more likely, and to identify ways to improve the method and its documentation. A set of criteria to evaluate the success of the ATHEANA method as used in the demonstration was identified. A human reliability analysis (HRA) team was formed that consisted of an expert in probabilistic risk assessment (PRA) with some background in HRA (not ATHEANA) and four personnel from the nuclear power plant. Personnel from the plant included two individuals from their PRA staff and two individuals from their training staff. Both individuals from training are currently licensed operators and one of them was a senior reactor operator on shift until a few months before the demonstration. The demonstration was conducted over a 5-month period and was observed by members of the Nuclear Regulatory Commission`s ATHEANA development team, who also served as consultants to the HRA team when necessary. Example results of the demonstration to date, including identified human failure events (HFEs), UAs, and EFCs are discussed. Also addressed is how simulator exercises are used in the ATHEANA demonstration project.

  12. Single Event Test Methodologies and System Error Rate Analysis for Triple Modular Redundant Field Programmable Gate Arrays

    Science.gov (United States)

    Allen, Gregory; Edmonds, Larry D.; Swift, Gary; Carmichael, Carl; Tseng, Chen Wei; Heldt, Kevin; Anderson, Scott Arlo; Coe, Michael

    2010-01-01

    We present a test methodology for estimating system error rates of Field Programmable Gate Arrays (FPGAs) mitigated with Triple Modular Redundancy (TMR). The test methodology is founded in a mathematical model, which is also presented. Accelerator data from 90 nm Xilins Military/Aerospace grade FPGA are shown to fit the model. Fault injection (FI) results are discussed and related to the test data. Design implementation and the corresponding impact of multiple bit upset (MBU) are also discussed.

  13. Bit Error Rate Due to Misalignment of Earth Station Antenna Pointing to Satellite

    Directory of Open Access Journals (Sweden)

    Wahyu Pamungkas

    2010-04-01

    Full Text Available One problem causing reduction of energy in satellite communications system is the misalignment of earth station antenna pointing to satellite. Error in pointing would affect the quality of information signal to energy bit in earth station. In this research, error in pointing angle occurred only at receiver (Rx antenna, while the transmitter (Tx antennas precisely point to satellite. The research was conducted towards two satellites, namely TELKOM-1 and TELKOM-2. At first, measurement was made by directing Tx antenna precisely to satellite, resulting in an antenna pattern shown by spectrum analyzer. The output from spectrum analyzers is drawn with the right scale to describe swift of azimuth and elevation pointing angle towards satellite. Due to drifting from the precise pointing, it influenced the received link budget indicated by pattern antenna. This antenna pattern shows reduction of power level received as a result of pointing misalignment. As a conclusion, the increasing misalignment of pointing to satellite would affect in the reduction of received signal parameters link budget of down-link traffic.

  14. Multidisciplinary framework for human reliability analysis with an application to errors of commission and dependencies

    Energy Technology Data Exchange (ETDEWEB)

    Barriere, M.T.; Luckas, W.J. [Brookhaven National Lab., Upton, NY (United States); Wreathall, J. [Wreathall (John) and Co., Dublin, OH (United States); Cooper, S.E. [Science Applications International Corp., Reston, VA (United States); Bley, D.C. [PLG, Inc., Newport Beach, CA (United States); Ramey-Smith, A. [Nuclear Regulatory Commission, Washington, DC (United States). Div. of Systems Technology

    1995-08-01

    Since the early 1970s, human reliability analysis (HRA) has been considered to be an integral part of probabilistic risk assessments (PRAs). Nuclear power plant (NPP) events, from Three Mile Island through the mid-1980s, showed the importance of human performance to NPP risk. Recent events demonstrate that human performance continues to be a dominant source of risk. In light of these observations, the current limitations of existing HRA approaches become apparent when the role of humans is examined explicitly in the context of real NPP events. The development of new or improved HRA methodologies to more realistically represent human performance is recognized by the Nuclear Regulatory Commission (NRC) as a necessary means to increase the utility of PRAS. To accomplish this objective, an Improved HRA Project, sponsored by the NRC`s Office of Nuclear Regulatory Research (RES), was initiated in late February, 1992, at Brookhaven National Laboratory (BNL) to develop an improved method for HRA that more realistically assesses the human contribution to plant risk and can be fully integrated with PRA. This report describes the research efforts including the development of a multidisciplinary HRA framework, the characterization and representation of errors of commission, and an approach for addressing human dependencies. The implications of the research and necessary requirements for further development also are discussed.

  15. Bit-error-rate testing of fiber optic data links for MMIC-based phased array antennas

    Science.gov (United States)

    Shalkhauser, K. A.; Kunath, R. R.; Daryoush, A. S.

    1990-01-01

    The measured bit-error-rate (BER) performance of a fiber optic data link to be used in satellite communications systems is presented and discussed. In the testing, the link was measured for its ability to carry high burst rate, serial-minimum shift keyed (SMSK) digital data similar to those used in actual space communications systems. The fiber optic data link, as part of a dual-segment injection-locked RF fiber optic link system, offers a means to distribute these signals to the many radiating elements of a phased array antenna. Test procedures, experimental arrangements, and test results are presented.

  16. Optimal classifier selection and negative bias in error rate estimation: an empirical study on high-dimensional prediction

    Directory of Open Access Journals (Sweden)

    Boulesteix Anne-Laure

    2009-12-01

    Full Text Available Abstract Background In biometric practice, researchers often apply a large number of different methods in a "trial-and-error" strategy to get as much as possible out of their data and, due to publication pressure or pressure from the consulting customer, present only the most favorable results. This strategy may induce a substantial optimistic bias in prediction error estimation, which is quantitatively assessed in the present manuscript. The focus of our work is on class prediction based on high-dimensional data (e.g. microarray data, since such analyses are particularly exposed to this kind of bias. Methods In our study we consider a total of 124 variants of classifiers (possibly including variable selection or tuning steps within a cross-validation evaluation scheme. The classifiers are applied to original and modified real microarray data sets, some of which are obtained by randomly permuting the class labels to mimic non-informative predictors while preserving their correlation structure. Results We assess the minimal misclassification rate over the different variants of classifiers in order to quantify the bias arising when the optimal classifier is selected a posteriori in a data-driven manner. The bias resulting from the parameter tuning (including gene selection parameters as a special case and the bias resulting from the choice of the classification method are examined both separately and jointly. Conclusions The median minimal error rate over the investigated classifiers was as low as 31% and 41% based on permuted uninformative predictors from studies on colon cancer and prostate cancer, respectively. We conclude that the strategy to present only the optimal result is not acceptable because it yields a substantial bias in error rate estimation, and suggest alternative approaches for properly reporting classification accuracy.

  17. On the Orientation Error of IMU: Investigating Static and Dynamic Accuracy Targeting Human Motion.

    Science.gov (United States)

    Ricci, Luca; Taffoni, Fabrizio; Formica, Domenico

    2016-01-01

    The accuracy in orientation tracking attainable by using inertial measurement units (IMU) when measuring human motion is still an open issue. This study presents a systematic quantification of the accuracy under static conditions and typical human dynamics, simulated by means of a robotic arm. Two sensor fusion algorithms, selected from the classes of the stochastic and complementary methods, are considered. The proposed protocol implements controlled and repeatable experimental conditions and validates accuracy for an extensive set of dynamic movements, that differ in frequency and amplitude of the movement. We found that dynamic performance of the tracking is only slightly dependent on the sensor fusion algorithm. Instead, it is dependent on the amplitude and frequency of the movement and a major contribution to the error derives from the orientation of the rotation axis w.r.t. the gravity vector. Absolute and relative errors upper bounds are found respectively in the range [0.7° ÷ 8.2°] and [1.0° ÷ 10.3°]. Alongside dynamic, static accuracy is thoroughly investigated, also with an emphasis on convergence behavior of the different algorithms. Reported results emphasize critical issues associated with the use of this technology and provide a baseline level of performance for the human motion related application.

  18. On the Orientation Error of IMU: Investigating Static and Dynamic Accuracy Targeting Human Motion

    Science.gov (United States)

    Ricci, Luca; Taffoni, Fabrizio

    2016-01-01

    The accuracy in orientation tracking attainable by using inertial measurement units (IMU) when measuring human motion is still an open issue. This study presents a systematic quantification of the accuracy under static conditions and typical human dynamics, simulated by means of a robotic arm. Two sensor fusion algorithms, selected from the classes of the stochastic and complementary methods, are considered. The proposed protocol implements controlled and repeatable experimental conditions and validates accuracy for an extensive set of dynamic movements, that differ in frequency and amplitude of the movement. We found that dynamic performance of the tracking is only slightly dependent on the sensor fusion algorithm. Instead, it is dependent on the amplitude and frequency of the movement and a major contribution to the error derives from the orientation of the rotation axis w.r.t. the gravity vector. Absolute and relative errors upper bounds are found respectively in the range [0.7° ÷ 8.2°] and [1.0° ÷ 10.3°]. Alongside dynamic, static accuracy is thoroughly investigated, also with an emphasis on convergence behavior of the different algorithms. Reported results emphasize critical issues associated with the use of this technology and provide a baseline level of performance for the human motion related application. PMID:27612100

  19. Determining The Factors Causing Human Error Deficiencies At A Public Utility Company

    Directory of Open Access Journals (Sweden)

    F. W. Badenhorst

    2004-11-01

    Full Text Available According to Neff (1977, as cited by Bergh (1995, the westernised culture considers work important for industrial mental health. Most individuals experience work positively, which creates a positive attitude. Should this positive attitude be inhibited, workers could lose concentration and become bored, potentially resulting in some form of human error. The aim of this research was to determine the factors responsible for human error events, which lead to power supply failures at Eskom power stations. Proposals were made for the reduction of these contributing factors towards improving plant performance. The target population was 700 panel operators in Eskom’s Power Generation Group. The results showed that factors leading to human error can be reduced or even eliminated. Opsomming Neff (1977 soos aangehaal deur Bergh (1995, skryf dat in die westerse kultuur werk belangrik vir bedryfsgeestesgesondheid is. Die meeste persone ervaar werk as positief, wat ’n positiewe gesindheid kweek. Indien hierdie positiewe gesindheid geïnhibeer word, kan dit lei tot ’n gebrek aan konsentrasie by die werkers. Werkers kan verveeld raak en dit kan weer lei tot menslike foute. Die doel van hierdie navorsing is om die faktore vas te stel wat tot menslike foute lei, en wat bydra tot onderbrekings in kragvoorsiening by Eskom kragstasies. Voorstelle is gemaak vir die vermindering van hierdie bydraende faktore ten einde die kragaanleg se prestasie te verbeter. Die teiken-populasie was 700 paneel-operateurs in die Kragopwekkingsgroep by Eskom. Die resultate dui daarop dat die faktore wat aanleiding gee tot menslike foute wel verminder, of geëlimineer kan word.

  20. A human error taxonomy for analysing healthcare incident reports: assessing reporting culture and its effects on safety perfomance

    DEFF Research Database (Denmark)

    Itoh, Kenji; Omata, N.; Andersen, Henning Boje

    2009-01-01

    The present paper reports on a human error taxonomy system developed for healthcare risk management and on its application to evaluating safety performance and reporting culture. The taxonomy comprises dimensions for classifying errors, for performance-shaping factors, and for the maturity...

  1. Capacity Versus Bit Error Rate Trade-Off in the DVB-S2 Forward Link

    Directory of Open Access Journals (Sweden)

    Matteo Berioli

    2007-05-01

    Full Text Available The paper presents an approach to optimize the use of satellite capacity in DVB-S2 forward links. By reducing the so-called safety margins, in the adaptive coding and modulation technique, it is possible to increase the spectral efficiency at expenses of an increased BER on the transmission. The work shows how a system can be tuned to operate at different degrees of this trade-off, and also the performance which can be achieved in terms of BER/PER, spectral efficiency, and interarrival, duration, strength of the error bursts. The paper also describes how a Markov chain can be used to model the ModCod transitions in a DVB-S2 system, and it presents results for the calculation of the transition probabilities in two cases.

  2. Capacity Versus Bit Error Rate Trade-Off in the DVB-S2 Forward Link

    Directory of Open Access Journals (Sweden)

    Berioli Matteo

    2007-01-01

    Full Text Available The paper presents an approach to optimize the use of satellite capacity in DVB-S2 forward links. By reducing the so-called safety margins, in the adaptive coding and modulation technique, it is possible to increase the spectral efficiency at expenses of an increased BER on the transmission. The work shows how a system can be tuned to operate at different degrees of this trade-off, and also the performance which can be achieved in terms of BER/PER, spectral efficiency, and interarrival, duration, strength of the error bursts. The paper also describes how a Markov chain can be used to model the ModCod transitions in a DVB-S2 system, and it presents results for the calculation of the transition probabilities in two cases.

  3. Error Rate Improvement in Underwater MIMO Communications Using Sparse Partial Response Equalization

    Science.gov (United States)

    2006-09-01

    λn−kvi(k) vHi (k) (13) θi(n) = n∑ k=1 λn−kvi(k)x (s)H i (k) (14) are the (time averaged) output correlation matrix and the input-output cross...error vector [5] and Ki(n) is the RLS gain defined as αi(n) = x (s) i (n)− cHi (n− 1)vi(n) (17) Ki(n) = Pi(n− 1)vi(n) λi + vHi (n)Pi(n− 1)vi(n) · (18...Using equations 13, 14, and the matrix inversion lemma [5], the inverse correlation matrix Pi(n) can be updated as Pi(n) = [ I−Ki(n) vHi (n) ] Pi(n− 1

  4. Formal safety assessment and application of the navigation simulators for preventing human error in ship operations

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    The International Maritime Organization (IMO) has encouraged its member countries to introduce Formal Safety Assessment (FSA) for ship operations since the end of the last century. FSA can be used through certain formal assessing steps to generate effective recommendations and cautions to control marine risks and improve the safety of ships. On the basis of the brief introduction of FSA, this paper describes the ideas of applying FSA to the prevention of human error in ship operations. It especially discusses the investigation and analysis of the information and data using navigation simulators and puts forward some suggestions for the introduction and development of the FSA research work for safer ship operations.

  5. Report: Human biochemical genetics: an insight into inborn errors of metabolism

    Institute of Scientific and Technical Information of China (English)

    YU Chunli; SCOTT C. Ronald

    2006-01-01

    Inborn errors of metabolism (IEM) include a broad spectrum of defects of various gene products that affect intermediary metabolism in the body. Studying the molecular and biochemical mechanisms of those inherited disorder, systematically summarizing the disease phenotype and natural history, providing diagnostic rationale and methodology and treatment strategy comprise the context of human biochemical genetics. This session focused on: (1) manifestations of representative metabolic disorders; (2) the emergent technology and application of newborn screening of metabolic disorders using tandem mass spectrometry; (3) principles of managing IEM; (4) the concept of carrier testing aiming prevention. Early detection of patients with IEM allows early intervention and more options for treatment.

  6. Structural basis of error-prone replication and stalling at a thymine base by human DNA polymerase

    Energy Technology Data Exchange (ETDEWEB)

    Kirouac, Kevin N.; Ling, Hong; (UWO)

    2009-06-30

    Human DNA polymerase iota (pol iota) is a unique member of Y-family polymerases, which preferentially misincorporates nucleotides opposite thymines (T) and halts replication at T bases. The structural basis of the high error rates remains elusive. We present three crystal structures of pol complexed with DNA containing a thymine base, paired with correct or incorrect incoming nucleotides. A narrowed active site supports a pyrimidine to pyrimidine mismatch and excludes Watson-Crick base pairing by pol. The template thymine remains in an anti conformation irrespective of incoming nucleotides. Incoming ddATP adopts a syn conformation with reduced base stacking, whereas incorrect dGTP and dTTP maintain anti conformations with normal base stacking. Further stabilization of dGTP by H-bonding with Gln59 of the finger domain explains the preferential T to G mismatch. A template 'U-turn' is stabilized by pol and the methyl group of the thymine template, revealing the structural basis of T stalling. Our structural and domain-swapping experiments indicate that the finger domain is responsible for pol's high error rates on pyrimidines and determines the incorporation specificity.

  7. The effect of administrative boundaries and geocoding error on cancer rates in California.

    Science.gov (United States)

    Goldberg, Daniel W; Cockburn, Myles G

    2012-04-01

    Geocoding is often used to produce maps of disease rates from the diagnosis addresses of incident cases to assist with disease surveillance, prevention, and control. In this process, diagnosis addresses are converted into latitude/longitude pairs which are then aggregated to produce rates at varying geographic scales such as Census tracts, neighborhoods, cities, counties, and states. The specific techniques used within geocoding systems have an impact on where the output geocode is located and can therefore have an effect on the derivation of disease rates at different geographic aggregations. This paper investigates how county-level cancer rates are affected by the choice of interpolation method when case data are geocoded to the ZIP code level. Four commonly used areal unit interpolation techniques are applied and the output of each is used to compute crude county-level five-year incidence rates of all cancers in California. We found that the rates observed for 44 out of the 58 counties in California vary based on which interpolation method is used, with rates in some counties increasing by nearly 400% between interpolation methods.

  8. Correcting for binomial measurement error in predictors in regression with application to analysis of DNA methylation rates by bisulfite sequencing.

    Science.gov (United States)

    Buonaccorsi, John; Prochenka, Agnieszka; Thoresen, Magne; Ploski, Rafal

    2016-09-30

    Motivated by a genetic application, this paper addresses the problem of fitting regression models when the predictor is a proportion measured with error. While the problem of dealing with additive measurement error in fitting regression models has been extensively studied, the problem where the additive error is of a binomial nature has not been addressed. The measurement errors here are heteroscedastic for two reasons; dependence on the underlying true value and changing sampling effort over observations. While some of the previously developed methods for treating additive measurement error with heteroscedasticity can be used in this setting, other methods need modification. A new version of simulation extrapolation is developed, and we also explore a variation on the standard regression calibration method that uses a beta-binomial model based on the fact that the true value is a proportion. Although most of the methods introduced here can be used for fitting non-linear models, this paper will focus primarily on their use in fitting a linear model. While previous work has focused mainly on estimation of the coefficients, we will, with motivation from our example, also examine estimation of the variance around the regression line. In addressing these problems, we also discuss the appropriate manner in which to bootstrap for both inferences and bias assessment. The various methods are compared via simulation, and the results are illustrated using our motivating data, for which the goal is to relate the methylation rate of a blood sample to the age of the individual providing the sample. Copyright © 2016 John Wiley & Sons, Ltd.

  9. Bit Error-Rate Minimizing Detector for Amplify-and-Forward Relaying Systems Using Generalized Gaussian Kernel

    KAUST Repository

    Ahmed, Qasim Zeeshan

    2013-01-01

    In this letter, a new detector is proposed for amplifyand- forward (AF) relaying system when communicating with the assistance of relays. The major goal of this detector is to improve the bit error rate (BER) performance of the receiver. The probability density function is estimated with the help of kernel density technique. A generalized Gaussian kernel is proposed. This new kernel provides more flexibility and encompasses Gaussian and uniform kernels as special cases. The optimal window width of the kernel is calculated. Simulations results show that a gain of more than 1 dB can be achieved in terms of BER performance as compared to the minimum mean square error (MMSE) receiver when communicating over Rayleigh fading channels.

  10. Can the Misinterpretation Amendment Rate Be Used as a Measure of Interpretive Error in Anatomic Pathology?: Implications of a Survey of the Directors of Anatomic and Surgical Pathology.

    Science.gov (United States)

    Parkash, Vinita; Fadare, Oluwole; Dewar, Rajan; Nakhleh, Raouf; Cooper, Kumarasen

    2017-03-01

    A repeat survey of the Association of the Directors of Anatomic and Surgical Pathology, done 10 years after the original was used to assess trends and variability in classifying scenarios as errors, and the preferred post signout report modification for correcting error by the membership of the Association of the Directors of Anatomic and Surgical Pathology. The results were analyzed to inform on whether interpretive amendment rates might act as surrogate measures of interpretive error in pathology. An analyses of the responses indicated that primary level misinterpretations (benign to malignant and vice versa) were universally qualified as error; secondary-level misinterpretations or misclassifications were inconsistently labeled error. There was added variability in the preferred post signout report modification used to correct report alterations. The classification of a scenario as error appeared to correlate with severity of potential harm of the missed call, the perceived subjectivity of the diagnosis, and ambiguity of reporting terminology. Substantial differences in policies for error detection and optimal reporting format were documented between departments. In conclusion, the inconsistency in labeling scenarios as error, disagreement about the optimal post signout report modification for the correction of the error, and variability in error detection policies preclude the use of the misinterpretation amendment rate as a surrogate measure for error in anatomic pathology. There is little change in uniformity of definition, attitudes and perception of interpretive error in anatomic pathology in the last 10 years.

  11. Human errors and violations in computer and information security: the viewpoint of network administrators and security specialists.

    Science.gov (United States)

    Kraemer, Sara; Carayon, Pascale

    2007-03-01

    This paper describes human errors and violations of end users and network administration in computer and information security. This information is summarized in a conceptual framework for examining the human and organizational factors contributing to computer and information security. This framework includes human error taxonomies to describe the work conditions that contribute adversely to computer and information security, i.e. to security vulnerabilities and breaches. The issue of human error and violation in computer and information security was explored through a series of 16 interviews with network administrators and security specialists. The interviews were audio taped, transcribed, and analyzed by coding specific themes in a node structure. The result is an expanded framework that classifies types of human error and identifies specific human and organizational factors that contribute to computer and information security. Network administrators tended to view errors created by end users as more intentional than unintentional, while errors created by network administrators as more unintentional than intentional. Organizational factors, such as communication, security culture, policy, and organizational structure, were the most frequently cited factors associated with computer and information security.

  12. How to Cope with the Rare Human Error Events Involved with organizational Factors in Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sa Kil; Luo, Meiling; Lee, Yong Hee [Korea Atomic Research Institute, Daejeon (Korea, Republic of)

    2014-10-15

    The current human error guidelines (e.g. US DOD handbooks, US NRC Guidelines) are representative tools to prevent human errors. These tools, however, have limits that they do not adapt all operating situations and circumstances such as design base events. In other words, these tools are only adapted foreseeable standardized operating situations and circumstances. In this study, our research team proposed an evidence-based approach such as UK's safety case to coping with the rare human error events such as TMI, Chernobyl, Fukushima accidents. These accidents are representative events involved with rare human errors. Our research team defined the 'rare human errors' as the follow three characterized events; Extremely low frequency Extremely high complicated structure Extremely serious damage of human life and property A safety case is a structured argument, supported by evidence, intended to justify that a system is acceptably safe. The definition by UK defense standard 00-56 issue 4 states that such an evidence-based approach can be contrast with a prescriptive approach to safety certification, which require safety to be justified using a prescribed process. Safety managements and safety regulatory activities based on safety case are effective to control organizational factors in terms of integrated safety management. Especially safety issues relevant with public acceptance are useful to provide practical evidences to the public reasonably. European Union including UK has developed the concept of engineered safety management system to deal with public acceptance using the safety case. In Korea nuclear industry, the Korean Atomic Research Institute has firstly performed a basic research to adapt the safety case in the field of radioactive waste according to the IAEA SSG-23(KAERI/TR-4497, 4531). Excepting the radioactive waste, there is no try to adapt the safety case yet. Most incidents and accidents involved human during operating NPPs have a tendency

  13. Error-free 5.1 Tbit/s data generation on a single-wavelength channel using a 1.28 Tbaud symbol rate

    DEFF Research Database (Denmark)

    Mulvad, Hans Christian Hansen; Galili, Michael; Oxenløwe, Leif Katsuo

    2009-01-01

    We demonstrate a record bit rate of 5.1 Tbit/s on a single wavelength using a 1.28 Tbaud OTDM symbol rate, DQPSK data-modulation, and polarisation-multiplexing. Error-free performance (BER......We demonstrate a record bit rate of 5.1 Tbit/s on a single wavelength using a 1.28 Tbaud OTDM symbol rate, DQPSK data-modulation, and polarisation-multiplexing. Error-free performance (BER...

  14. Soft error rate estimations of the Kintex-7 FPGA within the ATLAS Liquid Argon (LAr) Calorimeter

    Science.gov (United States)

    Wirthlin, M. J.; Takai, H.; Harding, A.

    2014-01-01

    This paper summarizes the radiation testing performed on the Xilinx Kintex-7 FPGA in an effort to determine if the Kintex-7 can be used within the ATLAS Liquid Argon (LAr) Calorimeter. The Kintex-7 device was tested with wide-spectrum neutrons, protons, heavy-ions, and mixed high-energy hadron environments. The results of these tests were used to estimate the configuration ram and block ram upset rate within the ATLAS LAr. These estimations suggest that the configuration memory will upset at a rate of 1.1 × 10-10 upsets/bit/s and the bram memory will upset at a rate of 9.06 × 10-11 upsets/bit/s. For the Kintex 7K325 device, this translates to 6.85 × 10-3 upsets/device/s for configuration memory and 1.49 × 10-3 for block memory.

  15. Procedures for using expert judgment to estimate human-error probabilities in nuclear power plant operations. [PWR; BWR

    Energy Technology Data Exchange (ETDEWEB)

    Seaver, D.A.; Stillwell, W.G.

    1983-03-01

    This report describes and evaluates several procedures for using expert judgment to estimate human-error probabilities (HEPs) in nuclear power plant operations. These HEPs are currently needed for several purposes, particularly for probabilistic risk assessments. Data do not exist for estimating these HEPs, so expert judgment can provide these estimates in a timely manner. Five judgmental procedures are described here: paired comparisons, ranking and rating, direct numerical estimation, indirect numerical estimation and multiattribute utility measurement. These procedures are evaluated in terms of several criteria: quality of judgments, difficulty of data collection, empirical support, acceptability, theoretical justification, and data processing. Situational constraints such as the number of experts available, the number of HEPs to be estimated, the time available, the location of the experts, and the resources available are discussed in regard to their implications for selecting a procedure for use.

  16. Rate Constants for Fine-structure Excitations in O-H Collisions with Error Bars Obtained by Machine Learning

    Science.gov (United States)

    Vieira, Daniel; Krems, Roman V.

    2017-02-01

    We present an approach using a combination of coupled channel scattering calculations with a machine-learning technique based on Gaussian Process regression to determine the sensitivity of the rate constants for non-adiabatic transitions in inelastic atomic collisions to variations of the underlying adiabatic interaction potentials. Using this approach, we improve the previous computations of the rate constants for the fine-structure transitions in collisions of O({}3{P}j) with atomic H. We compute the error bars of the rate constants corresponding to 20% variations of the ab initio potentials and show that this method can be used to determine which of the individual adiabatic potentials are more or less important for the outcome of different fine-structure changing collisions.

  17. Optimal JPWL Forward Error Correction Rate Allocation for Robust JPEG 2000 Images and Video Streaming over Mobile Ad Hoc Networks

    Science.gov (United States)

    Agueh, Max; Diouris, Jean-François; Diop, Magaye; Devaux, François-Olivier; De Vleeschouwer, Christophe; Macq, Benoit

    2008-12-01

    Based on the analysis of real mobile ad hoc network (MANET) traces, we derive in this paper an optimal wireless JPEG 2000 compliant forward error correction (FEC) rate allocation scheme for a robust streaming of images and videos over MANET. The packet-based proposed scheme has a low complexity and is compliant to JPWL, the 11th part of the JPEG 2000 standard. The effectiveness of the proposed method is evaluated using a wireless Motion JPEG 2000 client/server application; and the ability of the optimal scheme to guarantee quality of service (QoS) to wireless clients is demonstrated.

  18. Influence of beam wander on bit-error rate in a ground-to-satellite laser uplink communication system.

    Science.gov (United States)

    Ma, Jing; Jiang, Yijun; Tan, Liying; Yu, Siyuan; Du, Wenhe

    2008-11-15

    Based on weak fluctuation theory and the beam-wander model, the bit-error rate of a ground-to-satellite laser uplink communication system is analyzed, in comparison with the condition in which beam wander is not taken into account. Considering the combined effect of scintillation and beam wander, optimum divergence angle and transmitter beam radius for a communication system are researched. Numerical results show that both of them increase with the increment of total link margin and transmitted wavelength. This work can benefit the ground-to-satellite laser uplink communication system design.

  19. Accurate Bit-Error Rate Evaluation for TH-PPM Systems in Nakagami Fading Channels Using Moment Generating Functions

    Science.gov (United States)

    Liang, Bin; Gunawan, Erry; Law, Choi Look; Teh, Kah Chan

    Analytical expressions based on the Gauss-Chebyshev quadrature (GCQ) rule technique are derived to evaluate the bit-error rate (BER) for the time-hopping pulse position modulation (TH-PPM) ultra-wide band (UWB) systems under a Nakagami-m fading channel. The analyses are validated by the simulation results and adopted to assess the accuracy of the commonly used Gaussian approximation (GA) method. The influence of the fading severity on the BER performance of TH-PPM UWB system is investigated.

  20. Optimal JPWL Forward Error Correction Rate Allocation for Robust JPEG 2000 Images and Video Streaming over Mobile Ad Hoc Networks

    Directory of Open Access Journals (Sweden)

    Benoit Macq

    2008-07-01

    Full Text Available Based on the analysis of real mobile ad hoc network (MANET traces, we derive in this paper an optimal wireless JPEG 2000 compliant forward error correction (FEC rate allocation scheme for a robust streaming of images and videos over MANET. The packet-based proposed scheme has a low complexity and is compliant to JPWL, the 11th part of the JPEG 2000 standard. The effectiveness of the proposed method is evaluated using a wireless Motion JPEG 2000 client/server application; and the ability of the optimal scheme to guarantee quality of service (QoS to wireless clients is demonstrated.

  1. Human reliability analysis of errors of commission: a review of methods and applications

    Energy Technology Data Exchange (ETDEWEB)

    Reer, B

    2007-06-15

    Illustrated by specific examples relevant to contemporary probabilistic safety assessment (PSA), this report presents a review of human reliability analysis (HRA) addressing post initiator errors of commission (EOCs), i.e. inappropriate actions under abnormal operating conditions. The review addressed both methods and applications. Emerging HRA methods providing advanced features and explicit guidance suitable for PSA are: A Technique for Human Event Analysis (ATHEANA, key publications in 1998/2000), Methode d'Evaluation de la Realisation des Missions Operateur pour la Surete (MERMOS, 1998/2000), the EOC HRA method developed by the Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS, 2003), the Misdiagnosis Tree Analysis (MDTA) method (2005/2006), the Cognitive Reliability and Error Analysis Method (CREAM, 1998), and the Commission Errors Search and Assessment (CESA) method (2002/2004). As a result of a thorough investigation of various PSA/HRA applications, this paper furthermore presents an overview of EOCs (termination of safety injection, shutdown of secondary cooling, etc.) referred to in predictive studies and a qualitative review of cases of EOC quantification. The main conclusions of the review of both the methods and the EOC HRA cases are: (1) The CESA search scheme, which proceeds from possible operator actions to the affected systems to scenarios, may be preferable because this scheme provides a formalized way for identifying relatively important scenarios with EOC opportunities; (2) an EOC identification guidance like CESA, which is strongly based on the procedural guidance and important measures of systems or components affected by inappropriate actions, however should pay some attention to EOCs associated with familiar but non-procedural actions and EOCs leading to failures of manually initiated safety functions. (3) Orientations of advanced EOC quantification comprise a) modeling of multiple contexts for a given scenario, b) accounting for

  2. A web-based team-oriented medical error communication assessment tool: development, preliminary reliability, validity, and user ratings.

    Science.gov (United States)

    Kim, Sara; Brock, Doug; Prouty, Carolyn D; Odegard, Peggy Soule; Shannon, Sarah E; Robins, Lynne; Boggs, Jim G; Clark, Fiona J; Gallagher, Thomas

    2011-01-01

    Multiple-choice exams are not well suited for assessing communication skills. Standardized patient assessments are costly and patient and peer assessments are often biased. Web-based assessment using video content offers the possibility of reliable, valid, and cost-efficient means for measuring complex communication skills, including interprofessional communication. We report development of the Web-based Team-Oriented Medical Error Communication Assessment Tool, which uses videotaped cases for assessing skills in error disclosure and team communication. Steps in development included (a) defining communication behaviors, (b) creating scenarios, (c) developing scripts, (d) filming video with professional actors, and (e) writing assessment questions targeting team communication during planning and error disclosure. Using valid data from 78 participants in the intervention group, coefficient alpha estimates of internal consistency were calculated based on the Likert-scale questions and ranged from α=.79 to α=.89 for each set of 7 Likert-type discussion/planning items and from α=.70 to α=.86 for each set of 8 Likert-type disclosure items. The preliminary test-retest Pearson correlation based on the scores of the intervention group was r=.59 for discussion/planning and r=.25 for error disclosure sections, respectively. Content validity was established through reliance on empirically driven published principles of effective disclosure as well as integration of expert views across all aspects of the development process. In addition, data from 122 medicine and surgical physicians and nurses showed high ratings for video quality (4.3 of 5.0), acting (4.3), and case content (4.5). Web assessment of communication skills appears promising. Physicians and nurses across specialties respond favorably to the tool.

  3. Human factors evaluation of remote afterloading brachytherapy: Human error and critical tasks in remote afterloading brachytherapy and approaches for improved system performance. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Callan, J.R.; Kelly, R.T.; Quinn, M.L. [Pacific Science and Engineering Group, San Diego, CA (United States)] [and others

    1995-05-01

    Remote Afterloading Brachytherapy (RAB) is a medical process used in the treatment of cancer. RAB uses a computer-controlled device to remotely insert and remove radioactive sources close to a target (or tumor) in the body. Some RAB problems affecting the radiation dose to the patient have been reported and attributed to human error. To determine the root cause of human error in the RAB system, a human factors team visited 23 RAB treatment sites in the US The team observed RAB treatment planning and delivery, interviewed RAB personnel, and performed walk-throughs, during which staff demonstrated the procedures and practices used in performing RAB tasks. Factors leading to human error in the RAB system were identified. The impact of those factors on the performance of RAB was then evaluated and prioritized in terms of safety significance. Finally, the project identified and evaluated alternative approaches for resolving the safety significant problems related to human error.

  4. On the Error Rate Analysis of Dual-Hop Amplify-and-Forward Relaying in Generalized-K Fading Channels

    Directory of Open Access Journals (Sweden)

    George P. Efthymoglou

    2010-01-01

    Full Text Available We present novel and easy-to-evaluate expressions for the error rate performance of cooperative dual-hop relaying with maximal ratio combining operating over independent generalized- fading channels. For this system, it is hard to obtain a closed-form expression for the moment generating function (MGF of the end-to-end signal-to-noise ratio (SNR at the destination, even for the case of a single dual-hop relay link. Therefore, we employ two different upper bound approximations for the output SNR, of which one is based on the minimum SNR of the two hops for each dual-hop relay link and the other is based on the geometric mean of the SNRs of the two hops. Lower bounds for the symbol and bit error rates for a variety of digital modulations can then be evaluated using the MGF-based approach. The final expressions are useful in the performance evaluation of amplify-and-forward relaying in a generalized composite radio environment.

  5. The type I error rate for in vivo Comet assay data when the hierarchical structure is disregarded

    DEFF Research Database (Denmark)

    Hansen, Merete Kjær; Kulahci, Murat

    The Comet assay is a sensitive technique for detection of DNA strand breaks. The experimental design of in vivo Comet assay studies are often hierarchically structured, which should be reWected in the statistical analysis. However, the hierarchical structure sometimes seems to be disregarded, and...... the exposition of the statistical methodology and to suitably account for the hierarchical structure of Comet assay data whenever present.......The Comet assay is a sensitive technique for detection of DNA strand breaks. The experimental design of in vivo Comet assay studies are often hierarchically structured, which should be reWected in the statistical analysis. However, the hierarchical structure sometimes seems to be disregarded......, and this imposes considerable impact on the type I error rate. This study aims to demonstrate the implications that result from disregarding the hierarchical structure. DiUerent combinations of the factor levels as they appear in a literature study give type I error rates up to 0.51 and for all combinations...

  6. Between‐Batch Pharmacokinetic Variability Inflates Type I Error Rate in Conventional Bioequivalence Trials: A Randomized Advair Diskus Clinical Trial

    Science.gov (United States)

    Carroll, KJ; Mielke, J; Benet, LZ; Jones, B

    2016-01-01

    We previously demonstrated pharmacokinetic differences among manufacturing batches of a US Food and Drug Administration (FDA)‐approved dry powder inhalation product (Advair Diskus 100/50) large enough to establish between‐batch bio‐inequivalence. Here, we provide independent confirmation of pharmacokinetic bio‐inequivalence among Advair Diskus 100/50 batches, and quantify residual and between‐batch variance component magnitudes. These variance estimates are used to consider the type I error rate of the FDA's current two‐way crossover design recommendation. When between‐batch pharmacokinetic variability is substantial, the conventional two‐way crossover design cannot accomplish the objectives of FDA's statistical bioequivalence test (i.e., cannot accurately estimate the test/reference ratio and associated confidence interval). The two‐way crossover, which ignores between‐batch pharmacokinetic variability, yields an artificially narrow confidence interval on the product comparison. The unavoidable consequence is type I error rate inflation, to ∼25%, when between‐batch pharmacokinetic variability is nonzero. This risk of a false bioequivalence conclusion is substantially higher than asserted by regulators as acceptable consumer risk (5%). PMID:27727445

  7. Investigation on the bit error rate performance of 40Gb/s space optical communication system based on BPSK scheme

    Science.gov (United States)

    Li, Mi; Li, Bowen; Zhang, Xuping; Song, Yuejiang; Liu, Jia; Tu, Guojie

    2015-08-01

    Space optical communication technique is attracting increasingly more attention because it owns advantages such as high security and great communication quality compared with microwave communication. As the space optical communication develops, people have already achieved the communication at data rate of Gb/s currently. The next generation for space optical system have goal of the higher data rate of 40Gb/s. However, the traditional optical communication system cannot satisfy it when the data rate of system is at such high extent. This paper will introduce ground optical communication system of 40Gb/s data rate as to achieve the space optical communication at high data rate. Speaking of the data rate of 40Gb/s, we must apply waveguide modulator to modulate the optical signal and magnify this signal by laser amplifier. Moreover, the more sensitive avalanche photodiode (APD) will be as the detector to increase the communication quality. Based on communication system above, we analyze character of communication quality in downlink of space optical communication system when data rate is at the level of 40Gb/s. The bit error rate (BER) performance, an important factor to justify communication quality, versus some parameter ratios is discussed. From results, there exists optimum ratio of gain factor and divergence angle, which shows the best BER performance. We can also increase ratio of receiving diameter and divergence angle for better communication quality. These results can be helpful to comprehend the character of optical communication system at high data rate and contribute to the system design.

  8. An Analysis of the Contributing Factors to the Fiscal Year 1985 MCDOSET (Marine Corps Disbursing On-Site Examination Teams) Error Rates of the Marine Corps Infantry Battalion.

    Science.gov (United States)

    1986-03-01

    RATXrE FiMgure 7. monetar Error Pate in Plation to the ’,u’ber of Additional Duties of the Personnel Officer. a൹ no relationship appears to exist...S x z zz z 00- - - .1 0 5 10 15 20 _25 30 35 40 MONETAR !Y ERROR R11ATE Figure 12. Monetary Error Rate in --,lation to the Number of MOS 0131

  9. The dynamic effect of exchange-rate volatility on Turkish exports: Parsimonious error-correction model approach

    Directory of Open Access Journals (Sweden)

    Demirhan Erdal

    2015-01-01

    Full Text Available This paper aims to investigate the effect of exchange-rate stability on real export volume in Turkey, using monthly data for the period February 2001 to January 2010. The Johansen multivariate cointegration method and the parsimonious error-correction model are applied to determine long-run and short-run relationships between real export volume and its determinants. In this study, the conditional variance of the GARCH (1, 1 model is taken as a proxy for exchange-rate stability, and generalized impulse-response functions and variance-decomposition analyses are applied to analyze the dynamic effects of variables on real export volume. The empirical findings suggest that exchangerate stability has a significant positive effect on real export volume, both in the short and the long run.

  10. The mutation rate of the human mtDNA deletion mtDNA{sup 4977}

    Energy Technology Data Exchange (ETDEWEB)

    Shenkar, R. [Univ. of Colorado Health Science Center, Denver, CO (United States); Navidi, W. [Colorado School of Mines, Golden, CO (United States); Tavare, S. [Univ. of California, Los Angeles, CA (United States)] [and others

    1996-10-01

    The human mitochondrial mutation mtDNA{sup 4977} is a 4,977-bp deletion that originates between two 13-bp direct repeats. We grew 220 colonies of cells, each from a single human cell. For each colony, we counted the number of cells and amplified the DNA by PCR to test for the presence of a deletion. To estimate the mutation rate, we used a model that describes the relationship between the mutation rate and the probability that a colony of a given size will contain no mutants, taking into account such factors as possible mitochondrial turnover and mistyping due to PCR error. We estimate that the mutation rate for mtDNA{sup 4977} in cultured human cells is 5.95 x 10{sup {minus}8} per mitochondrial genome replication. This method can be applied to specific chromosomal, as well as mitochondrial, mutations. 17 refs., 1 fig., 1 tab.

  11. Temporal and Developmental-Stage Variation in the Occurrence of Mitotic Errors in Tripronuclear Human Preimplantation Embryos

    NARCIS (Netherlands)

    Mantikou, Eleni; van Echten-Arends, Jannie; Sikkema-Raddatz, Birgit; van der Veen, Fulco; Repping, Sjoerd; Mastenbroek, Sebastiaan

    2013-01-01

    Mitotic errors during early development of human preimplantation embryos are common, rendering a large proportion of embryos chromosomally mosaic. It is also known that the percentage of diploid cells in human diploid-aneuploid mosaic embryos is higher at the blastocyst than at the cleavage stage. I

  12. AN IV CATHETER FRAGMENTS DURING MDCT SCANNING OF HUMAN ERROR: EXPERIMENTAL AND REPRODUCIBLE MICROSCOPIC MAGNIFICATION ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Kweon, Dae Cheol [Dept. of Radiologic Science, Shin Heung College, Uijeongbu (Korea, Republic of); Lee, Jong Woong [Dept. of of Radiology, Kyung Hee University Hospital at Gang-dong, Seoul (Korea, Republic of); Choi, Ji Won [Dept. of Radiological Science, Jeonju University, Jeonju (Korea, Republic of); Yang, Sung Hwan [Dept. of of Prosthetics and Orthotics, Korean National College of Rehabilitation and Welfare, Pyeongtaek (Korea, Republic of); Dong, Kyung Rae [Dept. of Radiological Technology, Gwangju Health College University, Gwangju (Korea, Republic of); Chung, Won Kwan [Dept. of of Nuclear Engineering, Chosun University, Gwangju (Korea, Republic of)

    2011-12-15

    The use of intravenous catheters are occasionally complicated by intravascular fragments and swelling of the catheter fragments. We present a patient in whom an intravenous catheter fragments was retrieved from the dorsal metacarpal vein following its incidental CT examination detection. The case of demonstrates the utility of microscopy and multi-detector CT in localizing small of subtle intravenous catheter fragments as a human error. A case of IV catheter fragments in the metacarpal vein, in which reproducible and microscopy data allowed complete localization of a missing fragments and guided surgery with respect to the optimal incision site for fragments removal. These reproducible studies may help to determine the best course of action and treatment for the patient who presents with such a case.

  13. Adaptation of hybrid human-computer interaction systems using EEG error-related potentials.

    Science.gov (United States)

    Chavarriaga, Ricardo; Biasiucci, Andrea; Forster, Killian; Roggen, Daniel; Troster, Gerhard; Millan, Jose Del R

    2010-01-01

    Performance improvement in both humans and artificial systems strongly relies in the ability of recognizing erroneous behavior or decisions. This paper, that builds upon previous studies on EEG error-related signals, presents a hybrid approach for human computer interaction that uses human gestures to send commands to a computer and exploits brain activity to provide implicit feedback about the recognition of such commands. Using a simple computer game as a case study, we show that EEG activity evoked by erroneous gesture recognition can be classified in single trials above random levels. Automatic artifact rejection techniques are used, taking into account that subjects are allowed to move during the experiment. Moreover, we present a simple adaptation mechanism that uses the EEG signal to label newly acquired samples and can be used to re-calibrate the gesture recognition system in a supervised manner. Offline analysis show that, although the achieved EEG decoding accuracy is far from being perfect, these signals convey sufficient information to significantly improve the overall system performance.

  14. Effects of two commercial electronic prescribing systems on prescribing error rates in hospital in-patients: a before and after study.

    Directory of Open Access Journals (Sweden)

    Johanna I Westbrook

    2012-01-01

    Full Text Available BACKGROUND: Considerable investments are being made in commercial electronic prescribing systems (e-prescribing in many countries. Few studies have measured or evaluated their effectiveness at reducing prescribing error rates, and interactions between system design and errors are not well understood, despite increasing concerns regarding new errors associated with system use. This study evaluated the effectiveness of two commercial e-prescribing systems in reducing prescribing error rates and their propensities for introducing new types of error. METHODS AND RESULTS: We conducted a before and after study involving medication chart audit of 3,291 admissions (1,923 at baseline and 1,368 post e-prescribing system at two Australian teaching hospitals. In Hospital A, the Cerner Millennium e-prescribing system was implemented on one ward, and three wards, which did not receive the e-prescribing system, acted as controls. In Hospital B, the iSoft MedChart system was implemented on two wards and we compared before and after error rates. Procedural (e.g., unclear and incomplete prescribing orders and clinical (e.g., wrong dose, wrong drug errors were identified. Prescribing error rates per admission and per 100 patient days; rates of serious errors (5-point severity scale, those ≥3 were categorised as serious by hospital and study period; and rates and categories of postintervention "system-related" errors (where system functionality or design contributed to the error were calculated. Use of an e-prescribing system was associated with a statistically significant reduction in error rates in all three intervention wards (respectively reductions of 66.1% [95% CI 53.9%-78.3%]; 57.5% [33.8%-81.2%]; and 60.5% [48.5%-72.4%]. The use of the system resulted in a decline in errors at Hospital A from 6.25 per admission (95% CI 5.23-7.28 to 2.12 (95% CI 1.71-2.54; p<0.0001 and at Hospital B from 3.62 (95% CI 3.30-3.93 to 1.46 (95% CI 1.20-1.73; p<0.0001. This

  15. Human factors engineering in healthcare systems: the problem of human error and accident management.

    Science.gov (United States)

    Cacciabue, P C; Vella, G

    2010-04-01

    This paper discusses some crucial issues associated with the exploitation of data and information about health care for the improvement of patient safety. In particular, the issues of human factors and safety management are analysed in relation to exploitation of reports about non-conformity events and field observations. A methodology for integrating field observation and theoretical approaches for safety studies is described. Two sample cases are discussed in detail: the first one makes reference to the use of data collected in the aviation domain and shows how these can be utilised to define hazard and risk; the second one concerns a typical ethnographic study in a large hospital structure for the identification of most relevant areas of intervention. The results show that, if national authorities find a way to harmonise and formalize critical aspects, such as the severity of standard events, it is possible to estimate risk and define auditing needs, well before the occurrence of serious incidents, and to indicate practical ways forward for improving safety standards.

  16. Correlation Between Human Development Index and Infant Mortality Rate Worldwide

    Directory of Open Access Journals (Sweden)

    Alijanzadeh

    2016-02-01

    Full Text Available Background Infant mortality rate (per 1000 live births is a vital index to monitor the standard of health and social inequality which is related to human development dimensions worldwide. Human development index (HDI includes basic social indicators such as life expectancy, education and income. Objectives The current study aimed to find the correlation between human development index and infant mortality rate. Patients and Methods This descriptive study that represents the relationship of infant mortality rate with human development index and human development index dimensions was performed on the profiles of 135 countries worldwide [Africa (35 countries, America (26 countries, Asia (30 countries, the Pacific (2 countries and Europe (42 countries]. Two databases were used in the study: the world health organization (WHO database (2010 and human development database (2010. Data were analyzed using Pearson correlation test by SPSS software. Results The study found that socio-economic factors or human development dimensions are significantly correlated with risk of chance mortality in the world. The per capita income (r = -0.625, life expectancy (r = -0.925 and education (r = -0.843 were negatively correlated with the infant mortality rate; human development index (r = -0.844 was also negatively correlated with the infant mortality rate (P < 0.01. Conclusions Human development index is one of the best indicators and predictors to perceive healthcare inequities. Worldwide improvement of these indicators, especially the education level, might promote infant life expectancy and decrease infant mortality.

  17. A study on fatigue measurement of operators for human error prevention in NPPs

    Energy Technology Data Exchange (ETDEWEB)

    Ju, Oh Yeon; Il, Jang Tong; Meiling, Luo; Hee, Lee Young [KAERI, Daejeon (Korea, Republic of)

    2012-10-15

    The identification and the analysis of individual factor of operators, which is one of the various causes of adverse effects in human performance, is not easy in NPPs. There are work types (including shift), environment, personality, qualification, training, education, cognition, fatigue, job stress, workload, etc in individual factors for the operators. Research at the Finnish Institute of Occupational Health (FIOH) reported that a 'burn out (extreme fatigue)' is related to alcohol dependent habits and must be dealt with using a stress management program. USNRC (U.S. Nuclear Regulatory Commission) developed FFD (Fitness for Duty) for improving the task efficiency and preventing human errors. 'Managing Fatigue' of 10CFR26 presented as requirements to control operator fatigue in NPPs. The committee explained that excessive fatigue is due to stressful work environments, working hours, shifts, sleep disorders, and unstable circadian rhythms. In addition, an International Labor Organization (ILO) developed and suggested a checklist to manage fatigue and job stress. In domestic, a systematic evaluation way is presented by the Final Safety Analysis Report (FSAR) chapter 18, Human Factors, in the licensing process. However, it almost focused on the interface design such as HMI (Human Machine Interface), not individual factors. In particular, because our country is in a process of the exporting the NPP to UAE, the development and setting of fatigue management technique is important and urgent to present the technical standard and FFD criteria to UAE. And also, it is anticipated that the domestic regulatory body applies the FFD program as the regulation requirement so that a preparation for that situation is required. In this paper, advanced researches are investigated to find the fatigue measurement and evaluation methods of operators in a high reliability industry. Also, this study tries to review the NRC report and discuss the causal factors and

  18. Sources of error in the estimation of mosquito infection rates used to assess risk of arbovirus transmission.

    Science.gov (United States)

    Bustamante, Dulce M; Lord, Cynthia C

    2010-06-01

    Infection rate is an estimate of the prevalence of arbovirus infection in a mosquito population. It is assumed that when infection rate increases, the risk of arbovirus transmission to humans and animals also increases. We examined some of the factors that can invalidate this assumption. First, we used a model to illustrate how the proportion of mosquitoes capable of virus transmission, or infectious, is not a constant fraction of the number of infected mosquitoes. Thus, infection rate is not always a straightforward indicator of risk. Second, we used a model that simulated the process of mosquito sampling, pooling, and virus testing and found that mosquito infection rates commonly underestimate the prevalence of arbovirus infection in a mosquito population. Infection rate should always be used in conjunction with other surveillance indicators (mosquito population size, age structure, weather) and historical baseline data when assessing the risk of arbovirus transmission.

  19. Systematic analysis of video data from different human-robot interaction studies: a categorization of social signals during error situations.

    Science.gov (United States)

    Giuliani, Manuel; Mirnig, Nicole; Stollnberger, Gerald; Stadler, Susanne; Buchner, Roland; Tscheligi, Manfred

    2015-01-01

    Human-robot interactions are often affected by error situations that are caused by either the robot or the human. Therefore, robots would profit from the ability to recognize when error situations occur. We investigated the verbal and non-verbal social signals that humans show when error situations occur in human-robot interaction experiments. For that, we analyzed 201 videos of five human-robot interaction user studies with varying tasks from four independent projects. The analysis shows that there are two types of error situations: social norm violations and technical failures. Social norm violations are situations in which the robot does not adhere to the underlying social script of the interaction. Technical failures are caused by technical shortcomings of the robot. The results of the video analysis show that the study participants use many head movements and very few gestures, but they often smile, when in an error situation with the robot. Another result is that the participants sometimes stop moving at the beginning of error situations. We also found that the participants talked more in the case of social norm violations and less during technical failures. Finally, the participants use fewer non-verbal social signals (for example smiling, nodding, and head shaking), when they are interacting with the robot alone and no experimenter or other human is present. The results suggest that participants do not see the robot as a social interaction partner with comparable communication skills. Our findings have implications for builders and evaluators of human-robot interaction systems. The builders need to consider including modules for recognition and classification of head movements to the robot input channels. The evaluators need to make sure that the presence of an experimenter does not skew the results of their user studies.

  20. Inflation of type I error rates by unequal variances associated with parametric, nonparametric, and Rank-Transformation Tests

    Directory of Open Access Journals (Sweden)

    Donald W. Zimmerman

    2004-01-01

    Full Text Available It is well known that the two-sample Student t test fails to maintain its significance level when the variances of treatment groups are unequal, and, at the same time, sample sizes are unequal. However, introductory textbooks in psychology and education often maintain that the test is robust to variance heterogeneity when sample sizes are equal. The present study discloses that, for a wide variety of non-normal distributions, especially skewed distributions, the Type I error probabilities of both the t test and the Wilcoxon-Mann-Whitney test are substantially inflated by heterogeneous variances, even when sample sizes are equal. The Type I error rate of the t test performed on ranks replacing the scores (rank-transformed data is inflated in the same way and always corresponds closely to that of the Wilcoxon-Mann-Whitney test. For many probability densities, the distortion of the significance level is far greater after transformation to ranks and, contrary to known asymptotic properties, the magnitude of the inflation is an increasing function of sample size. Although nonparametric tests of location also can be sensitive to differences in the shape of distributions apart from location, the Wilcoxon-Mann-Whitney test and rank-transformation tests apparently are influenced mainly by skewness that is accompanied by specious differences in the means of ranks.

  1. Advanced Communications Technology Satellite (ACTS) Fade Compensation Protocol Impact on Very Small-Aperture Terminal Bit Error Rate Performance

    Science.gov (United States)

    Cox, Christina B.; Coney, Thom A.

    1999-01-01

    The Advanced Communications Technology Satellite (ACTS) communications system operates at Ka band. ACTS uses an adaptive rain fade compensation protocol to reduce the impact of signal attenuation resulting from propagation effects. The purpose of this paper is to present the results of an analysis characterizing the improvement in VSAT performance provided by this protocol. The metric for performance is VSAT bit error rate (BER) availability. The acceptable availability defined by communication system design specifications is 99.5% for a BER of 5E-7 or better. VSAT BER availabilities with and without rain fade compensation are presented. A comparison shows the improvement in BER availability realized with rain fade compensation. Results are presented for an eight-month period and for 24 months spread over a three-year period. The two time periods represent two different configurations of the fade compensation protocol. Index Terms-Adaptive coding, attenuation, propagation, rain, satellite communication, satellites.

  2. The mutation rate of the human mtDNA deletion mtDNA4977.

    Science.gov (United States)

    Shenkar, R; Navidi, W; Tavaré, S; Dang, M H; Chomyn, A; Attardi, G; Cortopassi, G; Arnheim, N

    1996-10-01

    The human mitochondrial mutation mtDNA4977 is a 4,977-bp deletion that originates between two 13-bp direct repeats. We grew 220 colonies of cells, each from a single human cell. For each colony, we counted the number of cells and amplified the DNA by PCR to test for the presence of a deletion. To estimate the mutation fate, we used a model that describes the relationship between the mutation rate and the probability that a colony of a given size will contain no mutants, taking into account such factors as possible mitochondrial turnover and mistyping due to PCR error. We estimate that the mutation rate for mtDNA4977 in cultured human cells is 5.95 x 10(-8) per mitochondrial genome replication. This method can be applied to specific chromosomal, as well as mitochondrial, mutations.

  3. A Preliminary Study on the Measures to Assess the Organizational Safety: The Cultural Impact on Human Error Potential

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yong Hee; Lee, Yong Hee [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2011-10-15

    The Fukushima I nuclear accident following the Tohoku earthquake and tsunami on 11 March 2011 occurred after twelve years had passed since the JCO accident which was caused as a result of an error made by JCO employees. These accidents, along with the Chernobyl accident, associated with characteristic problems of various organizations caused severe social and economic disruptions and have had significant environmental and health impact. The cultural problems with human errors occur for various reasons, and different actions are needed to prevent different errors. Unfortunately, much of the research on organization and human error has shown widely various or different results which call for different approaches. In other words, we have to find more practical solutions from various researches for nuclear safety and lead a systematic approach to organizational deficiency causing human error. This paper reviews Hofstede's criteria, IAEA safety culture, safety areas of periodic safety review (PSR), teamwork and performance, and an evaluation of HANARO safety culture to verify the measures used to assess the organizational safety

  4. Measurement of low bit-error-rates of adiabatic quantum-flux-parametron logic using a superconductor voltage driver

    Science.gov (United States)

    Takeuchi, Naoki; Suzuki, Hideo; Yoshikawa, Nobuyuki

    2017-05-01

    Adiabatic quantum-flux-parametron (AQFP) is an energy-efficient superconductor logic. The advantage of AQFP is that the switching energy can be reduced by lowering operation frequencies or by increasing the quality factors of Josephson junctions, while keeping the energy barrier height much larger than thermal energy. In other words, both low energy dissipation and low bit error rates (BERs) can be achieved. In this paper, we report the first measurement results of the low BERs of AQFP logic. We used a superconductor voltage driver with a stack of dc superconducting-quantum-interference-devices to amplify the logic signals of AQFP gates into mV-range voltage signals for the BER measurement. Our measurement results showed 3.3 dB and 2.6 dB operation margins, in which BERs were less than 10-20, for 1 Gbps and 2 Gbps data rates, respectively. While the observed BERs were very low, the estimated switching energy for the 1-Gbps operation was only 2 zJ or 30kBT, where kB is the Boltzmann's constant and T is the temperature. Unlike conventional non-adiabatic logic, BERs are not directly associated with switching energy in AQFP.

  5. Bit Error Rate Performance for Multicarrier Code Division Multiple Access over Generalized η-μ Fading Environment

    Directory of Open Access Journals (Sweden)

    James Osuru Mark

    2011-01-01

    Full Text Available The multicarrier code division multiple access (MC-CDMA system has received a considerable attention from researchers owing to its great potential in achieving high data rates transmission in wireless communications. Due to the detrimental effects of multipath fading the performance of the system degrades. Similarly, the impact of non-orthogonality of spreading codes can exist and cause interference. This paper addresses the performance of multicarrier code division multiple access system under the influence of frequency selective generalized η-µ  fading channel and multiple access interference caused by other active users to the desired one. We apply Gaussian approximation technique to analyse the performance of the system. The avearge bit error rate is derived and expressed in Gauss hypergeometic functions. Maximal ratio combining diversity technique is utilized to alleviate the deleterious effect of multipath fading. We observed that the system performance improves when the parameter η increase or decreasse in format 1 or format 2 conditions respectively.

  6. Error-Correcting Output Codes in Classification of Human Induced Pluripotent Stem Cell Colony Images

    Directory of Open Access Journals (Sweden)

    Henry Joutsijoki

    2016-01-01

    Full Text Available The purpose of this paper is to examine how well the human induced pluripotent stem cell (hiPSC colony images can be classified using error-correcting output codes (ECOC. Our image dataset includes hiPSC colony images from three classes (bad, semigood, and good which makes our classification task a multiclass problem. ECOC is a general framework to model multiclass classification problems. We focus on four different coding designs of ECOC and apply to each one of them k-Nearest Neighbor (k-NN searching, naïve Bayes, classification tree, and discriminant analysis variants classifiers. We use Scaled Invariant Feature Transformation (SIFT based features in classification. The best accuracy (62.4% is obtained with ternary complete ECOC coding design and k-NN classifier (standardized Euclidean distance measure and inverse weighting. The best result is comparable with our earlier research. The quality identification of hiPSC colony images is an essential problem to be solved before hiPSCs can be used in practice in large-scale. ECOC methods examined are promising techniques for solving this challenging problem.

  7. Error-Correcting Output Codes in Classification of Human Induced Pluripotent Stem Cell Colony Images.

    Science.gov (United States)

    Joutsijoki, Henry; Haponen, Markus; Rasku, Jyrki; Aalto-Setälä, Katriina; Juhola, Martti

    2016-01-01

    The purpose of this paper is to examine how well the human induced pluripotent stem cell (hiPSC) colony images can be classified using error-correcting output codes (ECOC). Our image dataset includes hiPSC colony images from three classes (bad, semigood, and good) which makes our classification task a multiclass problem. ECOC is a general framework to model multiclass classification problems. We focus on four different coding designs of ECOC and apply to each one of them k-Nearest Neighbor (k-NN) searching, naïve Bayes, classification tree, and discriminant analysis variants classifiers. We use Scaled Invariant Feature Transformation (SIFT) based features in classification. The best accuracy (62.4%) is obtained with ternary complete ECOC coding design and k-NN classifier (standardized Euclidean distance measure and inverse weighting). The best result is comparable with our earlier research. The quality identification of hiPSC colony images is an essential problem to be solved before hiPSCs can be used in practice in large-scale. ECOC methods examined are promising techniques for solving this challenging problem.

  8. Adaptive planning strategy for high dose rate prostate brachytherapy—a simulation study on needle positioning errors.

    Science.gov (United States)

    Borot de Battisti, M; Denis de Senneville, B; Maenhout, M; Hautvast, G; Binnekamp, D; Lagendijk, J J W; van Vulpen, M; Moerland, M A

    2016-03-01

    The development of magnetic resonance (MR) guided high dose rate (HDR) brachytherapy for prostate cancer has gained increasing interest for delivering a high tumor dose safely in a single fraction. To support needle placement in the limited workspace inside the closed-bore MRI, a single-needle MR-compatible robot is currently under development at the University Medical Center Utrecht (UMCU). This robotic device taps the needle in a divergent way from a single rotation point into the prostate. With this setup, it is warranted to deliver the irradiation dose by successive insertions of the needle. Although robot-assisted needle placement is expected to be more accurate than manual template-guided insertion, needle positioning errors may occur and are likely to modify the pre-planned dose distribution.In this paper, we propose a dose plan adaptation strategy for HDR prostate brachytherapy with feedback on the needle position: a dose plan is made at the beginning of the interventional procedure and updated after each needle insertion in order to compensate for possible needle positioning errors. The introduced procedure can be used with the single needle MR-compatible robot developed at the UMCU. The proposed feedback strategy was tested by simulating complete HDR procedures with and without feedback on eight patients with different numbers of needle insertions (varying from 4 to 12). In of the cases tested, the number of clinically acceptable plans obtained at the end of the procedure was larger with feedback compared to the situation without feedback. Furthermore, the computation time of the feedback between each insertion was below 100 s which makes it eligible for intra-operative use.

  9. Analysis of human error in occupational accidents in the power plant industries using combining innovative FTA and meta-heuristic algorithms

    Directory of Open Access Journals (Sweden)

    M. Omidvari

    2015-09-01

    Full Text Available Introduction: Occupational accidents are of the main issues in industries. It is necessary to identify the main root causes of accidents for their control. Several models have been proposed for determining the accidents root causes. FTA is one of the most widely used models which could graphically establish the root causes of accidents. The non-linear function is one of the main challenges in FTA compliance and in order to obtain the exact number, the meta-heuristic algorithms can be used. Material and Method: The present research was done in power plant industries in construction phase. In this study, a pattern for the analysis of human error in work-related accidents was provided by combination of neural network algorithms and FTA analytical model. Finally, using this pattern, the potential rate of all causes was determined. Result: The results showed that training, age, and non-compliance with safety principals in the workplace were the most important factors influencing human error in the occupational accident. Conclusion: According to the obtained results, it can be concluded that human errors can be greatly reduced by training, right choice of workers with regard to the type of occupations, and provision of appropriate safety conditions in the work place.

  10. Methodical errors of measurement of the human body tissues electrical parameters

    OpenAIRE

    Antoniuk, O.; Pokhodylo, Y.

    2015-01-01

    Sources of methodical measurement errors of immitance parameters of biological tissues are described. Modeling measurement errors of RC-parameters of biological tissues equivalent circuits into the frequency range is analyzed. Recommendations on the choice of test signal frequency for measurement of these elements is provided.

  11. The Differences in Error Rate and Type between IELTS Writing Bands and Their Impact on Academic Workload

    Science.gov (United States)

    Müller, Amanda

    2015-01-01

    This paper attempts to demonstrate the differences in writing between International English Language Testing System (IELTS) bands 6.0, 6.5 and 7.0. An analysis of exemplars provided from the IELTS test makers reveals that IELTS 6.0, 6.5 and 7.0 writers can make a minimum of 206 errors, 96 errors and 35 errors per 1000 words. The following section…

  12. How much do we know about spontaneous human mutation rates

    Energy Technology Data Exchange (ETDEWEB)

    Crow, J.F. (Univ. of Wisconsin, Madison, WI (United States))

    1993-01-01

    The much larger number of cell divisions between zygote and sperm than between zygote and egg, the increased age of fathers of children with new dominant mutations, and the greater evolution rate of pseudogenes on the Y chromosome than of those on autosomes all point to a much higher mutation rate in human males than in females, as first pointed out by Haldane in his classical study of X-linked hemophilia. The age of the father is the main factor determining the human spontaneous mutation rate, and probably the total mutation rate. The total mutation rate in Drosophila males of genes causing minor reduction in viability is at least 0.4 per sperm and may be considerably higher. The great mutation load implied by a rate of [approx] 1 per zygote can be greatly ameliorated by quasi-transition selection. Corresponding data are not available for the human population. The evolution rate of pseudogenes in primates suggests some 10[sup 2] new mutations per zygote. Presumably the overwhelming majority of these are neutral, but even the approximate fraction is not known. Statistical evidence in Drosophilia shows that mutations with minor effects cause about the same heterozygous impairment of fitness as those that are lethal when homozygous. The magnitude of heterozygous effect is such that almost all mutant genes are eliminated as heterozygotes before ever becoming homozygous. Although quantitative data in the human species are lacking, anecdotal information supports the conclusion that partial dominance is the rule here as well. This suggests that if the human mutation rate were increased or decreased, the effects would be spread over a period of 50-100 generations. 31 refs., 3 figs., 2 tabs.

  13. Management and Evaluation System on Human Error, Licence Requirements, and Job-aptitude in Rail and the Other Industries

    Energy Technology Data Exchange (ETDEWEB)

    Koo, In Soo; Suh, S. M.; Park, G. O. (and others)

    2006-07-15

    Rail system is a system that is very closely related to the public life. When an accident happens, the public using this system should be injured or even be killed. The accident that recently took place in Taegu subway system, because of the inappropriate human-side task performance, showed demonstratively how its results could turn out to be tragic one. Many studies have shown that the most cases of the accidents have occurred because of performing his/her tasks in inappropriate way. It is generally recognised that the rail system without human element could never be happened quite long time. So human element in rail system is going to be the major factor to the next tragic accident. This state of the art report studied the cases of the managements and evaluation systems related to human errors, license requirements, and job aptitudes in the areas of rail and the other industries for the purpose of improvement of the task performance of personnel which consists of an element and finally enhancement of rail safety. The human errors, license requirements, and evaluation system of the job aptitude on people engaged in agencies with close relation to rail do much for development and preservation their abilities. But due to various inside and outside factors, to some extent it may have limitations to timely reflect overall trends of society, technology, and a sense of value. Removal and control of the factors of human errors will have epochal roles in safety of the rail system through the case studies of this report. Analytical results on case studies of this report will be used in the project 'Development of Management Criteria on Human Error and Evaluation Criteria on Job-aptitude of Rail Safe-operation Personnel' which has been carried out as a part of 'Integrated R and D Program for Railway Safety'.

  14. Determination of human muscle protein fractional synthesis rate

    DEFF Research Database (Denmark)

    Bornø, Andreas; Hulston, Carl J; van Hall, Gerrit

    2014-01-01

    In the present study, different MS methods for the determination of human muscle protein fractional synthesis rate (FSR) using [ring-(13)C6 ]phenylalanine as a tracer were evaluated. Because the turnover rate of human skeletal muscle is slow, only minute quantities of the stable isotopically......-MS/MS) and GC-tandem MS (GC-MS/MS) have made these techniques an option for human muscle FSR measurements. Human muscle biopsies were freeze dried, cleaned, and hydrolyzed, and the amino acids derivatized using either N-acetyl-n-propyl, phenylisothiocyanate, or N.......89 ± 0.01, P muscle FSR, (2) LC-MS/MS comes quite close and is a good alternative when tissue quantities are too small for GC-C-IRMS, and (3) If GC-MS/MS is to be used, then the HFBA derivative should be used instead...

  15. Stochastic Analysis of a Repairable System with Constant Error Rates and Arbitrary System Repair Rates%对具有定常故障率和任意系统修复率的可修复系统的猜测分析

    Institute of Scientific and Technical Information of China (English)

    徐光甫; 李洪霞

    2004-01-01

    研究了具有定常人为故障率(human error rates)和通常故障率(common-cause failure rates),修复时间任意分布的可修复系统的数学模型.首先将此系统转换为Banach空间下的Volterra积分方程,得到了系统非负解的存在性和唯一性结果.

  16. The Relation Between Inflation in Type-I and Type-II Error Rate and Population Divergence in Genome-Wide Association Analysis of Multi-Ethnic Populations.

    Science.gov (United States)

    Derks, E M; Zwinderman, A H; Gamazon, E R

    2017-02-10

    Population divergence impacts the degree of population stratification in Genome Wide Association Studies. We aim to: (i) investigate type-I error rate as a function of population divergence (FST) in multi-ethnic (admixed) populations; (ii) evaluate the statistical power and effect size estimates; and (iii) investigate the impact of population stratification on the results of gene-based analyses. Quantitative phenotypes were simulated. Type-I error rate was investigated for Single Nucleotide Polymorphisms (SNPs) with varying levels of FST between the ancestral European and African populations. Type-II error rate was investigated for a SNP characterized by a high value of FST. In all tests, genomic MDS components were included to correct for population stratification. Type-I and type-II error rate was adequately controlled in a population that included two distinct ethnic populations but not in admixed samples. Statistical power was reduced in the admixed samples. Gene-based tests showed no residual inflation in type-I error rate.

  17. Packet error rate analysis of digital pulse interval modulation in intersatellite optical communication systems with diversified wavefront deformation.

    Science.gov (United States)

    Zhu, Jin; Wang, Dayan; Xie, Wanqing

    2015-02-20

    Diversified wavefront deformation is an inevitable phenomenon in intersatellite optical communication systems, which will decrease system performance. In this paper, we investigate the description of wavefront deformation and its influence on the packet error rate (PER) of digital pulse interval modulation (DPIM). With the wavelet method, the diversified wavefront deformation can be described by wavelet parameters: coefficient, dilation, and shift factors, where the coefficient factor represents the depth, dilation factor represents the area, and shift factor is for location. Based on this, the relationship between PER and wavelet parameters is analyzed from a theoretical viewpoint. Numerical results illustrate the validity of theoretical analysis: PER increases with the depth and area and decreases if location gets farther from the center of the optical antenna. In addition to describing diversified deformation, the advantage of the wavelet method over Zernike polynomials in computational complexity is shown via numerical example. This work provides a feasible method for the description along with influence analysis of diversified wavefront deformation from a practical viewpoint and will be helpful for designing optical systems.

  18. Write error rate of spin-transfer-torque random access memory including micromagnetic effects using rare event enhancement

    CERN Document Server

    Roy, Urmimala; Register, Leonard F; Banerjee, Sanjay K

    2016-01-01

    Spin-transfer-torque random access memory (STT-RAM) is a promising candidate for the next-generation of random-access-memory due to improved scalability, read-write speeds and endurance. However, the write pulse duration must be long enough to ensure a low write error rate (WER), the probability that a bit will remain unswitched after the write pulse is turned off, in the presence of stochastic thermal effects. WERs on the scale of 10$^{-9}$ or lower are desired. Within a macrospin approximation, WERs can be calculated analytically using the Fokker-Planck method to this point and beyond. However, dynamic micromagnetic effects within the bit can affect and lead to faster switching. Such micromagnetic effects can be addressed via numerical solution of the stochastic Landau-Lifshitz-Gilbert-Slonczewski (LLGS) equation. However, determining WERs approaching 10$^{-9}$ would require well over 10$^{9}$ such independent simulations, which is infeasible. In this work, we explore calculation of WER using "rare event en...

  19. Bit error rate analysis of Wi-Fi and bluetooth under the interference of 2.45 GHz RFID

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    IEEE 802.11b WLAN (Wi-Fi) and IEEE 802.15.1 WPAN (bluetooth) are prevalent nowadays, and radio frequency identification (RFID) is an emerging technology which has wider applications. 802.11b occupies unlicensed industrial, scientific and medical (ISM) band (2.4-2.483 5 GHz) and uses direct sequence spread spectrum (DSSS) to alleviate the narrow band interference and fading. Bluetooth is also one user of ISM band and adopts frequency hopping spread spectrum (FHSS) to avoid the mutual interference. RFID can operate on multiple frequency bands, such as 135 KHz, 13.56 MHz and 2.45 GHz. When 2.45 GHz RFID device, which uses FHSS, collocates with 802.11b or bluetooth, the mutual interference is inevitable. Although DSSS and FHSS are applied to mitigate the interference, their performance degradation may be very significant. Therefore, in this article, the impact of 2.45 GHz RFID on 802.11b and bluetooth is investigated. Bit error rate (BER) of 802.11b and bluetooth are analyzed by establishing a mathematical model, and the simula-tion results are compared with the theoretical analysis to justify this mathematical model.

  20. Bit Error Rate Performance Analysis of a Threshold-Based Generalized Selection Combining Scheme in Nakagami Fading Channels

    Directory of Open Access Journals (Sweden)

    Sulyman Ahmed Iyanda

    2005-01-01

    Full Text Available The severity of fading on mobile communication channels calls for the combining of multiple diversity sources to achieve acceptable error rate performance. Traditional approaches perform the combining of the different diversity sources using either the conventional selective diversity combining (CSC, equal-gain combining (EGC, or maximal-ratio combining (MRC schemes. CSC and MRC are the two extremes of compromise between performance quality and complexity. Some researches have proposed a generalized selection combining scheme (GSC that combines the best M branches out of the L available diversity resources (M ≤ L . In this paper, we analyze a generalized selection combining scheme based on a threshold criterion rather than a fixed-size subset of the best channels. In this scheme, only those diversity branches whose energy levels are above a specified threshold are combined. Closed-form analytical solutions for the BER performances of this scheme over Nakagami fading channels are derived. We also discuss the merits of this scheme over GSC.

  1. Superior bit error rate and jitter due to improved switching field distribution in exchange spring magnetic recording media.

    Science.gov (United States)

    Suess, D; Fuger, M; Abert, C; Bruckner, F; Vogler, C

    2016-06-01

    We report two effects that lead to a significant reduction of the switching field distribution in exchange spring media. The first effect relies on a subtle mechanism of the interplay between exchange coupling between soft and hard layers and anisotropy that allows significant reduction of the switching field distribution in exchange spring media. This effect reduces the switching field distribution by about 30% compared to single-phase media. A second effect is that due to the improved thermal stability of exchange spring media over single-phase media, the jitter due to thermal fluctuation is significantly smaller for exchange spring media than for single-phase media. The influence of this overall improved switching field distribution on the transition jitter in granular recording and the bit error rate in bit-patterned magnetic recording is discussed. The transition jitter in granular recording for a distribution of Khard values of 3% in the hard layer, taking into account thermal fluctuations during recording, is estimated to be a = 0.78 nm, which is similar to the best reported calculated jitter in optimized heat-assisted recording media.

  2. Identification of chromosomal errors in human preimplantation embryos with oligonucleotide DNA microarray.

    Directory of Open Access Journals (Sweden)

    Lifeng Liang

    Full Text Available A previous study comparing the performance of different platforms for DNA microarray found that the oligonucleotide (oligo microarray platform containing 385K isothermal probes had the best performance when evaluating dosage sensitivity, precision, specificity, sensitivity and copy number variations border definition. Although oligo microarray platform has been used in some research fields and clinics, it has not been used for aneuploidy screening in human embryos. The present study was designed to use this new microarray platform for preimplantation genetic screening in the human. A total of 383 blastocysts from 72 infertility patients with either advanced maternal age or with previous miscarriage were analyzed after biopsy and microarray. Euploid blastocysts were transferred to patients and clinical pregnancy and implantation rates were measured. Chromosomes in some aneuploid blastocysts were further analyzed by fluorescence in-situ hybridization (FISH to evaluate accuracy of the results. We found that most (58.1% of the blastocysts had chromosomal abnormalities that included single or multiple gains and/or losses of chromosome(s, partial chromosome deletions and/or duplications in both euploid and aneuploid embryos. Transfer of normal euploid blastocysts in 34 cycles resulted in 58.8% clinical pregnancy and 54.4% implantation rates. Examination of abnormal blastocysts by FISH showed that all embryos had matching results comparing microarray and FISH analysis. The present study indicates that oligo microarray conducted with a higher resolution and a greater number of probes is able to detect not only aneuploidy, but also minor chromosomal abnormalities, such as partial chromosome deletion and/or duplication in human embryos. Preimplantation genetic screening of the aneuploidy by DNA microarray is an advanced technology used to select embryos for transfer and improved embryo implantation can be obtained after transfer of the screened normal

  3. Age-dependent recombination rates in human pedigrees.

    Directory of Open Access Journals (Sweden)

    Julie Hussin

    2011-09-01

    Full Text Available In humans, chromosome-number abnormalities have been associated with altered recombination and increased maternal age. Therefore, age-related effects on recombination are of major importance, especially in relation to the mechanisms involved in human trisomies. Here, we examine the relationship between maternal age and recombination rate in humans. We localized crossovers at high resolution by using over 600,000 markers genotyped in a panel of 69 French-Canadian pedigrees, revealing recombination events in 195 maternal meioses. Overall, we observed the general patterns of variation in fine-scale recombination rates previously reported in humans. However, we make the first observation of a significant decrease in recombination rates with advancing maternal age in humans, likely driven by chromosome-specific effects. The effect appears to be localized in the middle section of chromosomal arms and near subtelomeric regions. We postulate that, for some chromosomes, protection against non-disjunction provided by recombination becomes less efficient with advancing maternal age, which can be partly responsible for the higher rates of aneuploidy in older women. We propose a model that reconciles our findings with reported associations between maternal age and recombination in cases of trisomies.

  4. Error rate of multi-level rapid prototyping trajectories for pedicle screw placement in lumbar and sacral spine

    Institute of Scientific and Technical Information of China (English)

    Matjaz Merc; Igor Drstvensek; Matjaz Vogrin; Tomaz Brajlih; Tomaz Friedrich; Gregor Recnik

    2014-01-01

    Objective:Free-hand pedicle screw placement has a high incidence of pedicle perforation which can be reduced with fluoroscopy,navigation or an alternative rapid prototyping drill guide template.In our study the error rate of multi-level templates for pedicle screw placement in lumbar and sacral regions was evaluated.Methods:A case series study was performed on 11 patients.Seventy-two screws were implanted using multilevel drill guide templates manufactured with selective laser sintering.According to the optimal screw direction preoperatively defined,an analysis of screw misplacement was performed.Displacement,deviation and screw length difference were measured.The learning curve was also estimated.Results:Twelve screws (17%) were placed more than 3.125 mm out of its optimal position in the centre of pedicle.The tip of the 16 screws (22%) was misplaced more than 6.25 mm out of the predicted optimal position.According to our predefined goal,19 screws (26%) were implanted inaccurately.In 10 cases the screw length was selected incorrectly:1 (1%) screw was too long and 9 (13%) were too short.No clinical signs of neurovascular lesion were observed.Learning curve was insignificantly noticeable (P=0.129).Conclusion:In our study,the procedure of manufacturing and applying multi-level drill guide templates has a 26% chance of screw misplacement.However,that rate does not coincide with pedicle perforation incidence and neurovascular injury.These facts along with a comparison to compatible studies make it possible to summarize that multi-level templates are satisfactorily accurate and allow precise screw placement with a clinically irrelevant mistake factor.Therefore templates could potentially represent a useful tool for routine pedicle screw placement.

  5. Accurate Time-Dependent Traveling-Wave Tube Model Developed for Computational Bit-Error-Rate Testing

    Science.gov (United States)

    Kory, Carol L.

    2001-01-01

    prohibitively expensive, as it would require manufacturing numerous amplifiers, in addition to acquiring the required digital hardware. As an alternative, the time-domain TWT interaction model developed here provides the capability to establish a computational test bench where ISI or bit error rate can be simulated as a function of TWT operating parameters and component geometries. Intermodulation products, harmonic generation, and backward waves can also be monitored with the model for similar correlations. The advancements in computational capabilities and corresponding potential improvements in TWT performance may prove to be the enabling technologies for realizing unprecedented data rates for near real time transmission of the increasingly larger volumes of data demanded by planned commercial and Government satellite communications applications. This work is in support of the Cross Enterprise Technology Development Program in Headquarters' Advanced Technology & Mission Studies Division and the Air Force Office of Scientific Research Small Business Technology Transfer programs.

  6. Medication Errors

    Science.gov (United States)

    ... Proprietary Names (PDF - 146KB) Draft Guidance for Industry: Best Practices in Developing Proprietary Names for Drugs (PDF - 279KB) ... or (301) 796-3400 druginfo@fda.hhs.gov Human Drug ... in Medication Errors Resources for You Agency for Healthcare Research and Quality: ...

  7. The effect of speaking rate on serial-order sound-level errors in normal healthy controls and persons with aphasia.

    Science.gov (United States)

    Fossett, Tepanta R D; McNeil, Malcolm R; Pratt, Sheila R; Tompkins, Connie A; Shuster, Linda I

    Although many speech errors can be generated at either a linguistic or motoric level of production, phonetically well-formed sound-level serial-order errors are generally assumed to result from disruption of phonologic encoding (PE) processes. An influential model of PE (Dell, 1986; Dell, Burger & Svec, 1997) predicts that speaking rate should affect the relative proportion of these serial-order sound errors (anticipations, perseverations, exchanges). These predictions have been extended to, and have special relevance for persons with aphasia (PWA) because of the increased frequency with which speech errors occur and because their localization within the functional linguistic architecture may help in diagnosis and treatment. Supporting evidence regarding the effect of speaking rate on phonological encoding has been provided by studies using young normal language (NL) speakers and computer simulations. Limited data exist for older NL users and no group data exist for PWA. This study tested the phonologic encoding properties of Dell's model of speech production (Dell, 1986; Dell,et al., 1997), which predicts that increasing speaking rate affects the relative proportion of serial-order sound errors (i.e., anticipations, perseverations, and exchanges). The effects of speech rate on the error ratios of anticipation/exchange (AE), anticipation/perseveration (AP) and vocal reaction time (VRT) were examined in 16 normal healthy controls (NHC) and 16 PWA without concomitant motor speech disorders. The participants were recorded performing a phonologically challenging (tongue twister) speech production task at their typical and two faster speaking rates. A significant effect of increased rate was obtained for the AP but not the AE ratio. Significant effects of group and rate were obtained for VRT. Although the significant effect of rate for the AP ratio provided evidence that changes in speaking rate did affect PE, the results failed to support the model derived predictions

  8. An error management system in a veterinary clinical laboratory.

    Science.gov (United States)

    Hooijberg, Emma; Leidinger, Ernst; Freeman, Kathleen P

    2012-05-01

    Error recording and management is an integral part of a clinical laboratory quality management system. Analysis and review of recorded errors lead to corrective and preventive actions through modification of existing processes and, ultimately, to quality improvement. Laboratory errors can be divided into preanalytical, analytical, and postanalytical errors depending on where in the laboratory cycle the errors occur. The purpose of the current report is to introduce an error management system in use in a veterinary diagnostic laboratory as well as to examine the amount and types of error recorded during the 8-year period from 2003 to 2010. Annual error reports generated during this period by the error recording system were reviewed, and annual error rates were calculated. In addition, errors were divided into preanalytical, analytical, postanalytical, and "other" categories, and their frequency was examined. Data were further compared to that available from human diagnostic laboratories. Finally, sigma metrics were calculated for the various error categories. Annual error rates per total number of samples ranged from 1.3% in 2003 to 0.7% in 2010. Preanalytical errors ranged from 52% to 77%, analytical from 4% to 14%, postanalytical from 9% to 21%, and other error from 6% to 19% of total errors. Sigma metrics ranged from 4.1 to 4.7. All data were comparable to that reported in human clinical laboratories. The incremental annual reduction of error shows that use of an error management system led to quality improvement.

  9. Cost effectiveness of a pharmacist-led information technology intervention for reducing rates of clinically important errors in medicines management in general practices (PINCER).

    Science.gov (United States)

    Elliott, Rachel A; Putman, Koen D; Franklin, Matthew; Annemans, Lieven; Verhaeghe, Nick; Eden, Martin; Hayre, Jasdeep; Rodgers, Sarah; Sheikh, Aziz; Avery, Anthony J

    2014-06-01

    We recently showed that a pharmacist-led information technology-based intervention (PINCER) was significantly more effective in reducing medication errors in general practices than providing simple feedback on errors, with cost per error avoided at £79 (US$131). We aimed to estimate cost effectiveness of the PINCER intervention by combining effectiveness in error reduction and intervention costs with the effect of the individual errors on patient outcomes and healthcare costs, to estimate the effect on costs and QALYs. We developed Markov models for each of six medication errors targeted by PINCER. Clinical event probability, treatment pathway, resource use and costs were extracted from literature and costing tariffs. A composite probabilistic model combined patient-level error models with practice-level error rates and intervention costs from the trial. Cost per extra QALY and cost-effectiveness acceptability curves were generated from the perspective of NHS England, with a 5-year time horizon. The PINCER intervention generated £2,679 less cost and 0.81 more QALYs per practice [incremental cost-effectiveness ratio (ICER): -£3,037 per QALY] in the deterministic analysis. In the probabilistic analysis, PINCER generated 0.001 extra QALYs per practice compared with simple feedback, at £4.20 less per practice. Despite this extremely small set of differences in costs and outcomes, PINCER dominated simple feedback with a mean ICER of -£3,936 (standard error £2,970). At a ceiling 'willingness-to-pay' of £20,000/QALY, PINCER reaches 59 % probability of being cost effective. PINCER produced marginal health gain at slightly reduced overall cost. Results are uncertain due to the poor quality of data to inform the effect of avoiding errors.

  10. Scaling Behaviour and Memory in Heart Rate of Healthy Human

    Institute of Scientific and Technical Information of China (English)

    CAI Shi-Min; PENG Hu; YANG Hui-Jie; ZHOU Tao; ZHOU Pei-Ling; WANG Bing-Hong

    2007-01-01

    We investigate a set of complex heart rate time series from healthy human in different behaviour states with the detrended fluctuation analysis and diffusion entropy (DE) method. It is proposed that the scaling properties are influenced by behaviour states. The memory detected by DE exhibits an approximately same pattern after a detrending procedure. Both of them demonstrate the long-range strong correlations in heart rate. These findings may be helpful to understand the underlying dynamical evolution process in the heart rate control system, as well as to model the cardiac dynamic process.

  11. Equivalent dose rate by muons to the human body.

    Science.gov (United States)

    Băcioiu, I

    2011-11-01

    In this paper, the relative sensitivity from different human tissues of the human body, at a ground level, from muon cosmic radiation has been studied. The aim of this paper was to provide information on the equivalent dose rates received from atmospheric muons to human body, at the ground level. The calculated value of the effective dose rate by atmospheric muons plus the radiation levels of the natural annual background radiation dose, at the ground level, in the momentum interval of cosmic ray muon (0.2-120.0 GeV/c) is about 2.106±0.001 mSv/y, which is insignificant in comparison with the values of the doses from the top of the atmosphere.

  12. 基于人差错纠正能力的人因可靠性模型研究%Human Reliability Method Analysis Based on Human Error Correcting Ability

    Institute of Scientific and Technical Information of China (English)

    陈炉云; 张裕芳

    2011-01-01

    Based on the theory of time sequence and error correcting ability character of the human operator behaviors in man-machine system, combining the key performance shaping factor analysis, the human reliability analysis of the vessel chamber is investigated. By the time sequence parameter and error correcting parameter in the human errors analysis, the operator behaviors shaping model of man-machine system and human errors event tree are proposed. By the error correcting ability analysis, the quantitative model and allowance theory in human reliability analysis are discussed. In the end, with the monitoring task of the operation desk in the vessel chamber as an example, a human reliability analysis was conducted to quantitatively assess the mission reliability of the operator.%根据人-机系统中人的操作行为具有时序性和差错可纠正性的特点,结合船舶舱室行为形成主因子,开展船舶舱室人因可靠性研究.以人因失误的时序性和差错纠正参数为基础,建立人-机系统中操作者行为模式和人因失误事件树模型.通过对人的差错纠正能力的分析,开展人因可靠性量化模型纠正理论研究.最后,以船舶舱室操作台的监控任务人因可靠性为例进行量化计算,定量评估操作人员执行任务的可靠度.

  13. Human Empathy, Personality and Experience Affect the Emotion Ratings of Dog and Human Facial Expressions.

    Science.gov (United States)

    Kujala, Miiamaaria V; Somppi, Sanni; Jokela, Markus; Vainio, Outi; Parkkonen, Lauri

    2017-01-01

    Facial expressions are important for humans in communicating emotions to the conspecifics and enhancing interpersonal understanding. Many muscles producing facial expressions in humans are also found in domestic dogs, but little is known about how humans perceive dog facial expressions, and which psychological factors influence people's perceptions. Here, we asked 34 observers to rate the valence, arousal, and the six basic emotions (happiness, sadness, surprise, disgust, fear, and anger/aggressiveness) from images of human and dog faces with Pleasant, Neutral and Threatening expressions. We investigated how the subjects' personality (the Big Five Inventory), empathy (Interpersonal Reactivity Index) and experience of dog behavior affect the ratings of dog and human faces. Ratings of both species followed similar general patterns: human subjects classified dog facial expressions from pleasant to threatening very similarly to human facial expressions. Subjects with higher emotional empathy evaluated Threatening faces of both species as more negative in valence and higher in anger/aggressiveness. More empathetic subjects also rated the happiness of Pleasant humans but not dogs higher, and they were quicker in their valence judgments of Pleasant human, Threatening human and Threatening dog faces. Experience with dogs correlated positively with ratings of Pleasant and Neutral dog faces. Personality also had a minor effect on the ratings of Pleasant and Neutral faces in both species. The results imply that humans perceive human and dog facial expression in a similar manner, and the perception of both species is influenced by psychological factors of the evaluators. Especially empathy affects both the speed and intensity of rating dogs' emotional facial expressions.

  14. The analysis of human error as causes in the maintenance of machines: a case study in mining companies

    Directory of Open Access Journals (Sweden)

    Kovacevic, Srdja

    2016-12-01

    Full Text Available This paper describes the two-step method used to analyse the factors and aspects influencing human error during the maintenance of mining machines. The first step is the cause-effect analysis, supported by brainstorming, where five factors and 21 aspects are identified. During the second step, the group fuzzy analytic hierarchy process is used to rank the identified factors and aspects. A case study is done on mining companies in Serbia. The key aspects are ranked according to an analysis that included experts who assess risks in mining companies (a maintenance engineer, a technologist, an ergonomist, a psychologist, and an organisational scientist. Failure to follow technical maintenance instructions, poor organisation of the training process, inadequate diagnostic equipment, and a lack of understanding of the work process are identified as the most important causes of human error.

  15. The application of SHERPA (Systematic Human Error Reduction and Prediction Approach) in the development of compensatory cognitive rehabilitation strategies for stroke patients with left and right brain damage.

    Science.gov (United States)

    Hughes, Charmayne M L; Baber, Chris; Bienkiewicz, Marta; Worthington, Andrew; Hazell, Alexa; Hermsdörfer, Joachim

    2015-01-01

    Approximately 33% of stroke patients have difficulty performing activities of daily living, often committing errors during the planning and execution of such activities. The objective of this study was to evaluate the ability of the human error identification (HEI) technique SHERPA (Systematic Human Error Reduction and Prediction Approach) to predict errors during the performance of daily activities in stroke patients with left and right hemisphere lesions. Using SHERPA we successfully predicted 36 of the 38 observed errors, with analysis indicating that the proportion of predicted and observed errors was similar for all sub-tasks and severity levels. HEI results were used to develop compensatory cognitive strategies that clinicians could employ to reduce or prevent errors from occurring. This study provides evidence for the reliability and validity of SHERPA in the design of cognitive rehabilitation strategies in stroke populations.

  16. Prospects for DNA methods to measure human heritable mutation rates

    Energy Technology Data Exchange (ETDEWEB)

    Mendelsohn, M.L.

    1985-06-14

    A workshop cosponsored by ICPEMC and the US Department of Energy was held in Alta, Utah, December 9-13, 1984 to examine the extent to which DNA-oriented methods might provide new approaches to the important but intractable problem of measuring mutation rates in control and exposed human populations. The workshop identified and analyzed six DNA methods for detection of human heritable mutation, including several created at the meeting, and concluded that none of the methods combine sufficient feasibility and efficiency to be recommended for general application. 8 refs.

  17. N-dimensional measurement-device-independent quantum key distribution with N + 1 un-characterized sources: zero quantum-bit-error-rate case.

    Science.gov (United States)

    Hwang, Won-Young; Su, Hong-Yi; Bae, Joonwoo

    2016-01-01

    We study N-dimensional measurement-device-independent quantum-key-distribution protocol where one checking state is used. Only assuming that the checking state is a superposition of other N sources, we show that the protocol is secure in zero quantum-bit-error-rate case, suggesting possibility of the protocol. The method may be applied in other quantum information processing.

  18. Multifractal heart rate dynamics in human cardiovascular model

    Science.gov (United States)

    Kotani, Kiyoshi; Takamasu, Kiyoshi; Safonov, Leonid; Yamamoto, Yoshiharu

    2003-05-01

    Human cardiovascular and/or cardio-respiratory systems are shown to exhibit both multifractal and synchronous dynamics, and we recently developed a nonlinear, physiologically plausible model for the synchronization between heartbeat and respiration (Kotani, et al. Phys. Rev. E 65: 051923, 2002). By using the same model, we now show the multifractality in the heart rate dynamics. We find that beat-to-beat monofractal noise (fractional Brownian motion) added to the brain stem cardiovascular areas results in significantly broader singularity spectra for heart rate through interactions between sympathetic and parasympathetic nervous systems. We conclude that the model proposed here would be useful in studying the complex cardiovascular and/or cardio- respiratory dynamics in humans.

  19. Internal consistency, test-retest reliability and measurement error of the self-report version of the social skills rating system in a sample of Australian adolescents.

    Directory of Open Access Journals (Sweden)

    Sharmila Vaz

    Full Text Available The social skills rating system (SSRS is used to assess social skills and competence in children and adolescents. While its characteristics based on United States samples (US are published, corresponding Australian figures are unavailable. Using a 4-week retest design, we examined the internal consistency, retest reliability and measurement error (ME of the SSRS secondary student form (SSF in a sample of Year 7 students (N = 187, from five randomly selected public schools in Perth, western Australia. Internal consistency (IC of the total scale and most subscale scores (except empathy on the frequency rating scale was adequate to permit independent use. On the importance rating scale, most IC estimates for girls fell below the benchmark. Test-retest estimates of the total scale and subscales were insufficient to permit reliable use. ME of the total scale score (frequency rating for boys was equivalent to the US estimate, while that for girls was lower than the US error. ME of the total scale score (importance rating was larger than the error using the frequency rating scale. The study finding supports the idea of using multiple informants (e.g. teacher and parent reports, not just student as recommended in the manual. Future research needs to substantiate the clinical meaningfulness of the MEs calculated in this study by corroborating them against the respective Minimum Clinically Important Difference (MCID.

  20. Variation in human recombination rates and its genetic determinants.

    Directory of Open Access Journals (Sweden)

    Adi Fledel-Alon

    Full Text Available BACKGROUND: Despite the fundamental role of crossing-over in the pairing and segregation of chromosomes during human meiosis, the rates and placements of events vary markedly among individuals. Characterizing this variation and identifying its determinants are essential steps in our understanding of the human recombination process and its evolution. STUDY DESIGN/RESULTS: Using three large sets of European-American pedigrees, we examined variation in five recombination phenotypes that capture distinct aspects of crossing-over patterns. We found that the mean recombination rate in males and females and the historical hotspot usage are significantly heritable and are uncorrelated with one another. We then conducted a genome-wide association study in order to identify loci that influence them. We replicated associations of RNF212 with the mean rate in males and in females as well as the association of Inversion 17q21.31 with the female mean rate. We also replicated the association of PRDM9 with historical hotspot usage, finding that it explains most of the genetic variance in this phenotype. In addition, we identified a set of new candidate regions for further validation. SIGNIFICANCE: These findings suggest that variation at broad and fine scales is largely separable and that, beyond three known loci, there is no evidence for common variation with large effects on recombination phenotypes.

  1. Search strategy has influenced the discovery rate of human viruses.

    Science.gov (United States)

    Rosenberg, Ronald; Johansson, Michael A; Powers, Ann M; Miller, Barry R

    2013-08-20

    A widely held concern is that the pace of infectious disease emergence has been increasing. We have analyzed the rate of discovery of pathogenic viruses, the preeminent source of newly discovered causes of human disease, from 1897 through 2010. The rate was highest during 1950-1969, after which it moderated. This general picture masks two distinct trends: for arthropod-borne viruses, which comprised 39% of pathogenic viruses, the discovery rate peaked at three per year during 1960-1969, but subsequently fell nearly to zero by 1980; however, the rate of discovery of nonarboviruses remained stable at about two per year from 1950 through 2010. The period of highest arbovirus discovery coincided with a comprehensive program supported by The Rockefeller Foundation of isolating viruses from humans, animals, and arthropod vectors at field stations in Latin America, Africa, and India. The productivity of this strategy illustrates the importance of location, approach, long-term commitment, and sponsorship in the discovery of emerging pathogens.

  2. Bit error rate analysis of X-ray communication system%X射线通信系统的误码率分析∗

    Institute of Scientific and Technical Information of China (English)

    王律强; 苏桐; 赵宝升; 盛立志; 刘永安; 刘舵

    2015-01-01

    X-ray communication, which was firstly introduced by Keithe Gendreau in 2007, is potential to compete with conventional communication methods, such as microwave and laser communication, against space surroundings. As a result, a great deal of time and effort has been devoted to making the initial idea into reality in recent years. Eventually, the X-ray communication demonstration system based on the grid-controlled X-ray source and microchannel plate detector can deliver both audio and video information in a 6-meter vacuum tunnel. The point is how to evaluate this space X-ray demonstration system in a typical experimental way. The method is to design a specific board to measure the relationship between bit-error-rate and emitting power against various communicating distances. In addition, the data should be compared with the calculation and simulation results to estimate the referred theoretical model. The concept of using X-ray as signal carriers is confirmed by our first generation X-ray communication demonstration system. Specifically, the method is to use grid-controlled emission source as a transceiver while implementing the photon counting detector which can be regarded as an important orientation of future deep-space X-ray communication applications. As the key specification of any given communication system, bit-error-rate level should be informed first. In addition, the theoretical analysis by using Poisson noise model also has been implemented to support this novel communication concept. Previous experimental results indicated that the X-ray audio demonstration system requires a 10−4 bit-error-rate level with 25 kbps communication rate. The system bit-error-rate based on on-off keying (OOK) modulation is calculated and measured, which corresponds to the theoretical calculation commendably. Another point that should be taken into consideration is the emitting energy, which is the main restriction of current X-ray communication system. The designed

  3. Direct measurement of fluence rate in the human brain

    Science.gov (United States)

    Melnik, Ivan S.; Rusina, Tatyana V.; Denisov, Nikolay A.; Dets, Sergiy M.; Steiner, Rudolf W.; Rozumenko, Vladimir D.

    1996-01-01

    Fluence rate was measured in normal and cancerous (glioma) human brain samples using a multichannel detector. Detector consisted of 8 isotrope fiber probes positioned around the central irradiating probe. Detecting probes were displaced one from other at a step 0.5 mm along the central irradiating fiber. Bare ends of detecting fibers were coupled with photodiode array. He-Ne (633 nm) or Nd:YAG (1064 nm) lasers were coupled with irradiating probe. Fluence rate was measured in each of 8 points in the depth range 5 mm. Measured mean penetration depths of 633 nm light were 0.70 mm, 0.50 mm and 0.40 mm for white matter, grey matter and glioma, respectively. For Nd:YAG laser, penetration depth was about 2.3 mm for normal tissue and glioma. Multichannel computerized detector allows to provide a small invasive real-time measurements of fluence rate in different tissues.

  4. Fast transcription rates of RNA polymerase II in human cells

    Science.gov (United States)

    Maiuri, Paolo; Knezevich, Anna; De Marco, Alex; Mazza, Davide; Kula, Anna; McNally, Jim G; Marcello, Alessandro

    2011-01-01

    Averaged estimates of RNA polymerase II (RNAPII) elongation rates in mammalian cells have been shown to range between 1.3 and 4.3 kb min−1. In this work, nascent RNAs from an integrated human immunodeficiency virus type 1-derived vector were detectable at the single living cell level by fluorescent RNA tagging. At steady state, a constant number of RNAs was measured corresponding to a minimal density of polymerases with negligible fluctuations over time. Recovery of fluorescence after photobleaching was complete within seconds, indicating a high rate of RNA biogenesis. The calculated transcription rate above 50 kb min−1 points towards a wide dynamic range of RNAPII velocities in living cells. PMID:22015688

  5. ATHEANA: {open_quotes}a technique for human error analysis{close_quotes} entering the implementation phase

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, J.; O`Hara, J.; Luckas, W. [Brookhaven National Lab., Upton, NY (United States)] [and others

    1997-02-01

    Probabilistic Risk Assessment (PRA) has become an increasingly important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. The NRC recently published a final policy statement, SECY-95-126, encouraging the use of PRA in regulatory activities. Human reliability analysis (HRA), while a critical element of PRA, has limitations in the analysis of human actions in PRAs that have long been recognized as a constraint when using PRA. In fact, better integration of HRA into the PRA process has long been a NRC issue. Of particular concern, has been the omission of errors of commission - those errors that are associated with inappropriate interventions by operators with operating systems. To address these concerns, the NRC identified the need to develop an improved HRA method, so that human reliability can be better represented and integrated into PRA modeling and quantification. The purpose of the Brookhaven National Laboratory (BNL) project, entitled `Improved HRA Method Based on Operating Experience` is to develop a new method for HRA which is supported by the analysis of risk-significant operating experience. This approach will allow a more realistic assessment and representation of the human contribution to plant risk, and thereby increase the utility of PRA. The project`s completed, ongoing, and future efforts fall into four phases: (1) Assessment phase (FY 92/93); (2) Analysis and Characterization phase (FY 93/94); (3) Development phase (FY 95/96); and (4) Implementation phase (FY 96/97 ongoing).

  6. Human Empathy, Personality and Experience Affect the Emotion Ratings of Dog and Human Facial Expressions

    OpenAIRE

    Kujala, Miiamaaria V.; Somppi, Sanni; Jokela, Markus; Vainio, Outi; Parkkonen, Lauri

    2017-01-01

    Facial expressions are important for humans in communicating emotions to the conspecifics and enhancing interpersonal understanding. Many muscles producing facial expressions in humans are also found in domestic dogs, but little is known about how humans perceive dog facial expressions, and which psychological factors influence people’s perceptions. Here, we asked 34 observers to rate the valence, arousal, and the six basic emotions (happiness, sadness, surprise, disgust, fear, and anger/aggr...

  7. Distinguishing science from pseudoscience in school psychology: science and scientific thinking as safeguards against human error.

    Science.gov (United States)

    Lilienfeld, Scott O; Ammirati, Rachel; David, Michal

    2012-02-01

    Like many domains of professional psychology, school psychology continues to struggle with the problem of distinguishing scientific from pseudoscientific and otherwise questionable clinical practices. We review evidence for the scientist-practitioner gap in school psychology and provide a user-friendly primer on science and scientific thinking for school psychologists. Specifically, we (a) outline basic principles of scientific thinking, (b) delineate widespread cognitive errors that can contribute to belief in pseudoscientific practices within school psychology and allied professions, (c) provide a list of 10 key warning signs of pseudoscience, illustrated by contemporary examples from school psychology and allied disciplines, and (d) offer 10 user-friendly prescriptions designed to encourage scientific thinking among school psychology practitioners and researchers. We argue that scientific thinking, although fallible, is ultimately school psychologists' best safeguard against a host of errors in thinking.

  8. Effects of body mass index and step rate on pedometer error in a free-living environment.

    Science.gov (United States)

    Tyo, Brian M; Fitzhugh, Eugene C; Bassett, David R; John, Dinesh; Feito, Yuri; Thompson, Dixie L

    2011-02-01

    Pedometers could provide great insights into walking habits if they are found to be accurate for people of all weight categories. the purposes of this study were to determine whether the New Lifestyles NL-2000 (NL) and the Digi-Walker SW-200 (DW) yield similar daily step counts as compared with the StepWatch 3 (SW) in a free-living environment and to determine whether pedometer error is influenced by body mass index (BMI) and speed of walking. The SW served as the criterion because of its accuracy across a range of speeds and BMI categories. Slow walking was defined as ≤80 steps per minute. fifty-six adults (mean ± SD: age = 32.7 ± 14.5 yr) wore the devices for 7 d. There were 20 normal weight, 18 overweight, and 18 obese participants. A two-way repeated-measures ANOVA was performed to determine whether BMI and device were related to number of steps counted per day. Stepwise linear regressions were performed to determine what variables contributed to NL and DW error. both the NL and the DW recorded fewer steps than the SW (P < 0.001). In the normal weight and overweight groups, error was similar for the DW and NL. In the obese group, the DW underestimated steps more than the NL (P < 0.01). DW error was positively related to BMI and percentage of slow steps, whereas NL error was linearly related to percentage of slow steps. A surprising finding was that many healthy, community-dwelling adults accumulated a large percentage of steps through slow walking. the NL is more accurate than the DW for obese individuals, and neither pedometer is accurate for people who walk slowly. Researchers and practitioners must weigh the strengths and limitations of step counters before making an informed decision about which device to use.

  9. Estimating the designated use attainment decision error rates of US Environmental Protection Agency's proposed numeric total phosphorus criteria for Florida, USA, colored lakes.

    Science.gov (United States)

    McLaughlin, Douglas B

    2012-01-01

    The utility of numeric nutrient criteria established for certain surface waters is likely to be affected by the uncertainty that exists in the presence of a causal link between nutrient stressor variables and designated use-related biological responses in those waters. This uncertainty can be difficult to characterize, interpret, and communicate to a broad audience of environmental stakeholders. The US Environmental Protection Agency (USEPA) has developed a systematic planning process to support a variety of environmental decisions, but this process is not generally applied to the development of national or state-level numeric nutrient criteria. This article describes a method for implementing such an approach and uses it to evaluate the numeric total P criteria recently proposed by USEPA for colored lakes in Florida, USA. An empirical, log-linear relationship between geometric mean concentrations of total P (a potential stressor variable) and chlorophyll a (a nutrient-related response variable) in these lakes-that is assumed to be causal in nature-forms the basis for the analysis. The use of the geometric mean total P concentration of a lake to correctly indicate designated use status, defined in terms of a 20 µg/L geometric mean chlorophyll a threshold, is evaluated. Rates of decision errors analogous to the Type I and Type II error rates familiar in hypothesis testing, and a 3rd error rate, E(ni) , referred to as the nutrient criterion-based impairment error rate, are estimated. The results show that USEPA's proposed "baseline" and "modified" nutrient criteria approach, in which data on both total P and chlorophyll a may be considered in establishing numeric nutrient criteria for a given lake within a specified range, provides a means for balancing and minimizing designated use attainment decision errors.

  10. El error en la práctica médica: una presencia ineludible Human error in medical practice: an unavoidable presence

    OpenAIRE

    Gladis Adriana Vélez Álvarez

    2006-01-01

    El errar, que es una característica humana y un mecanismo de aprendizaje, se convierte en una amenaza para el hombre mismo en algunos escenarios como la aviación y la medicina. Se presentan algunos datos acerca de la frecuencia del error en medicina, su ubicuidad y las circunstancias que lo favorecen, y se hace una reflexión acerca de cómo se ha enfrentado el error y de por qué no se habla abiertamente del mismo. Se propone que el primer paso para aprender del error es aceptarlo como una pres...

  11. Analysis of Task Types and Error Types of the Human Actions Involved in the Human-related Unplanned Reactor Trip Events

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae Whan; Park, Jin Kyun; Jung, Won Dea

    2008-02-15

    This report provides the task types and error types involved in the unplanned reactor trip events that have occurred during 1986 - 2006. The events that were caused by the secondary system of the nuclear power plants amount to 67 %, and the remaining 33 % was by the primary system. The contribution of the activities of the plant personnel was identified as the following order: corrective maintenance (25.7 %), planned maintenance (22.8 %), planned operation (19.8 %), periodic preventive maintenance (14.9 %), response to a transient (9.9 %), and design/manufacturing/installation (9.9%). According to the analysis of error modes, the error modes such as control failure (22.2 %), wrong object (18.5 %), omission (14.8 %), wrong action (11.1 %), and inadequate (8.3 %) take up about 75 % of all the unplanned trip events. The analysis of the cognitive functions involved showed that the planning function makes the highest contribution to the human actions leading to unplanned reactor trips, and it is followed by the observation function (23.4%), the execution function (17.8 %), and the interpretation function (10.3 %). The results of this report are to be used as important bases for development of the error reduction measures or development of the error mode prediction system for the test and maintenance tasks in nuclear power plants.

  12. Analysis of Human Errors in Industrial Incidents and Accidents for Improvement of Work Safety

    DEFF Research Database (Denmark)

    Leplat, J.; Rasmussen, Jens

    1984-01-01

    recommendations, the method proposed identifies very explicit countermeasures. Improvements require a change in human decisions during equipment design, work planning, or the execution itself. The use of a model of human behavior drawing a distinction between automated skill-based behavior, rule-based 'know......-how' and knowledge-based analysis is proposed for identification of the human decisions which are most sensitive to improvements...

  13. Throughput Estimation Method in Burst ACK Scheme for Optimizing Frame Size and Burst Frame Number Appropriate to SNR-Related Error Rate

    Science.gov (United States)

    Ohteru, Shoko; Kishine, Keiji

    The Burst ACK scheme enhances effective throughput by reducing ACK overhead when a transmitter sends sequentially multiple data frames to a destination. IEEE 802.11e is one such example. The size of the data frame body and the number of burst data frames are important burst transmission parameters that affect throughput. The larger the burst transmission parameters are, the better the throughput under error-free conditions becomes. However, large data frame could reduce throughput under error-prone conditions caused by signal-to-noise ratio (SNR) deterioration. If the throughput can be calculated from the burst transmission parameters and error rate, the appropriate ranges of the burst transmission parameters could be narrowed down, and the necessary buffer size for storing transmit data or received data temporarily could be estimated. In this paper, we present a method that features a simple algorithm for estimating the effective throughput from the burst transmission parameters and error rate. The calculated throughput values agree well with the measured ones for actual wireless boards based on the IEEE 802.11-based original MAC protocol. We also calculate throughput values for larger values of the burst transmission parameters outside the assignable values of the wireless boards and find the appropriate values of the burst transmission parameters.

  14. Onboard Sensor Data Qualification in Human-Rated Launch Vehicles

    Science.gov (United States)

    Wong, Edmond; Melcher, Kevin J.; Maul, William A.; Chicatelli, Amy K.; Sowers, Thomas S.; Fulton, Christopher; Bickford, Randall

    2012-01-01

    The avionics system software for human-rated launch vehicles requires an implementation approach that is robust to failures, especially the failure of sensors used to monitor vehicle conditions that might result in an abort determination. Sensor measurements provide the basis for operational decisions on human-rated launch vehicles. This data is often used to assess the health of system or subsystem components, to identify failures, and to take corrective action. An incorrect conclusion and/or response may result if the sensor itself provides faulty data, or if the data provided by the sensor has been corrupted. Operational decisions based on faulty sensor data have the potential to be catastrophic, resulting in loss of mission or loss of crew. To prevent these later situations from occurring, a Modular Architecture and Generalized Methodology for Sensor Data Qualification in Human-rated Launch Vehicles has been developed. Sensor Data Qualification (SDQ) is a set of algorithms that can be implemented in onboard flight software, and can be used to qualify data obtained from flight-critical sensors prior to the data being used by other flight software algorithms. Qualified data has been analyzed by SDQ and is determined to be a true representation of the sensed system state; that is, the sensor data is determined not to be corrupted by sensor faults or signal transmission faults. Sensor data can become corrupted by faults at any point in the signal path between the sensor and the flight computer. Qualifying the sensor data has the benefit of ensuring that erroneous data is identified and flagged before otherwise being used for operational decisions, thus increasing confidence in the response of the other flight software processes using the qualified data, and decreasing the probability of false alarms or missed detections.

  15. Minimising human error in malaria rapid diagnosis: clarity of written instructions and health worker performance.

    Science.gov (United States)

    Rennie, Waverly; Phetsouvanh, Rattanaxay; Lupisan, Socorro; Vanisaveth, Viengsay; Hongvanthong, Bouasy; Phompida, Samlane; Alday, Portia; Fulache, Mila; Lumagui, Richard; Jorgensen, Pernille; Bell, David; Harvey, Steven

    2007-01-01

    The usefulness of rapid diagnostic tests (RDT) in malaria case management depends on the accuracy of the diagnoses they provide. Despite their apparent simplicity, previous studies indicate that RDT accuracy is highly user-dependent. As malaria RDTs will frequently be used in remote areas with little supervision or support, minimising mistakes is crucial. This paper describes the development of new instructions (job aids) to improve health worker performance, based on observations of common errors made by remote health workers and villagers in preparing and interpreting RDTs, in the Philippines and Laos. Initial preparation using the instructions provided by the manufacturer was poor, but improved significantly with the job aids (e.g. correct use both of the dipstick and cassette increased in the Philippines by 17%). However, mistakes in preparation remained commonplace, especially for dipstick RDTs, as did mistakes in interpretation of results. A short orientation on correct use and interpretation further improved accuracy, from 70% to 80%. The results indicate that apparently simple diagnostic tests can be poorly performed and interpreted, but provision of clear, simple instructions can reduce these errors. Preparation of appropriate instructions and training as well as monitoring of user behaviour are an essential part of rapid test implementation.

  16. Human errors: their psychophysical bases and the Proprioceptive Diagnosis of Temperament and Character (DP-TC as a tool for measuring.

    Directory of Open Access Journals (Sweden)

    Tous Ral J.M.

    2014-07-01

    Full Text Available Human error is commonly differentiated into three different types. These are: errors in perception, errors in decision and errors in sensation. This analysis is based on classical psychophysics (Fechner, 1860 and describes the errors of detection and perception. Decision- making errors are evaluated in terms of the theory of signal detection (McNicholson, 1974, and errors of sensation or sensitivity are evaluated in terms of proprioceptive information (van Beers, 2001. Each of these stages developed its own method of evaluation that has influenced the development of ergonomics in the event of errors in perception and the verbal assessment of personality (stress, impulsiveness, burnout, etc. in decision-making errors. Here we represent the method we have developed, the Proprioceptive Diagnosis of Temperament and Character (DP- TC test, for the specific assessment of errors of perception or expressivity which are based on fine motor precision performance. Each of the described errors types are interdependent of each other in such a manner that observable stress in behaviour may be caused due to: the inadequate performance of a task due to the perception of the person (i.e. from right to left for a right-handed person; performing a task that requires attentive decision-making to be performed too hastily; undertaking a task that does not correspond to the prevailing disposition of the person.

  17. Identification errors in pathology and laboratory medicine.

    Science.gov (United States)

    Valenstein, Paul N; Sirota, Ronald L

    2004-12-01

    Identification errors involve misidentification of a patient or a specimen. Either has the potential to cause patients harm. Identification errors can occur during any part of the test cycle; however, most occur in the preanalytic phase. Patient identification errors in transfusion medicine occur in 0.05% of specimens; for general laboratory specimens the rate is much higher, around 1%. Anatomic pathology, which involves multiple specimen transfers and hand-offs, may have the highest identification error rate. Certain unavoidable cognitive failures lead to identification errors. Technology, ranging from bar-coded specimen labels to radio frequency identification tags, can be incorporated into protective systems that have the potential to detect and correct human error and reduce the frequency with which patients and specimens are misidentified.

  18. Research on the Mechanism of Human Error in Ship Building%舰船建造中人因失误机理的研究

    Institute of Scientific and Technical Information of China (English)

    石小岗; 周宏; 莫一峰

    2014-01-01

    由于舰船建造的人-机-环境系统的复杂性使得在建造过程中人因失误事件的发生概率很大。如何预防与减少人因失误提高人的可靠性已成为保证舰船建造安全生产的主要因素。本文研究了人因失误的特点,根据人的认知行为对舰船建造过程的人为失误进行了分类同时总结出了影响舰船建造过程中人因失误的影响因素,针对影响因素给出了预防舰船建造中人因失误的有效措施。%The complexity of the man-machine-environment system for ship building results in big probability of human error. How to prevent and decrease human error and improve the reliability of people has become the main factor for ensuring shipbuilding safety. This paper studies the characteristics of human error, classifies the human error in building according to human cognitive behavior and summarizes the influencing factors of human error in shipbuilding. Effective measures to prevent human error are put forward.

  19. The Measure of Human Error: Direct and Indirect Performance Shaping Factors

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring; Candice D. Griffith; Jeffrey C. Joe

    2007-08-01

    The goal of performance shaping factors (PSFs) is to provide measures to account for human performance. PSFs fall into two categories—direct and indirect measures of human performance. While some PSFs such as “time to complete a task” are directly measurable, other PSFs, such as “fitness for duty,” can only be measured indirectly through other measures and PSFs, such as through fatigue measures. This paper explores the role of direct and indirect measures in human reliability analysis (HRA) and the implications that measurement theory has on analyses and applications using PSFs. The paper concludes with suggestions for maximizing the reliability and validity of PSFs.

  20. Low dose rate gamma ray induced loss and data error rate of multimode silica fibre links; Affaiblissement et taux d`erreur de transmission de fibres optiques multimodes soumises a une irradiation gamma a faible debit de dose

    Energy Technology Data Exchange (ETDEWEB)

    Breuze, G.; Fanet, H.; Serre, J. [CEA Centre d`Etudes de Saclay, 91 - Gif-sur-Yvette (France). Dept. d`Electronique et d`Instrumentation Nucleaire; Colas, D.; Garnero, E.; Hamet, T. [Electricite de France (EDF), 77 - Ecuelles (France)

    1993-12-31

    Fiber optics data transmission from numerous multiplexed sensors, is potentially attractive for nuclear plant applications. Multimode silica fiber behaviour during steady state gamma ray exposure is studied as a joint programme between LETI CE/SACLAY and EDF Renardieres: transmitted optical power and bit error rate have been measured on a 100 m optical fiber.

  1. Characterization of semiconductor-laser phase noise and estimation of bit-error rate performance with low-speed offline digital coherent receivers.

    Science.gov (United States)

    Kikuchi, Kazuro

    2012-02-27

    We develop a systematic method for characterizing semiconductor-laser phase noise, using a low-speed offline digital coherent receiver. The field spectrum, the FM-noise spectrum, and the phase-error variance measured with such a receiver can completely describe phase-noise characteristics of lasers under test. The sampling rate of the digital coherent receiver should be much higher than the phase-fluctuation speed. However, 1 GS/s is large enough for most of the single-mode semiconductor lasers. In addition to such phase-noise characterization, interpolating the taken data at 1.25 GS/s to form a data stream at 10 GS/s, we can predict the bit-error rate (BER) performance of multi-level modulated optical signals at 10 Gsymbol/s. The BER degradation due to the phase noise is well explained by the result of the phase-noise measurements.

  2. [Survey in hospitals. Nursing errors, error culture and error management].

    Science.gov (United States)

    Habermann, Monika; Cramer, Henning

    2010-09-01

    Knowledge on errors is important to design safe nursing practice and its framework. This article presents results of a survey on this topic, including data of a representative sample of 724 nurses from 30 German hospitals. Participants predominantly remembered medication errors. Structural and organizational factors were rated as most important causes of errors. Reporting rates were considered low; this was explained by organizational barriers. Nurses in large part expressed having suffered from mental problems after error events. Nurses' perception focussing on medication errors seems to be influenced by current discussions which are mainly medication-related. This priority should be revised. Hospitals' risk management should concentrate on organizational deficits and positive error cultures. Decision makers are requested to tackle structural problems such as staff shortage.

  3. Leveraging Distant Relatedness to Quantify Human Mutation and Gene-Conversion Rates.

    Science.gov (United States)

    Palamara, Pier Francesco; Francioli, Laurent C; Wilton, Peter R; Genovese, Giulio; Gusev, Alexander; Finucane, Hilary K; Sankararaman, Sriram; Sunyaev, Shamil R; de Bakker, Paul I W; Wakeley, John; Pe'er, Itsik; Price, Alkes L

    2015-12-01

    The rate at which human genomes mutate is a central biological parameter that has many implications for our ability to understand demographic and evolutionary phenomena. We present a method for inferring mutation and gene-conversion rates by using the number of sequence differences observed in identical-by-descent (IBD) segments together with a reconstructed model of recent population-size history. This approach is robust to, and can quantify, the presence of substantial genotyping error, as validated in coalescent simulations. We applied the method to 498 trio-phased sequenced Dutch individuals and inferred a point mutation rate of 1.66 × 10(-8) per base per generation and a rate of 1.26 × 10(-9) for conversion as 5.99 × 10(-6). We found that recombination does not have observable mutagenic effects after gene conversion is accounted for and that local gene-conversion rates reflect recombination rates. We detected a strong enrichment of recent deleterious variation among mismatching variants found within IBD regions and observed summary statistics of local sharing of IBD segments to closely match previously proposed metrics of background selection; however, we found no significant effects of selection on our mutation-rate estimates. We detected no evidence of strong variation of mutation rates in a number of genomic annotations obtained from several recent studies. Our analysis suggests that a mutation-rate estimate higher than that reported by recent pedigree-based studies should be adopted in the context of DNA-based demographic reconstruction.

  4. General anesthesia suppresses normal heart rate variability in humans

    Science.gov (United States)

    Matchett, Gerald; Wood, Philip

    2014-06-01

    The human heart normally exhibits robust beat-to-beat heart rate variability (HRV). The loss of this variability is associated with pathology, including disease states such as congestive heart failure (CHF). The effect of general anesthesia on intrinsic HRV is unknown. In this prospective, observational study we enrolled 100 human subjects having elective major surgical procedures under general anesthesia. We recorded continuous heart rate data via continuous electrocardiogram before, during, and after anesthesia, and we assessed HRV of the R-R intervals. We assessed HRV using several common metrics including Detrended Fluctuation Analysis (DFA), Multifractal Analysis, and Multiscale Entropy Analysis. Each of these analyses was done in each of the four clinical phases for each study subject over the course of 24 h: Before anesthesia, during anesthesia, early recovery, and late recovery. On average, we observed a loss of variability on the aforementioned metrics that appeared to correspond to the state of general anesthesia. Following the conclusion of anesthesia, most study subjects appeared to regain their normal HRV, although this did not occur immediately. The resumption of normal HRV was especially delayed on DFA. Qualitatively, the reduction in HRV under anesthesia appears similar to the reduction in HRV observed in CHF. These observations will need to be validated in future studies, and the broader clinical implications of these observations, if any, are unknown.

  5. Analytical bit error rate performance evaluation of an orthogonal frequency division multiplexing power line communication system impaired by impulsive and Gaussian channel noise

    OpenAIRE

    Munshi Mahbubur Rahman; Satya Prasad Majumder

    2015-01-01

    An analytical approach is presented to evaluate the bit error rate (BER) performance of a power line (PL) communication system considering the combined influence of impulsive noise and background PL Gaussian noise. Middleton class-A noise model is considered to evaluate the effect of impulsive noise. The analysis is carried out to find the expression of the signal-to-noise ratio and BER considering orthogonal frequency division multiplexing (OFDM) with binary phase shift keying modulation wit...

  6. The role of usability in the evaluation of accidents: human error or design flaw?

    Science.gov (United States)

    Correia, Walter; Soares, Marcelo; Barros, Marina; Campos, Fábio

    2012-01-01

    This article aims to highlight the role of consumer products companies in the heart and the extent of accidents involving these types of products, and as such undesired events take part as an agent in influencing decision making for the purchase of a product that nature on the part of consumers and users. The article demonstrates, by reference, interviews and case studies such as the development of poorly designed products and design errors of design can influence the usage behavior of users, thus leading to accidents, and also negatively affect the next image of a company. The full explanation of these types of questions aims to raise awareness, plan on a reliable usability, users and consumers in general about the safe use of consumer products, and also safeguard their rights before a legal system of consumer protection, even far away by the CDC--Code of Consumer Protection.

  7. Electrophysiological correlates of reward prediction error recorded in the human prefrontal cortex

    Science.gov (United States)

    Oya, Hiroyuki; Adolphs, Ralph; Kawasaki, Hiroto; Bechara, Antoine; Damasio, Antonio; Howard, Matthew A.

    2005-01-01

    Lesion and functional imaging studies have shown that the ventromedial prefrontal cortex is critically involved in the avoidance of risky choices. However, detailed descriptions of the mechanisms that underlie the establishment of such behaviors remain elusive, due in part to the spatial and temporal limitations of available research techniques. We investigated this issue by recording directly from prefrontal depth electrodes in a rare neurosurgical patient while he performed the Iowa Gambling Task, and we concurrently measured behavioral, autonomic, and electrophysiological responses. We found a robust alpha-band component of event-related potentials that reflected the mismatch between expected outcomes and actual outcomes in the task, correlating closely with the reward-related error obtained from a reinforcement learning model of the patient's choice behavior. The finding implicates this brain region in the acquisition of choice bias by means of a continuous updating of expectations about reward and punishment. PMID:15928095

  8. Who Do Hospital Physicians and Nurses Go to for Advice About Medications? A Social Network Analysis and Examination of Prescribing Error Rates.

    Science.gov (United States)

    Creswick, Nerida; Westbrook, Johanna Irene

    2015-09-01

    To measure the weekly medication advice-seeking networks of hospital staff, to compare patterns across professional groups, and to examine these in the context of prescribing error rates. A social network analysis was conducted. All 101 staff in 2 wards in a large, academic teaching hospital in Sydney, Australia, were surveyed (response rate, 90%) using a detailed social network questionnaire. The extent of weekly medication advice seeking was measured by density of connections, proportion of reciprocal relationships by reciprocity, number of colleagues to whom each person provided advice by in-degree, and perceptions of amount and impact of advice seeking between physicians and nurses. Data on prescribing error rates from the 2 wards were compared. Weekly medication advice-seeking networks were sparse (density: 7% ward A and 12% ward B). Information sharing across professional groups was modest, and rates of reciprocation of advice were low (9% ward A, 14% ward B). Pharmacists provided advice to most people, and junior physicians also played central roles. Senior physicians provided medication advice to few people. Many staff perceived that physicians rarely sought advice from nurses when prescribing, but almost all believed that an increase in communication between physicians and nurses about medications would improve patient safety. The medication networks in ward B had higher measures for density, reciprocation, and fewer senior physicians who were isolates. Ward B had a significantly lower rate of both procedural and clinical prescribing errors than ward A (0.63 clinical prescribing errors per admission [95%CI, 0.47-0.79] versus 1.81/ admission [95%CI, 1.49-2.13]). Medication advice-seeking networks among staff on hospital wards are limited. Hubs of advice provision include pharmacists, junior physicians, and senior nurses. Senior physicians are poorly integrated into medication advice networks. Strategies to improve the advice-giving networks between senior

  9. When errors are rewarding

    NARCIS (Netherlands)

    Bruijn, E.R.A. de; Lange, F.P. de; Cramon, D.Y. von; Ullsperger, M.

    2009-01-01

    For social beings like humans, detecting one's own and others' errors is essential for efficient goal-directed behavior. Although one's own errors are always negative events, errors from other persons may be negative or positive depending on the social context. We used neuroimaging to disentangle br

  10. The Error Is the Clue: Breakdown In Human-Machine Interaction

    Science.gov (United States)

    2006-01-01

    prolonged vowel on line 35 above, utterance 16 in Figure1. After two unsuccessful attempts to book a train the user tries one more time. At that point she has...is because it is sought in fusion” writes Levinas in his essay “The Other in Proust” [10]. Levinas meant fusion of humans, of views, of perspectives... styles ’ or to get their hats and leave. Thus on one hand, we don’t need to work for fusion between humans and machines by frenetically trying to

  11. Error Rates of M-PAM and M-QAM in Generalized Fading and Generalized Gaussian Noise Environments

    KAUST Repository

    Soury, Hamza

    2013-07-01

    This letter investigates the average symbol error probability (ASEP) of pulse amplitude modulation and quadrature amplitude modulation coherent signaling over flat fading channels subject to additive white generalized Gaussian noise. The new ASEP results are derived in a generic closed-form in terms of the Fox H function and the bivariate Fox H function for the extended generalized-K fading case. The utility of this new general closed-form is that it includes some special fading distributions, like the Generalized-K, Nakagami-m, and Rayleigh fading and special noise distributions such as Gaussian and Laplacian. Some of these special cases are also treated and are shown to yield simplified results.

  12. Emergence of dynamical complexity related to human heart rate variability

    Science.gov (United States)

    Chang, Mei-Chu; Peng, C.-K.; Stanley, H. Eugene

    2014-12-01

    We apply the refined composite multiscale entropy (MSE) method to a one-dimensional directed small-world network composed of nodes whose states are binary and whose dynamics obey the majority rule. We find that the resulting fluctuating signal becomes dynamically complex. This dynamical complexity is caused (i) by the presence of both short-range connections and long-range shortcuts and (ii) by how well the system can adapt to the noisy environment. By tuning the adaptability of the environment and the long-range shortcuts we can increase or decrease the dynamical complexity, thereby modeling trends found in the MSE of a healthy human heart rate in different physiological states. When the shortcut and adaptability values increase, the complexity in the system dynamics becomes uncorrelated.

  13. Analysis of calibration data for the uranium active neutron coincidence counting collar with attention to errors in the measured neutron coincidence rate

    Energy Technology Data Exchange (ETDEWEB)

    Croft, Stephen [Oak Ridge National Laboratory (ORNL), One Bethel Valley Road, Oak Ridge, TN (United States); Burr, Tom [International Atomic Energy Agency (IAEA), Vienna (Austria); Favalli, Andrea [Los Alamos National Laboratory (LANL), MS E540, Los Alamos, NM 87545 (United States); Nicholson, Andrew [Oak Ridge National Laboratory (ORNL), One Bethel Valley Road, Oak Ridge, TN (United States)

    2016-03-01

    The declared linear density of {sup 238}U and {sup 235}U in fresh low enriched uranium light water reactor fuel assemblies can be verified for nuclear safeguards purposes using a neutron coincidence counter collar in passive and active mode, respectively. The active mode calibration of the Uranium Neutron Collar – Light water reactor fuel (UNCL) instrument is normally performed using a non-linear fitting technique. The fitting technique relates the measured neutron coincidence rate (the predictor) to the linear density of {sup 235}U (the response) in order to estimate model parameters of the nonlinear Padé equation, which traditionally is used to model the calibration data. Alternatively, following a simple data transformation, the fitting can also be performed using standard linear fitting methods. This paper compares performance of the nonlinear technique to the linear technique, using a range of possible error variance magnitudes in the measured neutron coincidence rate. We develop the required formalism and then apply the traditional (nonlinear) and alternative approaches (linear) to the same experimental and corresponding simulated representative datasets. We find that, in this context, because of the magnitude of the errors in the predictor, it is preferable not to transform to a linear model, and it is preferable not to adjust for the errors in the predictor when inferring the model parameters.

  14. The rate of spontaneous mutations in human myeloid cells

    Energy Technology Data Exchange (ETDEWEB)

    Araten, David J., E-mail: david.araten@nyumc.org [Division of Hematology, Department of Veterans Affairs New York Harbor Healthcare System (United States); Division of Hematology, Department of Medicine, NYU School of Medicine and the NYU Langone Cancer Center (United States); Krejci, Ondrej [Division of Experimental Hematology and Cancer Biology, Cincinnati Children' s Hospital Medical Center, Cincinnati, OH (United States); DiTata, Kimberly [Division of Hematology, Department of Medicine, NYU School of Medicine and the NYU Langone Cancer Center (United States); Wunderlich, Mark [Division of Experimental Hematology and Cancer Biology, Cincinnati Children' s Hospital Medical Center, Cincinnati, OH (United States); Sanders, Katie J.; Zamechek, Leah [Division of Hematology, Department of Medicine, NYU School of Medicine and the NYU Langone Cancer Center (United States); Mulloy, James C. [Division of Experimental Hematology and Cancer Biology, Cincinnati Children' s Hospital Medical Center, Cincinnati, OH (United States)

    2013-09-15

    Highlights: • We provide the first measurement of the mutation rate (μ) in human myeloid cells. • μ is measured to be 3.6–23 × 10{sup −7} per cell division. • The AML-ETO and MLL-AF9 fusions do not seem to increase μ. • Cooperating mutations in NRAS, FLT3 and p53 not seem to increase μ. • Hypermutability may be required to explain leukemogenesis. - Abstract: The mutation rate (μ) is likely to be a key parameter in leukemogenesis, but historically, it has been difficult to measure in humans. The PIG-A gene has some advantages for the detection of spontaneous mutations because it is X-linked, and therefore only one mutation is required to disrupt its function. Furthermore, the PIG-A-null phenotype is readily detected by flow cytometry. Using PIG-A, we have now provided the first in vitro measurement of μ in myeloid cells, using cultures of CD34+ cells that are transduced with either the AML-ETO or the MLL-AF9 fusion genes and expanded with cytokines. For the AML-ETO cultures, the median μ value was ∼9.4 × 10{sup −7} (range ∼3.6–23 × 10{sup −7}) per cell division. In contrast, few spontaneous mutations were observed in the MLL-AF9 cultures. Knockdown of p53 or introduction of mutant NRAS or FLT3 alleles did not have much of an effect on μ. Based on these data, we provide a model to predict whether hypermutability must occur in the process of leukemogenesis.

  15. Common errors in textbook descriptions of muscle fiber size in nontrained humans.

    Science.gov (United States)

    Chalmers, Gordon R; Row, Brandi S

    2011-09-01

    Exercise science and human anatomy and physiology textbooks commonly report that type IIB muscle fibers have the largest cross-sectional area of the three fiber types. These descriptions of muscle fiber sizes do not match with the research literature examining muscle fibers in young adult nontrained humans. For men, most commonly type IIA fibers were significantly larger than other fiber types (six out of 10 cases across six different muscles). For women, either type I, or both I and IIA muscle fibers were usually significantly the largest (five out of six cases across four different muscles). In none of these reports were type IIB fibers significantly larger than both other fiber types. In 27 studies that did not include statistical comparisons of mean fiber sizes across fiber types, in no cases were type IIB or fast glycolytic fibers larger than both type I and IIA, or slow oxidative and fast oxidative glycolytic fibers. The likely reason for mistakes in textbook descriptions of human muscle fiber sizes is that animal data were presented without being labeled as such, and without any warning that there are interspecies differences in muscle fiber properties. Correct knowledge of muscle fiber sizes may facilitate interpreting training and aging adaptations.

  16. Biased parameter estimates and inflated Type I error rates in analysis of covariance (and analysis of partial variance) arising from unreliability: alternatives and remedial strategies.

    Science.gov (United States)

    Zinbarg, Richard E; Suzuki, Satoru; Uliaszek, Amanda A; Lewis, Alison R

    2010-05-01

    Miller and Chapman (2001) argued that 1 major class of misuse of analysis of covariance (ANCOVA) or its multiple regression counterpart, analysis of partial variance (APV), arises from attempts to use an ANCOVA/APV to answer a research question that is not meaningful in the 1st place. Unfortunately, there is another misuse of ANCOVAs/APVs that arises frequently in psychopathology studies even when addressing consensually meaningful research questions. This misuse arises from inflated Type I error rates in ANCOVA/APV inferential tests of the unique association of the independent variable with the dependent variable when the covariate and independent variables are correlated and measured with error. Alternatives to conventional ANCOVAs/APVs are discussed, as are steps that can be taken to minimize the impact of this bias on drawing valid inferences when conventional ANCOVAs/APVs are used.

  17. Realistic glottal motion and airflow rate during human breathing.

    Science.gov (United States)

    Scheinherr, Adam; Bailly, Lucie; Boiron, Olivier; Lagier, Aude; Legou, Thierry; Pichelin, Marine; Caillibotte, Georges; Giovanni, Antoine

    2015-09-01

    The glottal geometry is a key factor in the aerosol delivery efficiency for treatment of lung diseases. However, while glottal vibrations were extensively studied during human phonation, the realistic glottal motion during breathing is poorly understood. Therefore, most current studies assume an idealized steady glottis in the context of respiratory dynamics, and thus neglect the flow unsteadiness related to this motion. This is particularly important to assess the aerosol transport mechanisms in upper airways. This article presents a clinical study conducted on 20 volunteers, to examine the realistic glottal motion during several breathing tasks. Nasofibroscopy was used to investigate the glottal geometrical variations simultaneously with accurate airflow rate measurements. In total, 144 breathing sequences of 30s were recorded. Regarding the whole database, two cases of glottal time-variations were found: "static" or "dynamic" ones. Typically, the peak value of glottal area during slow breathing narrowed from 217 ± 54 mm(2) (mean ± STD) during inspiration, to 178 ± 35 mm(2) during expiration. Considering flow unsteadiness, it is shown that the harmonic approximation of the airflow rate underevaluates the inertial effects as compared to realistic patterns, especially at the onset of the breathing cycle. These measurements provide input data to conduct realistic numerical simulations of laryngeal airflow and particle deposition. Copyright © 2015 IPEM. Published by Elsevier Ltd. All rights reserved.

  18. Human heart rate variability relation is unchanged during motion sickness

    Science.gov (United States)

    Mullen, T. J.; Berger, R. D.; Oman, C. M.; Cohen, R. J.

    1998-01-01

    In a study of 18 human subjects, we applied a new technique, estimation of the transfer function between instantaneous lung volume (ILV) and instantaneous heart rate (HR), to assess autonomic activity during motion sickness. Two control recordings of ILV and electrocardiogram (ECG) were made prior to the development of motion sickness. During the first, subjects were seated motionless, and during the second they were seated rotating sinusoidally about an earth vertical axis. Subjects then wore prism goggles that reverse the left-right visual field and performed manual tasks until they developed moderate motion sickness. Finally, ILV and ECG were recorded while subjects maintained a relatively constant level of sickness by intermittent eye closure during rotation with the goggles. Based on analyses of ILV to HR transfer functions from the three conditions, we were unable to demonstrate a change in autonomic control of heart rate due to rotation alone or due to motion sickness. These findings do not support the notion that moderate motion sickness is manifested as a generalized autonomic response.

  19. Human rights of children with intellectual disabilities: comparing self-ratings and proxy ratings.

    Science.gov (United States)

    Huus, K; Granlund, M; Bornman, J; Lygnegård, F

    2015-11-01

    A child rights-based approach to research articulates well with Article 12 of the United Nations Convention on the Rights of the Child (CRC) and highlights the importance and value of including children's own views about aspects that concern them. The aim of this study is to compare children with intellectual disability's own ratings (as self-raters) to those of their primary caregivers (as proxy raters) regarding human rights of children. The study also aims to establish whether there is an inter-rater agreement between the self-raters and proxy raters concerning Maslow's hierarchy of needs. This study is nested in a larger study examining the human rights of children with intellectual disability in South Africa. In total, 162 children with intellectual disability from 11 schools across three provinces and their primary caregivers participated by answering parts of a Children's Rights Questionnaire (CRQ) developed by the researchers based on the United Nation's CRC. We compared the answers for six questions in the questionnaire that were addressed to self-raters (children) and proxy raters (primary caregivers) in the same way. Questions regarding basic needs, such as access to clean water or whether the child had food to eat at home, were answered similarly by self-raters and proxy raters. Larger differences were found when self-raters and proxy raters were asked about whether the child had things or friends to play with at home. Socio-economic variables seemed to affect whether self-raters and proxy raters answered similarly. The results underscore the importance of promoting children's rights to express themselves by considering the opinions of both the children as self-raters and their primary caregivers as proxy raters - not only the latter. The results indicate that it is especially important to include children's own voices when more complex needs are surveyed. Agreement between self- and proxy ratings could be affected by socio-economic circumstances.

  20. Higher order scrambled digital nets achieve the optimal rate of the root mean square error for smooth integrands

    CERN Document Server

    Dick, Josef

    2010-01-01

    We study numerical approximations of integrals $\\int_{[0,1]^s} f(\\bsx) \\,\\mathrm{d} \\bsx$ by averaging the function at some sampling points. Monte Carlo (MC) sampling yields a convergence of the root mean square error (RMSE) of order $N^{-1/2}$ (where $N$ is the number of samples). Quasi-Monte Carlo (QMC) sampling on the other hand achieves a convergence of order $N^{-1+\\varepsilon}$, for any $\\varepsilon >0$. Randomized QMC (RQMC), a combination of MC and QMC, achieves a RMSE of order $N^{-3/2+\\varepsilon}$. A combination of RQMC with local antithetic sampling achieves a convergence of the RMSE of order $N^{-3/2-1/s+\\varepsilon}$ (where $s \\ge 1$ is the dimension). QMC, RQMC and RQMC with local antithetic sampling require that the integrand has some smoothness (for instance, bounded variation). Stronger smoothness assumptions on the integrand do not improve the convergence of the above algorithms further. This paper introduces a new RQMC algorithm, for which we prove that it achieves a convergence of the RMS...

  1. 船舶事故中人因失误机理的研究%Study on the Human Error Mechanism in Ship Accident

    Institute of Scientific and Technical Information of China (English)

    彭陈; 张圆圆

    2015-01-01

    Due to the complexity of the man-machine-environment system in ship accident, human error is of great possibility;therefore,to reduce human errors becomes important for prevention of ship accidents.This essay analyzes the reasons of human errors,constructs the human error model and the reliability mathematical model of human in ship accident,and gives an outlook on the study of human errors in ship accidents.%船舶事故中人-机-环境系统的复杂性,使得人因失误的概率很大,减少人因失误成为船舶事故的重要因素,本文分析了人因失误原因,构建了人因失误模型及船舶事故中人的可靠性数学模型,并对船舶事故人因失误的研究方向提出了展望。

  2. Estimates of rates and errors for measurements of direct-. gamma. and direct-. gamma. + jet production by polarized protons at RHIC

    Energy Technology Data Exchange (ETDEWEB)

    Beddo, M.E.; Spinka, H.; Underwood, D.G.

    1992-08-14

    Studies of inclusive direct-{gamma} production by pp interactions at RHIC energies were performed. Rates and the associated uncertainties on spin-spin observables for this process were computed for the planned PHENIX and STAR detectors at energies between {radical}s = 50 and 500 GeV. Also, rates were computed for direct-{gamma} + jet production for the STAR detector. The goal was to study the gluon spin distribution functions with such measurements. Recommendations concerning the electromagnetic calorimeter design and the need for an endcap calorimeter for STAR are made.

  3. Adaptive planning strategy for high dose rate prostate brachytherapy—a simulation study on needle positioning errors

    NARCIS (Netherlands)

    Borot, Maxence; Denis de Senneville, B; Maenhout, M; Hautvast, G; Binnekamp, D; Lagendijk, J J W; van Vulpen, M; Moerland, M A

    2016-01-01

    The development of magnetic resonance (MR) guided high dose rate (HDR) brachytherapy for prostate cancer has gained increasing interest for delivering a high tumor dose safely in a single fraction. To support needle placement in the limited workspace inside the closed-bore MRI, a single-needle MR-co

  4. De novo centriole formation in human cells is error-prone and does not require SAS-6 self-assembly.

    Science.gov (United States)

    Wang, Won-Jing; Acehan, Devrim; Kao, Chien-Han; Jane, Wann-Neng; Uryu, Kunihiro; Tsou, Meng-Fu Bryan

    2015-11-26

    Vertebrate centrioles normally propagate through duplication, but in the absence of preexisting centrioles, de novo synthesis can occur. Consistently, centriole formation is thought to strictly rely on self-assembly, involving self-oligomerization of the centriolar protein SAS-6. Here, through reconstitution of de novo synthesis in human cells, we surprisingly found that normal looking centrioles capable of duplication and ciliation can arise in the absence of SAS-6 self-oligomerization. Moreover, whereas canonically duplicated centrioles always form correctly, de novo centrioles are prone to structural errors, even in the presence of SAS-6 self-oligomerization. These results indicate that centriole biogenesis does not strictly depend on SAS-6 self-assembly, and may require preexisting centrioles to ensure structural accuracy, fundamentally deviating from the current paradigm.

  5. A Bayesian method for using simulator data to enhance human error probabilities assigned by existing HRA methods

    Energy Technology Data Exchange (ETDEWEB)

    Katrinia M. Groth; Curtis L. Smith; Laura P. Swiler

    2014-08-01

    In the past several years, several international organizations have begun to collect data on human performance in nuclear power plant simulators. The data collected provide a valuable opportunity to improve human reliability analysis (HRA), but these improvements will not be realized without implementation of Bayesian methods. Bayesian methods are widely used to incorporate sparse data into models in many parts of probabilistic risk assessment (PRA), but Bayesian methods have not been adopted by the HRA community. In this paper, we provide a Bayesian methodology to formally use simulator data to refine the human error probabilities (HEPs) assigned by existing HRA methods. We demonstrate the methodology with a case study, wherein we use simulator data from the Halden Reactor Project to update the probability assignments from the SPAR-H method. The case study demonstrates the ability to use performance data, even sparse data, to improve existing HRA methods. Furthermore, this paper also serves as a demonstration of the value of Bayesian methods to improve the technical basis of HRA.

  6. TECHNOLOGY VS NATURE: HUMAN ERROR IN DEALING WITH NATURE IN CRICHTON'S JURASSIC PARK

    Directory of Open Access Journals (Sweden)

    Sarah Prasasti

    2000-01-01

    Full Text Available Witnessing the euphoria of the era of biotechnology in the late twentieth century, Crichton exposes the theme of biotechnology in his works. In Jurassic Park, he voices his concern about the impact of the use of biotechnology to preserve nature and its living creatures. He further describes how the purpose of preserving nature and the creatures has turned out to be destructive. This article discusses Crichton's main character, Hammond, who attempts to control nature by genetically recreating the extinct fossil animals. It seems that the attempt ignores his human limitations. Although he is confident that has been equipped with the technology, he forgets to get along with nature. His way of using technology to accomplish his purpose proves not to be in harmony with nature. As a consequence, nature fights back. And he is conquered.

  7. Non-contact Laser-based Human Respiration Rate Measurement

    Science.gov (United States)

    Scalise, L.; Marchionni, P.; Ercoli, I.

    2011-08-01

    At present the majority of the instrumentation, used in clinical environments, to measure human respiration rate are based on invasive and contact devices. The gold standard instrument is considered the spirometer which is largely used; it needs a direct contact and requires a collaboration by the patient. Laser Doppler Vibrometer (LDVi) is an optical, non-contact measurement system for the assessment of a surface velocity and displacement. LDVi has already been used for the measurement of the cardiac activity and for the measurement of the chest-wall displacements. The aims of this work are to select the best measurement point on the thoracic surface for LDVi monitoring of the respiration rate (RR) and to compare measured data with the RR valued provided by the spirometer. The measurement system is composed by a LDV system and a data acquisition board installed on a PC. Tests were made on 10 different point of the thorax for each patient. Patients population was composed by 33 subjects (17 male and 16 female). The optimal measurement point was chosen considering the maximum peak-to-peak value of the displacement measured by LDV. Before extracting RR we have used a special wavelet decomposition for better selection of the expiration peaks. A standard spirometer was used for the validation of the data. From tests it results that the optimal measurement point, namely is located on the inferior part of the thoracic region (left, front side). From our tests we have obtained a close correlation between the RR values measured by the spirometer and those measured by the proposed method: a difference of 14±211 ms on the RR value is reported for the entire population of 33 subjects. Our method allows a no-contact measurement of lungs activity (respiration period), reducing the electric and biological risks. Moreover it allows to measure in critical environment like in RMN or in burned skin where is difficult or impossible to apply electrodes.

  8. An evaluation of a Low-Dose-Rate (LDR) brachytherapy procedure using a systems engineering & error analysis methodology for health care (SEABH) - (SAVE)

    LENUS (Irish Health Repository)

    Chadwick, Liam

    2012-03-12

    Health Care Failure Modes and Effects Analysis (HFMEA®) is an established tool for risk assessment in health care. A number of deficiencies have been identified in the method. A new method called Systems and Error Analysis Bundle for Health Care (SEABH) was developed to address these deficiencies. SEABH has been applied to a number of medical processes as part of its validation and testing. One of these, Low Dose Rate (LDR) prostate Brachytherapy is reported in this paper. The case study supported the validity of SEABH with respect to its capacity to address the weaknesses of (HFMEA®).

  9. Bit-error-rate performance analysis of self-heterodyne detected radio-over-fiber links using phase and intensity modulation

    DEFF Research Database (Denmark)

    Yin, Xiaoli; Yu, Xianbin; Tafur Monroy, Idelfonso

    2010-01-01

    We theoretically and experimentally investigate the performance of two self-heterodyne detected radio-over-fiber (RoF) links employing phase modulation (PM) and quadrature biased intensity modulation (IM), in term of bit-error-rate (BER) and optical signal-to-noise-ratio (OSNR). In both links, self......-heterodyne receivers perform down-conversion of radio frequency (RF) subcarrier signal. A theoretical model including noise analysis is constructed to calculate the Q factor and estimate the BER performance. Furthermore, we experimentally validate our prediction in the theoretical modeling. Both the experimental...

  10. Analysis of Errors of Deep Space X-Band Range-Rate Measurement%深空X频段测速数据误差分析

    Institute of Scientific and Technical Information of China (English)

    樊敏; 王宏; 李海涛; 赵华

    2013-01-01

    X-band is the primary frequency band used by deep space TT&C (Tracking, Telemetry and Command) systems. X-band range-rate measurement is more accurate than those of S-band as validated in X-band deep space TT&C system experiments of Chang'E-2 spacecraft. The precision of range-rate measurement is about 1 mm/s. For X-band range-rate, theoretical error caused by Doppler effect approximate calculation formula is analyzed. This error could become 1 cm/s during translunar and lunar-orbiting phases. Furthermore, measurement residual error is analyzed based on the precision ephemerides of post orbit determination for X-band deep space TT&C system experiment of Chang'E-2 spacecraft. The results show that the range-rate residual error induced by the approximation increases by 1 mm/s compared to what is calculated by equations. It is close to the actual measurement precision. Therefore, the Doppler effect approximate calculation formula is no longer applicable and the exact formula should be used in the lunar and deep space exploration projects in the future.%X频段是深空测控的主用频段,其多普勒测速精度远高于S频段,这一结论在“嫦娥二号”任务X频段深空测控技术试验中得到了验证,测速精度约为1 mm/s.针对X频段高精度测速,本文分析了目前采用的径向速度近似计算公式,理论分析其产生的误差在地月转移和环月轨道段可达1 cm/s.通过“嫦娥二号”任务X频段测控技术试验,以事后精密轨道为基准进行残差分析,结果表明,相比精确公式,近似公式计算测速数据的残差会增加1 mm/s,已与X频段测速精度本身相当,因此,多普勒测速近似计算在X频段测量中已不再适用,应使用本文中列出的精确计算公式.

  11. Bit-error-rate performance analysis of self-heterodyne detected radio-over-fiber links using phase and intensity modulation

    DEFF Research Database (Denmark)

    Yin, Xiaoli; Yu, Xianbin; Tafur Monroy, Idelfonso

    2010-01-01

    -heterodyne receivers perform down-conversion of radio frequency (RF) subcarrier signal. A theoretical model including noise analysis is constructed to calculate the Q factor and estimate the BER performance. Furthermore, we experimentally validate our prediction in the theoretical modeling. Both the experimental......We theoretically and experimentally investigate the performance of two self-heterodyne detected radio-over-fiber (RoF) links employing phase modulation (PM) and quadrature biased intensity modulation (IM), in term of bit-error-rate (BER) and optical signal-to-noise-ratio (OSNR). In both links, self...

  12. Inborn errors of the Krebs cycle: a group of unusual mitochondrial diseases in human.

    Science.gov (United States)

    Rustin, P; Bourgeron, T; Parfait, B; Chretien, D; Munnich, A; Rötig, A

    1997-08-22

    Krebs cycle disorders constitute a group of rare human diseases which present an amazing complexity considering our current knowledge on the Krebs cycle function and biogenesis. Acting as a turntable of cell metabolism, it is ubiquitously distributed in the organism and its enzyme components encoded by supposedly typical house-keeping genes. However, the investigation of patients presenting specific defects of Krebs cycle enzymes, resulting from deleterious mutations of the considered genes, leads to reconsider this simple envision by revealing organ-specific impairments, mostly affecting neuromuscular system. This often leaves aside organs the metabolism of which strongly depends on mitochondrial energy metabolism as well, such as heart, kidney or liver. Additionally, in some patients, a complex pattern of tissue-specific enzyme defect was also observed. The lack of functional additional copies of Krebs cycle genes suggests that the complex expression pattern should be ascribed to tissue-specific regulations of transcriptional and/or translational activities, together with a variable cell adaptability to Krebs cycle functional defects.

  13. Choice of reference sequence and assembler for alignment of Listeria monocytogenes short-read sequence data greatly influences rates of error in SNP analyses.

    Directory of Open Access Journals (Sweden)

    Arthur W Pightling

    Full Text Available The wide availability of whole-genome sequencing (WGS and an abundance of open-source software have made detection of single-nucleotide polymorphisms (SNPs in bacterial genomes an increasingly accessible and effective tool for comparative analyses. Thus, ensuring that real nucleotide differences between genomes (i.e., true SNPs are detected at high rates and that the influences of errors (such as false positive SNPs, ambiguously called sites, and gaps are mitigated is of utmost importance. The choices researchers make regarding the generation and analysis of WGS data can greatly influence the accuracy of short-read sequence alignments and, therefore, the efficacy of such experiments. We studied the effects of some of these choices, including: i depth of sequencing coverage, ii choice of reference-guided short-read sequence assembler, iii choice of reference genome, and iv whether to perform read-quality filtering and trimming, on our ability to detect true SNPs and on the frequencies of errors. We performed benchmarking experiments, during which we assembled simulated and real Listeria monocytogenes strain 08-5578 short-read sequence datasets of varying quality with four commonly used assemblers (BWA, MOSAIK, Novoalign, and SMALT, using reference genomes of varying genetic distances, and with or without read pre-processing (i.e., quality filtering and trimming. We found that assemblies of at least 50-fold coverage provided the most accurate results. In addition, MOSAIK yielded the fewest errors when reads were aligned to a nearly identical reference genome, while using SMALT to align reads against a reference sequence that is ∼0.82% distant from 08-5578 at the nucleotide level resulted in the detection of the greatest numbers of true SNPs and the fewest errors. Finally, we show that whether read pre-processing improves SNP detection depends upon the choice of reference sequence and assembler. In total, this study demonstrates that researchers

  14. Design Considerations for Human Rating of Liquid Rocket Engines

    Science.gov (United States)

    Parkinson, Douglas

    2010-01-01

    I.Human-rating is specific to each engine; a. Context of program/project must be understood. b. Engine cannot be discussed independently from vehicle and mission. II. Utilize a logical combination of design, manufacturing, and test approaches a. Design 1) It is crucial to know the potential ways a system can fail, and how a failure can propagate; 2) Fault avoidance, fault tolerance, DFMR, caution and warning all have roles to play. b. Manufacturing and Assembly; 1) As-built vs. as-designed; 2) Review procedures for assembly and maintenance periodically; and 3) Keep personnel trained and certified. c. There is no substitute for test: 1) Analytical tools are constantly advancing, but still need test data for anchoring assumptions; 2) Demonstrate robustness and explore sensitivities; 3) Ideally, flight will be encompassed by ground test experience. III. Consistency and repeatability is key in production a. Maintain robust processes and procedures for inspection and quality control based upon development and qualification experience; b. Establish methods to "spot check" quality and consistency in parts: 1) Dedicated ground test engines; 2) Random components pulled from the line/lot to go through "enhanced" testing.

  15. Nonlinear Control of Heart Rate Variability in Human Infants

    Science.gov (United States)

    Sugihara, George; Allan, Walter; Sobel, Daniel; Allan, Kenneth D.

    1996-03-01

    Nonlinear analyses of infant heart rhythms reveal a marked rise in the complexity of the electrocardiogram with maturation. We find that normal mature infants (gestation >= 35 weeks) have complex and distinctly nonlinear heart rhythms (consistent with recent reports for healthy adults) but that such nonlinearity is lacking in preterm infants (gestation parasympathetic-sympathetic interaction and function are presumed to be less well developed. Our study further shows that infants with clinical brain death and those treated with atropine exhibit a similar lack of nonlinear feedback control. These three lines of evidence support the hypothesis championed by Goldberger et al. [Goldberger, A. L., Rigney, D. R. & West, B. J. (1990) Sci. Am. 262, 43-49] that autonomic nervous system control underlies the nonlinearity and possible chaos of normal heart rhythms. This report demonstrates the acquisition of nonlinear heart rate dynamics and possible chaos in developing human infants and its loss in brain death and with the administration of atropine. It parallels earlier work documenting changes in the variability of heart rhythms in each of these cases and suggests that nonlinearity may provide additional power in characterizing physiological states.

  16. Disease model: a simplified approach for analysis and management of human error: a quality improvement study.

    Science.gov (United States)

    Ahmad-Sabry, Mohammad H I

    2015-04-01

    During 6 weeks, we had 4 incidents of echocardiography machine malfunction. There were 3 in the operating room, which were damaged due to intravenous (IV) fluid spillage over the keyboard of the machine leading to burning of the keyboard electric connection, and 1 in the cardiology department, which was damagaed due to spillage of coffee on it. The malfunction had an economic impact on the hospital (about $ 20,000) in addition to the nonavailability of the ultrasound (US) machine for the cardiac patient after the incident till the end of the case and for consequent cases till the fixation of the machine. We undertook an analysis of the incidents using simplified approach. The first incident happened when changing an empty IV fluid bag for a full one led to spillage of some fluid onto the keyboard. The second incidence was due to the use of needle to depressurize a medication bottle for continuous IV drip, and the third event was due to disconnection of the IV set from the bottle during transfer of the patient from operation room to intensive care unit. The fundamental problem is of course that fluid is harmful to the US machine. In addition, the machines are in a position between the patient bed and anesthesia machine. This means that IV pulls are on each side of the patient bed, which makes the machine vulnerable to fluid spillage. We considered a machine modification, to create a protective cover, but this was hindered by complexity of keyboard of the US machine, technical and financial challenges, and the time it would take to achieve. Second, we considered the creation of a protocol, with putting the machine in a position where no IV pulls are around and transferring the machine out of the room when transferring the patient will endanger the machine by the IV fluid. Third, changing of human behavior; to do this, we announced the protocol in our anesthesia conference to make it known to each and every one. We taught residents, fellows, and staff about the new

  17. Increased heart rate variability but normal resting metabolic rate in hypocretin/orexin-deficient human narcolepsy.

    NARCIS (Netherlands)

    Fronczek, R.; Overeem, S.; Reijntjes, R.; Lammers, G.J.; Dijk, J.G.M.; Pijl, H.

    2008-01-01

    STUDY OBJECTIVES: We investigated autonomic balance and resting metabolic rate to explore their possible involvement in obesity in hypocretin/orexin-deficient narcoleptic subjects. METHODS: Resting metabolic rate (using indirect calorimetry) and variability in heart rate and blood pressure were

  18. Increased error rates in preliminary reports issued by radiology residents working more than 10 consecutive hours overnight.

    Science.gov (United States)

    Ruutiainen, Alexander T; Durand, Daniel J; Scanlon, Mary H; Itri, Jason N

    2013-03-01

    To determine if the rate of major discrepancies between resident preliminary reports and faculty final reports increases during the final hours of consecutive 12-hour overnight call shifts. Institutional review board exemption status was obtained for this study. All overnight radiology reports interpreted by residents on-call between January 2010 and June 2010 were reviewed by board-certified faculty and categorized as major discrepancies if they contained a change in interpretation with the potential to impact patient management or outcome. Initial determination of a major discrepancy was at the discretion of individual faculty radiologists based on this general definition. Studies categorized as major discrepancies were secondarily reviewed by the residency program director (M.H.S.) to ensure consistent application of the major discrepancy designation. Multiple variables associated with each report were collected and analyzed, including the time of preliminary interpretation, time into shift study was interpreted, volume of studies interpreted during each shift, day of the week, patient location (inpatient or emergency department), block of shift (2-hour blocks for 12-hour shifts), imaging modality, patient age and gender, resident identification, and faculty identification. Univariate risk factor analysis was performed to determine the optimal data format of each variable (ie, continuous versus categorical). A multivariate logistic regression model was then constructed to account for confounding between variables and identify independent risk factors for major discrepancies. We analyzed 8062 preliminary resident reports with 79 major discrepancies (1.0%). There was a statistically significant increase in major discrepancy rate during the final 2 hours of consecutive 12-hour call shifts. Multivariate analysis confirmed that interpretation during the last 2 hours of 12-hour call shifts (odds ratio (OR) 1.94, 95% confidence interval (CI) 1.18-3.21), cross

  19. Downlink Error Rates of Half-duplex Users in Full-duplex Networks over a Laplacian Inter-User Interference Limited and EGK fading

    KAUST Repository

    Soury, Hamza

    2017-03-14

    This paper develops a mathematical framework to study downlink error rates and throughput for half-duplex (HD) terminals served by a full-duplex (FD) base station (BS). The developed model is used to motivate long term pairing for users that have non-line of sight (NLOS) interfering link. Consequently, we study the interferer limited problem that appears between NLOS HD users-pair that are scheduled on the same FD channel. The distribution of the interference is first characterized via its distribution function, which is derived in closed form. Then, a comprehensive performance assessment for the proposed pairing scheme is provided by assuming Extended Generalized- $cal{K}$ (EGK) fading for the downlink and studying different modulation schemes. To this end, a unified closed form expression for the average symbol error rate is derived. Furthermore, we show the effective downlink throughput gain harvested by the pairing NLOS users as a function of the average signal-to-interferenceratio when compared to an idealized HD scenario with neither interference nor noise. Finally, we show the minimum required channel gain pairing threshold to harvest downlink throughput via the FD operation when compared to the HD case for each modulation scheme.

  20. Determination of the Contamination Rate and the Associated Error for Targets Observed by CoRoT in the Exoplanet Channel

    Science.gov (United States)

    Gardes, B.; Chabaud, P.-Y.; Guterman, P.

    2012-09-01

    In the CoRoT exoplanet field of view, photometric measurements are obtained by aperture integration using a generic collection of masks. The total flux held within the photometric mask may be split in two parts, the target flux itself and the flux due to the nearest neighbours considered as contaminants. So far ExoDat (http://cesam.oamp.fr/exodat) gives a rough estimate of the contamination rate for all potential exoplanet targets (level-0) based on generic PSF shapes built before CoRoT launch. Here, we present the updated estimate of the contamination rate (level-1) with its associated error. This estimate is done for each target observed by CoRoT in the exoplanet channel using a new catalog of PSF built from the first available flight images and taking into account the line of sight of the satellite (i.e. the satellite orientation).

  1. Certification of COTS Software in NASA Human Rated Flight Systems

    Science.gov (United States)

    Goforth, Andre

    2012-01-01

    Adoption of commercial off-the-shelf (COTS) products in safety critical systems has been seen as a promising acquisition strategy to improve mission affordability and, yet, has come with significant barriers and challenges. Attempts to integrate COTS software components into NASA human rated flight systems have been, for the most part, complicated by verification and validation (V&V) requirements necessary for flight certification per NASA s own standards. For software that is from COTS sources, and, in general from 3rd party sources, either commercial, government, modified or open source, the expectation is that it meets the same certification criteria as those used for in-house and that it does so as if it were built in-house. The latter is a critical and hidden issue. This paper examines the longstanding barriers and challenges in the use of 3rd party software in safety critical systems and cover recent efforts to use COTS software in NASA s Multi-Purpose Crew Vehicle (MPCV) project. It identifies some core artifacts that without them, the use of COTS and 3rd party software is, for all practical purposes, a nonstarter for affordable and timely insertion into flight critical systems. The paper covers the first use in a flight critical system by NASA of COTS software that has prior FAA certification heritage, which was shown to meet the RTCA-DO-178B standard, and how this certification may, in some cases, be leveraged to allow the use of analysis in lieu of testing. Finally, the paper proposes the establishment of an open source forum for development of safety critical 3rd party software.

  2. [Analysis, identification and correction of some errors of model refseqs appeared in NCBI Human Gene Database by in silico cloning and experimental verification of novel human genes].

    Science.gov (United States)

    Zhang, De-Li; Ji, Liang; Li, Yan-Da

    2004-05-01

    We found that human genome coding regions annotated by computers have different kinds of many errors in public domain through homologous BLAST of our cloned genes in non-redundant (nr) database, including insertions, deletions or mutations of one base pair or a segment in sequences at the cDNA level, or different permutation and combination of these errors. Basically, we use the three means for validating and identifying some errors of the model genes appeared in NCBI GENOME ANNOTATION PROJECT REFSEQS: (I) Evaluating the support degree of human EST clustering and draft human genome BLAST. (2) Preparation of chromosomal mapping of our verified genes and analysis of genomic organization of the genes. All of the exon/intron boundaries should be consistent with the GT/AG rule, and consensuses surrounding the splice boundaries should be found as well. (3) Experimental verification by RT-PCR of the in silico cloning genes and further by cDNA sequencing. And then we use the three means as reference: (1) Web searching or in silico cloning of the genes of different species, especially mouse and rat homologous genes, and thus judging the gene existence by ontology. (2) By using the released genes in public domain as standard, which should be highly homologous to our verified genes, especially the released human genes appeared in NCBI GENOME ANNOTATION PROJECT REFSEQS, we try to clone each a highly homologous complete gene similar to the released genes in public domain according to the strategy we developed in this paper. If we can not get it, our verified gene may be correct and the released gene in public domain may be wrong. (3) To find more evidence, we verified our cloned genes by RT-PCR or hybrid technique. Here we list some errors we found from NCBI GENOME ANNOTATION PROJECT REFSEQs: (1) Insert a base in the ORF by mistake which causes the frame shift of the coding amino acid. In detail, abase in the ORF of a gene is a redundant insertion, which causes a reading frame

  3. Cognition Analysis of Human Errors in ATC Based on HERA-JANUS Model%基于HERA-JANUS模型的空管人误认知分析

    Institute of Scientific and Technical Information of China (English)

    吴聪; 解佳妮; 杜红兵; 袁乐平

    2012-01-01

    空管人误分类分析是空管人误研究的基础.为了对管制员人误进行系统的分类研究,结合空管业务知识和认知心理学理论,对欧洲航空安全局和美国联邦航空局合作开发的HERA-JANUS模型的工作原理和流程进行较详细地分析.运用该方法模型,对我国一起空管不安全事件案例进行分析后得到3个由管制员所产生的人误差错,并对这3个人误差错分别从人误类型、人误认知、相关因素3方面进行详尽的分析研究,最后得出该不安全事件的21项人误结果.结果表明,HERA-JANUS模型能较全面地从深层次分析管制员的人误,其分类形式也便于开展空管人误统计.%It was held that classification and analysis of human errors were a basis for ATM system human factors study. With the professional knowledge of ATM and cognitive psychology theory, the principle and flowchart of HERA-JANUS model developed by European Aviation Safety Agency and Federal Aviation Administration were introduced in detail in order to research controllers' errors more systematically. An unsafe incident case of ATC in China was investigated by employing the model, and three human errors stumbled by a controller in this case were identified. These errors were classified from three respects, viz. human error type, human error cognition, and influencing factors, respectively. Twenty-one causal factors of human errors of the unsafe occurrence were ultimately obtained. The results show that the model can analyze controllers' errors more comprehensively and its classification way is helpful in earring out statistics of controllers' errors.

  4. 认知控制模式下的CREAM方法概率量化%Quantification of human error probability of CREAM in cognitive control mode

    Institute of Scientific and Technical Information of China (English)

    蒋英杰; 孙志强; 宫二玲; 谢红卫

    2011-01-01

    Human errors have nowadays turned lo be the main factor that may reduce the reliability and safety of human-machine system, and therefore necessary to be attached special attention to. It is for this reason that the quantification of human error probability has become the research topic of this paper known as a key ingredient of human reliability analysis (HRA) . However, the first step for us to do here is to introduce the basic method of cognitive reliability and error analysis method (CREAM) as a kind of widely accepted HRA method as well as the hasic theory it involves, And, then, we would like to introduce the steps for quantifying human error probability in details. Considering that cognitive ben a vi or mode provided by CREAM should be continuous, we have put forward two methods for defining the probabilistic control modes by HRA practitioners, which arc based on Bayesian nell and the fuzzy logic, respectively. The reason for so doing is that if the human error probability were not lo be quantified, it would be necessary to construct a method to deal with the human error probability in probabilistic mode, which makes it necessary to apply a method for quantifying the human error probability in probabilistic control modes. In preparing for such a method, we should lake the lognormal function as the probabilistic density function of human error probability in the mode and the probabilistic density function of human error probability in probabilistic cognitive behavior mode as the linear combination of the functions in each cognitive behavior mode. However, the human error probability in probabilistic mode is quantified through theoretical inference. In order to heighten the efficiency of calculation, we have also applied the Monte Carlo algorithm to our work. And, last of all, the validity of the method has been demonstrated by means of a sample study to show the process of the method.%研究了人因可靠性分析(Human Reliability Analysis,HRA)中人为差

  5. Increased heart rate variability but normal resting metabolic rate in hypocretin/orexin-deficient human narcolepsy.

    NARCIS (Netherlands)

    Fronczek, R.; Overeem, S.; Reijntjes, R.; Lammers, G.J.; Dijk, J.G.M.; Pijl, H.

    2008-01-01

    STUDY OBJECTIVES: We investigated autonomic balance and resting metabolic rate to explore their possible involvement in obesity in hypocretin/orexin-deficient narcoleptic subjects. METHODS: Resting metabolic rate (using indirect calorimetry) and variability in heart rate and blood pressure were dete

  6. Ventilator-associated pneumonia: the influence of bacterial resistance, prescription errors, and de-escalation of antimicrobial therapy on mortality rates

    Directory of Open Access Journals (Sweden)

    Ana Carolina Souza-Oliveira

    Full Text Available Abstract Ventilator-associated pneumonia is the most prevalent nosocomial infection in intensive care units and is associated with high mortality rates (14–70%. Aim This study evaluated factors influencing mortality of patients with Ventilator-associated pneumonia (VAP, including bacterial resistance, prescription errors, and de-escalation of antibiotic therapy. Methods This retrospective study included 120 cases of Ventilator-associated pneumonia admitted to the adult adult intensive care unit of the Federal University of Uberlândia. The chi-square test was used to compare qualitative variables. Student's t-test was used for quantitative variables and multiple logistic regression analysis to identify independent predictors of mortality. Findings De-escalation of antibiotic therapy and resistant bacteria did not influence mortality. Mortality was 4 times and 3 times higher, respectively, in patients who received an inappropriate antibiotic loading dose and in patients whose antibiotic dose was not adjusted for renal function. Multiple logistic regression analysis revealed the incorrect adjustment for renal function was the only independent factor associated with increased mortality. Conclusion Prescription errors influenced mortality of patients with Ventilator-associated pneumonia, underscoring the challenge of proper Ventilator-associated pneumonia treatment, which requires continuous reevaluation to ensure that clinical response to therapy meets expectations.

  7. ERRORS AND CORRECTION

    Institute of Scientific and Technical Information of China (English)

    1998-01-01

    To err is human . Since the 1960s, most second language teachers or language theorists have regarded errors as natural and inevitable in the language learning process . Instead of regarding them as terrible and disappointing, teachers have come to realize their value. This paper will consider these values, analyze some errors and propose some effective correction techniques.

  8. Science, practice, and human errors in controlling Clostridium botulinum in heat-preserved food in hermetic containers.

    Science.gov (United States)

    Pflug, Irving J

    2010-05-01

    The incidence of botulism in canned food in the last century is reviewed along with the background science; a few conclusions are reached based on analysis of published data. There are two primary aspects to botulism control: the design of an adequate process and the delivery of the adequate process to containers of food. The probability that the designed process will not be adequate to control Clostridium botulinum is very small, probably less than 1.0 x 10(-6), based on containers of food, whereas the failure of the operator of the processing equipment to deliver the specified process to containers of food may be of the order of 1 in 40, to 1 in 100, based on processing units (retort loads). In the commercial food canning industry, failure to deliver the process will probably be of the order of 1.0 x 10(-4) to 1.0 x 10(-6) when U.S. Food and Drug Administration (FDA) regulations are followed. Botulism incidents have occurred in food canning plants that have not followed the FDA regulations. It is possible but very rare to have botulism result from postprocessing contamination. It may thus be concluded that botulism incidents in canned food are primarily the result of human failure in the delivery of the designed or specified process to containers of food that, in turn, result in the survival, outgrowth, and toxin production of C. botulinum spores. Therefore, efforts in C. botulinum control should be concentrated on reducing human errors in the delivery of the specified process to containers of food.

  9. Who cares about consent requirements for sourcing human embryonic stem cells? Are errors in the past really errors of the past?

    Science.gov (United States)

    Krahn, Timothy M; Wallwork, Thomas E

    2011-01-01

    Through an Access to Information Act request, we have obtained the consent forms used by the providers of every human embryonic stem cell (hESC) line approved for use by the Canadian Institutes of Health Research (CIHR), and examined them to verify whether or not they meet the consent requirements established by Canadian law and regulations. Our findings show that at least seven out of ten consent forms studied did not satisfy these minimum requirements. We then outline various options for responding to this situation in terms of: (i) remedial measures for dealing with executive problems with regulatory oversight procedures; and (ii) remedial measures for dealing with the impugned lines.

  10. Bit Error Rate Performance of a MIMO-CDMA System Employing Parity-Bit-Selected Spreading in Frequency Nonselective Rayleigh Fading

    Directory of Open Access Journals (Sweden)

    Claude D'Amours

    2011-01-01

    Full Text Available We analytically derive the upper bound for the bit error rate (BER performance of a single user multiple input multiple output code division multiple access (MIMO-CDMA system employing parity-bit-selected spreading in slowly varying, flat Rayleigh fading. The analysis is done for spatially uncorrelated links. The analysis presented demonstrates that parity-bit-selected spreading provides an asymptotic gain of 10log(Nt dB over conventional MIMO-CDMA when the receiver has perfect channel estimates. This analytical result concurs with previous works where the (BER is determined by simulation methods and provides insight into why the different techniques provide improvement over conventional MIMO-CDMA systems.

  11. A point-process model of human heartbeat intervals: new definitions of heart rate and heart rate variability.

    Science.gov (United States)

    Barbieri, Riccardo; Matten, Eric C; Alabi, Abdulrasheed A; Brown, Emery N

    2005-01-01

    Heart rate is a vital sign, whereas heart rate variability is an important quantitative measure of cardiovascular regulation by the autonomic nervous system. Although the design of algorithms to compute heart rate and assess heart rate variability is an active area of research, none of the approaches considers the natural point-process structure of human heartbeats, and none gives instantaneous estimates of heart rate variability. We model the stochastic structure of heartbeat intervals as a history-dependent inverse Gaussian process and derive from it an explicit probability density that gives new definitions of heart rate and heart rate variability: instantaneous R-R interval and heart rate standard deviations. We estimate the time-varying parameters of the inverse Gaussian model by local maximum likelihood and assess model goodness-of-fit by Kolmogorov-Smirnov tests based on the time-rescaling theorem. We illustrate our new definitions in an analysis of human heartbeat intervals from 10 healthy subjects undergoing a tilt-table experiment. Although several studies have identified deterministic, nonlinear dynamical features in human heartbeat intervals, our analysis shows that a highly accurate description of these series at rest and in extreme physiological conditions may be given by an elementary, physiologically based, stochastic model.

  12. Post-manufacturing, 17-times acceptable raw bit error rate enhancement, dynamic codeword transition ECC scheme for highly reliable solid-state drives, SSDs

    Science.gov (United States)

    Tanakamaru, Shuhei; Fukuda, Mayumi; Higuchi, Kazuhide; Esumi, Atsushi; Ito, Mitsuyoshi; Li, Kai; Takeuchi, Ken

    2011-04-01

    A dynamic codeword transition ECC scheme is proposed for highly reliable solid-state drives, SSDs. By monitoring the error number or the write/erase cycles, the ECC codeword dynamically increases from 512 Byte (+parity) to 1 KByte, 2 KByte, 4 KByte…32 KByte. The proposed ECC with a larger codeword decreases the failure rate after ECC. As a result, the acceptable raw bit error rate, BER, before ECC is enhanced. Assuming a NAND Flash memory which requires 8-bit correction in 512 Byte codeword ECC, a 17-times higher acceptable raw BER than the conventional fixed 512 Byte codeword ECC is realized for the mobile phone application without an interleaving. For the MP3 player, digital-still camera and high-speed memory card applications with a dual channel interleaving, 15-times higher acceptable raw BER is achieved. Finally, for the SSD application with 8 channel interleaving, 13-times higher acceptable raw BER is realized. Because the ratio of the user data to the parity bits is the same in each ECC codeword, no additional memory area is required. Note that the reliability of SSD is improved after the manufacturing without cost penalty. Compared with the conventional ECC with the fixed large 32 KByte codeword, the proposed scheme achieves a lower power consumption by introducing the "best-effort" type operation. In the proposed scheme, during the most of the lifetime of SSD, a weak ECC with a shorter codeword such as 512 Byte (+parity), 1 KByte and 2 KByte is used and 98% lower power consumption is realized. At the life-end of SSD, a strong ECC with a 32 KByte codeword is used and the highly reliable operation is achieved. The random read performance is also discussed. The random read performance is estimated by the latency. The latency is below 1.5 ms for ECC codeword up to 32 KByte. This latency is below the average latency of 15,000 rpm HDD, 2 ms.

  13. Analyzing the propagation behavior of scintillation index and bit error rate of a partially coherent flat-topped laser beam in oceanic turbulence.

    Science.gov (United States)

    Yousefi, Masoud; Golmohammady, Shole; Mashal, Ahmad; Kashani, Fatemeh Dabbagh

    2015-11-01

    In this paper, on the basis of the extended Huygens-Fresnel principle, a semianalytical expression for describing on-axis scintillation index of a partially coherent flat-topped (PCFT) laser beam of weak to moderate oceanic turbulence is derived; consequently, by using the log-normal intensity probability density function, the bit error rate (BER) is evaluated. The effects of source factors (such as wavelength, order of flatness, and beam width) and turbulent ocean parameters (such as Kolmogorov microscale, relative strengths of temperature and salinity fluctuations, rate of dissipation of the mean squared temperature, and rate of dissipation of the turbulent kinetic energy per unit mass of fluid) on propagation behavior of scintillation index, and, hence, on BER, are studied in detail. Results indicate that, in comparison with a Gaussian beam, a PCFT laser beam with a higher order of flatness is found to have lower scintillations. In addition, the scintillation index and BER are most affected when salinity fluctuations in the ocean dominate temperature fluctuations.

  14. Predicting sex offender recidivism. I. Correcting for item overselection and accuracy overestimation in scale development. II. Sampling error-induced attenuation of predictive validity over base rate information.

    Science.gov (United States)

    Vrieze, Scott I; Grove, William M

    2008-06-01

    The authors demonstrate a statistical bootstrapping method for obtaining unbiased item selection and predictive validity estimates from a scale development sample, using data (N = 256) of Epperson et al. [2003 Minnesota Sex Offender Screening Tool-Revised (MnSOST-R) technical paper: Development, validation, and recommended risk level cut scores. Retrieved November 18, 2006 from Iowa State University Department of Psychology web site: http://www.psychology.iastate.edu/ approximately dle/mnsost_download.htm] from which the Minnesota Sex Offender Screening Tool-Revised (MnSOST-R) was developed. Validity (area under receiver operating characteristic curve) reported by Epperson et al. was .77 with 16 items selected. The present analysis yielded an asymptotically unbiased estimator AUC = .58. The present article also focused on the degree to which sampling error renders estimated cutting scores (appropriate to local [varying] recidivism base rates) nonoptimal, so that the long-run performance (measured by correct fraction, the total proportion of correct classifications) of these estimated cutting scores is poor, when they are applied to their parent populations (having assumed values for AUC and recidivism rate). This was investigated by Monte Carlo simulation over a range of AUC and recidivism rate values. Results indicate that, except for the AUC values higher than have ever been cross-validated, in combination with recidivism base rates severalfold higher than the literature average [Hanson and Morton-Bourgon, 2004, Predictors of sexual recidivism: An updated meta-analysis. (User report 2004-02.). Ottawa: Public Safety and Emergency Preparedness Canada], the user of an instrument similar in performance to the MnSOST-R cannot expect to achieve correct fraction performance notably in excess of what is achievable from knowing the population recidivism rate alone. The authors discuss the legal implications of their findings for procedural and substantive due process in

  15. A quantitative assay for lysosomal acidification rates in human osteoclasts

    DEFF Research Database (Denmark)

    Jensen, Vicki Kaiser; Nosjean, Olivier; Dziegiel, Morten Hanefeld;

    2011-01-01

    lacunae. The electroneutrality of the lacunae is maintained by chloride transport through the chloride-proton antiporter chloride channel 7. Inhibition of either proton or chloride transport prevents bone resorption. The aims of this study were to validate the human osteoclastic microsome- based influx......, the effect of valinomycin, inhibitor sensitivity, and the ion profile of the human osteoclast microsomes. The expression level of chloride channel 7 was increased in the human osteoclastic microsomes compared with whole osteoclasts. Acid influx was induced by 1.25 mM adenosine triphosphate. Further 1.1 μ......M valinomycin increased the acid influx by 129%. Total abrogation of acid influx was observed using both H(+) and Cl(-) ionophores. Finally, investigation of the anion profile demonstrated that Cl(-) and Br(-) are the preferred anions for the transporter. In conclusion, the acid influx assay based on microsomes...

  16. High triacylglycerol turnover rate in human skeletal muscle

    DEFF Research Database (Denmark)

    Sacchetti, Massimo; Saltin, Bengt; Olsen, David B

    2004-01-01

    could be due to the observed decline in plasma insulin concentration (-74%, P pool(-1)). An increase in FA level......, as a consequence of short-term fasting, does not seem to increase IMTAG synthesis rate and pool size....

  17. The "human" statistics of terrestrial impact cratering rate

    CERN Document Server

    Jetsu, L

    1997-01-01

    The most significant periodicities in the terrestrial impact crater record are due to the human-signal: the bias of assigning integer values for the crater ages. This bias seems to have eluded the proponents and opponents of real periodicity in the occurrence of these events, as well as the theorists searching for an extraterrestrial explanation for such periodicity. The human-signal should be seriously considered by scientists in astronomy, geology and paleontology when searching for a connection between terrestrial major comet or asteroid impacts and mass extinctions of species.

  18. Somatic microindels in human cancer: the insertions are highly error-prone and derive from nearby but not adjacent sense and antisense templates.

    Science.gov (United States)

    Scaringe, William A; Li, Kai; Gu, Dongqing; Gonzalez, Kelly D; Chen, Zhenbin; Hill, Kathleen A; Sommer, Steve S

    2008-09-15

    Somatic microindels (microdeletions with microinsertions) have been studied in normal mouse tissues using the Big Blue lacI transgenic mutation detection system. Here we analyze microindels in human cancers using an endogenous and transcribed gene, the TP53 gene. Microindel frequency, the enhancement of 1-2 microindels and other features are generally similar to that observed in the non-transcribed lacI gene in normal mouse tissues. The current larger sample of somatic microindels reveals recurroids: mutations in which deletions are identical and the co-localized insertion is similar. The data reveal that the inserted sequences derive from nearby but not adjacent sequences in contrast to the slippage that characterizes the great majority of pure microinsertions. The microindel inserted sequences derive from a template on the sense or antisense strand with similar frequency. The estimated error rate of the insertion process of 13% per bp is by far the largest reported in vivo, with the possible exception of somatic hypermutation in the immunoglobulin gene. The data constrain possible mechanisms of microindels and raise the question of whether microindels are 'scars' from the bypass of large DNA adducts by a translesional polymerase, e.g. the 'Tarzan model' presented herein.

  19. Analysis on Error Tolerance Rate Theory and Its Relative Effecting Factors in Braille Reading%盲文阅读中的容错率理论及相关影响因素分析

    Institute of Scientific and Technical Information of China (English)

    孙宇[; 李纯莲; 钟经华

    2016-01-01

    Braille error tolerance rate includes two aspects: the scheme error tolerance rate corresponding to Braille scheme and the spelling error tolerance rate corresponding to readers.In order to reasonably evaluate the spelling efficiency of Chinese Braille scheme and further improve it, this paper presents a concept of scheme error tolerance rate and makes a statistical analysis on it.The results show that the error tolerance rate is objective necessary and controllable, pointing out the Braille scheme with the greater error tolerance rate will be easier to use and popularize.Finally, it gives an optimization function of scheme error tolerance rate, which is helpful to improve the current Braille scheme.Meanwhile, it discusses the influences of readers′psychological factors on Braille error tolerance rate when reading and reveals the relations of mutual influence, mutual promotion and mutual compensation between the scheme error tolerance rate of Braille scheme and the spelling error tolerance rate of Braille readers.%盲文容错率包括盲文方案的方案容错率和盲文读者的拼读容错率两个方面。为了合理评估汉语盲文方案的拼读效率、进一步改进盲文方案,提出盲文方案的方案容错率概念并对其进行统计学分析,得出容错率存在的必然性和可控性,指出容错率较大的盲文方案较容易使用和推广,最后给出了盲文方案容错率的优化函数以利于改进现有盲文方案。同时还分析了读者在阅读盲文时,其心理因素对盲文容错率的影响,揭示了盲文方案的方案容错率和盲文读者的拼读容错率之间相互影响、相互促进、相互代偿的关系。

  20. A Wearable Capacitive Sensor for Monitoring Human Respiratory Rate

    Science.gov (United States)

    Kundu, Subrata Kumar; Kumagai, Shinya; Sasaki, Minoru

    2013-04-01

    Realizing an untethered, low-cost, and comfortably wearable respiratory rate sensor for long-term breathing monitoring application still remains a challenge. In this paper, a conductive-textile-based wearable respiratory rate sensing technique based on the capacitive sensing approach is proposed. The sensing unit consists of two conductive textile electrodes that can be easily fabricated, laminated, and integrated in garments. Respiration cycle is detected by measuring the capacitance of two electrodes placed on the inner anterior and posterior sides of a T-shirt at either the abdomen or chest position. A convenient wearable respiratory sensor setup with a capacitance-to-voltage converter has been devised. Respiratory rate as well as breathing mode can be accurately identified using the designed sensor. The sensor output provides significant information on respiratory flow. The effectiveness of the proposed system for different breathing patterns has been evaluated by experiments.

  1. Unavoidable Human Errors of Tumor Size Measurement during Specimen Attachment after Endoscopic Resection: A Clinical Prospective Study

    Science.gov (United States)

    Mori, Hirohito; Kobara, Hideki; Tsushimi, Takaaki; Nishiyama, Noriko; Fujihara, Shintaro; Masaki, Tsutomu

    2015-01-01

    Objective Objective evaluation of resected specimen and tumor size is critical because the tumor diameter after endoscopic submucosal dissection affects therapeutic strategies. In this study, we investigated whether the true tumor diameter of gastrointestinal cancer specimens measured by flexible endoscopy is subjective by testing whether the specimen is correctly attached to the specimen board after endoscopic submucosal dissection resection and whether the size differs depending on the endoscopist who attached the specimen. Methods Seventy-two patients diagnosed with early gastric cancer who satisfied the endoscopic submucosal dissection expanded-indication guideline were enrolled. Three endoscopists were randomly selected before every endoscopic submucosal dissection. Each endoscopist separately attached the same resected specimen, measured the maximum resection diameter and tumor size, and removed the lesion from the attachment board. Results The resected specimen diameters of the 3 endoscopists were 44.5±13.9 mm (95% Confidence Interval (CI): 23–67), 37.4±12.0 mm (95% CI: 18–60), and 41.1±13.3 mm (95% CI: 20–63) mm. Comparison among 3 groups (Kruskal Wallis H- test), there were significant differences (H = 6.397, P = 0.040), and recorded tumor sizes were 38.3±13.1 mm (95% CI: 16–67), 31.1±11.2 mm (95% CI: 12.5–53.3), and 34.8±12.8 (95% CI: 11.5–62.3) mm. Comparison among 3 groups, there were significant differences (H = 6.917, P = 0.031). Conclusions Human errors regarding the size of attached resected specimens are unavoidable, but it cannot be ignored because it affects the patient’s additional treatment and/or surgical intervention. We must develop a more precise methodology to obtain accurate tumor size. Trial Registration University hospital Medical Information Network UMIN No. 000012915 PMID:25856397

  2. Differentiated Bit Error Rate Estimation for Wireless Networks%无线网络的差异化比特错误率估计方法

    Institute of Scientific and Technical Information of China (English)

    张招亮; 陈海明; 黄庭培; 崔莉

    2014-01-01

    在无线网络中,比特错误率(bit error rate,BER)的估计是许多上层协议的基础,对数据传输的性能具有重要的影响,目前已成为一个重要的研究课题.但是现有BER估计编码未考虑实际网络的BER分布特征,估计误差较大.在实测分析802.11无线网络的BER分布特征的基础上,提出了一种采用差异化思想来提高BER估计准确度的方法差异化估错码(differentiated error estimation,DEE),其主要思想是在数据包中插入具有不同估错能力的多级估错位,并随机均匀地分布各估错位.然后,借助BER与奇偶校验错误概率的理论关系来估计BER.此外,DEE利用BER非均匀分布特征来优化各级估错位的能力,提高出现概率较高的BER的估计准确度,以降低平均估计误差.在7个节点组成的测试床上评价了DEE的性能.实验结果表明,与最近的研究成果估错码(error estimation code,EEC)相比,DEE可将估计误差平均减少约44%.当估错冗余较低时DEE可将估计误差减少约68%.此外,DEE具有比EEC更小的估计偏差.

  3. Comparison of Bit Error Rate Performance of Multi Tone Channel Utilising De-OQPSK and De-Off Set 16 QAM with Guard Interval

    Directory of Open Access Journals (Sweden)

    Ibrahim A.Z. Qatawneh

    2005-01-01

    Full Text Available Digital communications systems use Multi tone Channel (MC transmission techniques with differentially encoded and differentially coherent demodulation. Today there are two principle MC application, one is for the high speed digital subscriber loop and the other is for the broadcasting of digital audio and video signals. In this study the comparison of multi carriers with OQPSK and Offset 16 QAM for high-bit rate wireless applications are considered. The comparison of Bit Error Rate (BER performance of Multi tone Channel (MC with offset quadrature amplitude modulation (Offset 16 QAM and offset quadrature phase shift keying modulation (OQPSK with guard interval in a fading environment is considered via the use of Monte Carlo simulation methods. BER results are presented for Offset 16 QAM using guard interval to immune the multi path delay for frequency Rayleigh fading channels and for two-path fading channels in the presence of Additive White Gaussian Noise (AWGN. The BER results are presented for Multi tone Channel (MC with differentially Encoded offset 16 Quadrature Amplitude Modulation (offset 16 QAM and MC with differentially Encoded offset quadrature phase shift keying modulation (OQPSK using guard interval for frequency flat Rician channel in the presence of Additive White Gaussian Noise (AWGN. The performance of multitone systems is also compared with equivalent differentially Encoded offset quadrature amplitude modulation (Offset 16 QAM and differentially Encoded offset quadrature phase shift keying modulation (OQPSKwith and without guard interval in the same fading environment.

  4. Muscle metaboreflex and autonomic regulation of heart rate in humans

    DEFF Research Database (Denmark)

    Fisher, James P; Adlan, Ahmed M; Shantsila, Alena

    2013-01-01

    We elucidated the autonomic mechanisms whereby heart rate (HR) is regulated by the muscle metaboreflex. Eight male participants (22 ± 3 years) performed three exercise protocols: (1) enhanced metaboreflex activation with partial flow restriction (bi-lateral thigh cuff inflation) during leg cycling...

  5. 情景环境与人为差错的对应关系分析方法%Method for correlation analysis between scenario and human error

    Institute of Scientific and Technical Information of China (English)

    蒋英杰; 孙志强; 宫二玲; 谢红卫

    2011-01-01

    A new method is proposed to analyze the correlation between scenario and human error. The scenario is decomposed into six aspects, which are operator, machine, task, organization, environment and assistant devices. Based on the scenario decomposition, a taxonomy of performance shaping factor is constructed, which includes thirty-eight items and can provide a reference template for the investigation of human error causes. Based on the skill-based, rule-based and knowledge-based (SRK) model, the slip/lapse/mistake framework is introduced to classify human errors, which are categorized as skill-based slip and lapse, rule-based slip and mistake, and knowledge-based mistake. Grey relational analysis is introduced to analyze the correlation between performance shaping factors and human error types, in which the correlations of "consequent-antecedent" and "antecedent-consequent" are both analyzed. By this method, performance shaping factors related to some specified human error type and human error types caused by some specified performance shaping factor both can be sorted according to their correlation degrees. A case study is provided, which shows that the proposed method is applicable in analyzing the correlation between scenario and human error, and can provide some important implications for human error prediction and human error reduction.%提出了一种分析情景环境与人为差错之间对应关系的方法.将情景环境分为操作者、机器、任务、组织、环境和辅助系统6个方面,建立了包含38个元素的行为形成因子分类方法,为人为差错成因的查找提供了参考模板.在SRK(skill-based,rule-based and knowledge-based)模型的基础上引入疏忽/遗忘/错误分类框架,将人为差错分为技能型疏忽、技能型遗忘、规则型疏忽、规则型错误以及知识型错误等5种基本的人为差错类型.使用灰色关联分析方法,从“结果-原因”和“原因-结果”两个方向分析行为形

  6. EMG versus torque control of human-machine systems: equalizing control signal variability does not equalize error or uncertainty.

    Science.gov (United States)

    Johnson, Reva E; Koerding, Konrad P; Hargrove, Levi J; Sensinger, Jonathon W

    2016-08-25

    In this paper we asked the question: if we artificially raise the variability of torque control signals to match that of EMG, do subjects make similar errors and have similar uncertainty about their movements? We answered this question using two experiments in which subjects used three different control signals: torque, torque+noise, and EMG. First, we measured error on a simple target-hitting task in which subjects received visual feedback only at the end of their movements. We found that even when the signal-to-noise ratio was equal across EMG and torque+noise control signals, EMG resulted in larger errors. Second, we quantified uncertainty by measuring the just-noticeable difference of a visual perturbation. We found that for equal errors, EMG resulted in higher movement uncertainty than both torque and torque+noise. The differences suggest that performance and confidence are influenced by more than just the noisiness of the control signal, and suggest that other factors, such as the user's ability to incorporate feedback and develop accurate internal models, also have significant impacts on the performance and confidence of a person's actions. We theorize that users have difficulty distinguishing between random and systematic errors for EMG control, and future work should examine in more detail the types of errors made with EMG control.

  7. Human disturbance influences reproductive success and growth rate in California sea lions (Zalophus californianus).

    Science.gov (United States)

    French, Susannah S; González-Suárez, Manuela; Young, Julie K; Durham, Susan; Gerber, Leah R

    2011-03-16

    The environment is currently undergoing changes at both global (e.g., climate change) and local (e.g., tourism, pollution, habitat modification) scales that have the capacity to affect the viability of animal and plant populations. Many of these changes, such as human disturbance, have an anthropogenic origin and therefore may be mitigated by management action. To do so requires an understanding of the impact of human activities and changing environmental conditions on population dynamics. We investigated the influence of human activity on important life history parameters (reproductive rate, and body condition, and growth rate of neonate pups) for California sea lions (Zalophus californianus) in the Gulf of California, Mexico. Increased human presence was associated with lower reproductive rates, which translated into reduced long-term population growth rates and suggested that human activities are a disturbance that could lead to population declines. We also observed higher body growth rates in pups with increased exposure to humans. Increased growth rates in pups may reflect a density dependent response to declining reproductive rates (e.g., decreased competition for resources). Our results highlight the potentially complex changes in life history parameters that may result from human disturbance, and their implication for population dynamics. We recommend careful monitoring of human activities in the Gulf of California and emphasize the importance of management strategies that explicitly consider the potential impact of human activities such as ecotourism on vertebrate populations.

  8. Human disturbance influences reproductive success and growth rate in California sea lions (Zalophus californianus.

    Directory of Open Access Journals (Sweden)

    Susannah S French

    Full Text Available The environment is currently undergoing changes at both global (e.g., climate change and local (e.g., tourism, pollution, habitat modification scales that have the capacity to affect the viability of animal and plant populations. Many of these changes, such as human disturbance, have an anthropogenic origin and therefore may be mitigated by management action. To do so requires an understanding of the impact of human activities and changing environmental conditions on population dynamics. We investigated the influence of human activity on important life history parameters (reproductive rate, and body condition, and growth rate of neonate pups for California sea lions (Zalophus californianus in the Gulf of California, Mexico. Increased human presence was associated with lower reproductive rates, which translated into reduced long-term population growth rates and suggested that human activities are a disturbance that could lead to population declines. We also observed higher body growth rates in pups with increased exposure to humans. Increased growth rates in pups may reflect a density dependent response to declining reproductive rates (e.g., decreased competition for resources. Our results highlight the potentially complex changes in life history parameters that may result from human disturbance, and their implication for population dynamics. We recommend careful monitoring of human activities in the Gulf of California and emphasize the importance of management strategies that explicitly consider the potential impact of human activities such as ecotourism on vertebrate populations.

  9. Previous estimates of mitochondrial DNA mutation level variance did not account for sampling error: comparing the mtDNA genetic bottleneck in mice and humans.

    Science.gov (United States)

    Wonnapinij, Passorn; Chinnery, Patrick F; Samuels, David C

    2010-04-09

    In cases of inherited pathogenic mitochondrial DNA (mtDNA) mutations, a mother and her offspring generally have large and seemingly random differences in the amount of mutated mtDNA that they carry. Comparisons of measured mtDNA mutation level variance values have become an important issue in determining the mechanisms that cause these large random shifts in mutation level. These variance measurements have been made with samples of quite modest size, which should be a source of concern because higher-order statistics, such as variance, are poorly estimated from small sample sizes. We have developed an analysis of the standard error of variance from a sample of size n, and we have defined error bars for variance measurements based on this standard error. We calculate variance error bars for several published sets of measurements of mtDNA mutation level variance and show how the addition of the error bars alters the interpretation of these experimental results. We compare variance measurements from human clinical data and from mouse models and show that the mutation level variance is clearly higher in the human data than it is in the mouse models at both the primary oocyte and offspring stages of inheritance. We discuss how the standard error of variance can be used in the design of experiments measuring mtDNA mutation level variance. Our results show that variance measurements based on fewer than 20 measurements are generally unreliable and ideally more than 50 measurements are required to reliably compare variances with less than a 2-fold difference.

  10. Determination of the Optimized Automation Rate considering Effects of Automation on Human Operators in Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seung Min; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Daejon (Korea, Republic of); Kim, Jong Hyun [KEPCO International Nuclear Graduate School, Seosaeng (Korea, Republic of); Kim, Man Cheol [Chung-Ang University, Seoul (Korea, Republic of)

    2015-05-15

    Automation refers to the use of a device or a system to perform a function previously performed by a human operator. It is introduced to reduce the human errors and to enhance the performance in various industrial fields, including the nuclear industry. However, these positive effects are not always achieved in complex systems such as nuclear power plants (NPPs). An excessive introduction of automation can generate new roles for human operators and change activities in unexpected ways. As more automation systems are accepted, the ability of human operators to detect automation failures and resume manual control is diminished. This disadvantage of automation is called the Out-of-the- Loop (OOTL) problem. We should consider the positive and negative effects of automation at the same time to determine the appropriate level of the introduction of automation. Thus, in this paper, we suggest an estimation method to consider the positive and negative effects of automation at the same time to determine the appropriate introduction of automation. This concept is limited in that it does not consider the effects of automation on human operators. Thus, a new estimation method for automation rate was suggested to overcome this problem.

  11. Sampling Errors in Monthly Rainfall Totals for TRMM and SSM/I, Based on Statistics of Retrieved Rain Rates and Simple Models

    Science.gov (United States)

    Bell, Thomas L.; Kundu, Prasun K.; Einaudi, Franco (Technical Monitor)

    2000-01-01

    Estimates from TRMM satellite data of monthly total rainfall over an area are subject to substantial sampling errors due to the limited number of visits to the area by the satellite during the month. Quantitative comparisons of TRMM averages with data collected by other satellites and by ground-based systems require some estimate of the size of this sampling error. A method of estimating this sampling error based on the actual statistics of the TRMM observations and on some modeling work has been developed. "Sampling error" in TRMM monthly averages is defined here relative to the monthly total a hypothetical satellite permanently stationed above the area would have reported. "Sampling error" therefore includes contributions from the random and systematic errors introduced by the satellite remote sensing system. As part of our long-term goal of providing error estimates for each grid point accessible to the TRMM instruments, sampling error estimates for TRMM based on rain retrievals from TRMM microwave (TMI) data are compared for different times of the year and different oceanic areas (to minimize changes in the statistics due to algorithmic differences over land and ocean). Changes in sampling error estimates due to changes in rain statistics due 1) to evolution of the official algorithms used to process the data, and 2) differences from other remote sensing systems such as the Defense Meteorological Satellite Program (DMSP) Special Sensor Microwave/Imager (SSM/I), are analyzed.

  12. Effect of Transducer Orientation on Errors in Ultrasound Image-Based Measurements of Human Medial Gastrocnemius Muscle Fascicle Length and Pennation.

    Science.gov (United States)

    Bolsterlee, Bart; Gandevia, Simon C; Herbert, Robert D

    2016-01-01

    Ultrasound imaging is often used to measure muscle fascicle lengths and pennation angles in human muscles in vivo. Theoretically the most accurate measurements are made when the transducer is oriented so that the image plane aligns with muscle fascicles and, for measurements of pennation, when the image plane also intersects the aponeuroses perpendicularly. However this orientation is difficult to achieve and usually there is some degree of misalignment. Here, we used simulated ultrasound images based on three-dimensional models of the human medial gastrocnemius, derived from magnetic resonance and diffusion tensor images, to describe the relationship between transducer orientation and measurement errors. With the transducer oriented perpendicular to the surface of the leg, the error in measurement of fascicle lengths was about 0.4 mm per degree of misalignment of the ultrasound image with the muscle fascicles. If the transducer is then tipped by 20°, the error increases to 1.1 mm per degree of misalignment. For a given degree of misalignment of muscle fascicles with the image plane, the smallest absolute error in fascicle length measurements occurs when the transducer is held perpendicular to the surface of the leg. Misalignment of the transducer with the fascicles may cause fascicle length measurements to be underestimated or overestimated. Contrary to widely held beliefs, it is shown that pennation angles are always overestimated if the image is not perpendicular to the aponeurosis, even when the image is perfectly aligned with the fascicles. An analytical explanation is provided for this finding.

  13. Biochemical analysis of six genetic variants of error-prone human DNA polymerase ι involved in translesion DNA synthesis.

    Science.gov (United States)

    Kim, Jinsook; Song, Insil; Jo, Ara; Shin, Joo-Ho; Cho, Hana; Eoff, Robert L; Guengerich, F Peter; Choi, Jeong-Yun

    2014-10-20

    DNA polymerase (pol) ι is the most error-prone among the Y-family polymerases that participate in translesion synthesis (TLS). Pol ι can bypass various DNA lesions, e.g., N(2)-ethyl(Et)G, O(6)-methyl(Me)G, 8-oxo-7,8-dihydroguanine (8-oxoG), and an abasic site, though frequently with low fidelity. We assessed the biochemical effects of six reported genetic variations of human pol ι on its TLS properties, using the recombinant pol ι (residues 1-445) proteins and DNA templates containing a G, N(2)-EtG, O(6)-MeG, 8-oxoG, or abasic site. The Δ1-25 variant, which is the N-terminal truncation of 25 residues resulting from an initiation codon variant (c.3G > A) and also is the formerly misassigned wild-type, exhibited considerably higher polymerase activity than wild-type with Mg(2+) (but not with Mn(2+)), coinciding with its steady-state kinetic data showing a ∼10-fold increase in kcat/Km for nucleotide incorporation opposite templates (only with Mg(2+)). The R96G variant, which lacks a R96 residue known to interact with the incoming nucleotide, lost much of its polymerase activity, consistent with the kinetic data displaying 5- to 72-fold decreases in kcat/Km for nucleotide incorporation opposite templates either with Mg(2+) or Mn(2+), except for that opposite N(2)-EtG with Mn(2+) (showing a 9-fold increase for dCTP incorporation). The Δ1-25 variant bound DNA 20- to 29-fold more tightly than wild-type (with Mg(2+)), but the R96G variant bound DNA 2-fold less tightly than wild-type. The DNA-binding affinity of wild-type, but not of the Δ1-25 variant, was ∼7-fold stronger with 0.15 mM Mn(2+) than with Mg(2+). The results indicate that the R96G variation severely impairs most of the Mg(2+)- and Mn(2+)-dependent TLS abilities of pol ι, whereas the Δ1-25 variation selectively and substantially enhances the Mg(2+)-dependent TLS capability of pol ι, emphasizing the potential translational importance of these pol ι genetic variations, e.g., individual differences

  14. Analytical bit error rate performance evaluation of an orthogonal frequency division multiplexing power line communication system impaired by impulsive and Gaussian channel noise

    Directory of Open Access Journals (Sweden)

    Munshi Mahbubur Rahman

    2015-02-01

    Full Text Available An analytical approach is presented to evaluate the bit error rate (BER performance of a power line (PL communication system considering the combined influence of impulsive noise and background PL Gaussian noise. Middleton class-A noise model is considered to evaluate the effect of impulsive noise. The analysis is carried out to find the expression of the signal-to-noise ratio and BER considering orthogonal frequency division multiplexing (OFDM with binary phase shift keying modulation with coherent demodulation of OFDM sub-channels. The results are evaluated numerically considering the multipath transfer function model of PL with non-flat power spectral density of PL background noise over a bandwidth of 0.3–100 MHz. The results are plotted for several system and noise parameters and penalty because of impulsive noise is determined at a BER of 10^−6. The computed results show that the system suffers significant power penalty because of impulsive noise which is higher at higher channel bandwidth and can be reduced by increasing the number of OFDM subcarriers to some extent. The analytical results conform well with the simulation results reported earlier.

  15. Cooling Rates of Humans in Air and in Water: An Experiment

    Science.gov (United States)

    Bohren, Craig F.

    2012-12-01

    In a previous article I analyzed in detail the physical factors resulting in greater cooling rates of objects in still water than in still air, emphasizing cooling of the human body. By cooling rate I mean the rate of decrease of core temperature uncompensated by metabolism. I concluded that the "correct ratio for humans is closer to 2 than to 10." To support this assertion I subsequently did experiments, which I report following a digression on hypothermia.

  16. Proofreading for word errors.

    Science.gov (United States)

    Pilotti, Maura; Chodorow, Martin; Agpawa, Ian; Krajniak, Marta; Mahamane, Salif

    2012-04-01

    Proofreading (i.e., reading text for the purpose of detecting and correcting typographical errors) is viewed as a component of the activity of revising text and thus is a necessary (albeit not sufficient) procedural step for enhancing the quality of a written product. The purpose of the present research was to test competing accounts of word-error detection which predict factors that may influence reading and proofreading differently. Word errors, which change a word into another word (e.g., from --> form), were selected for examination because they are unlikely to be detected by automatic spell-checking functions. Consequently, their detection still rests mostly in the hands of the human proofreader. Findings highlighted the weaknesses of existing accounts of proofreading and identified factors, such as length and frequency of the error in the English language relative to frequency of the correct word, which might play a key role in detection of word errors.

  17. Uncertainty quantification and error analysis

    Energy Technology Data Exchange (ETDEWEB)

    Higdon, Dave M [Los Alamos National Laboratory; Anderson, Mark C [Los Alamos National Laboratory; Habib, Salman [Los Alamos National Laboratory; Klein, Richard [Los Alamos National Laboratory; Berliner, Mark [OHIO STATE UNIV.; Covey, Curt [LLNL; Ghattas, Omar [UNIV OF TEXAS; Graziani, Carlo [UNIV OF CHICAGO; Seager, Mark [LLNL; Sefcik, Joseph [LLNL; Stark, Philip [UC/BERKELEY; Stewart, James [SNL

    2010-01-01

    UQ studies all sources of error and uncertainty, including: systematic and stochastic measurement error; ignorance; limitations of theoretical models; limitations of numerical representations of those models; limitations on the accuracy and reliability of computations, approximations, and algorithms; and human error. A more precise definition for UQ is suggested below.

  18. At least some errors are randomly generated (Freud was wrong)

    Science.gov (United States)

    Sellen, A. J.; Senders, J. W.

    1986-01-01

    An experiment was carried out to expose something about human error generating mechanisms. In the context of the experiment, an error was made when a subject pressed the wrong key on a computer keyboard or pressed no key at all in the time allotted. These might be considered, respectively, errors of substitution and errors of omission. Each of seven subjects saw a sequence of three digital numbers, made an easily learned binary judgement about each, and was to press the appropriate one of two keys. Each session consisted of 1,000 presentations of randomly permuted, fixed numbers broken into 10 blocks of 100. One of two keys should have been pressed within one second of the onset of each stimulus. These data were subjected to statistical analyses in order to probe the nature of the error generating mechanisms. Goodness of fit tests for a Poisson distribution for the number of errors per 50 trial interval and for an exponential distribution of the length of the intervals between errors were carried out. There is evidence for an endogenous mechanism that may best be described as a random error generator. Furthermore, an item analysis of the number of errors produced per stimulus suggests the existence of a second mechanism operating on task driven factors producing exogenous errors. Some errors, at least, are the result of constant probability generating mechanisms with error rate idiosyncratically determined for each subject.

  19. 在常规故障和临界人为错误条件下具有易损坏储备部件可修复系统的解的特征值%The Eigenvalue of a Repairable System with a Deleriorating Standby Unit Under Common-Cause Failure and Critical Human Error

    Institute of Scientific and Technical Information of China (English)

    金瑞星; 徐光甫

    2007-01-01

    讨论了一个由两个部件和一个储备部件,并且具有临界人为错误(human error rates)和常规故障(commor-error rates)的随机数学模型.研究了其预解式的表达式及本征值的数目问题,且得出一个本征值对应一个本征元的结论,并给出证明.

  20. APJE-SLIM Based Method for Marine Human Error Probability Estimation%基于APJE-SLIM的海运人因失误概率的确定

    Institute of Scientific and Technical Information of China (English)

    席永涛; 陈伟炯; 夏少生; 张晓东

    2011-01-01

    Safety is the eternal theme in shipping industry.Research shows that human error is the main reason of maritime accidents.In order to research marine human errors, the PSF are discussed, and the human error probability (HEP) is estimated under the influence of PSF.Based on the detailed investigation of human errors in collision avoidance behavior which is the most key mission in navigation and the PSF, human reliability of mariners in collision avoidance is analyzed by using the integration of APJE and SLIM.Result shows that PSF such as fatigue and health status, knowledge, experience and training, task complexity, safety management and organizational effectiveness, etc.have varying influence on HEP.If the level of PSF can be improved, the HEP can decreased.Using APJE to determine the absolute human error probabilities of extreme point can solve the problem that the probability of reference point is hard to obtain in SLIM method, and obtain the marine HEP under the different influence levels of PSF.%安全是海运行业永恒的主题,调查研究表明,人因失误是造成海事的主要原因.为了对海运人因失误进行研究,探讨引起人因失误的行为形成因子(PSF),确定在PSF影响下的人因失误概率.在调查海上避让行为的人因失误和这些失误的行为形成因子的基础上,采用APJE和SLIM 相结合的方法对航海人员避让行为中的可靠性进行分析.结果表明,航海人员疲劳与健康程度、知识、经验与培训水平、任务复杂程度、安全管理水平与组织有效性等PSF对人因失误概率有着不同程度的影响,相应提高PSF水平,可极大地减少人因失误概率.利用APJE确定端点绝对失误概率,解决了SLIM方法中难以获得参考点概率的问题,获得了在不同种类不同水平PSF影响下的海运人因失误概率.

  1. Uncorrected refractive errors

    Directory of Open Access Journals (Sweden)

    Kovin S Naidoo

    2012-01-01

    Full Text Available Global estimates indicate that more than 2.3 billion people in the world suffer from poor vision due to refractive error; of which 670 million people are considered visually impaired because they do not have access to corrective treatment. Refractive errors, if uncorrected, results in an impaired quality of life for millions of people worldwide, irrespective of their age, sex and ethnicity. Over the past decade, a series of studies using a survey methodology, referred to as Refractive Error Study in Children (RESC, were performed in populations with different ethnic origins and cultural settings. These studies confirmed that the prevalence of uncorrected refractive errors is considerably high for children in low-and-middle-income countries. Furthermore, uncorrected refractive error has been noted to have extensive social and economic impacts, such as limiting educational and employment opportunities of economically active persons, healthy individuals and communities. The key public health challenges presented by uncorrected refractive errors, the leading cause of vision impairment across the world, require urgent attention. To address these issues, it is critical to focus on the development of human resources and sustainable methods of service delivery. This paper discusses three core pillars to addressing the challenges posed by uncorrected refractive errors: Human Resource (HR Development, Service Development and Social Entrepreneurship.

  2. Uncorrected refractive errors.

    Science.gov (United States)

    Naidoo, Kovin S; Jaggernath, Jyoti

    2012-01-01

    Global estimates indicate that more than 2.3 billion people in the world suffer from poor vision due to refractive error; of which 670 million people are considered visually impaired because they do not have access to corrective treatment. Refractive errors, if uncorrected, results in an impaired quality of life for millions of people worldwide, irrespective of their age, sex and ethnicity. Over the past decade, a series of studies using a survey methodology, referred to as Refractive Error Study in Children (RESC), were performed in populations with different ethnic origins and cultural settings. These studies confirmed that the prevalence of uncorrected refractive errors is considerably high for children in low-and-middle-income countries. Furthermore, uncorrected refractive error has been noted to have extensive social and economic impacts, such as limiting educational and employment opportunities of economically active persons, healthy individuals and communities. The key public health challenges presented by uncorrected refractive errors, the leading cause of vision impairment across the world, require urgent attention. To address these issues, it is critical to focus on the development of human resources and sustainable methods of service delivery. This paper discusses three core pillars to addressing the challenges posed by uncorrected refractive errors: Human Resource (HR) Development, Service Development and Social Entrepreneurship.

  3. Influence of partially coherent beam passing through strong turbulence on bit error rate of laser communication systems%部分相干光通过强湍流对通信系统误码率的影响

    Institute of Scientific and Technical Information of China (English)

    王江安; 赵英俊; 吴荣华; 任席闯

    2009-01-01

    A theoretical base for using the technology of multibeam transmission and the reception in the ship laser communication systems was provided by the research on the influence of partially coherent beams passing through the strong turbulence on the bit error rate. The relation between system bit error rate and transmission range was obtained under the conditions of different turbulence measurement, transmission laser wavelength and light source coherence parameter by the aid of a method to parse the equation of laser transmission in atmospheric turbulence field, ignoring other noises in the system, but considering the bit error rate caused by atmospheric turbulence only. The result indicates that under the strong turbulence condition, the system bit error rate is increased gradually with the increase of the transmission range when the quantity of transmitting antennas reaches a certain number, but the system bit error rate tends to saturation when the transmission range accretion reaches a definite degree; the bigger the light source coherence parameter increases, the lower the system bit error rate becomes; the more the turbulence inner scale is, the higher the system bit error rate becomes; and the variation of the transmission laser wavelength has no obvious influence on the system bit error rate.%为研究部分相干光通过强湍流对系统误码率的影响,借助对激光在大气湍流场中的传输方程进行解析求解(忽略系统中其他噪声,仅考虑由大气湍流引起的系统误码率),得到不同湍流内尺度、传输激光波长和光源相干参数条件下,系统误码率和传输距离的关系.结果表明:在强湍流条件下,当发射天线数目达到一定时,随着传输距离的增加,系统误码率逐渐增大,但增大到一定程度后趋于饱和;光源相干参数越大,系统误码率越低;湍流内尺度越大,系统误码率越高;传输激光波长的变化对系统误码率无明显影响.

  4. Resonance of about-weekly human heart rate rhythm with solar activity change.

    Science.gov (United States)

    Cornelissen, G; Halberg, F; Wendt, H W; Bingham, C; Sothern, R B; Haus, E; Kleitman, E; Kleitman, N; Revilla, M A; Revilla, M; Breus, T K; Pimenov, K; Grigoriev, A E; Mitish, M D; Yatsyk, G V; Syutkina, E V

    1996-12-01

    In several human adults, certain solar activity rhythms may influence an about 7-day rhythm in heart rate. When no about-weekly feature was found in the rate of change in sunspot area, a measure of solar activity, the double amplitude of a circadian heart rate rhythm, approximated by the fit of a 7-day cosine curve, was lower, as was heart rate corresponds to about-weekly features in solar activity and/or relates to a sunspot cycle.

  5. 核电厂数字化人-机界面特征对人因失误的影响研究%Effects of Digital Human-Machine Interface Characteristics on Human Error in Nuclear Power Plants

    Institute of Scientific and Technical Information of China (English)

    李鹏程; 张力; 戴立操; 黄卫刚

    2011-01-01

    In order to identify the effects of digital human-machine interface characteristics on human error in nuclear power plants, the new characteristics of digital human-machine interface are identified by comparing with the traditional analog control systems in the aspects of the information display, user interface interaction and management, control systems, alarm systems and procedures system, and the negative effects of digital human-machine interface characteristics on human error are identified by field research and interviewing with operators such as increased cognitive load and workload, mode confusion, loss of situation awareness. As to the adverse effects related above, the corresponding prevention and control measures of human errors are provided to support the prevention and minimization of human errors and the optimization of human - machine interface design.%以数字化主控室的现场调研和对操纵员的访谈内容为依据,分别从信息显示、用户界面交互与管理、控制系统、报警系统、规程系统等方面与传统的模拟控制系统进行了比较分析,识别数字化人-机界面新特征.结果显示,数字化人.机界面新特征对人因失误产生的不利影响主要表现为操纵员的认知负荷和操作负荷的增加,容易产生模式混淆、情境意识丧失等方面.针对上述不利的影响,提出了相应的人因失误预防对策,为人因失误的预防和人-机界面的优化设计提供决策支持.

  6. Analysis of the "naming game" with learning errors in communications.

    Science.gov (United States)

    Lou, Yang; Chen, Guanrong

    2015-07-16

    Naming game simulates the process of naming an objective by a population of agents organized in a certain communication network. By pair-wise iterative interactions, the population reaches consensus asymptotically. We study naming game with communication errors during pair-wise conversations, with error rates in a uniform probability distribution. First, a model of naming game with learning errors in communications (NGLE) is proposed. Then, a strategy for agents to prevent learning errors is suggested. To that end, three typical topologies of communication networks, namely random-graph, small-world and scale-free networks, are employed to investigate the effects of various learning errors. Simulation results on these models show that 1) learning errors slightly affect the convergence speed but distinctively increase the requirement for memory of each agent during lexicon propagation; 2) the maximum number of different words held by the population increases linearly as the error rate increases; 3) without applying any strategy to eliminate learning errors, there is a threshold of the learning errors which impairs the convergence. The new findings may help to better understand the role of learning errors in naming game as well as in human language development from a network science perspective.

  7. Innovative Reduced Mass TPS Designs for Human-Rated Aeroassit Vehicles Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This proposal addresses Item #2 of Topic X7.04 Aeroassist Systems and proposes innovative heat shield thermal protection systems (TPS) designs for human-rated...

  8. Effective use of pre-job briefing as tool for the prevention of human error; Effektive Nutzung der Arbeitsvorbesprechung als Werkzeug zur Vermeidung von Fehlhandlungen

    Energy Technology Data Exchange (ETDEWEB)

    Schlump, Ansgar [KLE GmbH, Lingen (Germany). Kernkraftwerk Emsland

    2015-06-15

    There is a fundamental demand to minimise the risks for workers and facilities while executing maintenance work. To ensure that facilities are secure and reliable, any deviation from normal operation behaviour has to be avoided. Accurate planning is the basis for minimising mistakes and making work more secure. All workers involved should understand how the work should be done and what is expected to avoid human errors. Especially in nuclear power plants, the human performance tools (HPT) have proved to be an effective instrument to minimise human errors. These human performance tools consist of numerous different tools that complement each other (e.g. pre-job briefing). The safety culture of the plants is also characterised by these tools. The choice of using the right HP-Tool is often a difficult task for the work planer. On the one hand, he wants to avoid mistakes during the execution of work but on the other hand he does not want to irritate the workers with unnecessary requirements. The proposed concept uses a simple risk analysis to take into account the complexity of the task, the experience of the past and the consequences of failure in to account. One main result of this risk analysis is a recommendation of the detailing of the pre-job briefing, to reduce the risks for the involved staff to a minimum.

  9. 无线传感器网络中比特错误率估计方法%Differentia-based bit error rate estimation method in wireless sensor network

    Institute of Scientific and Technical Information of China (English)

    裴祥喜; 崔炳德; 李珉; 周志敏

    2014-01-01

    The performance of interference detection in wireless sensor network depends on the performance of bit error rate (BER) estimation, however, the existed BER estimation methods are either too complicate to imple-ment or with low precision. To solve the problem, differentia error estimation (DEE) method is proposed to enhance the precision of BER. The main idea is to insert multi-level error estimation bits that with different error estimation ability, which are random and uniformly distributed, into the sender’s packets. And the receivers estimate the BER by using the relation between the BER and parity check. Meanwhile, DEE optimizes the ability of error estimation bit of each level by making use of BER’s feature of non-uniform distribution, to enhance the estimate precision of BER with higher probability and lower the average estimation error. The experiments shows that, compared with the error estimating coding (EEC) method, the average estimation error decreases 44%, and the estimation error decreas-es as much as 68%when the redundancy is decreased.%由于无线传感器网络中干扰检测的性能依赖于比特错误率(Bit Error Rate, BER)估计的准确性,而现有的比特错误率估计方法或者难以实现,或者准确性差。针对这个问题,本文提出了采用差异化思想来提高比特错误率估计准确度的方法DEE(Differentiated Error Estimation)。其主要是发送方在数据包中插入具有不同估错能力的多级估错位,并随机均匀地分布所有估错位。接收方借助BER与奇偶校验失败概率的理论关系来估计BER。同时,DEE利用BER非均匀分布特征来优化各级估错位的能力,提高出现概率高的BER的估计准确度,以降低平均估计误差。实验结果表明,与现有方法EEC相比,DEE可将估计误差平均减少约44%。当估错冗余较低时,DEE可将估计误差减少约68%。

  10. An investigation on the assessed thermal sensation and human body exergy consumption rate

    DEFF Research Database (Denmark)

    Simone, Angela; Kolarik, Jakub; Iwamatsu, Toshiya

    2010-01-01

    -environment research has been explored in the present work. The relationship of subjectively assessed thermal sensation data, from earlier thermal comfort studies, to the calculated human-body exergy consumption has been analysed. The results show that the minimum human body exergy consumption rate was related...

  11. Errors in Radiologic Reporting

    Directory of Open Access Journals (Sweden)

    Esmaeel Shokrollahi

    2010-05-01

    Full Text Available Given that the report is a professional document and bears the associated responsibilities, all of the radiologist's errors appear in it, either directly or indirectly. It is not easy to distinguish and classify the mistakes made when a report is prepared, because in most cases the errors are complex and attributable to more than one cause and because many errors depend on the individual radiologists' professional, behavioral and psychological traits."nIn fact, anyone can make a mistake, but some radiologists make more mistakes, and some types of mistakes are predictable to some extent."nReporting errors can be categorized differently:"nUniversal vs. individual"nHuman related vs. system related"nPerceptive vs. cognitive errors"n1. Descriptive "n2. Interpretative "n3. Decision related Perceptive errors"n1. False positive "n2. False negative"n Nonidentification "n Erroneous identification "nCognitive errors "n Knowledge-based"n Psychological  

  12. Refractive Errors

    Science.gov (United States)

    ... does the eye focus light? In order to see clearly, light rays from an object must focus onto the ... The refractive errors are: myopia, hyperopia and astigmatism [See figures 2 and 3]. What is hyperopia (farsightedness)? Hyperopia occurs when light rays focus behind the retina (because the eye ...

  13. Human errors in medical practice and the prevention%医疗活动中的人为错误及其防范

    Institute of Scientific and Technical Information of China (English)

    周大春; 陈肖敏; 赵彩莲; 蔡秀军

    2009-01-01

    Human errors are errors found in planning or implementation, and those found in medical practice are often major causes of mishaps.To name a few, wrong-site surgery, medication error, wrong treatment, and inadvertent equipment operation.Errors of this category can be prevented by learning from experiences and achievement worldwide.Preventive measures include those taken in human aspect and system aspect, reinforced education and training, process optimization, and hardware redesign.These measures can be aided by multiple safety steps in risky technical operations, in an effort to break the accident chain.For example, pre-operative surgical site marking, multi-department co-operated patient identification, bar-coded medication delivery, read-back during verbal communication, and observation of clinical pathway.Continuous quality improvement may be achieved when both the management and staff see medical errors in the correct sense, and frontline staff are willing to report their errors.%人为错误是与主观愿望相违背的计划错误或执行错误.医疗活动中的人为错误是导致医疗事故的重要原因.常见的有手术部位错误、药物误用、治疗方案错误、医嘱误写误读、设备误接误操作等.防范医疗活动中的人为错误可以结合国内和国外的经验,从人员角度和系统角度着手,加强员工教育,改进操作流程,改善硬件设施.对有风险的技术操作设置多重安全措施,以增加打断事故发生链的概率,如手术部位预先画标记和多部门合作核对,用药前人工核对与计算机条形码匹配相结合,采用规范的临床路径等.管理层和一线员工都要对医疗差错有理性认识,鼓励基层上报差错事故,借以发现问题并进行持续质量改进.

  14. Human-rating Automated and Robotic Systems - (How HAL Can Work Safely with Astronauts)

    Science.gov (United States)

    Baroff, Lynn; Dischinger, Charlie; Fitts, David

    2009-01-01

    Long duration human space missions, as planned in the Vision for Space Exploration, will not be possible without applying unprecedented levels of automation to support the human endeavors. The automated and robotic systems must carry the load of routine housekeeping for the new generation of explorers, as well as assist their exploration science and engineering work with new precision. Fortunately, the state of automated and robotic systems is sophisticated and sturdy enough to do this work - but the systems themselves have never been human-rated as all other NASA physical systems used in human space flight have. Our intent in this paper is to provide perspective on requirements and architecture for the interfaces and interactions between human beings and the astonishing array of automated systems; and the approach we believe necessary to create human-rated systems and implement them in the space program. We will explain our proposed standard structure for automation and robotic systems, and the process by which we will develop and implement that standard as an addition to NASA s Human Rating requirements. Our work here is based on real experience with both human system and robotic system designs; for surface operations as well as for in-flight monitoring and control; and on the necessities we have discovered for human-systems integration in NASA's Constellation program. We hope this will be an invitation to dialog and to consideration of a new issue facing new generations of explorers and their outfitters.

  15. Toward a cognitive taxonomy of medical errors.

    Science.gov (United States)

    Zhang, Jiajie; Patel, Vimla L; Johnson, Todd R; Shortliffe, Edward H

    2002-01-01

    One critical step in addressing and resolving the problems associated with human errors is the development of a cognitive taxonomy of such errors. In the case of errors, such a taxonomy may be developed (1) to categorize all types of errors along cognitive dimensions, (2) to associate each type of error with a specific underlying cognitive mechanism, (3) to explain why, and even predict when and where, a specific error will occur, and (4) to generate intervention strategies for each type of error. Based on Reason's (1992) definition of human errors and Norman's (1986) cognitive theory of human action, we have developed a preliminary action-based cognitive taxonomy of errors that largely satisfies these four criteria in the domain of medicine. We discuss initial steps for applying this taxonomy to develop an online medical error reporting system that not only categorizes errors but also identifies problems and generates solutions.

  16. FRamework Assessing Notorious Contributing Influences for Error (FRANCIE): Perspective on Taxonomy Development to Support Error Reporting and Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lon N. Haney; David I. Gertman

    2003-04-01

    Beginning in the 1980s a primary focus of human reliability analysis was estimation of human error probabilities. However, detailed qualitative modeling with comprehensive representation of contextual variables often was lacking. This was likely due to the lack of comprehensive error and performance shaping factor taxonomies, and the limited data available on observed error rates and their relationship to specific contextual variables. In the mid 90s Boeing, America West Airlines, NASA Ames Research Center and INEEL partnered in a NASA sponsored Advanced Concepts grant to: assess the state of the art in human error analysis, identify future needs for human error analysis, and develop an approach addressing these needs. Identified needs included the need for a method to identify and prioritize task and contextual characteristics affecting human reliability. Other needs identified included developing comprehensive taxonomies to support detailed qualitative modeling and to structure meaningful data collection efforts across domains. A result was the development of the FRamework Assessing Notorious Contributing Influences for Error (FRANCIE) with a taxonomy for airline maintenance tasks. The assignment of performance shaping factors to generic errors by experts proved to be valuable to qualitative modeling. Performance shaping factors and error types from such detailed approaches can be used to structure error reporting schemes. In a recent NASA Advanced Human Support Technology grant FRANCIE was refined, and two new taxonomies for use on space missions were developed. The development, sharing, and use of error taxonomies, and the refinement of approaches for increased fidelity of qualitative modeling is offered as a means to help direct useful data collection strategies.

  17. Implementation of Safety and Human-Rating on Lockheed Martin's Crew Exploration Vehicle

    Science.gov (United States)

    Saemisch, Michael K.

    2005-12-01

    Lockheed Martin leads an industry and academic team to develop requirements and the design of NASA's Crew Exploration Vehicle (CEV) in support of the United States' Vision for Space Exploration. This paper discusses the safety and human-rating requirements, challenges, and approaches taken by the team focusing on safety and human-rating design decisions and trade- offs. Examples of these requirements are failure- tolerance, crew abort/escape, "design for minimum risk", computer-based control, all reviewed by a new NASA human-rating process. NASA allowed contractors freedom in the approaches they could pursue, which offered the opportunity for safety and human-rating goals to influence the basic concepts and major design decisions made early in the program, which drive the major safety features (and limitations) of the CEV project. The paper discusses the method developed by Lockheed Martin, HazComp, to evaluate hazards of proposed concept options, without the benefit of detailed design data used to provide a hazard-based "safety figure of merit" and substantiating data to the trade study decision process. The importance of a well- developed preliminary hazard analysis to support these evaluations is discussed. Major NASA safety and human-rating requirements and their evolution are also discussed along with issues, concerns and recommendations for future human space exploration safety requirements and safety focus.

  18. Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR): Data manual. Part 2: Human error probability (HEP) data; Volume 5, Revision 4

    Energy Technology Data Exchange (ETDEWEB)

    Reece, W.J.; Gilbert, B.G.; Richards, R.E. [EG and G Idaho, Inc., Idaho Falls, ID (United States)

    1994-09-01

    This data manual contains a hard copy of the information in the Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR) Version 3.5 database, which is sponsored by the US Nuclear Regulatory Commission. NUCLARR was designed as a tool for risk analysis. Many of the nuclear reactors in the US and several outside the US are represented in the NUCLARR database. NUCLARR includes both human error probability estimates for workers at the plants and hardware failure data for nuclear reactor equipment. Aggregations of these data yield valuable reliability estimates for probabilistic risk assessments and human reliability analyses. The data manual is organized to permit manual searches of the information if the computerized version is not available. Originally, the manual was published in three parts. In this revision the introductory material located in the original Part 1 has been incorporated into the text of Parts 2 and 3. The user can now find introductory material either in the original Part 1, or in Parts 2 and 3 as revised. Part 2 contains the human error probability data, and Part 3, the hardware component reliability data.

  19. Effect of alteration of translation error rate on enzyme microheterogeneity as assessed by variation in single molecule electrophoretic mobility and catalytic activity.

    Science.gov (United States)

    Nichols, Ellert R; Shadabi, Elnaz; Craig, Douglas B

    2009-06-01

    The role of translation error for Escherichia coli individual beta-galactosidase molecule catalytic and electrophoretic heterogeneity was investigated using CE-LIF. An E. coli rpsL mutant with a hyperaccurate translation phenotype produced enzyme molecules that exhibited significantly less catalytic heterogeneity but no reduction of electrophoretic heterogeneity. Enzyme expressed with streptomycin-induced translation error had increased thermolability, lower activity, and no significant change to catalytic or electrophoretic heterogeneity. Modeling of the electrophoretic behaviour of beta-galactosidase suggested that variation of the hydrodynamic radius may be the most significant contributor to electrophoretic heterogeneity.

  20. Mouse heart rate in a human: diagnostic mystery of an extreme tachyarrhythmia.

    Science.gov (United States)

    Chhabra, Lovely; Goel, Narender; Prajapat, Laxman; Spodick, David H; Goyal, Sanjeev

    2012-01-01

    We report telemetry recording of an extreme non-fatal tachyarrhythmia noted in a hospitalized quadriplegic male with history of atrial fibrillation where the average ventricular conduction rate was found to be about 600 beats per minute and was associated with transient syncope. A medical literature review suggests that the fastest human ventricular conduction rate reported to date in a tachyarrhythmia is 480 beats per minute. We therefore report the fastest human heart rate noted in a tachyarrhythmia and the most probable mechanism of this arrhythmia being a rapid atrial fibrillation with 1:1 conduction in the setting of probable co-existing multiple bypass tracts.

  1. Subjective thermal sensation and human body exergy consumption rate: analysis and correlation

    DEFF Research Database (Denmark)

    Simone, Angela; Dovjak, M.; Kolarik, Jakub

    2011-01-01

    The exergy approach to design and operation of climate conditioning systems is relatively well established, while its exploitation in connection to human perception of the indoor environment is relatively rare. As a building should provide healthy and comfortable environment for its occupants......, it is reasonable to consider both the exergy flows in building and those within the human body. There is a need to verify the human-body exergy model with the Thermal-Sensation (TS) response of subjects exposed to different combinations of indoor climate parameters (temperature, humidity, etc.). First results...... available on the relation between human-body exergy consumption rates and subjectively assessed thermal sensation showed that the minimum human body exergy consumption rate is associated with thermal sensation votes close to thermal neutrality, tending to slightly cool side of thermal sensation. By applying...

  2. Subjective thermal sensation and human body exergy consumption rate: analysis and correlation

    DEFF Research Database (Denmark)

    Simone, Angela; Dovjak, M.; Kolarik, Jakub

    2011-01-01

    The exergy approach to design and operation of climate conditioning systems is relatively well established, while its exploitation in connection to human perception of the indoor environment is relatively rare. As a building should provide healthy and comfortable environment for its occupants......, it is reasonable to consider both the exergy flows in building and those within the human body. There is a need to verify the human-body exergy model with the Thermal-Sensation (TS) response of subjects exposed to different combinations of indoor climate parameters (temperature, humidity, etc.). First results...... available on the relation between human-body exergy consumption rates and subjectively assessed thermal sensation showed that the minimum human body exergy consumption rate is associated with thermal sensation votes close to thermal neutrality, tending to slightly cool side of thermal sensation. By applying...

  3. A relation between calculated human body exergy consumption rate and subjectively assessed thermal sensation

    DEFF Research Database (Denmark)

    Simone, Angela; Kolarik, Jakub; Iwamatsu, Toshiya

    2011-01-01

    . Generally, the relationship between air temperature and the exergy consumption rate, as a first approximation, shows an increasing trend. Taking account of both convective and radiative heat exchange between the human body and the surrounding environment by using the calculated operative temperature, exergy...... consumption rates increase as the operative temperature increases above 24 ◦C or decreases below 22 ◦C. With the data available so far, a second-order polynomial relationship between thermal sensation and the exergy consumption rate was established....... occupants, it is reasonable to consider both the exergy flows in building and those within the human body. Until now, no data have been available on the relation between human-body exergy consumption rates and subjectively assessed thermal sensation. The objective of the present work was to relate thermal...

  4. 基于Bayes信息融合的人为差错概率计算方法%Human error probability quantification method based on Bayesian information fusion

    Institute of Scientific and Technical Information of China (English)

    蒋英杰; 孙志强; 谢红卫; 宫二玲

    2011-01-01

    研究了人为差错概率的计算.首先,介绍了可用于人为差错概率计算的数据来源,主要包括:通用数据、专家数据、仿真实验数据和现场数据.然后,分析了Bayes信息融合方法的基本思想,强调了该方法的两个关键性问题:验前分布的构建和融合权重的确定.最后,构建了基于Bayes信息融合的人为差错概率计算方法.将前3种数据作为脸前信息,融合形成验前分布.使用Bayes方法完成与现场数据的数据综合,得到人为差错概率的验后分布.基于该验后分布,完成人为差错概率的计算.通过示例分析,演示了方法的使用过程,证明了方法的有效性.%The quantification of human error probability is researched. Firstly, the data resources that can be used in the quantification of human error probability are introduced, including general data, expert data, simulation data, and spot data. Their characteristics are analyzed. Secondly, the basic idea of Bayesian information fusing is analyzed. Two key prololems are emphasized, which are the formation of prior distributions and the determination of fusing weights. Finally, the new method is presented, which quantifies the human error probability based on Bayesian information fusing. The first three kinds of data are regarded as prior information to form the fused prior distribution. The Bayesian method is used to synthesize all the data and get the posterior distribution. Based on the posterior distribution, the human error probability can be quantified. An example is analyzed, which shows the process of the method and proves its validity.

  5. Impact Analysis of Human Error on Protection System Reliability%人为失误对保护系统可靠性的影响

    Institute of Scientific and Technical Information of China (English)

    张晶晶; 丁明; 李生虎

    2012-01-01

    针对单一主保护和主后备保护系统,基于状态维修环境,首次建立了详细的、考虑人为失误影响的保护系统可靠性模型。定义了相应的可靠性指标,并通过算例分析了人为失误对保护系统可靠性指标的影响。分析结果表明:人为失误对单一主保护和主后备保护系统的可靠性影响都较大,在正常运行及修理等过程中要尽量减少人为失误,提高人员可靠性和保护系统可靠性。在多重保护系统运行中,不仅要提高主保护的可靠性,也要提高后备保护的可靠性,并把防止误动作作为指导思想。%In view of the single main protection and main and backup protection system, a protection system reliability model considering the impact of hun-lan error is firstly developed in detail, which is based on the condition-based maintenance environment. Corresponding reliability indices are defined, through an example the impact of human error on the protection system reliability is analyzed. The analysis results show that human error has a great impact on both single main protection and main and backup protection system, and human error must be reduced as possible during normal operation and maintenance process. The human reliability and protection system reliability must be improved. Not only reliability of main protection should be increased, but also reliability of backup protection in the multiple protection system, and preventing malfunction of protection system should be guideline.

  6. Physical security and cyber security issues and human error prevention for 3D printed objects: detecting the use of an incorrect printing material

    Science.gov (United States)

    Straub, Jeremy

    2017-06-01

    A wide variety of characteristics of 3D printed objects have been linked to impaired structural integrity and use-efficacy. The printing material can also have a significant impact on the quality, utility and safety characteristics of a 3D printed object. Material issues can be created by vendor issues, physical security issues and human error. This paper presents and evaluates a system that can be used to detect incorrect material use in a 3D printer, using visible light imaging. Specifically, it assesses the ability to ascertain the difference between materials of different color and different types of material with similar coloration.

  7. Toward a cognitive taxonomy of medical errors.

    OpenAIRE

    Zhang, Jiajie; Patel, Vimla L.; Johnson, Todd R.; Shortliffe, Edward H.

    2002-01-01

    One critical step in addressing and resolving the problems associated with human errors is the development of a cognitive taxonomy of such errors. In the case of errors, such a taxonomy may be developed (1) to categorize all types of errors along cognitive dimensions, (2) to associate each type of error with a specific underlying cognitive mechanism, (3) to explain why, and even predict when and where, a specific error will occur, and (4) to generate intervention strategies for each type of e...

  8. Minimal changes in heart rate of incubating American Oystercatchers (Haematopus palliatus) in response to human activity

    Science.gov (United States)

    Borneman, Tracy E.; Rose, Eli T.; Simons, Theodore R.

    2014-01-01

    An organism's heart rate is commonly used as an indicator of physiological stress due to environmental stimuli. We used heart rate to monitor the physiological response of American Oystercatchers (Haematopus palliatus) to human activity in their nesting environment. We placed artificial eggs with embedded microphones in 42 oystercatcher nests to record the heart rate of incubating oystercatchers continuously for up to 27 days. We used continuous video and audio recordings collected simultaneously at the nests to relate physiological response of birds (heart rate) to various types of human activity. We observed military and civilian aircraft, off-road vehicles, and pedestrians around nests. With the exception of high-speed, low-altitude military overflights, we found little evidence that oystercatcher heart rates were influenced by most types of human activity. The low-altitude flights were the only human activity to significantly increase average heart rates of incubating oystercatchers (12% above baseline). Although statistically significant, we do not consider the increase in heart rate during high-speed, low-altitude military overflights to be of biological significance. This noninvasive technique may be appropriate for other studies of stress in nesting birds.

  9. Method to control depth error when ablating human dentin with numerically controlled picosecond laser: a preliminary study.

    Science.gov (United States)

    Sun, Yuchun; Yuan, Fusong; Lv, Peijun; Wang, Dangxiao; Wang, Lei; Wang, Yong

    2015-07-01

    A three-axis numerically controlled picosecond laser was used to ablate dentin to investigate the quantitative relationships among the number of additive pulse layers in two-dimensional scans starting from the focal plane, step size along the normal of the focal plane (focal plane normal), and ablation depth error. A method to control the ablation depth error, suitable to control stepping along the focal plane normal, was preliminarily established. Twenty-four freshly removed mandibular first molars were cut transversely along the long axis of the crown and prepared as 48 tooth sample slices with approximately flat surfaces. Forty-two slices were used in the first section. The picosecond laser was 1,064 nm in wavelength, 3 W in power, and 10 kHz in repetition frequency. For a varying number (n = 5-70) of focal plane additive pulse layers (14 groups, three repetitions each), two-dimensional scanning and ablation were performed on the dentin regions of the tooth sample slices, which were fixed on the focal plane. The ablation depth, d, was measured, and the quantitative function between n and d was established. Six slices were used in the second section. The function was used to calculate and set the timing of stepwise increments, and the single-step size along the focal plane normal was d micrometer after ablation of n layers (n = 5-50; 10 groups, six repetitions each). Each sample underwent three-dimensional scanning and ablation to produce 2 × 2-mm square cavities. The difference, e, between the measured cavity depth and theoretical value was calculated, along with the difference, e 1, between the measured average ablation depth of a single-step along the focal plane normal and theoretical value. Values of n and d corresponding to the minimum values of e and e 1, respectively, were obtained. In two-dimensional ablation, d was largest (720.61 μm) when n = 65 and smallest when n = 5 (45.00 μm). Linear regression yielded the quantitative

  10. A Meta-Meta-Analysis: Empirical Review of Statistical Power, Type I Error Rates, Effect Sizes, and Model Selection of Meta-Analyses Published in Psychology

    Science.gov (United States)

    Cafri, Guy; Kromrey, Jeffrey D.; Brannick, Michael T.

    2010-01-01

    This article uses meta-analyses published in "Psychological Bulletin" from 1995 to 2005 to describe meta-analyses in psychology, including examination of statistical power, Type I errors resulting from multiple comparisons, and model choice. Retrospective power estimates indicated that univariate categorical and continuous moderators, individual…

  11. Types and causes of medication errors from nurse's viewpoint.

    Science.gov (United States)

    Cheragi, Mohammad Ali; Manoocheri, Human; Mohammadnejad, Esmaeil; Ehsani, Syyedeh R

    2013-05-01

    The main professional goal of nurses is to provide and improve human health. Medication errors are among the most common health threatening mistakes that affect patient care. Such mistakes are considered as a global problem which increases mortality rates, length of hospital stay, and related costs. This study was conducted to evaluate the types and causes of nursing medication errors. This cross-sectional study was conducted in 2009. A total number of 237 nurses were randomly selected from nurses working in Imam Khomeini Hospital (Tehran, Iran). They filled out a questionnaire including 10 items on demographic characteristics and 7 items about medication errors. Data were analyzed using descriptive and inferential statistics in SPSS for Windows 16.0. Medication errors had been made by 64.55% of the nurses. In addition, 31.37% of the participants reported medication errors on the verge of occurrence. The most common types of reported errors were wrong dosage and infusion rate. The most common causes were using abbreviations instead of full names of drugs and similar names of drugs. Therefore, the most important cause of medication errors was lack of pharmacological knowledge. There were no statistically significant relationships between medication errors and years of working experience, age, and working shifts. However, a significant relationship was found between errors in intravenous injections and gender. Likewise, errors in oral administration were significantly related with number of patients. Medication errors are a major problem in nursing. Since most cases of medication errors are not reported by nurses, nursing managers must demonstrate positive responses to nurses who report medication errors in order to improve patient safety.

  12. Effect of pre-freezing conditions on the progressive motility recovery rate of human frozen spermatozoa.

    Science.gov (United States)

    Zhang, X; Zhou, Y; Xia, W; Wu, H; Yao, K; Liu, H; Xiong, C

    2012-10-01

    We evaluated the effects of sperm concentration, progressive motility, sperm morphology, duration of abstinence and collection season on the progressive motility recovery rate of human frozen spermatozoa to identify characteristics that predict the progressive motility recovery rate of human frozen spermatozoa and improve the protocol for sperm collecting in sperm banks. A total of 14 190 semen samples donated at Zhejiang human sperm bank of China between September 2006 and June 2011 were collected from 1624 donors. Semen was evaluated according to WHO standard procedures for sperm concentration. Progressive motility, sperm morphology, ejaculate collection season and abstinence time were recorded. After freezing and thawing, the progressive motility was assessed. Results showed that sperm concentration, progressive motility and normal morphology were significantly associated with the progressive motility recovery rate of human frozen spermatozoa. In addition, the abstinence time and collection season also significantly affected progressive motility recovery rate. Our results indicated that sperm concentration, progressive motility and normal morphology could be valuable in predicting the progressive motility recovery rate of human frozen spermatozoa. As such, progressive motility recovery may be improved by donating semen when abstinent for 3-5 days and during seasons other than summer.

  13. Human errors and work performance in a nuclear power plant control room: associations with work-related factors and behavioral coping

    Energy Technology Data Exchange (ETDEWEB)

    Kecklund, Lena Jacobsson; Svenson, Ola

    1997-04-01

    The present study investigated the relationships between the operator's appraisal of his own work situation and the quality of his own work performance as well as self-reported errors in a nuclear power plant control room. In all, 98 control room operators from two nuclear power units filled out a questionnaire and several diaries during two operational conditions, annual outage and normal operation. As expected, the operators reported higher work demands in annual outage as compared to normal operation. In response to the increased demands, the operators reported that they used coping strategies such as increased effort, decreased aspiration level for work performance quality and increased use of delegation of tasks to others. This way of coping does not reflect less positive motivation for the work during the outage period. Instead, the operators maintain the same positive motivation for their work, and succeed in being more alert during morning and night shifts. However, the operators feel less satisfied with their work result. The operators also perceive the risk of making minor errors as increasing during outage. The decreased level of satisfaction with work result during outage is a fact despite the lowering of aspiration level for work performance quality during outage. In order to decrease relative frequencies for minor errors, special attention should be given to reduce work demands, such as time pressure and memory demands. In order to decrease misinterpretation errors special attention should be given to organizational factors such as planning and shift turnovers in addition to training. In summary, the outage period seems to be a significantly more vulnerable window in the management of a nuclear power plant than the normal power production state. Thus, an increased focus on the outage period and human factors issues, addressing the synergetic effects or work demands, organizational factors and coping resources is an important area for improvement

  14. Transcription-induced mutational strand bias and its effect on substitution rates in human genes.

    Science.gov (United States)

    Mugal, Carina F; von Grünberg, Hans-Hennig; Peifer, Martin

    2009-01-01

    If substitution rates are not the same on the two complementary DNA strands, a substitution is considered strand asymmetric. Such substitutional strand asymmetries are determined here for the three most frequent types of substitution on the human genome (C --> T, A --> G, and G --> T). Substitution rate differences between both strands are estimated for 4,590 human genes by aligning all repeats occurring within the introns with their ancestral consensus sequences. For 1,630 of these genes, both coding strand and noncoding strand rates could be compared with rates in gene-flanking regions. All three rates considered are found to be on average higher on the coding strand and lower on the transcribed strand in comparison to their values in the gene-flanking regions. This finding points to the simultaneous action of rate-increasing effects on the coding strand--such as increased adenine and cytosine deamination--and transcription-coupled repair as a rate-reducing effect on the transcribed strand. The common behavior of the three rates leads to strong correlations of the rate asymmetries: Whenever one rate is strand biased, the other two rates are likely to show the same bias. Furthermore, we determine all three rate asymmetries as a function of time: the A --> G and G --> T rate asymmetries are both found to be constant in time, whereas the C --> T rate asymmetry shows a pronounced time dependence, an observation that explains the difference between our results and those of an earlier work by Green et al. (2003. Transcription-associated mutational asymmetry in mammalian evolution. Nat Genet. 33:514-517.). Finally, we show that in addition to transcription also the replication process biases the substitution rates in genes.

  15. Heart rate responses provide an objective evaluation of human disturbance stimuli in breeding birds.

    Science.gov (United States)

    Ellenberg, Ursula; Mattern, Thomas; Seddon, Philip J

    2013-01-01

    Intuition is a poor guide for evaluating the effects of human disturbance on wildlife. Using the endangered Yellow-eyed penguin, Megadyptes antipodes, as an example, we show that heart rate responses provide an objective tool to evaluate human disturbance stimuli and encourage the wider use of this simple and low-impact approach. Yellow-eyed penguins are a flagship species for New Zealand's wildlife tourism; however, unregulated visitor access has recently been associated with reduced breeding success and lower first year survival. We measured heart rate responses of Yellow-eyed penguins via artificial eggs to evaluate a range of human stimuli regularly occurring at their breeding sites. We found the duration of a stimulus to be the most important factor, with elevated heart rate being sustained while a person remained within sight. Human activity was the next important component; a simulated wildlife photographer, crawling slowly around during his stay, elicited a significantly higher heart rate response than an entirely motionless human spending the same time at the same distance. Stimuli we subjectively might perceive as low impact, such as the careful approach of a 'wildlife photographer', resulted in a stronger response than a routine nest-check that involved lifting a bird up to view nest contents. A single, slow-moving human spending 20 min within 2 m from the nest may provoke a response comparable to that of 10 min handling a bird for logger deployment. To reduce cumulative impact of disturbance, any human presence in the proximity of Yellow-eyed penguins needs to be kept at a minimum. Our results highlight the need for objective quantification of the effects of human disturbance in order to provide a sound basis for guidelines to manage human activity around breeding birds.

  16. 加强药品管理降低药品调配中心使用差错率%Strengthen Drug Management and Reduce the use of Error Rate in Drug Distribution Center

    Institute of Scientific and Technical Information of China (English)

    郭玉侠

    2015-01-01

    目的 探讨医院静脉药品调配中心差错原因, 分析采用药品管理后对医院静脉药品调配中心使用差错率状况的影响. 方法 对该院2014年6月—2015年6月间静脉用药调配中心使用差错状况进行回顾性分析,对比加强管理前后医院药品调配中心药品使用差错率状况. 结果 医院静脉用药调配中心2014年6月—2014年12月共配置静脉药品23080份,差错63份,差错率2.73%,2015年1月—2015年6月共配置静脉药品23510份,差错10份,差错率0.47%. 药品管理后药品审核、药品调配、药品混淆、药品标签差错发生率及总差错率与药品管理前相比均明显较低,差异具有统计学意义,P<0.05. 结论 加强医院药品管理能有效减少医院药品使用差错率,促进医院静脉药品调配中心安全运作,保证患者治疗安全性,减少医疗纠纷.%Objective To explore hospital intravenous drug procurement center error reason, after analysis using drug man-agement of hospital intravenous drug procurement center using the influence of the error rate conditions. Methods In our hospital between June 2014- June 2015 intravenous drug use control center using error condition were analyzed retrospec-tively, compared before and after strengthening management of hospital drug procurement center drug use error rate. Re-sults Intravenous drug use control center hospital on June 6, 2014-December 2014 23080 configuration intravenous drugs, error 63,Error rate 2.73%, January 2015-June 2015, a total of 23510 intravenous drug, mistake 10, Error rate 0.47%. Af-ter drug administration drug review, drug procurement, medicines, confusion, drug labels error rate and the total error rate and drug management were significantly lower than before, difference have statistical significance, P<0.05). Conclusion To strengthen the hospital medicine management can effectively reduce the hospital drug use error rate, promote hospital intra-venous drug procurement center

  17. Nocturnal variations in peripheral blood flow, systemic blood pressure, and heart rate in humans

    DEFF Research Database (Denmark)

    Sindrup, J H; Kastrup, J; Christensen, H

    1991-01-01

    was associated with a 30-40% increase in blood flow rate and a highly significant decrease in mean arterial blood pressure and heart rate (P less than 0.001 for all). Approximately 100 min after the subjects went to sleep an additional blood flow rate increment (mean 56%) and a simultaneous significant decrease......Subcutaneous adipose tissue blood flow rate, together with systemic arterial blood pressure and heart rate under ambulatory conditions, was measured in the lower legs of 15 normal human subjects for 12-20 h. The 133Xe-washout technique, portable CdTe(Cl) detectors, and a portable data storage unit...... were used for measurement of blood flow rates. An automatic portable blood pressure recorder and processor unit was used for measurement of systolic blood pressure, diastolic blood pressure, and heart rate every 15 min. The change from upright to supine position at the beginning of the night period...

  18. Human error probability quantification using fuzzy methodology in nuclear plants; Aplicacao da metodologia fuzzy na quantificacao da probabilidade de erro humano em instalacoes nucleares

    Energy Technology Data Exchange (ETDEWEB)

    Nascimento, Claudio Souza do

    2010-07-01

    This work obtains Human Error Probability (HEP) estimates from operator's actions in response to emergency situations a hypothesis on Research Reactor IEA-R1 from IPEN. It was also obtained a Performance Shaping Factors (PSF) evaluation in order to classify them according to their influence level onto the operator's actions and to determine these PSF actual states over the plant. Both HEP estimation and PSF evaluation were done based on Specialists Evaluation using interviews and questionnaires. Specialists group was composed from selected IEA-R1 operators. Specialist's knowledge representation into linguistic variables and group evaluation values were obtained through Fuzzy Logic and Fuzzy Set Theory. HEP obtained values show good agreement with literature published data corroborating the proposed methodology as a good alternative to be used on Human Reliability Analysis (HRA). (author)

  19. Effect of UVA Fluence Rate on Indicators of Oxidative Stress in Human Dermal Fibroblasts

    Directory of Open Access Journals (Sweden)

    James D. Hoerter, Christopher S. Ward, Kyle D. Bale, Admasu N. Gizachew, Rachelle Graham, Jaclyn Reynolds, Melanie E. Ward, Chesca Choi, Jean-Leonard Kagabo, Michael Sauer, Tara Kuipers, Timothy Hotchkiss, Nate Banner, Renee A. Chellson, Theresa Ohaeri, L

    2008-01-01

    Full Text Available During the course of a day human skin is exposed to solar UV radiation that fluctuates in fluence rate within the UVA (290-315 nm and UVB (315-400 nm spectrum. Variables affecting the fluence rate reaching skin cells include differences in UVA and UVB penetrating ability, presence or absence of sunscreens, atmospheric conditions, and season and geographical location where the exposure occurs. Our study determined the effect of UVA fluence rate in solar-simulated (SSR and tanning-bed radiation (TBR on four indicators of oxidative stress---protein oxidation, glutathione, heme oxygenase-1, and reactive oxygen species--in human dermal fibroblasts after receiving equivalent UVA and UVB doses. Our results show that the higher UVA fluence rate in TBR increases the level of all four indicators of oxidative stress. In sequential exposures when cells are exposed first to SSR, the lower UVA fluence rate in SSR induces a protective response that protects against oxidative stress following a second exposure to a higher UVA fluence rate. Our studies underscore the important role of UVA fluence rate in determining how human skin cells respond to a given dose of radiation containing both UVA and UVB radiation.

  20. Intergenic DNA sequences from the human X chromosome reveal high rates of global gene flow

    Directory of Open Access Journals (Sweden)

    Wall Jeffrey D

    2008-11-01

    Full Text Available Abstract Background Despite intensive efforts devoted to collecting human polymorphism data, little is known about the role of gene flow in the ancestry of human populations. This is partly because most analyses have applied one of two simple models of population structure, the island model or the splitting model, which make unrealistic biological assumptions. Results Here, we analyze 98-kb of DNA sequence from 20 independently evolving intergenic regions on the X chromosome in a sample of 90 humans from six globally diverse populations. We employ an isolation-with-migration (IM model, which assumes that populations split and subsequently exchange migrants, to independently estimate effective population sizes and migration rates. While the maximum effective size of modern humans is estimated at ~10,000, individual populations vary substantially in size, with African populations tending to be larger (2,300–9,000 than non-African populations (300–3,300. We estimate mean rates of bidirectional gene flow at 4.8 × 10-4/generation. Bidirectional migration rates are ~5-fold higher among non-African populations (1.5 × 10-3 than among African populations (2.7 × 10-4. Interestingly, because effective sizes and migration rates are inversely related in African and non-African populations, population migration rates are similar within Africa and Eurasia (e.g., global mean Nm = 2.4. Conclusion We conclude that gene flow has played an important role in structuring global human populations and that migration rates should be incorporated as critical parameters in models of human demography.