WorldWideScience

Sample records for unequal error protection

  1. A Hybrid Unequal Error Protection / Unequal Error Resilience ...

    African Journals Online (AJOL)

    The quality layers are then assigned an Unequal Error Resilience to synchronization loss by unequally allocating the number of headers available for synchronization to them. Following that Unequal Error Protection against channel noise is provided to the layers by the use of Rate Compatible Punctured Convolutional ...

  2. Enhancement of Unequal Error Protection Properties of LDPC Codes

    Directory of Open Access Journals (Sweden)

    Poulliat Charly

    2007-01-01

    Full Text Available It has been widely recognized in the literature that irregular low-density parity-check (LDPC codes exhibit naturally an unequal error protection (UEP behavior. In this paper, we propose a general method to emphasize and control the UEP properties of LDPC codes. The method is based on a hierarchical optimization of the bit node irregularity profile for each sensitivity class within the codeword by maximizing the average bit node degree while guaranteeing a minimum degree as high as possible. We show that this optimization strategy is efficient, since the codes that we optimize show better UEP capabilities than the codes optimized for the additive white Gaussian noise channel.

  3. Improved Design of Unequal Error Protection LDPC Codes

    Directory of Open Access Journals (Sweden)

    Sandberg Sara

    2010-01-01

    Full Text Available We propose an improved method for designing unequal error protection (UEP low-density parity-check (LDPC codes. The method is based on density evolution. The degree distribution with the best UEP properties is found, under the constraint that the threshold should not exceed the threshold of a non-UEP code plus some threshold offset. For different codeword lengths and different construction algorithms, we search for good threshold offsets for the UEP code design. The choice of the threshold offset is based on the average a posteriori variable node mutual information. Simulations reveal the counter intuitive result that the short-to-medium length codes designed with a suitable threshold offset all outperform the corresponding non-UEP codes in terms of average bit-error rate. The proposed codes are also compared to other UEP-LDPC codes found in the literature.

  4. Error-Resilient Unequal Error Protection of Fine Granularity Scalable Video Bitstreams

    Science.gov (United States)

    Cai, Hua; Zeng, Bing; Shen, Guobin; Xiong, Zixiang; Li, Shipeng

    2006-12-01

    This paper deals with the optimal packet loss protection issue for streaming the fine granularity scalable (FGS) video bitstreams over IP networks. Unlike many other existing protection schemes, we develop an error-resilient unequal error protection (ER-UEP) method that adds redundant information optimally for loss protection and, at the same time, cancels completely the dependency among bitstream after loss recovery. In our ER-UEP method, the FGS enhancement-layer bitstream is first packetized into a group of independent and scalable data packets. Parity packets, which are also scalable, are then generated. Unequal protection is finally achieved by properly shaping the data packets and the parity packets. We present an algorithm that can optimally allocate the rate budget between data packets and parity packets, together with several simplified versions that have lower complexity. Compared with conventional UEP schemes that suffer from bit contamination (caused by the bit dependency within a bitstream), our method guarantees successful decoding of all received bits, thus leading to strong error-resilience (at any fixed channel bandwidth) and high robustness (under varying and/or unclean channel conditions).

  5. Designing an efficient LT-code with unequal error protection for image transmission

    Science.gov (United States)

    S. Marques, F.; Schwartz, C.; Pinho, M. S.; Finamore, W. A.

    2015-10-01

    The use of images from earth observation satellites is spread over different applications, such as a car navigation systems and a disaster monitoring. In general, those images are captured by on board imaging devices and must be transmitted to the Earth using a communication system. Even though a high resolution image can produce a better Quality of Service, it leads to transmitters with high bit rate which require a large bandwidth and expend a large amount of energy. Therefore, it is very important to design efficient communication systems. From communication theory, it is well known that a source encoder is crucial in an efficient system. In a remote sensing satellite image transmission, this efficiency is achieved by using an image compressor, to reduce the amount of data which must be transmitted. The Consultative Committee for Space Data Systems (CCSDS), a multinational forum for the development of communications and data system standards for space flight, establishes a recommended standard for a data compression algorithm for images from space systems. Unfortunately, in the satellite communication channel, the transmitted signal is corrupted by the presence of noise, interference signals, etc. Therefore, the receiver of a digital communication system may fail to recover the transmitted bit. Actually, a channel code can be used to reduce the effect of this failure. In 2002, the Luby Transform code (LT-code) was introduced and it was shown that it was very efficient when the binary erasure channel model was used. Since the effect of the bit recovery failure depends on the position of the bit in the compressed image stream, in the last decade many e orts have been made to develop LT-code with unequal error protection. In 2012, Arslan et al. showed improvements when LT-codes with unequal error protection were used in images compressed by SPIHT algorithm. The techniques presented by Arslan et al. can be adapted to work with the algorithm for image compression

  6. Unequal Error Protected JPEG 2000 Broadcast Scheme with Progressive Fountain Codes

    OpenAIRE

    Chen, Zhao; Xu, Mai; Yin, Luiguo; Lu, Jianhua

    2012-01-01

    This paper proposes a novel scheme, based on progressive fountain codes, for broadcasting JPEG 2000 multimedia. In such a broadcast scheme, progressive resolution levels of images/video have been unequally protected when transmitted using the proposed progressive fountain codes. With progressive fountain codes applied in the broadcast scheme, the resolutions of images (JPEG 2000) or videos (MJPEG 2000) received by different users can be automatically adaptive to their channel qualities, i.e. ...

  7. Robust Transmission of H.264/AVC Streams Using Adaptive Group Slicing and Unequal Error Protection

    Science.gov (United States)

    Thomos, Nikolaos; Argyropoulos, Savvas; Boulgouris, Nikolaos V.; Strintzis, Michael G.

    2006-12-01

    We present a novel scheme for the transmission of H.264/AVC video streams over lossy packet networks. The proposed scheme exploits the error-resilient features of H.264/AVC codec and employs Reed-Solomon codes to protect effectively the streams. A novel technique for adaptive classification of macroblocks into three slice groups is also proposed. The optimal classification of macroblocks and the optimal channel rate allocation are achieved by iterating two interdependent steps. Dynamic programming techniques are used for the channel rate allocation process in order to reduce complexity. Simulations clearly demonstrate the superiority of the proposed method over other recent algorithms for transmission of H.264/AVC streams.

  8. Unequal error control scheme for dimmable visible light communication systems

    Science.gov (United States)

    Deng, Keyan; Yuan, Lei; Wan, Yi; Li, Huaan

    2017-01-01

    Visible light communication (VLC), which has the advantages of a very large bandwidth, high security, and freedom from license-related restrictions and electromagnetic-interference, has attracted much interest. Because a VLC system simultaneously performs illumination and communication functions, dimming control, efficiency, and reliable transmission are significant and challenging issues of such systems. In this paper, we propose a novel unequal error control (UEC) scheme in which expanding window fountain (EWF) codes in an on-off keying (OOK)-based VLC system are used to support different dimming target values. To evaluate the performance of the scheme for various dimming target values, we apply it to H.264 scalable video coding bitstreams in a VLC system. The results of the simulations that are performed using additive white Gaussian noises (AWGNs) with different signal-to-noise ratios (SNRs) are used to compare the performance of the proposed scheme for various dimming target values. It is found that the proposed UEC scheme enables earlier base layer recovery compared to the use of the equal error control (EEC) scheme for different dimming target values and therefore afford robust transmission for scalable video multicast over optical wireless channels. This is because of the unequal error protection (UEP) and unequal recovery time (URT) of the EWF code in the proposed scheme.

  9. A Hybrid Unequal Error Protection / Unequal Error Resilience ...

    African Journals Online (AJOL)

    admpather

    Resilience Scheme for JPEG Image Transmission using. OFDM ... of the Peak to Peak Signal to Noise power Ratio (PSNR) and the Mean Structural Similarity ..... transmission over wireless mobile networks or Wireless Local Area Networks. 6.

  10. Inflation of type I error rates by unequal variances associated with parametric, nonparametric, and Rank-Transformation Tests

    Directory of Open Access Journals (Sweden)

    Donald W. Zimmerman

    2004-01-01

    Full Text Available It is well known that the two-sample Student t test fails to maintain its significance level when the variances of treatment groups are unequal, and, at the same time, sample sizes are unequal. However, introductory textbooks in psychology and education often maintain that the test is robust to variance heterogeneity when sample sizes are equal. The present study discloses that, for a wide variety of non-normal distributions, especially skewed distributions, the Type I error probabilities of both the t test and the Wilcoxon-Mann-Whitney test are substantially inflated by heterogeneous variances, even when sample sizes are equal. The Type I error rate of the t test performed on ranks replacing the scores (rank-transformed data is inflated in the same way and always corresponds closely to that of the Wilcoxon-Mann-Whitney test. For many probability densities, the distortion of the significance level is far greater after transformation to ranks and, contrary to known asymptotic properties, the magnitude of the inflation is an increasing function of sample size. Although nonparametric tests of location also can be sensitive to differences in the shape of distributions apart from location, the Wilcoxon-Mann-Whitney test and rank-transformation tests apparently are influenced mainly by skewness that is accompanied by specious differences in the means of ranks.

  11. Retaliation against reporters of unequal treatment: Failing employee protection in The Netherlands

    NARCIS (Netherlands)

    Svensson, Jorgen S.; van Genugten, M.L.

    2013-01-01

    Purpose – Equal treatment in the workplace is considered one of the most fundamental rights of employees. This right also implies that employees must be able to address any form of unequal treatment freely and effectively, without fear of retaliation. The purpose of this paper is to investigate the

  12. Unequal Protection of Video Streaming through Adaptive Modulation with a Trizone Buffer over Bluetooth Enhanced Data Rate

    Directory of Open Access Journals (Sweden)

    Razavi Rouzbeh

    2008-01-01

    Full Text Available Abstract Bluetooth enhanced data rate wireless channel can support higher-quality video streams compared to previous versions of Bluetooth. Packet loss when transmitting compressed data has an effect on the delivered video quality that endures over multiple frames. To reduce the impact of radio frequency noise and interference, this paper proposes adaptive modulation based on content type at the video frame level and content importance at the macroblock level. Because the bit rate of protected data is reduced, the paper proposes buffer management to reduce the risk of buffer overflow. A trizone buffer is introduced, with a varying unequal protection policy in each zone. Application of this policy together with adaptive modulation results in up to 4 dB improvement in objective video quality compared to fixed rate scheme for an additive white Gaussian noise channel and around 10 dB for a Gilbert-Elliott channel. The paper also reports a consistent improvement in video quality over a scheme that adapts to channel conditions by varying the data rate without accounting for the video frame packet type or buffer congestion.

  13. Unequal Protection of Video Streaming through Adaptive Modulation with a Trizone Buffer over Bluetooth Enhanced Data Rate

    Directory of Open Access Journals (Sweden)

    Rouzbeh Razavi

    2007-12-01

    Full Text Available Bluetooth enhanced data rate wireless channel can support higher-quality video streams compared to previous versions of Bluetooth. Packet loss when transmitting compressed data has an effect on the delivered video quality that endures over multiple frames. To reduce the impact of radio frequency noise and interference, this paper proposes adaptive modulation based on content type at the video frame level and content importance at the macroblock level. Because the bit rate of protected data is reduced, the paper proposes buffer management to reduce the risk of buffer overflow. A trizone buffer is introduced, with a varying unequal protection policy in each zone. Application of this policy together with adaptive modulation results in up to 4 dB improvement in objective video quality compared to fixed rate scheme for an additive white Gaussian noise channel and around 10 dB for a Gilbert-Elliott channel. The paper also reports a consistent improvement in video quality over a scheme that adapts to channel conditions by varying the data rate without accounting for the video frame packet type or buffer congestion.

  14. Reducing Error, Fraud and Corruption (EFC) in Social Protection Programs

    OpenAIRE

    Tesliuc, Emil Daniel; Milazzo, Annamaria

    2007-01-01

    Social Protection (SP) and Social Safety Net (SSN) programs channel a large amount of public resources, it is important to make sure that these reach the intended beneficiaries. Error, fraud, or corruption (EFC) reduces the economic efficiency of these interventions by decreasing the amount of money that goes to the intended beneficiaries, and erodes the political support for the program. ...

  15. The 1996 European Directive and radiation protection at CERN, or why 15 plus 4 is unequal to 19

    International Nuclear Information System (INIS)

    Hoefert, M.

    1997-04-01

    The recommendations of the 1996 EU Directive on radiation protection are compared with the practice at CERN as laid down in the 1996 Radiation Safety Manual which is largely based on the Swiss Radiation Protection Ordinance of 1994. The three topics discussed are individual dosimetry for persons exposed in the exercise of their profession, exemption values and clearance levels for radioactivity and committed effective dose coefficients, and reference levels for members of the public. (author)

  16. Is equal moral consideration really compatible with unequal moral status?

    Science.gov (United States)

    Rossi, John

    2010-09-01

    The issue of moral considerability, or how much moral importance a being's interests deserve, is one of the most important in animal ethics. Some leading theorists--most notably David DeGrazia--have argued that a principle of "equal moral consideration" is compatible with "unequal moral status." Such a position would reconcile the egalitarian force of equal consideration with more stringent obligations to humans than animals. The article presents arguments that equal consideration is not compatible with unequal moral status, thereby forcing those who would justify significantly different moral protections for humans and animals to argue for unequal consideration.

  17. Automated reactor protection testing saves time and avoids errors

    International Nuclear Information System (INIS)

    Raimondo, E.

    1990-01-01

    When the Pressurized Water Reactor units in the French 900MWe series were designed, the instrumentation and control systems were equipped for manual periodic testing. Manual reactor protection system testing has since been successfully replaced by an automatic system, which is also applicable to other instrumentation testing. A study on the complete automation of process instrumentation testing has been carried out. (author)

  18. Humanitarianism and Unequal Exchange

    Directory of Open Access Journals (Sweden)

    Raja Swamy

    2017-08-01

    Full Text Available This article examines the relationship between humanitarian aid and ecologically unequal exchange in the context of post-disaster reconstruction. I assess the manner in which humanitarian aid became a central part of the reconstruction process in India's Tamil Nadu state following the devastating 2004 Indian Ocean tsunami. This article focuses on how the humanitarian “gift” of housing became a central plank of the state's efforts to push fishers inland while opening up coastal lands for various economic development projects such as ports, infrastructure, industries, and tourism. As part of the state and multilateral agency financed reconstruction process, the humanitarian aid regime provided “free” houses as gifts to recipients while expecting in return the formal abandonment of all claims to the coast. The humanitarian “gift” therefore helped depoliticize critical issues of land and resources, location and livelihood, which prior to the tsunami were subjects of long-standing political conflicts between local fisher populations and the state. The gift economy in effect played into an ongoing conflict over land and resources and effectively sought to ease the alienation of fishers from their coastal commons and near shore marine resource base. I argue that humanitarian aid, despite its associations with benevolence and generosity, presents a troubling and disempowering set of options for political struggles over land, resources, and social entitlements such as housing, thereby intensifying existing ecological and economic inequalities.

  19. HIV / AIDS: An Unequal Burden

    Science.gov (United States)

    Skip Navigation Bar Home Current Issue Past Issues HIV / AIDS HIV / AIDS: An Unequal Burden Past Issues / Summer 2009 ... high-risk category, emphasizes Dr. Cargill. Photo: iStock HIV and Pregnancy Are there ways to help HIV- ...

  20. [Maternal death: unequal risks].

    Science.gov (United States)

    Defossez, A C; Fassin, D

    1989-01-01

    rates include political, geographic, and economic mechanisms of exclusion which affect the vast majority of the population in developing countries. Political power is concentrated in the hands of relatively small groups whose decisions about such expenditures as health care are usually more favorable to the privileged. A consequence of the very unequal regional development in most Third World countries is that health, educational, and most other resources are concentrated in large cities and perhaps 1 or 2 strategic regions, leaving most of the population underserved. The low social position of women leaves them doubly vulnerable. The social factors adding to risks of maternal mortality should be considered in programs of prevention if the causes and not just the consequences are to be addressed.

  1. Unequal recognition, misrecognition and injustice

    DEFF Research Database (Denmark)

    Lægaard, Sune

    2012-01-01

    by the state of religious minorities. It argues that state–religion relations can be analysed as relations of recognition, which are not only unequal but also multi-dimensional, and that it is difficult to answer the question whether multi-dimensional recognitive inequalities are unjust or wrong if one...

  2. Hecke algebras with unequal parameters

    CERN Document Server

    Lusztig, G

    2003-01-01

    Hecke algebras arise in representation theory as endomorphism algebras of induced representations. One of the most important classes of Hecke algebras is related to representations of reductive algebraic groups over p-adic or finite fields. In 1979, in the simplest (equal parameter) case of such Hecke algebras, Kazhdan and Lusztig discovered a particular basis (the KL-basis) in a Hecke algebra, which is very important in studying relations between representation theory and geometry of the corresponding flag varieties. It turned out that the elements of the KL-basis also possess very interesting combinatorial properties. In the present book, the author extends the theory of the KL-basis to a more general class of Hecke algebras, the so-called algebras with unequal parameters. In particular, he formulates conjectures describing the properties of Hecke algebras with unequal parameters and presents examples verifying these conjectures in particular cases. Written in the author's precise style, the book gives rese...

  3. Equidistant Linear Network Codes with maximal Error-protection from Veronese Varieties

    DEFF Research Database (Denmark)

    Hansen, Johan P.

    2012-01-01

    Linear network coding transmits information in terms of a basis of a vector space and the information is received as a basis of a possible altered vectorspace. Ralf Koetter and Frank R. Kschischang in Coding for errors and erasures in random network coding (IEEE Transactions on Information Theory...... construct explicit families of vector-spaces of constant dimension where any pair of distinct vector-spaces are equidistant in the above metric. The parameters of the resulting linear network codes which have maximal error-protection are determined....

  4. Support of protective work of human error in a nuclear power plant

    International Nuclear Information System (INIS)

    Yoshizawa, Yuriko

    1999-01-01

    The nuclear power plant human factor group of the Tokyo Electric Power Co., Ltd. supports various protective work of human error conducted at the nuclear power plant. Its main researching theme are studies on human factor on operation of a nuclear power plant, and on recovery and common basic study on human factor. In addition, on a base of the obtained informations, assistance to protective work of human error conducted at the nuclear power plant as well as development for its actual use was also promoted. Especially, for actions sharing some dangerous informations, various assistances such as a proposal on actual example analytical method to effectively understand a dangerous information not facially but faithfully, construction of a data base to conveniently share such dangerous information, and practice on non-accident business survey for a hint of effective promotion of the protection work, were promoted. Here were introduced on assistance and investigation for effective sharing of the dangerous informations for various actions on protection of human error mainly conducted in nuclear power plant. (G.K.)

  5. Unequal-Arms Michelson Interferometers

    Science.gov (United States)

    Tinto, Massimo; Armstrong, J. W.

    2000-01-01

    Michelson interferometers allow phase measurements many orders of magnitude below the phase stability of the laser light injected into their two almost equal-length arms. If, however, the two arms are unequal, the laser fluctuations can not be removed by simply recombining the two beams. This is because the laser jitters experience different time delays in the two arms, and therefore can not cancel at the photo detector. We present here a method for achieving exact laser noise cancellation, even in an unequal-arm interferometer. The method presented in this paper requires a separate readout of the relative phase in each arm, made by interfering the returning beam in each arm with a fraction of the outgoing beam. By linearly combining the two data sets with themselves, after they have been properly time shifted, we show that it is possible to construct a new data set that is free of laser fluctuations. An application of this technique to future planned space-based laser interferometer detector3 of gravitational radiation is discussed.

  6. Error Correction and Calibration of a Sun Protection Measurement System for Textile Fabrics

    International Nuclear Information System (INIS)

    Moss, A.R.L.

    2000-01-01

    Clothing is increasingly being labelled with a Sun Protection Factor number which indicates the protection against sunburn provided by the textile fabric. This Factor is obtained by measuring the transmittance of samples of the fabric in the ultraviolet region (290-400 nm). The accuracy and hence the reliability of the label depends on the accuracy of the measurement. Some sun protection measurement systems quote a transmittance accuracy at 2%T of ± 1.5%T. This means a fabric classified under the Australian standard (AS/NZ 4399:1996) with an Ultraviolet Protection Factor (UPF) of 40 would have an uncertainty of +15 or -10. This would not allow classification to the nearest 5, and a UVR protection category of 'excellent protection' might in fact be only 'very good protection'. An accuracy of ±0.1%T is required to give a UPF uncertainty of ±2.5. The measurement system then does not contribute significantly to the error, and the problems are now limited to sample conditioning, position and consistency. A commercial sun protection measurement system has been developed by Camspec Ltd which used traceable neutral density filters and appropriate design to ensure high accuracy. The effects of small zero offsets are corrected and the effect of the reflectivity of the sample fabric on the integrating sphere efficiency is measured and corrected. Fabric orientation relative to the light patch is considered. Signal stability is ensured by means of a reference beam. Traceable filters also allow wavelength accuracy to be conveniently checked. (author)

  7. Error Correction and Calibration of a Sun Protection Measurement System for Textile Fabrics

    Energy Technology Data Exchange (ETDEWEB)

    Moss, A.R.L

    2000-07-01

    Clothing is increasingly being labelled with a Sun Protection Factor number which indicates the protection against sunburn provided by the textile fabric. This Factor is obtained by measuring the transmittance of samples of the fabric in the ultraviolet region (290-400 nm). The accuracy and hence the reliability of the label depends on the accuracy of the measurement. Some sun protection measurement systems quote a transmittance accuracy at 2%T of {+-} 1.5%T. This means a fabric classified under the Australian standard (AS/NZ 4399:1996) with an Ultraviolet Protection Factor (UPF) of 40 would have an uncertainty of +15 or -10. This would not allow classification to the nearest 5, and a UVR protection category of 'excellent protection' might in fact be only 'very good protection'. An accuracy of {+-}0.1%T is required to give a UPF uncertainty of {+-}2.5. The measurement system then does not contribute significantly to the error, and the problems are now limited to sample conditioning, position and consistency. A commercial sun protection measurement system has been developed by Camspec Ltd which used traceable neutral density filters and appropriate design to ensure high accuracy. The effects of small zero offsets are corrected and the effect of the reflectivity of the sample fabric on the integrating sphere efficiency is measured and corrected. Fabric orientation relative to the light patch is considered. Signal stability is ensured by means of a reference beam. Traceable filters also allow wavelength accuracy to be conveniently checked. (author)

  8. Recent study, but not retrieval, of knowledge protects against learning errors.

    Science.gov (United States)

    Mullet, Hillary G; Umanath, Sharda; Marsh, Elizabeth J

    2014-11-01

    Surprisingly, people incorporate errors into their knowledge bases even when they have the correct knowledge stored in memory (e.g., Fazio, Barber, Rajaram, Ornstein, & Marsh, 2013). We examined whether heightening the accessibility of correct knowledge would protect people from later reproducing misleading information that they encountered in fictional stories. In Experiment 1, participants studied a series of target general knowledge questions and their correct answers either a few minutes (high accessibility of knowledge) or 1 week (low accessibility of knowledge) before exposure to misleading story references. In Experiments 2a and 2b, participants instead retrieved the answers to the target general knowledge questions either a few minutes or 1 week before the rest of the experiment. Reading the relevant knowledge directly before the story-reading phase protected against reproduction of the misleading story answers on a later general knowledge test, but retrieving that same correct information did not. Retrieving stored knowledge from memory might actually enhance the encoding of relevant misinformation.

  9. Human error probability evaluation as part of reliability analysis of digital protection system of advanced pressurized water reactor - APR 1400

    International Nuclear Information System (INIS)

    Varde, P. V.; Lee, D. Y.; Han, J. B.

    2003-03-01

    A case of study on human reliability analysis has been performed as part of reliability analysis of digital protection system of the reactor automatically actuates the shutdown system of the reactor when demanded. However, the safety analysis takes credit for operator action as a diverse mean for tripping the reactor for, though a low probability, ATWS scenario. Based on the available information two cases, viz., human error in tripping the reactor and calibration error for instrumentations in protection system, have been analyzed. Wherever applicable a parametric study has also been performed

  10. Analysis of covariance with pre-treatment measurements in randomized trials: comparison of equal and unequal slopes.

    Science.gov (United States)

    Funatogawa, Ikuko; Funatogawa, Takashi

    2011-09-01

    In randomized trials, an analysis of covariance (ANCOVA) is often used to analyze post-treatment measurements with pre-treatment measurements as a covariate to compare two treatment groups. Random allocation guarantees only equal variances of pre-treatment measurements. We hence consider data with unequal covariances and variances of post-treatment measurements without assuming normality. Recently, we showed that the actual type I error rate of the usual ANCOVA assuming equal slopes and equal residual variances is asymptotically at a nominal level under equal sample sizes, and that of the ANCOVA with unequal variances is asymptotically at a nominal level, even under unequal sample sizes. In this paper, we investigated the asymptotic properties of the ANCOVA with unequal slopes for such data. The estimators of the treatment effect at the observed mean are identical between equal and unequal variance assumptions, and these are asymptotically normal estimators for the treatment effect at the true mean. However, the variances of these estimators based on standard formulas are biased, and the actual type I error rates are not at a nominal level, irrespective of variance assumptions. In equal sample sizes, the efficiency of the usual ANCOVA assuming equal slopes and equal variances is asymptotically the same as those of the ANCOVA with unequal slopes and higher than that of the ANCOVA with equal slopes and unequal variances. Therefore, the use of the usual ANCOVA is appropriate in equal sample sizes. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Reply to "Comment on `Protecting bipartite entanglement by quantum interferences' "

    Science.gov (United States)

    Das, Sumanta; Agarwal, G. S.

    2018-03-01

    In a recent Comment Nair and Arun, Phys. Rev. A 97, 036301 (2018), 10.1103/PhysRevA.97.036301, it was concluded that the two-qubit entanglement protection reported in our work [Das and Agarwal, Phys. Rev. A 81, 052341 (2010), 10.1103/PhysRevA.81.052341] is erroneous. While we acknowledge the error in analytical results on concurrence when dipole matrix elements were unequal, the essential conclusions on entanglement protection are not affected.

  12. Algorithms for Unequal-Arm Michelson Interferometers

    Science.gov (United States)

    Giampieri, Giacomo; Hellings, Ronald W.; Tinto, Massimo; Bender, Peter L.; Faller, James E.

    1994-01-01

    A method of data acquisition and data analysis is described in which the performance of Michelson-type interferometers with unequal arms can be made nearly the same as interferometers with equal arms. The method requires a separate readout of the relative phase in each arm, made by interfering the returning beam in each arm with a fraction of the outgoing beam.

  13. On the Efficient Broadcasting of Heterogeneous Services over Band-Limited Channels: Unequal Power Allocation for Wavelet Packet Division Multiplexing

    Directory of Open Access Journals (Sweden)

    Maurizio Murroni

    2008-01-01

    Full Text Available Multiple transmission of heterogeneous services is a central aspect of broadcasting technology. Often, in this framework, the design of efficient communication systems is complicated by stringent bandwidth constraint. In wavelet packet division multiplexing (WPDM, the message signals are waveform coded onto wavelet packet basis functions. The overlapping nature of such waveforms in both time and frequency allows improving the performance over the commonly used FDM and TDM schemes, while their orthogonality properties permit to extract the message signals by a simple correlator receiver. Furthermore, the scalable structure of WPDM makes it suitable for broadcasting heterogeneous services. This work investigates unequal error protection (UEP of data which exhibit different sensitivities to channel errors to improve the performance of WPDM for transmission over band-limited channels. To cope with bandwidth constraint, an appropriate distribution of power among waveforms is proposed which is driven by the channel error sensitivities of the carried message signals in case of Gaussian noise. We address this problem by means of the genetic algorithms (GAs, which allow flexible suboptimal solution with reduced complexity. The mean square error (MSE between the original and the decoded message, which has a strong correlation with subjective perception, is used as an optimization criterion.

  14. Isolating Graphical Failure-Inducing Input for Privacy Protection in Error Reporting Systems

    Directory of Open Access Journals (Sweden)

    Matos João

    2016-04-01

    Full Text Available This work proposes a new privacy-enhancing system that minimizes the disclosure of information in error reports. Error reporting mechanisms are of the utmost importance to correct software bugs but, unfortunately, the transmission of an error report may reveal users’ private information. Some privacy-enhancing systems for error reporting have been presented in the past years, yet they rely on path condition analysis, which we show in this paper to be ineffective when it comes to graphical-based input. Knowing that numerous applications have graphical user interfaces (GUI, it is very important to overcome such limitation. This work describes a new privacy-enhancing error reporting system, based on a new input minimization algorithm called GUIᴍɪɴ that is geared towards GUI, to remove input that is unnecessary to reproduce the observed failure. Before deciding whether to submit the error report, the user is provided with a step-by-step graphical replay of the minimized input, to evaluate whether it still yields sensitive information. We also provide an open source implementation of the proposed system and evaluate it with well-known applications.

  15. Human Factors Risk Analyses of a Doffing Protocol for Ebola-Level Personal Protective Equipment: Mapping Errors to Contamination.

    Science.gov (United States)

    Mumma, Joel M; Durso, Francis T; Ferguson, Ashley N; Gipson, Christina L; Casanova, Lisa; Erukunuakpor, Kimberly; Kraft, Colleen S; Walsh, Victoria L; Zimring, Craig; DuBose, Jennifer; Jacob, Jesse T

    2018-03-05

    Doffing protocols for personal protective equipment (PPE) are critical for keeping healthcare workers (HCWs) safe during care of patients with Ebola virus disease. We assessed the relationship between errors and self-contamination during doffing. Eleven HCWs experienced with doffing Ebola-level PPE participated in simulations in which HCWs donned PPE marked with surrogate viruses (ɸ6 and MS2), completed a clinical task, and were assessed for contamination after doffing. Simulations were video recorded, and a failure modes and effects analysis and fault tree analyses were performed to identify errors during doffing, quantify their risk (risk index), and predict contamination data. Fifty-one types of errors were identified, many having the potential to spread contamination. Hand hygiene and removing the powered air purifying respirator (PAPR) hood had the highest total risk indexes (111 and 70, respectively) and number of types of errors (9 and 13, respectively). ɸ6 was detected on 10% of scrubs and the fault tree predicted a 10.4% contamination rate, likely occurring when the PAPR hood inadvertently contacted scrubs during removal. MS2 was detected on 10% of hands, 20% of scrubs, and 70% of inner gloves and the predicted rates were 7.3%, 19.4%, 73.4%, respectively. Fault trees for MS2 and ɸ6 contamination suggested similar pathways. Ebola-level PPE can both protect and put HCWs at risk for self-contamination throughout the doffing process, even among experienced HCWs doffing with a trained observer. Human factors methodologies can identify error-prone steps, delineate the relationship between errors and self-contamination, and suggest remediation strategies.

  16. Compact Unequal Power Divider with Filtering Response

    Directory of Open Access Journals (Sweden)

    Wei-Qiang Pan

    2015-01-01

    Full Text Available We present a novel unequal power divider with bandpass responses. The proposed power divider consists of five resonators and a resistor. The power division ratio is controlled by altering the coupling strength among the resonators. The output ports have the characteristic impedance of 50 Ω and impedance transformers in classical Wilkinson power dividers are not required in this design. Use of resonators enables the filtering function of the power divider. Two transmission zeros are generated near the passband edges, resulting in quasielliptic bandpass responses. For validation, a 2 : 1 filtering power divider is implemented. The fabricated circuit size is 0.22 λg × 0.08 λg, featuring compact size for unequal filtering power dividers, which is suitable for the feeding networks of antenna arrays.

  17. Kazhdan-Lusztig cells with unequal parameters

    CERN Document Server

    Bonnafé, Cédric

    2017-01-01

    This monograph provides a comprehensive introduction to the Kazhdan-Lusztig theory of cells in the broader context of the unequal parameter case. Serving as a useful reference, the present volume offers a synthesis of significant advances made since Lusztig’s seminal work on the subject was published in 2002. The focus lies on the combinatorics of the partition into cells for general Coxeter groups, with special attention given to induction methods, cellular maps and the role of Lusztig's conjectures. Using only algebraic and combinatorial methods, the author carefully develops proofs, discusses open conjectures, and presents recent research, including a chapter on the action of the cactus group. Kazhdan-Lusztig Cells with Unequal Parameters will appeal to graduate students and researchers interested in related subject areas, such as Lie theory, representation theory, and combinatorics of Coxeter groups. Useful examples and various exercises make this book suitable for self-study and use alongside lecture c...

  18. Assessing Visibility of Individual Transmission Errors in Networked Video

    DEFF Research Database (Denmark)

    Korhonen, Jari; Mantel, Claire

    2016-01-01

    could benefit from information about subjective visibility of individual packet losses; for example, computational resources could be directed more efficiently to unequal error protection and concealment by focusing in the visually most disturbing artifacts. In this paper, we present a novel subjective...... methodology for packet loss artifact detection by tapping a touchscreen where a defect is observed. To validate the proposed methodology, the results of a pilot study are presented and analyzed. According to the results, the proposed method can be used to derive qualitatively and statistically meaningful data...... on the subjective visibility of individual packet loss artifacts....

  19. Error Floor Analysis of Coded Slotted ALOHA over Packet Erasure Channels

    DEFF Research Database (Denmark)

    Ivanov, Mikhail; Graell i Amat, Alexandre; Brannstrom, F.

    2014-01-01

    We present a framework for the analysis of the error floor of coded slotted ALOHA (CSA) for finite frame lengths over the packet erasure channel. The error floor is caused by stopping sets in the corresponding bipartite graph, whose enumeration is, in general, not a trivial problem. We therefore ...... identify the most dominant stopping sets for the distributions of practical interest. The derived analytical expressions allow us to accurately predict the error floor at low to moderate channel loads and characterize the unequal error protection inherent in CSA.......We present a framework for the analysis of the error floor of coded slotted ALOHA (CSA) for finite frame lengths over the packet erasure channel. The error floor is caused by stopping sets in the corresponding bipartite graph, whose enumeration is, in general, not a trivial problem. We therefore...

  20. Unequal-time correlators for cosmology

    Science.gov (United States)

    Kitching, T. D.; Heavens, A. F.

    2017-03-01

    Measurements of the power spectrum from large-scale structure surveys have, to date, assumed an equal-time approximation, where the full cross-correlation power spectrum of the matter density field evaluated at different times (or distances) has been approximated either by the power spectrum at a fixed time or in an improved fashion, by a geometric mean P (k ;r1,r2)=[P (k ;r1)P (k ;r2)]1 /2 . In this paper we investigate the expected impact of the geometric mean ansatz and present an application in assessing the impact on weak-gravitational-lensing cosmological parameter inference, using a perturbative unequal time correlator. As one might expect, we find that the impact of this assumption is greatest at large separations in redshift Δ z ≳0.3 where the change in the amplitude of the matter power spectrum can be as much as 10 percent for k ≳5 h ⁢ Mpc-1 . However, of more concern is that the corrections for small separations, where the clustering is not close to zero, may not be negligibly small. In particular, we find that for a Euclid- or LSST-like weak lensing experiment, the assumption of equal-time correlators may result in biased predictions of the cosmic shear power spectrum, and that the impact is strongly dependent on the amplitude of the intrinsic alignment signal. To compute unequal-time correlations to sufficient accuracy will require advances in either perturbation theory to high k modes or extensive use of simulations.

  1. The Unequal Power Relation in the Final Interpretation

    DEFF Research Database (Denmark)

    Almlund, Pernille

    2013-01-01

    if the interpretation also takes the unequal power relation into account. Consequently, interpreting the researched in a respectful manner is difficult. This article demonstrates the necessity of increasing awareness of the unequal power relation by posing, discussing and, to some extent answering, three methodological...... questions inspired by meta-theory that are significant for qualitative research and qualitative researchers to reflect on. This article concludes that respectful interpretation and consciously paying attention to the unequal power relation in the final interpretation require decentring the subject...

  2. Professional installation of overvoltage protection devices. The most common installation errors; Fachgerechte Installation von Ueberspannungsschutzgeraeten. Die haeufigsten Installationsfehler

    Energy Technology Data Exchange (ETDEWEB)

    Gmelch, L. [Dehn und Soehne GmbH und Co. KG, Neumarkt (Germany)

    2007-07-01

    Increasingly sensitive electronic equipment and high demands on plant and system availability necesstate effective protection against lightning and overvoltage. Apart from the measures that must be taken already in the projecting phase in order to minimize the risk of interferences, disturbances and destruction of plants and systems, there are also some basic principles that must be observed when installing overvoltage protection equipment. If these are neglected, the protective function of overvoltage protection systems may be seriously impaired. (orig.)

  3. The Theory of Unequal Exchange: The End of the Debate?

    NARCIS (Netherlands)

    R. Brown (Richard)

    1978-01-01

    textabstractThe overall objective of this work is to examine the theory of Unequal Exchange, the recent critiques of that, and its interrelation with questions concerning the effects and role of foreign investment in underdeveloped countries. Interest in this debate was stimulated largely by the

  4. The Dugdale solution for two unequal straight cracks weakening

    Indian Academy of Sciences (India)

    A crack arrest model is proposed for an infinite elastic perfectly-plastic plate weakened by two unequal, quasi-static, collinear straight cracks. The Dugdale model solution is obtained for the above problem when the developed plastic zones are subjected to normal cohesive quadratically varying yield point stress. Employing ...

  5. Random linear network coding for streams with unequally sized packets

    DEFF Research Database (Denmark)

    Taghouti, Maroua; Roetter, Daniel Enrique Lucani; Pedersen, Morten Videbæk

    2016-01-01

    State of the art Random Linear Network Coding (RLNC) schemes assume that data streams generate packets with equal sizes. This is an assumption that results in the highest efficiency gains for RLNC. A typical solution for managing unequal packet sizes is to zero-pad the smallest packets. However, ...

  6. K-harmonic solution for three bound unequal particles

    International Nuclear Information System (INIS)

    Coelho, H.T.; Consoni, L.; Vallieres, M.

    1978-01-01

    The three bound unequal particles problem using K-harmonics is analysed concerning how the nature of interactions and asymmetries of the system will affect convergence of the solutions. Coulomb interaction which gives closed expressions for the matrix elements of the potential in the method is discussed [pt

  7. Design and analysis of unequal split Bagley power dividers

    Science.gov (United States)

    Abu-Alnadi, Omar; Dib, Nihad; Al-Shamaileh, Khair; Sheta, Abdelfattah

    2015-03-01

    In this article, we propose a general design procedure to develop unequal split Bagley power dividers (BPDs). Based on the mathematical approach carried out in the insight of simple circuit and transmission line theories, exact design equations for 3-way and 5-way BPDs are derived. Utilising the developed equations leads to power dividers with the ability of offering different output power ratios through a suitable choice of the characteristic impedances of the interconnecting transmission lines. For verification purposes, a 1:2:1 3-way, 1:2:1:2:1 5-way and 1:3:1:3:1 5-way BPDs are designed and fabricated. The experimental and full-wave simulation results prove the validity of the designed unequal split BPDs.

  8. A Two Unequal Fluids (TUF) model for thermalhydraulics analysis

    International Nuclear Information System (INIS)

    Bonalumi, R.A.; Liu, W.S.; Yousef, W.W.; Pascoe, J.

    1983-01-01

    TUF is an advanced two-phase flow computer code being developed at Ontario Hydro for analysis of thermalhydraulics transients in which the Homogeneous Equilibrium Model is not adequate, i.e., when the two phases (vapor and liquid) have Unequal Velocities (UV) and Unequal Temperatures (UT). The paper covers only one of the several development areas encompassed by TUF, namely its mathematical aspects. TUF's basic features include: numerical solution of mass-energy balance equations over fixed control volumes, semi-analytical solution of momentum equations at junctions (such that the solution is unconditionally stable and and has UV-UT choking and flooding limitations built-in). Two strategies are being developed: one based on the Porsching approach (for short-term use in an existing system code) and the other based on a two-step pressure field approach (computationally more efficient and unconditionally stable). Some simple test cases are presented

  9. Protecting DNA from errors and damage: an overview of DNA repair mechanisms in plants compared to mammals.

    Science.gov (United States)

    Spampinato, Claudia P

    2017-05-01

    The genome integrity of all organisms is constantly threatened by replication errors and DNA damage arising from endogenous and exogenous sources. Such base pair anomalies must be accurately repaired to prevent mutagenesis and/or lethality. Thus, it is not surprising that cells have evolved multiple and partially overlapping DNA repair pathways to correct specific types of DNA errors and lesions. Great progress in unraveling these repair mechanisms at the molecular level has been made by several talented researchers, among them Tomas Lindahl, Aziz Sancar, and Paul Modrich, all three Nobel laureates in Chemistry for 2015. Much of this knowledge comes from studies performed in bacteria, yeast, and mammals and has impacted research in plant systems. Two plant features should be mentioned. Plants differ from higher eukaryotes in that they lack a reserve germline and cannot avoid environmental stresses. Therefore, plants have evolved different strategies to sustain genome fidelity through generations and continuous exposure to genotoxic stresses. These strategies include the presence of unique or multiple paralogous genes with partially overlapping DNA repair activities. Yet, in spite (or because) of these differences, plants, especially Arabidopsis thaliana, can be used as a model organism for functional studies. Some advantages of this model system are worth mentioning: short life cycle, availability of both homozygous and heterozygous lines for many genes, plant transformation techniques, tissue culture methods and reporter systems for gene expression and function studies. Here, I provide a current understanding of DNA repair genes in plants, with a special focus on A. thaliana. It is expected that this review will be a valuable resource for future functional studies in the DNA repair field, both in plants and animals.

  10. The Theory of Exploitation as the Unequal Exchange of Labour

    OpenAIRE

    Veneziani, Roberto; Yoshihara, Naoki

    2016-01-01

    This paper analyses the normative and positive foundations of the theory of exploitation as the unequal exchange of labour (UEL). The key intuitions behind all of the main approaches to UEL exploitation are explicitly analysed as a series of formal claims in a general economic environment. It is then argued that these intuitions can be captured by one fundamental axiom - called Labour Exploitation - which defines the basic domain of all UEL exploitation forms and identifies the formal and the...

  11. The theory of exploitation as the unequal exchange of labour

    OpenAIRE

    Veneziani, Roberto; Yoshihara, Naoki

    2017-01-01

    This paper analyses the normative and positive foundations of the theory of exploitation as the unequal exchange of labour (UEL). The key intuitions behind all of the main approaches to UEL exploitation are explicitly analysed as a series of formal claims in a general economic environment. It is then argued that these intuitions can be captured by one fundamental axiom - called Labour Exploitation - which defines the basic domain of all UEL exploitation forms and identifies the formal and the...

  12. An Unequal Secure Encryption Scheme for H.264/AVC Video Compression Standard

    Science.gov (United States)

    Fan, Yibo; Wang, Jidong; Ikenaga, Takeshi; Tsunoo, Yukiyasu; Goto, Satoshi

    H.264/AVC is the newest video coding standard. There are many new features in it which can be easily used for video encryption. In this paper, we propose a new scheme to do video encryption for H.264/AVC video compression standard. We define Unequal Secure Encryption (USE) as an approach that applies different encryption schemes (with different security strength) to different parts of compressed video data. This USE scheme includes two parts: video data classification and unequal secure video data encryption. Firstly, we classify the video data into two partitions: Important data partition and unimportant data partition. Important data partition has small size with high secure protection, while unimportant data partition has large size with low secure protection. Secondly, we use AES as a block cipher to encrypt the important data partition and use LEX as a stream cipher to encrypt the unimportant data partition. AES is the most widely used symmetric cryptography which can ensure high security. LEX is a new stream cipher which is based on AES and its computational cost is much lower than AES. In this way, our scheme can achieve both high security and low computational cost. Besides the USE scheme, we propose a low cost design of hybrid AES/LEX encryption module. Our experimental results show that the computational cost of the USE scheme is low (about 25% of naive encryption at Level 0 with VEA used). The hardware cost for hybrid AES/LEX module is 4678 Gates and the AES encryption throughput is about 50Mbps.

  13. Monte Carlo Simulations Comparing Fisher Exact Test and Unequal Variances t Test for Analysis of Differences Between Groups in Brief Hospital Lengths of Stay.

    Science.gov (United States)

    Dexter, Franklin; Bayman, Emine O; Dexter, Elisabeth U

    2017-12-01

    We examined type I and II error rates for analysis of (1) mean hospital length of stay (LOS) versus (2) percentage of hospital LOS that are overnight. These 2 end points are suitable for when LOS is treated as a secondary economic end point. We repeatedly resampled LOS for 5052 discharges of thoracoscopic wedge resections and lung lobectomy at 26 hospitals. Unequal variances t test (Welch method) and Fisher exact test both were conservative (ie, type I error rate less than nominal level). The Wilcoxon rank sum test was included as a comparator; the type I error rates did not differ from the nominal level of 0.05 or 0.01. Fisher exact test was more powerful than the unequal variances t test at detecting differences among hospitals; estimated odds ratio for obtaining P < .05 with Fisher exact test versus unequal variances t test = 1.94, with 95% confidence interval, 1.31-3.01. Fisher exact test and Wilcoxon-Mann-Whitney had comparable statistical power in terms of differentiating LOS between hospitals. For studies with LOS to be used as a secondary end point of economic interest, there is currently considerable interest in the planned analysis being for the percentage of patients suitable for ambulatory surgery (ie, hospital LOS equals 0 or 1 midnight). Our results show that there need not be a loss of statistical power when groups are compared using this binary end point, as compared with either Welch method or Wilcoxon rank sum test.

  14. Ecologically unequal exchange, recessions, and climate change: A longitudinal study.

    Science.gov (United States)

    Huang, Xiaorui

    2018-07-01

    This study investigates how the ecologically unequal exchange of carbon dioxide emissions varies with economic recessions. I propose a country-specific approach to examine (1) the relationship between carbon dioxide emissions in developing countries and the "vertical flow" of exports to the United States; and (2) the variations of the relationship before, during, and after two recent economic recessions in 2001 and 2008. Using data on 69 developing nations between 2000 and 2010, I estimate time-series cross-sectional regression models with two-way fixed effects. Results suggest that the vertical flow of exports to the United States is positively associated with carbon dioxide emissions in developing countries. The magnitude of this relationship increased in 2001, 2009, and 2010, and decreased in 2008, but remained stable in non-recession periods, suggesting that economic recessions in the United States are associated with variations of ecologically unequal exchange. Results highlight the impacts of U.S. recessions on carbon emissions in developing countries through the structure of international trade. Copyright © 2018 Elsevier Inc. All rights reserved.

  15. Scattering cross section of unequal length dipole arrays

    CERN Document Server

    Singh, Hema; Jha, Rakesh Mohan

    2016-01-01

    This book presents a detailed and systematic analytical treatment of scattering by an arbitrary dipole array configuration with unequal-length dipoles, different inter-element spacing and load impedance. It provides a physical interpretation of the scattering phenomena within the phased array system. The antenna radar cross section (RCS) depends on the field scattered by the antenna towards the receiver. It has two components, viz. structural RCS and antenna mode RCS. The latter component dominates the former, especially if the antenna is mounted on a low observable platform. The reduction in the scattering due to the presence of antennas on the surface is one of the concerns towards stealth technology. In order to achieve this objective, a detailed and accurate analysis of antenna mode scattering is required. In practical phased array, one cannot ignore the finite dimensions of antenna elements, coupling effect and the role of feed network while estimating the antenna RCS. This book presents the RCS estimati...

  16. Sequential boundaries approach in clinical trials with unequal allocation ratios

    Directory of Open Access Journals (Sweden)

    Ayatollahi Seyyed

    2006-01-01

    Full Text Available Abstract Background In clinical trials, both unequal randomization design and sequential analyses have ethical and economic advantages. In the single-stage-design (SSD, however, if the sample size is not adjusted based on unequal randomization, the power of the trial will decrease, whereas with sequential analysis the power will always remain constant. Our aim was to compare sequential boundaries approach with the SSD when the allocation ratio (R was not equal. Methods We evaluated the influence of R, the ratio of the patients in experimental group to the standard group, on the statistical properties of two-sided tests, including the two-sided single triangular test (TT, double triangular test (DTT and SSD by multiple simulations. The average sample size numbers (ASNs and power (1-β were evaluated for all tests. Results Our simulation study showed that choosing R = 2 instead of R = 1 increases the sample size of SSD by 12% and the ASN of the TT and DTT by the same proportion. Moreover, when R = 2, compared to the adjusted SSD, using the TT or DTT allows to retrieve the well known reductions of ASN observed when R = 1, compared to SSD. In addition, when R = 2, compared to SSD, using the TT and DTT allows to obtain smaller reductions of ASN than when R = 1, but maintains the power of the test to its planned value. Conclusion This study indicates that when the allocation ratio is not equal among the treatment groups, sequential analysis could indeed serve as a compromise between ethicists, economists and statisticians.

  17. Error Patterns

    NARCIS (Netherlands)

    Hoede, C.; Li, Z.

    2001-01-01

    In coding theory the problem of decoding focuses on error vectors. In the simplest situation code words are $(0,1)$-vectors, as are the received messages and the error vectors. Comparison of a received word with the code words yields a set of error vectors. In deciding on the original code word,

  18. An Unequal Information Society: How Information Access Initiatives Contribute to the Construction of Inequality

    Science.gov (United States)

    Sanfilippo, Madelyn Rose

    2016-01-01

    Unequal access to information has significant social and political consequences, and is itself a consequence of sociotechnical systems born of social, cultural, economic, and institutional context. Information is unequally distributed both within and between communities. While many factors that shape information inequality shift subtly over time,…

  19. Legal Marriage, Unequal Recognition, and Mental Health among Same-Sex Couples.

    Science.gov (United States)

    LeBlanc, Allen J; Frost, David M; Bowen, Kayla

    2018-04-01

    The authors examined whether the perception of unequal relationship recognition, a novel, couple-level minority stressor, has negative consequences for mental health among same-sex couples. Data came from a dyadic study of 100 ( N = 200) same-sex couples in the U.S. Being in a legal marriage was associated with lower perceived unequal recognition and better mental health; being in a registered domestic partnership or civil union - not also legally married - was associated with greater perceived unequal recognition and worse mental health. Actor Partner Interdependence Models tested associations between legal relationship status, unequal relationship recognition, and mental health (nonspecific psychological distress, depressive symptomatology, and problematic drinking), net controls (age, gender, race/ethnicity, education, and income). Unequal recognition was consistently associated with worse mental health, independent of legal relationship status. Legal changes affecting relationship recognition should not be seen as simple remedies for addressing the mental health effects of institutionalized discrimination.

  20. Operator errors

    International Nuclear Information System (INIS)

    Knuefer; Lindauer

    1980-01-01

    Besides that at spectacular events a combination of component failure and human error is often found. Especially the Rasmussen-Report and the German Risk Assessment Study show for pressurised water reactors that human error must not be underestimated. Although operator errors as a form of human error can never be eliminated entirely, they can be minimized and their effects kept within acceptable limits if a thorough training of personnel is combined with an adequate design of the plant against accidents. Contrary to the investigation of engineering errors, the investigation of human errors has so far been carried out with relatively small budgets. Intensified investigations in this field appear to be a worthwhile effort. (orig.)

  1. Unequal subfamily proportions among honey bee queen and worker brood

    Science.gov (United States)

    Tilley; Oldroyd

    1997-12-01

    Queens from three colonies of feral honey bees, Apis mellifera were removed and placed in separate nucleus colonies. For each colony, eggs and larvae were taken from the nucleus and placed in the main hive on each of 3-4 consecutive weeks. Workers in the queenless parts selected young larvae to rear as queens. Queen pupae, together with the surrounding worker pupae, were removed from each colony and analysed at two to three microsatellite loci to determine their paternity. In all three colonies, the paternity of larvae chosen by the bees to rear as queens was not a random sample of the paternities in the worker brood, with certain subfamilies being over-represented in queens. These results support an important prediction of kin selection theory: when colonies are queenless, unequal relatedness within colonies could lead to the evolution of reproductive competition, that is some subfamilies achieving greater reproductive success than others. The mechanism by which such dominance is achieved could be through a system of kin recognition and nepotism, but we conclude that genetically based differential attractiveness of larvae for rearing as queens is more likely.Copyright 1997 The Association for the Study of Animal BehaviourCopyright 1997The Association for the Study of Animal Behaviour.

  2. Unequal arm space-borne gravitational wave detectors

    International Nuclear Information System (INIS)

    Larson, Shane L.; Hellings, Ronald W.; Hiscock, William A.

    2002-01-01

    Unlike ground-based interferometric gravitational wave detectors, large space-based systems will not be rigid structures. When the end stations of the laser interferometer are freely flying spacecraft, the armlengths will change due to variations in the spacecraft positions along their orbital trajectories, so the precise equality of the arms that is required in a laboratory interferometer to cancel laser phase noise is not possible. However, using a method discovered by Tinto and Armstrong, a signal can be constructed in which laser phase noise exactly cancels out, even in an unequal arm interferometer. We examine the case where the ratio of the armlengths is a variable parameter, and compute the averaged gravitational wave transfer function as a function of that parameter. Example sensitivity curve calculations are presented for the expected design parameters of the proposed LISA interferometer, comparing it to a similar instrument with one arm shortened by a factor of 100, showing how the ratio of the armlengths will affect the overall sensitivity of the instrument

  3. Quasi-human seniority-order algorithm for unequal circles packing

    International Nuclear Information System (INIS)

    Zhu, Dingju

    2016-01-01

    In the existing methods for solving unequal circles packing problems, the initial configuration is given arbitrarily or randomly, but the impact of different initial configurations for existing packing algorithm to the speed of existing packing algorithm solving unequal circles packing problems is very large. The quasi-human seniority-order algorithm proposed in this paper can generate a better initial configuration for existing packing algorithm to accelerate the speed of existing packing algorithm solving unequal circles packing problems. In experiments, the quasi-human seniority-order algorithm is applied to generate better initial configurations for quasi-physical elasticity methods to solve the unequal circles packing problems, and the experimental results show that the proposed quasi-human seniority-order algorithm can greatly improve the speed of solving the problem.

  4. Einstein's error

    International Nuclear Information System (INIS)

    Winterflood, A.H.

    1980-01-01

    In discussing Einstein's Special Relativity theory it is claimed that it violates the principle of relativity itself and that an anomalous sign in the mathematics is found in the factor which transforms one inertial observer's measurements into those of another inertial observer. The apparent source of this error is discussed. Having corrected the error a new theory, called Observational Kinematics, is introduced to replace Einstein's Special Relativity. (U.K.)

  5. Unequal Gain of Equal Resources across Racial Groups

    Directory of Open Access Journals (Sweden)

    Shervin Assari

    2018-01-01

    Full Text Available The health effects of economic resources (eg, education, employment, and living place and psychological assets (eg, self-efficacy, perceived control over life, anger control, and emotions are well-known. This article summarizes the results of a growing body of evidence documenting Blacks’ diminished return, defined as a systematically smaller health gain from economic resources and psychological assets for Blacks in comparison to Whites. Due to structural barriers that Blacks face in their daily lives, the very same resources and assets generate smaller health gain for Blacks compared to Whites. Even in the presence of equal access to resources and assets, such unequal health gain constantly generates a racial health gap between Blacks and Whites in the United States. In this paper, a number of public policies are recommended based on these findings. First and foremost, public policies should not merely focus on equalizing access to resources and assets, but also reduce the societal and structural barriers that hinder Blacks. Policy solutions should aim to reduce various manifestations of structural racism including but not limited to differential pay, residential segregation, lower quality of education, and crime in Black and urban communities. As income was not found to follow the same pattern demonstrated for other resources and assets (ie, income generated similar decline in risk of mortality for Whites and Blacks, policies that enforce equal income and increase minimum wage for marginalized populations are essential. Improving quality of education of youth and employability of young adults will enable Blacks to compete for high paying jobs. Policies that reduce racism and discrimination in the labor market are also needed. Without such policies, it will be very difficult, if not impossible, to eliminate the sustained racial health gap in the United States.

  6. Using the PLUM procedure of SPSS to fit unequal variance and generalized signal detection models.

    Science.gov (United States)

    DeCarlo, Lawrence T

    2003-02-01

    The recent addition of aprocedure in SPSS for the analysis of ordinal regression models offers a simple means for researchers to fit the unequal variance normal signal detection model and other extended signal detection models. The present article shows how to implement the analysis and how to interpret the SPSS output. Examples of fitting the unequal variance normal model and other generalized signal detection models are given. The approach offers a convenient means for applying signal detection theory to a variety of research.

  7. Unequal Gain of Equal Resources across Racial Groups.

    Science.gov (United States)

    Assari, Shervin

    2017-08-05

    The health effects of economic resources (eg, education, employment, and living place) and psychological assets (eg, self-efficacy, perceived control over life, anger control, and emotions) are well-known. This article summarizes the results of a growing body of evidence documenting Blacks' diminished return, defined as a systematically smaller health gain from economic resources and psychological assets for Blacks in comparison to Whites. Due to structural barriers that Blacks face in their daily lives, the very same resources and assets generate smaller health gain for Blacks compared to Whites. Even in the presence of equal access to resources and assets, such unequal health gain constantly generates a racial health gap between Blacks and Whites in the United States. In this paper, a number of public policies are recommended based on these findings. First and foremost, public policies should not merely focus on equalizing access to resources and assets, but also reduce the societal and structural barriers that hinder Blacks. Policy solutions should aim to reduce various manifestations of structural racism including but not limited to differential pay, residential segregation, lower quality of education, and crime in Black and urban communities. As income was not found to follow the same pattern demonstrated for other resources and assets (ie, income generated similar decline in risk of mortality for Whites and Blacks), policies that enforce equal income and increase minimum wage for marginalized populations are essential. Improving quality of education of youth and employability of young adults will enable Blacks to compete for high paying jobs. Policies that reduce racism and discrimination in the labor market are also needed. Without such policies, it will be very difficult, if not impossible, to eliminate the sustained racial health gap in the United States. © 2018 The Author(s); Published by Kerman University of Medical Sciences. This is an open

  8. Protective

    Directory of Open Access Journals (Sweden)

    Wessam M. Abdel-Wahab

    2013-10-01

    Full Text Available Many active ingredients extracted from herbal and medicinal plants are extensively studied for their beneficial effects. Antioxidant activity and free radical scavenging properties of thymoquinone (TQ have been reported. The present study evaluated the possible protective effects of TQ against the toxicity and oxidative stress of sodium fluoride (NaF in the liver of rats. Rats were divided into four groups, the first group served as the control group and was administered distilled water whereas the NaF group received NaF orally at a dose of 10 mg/kg for 4 weeks, TQ group was administered TQ orally at a dose of 10 mg/kg for 5 weeks, and the NaF-TQ group was first given TQ for 1 week and was secondly administered 10 mg/kg/day NaF in association with 10 mg/kg TQ for 4 weeks. Rats intoxicated with NaF showed a significant increase in lipid peroxidation whereas the level of reduced glutathione (GSH and the activity of superoxide dismutase (SOD, catalase (CAT, glutathione S-transferase (GST and glutathione peroxidase (GPx were reduced in hepatic tissues. The proper functioning of the liver was also disrupted as indicated by alterations in the measured liver function indices and biochemical parameters. TQ supplementation counteracted the NaF-induced hepatotoxicity probably due to its strong antioxidant activity. In conclusion, the results obtained clearly indicated the role of oxidative stress in the induction of NaF toxicity and suggested hepatoprotective effects of TQ against the toxicity of fluoride compounds.

  9. 40 CFR 73.37 - Account error.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 16 2010-07-01 2010-07-01 false Account error. 73.37 Section 73.37 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) SULFUR DIOXIDE ALLOWANCE SYSTEM Allowance Tracking System § 73.37 Account error. The Administrator may, at his or her sole...

  10. THE EFFECT OF UNEQUAL DISTRIBUTION OF THE STANDARD AND QUALITY OF

    Directory of Open Access Journals (Sweden)

    JELENA TOSKOVIC

    2015-10-01

    Full Text Available In the early 1990s the Western Balkan countries entered the transition process which involved the transition from a centrally-planned to a market economy. Thus, these countries were forced to adhere to the basic neoliberal principles based on the Washington consensus, which has promoted the liberalization, privatization and stabilization. This model, in all transition countries, and thus the countries of the Western Balkans (Albania, Bosnia and Herzegovina, FYR Macedonia, Montenegro, Serbia, was governed by the same guidelines. However, the consequence occurred but economic and social crisis that led to decades of growth of economic inequality. Deformation of the unequal distribution led to the economic and social stratification which is disabled out of the economic crisis in which these countries found. In this paper we analyzed indicators Gini coefficient, the standard of living and quality of life that affect the life of the inhabitants of the Western Balkans. Gini coefficient as the most commonly used measure for measuring economic inequality, measured the distribution of income or consumption expenditure among households or individuals who within an economy deviates from the even distribution. The standard of living is determined by the totality of the conditions of life and work of the individual layers of the population of a country in a given time period.It is related to quality of life and is used for comparative reviews and comparisons of geographic areas, a number of vacation days, and then to compare different periods in history and more. Quality of life is intangible thesis, which includes the assessment of quality of life factors such as employment, income, health, education, science, energy, knowledge and technology, the environment, human rights, protection and recreation, infrastructure, national security, public safety, etc. All these factors combine to affect the life of the population, which from the beginning of the

  11. Harmonic elimination technique for a single-phase multilevel converter with unequal DC link voltage levels

    DEFF Research Database (Denmark)

    Ghasemi, N.; Zare, F.; Boora, A.A.

    2012-01-01

    Multilevel converters, because of the benefits they attract in generating high quality output voltage, are used in several applications. Various modulation and control techniques are introduced by several researchers to control the output voltage of the multilevel converters like space vector...... modulation and harmonic elimination (HE) methods. Multilevel converters may have a DC link with equal or unequal DC voltages. In this study a new HE technique based on the HE method is proposed for multilevel converters with unequal DC link voltage. The DC link voltage levels are considered as additional...

  12. Errors in macromolecular synthesis after stress. A study of the possible protective role of the small heat shock proteinsBiochemistry

    NARCIS (Netherlands)

    Marin Vinader, L.

    2006-01-01

    The general goal of this thesis was to gain insight in what small heat shock proteins (sHsps) do with respect to macromolecular synthesis during a stressful situation in the cell. It is known that after a non-lethal heat shock, cells are better protected against a subsequent more severe heat shock,

  13. Medication Errors - A Review

    OpenAIRE

    Vinay BC; Nikhitha MK; Patel Sunil B

    2015-01-01

    In this present review article, regarding medication errors its definition, medication error problem, types of medication errors, common causes of medication errors, monitoring medication errors, consequences of medication errors, prevention of medication error and managing medication errors have been explained neatly and legibly with proper tables which is easy to understand.

  14. The association between unequal parental treatment and the sibling relationship in Finland: The difference between full and half-siblings.

    Science.gov (United States)

    Danielsbacka, Mirkka; Tanskanen, Antti O

    2015-06-24

    Studies have shown that unequal parental treatment is associated with relationship quality between siblings. However, it is unclear how it affects the relationship between full and half-siblings. Using data from the Generational Transmissions in Finland project (n = 1,537 younger adults), we study whether those who have half-siblings perceive more unequal parental treatment than those who have full siblings only. In addition, we study how unequal parental treatment is associated with sibling relationship between full, maternal, and paternal half-siblings. First, we found that individuals who have maternal and/or paternal half-siblings are more likely to have encountered unequal maternal treatment than individuals who have full siblings only. Second, we found that unequal parental treatment impairs full as well as maternal and paternal half-sibling relations in adulthood. Third, unequal parental treatment mediates the effect of genetic relatedness on sibling relations in the case of maternal half-siblings, but not in the case of paternal half-siblings. After controlling for unequal parental treatment, the quality of maternal half-sibling relationships did not differ from that of full siblings, whereas the quality of paternal half-sibling relationships still did. Fourth, the qualitative comments (n = 206) from the same population reveal that unequal parental treatment presents itself several ways, such as differential financial, emotional, or practical support.

  15. The Association between Unequal Parental Treatment and the Sibling Relationship in Finland: The Difference between Full and Half-Siblings

    Directory of Open Access Journals (Sweden)

    Mirkka Danielsbacka

    2015-04-01

    Full Text Available Studies have shown that unequal parental treatment is associated with relationship quality between siblings. However, it is unclear how it affects the relationship between full and half-siblings. Using data from the Generational Transmissions in Finland project (n = 1,537 younger adults, we study whether those who have half-siblings perceive more unequal parental treatment than those who have full siblings only. In addition, we study how unequal parental treatment is associated with sibling relationship between full, maternal, and paternal half-siblings. First, we found that individuals who have maternal and/or paternal half-siblings are more likely to have encountered unequal maternal treatment than individuals who have full siblings only. Second, we found that unequal parental treatment impairs full as well as maternal and paternal half-sibling relations in adulthood. Third, unequal parental treatment mediates the effect of genetic relatedness on sibling relations in the case of maternal half-siblings, but not in the case of paternal half-siblings. After controlling for unequal parental treatment, the quality of maternal half-sibling relationships did not differ from that of full siblings, whereas the quality of paternal half-sibling relationships still did. Fourth, the qualitative comments (n = 206 from the same population reveal that unequal parental treatment presents itself several ways, such as differential financial, emotional, or practical support.

  16. Multi-type Step-wise group screening designs with unequal A-priori ...

    African Journals Online (AJOL)

    ... design with unequal group sizes and obtain values of the group sizes that minimize the expected number of runs.. Keywords: Group Screening, Group factors, multi-type step-wise group screening, expected number of runs, Optimum group screening designs > East African Journal of Statistics Vol. 1 (1) 2005: pp. 49-67 ...

  17. Taxation and the unequal reach of the state: mapping state capacity in Ecuador

    NARCIS (Netherlands)

    Harbers, I.

    2015-01-01

    Even though the unequal reach of the state has become an important concern in the literature on developing democracies in Latin America, empirical measures of intracountry variation in state capacity are scarce. So far, attempts to develop valid measures of the reach of the state have often been

  18. Direct fourier method reconstruction based on unequally spaced fast fourier transform

    International Nuclear Information System (INIS)

    Wu Xiaofeng; Zhao Ming; Liu Li

    2003-01-01

    First, We give an Unequally Spaced Fast Fourier Transform (USFFT) method, which is more exact and theoretically more comprehensible than its former counterpart. Then, with an interesting interpolation scheme, we discusse how to apply USFFT to Direct Fourier Method (DFM) reconstruction of parallel projection data. At last, an emulation experiment result is given. (authors)

  19. Selections from Unequal Partners: Teaching about Power, Consent, and Healthy Relationships

    Science.gov (United States)

    deFur, Kirsten

    2016-01-01

    The Center for Sex Education recently published the fourth edition of "Unequal Partners: Teaching about Power, Consent, and Healthy Relationships, Volumes 1 and 2." Included here are two lesson plans about sexual consent selected from each volume. "What does it take … to give sexual consent?" [Sue Montfort and Peggy Brick] is…

  20. Ecological Unequal Exchange: International Trade and Uneven Utilization of Environmental Space in the World System

    Science.gov (United States)

    Rice, James

    2007-01-01

    We evaluate the argument that international trade influences disproportionate cross-national utilization of global renewable natural resources. Such uneven dynamics are relevant to the consideration of inequitable appropriation of environmental space in particular and processes of ecological unequal exchange more generally. Using OLS regression…

  1. Globalization as Continuing Colonialism: Critical Global Citizenship Education in an Unequal World

    Science.gov (United States)

    Mikander, Pia

    2016-01-01

    In an unequal world, education about global inequality can be seen as a controversial but necessary topic for social science to deal with. Even though the world no longer consists of colonies and colonial powers, many aspects of the global economy follow the same patterns as during colonial times, with widening gaps between the world's richest and…

  2. Wide brick tunnel randomization - an unequal allocation procedure that limits the imbalance in treatment totals.

    Science.gov (United States)

    Kuznetsova, Olga M; Tymofyeyev, Yevgen

    2014-04-30

    In open-label studies, partial predictability of permuted block randomization provides potential for selection bias. To lessen the selection bias in two-arm studies with equal allocation, a number of allocation procedures that limit the imbalance in treatment totals at a pre-specified level but do not require the exact balance at the ends of the blocks were developed. In studies with unequal allocation, however, the task of designing a randomization procedure that sets a pre-specified limit on imbalance in group totals is not resolved. Existing allocation procedures either do not preserve the allocation ratio at every allocation or do not include all allocation sequences that comply with the pre-specified imbalance threshold. Kuznetsova and Tymofyeyev described the brick tunnel randomization for studies with unequal allocation that preserves the allocation ratio at every step and, in the two-arm case, includes all sequences that satisfy the smallest possible imbalance threshold. This article introduces wide brick tunnel randomization for studies with unequal allocation that allows all allocation sequences with imbalance not exceeding any pre-specified threshold while preserving the allocation ratio at every step. In open-label studies, allowing a larger imbalance in treatment totals lowers selection bias because of the predictability of treatment assignments. The applications of the technique in two-arm and multi-arm open-label studies with unequal allocation are described. Copyright © 2013 John Wiley & Sons, Ltd.

  3. Error Control Techniques for Efficient Multicast Streaming in UMTS Networks: Proposals andPerformance Evaluation

    Directory of Open Access Journals (Sweden)

    Michele Rossi

    2004-06-01

    Full Text Available In this paper we introduce techniques for efficient multicast video streaming in UMTS networks where a video content has to be conveyed to multiple users in the same cell. Efficient multicast data delivery in UMTS is still an open issue. In particular, suitable solutions have to be found to cope with wireless channel errors, while maintaining both an acceptable channel utilization and a controlled delivery delay over the wireless link between the serving base station and the mobile terminals. Here, we first highlight that standard solutions such as unequal error protection (UEP of the video flow are ineffective in the UMTS systems due to its inherent large feedback delay at the link layer (Radio Link Control, RLC. Subsequently, we propose a local approach to solve errors directly at the UMTS link layer while keeping a reasonably high channel efficiency and saving, as much as possible, system resources. The solution that we propose in this paper is based on the usage of the common channel to serve all the interested users in a cell. In this way, we can save resources with respect to the case where multiple dedicated channels are allocated for every user. In addition to that, we present a hybrid ARQ (HARQ proactive protocol that, at the cost of some redundancy (added to the link layer flow, is able to consistently improve the channel efficiency with respect to the plain ARQ case, by therefore making the use of a single common channel for multicast data delivery feasible. In the last part of the paper we give some hints for future research, by envisioning the usage of the aforementioned error control protocols with suitably encoded video streams.

  4. Error Budgeting

    Energy Technology Data Exchange (ETDEWEB)

    Vinyard, Natalia Sergeevna [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Perry, Theodore Sonne [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Usov, Igor Olegovich [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-10-04

    We calculate opacity from k (hn)=-ln[T(hv)]/pL, where T(hv) is the transmission for photon energy hv, p is sample density, and L is path length through the sample. The density and path length are measured together by Rutherford backscatter. Δk = $\\partial k$\\ $\\partial T$ ΔT + $\\partial k$\\ $\\partial (pL)$. We can re-write this in terms of fractional error as Δk/k = Δ1n(T)/T + Δ(pL)/(pL). Transmission itself is calculated from T=(U-E)/(V-E)=B/B0, where B is transmitted backlighter (BL) signal and B0 is unattenuated backlighter signal. Then ΔT/T=Δln(T)=ΔB/B+ΔB0/B0, and consequently Δk/k = 1/T (ΔB/B + ΔB$_0$/B$_0$ + Δ(pL)/(pL). Transmission is measured in the range of 0.2

  5. Class-specific Error Bounds for Ensemble Classifiers

    Energy Technology Data Exchange (ETDEWEB)

    Prenger, R; Lemmond, T; Varshney, K; Chen, B; Hanley, W

    2009-10-06

    The generalization error, or probability of misclassification, of ensemble classifiers has been shown to be bounded above by a function of the mean correlation between the constituent (i.e., base) classifiers and their average strength. This bound suggests that increasing the strength and/or decreasing the correlation of an ensemble's base classifiers may yield improved performance under the assumption of equal error costs. However, this and other existing bounds do not directly address application spaces in which error costs are inherently unequal. For applications involving binary classification, Receiver Operating Characteristic (ROC) curves, performance curves that explicitly trade off false alarms and missed detections, are often utilized to support decision making. To address performance optimization in this context, we have developed a lower bound for the entire ROC curve that can be expressed in terms of the class-specific strength and correlation of the base classifiers. We present empirical analyses demonstrating the efficacy of these bounds in predicting relative classifier performance. In addition, we specify performance regions of the ROC curve that are naturally delineated by the class-specific strengths of the base classifiers and show that each of these regions can be associated with a unique set of guidelines for performance optimization of binary classifiers within unequal error cost regimes.

  6. Optimal erasure protection for scalably compressed video streams with limited retransmission.

    Science.gov (United States)

    Taubman, David; Thie, Johnson

    2005-08-01

    This paper shows how the priority encoding transmission (PET) framework may be leveraged to exploit both unequal error protection and limited retransmission for RD-optimized delivery of streaming media. Previous work on scalable media protection with PET has largely ignored the possibility of retransmission. Conversely, the PET framework has not been harnessed by the substantial body of previous work on RD optimized hybrid forward error correction/automatic repeat request schemes. We limit our attention to sources which can be modeled as independently compressed frames (e.g., video frames), where each element in the scalable representation of each frame can be transmitted in one or both of two transmission slots. An optimization algorithm determines the level of protection which should be assigned to each element in each slot, subject to transmission bandwidth constraints. To balance the protection assigned to elements which are being transmitted for the first time with those which are being retransmitted, the proposed algorithm formulates a collection of hypotheses concerning its own behavior in future transmission slots. We show how the PET framework allows for a decoupled optimization algorithm with only modest complexity. Experimental results obtained with Motion JPEG2000 compressed video demonstrate that substantial performance benefits can be obtained using the proposed framework.

  7. University of Mauritius Research Journal - Vol 13 (2007)

    African Journals Online (AJOL)

    A Hybrid Unequal Error Protection / Unequal Error Resilience Scheme for JPEG Image Transmission using OFDM · EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT · DOWNLOAD FULL TEXT DOWNLOAD FULL TEXT. TP Fowdur, KMS Soyjaudah, 57-68 ...

  8. Numerical study of two side-by-side cylinders with unequal diameters at low Reynolds number

    International Nuclear Information System (INIS)

    Gao, Y Y; Wang, X K; Tan, S K

    2012-01-01

    Two-dimensional laminar flow about two side-by-side unequal cylinders with different diameter ratios d/D and centre-to-centre spacing ratios T/D at Re=300 (based on the larger cylinder diameter) was simulated using a CFD software. Comparisons of experimental and numerical results were made to elucidate the degree of interference due to d/D and T/D and their effects on the flow patterns and vortex shedding frequencies. The findings showed that the flow patterns behind two unequal cylinders were distinctly different from that behind two equal side-by-side cylinders, with distinct in-phase and anti-phase vortex shedding, and random switching of modes of vortex shedding.

  9. Effect of unequal fuel and oxidizer Lewis numbers on flame dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Shamim, Tariq [Department of Mechanical Engineering, The University of Michigan-Dearborn, Dearborn, MI 48128-1491 (United States)

    2006-12-15

    The interaction of non-unity Lewis number (due to preferential diffusion and/or unequal rates of heat and mass transfer) with the coupled effect of radiation, chemistry and unsteadiness alters several characteristics of a flame. The present study numerically investigates this interaction with a particular emphasis on the effect of unequal and non-unity fuel and oxidizer Lewis numbers in a transient diffusion flame. The unsteadiness is simulated by considering the flame subjected to modulations in reactant concentration. Flames with different Lewis numbers (ranging from 0.5 to 2) and subjected to different modulating frequencies are considered. The results show that the coupled effect of Lewis number and unsteadiness strongly influences the flame dynamics. The impact is stronger at high modulating frequencies and strain rates, particularly for large values of Lewis numbers. Compared to the oxidizer side Lewis number, the fuel side Lewis number has greater influence on flame dynamics. (author)

  10. Load bearing capacity of welded joints between dissimilar pipelines with unequal wall thickness

    Energy Technology Data Exchange (ETDEWEB)

    Beak, Jonghyun; Kim, Youngpyo; Kim, Woosik [Korea Gas Corporation, Suwon (Korea, Republic of)

    2012-09-15

    The behavior of the load bearing capacity of a pipeline with unequal wall thickness was evaluated using finite element analyses. Pipelines with a wall thickness ratio of 1.22-1.89 were adopted to investigate plastic collapse under tensile, internal pressure, or bending stress. A parametric study showed that the tensile strength and moment of a pipeline with a wall thickness ratio less than 1.5 were not influenced by the wall thickness ratio and taper angle; however, those of a pipeline with a wall thickness ratio more than 1.5 decreased considerably at a low taper angle. The failure pressure of a pipeline with unequal wall thickness was not influenced by the wall thickness ratio and taper angle.

  11. Modeling imperfectly repaired system data via grey differential equations with unequal-gapped times

    International Nuclear Information System (INIS)

    Guo Renkuan

    2007-01-01

    In this paper, we argue that grey differential equation models are useful in repairable system modeling. The arguments starts with the review on GM(1,1) model with equal- and unequal-spaced stopping time sequence. In terms of two-stage GM(1,1) filtering, system stopping time can be partitioned into system intrinsic function and repair effect. Furthermore, we propose an approach to use grey differential equation to specify a semi-statistical membership function for system intrinsic function times. Also, we engage an effort to use GM(1,N) model to model system stopping times and the associated operating covariates and propose an unequal-gapped GM(1,N) model for such analysis. Finally, we investigate the GM(1,1)-embed systematic grey equation system modeling of imperfectly repaired system operating data. Practical examples are given in step-by-step manner to illustrate the grey differential equation modeling of repairable system data

  12. Vibration energy harvesting using piezoelectric unimorph cantilevers with unequal piezoelectric and nonpiezoelectric lengths

    OpenAIRE

    Gao, Xiaotong; Shih, Wei-Heng; Shih, Wan Y.

    2010-01-01

    We have examined a piezoelectric unimorph cantilever (PUC) with unequal piezoelectric and nonpiezoelectric lengths for vibration energy harvesting theoretically by extending the analysis of a PUC with equal piezoelectric and nonpiezoelectric lengths. The theoretical approach was validated by experiments. A case study showed that for a fixed vibration frequency, the maximum open-circuit induced voltage which was important for charge storage for later use occurred with a PUC that had a nonpiezo...

  13. [Work and health inequalities: The unequal distribution of exposures at work in Germany and Europe].

    Science.gov (United States)

    Dragano, Nico; Wahrendorf, Morten; Müller, Kathrin; Lunau, Thorsten

    2016-02-01

    Health inequalities in the working population may partly be due to the unequal exposure to work-related risk factors among different occupational positions. Empirical data, however, exploring the distribution of exposures at work according to occupational position for Germany is missing. This paper summarizes existing literature on occupational inequalities and discusses the role of working conditions. In addition, using European survey data, we study how various exposures at work vary by occupational class. Analyses are based on the European Working Condition Survey, and we compare the German sample (n = 2096) with the sample from the EU-27 countries (n = 34,529). To measure occupational position we use occupational class (EGP-classes). First, we describe the prevalence of 16 different exposures at work by occupational class for men and women. Second, we estimate regression models, and thereby investigate if associations between occupational class and self-perceived health are related to an unequal distribution of exposures at work. For various exposures at work we found a higher prevalence among manual workers and lower-skilled employees for both physical and psychosocial conditions. With few exceptions only, this finding was true for men and women and consistent for Germany and Europe. Results indicate that the unequal distribution of health-adverse conditions at work contribute towards existing health inequalities among the working population.

  14. Modeling coherent errors in quantum error correction

    Science.gov (United States)

    Greenbaum, Daniel; Dutton, Zachary

    2018-01-01

    Analysis of quantum error correcting codes is typically done using a stochastic, Pauli channel error model for describing the noise on physical qubits. However, it was recently found that coherent errors (systematic rotations) on physical data qubits result in both physical and logical error rates that differ significantly from those predicted by a Pauli model. Here we examine the accuracy of the Pauli approximation for noise containing coherent errors (characterized by a rotation angle ɛ) under the repetition code. We derive an analytic expression for the logical error channel as a function of arbitrary code distance d and concatenation level n, in the small error limit. We find that coherent physical errors result in logical errors that are partially coherent and therefore non-Pauli. However, the coherent part of the logical error is negligible at fewer than {ε }-({dn-1)} error correction cycles when the decoder is optimized for independent Pauli errors, thus providing a regime of validity for the Pauli approximation. Above this number of correction cycles, the persistent coherent logical error will cause logical failure more quickly than the Pauli model would predict, and this may need to be combated with coherent suppression methods at the physical level or larger codes.

  15. Unequal Expectations

    DEFF Research Database (Denmark)

    Karlson, Kristian Bernt

    the role of causal inference in social science; and it discusses the potential of the findings of the dissertation to inform educational policy. In Chapters II and III, constituting the substantive contribution of the dissertation, I examine the process through which students form expectations...... of the relation between the self and educational prospects; evaluations that are socially bounded in that students take their family's social position into consideration when forming their educational expectations. One important consequence of this learning process is that equally talented students tend to make...... for their educational futures. Focusing on the causes rather than the consequences of educational expectations, I argue that students shape their expectations in response to the signals about their academic performance they receive from institutionalized performance indicators in schools. Chapter II considers...

  16. Unequal Solidarity?

    DEFF Research Database (Denmark)

    Holck, Lotte; Muhr, Sara Louise

    2017-01-01

    crossing the Atlantic, the concept of diversity management merged with Danish universal welfare logics that offer a particular view on equality as sameness together with solidarity through corporate social responsibility. Drawing on 94 employee narratives about difference in a Danish workplace renowned......Due to the fact that immigration in Denmark is a more recent phenomenon, diversity management has had a much shorter history in politics as well as in business, and has not yet been institutionalized to the same degree as in for example North America, from where the concept originates. When...... for its diversity work, this article argues that a translation of the original American concept has taken place that turns diversity management into an ambiguous corporate activity when practised through Danish welfare logics. Paradoxically, corporate practices of social responsibility aimed at fostering...

  17. Snail family members unequally trigger EMT and thereby differ in their ability to promote the neoplastic transformation of mammary epithelial cells.

    Directory of Open Access Journals (Sweden)

    Baptiste Gras

    Full Text Available By fostering cell commitment to the epithelial-to-mesenchymal transition (EMT, SNAIL proteins endow cells with motility, thereby favoring the metastatic spread of tumor cells. Whether the phenotypic change additionally facilitates tumor initiation has never been addressed. Here we demonstrate that when a SNAIL protein is ectopically produced in non-transformed mammary epithelial cells, the cells are protected from anoikis and proliferate under low-adherence conditions: a hallmark of cancer cells. The three SNAIL proteins show unequal oncogenic potential, strictly correlating with their ability to promote EMT. SNAIL3 especially behaves as a poor EMT-inducer comforting the concept that the transcription factor functionally diverges from its two related proteins.

  18. Learning from prescribing errors

    OpenAIRE

    Dean, B

    2002-01-01

    

 The importance of learning from medical error has recently received increasing emphasis. This paper focuses on prescribing errors and argues that, while learning from prescribing errors is a laudable goal, there are currently barriers that can prevent this occurring. Learning from errors can take place on an individual level, at a team level, and across an organisation. Barriers to learning from prescribing errors include the non-discovery of many prescribing errors, lack of feedback to th...

  19. ANALISIS PENGARUH KONFIGURASI EIGRP EQUAL DAN UNEQUAL COST LOAD BALANCING TERHADAP KINERJA ROUTER

    Directory of Open Access Journals (Sweden)

    Dian Bagus Saptonugroho

    2015-04-01

    Full Text Available Routing protocol is tasked with finding the best route to send the packet. Assessed using the metric. If there is more than one route with the same metric value, Routing Information Path (RIP, Open Shortest Path First (OSPF, and Enhanched Interior Gateway Routing Protocol (EIGRP support equal cost load balancing to send packets to the destination. If there is more than one route with a different metric values, EIGRP can do unequal cost load balancing. Research needs to be conducted to determine the effect of the configuration of EIGRP equal and unequal cost load balancing on the performance of the router which can be used as a proof-of-concept testing that is part of the project design document on a network. Research networks using EIGRP as the routing protocol. After the equal and unequal load balancing is enabled by configuring the variance, CEF, per-destination load balancing, per-packet load balancing, or traffic sharing and analyzing its effect on the neighbor table, topology table, routing table, the data transmission, survivability, convergence, throughput, and utilization. This study used an emulator GNS3 as Cisco 2691 Router with Cisco IOS version 12:24 (25 c and advanced enterprise-adventerprisek9 image c2691-mz.124-25c.bin, and OPNET Modeler 14.5 for simulation. The results of the study can be used as a proof-of-concept testing in the design document for later use as contemplated in the manufacture of plan implementation and verification plan.

  20. Differences between Inequalities and Unequal Exchange: Comments on the Papers by Chaves and Köhler

    OpenAIRE

    Raffer, Kunibert

    2006-01-01

    Köhler's critique of global wages, where he presents the concept of productivity with great clarity, combines very well with Chaves' presentation of Köhler's model of Unequal Exchange (UE). A brief and solid common position emerges. As I wrote that "the dimension of non-equivalence in a strict, logical sense" can only be shown by comparing real wages, I fully second Köhler's use of Purchase Power Parity (PPP)-data. In the 1980s, I explicitly referred to the research on PPP comparisons. Theref...

  1. Notes on a Dramaturgical Analysis of Unequal Small-Scale Corruption Experiences

    Directory of Open Access Journals (Sweden)

    Edgar Daniel Manchinelly Mota

    2017-10-01

    Full Text Available In the last two decades, corruption has emerged as a relevant subject on a worldwide scale, because of its negative effects on the economy and State institutions, among other things. Research has focused on the macro aspects of corruption, emphasizing its causes and consequences. However, small-scale corruption has not been studied in such detail. This document proposes a theoretical-methodological framework for a dramaturgical analysis of small-scale corruption, with the aim of demonstrating that it is a stratified interaction. In this sense, corruption is an unequal experience for citizens, which depends on individuals’ social position.

  2. Two-dimensional errors

    International Nuclear Information System (INIS)

    Anon.

    1991-01-01

    This chapter addresses the extension of previous work in one-dimensional (linear) error theory to two-dimensional error analysis. The topics of the chapter include the definition of two-dimensional error, the probability ellipse, the probability circle, elliptical (circular) error evaluation, the application to position accuracy, and the use of control systems (points) in measurements

  3. Part two: Error propagation

    International Nuclear Information System (INIS)

    Picard, R.R.

    1989-01-01

    Topics covered in this chapter include a discussion of exact results as related to nuclear materials management and accounting in nuclear facilities; propagation of error for a single measured value; propagation of error for several measured values; error propagation for materials balances; and an application of error propagation to an example of uranium hexafluoride conversion process

  4. Learning from Errors

    OpenAIRE

    Martínez-Legaz, Juan Enrique; Soubeyran, Antoine

    2003-01-01

    We present a model of learning in which agents learn from errors. If an action turns out to be an error, the agent rejects not only that action but also neighboring actions. We find that, keeping memory of his errors, under mild assumptions an acceptable solution is asymptotically reached. Moreover, one can take advantage of big errors for a faster learning.

  5. Generalized Gaussian Error Calculus

    CERN Document Server

    Grabe, Michael

    2010-01-01

    For the first time in 200 years Generalized Gaussian Error Calculus addresses a rigorous, complete and self-consistent revision of the Gaussian error calculus. Since experimentalists realized that measurements in general are burdened by unknown systematic errors, the classical, widespread used evaluation procedures scrutinizing the consequences of random errors alone turned out to be obsolete. As a matter of course, the error calculus to-be, treating random and unknown systematic errors side by side, should ensure the consistency and traceability of physical units, physical constants and physical quantities at large. The generalized Gaussian error calculus considers unknown systematic errors to spawn biased estimators. Beyond, random errors are asked to conform to the idea of what the author calls well-defined measuring conditions. The approach features the properties of a building kit: any overall uncertainty turns out to be the sum of a contribution due to random errors, to be taken from a confidence inter...

  6. High-Dimensional Multivariate Repeated Measures Analysis with Unequal Covariance Matrices

    Science.gov (United States)

    Harrar, Solomon W.; Kong, Xiaoli

    2015-01-01

    In this paper, test statistics for repeated measures design are introduced when the dimension is large. By large dimension is meant the number of repeated measures and the total sample size grow together but either one could be larger than the other. Asymptotic distribution of the statistics are derived for the equal as well as unequal covariance cases in the balanced as well as unbalanced cases. The asymptotic framework considered requires proportional growth of the sample sizes and the dimension of the repeated measures in the unequal covariance case. In the equal covariance case, one can grow at much faster rate than the other. The derivations of the asymptotic distributions mimic that of Central Limit Theorem with some important peculiarities addressed with sufficient rigor. Consistent and unbiased estimators of the asymptotic variances, which make efficient use of all the observations, are also derived. Simulation study provides favorable evidence for the accuracy of the asymptotic approximation under the null hypothesis. Power simulations have shown that the new methods have comparable power with a popular method known to work well in low-dimensional situation but the new methods have shown enormous advantage when the dimension is large. Data from Electroencephalograph (EEG) experiment is analyzed to illustrate the application of the results. PMID:26778861

  7. Material conditions of kindergartens as producers of experiences: uses of diversity and unequal relationships

    Directory of Open Access Journals (Sweden)

    Lucía Petrelli

    2017-09-01

    Full Text Available This article presents preliminary results of socio-anthropological research that took place in Early Education schools of the City of Buenos Aires. It focuses on the uses of sociocultural diversity and unequal relationships and their articulation with the materiality of everyday life conditions of the institutions in which fieldwork took place. It includes the analytical description of a social situation —the visit of an officer of the Ministry of Education to one of these Initial Schools in which research was carried on. This descriptions show the ways in which the different institutional subjects —teachers, parents, school directors, the staff member— refer to and produce the spaces in which their practices take place, and simultaneously, struggle for their places and put  under strain their diverse and unequal relationships. Afterwards, taking into account the Initial Education schools of a whole Educational District of the City, we analyze the features of the “educational offer” for Initial Schooling, the incorporation of devices  as cell phones  to everyday educational work of the kindergartens, as an aide to the teacher’s work, and the beginnings of the online school enrolment. These issues permit us to highlight the recent history of Argentina´s Initial Level teaching. The article accounts  that materiality, as the ethnographic descriptions evidence, permits to understand how, nowadays, relations of diversity and inequality come together.

  8. Understanding determinants of unequal distribution of stillbirth in Tehran, Iran: a concentration index decomposition approach.

    Science.gov (United States)

    Almasi-Hashiani, Amir; Sepidarkish, Mahdi; Safiri, Saeid; Khedmati Morasae, Esmaeil; Shadi, Yahya; Omani-Samani, Reza

    2017-05-17

    The present inquiry set to determine the economic inequality in history of stillbirth and understanding determinants of unequal distribution of stillbirth in Tehran, Iran. A population-based cross-sectional study was conducted on 5170 pregnancies in Tehran, Iran, since 2015. Principal component analysis (PCA) was applied to measure the asset-based economic status. Concentration index was used to measure socioeconomic inequality in stillbirth and then decomposed into its determinants. The concentration index and its 95% CI for stillbirth was -0.121 (-0.235 to -0.002). Decomposition of the concentration index showed that mother's education (50%), mother's occupation (30%), economic status (26%) and father's age (12%) had the highest positive contributions to measured inequality in stillbirth history in Tehran. Mother's age (17%) had the highest negative contribution to inequality. Stillbirth is unequally distributed among Iranian women and is mostly concentrated among low economic status people. Mother-related factors had the highest positive and negative contributions to inequality, highlighting specific interventions for mothers to redress inequality. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  9. The Political Economy of the Water Footprint: A Cross-National Analysis of Ecologically Unequal Exchange

    Directory of Open Access Journals (Sweden)

    Jared B. Fitzgerald

    2016-12-01

    Full Text Available Water scarcity is an important social and ecological issue that is becoming increasingly problematic with the onset of climate change. This study explores the extent to which water resources in developing countries are affected by the vertical flow of exports to high-income countries. In examining this question, the authors engage the sociological theory of ecologically unequal exchange, which argues that high-income countries are able to partially externalize the environmental costs of their consumption to lower-income countries. The authors use a relatively new and underutilized measure of water usage, the water footprint, which quantifies the amount of water used in the entire production process. Ordinary least squares (OLS and robust regression techniques are employed in the cross-national analysis of 138 countries. The results provide partial support of the propositions of ecologically unequal exchange theory. In particular, the results highlight the importance of structural position in the global economy for understanding the effects of trade on water resources.

  10. Unequal Exchange of Air Pollution and Economic Benefits Embodied in China's Exports.

    Science.gov (United States)

    Zhang, Wei; Wang, Feng; Hubacek, Klaus; Liu, Yu; Wang, Jinnan; Feng, Kuishuang; Jiang, Ling; Jiang, Hongqiang; Zhang, Bing; Bi, Jun

    2018-04-03

    As the world's factory, China has enjoyed huge economic benefits from international export but also suffered severe environmental consequences. Most studies investigating unequal environmental exchange associated with trade took China as a homogeneous entity ignoring considerable inequality and outsourcing of pollution within China. This paper traces the regional mismatch of export-induced economic benefits and environmental costs along national supply chains by using the latest multiregional input-output model and emission inventory for 2012. The results indicate that approximately 56% of the national GDP induced by exports has been received by developed coastal regions, while about 72% of air pollution embodied in national exports, measured as aggregated atmospheric pollutant equivalents (APE), has been mainly incurred by less developed central and western regions. For each yuan of export-induced GDP, developed regions only incurred 0.4-0.6 g APE emissions, whereas less developed regions from western or central China had to suffer 4-8 times the amount of emissions. This is due to poorer regions providing lower value added and higher emission-intensive inputs and having lower environmental standards and less efficient technologies. Our results may pave a way to mitigate the unequal relationship between developed and less developed regions from the perspective of environment-economy nexus.

  11. Medication errors: prescribing faults and prescription errors.

    Science.gov (United States)

    Velo, Giampaolo P; Minuz, Pietro

    2009-06-01

    1. Medication errors are common in general practice and in hospitals. Both errors in the act of writing (prescription errors) and prescribing faults due to erroneous medical decisions can result in harm to patients. 2. Any step in the prescribing process can generate errors. Slips, lapses, or mistakes are sources of errors, as in unintended omissions in the transcription of drugs. Faults in dose selection, omitted transcription, and poor handwriting are common. 3. Inadequate knowledge or competence and incomplete information about clinical characteristics and previous treatment of individual patients can result in prescribing faults, including the use of potentially inappropriate medications. 4. An unsafe working environment, complex or undefined procedures, and inadequate communication among health-care personnel, particularly between doctors and nurses, have been identified as important underlying factors that contribute to prescription errors and prescribing faults. 5. Active interventions aimed at reducing prescription errors and prescribing faults are strongly recommended. These should be focused on the education and training of prescribers and the use of on-line aids. The complexity of the prescribing procedure should be reduced by introducing automated systems or uniform prescribing charts, in order to avoid transcription and omission errors. Feedback control systems and immediate review of prescriptions, which can be performed with the assistance of a hospital pharmacist, are also helpful. Audits should be performed periodically.

  12. Unequal Marriages within the Russian Imperial Home and the 1911 Meeting of the Grand Dukes

    Directory of Open Access Journals (Sweden)

    Stanislav V. Dumin

    2013-12-01

    Full Text Available In celebrating the 400th anniversary of the Russian Imperial Household in 2013, it is important to remember that the historic dynasty did not disappear in 1917 and, as with the earlier ruling dynasties, it still retains its status and its structure based on the Law on Succession and the Provision on the Imperial Family. These documents, in part, define that the dynasty includes just the Romanov descendants born from marital unions with the ruling or previously ruling dynasties. The rest of the Romanov descendants that were born from unequal, morganatic marriages, did not belong to the Russian Imperial Family and, correspondingly, did not have the right to the throne. Until 1911, these marriages were simply forbidden for all members of the dynasty. In mentioning representatives of the Romanov family, popular literature and the media often do not mention this circumstance and instead include individuals who were not part of the dynasty, even though they were descended from the Russian Emperors through the paternal or the maternal line. Representatives of the so-called “Association of the Romanov Household”, descendants of Grand Dukes or Dukes of Imperial Blood from unequal marriages, would often point to a decree made by Nicholas II in 1911. The decree stated that the lesser Romanovs, dukes and duchesses of imperial blood, that is, great grandchildren and more distant descendants of Emperors, could enter into unequal marriages with Royal Permission. However, the theory stating that this decree somehow still gave these descendants dynastic rights is refuted through the materials that we have uncovered in the State Archives of the Russian Federation detailing the meeting of the Grand Dukes, called together by the order of Nicholas II and the resolution of the Emperor at the end of this meeting that is published in the article. As such, out of all the Romanov descendants still alive today, the status of being a true member of the dynasty only

  13. Field error lottery

    Energy Technology Data Exchange (ETDEWEB)

    Elliott, C.J.; McVey, B. (Los Alamos National Lab., NM (USA)); Quimby, D.C. (Spectra Technology, Inc., Bellevue, WA (USA))

    1990-01-01

    The level of field errors in an FEL is an important determinant of its performance. We have computed 3D performance of a large laser subsystem subjected to field errors of various types. These calculations have been guided by simple models such as SWOOP. The technique of choice is utilization of the FELEX free electron laser code that now possesses extensive engineering capabilities. Modeling includes the ability to establish tolerances of various types: fast and slow scale field bowing, field error level, beam position monitor error level, gap errors, defocusing errors, energy slew, displacement and pointing errors. Many effects of these errors on relative gain and relative power extraction are displayed and are the essential elements of determining an error budget. The random errors also depend on the particular random number seed used in the calculation. The simultaneous display of the performance versus error level of cases with multiple seeds illustrates the variations attributable to stochasticity of this model. All these errors are evaluated numerically for comprehensive engineering of the system. In particular, gap errors are found to place requirements beyond mechanical tolerances of {plus minus}25{mu}m, and amelioration of these may occur by a procedure utilizing direct measurement of the magnetic fields at assembly time. 4 refs., 12 figs.

  14. Prescription Errors in Psychiatry

    African Journals Online (AJOL)

    Arun Kumar Agnihotri

    clinical pharmacists in detecting errors before they have a (sometimes serious) clinical impact should not be underestimated. Research on medication error in mental health care is limited. .... participation in ward rounds and adverse drug.

  15. A Hybrid Fuzzy Multi-hop Unequal Clustering Algorithm for Dense Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Shawkat K. Guirguis

    2017-01-01

    Full Text Available Clustering is carried out to explore and solve power dissipation problem in wireless sensor network (WSN. Hierarchical network architecture, based on clustering, can reduce energy consumption, balance traffic load, improve scalability, and prolong network lifetime. However, clustering faces two main challenges: hotspot problem and searching for effective techniques to perform clustering. This paper introduces a fuzzy unequal clustering technique for heterogeneous dense WSNs to determine both final cluster heads and their radii. Proposed fuzzy system blends three effective parameters together which are: the distance to the base station, the density of the cluster, and the deviation of the noders residual energy from the average network energy. Our objectives are achieving gain for network lifetime, energy distribution, and energy consumption. To evaluate the proposed algorithm, WSN clustering based routing algorithms are analyzed, simulated, and compared with obtained results. These protocols are LEACH, SEP, HEED, EEUC, and MOFCA.

  16. Discrete- vs. Continuous-Time Modeling of Unequally Spaced Experience Sampling Method Data

    Directory of Open Access Journals (Sweden)

    Silvia de Haan-Rietdijk

    2017-10-01

    Full Text Available The Experience Sampling Method is a common approach in psychological research for collecting intensive longitudinal data with high ecological validity. One characteristic of ESM data is that it is often unequally spaced, because the measurement intervals within a day are deliberately varied, and measurement continues over several days. This poses a problem for discrete-time (DT modeling approaches, which are based on the assumption that all measurements are equally spaced. Nevertheless, DT approaches such as (vector autoregressive modeling are often used to analyze ESM data, for instance in the context of affective dynamics research. There are equivalent continuous-time (CT models, but they are more difficult to implement. In this paper we take a pragmatic approach and evaluate the practical relevance of the violated model assumption in DT AR(1 and VAR(1 models, for the N = 1 case. We use simulated data under an ESM measurement design to investigate the bias in the parameters of interest under four different model implementations, ranging from the true CT model that accounts for all the exact measurement times, to the crudest possible DT model implementation, where even the nighttime is treated as a regular interval. An analysis of empirical affect data illustrates how the differences between DT and CT modeling can play out in practice. We find that the size and the direction of the bias in DT (VAR models for unequally spaced ESM data depend quite strongly on the true parameter in addition to data characteristics. Our recommendation is to use CT modeling whenever possible, especially now that new software implementations have become available.

  17. Unequal cluster sizes in stepped-wedge cluster randomised trials: a systematic review.

    Science.gov (United States)

    Kristunas, Caroline; Morris, Tom; Gray, Laura

    2017-11-15

    To investigate the extent to which cluster sizes vary in stepped-wedge cluster randomised trials (SW-CRT) and whether any variability is accounted for during the sample size calculation and analysis of these trials. Any, not limited to healthcare settings. Any taking part in an SW-CRT published up to March 2016. The primary outcome is the variability in cluster sizes, measured by the coefficient of variation (CV) in cluster size. Secondary outcomes include the difference between the cluster sizes assumed during the sample size calculation and those observed during the trial, any reported variability in cluster sizes and whether the methods of sample size calculation and methods of analysis accounted for any variability in cluster sizes. Of the 101 included SW-CRTs, 48% mentioned that the included clusters were known to vary in size, yet only 13% of these accounted for this during the calculation of the sample size. However, 69% of the trials did use a method of analysis appropriate for when clusters vary in size. Full trial reports were available for 53 trials. The CV was calculated for 23 of these: the median CV was 0.41 (IQR: 0.22-0.52). Actual cluster sizes could be compared with those assumed during the sample size calculation for 14 (26%) of the trial reports; the cluster sizes were between 29% and 480% of that which had been assumed. Cluster sizes often vary in SW-CRTs. Reporting of SW-CRTs also remains suboptimal. The effect of unequal cluster sizes on the statistical power of SW-CRTs needs further exploration and methods appropriate to studies with unequal cluster sizes need to be employed. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  18. Unequal Probability Marking Approach to Enhance Security of Traceback Scheme in Tree-Based WSNs.

    Science.gov (United States)

    Huang, Changqin; Ma, Ming; Liu, Xiao; Liu, Anfeng; Zuo, Zhengbang

    2017-06-17

    Fog (from core to edge) computing is a newly emerging computing platform, which utilizes a large number of network devices at the edge of a network to provide ubiquitous computing, thus having great development potential. However, the issue of security poses an important challenge for fog computing. In particular, the Internet of Things (IoT) that constitutes the fog computing platform is crucial for preserving the security of a huge number of wireless sensors, which are vulnerable to attack. In this paper, a new unequal probability marking approach is proposed to enhance the security performance of logging and migration traceback (LM) schemes in tree-based wireless sensor networks (WSNs). The main contribution of this paper is to overcome the deficiency of the LM scheme that has a higher network lifetime and large storage space. In the unequal probability marking logging and migration (UPLM) scheme of this paper, different marking probabilities are adopted for different nodes according to their distances to the sink. A large marking probability is assigned to nodes in remote areas (areas at a long distance from the sink), while a small marking probability is applied to nodes in nearby area (areas at a short distance from the sink). This reduces the consumption of storage and energy in addition to enhancing the security performance, lifetime, and storage capacity. Marking information will be migrated to nodes at a longer distance from the sink for increasing the amount of stored marking information, thus enhancing the security performance in the process of migration. The experimental simulation shows that for general tree-based WSNs, the UPLM scheme proposed in this paper can store 1.12-1.28 times the amount of stored marking information that the equal probability marking approach achieves, and has 1.15-1.26 times the storage utilization efficiency compared with other schemes.

  19. Merger of binary neutron stars of unequal mass in full general relativity

    International Nuclear Information System (INIS)

    Shibata, Masaru; Taniguchi, Keisuke; Uryu-bar, Ko-barji

    2003-01-01

    We present results of three dimensional numerical simulations of the merger of unequal-mass binary neutron stars in full general relativity. A Γ-law equation of state P=(Γ-1)ρε is adopted, where P, ρ, ε, and Γ are the pressure, rest mass density, specific internal energy, and the adiabatic constant, respectively. We take Γ=2 and the baryon rest-mass ratio Q M to be in the range 0.85-1. The typical grid size is (633,633,317) for (x,y,z). We improve several implementations since the latest work. In the present code, the radiation reaction of gravitational waves is taken into account with a good accuracy. This fact enables us to follow the coalescence all the way from the late inspiral phase through the merger phase for which the transition is triggered by the radiation reaction. It is found that if the total rest mass of the system is more than ∼1.7 times of the maximum allowed rest mass of spherical neutron stars, a black hole is formed after the merger, irrespective of the mass ratios. The gravitational waveforms and outcomes in the merger of unequal-mass binaries are compared with those in equal-mass binaries. It is found that the disk mass around the so formed black holes increases with decreasing rest-mass ratios and decreases with increasing compactness of neutron stars. The merger process and the gravitational waveforms also depend strongly on the rest-mass ratios even for the range Q M =0.85-1

  20. Errors in otology.

    Science.gov (United States)

    Kartush, J M

    1996-11-01

    Practicing medicine successfully requires that errors in diagnosis and treatment be minimized. Malpractice laws encourage litigators to ascribe all medical errors to incompetence and negligence. There are, however, many other causes of unintended outcomes. This article describes common causes of errors and suggests ways to minimize mistakes in otologic practice. Widespread dissemination of knowledge about common errors and their precursors can reduce the incidence of their occurrence. Consequently, laws should be passed to allow for a system of non-punitive, confidential reporting of errors and "near misses" that can be shared by physicians nationwide.

  1. Cross-Country Variation in Adult Skills Inequality: Why Are Skill Levels and Opportunities so Unequal in Anglophone Countries?

    Science.gov (United States)

    Green, Andy; Green, Francis; Pensiero, Nicola

    2015-01-01

    This article examines cross-country variations in adult skills inequality and asks why skills in Anglophone countries are so unequal. Drawing on the Organization for Economic Cooperation and Development's recent Survey of Adult Skills and other surveys, it investigates the differences across countries and country groups in inequality in both…

  2. X-ray- and TEM-induced mitotic recombination in Drosophila melanogaster: Unequal and sister-strand recombination

    International Nuclear Information System (INIS)

    Becker, H.J.

    1975-01-01

    Twin mosaic spots of dark-apricot and light-apricot ommatidia were found in the eyes of wsup(a)/wsup(a) females, of wsup(a) males, of females homozygous for In(1)sc 4 , wsup(a) and of attached-X females homozygous for wsup(a). The flies were raised from larvae which had been treated with 1,630 R of X-rays at the age of 48-52 hours. An additional group of wsup(a)/wsup(a) females and wsup(a) males came from larvae that had been fed with triethylene melamine (TEM) at the age of 22-24 hours. The twin spots apparently were the result of induced unequal mitotic recombination, i.e. from unequal sister-strand recombination in the males and from unequal sister-strand recombination as well as, possibly, unequal recombination between homologous strands in the females. That is, a duplication resulted in wsup(a)Dpwsup(a)/wsup(a) dark-apricto ommatidia and the corresponding deficiency in an adjacent area of wsup(a)/Dfwsup(a) light-apricot ommatidia. In an additional experiment sister-strand mitotic recombination in the ring-X chromosome of ring-X/rod-X females heterozygous for w and wsup(co) is believed to be the cause for X-ray induced single mosaic spots that show the phenotype of the rod-X marker. (orig.) [de

  3. Statistical Power and Optimum Sample Allocation Ratio for Treatment and Control Having Unequal Costs Per Unit of Randomization

    Science.gov (United States)

    Liu, Xiaofeng

    2003-01-01

    This article considers optimal sample allocation between the treatment and control condition in multilevel designs when the costs per sampling unit vary due to treatment assignment. Optimal unequal allocation may reduce the cost from that of a balanced design without sacrificing any power. The optimum sample allocation ratio depends only on the…

  4. Random Shift and XOR of Unequal-sized Packets (RaSOR) to Shave off Transmission Overhead

    DEFF Research Database (Denmark)

    Taghouti, Maroua; Roetter, Daniel Enrique Lucani; Fitzek, Frank Hanns Paul

    2017-01-01

    We propose the design of a novel coding scheme of unequal-sized packets. Unlike the conventional wisdom that consists of brute-force zero-padding in Random Linear Network Coding (RLNC), we exploit this heterogeneity to shave off this trailing overhead and transmit considerably less coded packets....

  5. The error in total error reduction.

    Science.gov (United States)

    Witnauer, James E; Urcelay, Gonzalo P; Miller, Ralph R

    2014-02-01

    Most models of human and animal learning assume that learning is proportional to the discrepancy between a delivered outcome and the outcome predicted by all cues present during that trial (i.e., total error across a stimulus compound). This total error reduction (TER) view has been implemented in connectionist and artificial neural network models to describe the conditions under which weights between units change. Electrophysiological work has revealed that the activity of dopamine neurons is correlated with the total error signal in models of reward learning. Similar neural mechanisms presumably support fear conditioning, human contingency learning, and other types of learning. Using a computational modeling approach, we compared several TER models of associative learning to an alternative model that rejects the TER assumption in favor of local error reduction (LER), which assumes that learning about each cue is proportional to the discrepancy between the delivered outcome and the outcome predicted by that specific cue on that trial. The LER model provided a better fit to the reviewed data than the TER models. Given the superiority of the LER model with the present data sets, acceptance of TER should be tempered. Copyright © 2013 Elsevier Inc. All rights reserved.

  6. Errors in Neonatology

    OpenAIRE

    Antonio Boldrini; Rosa T. Scaramuzzo; Armando Cuttano

    2013-01-01

    Introduction: Danger and errors are inherent in human activities. In medical practice errors can lean to adverse events for patients. Mass media echo the whole scenario. Methods: We reviewed recent published papers in PubMed database to focus on the evidence and management of errors in medical practice in general and in Neonatology in particular. We compared the results of the literature with our specific experience in Nina Simulation Centre (Pisa, Italy). Results: In Neonatology the main err...

  7. Systematic Procedural Error

    National Research Council Canada - National Science Library

    Byrne, Michael D

    2006-01-01

    .... This problem has received surprisingly little attention from cognitive psychologists. The research summarized here examines such errors in some detail both empirically and through computational cognitive modeling...

  8. Human errors and mistakes

    International Nuclear Information System (INIS)

    Wahlstroem, B.

    1993-01-01

    Human errors have a major contribution to the risks for industrial accidents. Accidents have provided important lesson making it possible to build safer systems. In avoiding human errors it is necessary to adapt the systems to their operators. The complexity of modern industrial systems is however increasing the danger of system accidents. Models of the human operator have been proposed, but the models are not able to give accurate predictions of human performance. Human errors can never be eliminated, but their frequency can be decreased by systematic efforts. The paper gives a brief summary of research in human error and it concludes with suggestions for further work. (orig.)

  9. Learning from Errors

    Science.gov (United States)

    Metcalfe, Janet

    2017-01-01

    Although error avoidance during learning appears to be the rule in American classrooms, laboratory studies suggest that it may be a counterproductive strategy, at least for neurologically typical students. Experimental investigations indicate that errorful learning followed by corrective feedback is beneficial to learning. Interestingly, the…

  10. Investigation of Unequal Planar Wireless Electricity Device for Efficient Wireless Power Transfer

    Directory of Open Access Journals (Sweden)

    M. H. Mohd Salleh

    2017-04-01

    Full Text Available This article focuses on the design and investigation of a pair of unequally sized wireless electricity (Witricity devices that are equipped with integrated planar coil strips. The proposed pair of devices consists of two different square-shaped resonator sizes of 120 mm × 120 mm and 80 mm × 80 mm, acting as a transmitter and receiver, respectively. The devices are designed, simulated and optimized using the CST Microwave Studio software prior to being fabricated and verified using a vector network analyzer (VNA. The surface current results of the coupled devices indicate a good current density at 10 mm to 30 mm distance range. This good current density demonstrates that the coupled devices’ surface has more electric current per unit area, which leads to a good performance up to 30 mm range. Hence, the results also reveal good coupling efficiency between the coupled devices, which is approximately 54.5% at up to a 30 mm distance, with both devices axially aligned. In addition, a coupling efficiency of 50% is achieved when a maximum lateral misalignment (LM of 10 mm, and a varied angular misalignment (AM from 0° to 40° are implemented to the proposed device.

  11. Income inequality and status seeking: searching for positional goods in unequal U.S. States.

    Science.gov (United States)

    Walasek, Lukasz; Brown, Gordon D A

    2015-04-01

    It is well established that income inequality is associated with lower societal well-being, but the psychosocial causes of this relationship are poorly understood. A social-rank hypothesis predicts that members of unequal societies are likely to devote more of their resources to status-seeking behaviors such as acquiring positional goods. We used Google Correlate to find search terms that correlated with our measure of income inequality, and we controlled for income and other socioeconomic factors. We found that of the 40 search terms used more frequently in states with greater income inequality, more than 70% were classified as referring to status goods (e.g., designer brands, expensive jewelry, and luxury clothing). In contrast, 0% of the 40 search terms used more frequently in states with less income inequality were classified as referring to status goods. Finally, we showed how residual-based analysis offers a new methodology for using Google Correlate to provide insights into societal attitudes and motivations while avoiding confounds and high risks of spurious correlations. © The Author(s) 2015.

  12. Comparative Racialization and Unequal Justice in the Era of Black Lives Matter: The Dylan Yang Case

    Directory of Open Access Journals (Sweden)

    Pao Lee Vue

    2016-12-01

    Full Text Available Through a close examination of the Dylan Yang-Isaiah Powell case in Wausau, Wisconsin, we argue that while Hmong experiences may have remained marginalized or invisible in the era of Black Lives Matter, this case and the mobilization efforts around it suggest both commonalities and disjunctures among boys of color, especially in relation to the US justice system. The Dylan Yang case, in which a Hmong teen was convicted of murder for the stabbing of another boy, perceived to be black Latino, in an altercation at his home, demands comparative racialization analytics to gain perspective on the implementation of unequal justice. Unpacking the effects of the gangster stereotype, especially for Southeast Asian youth, we suggest how, despite the Asian American model minority trope, Hmong American boys have been racialized as monstrous thugs comparable (but not identical to their black and Latino counterparts, and thus treated by law enforcement as suspects in need of “cataloging” as part of the school-to-prison pipeline. We also delve into the actual practices of young men in order to reveal their strategies in tense and conflictual multiracial contexts, then turn to issues such as long sentences and juvenile solitary confinement that imply the disposability of young lives of color. We conclude with a curation of links to articles, blogs and social media that we invite readers to explore using the critical lens we provide.

  13. An improved energy aware distributed unequal clustering protocol for heterogeneous wireless sensor networks

    Directory of Open Access Journals (Sweden)

    Vrinda Gupta

    2016-06-01

    Full Text Available In this paper, an improved version of the energy aware distributed unequal clustering protocol (EADUC is projected. The EADUC protocol is commonly used for solving energy hole problem in multi-hop wireless sensor networks. In the EADUC, location of base station and residual energy are given importance as clustering parameters. Based on these parameters, different competition radii are assigned to nodes. Herein, a new approach has been proposed to improve the working of EADUC, by electing cluster heads considering number of nodes in the neighborhood in addition to the above two parameters. The inclusion of the neighborhood information for computation of the competition radii provides better balancing of energy in comparison with the existing approach. Furthermore, for the selection of next hop node, the relay metric is defined directly in terms of energy expense instead of only the distance information used in the EADUC and the data transmission phase has been extended in every round by performing the data collection number of times through use of major slots and mini-slots. The methodology used is of retaining the same clusters for a few rounds and is effective in reducing the clustering overhead. The performance of the proposed protocol has been evaluated under three different scenarios and compared with existing protocols through simulations. The results show that the proposed scheme outperforms the existing protocols in terms of network lifetime in all the scenarios.

  14. Action errors, error management, and learning in organizations.

    Science.gov (United States)

    Frese, Michael; Keith, Nina

    2015-01-03

    Every organization is confronted with errors. Most errors are corrected easily, but some may lead to negative consequences. Organizations often focus on error prevention as a single strategy for dealing with errors. Our review suggests that error prevention needs to be supplemented by error management--an approach directed at effectively dealing with errors after they have occurred, with the goal of minimizing negative and maximizing positive error consequences (examples of the latter are learning and innovations). After defining errors and related concepts, we review research on error-related processes affected by error management (error detection, damage control). Empirical evidence on positive effects of error management in individuals and organizations is then discussed, along with emotional, motivational, cognitive, and behavioral pathways of these effects. Learning from errors is central, but like other positive consequences, learning occurs under certain circumstances--one being the development of a mind-set of acceptance of human error.

  15. Chocolate and The Consumption of Forests: A Cross-National Examination of Ecologically Unequal Exchange in Cocoa Exports

    Directory of Open Access Journals (Sweden)

    Mark D. Noble

    2017-08-01

    Full Text Available This study explores the potential links between specialization in cocoa exports and deforestation in developing nations through the lens of ecologically unequal exchange. Although chocolate production was once considered to have only minimal impacts on forests, recent reports suggest damaging trends due to increased demand and changing cultivation strategies. I use two sets of regression analyses to show the increased impact of cocoa export concentration on deforestation over time for less-developed nations. Overall, the results confirm that cocoa exports are associated with deforestation in the most recent time period, and suggest that specialization in cocoa exports is an important form of ecologically unequal exchange, where the environmental costs of chocolate consumption in the Global North are externalized to nations in the Global South, further impairing possibilities for successful or sustainable development.

  16. China, Japan, and the United States in World War II: The Relinquishment of Unequal Treaties in 1943

    Directory of Open Access Journals (Sweden)

    Xiaohua Ma

    2015-08-01

    Full Text Available This paper aims to examine how the United States transformed its foreign policy to promote China as an “equal state” in international politics during World War II, with focus on the process of the American relinquishment of its unequal treaties with China in 1943. In particular, it concentrates on analyzing the conflicts between the United States and Japan in the process of relinquishment. By examining the rivalry between the United States and Japan in the social warfare – propaganda – we can see that the relinquishment of the unequal treaties in 1943 not only marked a historical turning point in America’s China policy, but also had a great impact on the transformation of East Asian politics in World War II and its influence in the world politics.

  17. The Capability Threshold: Re-examining the Definition of the Middle Class in an Unequal Developing Country

    OpenAIRE

    Burger, Ronelle; McAravey, Camren; van der Berg, Servaas

    2015-01-01

    In a polarised and highly unequal country such as South Africa, it is unlikely that a definition of the middle class that is based on an income threshold will adequately capture the political and social meanings of being middle class. We therefore propose a multi-dimensional definition, rooted in the ideas of empowerment and capability, and find that the 'empowered middle class' has expanded significantly since 1993 also across vulnerable subgroups such as blacks, female-headed households and...

  18. Uncorrected refractive errors.

    Science.gov (United States)

    Naidoo, Kovin S; Jaggernath, Jyoti

    2012-01-01

    Global estimates indicate that more than 2.3 billion people in the world suffer from poor vision due to refractive error; of which 670 million people are considered visually impaired because they do not have access to corrective treatment. Refractive errors, if uncorrected, results in an impaired quality of life for millions of people worldwide, irrespective of their age, sex and ethnicity. Over the past decade, a series of studies using a survey methodology, referred to as Refractive Error Study in Children (RESC), were performed in populations with different ethnic origins and cultural settings. These studies confirmed that the prevalence of uncorrected refractive errors is considerably high for children in low-and-middle-income countries. Furthermore, uncorrected refractive error has been noted to have extensive social and economic impacts, such as limiting educational and employment opportunities of economically active persons, healthy individuals and communities. The key public health challenges presented by uncorrected refractive errors, the leading cause of vision impairment across the world, require urgent attention. To address these issues, it is critical to focus on the development of human resources and sustainable methods of service delivery. This paper discusses three core pillars to addressing the challenges posed by uncorrected refractive errors: Human Resource (HR) Development, Service Development and Social Entrepreneurship.

  19. Uncorrected refractive errors

    Directory of Open Access Journals (Sweden)

    Kovin S Naidoo

    2012-01-01

    Full Text Available Global estimates indicate that more than 2.3 billion people in the world suffer from poor vision due to refractive error; of which 670 million people are considered visually impaired because they do not have access to corrective treatment. Refractive errors, if uncorrected, results in an impaired quality of life for millions of people worldwide, irrespective of their age, sex and ethnicity. Over the past decade, a series of studies using a survey methodology, referred to as Refractive Error Study in Children (RESC, were performed in populations with different ethnic origins and cultural settings. These studies confirmed that the prevalence of uncorrected refractive errors is considerably high for children in low-and-middle-income countries. Furthermore, uncorrected refractive error has been noted to have extensive social and economic impacts, such as limiting educational and employment opportunities of economically active persons, healthy individuals and communities. The key public health challenges presented by uncorrected refractive errors, the leading cause of vision impairment across the world, require urgent attention. To address these issues, it is critical to focus on the development of human resources and sustainable methods of service delivery. This paper discusses three core pillars to addressing the challenges posed by uncorrected refractive errors: Human Resource (HR Development, Service Development and Social Entrepreneurship.

  20. Genesis by meiotic unequal crossover of a de novo deletion that contributes to steroid 21-hydroxylase deficiency

    International Nuclear Information System (INIS)

    Sinnott, P.; Collier, S.; Dyer, P.A.; Harris, R.; Strachan, T.; Costigan, C.

    1990-01-01

    The HLA-linked human steroid 21-hydroxylase gene CYP21B and its closely homologous pseudogene CYP21A are each normally located centromeric to a fourth component of complement (C4) gene, C4B and C4A, respectively, in an organization suggesting tandem duplication of a ca. 30-kilobase DNA unit containing a CYP21 gene and a C4 gene. Such an organization has been considered to facilitate gene deletion and addition events by unequal crossover between the tandem repeats. The authors have identified a steroid 21-hydroxylase deficiency patient who has a maternally inherited disease haplotype that carries a de novo deletion of a ca. 30-kilobase repeat unit including the CYP21B gene and associated C4B gene. This disease haplotype appears to have been generated as a result of meiotic unequal crossover between maternal homologous chromosomes. One of the maternal haplotypes is the frequently occurring HLA-DR3,B8,A1 haplotype that normally carries a deletion of a ca. 30-kilobase unit including the CYP21A gene and C4A gene. Haplotypes of this type may possible act as premutations, increasing the susceptibility of developing a 21-hydroxylase deficiency mutation by facilitating unequal chromosome pairing

  1. Preventing Errors in Laterality

    OpenAIRE

    Landau, Elliot; Hirschorn, David; Koutras, Iakovos; Malek, Alexander; Demissie, Seleshie

    2014-01-01

    An error in laterality is the reporting of a finding that is present on the right side as on the left or vice versa. While different medical and surgical specialties have implemented protocols to help prevent such errors, very few studies have been published that describe these errors in radiology reports and ways to prevent them. We devised a system that allows the radiologist to view reports in a separate window, displayed in a simple font and with all terms of laterality highlighted in sep...

  2. Errors and violations

    International Nuclear Information System (INIS)

    Reason, J.

    1988-01-01

    This paper is in three parts. The first part summarizes the human failures responsible for the Chernobyl disaster and argues that, in considering the human contribution to power plant emergencies, it is necessary to distinguish between: errors and violations; and active and latent failures. The second part presents empirical evidence, drawn from driver behavior, which suggest that errors and violations have different psychological origins. The concluding part outlines a resident pathogen view of accident causation, and seeks to identify the various system pathways along which errors and violations may be propagated

  3. Performance analysis of amplify-and-forward two-way relaying with co-channel interference and channel estimation error

    KAUST Repository

    Yang, Liang

    2013-04-01

    In this paper, we consider the performance of a two-way amplify-and-forward relaying network (AF TWRN) in the presence of unequal power co-channel interferers (CCI). Specifically, we consider AF TWRN with an interference-limited relay and two noisy-nodes with channel estimation error and CCI. We derive the approximate signal-to-interference plus noise ratio expressions and then use these expressions to evaluate the outage probability and error probability. Numerical results show that the approximate closed-form expressions are very close to the exact ones. © 2013 IEEE.

  4. Help prevent hospital errors

    Science.gov (United States)

    ... this page: //medlineplus.gov/ency/patientinstructions/000618.htm Help prevent hospital errors To use the sharing features ... in the hospital. If You Are Having Surgery, Help Keep Yourself Safe Go to a hospital you ...

  5. Pedal Application Errors

    Science.gov (United States)

    2012-03-01

    This project examined the prevalence of pedal application errors and the driver, vehicle, roadway and/or environmental characteristics associated with pedal misapplication crashes based on a literature review, analysis of news media reports, a panel ...

  6. Rounding errors in weighing

    International Nuclear Information System (INIS)

    Jeach, J.L.

    1976-01-01

    When rounding error is large relative to weighing error, it cannot be ignored when estimating scale precision and bias from calibration data. Further, if the data grouping is coarse, rounding error is correlated with weighing error and may also have a mean quite different from zero. These facts are taken into account in a moment estimation method. A copy of the program listing for the MERDA program that provides moment estimates is available from the author. Experience suggests that if the data fall into four or more cells or groups, it is not necessary to apply the moment estimation method. Rather, the estimate given by equation (3) is valid in this instance. 5 tables

  7. Spotting software errors sooner

    International Nuclear Information System (INIS)

    Munro, D.

    1989-01-01

    Static analysis is helping to identify software errors at an earlier stage and more cheaply than conventional methods of testing. RTP Software's MALPAS system also has the ability to check that a code conforms to its original specification. (author)

  8. Errors in energy bills

    International Nuclear Information System (INIS)

    Kop, L.

    2001-01-01

    On request, the Dutch Association for Energy, Environment and Water (VEMW) checks the energy bills for her customers. It appeared that in the year 2000 many small, but also big errors were discovered in the bills of 42 businesses

  9. Medical Errors Reduction Initiative

    National Research Council Canada - National Science Library

    Mutter, Michael L

    2005-01-01

    The Valley Hospital of Ridgewood, New Jersey, is proposing to extend a limited but highly successful specimen management and medication administration medical errors reduction initiative on a hospital-wide basis...

  10. The surveillance error grid.

    Science.gov (United States)

    Klonoff, David C; Lias, Courtney; Vigersky, Robert; Clarke, William; Parkes, Joan Lee; Sacks, David B; Kirkman, M Sue; Kovatchev, Boris

    2014-07-01

    Currently used error grids for assessing clinical accuracy of blood glucose monitors are based on out-of-date medical practices. Error grids have not been widely embraced by regulatory agencies for clearance of monitors, but this type of tool could be useful for surveillance of the performance of cleared products. Diabetes Technology Society together with representatives from the Food and Drug Administration, the American Diabetes Association, the Endocrine Society, and the Association for the Advancement of Medical Instrumentation, and representatives of academia, industry, and government, have developed a new error grid, called the surveillance error grid (SEG) as a tool to assess the degree of clinical risk from inaccurate blood glucose (BG) monitors. A total of 206 diabetes clinicians were surveyed about the clinical risk of errors of measured BG levels by a monitor. The impact of such errors on 4 patient scenarios was surveyed. Each monitor/reference data pair was scored and color-coded on a graph per its average risk rating. Using modeled data representative of the accuracy of contemporary meters, the relationships between clinical risk and monitor error were calculated for the Clarke error grid (CEG), Parkes error grid (PEG), and SEG. SEG action boundaries were consistent across scenarios, regardless of whether the patient was type 1 or type 2 or using insulin or not. No significant differences were noted between responses of adult/pediatric or 4 types of clinicians. Although small specific differences in risk boundaries between US and non-US clinicians were noted, the panel felt they did not justify separate grids for these 2 types of clinicians. The data points of the SEG were classified in 15 zones according to their assigned level of risk, which allowed for comparisons with the classic CEG and PEG. Modeled glucose monitor data with realistic self-monitoring of blood glucose errors derived from meter testing experiments plotted on the SEG when compared to

  11. Can conditional cash transfer programs generate equality of opportunity in highly unequal societies? Evidence from Brazil

    Directory of Open Access Journals (Sweden)

    Simone Bohn

    2014-09-01

    Full Text Available This article examines whether the state, through conditional cash transfer programs (CCT, can reduce the poverty and extremely poverty in societies marred by high levels of income concentration. We focus on one of the most unequal countries in the globe, Brazil, and analyze the extent to which this country's CCT program - Bolsa Família (BF, Family Grant program - is able to improve the life chances of extremely poor beneficiaries, through the three major goals of PBF: First, to immediately end hunger; second, to create basic social rights related to healthcare and education; finally, considering also complementary policies, to integrate adults into the job market. The analysis relies on a quantitative survey with 4,000 beneficiaries and a qualitative survey comprised of in-depth interviews with 38 program's participants from all the regions of the country in 2008, it means that this study is about the five first years of the PBF. In order to answer the research questions, we ran four probit analyses related: a the determinants of the realization of prenatal care; b the determinants of food security among BF beneficiaries, c the determinants that adult BF recipients will return to school, d the determinants that a BF beneficiary will obtain a job. Important results from the study are: First, those who before their participation on PBF were at the margins have now been able to access healthcare services on a more regular basis. Thus, the women at the margins who were systematically excluded - black women, poorly educated and from the North - now, after their participation in the CCT program, have more access to prenatal care and can now count with more availability of public healthcare network. Second, before entering the Bolsa Família program, 50.3% of the participants faced severe food insecurity. This number went down to 36.8% in very five years. Men are more likely than women; non-blacks more likely than blacks; and South and Centre

  12. Design for Error Tolerance

    DEFF Research Database (Denmark)

    Rasmussen, Jens

    1983-01-01

    An important aspect of the optimal design of computer-based operator support systems is the sensitivity of such systems to operator errors. The author discusses how a system might allow for human variability with the use of reversibility and observability.......An important aspect of the optimal design of computer-based operator support systems is the sensitivity of such systems to operator errors. The author discusses how a system might allow for human variability with the use of reversibility and observability....

  13. Applying a new unequally weighted feature fusion method to improve CAD performance of classifying breast lesions

    Science.gov (United States)

    Zargari Khuzani, Abolfazl; Danala, Gopichandh; Heidari, Morteza; Du, Yue; Mashhadi, Najmeh; Qiu, Yuchen; Zheng, Bin

    2018-02-01

    Higher recall rates are a major challenge in mammography screening. Thus, developing computer-aided diagnosis (CAD) scheme to classify between malignant and benign breast lesions can play an important role to improve efficacy of mammography screening. Objective of this study is to develop and test a unique image feature fusion framework to improve performance in classifying suspicious mass-like breast lesions depicting on mammograms. The image dataset consists of 302 suspicious masses detected on both craniocaudal and mediolateral-oblique view images. Amongst them, 151 were malignant and 151 were benign. The study consists of following 3 image processing and feature analysis steps. First, an adaptive region growing segmentation algorithm was used to automatically segment mass regions. Second, a set of 70 image features related to spatial and frequency characteristics of mass regions were initially computed. Third, a generalized linear regression model (GLM) based machine learning classifier combined with a bat optimization algorithm was used to optimally fuse the selected image features based on predefined assessment performance index. An area under ROC curve (AUC) with was used as a performance assessment index. Applying CAD scheme to the testing dataset, AUC was 0.75+/-0.04, which was significantly higher than using a single best feature (AUC=0.69+/-0.05) or the classifier with equally weighted features (AUC=0.73+/-0.05). This study demonstrated that comparing to the conventional equal-weighted approach, using an unequal-weighted feature fusion approach had potential to significantly improve accuracy in classifying between malignant and benign breast masses.

  14. Climate change and unequal phenological changes across four trophic levels: constraints or adaptations?

    Science.gov (United States)

    Both, Christiaan; van Asch, Margriet; Bijlsma, Rob G; van den Burg, Arnold B; Visser, Marcel E

    2009-01-01

    1. Climate change has been shown to affect the phenology of many organisms, but interestingly these shifts are often unequal across trophic levels, causing a mismatch between the phenology of organisms and their food. 2. We consider two alternative hypotheses: consumers are constrained to adjust sufficiently to the lower trophic level, or prey species react more strongly than their predators to reduce predation. We discuss both hypotheses with our analyses of changes in phenology across four trophic levels: tree budburst, peak biomass of herbivorous caterpillars, breeding phenology of four insectivorous bird species and an avian predator. 3. In our long-term study, we show that between 1988 and 2005, budburst advanced (not significantly) with 0.17 d yr(-1), while between 1985 and 2005 both caterpillars (0.75 d year(-1)) and the hatching date of the passerine species (range for four species: 0.36-0.50 d year(-1)) have advanced, whereas raptor hatching dates showed no trend. 4. The caterpillar peak date was closely correlated with budburst date, as were the passerine hatching dates with the peak caterpillar biomass date. In all these cases, however, the slopes were significantly less than unity, showing that the response of the consumers is weaker than that of their food. This was also true for the avian predator, for which hatching dates were not correlated with the peak availability of fledgling passerines. As a result, the match between food demand and availability deteriorated over time for both the passerines and the avian predators. 5. These results could equally well be explained by consumers' insufficient responses as a consequence of constraints in adapting to climate change, or by them trying to escape predation from a higher trophic level, or both. Selection on phenology could thus be both from matches of phenology with higher and lower levels, and quantifying these can shed new light on why some organisms do adjust their phenology to climate change, while

  15. Apologies and Medical Error

    Science.gov (United States)

    2008-01-01

    One way in which physicians can respond to a medical error is to apologize. Apologies—statements that acknowledge an error and its consequences, take responsibility, and communicate regret for having caused harm—can decrease blame, decrease anger, increase trust, and improve relationships. Importantly, apologies also have the potential to decrease the risk of a medical malpractice lawsuit and can help settle claims by patients. Patients indicate they want and expect explanations and apologies after medical errors and physicians indicate they want to apologize. However, in practice, physicians tend to provide minimal information to patients after medical errors and infrequently offer complete apologies. Although fears about potential litigation are the most commonly cited barrier to apologizing after medical error, the link between litigation risk and the practice of disclosure and apology is tenuous. Other barriers might include the culture of medicine and the inherent psychological difficulties in facing one’s mistakes and apologizing for them. Despite these barriers, incorporating apology into conversations between physicians and patients can address the needs of both parties and can play a role in the effective resolution of disputes related to medical error. PMID:18972177

  16. Thermodynamics of Error Correction

    Directory of Open Access Journals (Sweden)

    Pablo Sartori

    2015-12-01

    Full Text Available Information processing at the molecular scale is limited by thermal fluctuations. This can cause undesired consequences in copying information since thermal noise can lead to errors that can compromise the functionality of the copy. For example, a high error rate during DNA duplication can lead to cell death. Given the importance of accurate copying at the molecular scale, it is fundamental to understand its thermodynamic features. In this paper, we derive a universal expression for the copy error as a function of entropy production and work dissipated by the system during wrong incorporations. Its derivation is based on the second law of thermodynamics; hence, its validity is independent of the details of the molecular machinery, be it any polymerase or artificial copying device. Using this expression, we find that information can be copied in three different regimes. In two of them, work is dissipated to either increase or decrease the error. In the third regime, the protocol extracts work while correcting errors, reminiscent of a Maxwell demon. As a case study, we apply our framework to study a copy protocol assisted by kinetic proofreading, and show that it can operate in any of these three regimes. We finally show that, for any effective proofreading scheme, error reduction is limited by the chemical driving of the proofreading reaction.

  17. Error propagation analysis for a sensor system

    International Nuclear Information System (INIS)

    Yeater, M.L.; Hockenbury, R.W.; Hawkins, J.; Wilkinson, J.

    1976-01-01

    As part of a program to develop reliability methods for operational use with reactor sensors and protective systems, error propagation analyses are being made for each model. An example is a sensor system computer simulation model, in which the sensor system signature is convoluted with a reactor signature to show the effect of each in revealing or obscuring information contained in the other. The error propagation analysis models the system and signature uncertainties and sensitivities, whereas the simulation models the signatures and by extensive repetitions reveals the effect of errors in various reactor input or sensor response data. In the approach for the example presented, the errors accumulated by the signature (set of ''noise'' frequencies) are successively calculated as it is propagated stepwise through a system comprised of sensor and signal processing components. Additional modeling steps include a Fourier transform calculation to produce the usual power spectral density representation of the product signature, and some form of pattern recognition algorithm

  18. Learning from Errors

    Directory of Open Access Journals (Sweden)

    MA. Lendita Kryeziu

    2015-06-01

    Full Text Available “Errare humanum est”, a well known and widespread Latin proverb which states that: to err is human, and that people make mistakes all the time. However, what counts is that people must learn from mistakes. On these grounds Steve Jobs stated: “Sometimes when you innovate, you make mistakes. It is best to admit them quickly, and get on with improving your other innovations.” Similarly, in learning new language, learners make mistakes, thus it is important to accept them, learn from them, discover the reason why they make them, improve and move on. The significance of studying errors is described by Corder as: “There have always been two justifications proposed for the study of learners' errors: the pedagogical justification, namely that a good understanding of the nature of error is necessary before a systematic means of eradicating them could be found, and the theoretical justification, which claims that a study of learners' errors is part of the systematic study of the learners' language which is itself necessary to an understanding of the process of second language acquisition” (Corder, 1982; 1. Thus the importance and the aim of this paper is analyzing errors in the process of second language acquisition and the way we teachers can benefit from mistakes to help students improve themselves while giving the proper feedback.

  19. Compact disk error measurements

    Science.gov (United States)

    Howe, D.; Harriman, K.; Tehranchi, B.

    1993-01-01

    The objectives of this project are as follows: provide hardware and software that will perform simple, real-time, high resolution (single-byte) measurement of the error burst and good data gap statistics seen by a photoCD player read channel when recorded CD write-once discs of variable quality (i.e., condition) are being read; extend the above system to enable measurement of the hard decision (i.e., 1-bit error flags) and soft decision (i.e., 2-bit error flags) decoding information that is produced/used by the Cross Interleaved - Reed - Solomon - Code (CIRC) block decoder employed in the photoCD player read channel; construct a model that uses data obtained via the systems described above to produce meaningful estimates of output error rates (due to both uncorrected ECC words and misdecoded ECC words) when a CD disc having specific (measured) error statistics is read (completion date to be determined); and check the hypothesis that current adaptive CIRC block decoders are optimized for pressed (DAD/ROM) CD discs. If warranted, do a conceptual design of an adaptive CIRC decoder that is optimized for write-once CD discs.

  20. Errors in Neonatology

    Directory of Open Access Journals (Sweden)

    Antonio Boldrini

    2013-06-01

    Full Text Available Introduction: Danger and errors are inherent in human activities. In medical practice errors can lean to adverse events for patients. Mass media echo the whole scenario. Methods: We reviewed recent published papers in PubMed database to focus on the evidence and management of errors in medical practice in general and in Neonatology in particular. We compared the results of the literature with our specific experience in Nina Simulation Centre (Pisa, Italy. Results: In Neonatology the main error domains are: medication and total parenteral nutrition, resuscitation and respiratory care, invasive procedures, nosocomial infections, patient identification, diagnostics. Risk factors include patients’ size, prematurity, vulnerability and underlying disease conditions but also multidisciplinary teams, working conditions providing fatigue, a large variety of treatment and investigative modalities needed. Discussion and Conclusions: In our opinion, it is hardly possible to change the human beings but it is likely possible to change the conditions under they work. Voluntary errors report systems can help in preventing adverse events. Education and re-training by means of simulation can be an effective strategy too. In Pisa (Italy Nina (ceNtro di FormazIone e SimulazioNe NeonAtale is a simulation center that offers the possibility of a continuous retraining for technical and non-technical skills to optimize neonatological care strategies. Furthermore, we have been working on a novel skill trainer for mechanical ventilation (MEchatronic REspiratory System SImulator for Neonatal Applications, MERESSINA. Finally, in our opinion national health policy indirectly influences risk for errors. Proceedings of the 9th International Workshop on Neonatology · Cagliari (Italy · October 23rd-26th, 2013 · Learned lessons, changing practice and cutting-edge research

  1. LIBERTARISMO & ERROR CATEGORIAL

    Directory of Open Access Journals (Sweden)

    Carlos G. Patarroyo G.

    2009-01-01

    Full Text Available En este artículo se ofrece una defensa del libertarismo frente a dos acusaciones según las cuales éste comete un error categorial. Para ello, se utiliza la filosofía de Gilbert Ryle como herramienta para explicar las razones que fundamentan estas acusaciones y para mostrar por qué, pese a que ciertas versiones del libertarismo que acuden a la causalidad de agentes o al dualismo cartesiano cometen estos errores, un libertarismo que busque en el indeterminismo fisicalista la base de la posibilidad de la libertad humana no necesariamente puede ser acusado de incurrir en ellos.

  2. Libertarismo & Error Categorial

    OpenAIRE

    PATARROYO G, CARLOS G

    2009-01-01

    En este artículo se ofrece una defensa del libertarismo frente a dos acusaciones según las cuales éste comete un error categorial. Para ello, se utiliza la filosofía de Gilbert Ryle como herramienta para explicar las razones que fundamentan estas acusaciones y para mostrar por qué, pese a que ciertas versiones del libertarismo que acuden a la causalidad de agentes o al dualismo cartesiano cometen estos errores, un libertarismo que busque en el indeterminismo fisicalista la base de la posibili...

  3. Error Free Software

    Science.gov (United States)

    1985-01-01

    A mathematical theory for development of "higher order" software to catch computer mistakes resulted from a Johnson Space Center contract for Apollo spacecraft navigation. Two women who were involved in the project formed Higher Order Software, Inc. to develop and market the system of error analysis and correction. They designed software which is logically error-free, which, in one instance, was found to increase productivity by 600%. USE.IT defines its objectives using AXES -- a user can write in English and the system converts to computer languages. It is employed by several large corporations.

  4. MEDICAL ERROR: CIVIL AND LEGAL ASPECT.

    Science.gov (United States)

    Buletsa, S; Drozd, O; Yunin, O; Mohilevskyi, L

    2018-03-01

    The scientific article is focused on the research of the notion of medical error, medical and legal aspects of this notion have been considered. The necessity of the legislative consolidation of the notion of «medical error» and criteria of its legal estimation have been grounded. In the process of writing a scientific article, we used the empirical method, general scientific and comparative legal methods. A comparison of the concept of medical error in civil and legal aspects was made from the point of view of Ukrainian, European and American scientists. It has been marked that the problem of medical errors is known since ancient times and in the whole world, in fact without regard to the level of development of medicine, there is no country, where doctors never make errors. According to the statistics, medical errors in the world are included in the first five reasons of death rate. At the same time the grant of medical services practically concerns all people. As a man and his life, health in Ukraine are acknowledged by a higher social value, medical services must be of high-quality and effective. The grant of not quality medical services causes harm to the health, and sometimes the lives of people; it may result in injury or even death. The right to the health protection is one of the fundamental human rights assured by the Constitution of Ukraine; therefore the issue of medical errors and liability for them is extremely relevant. The authors make conclusions, that the definition of the notion of «medical error» must get the legal consolidation. Besides, the legal estimation of medical errors must be based on the single principles enshrined in the legislation and confirmed by judicial practice.

  5. Error Correcting Codes

    Indian Academy of Sciences (India)

    Science and Automation at ... the Reed-Solomon code contained 223 bytes of data, (a byte ... then you have a data storage system with error correction, that ..... practical codes, storing such a table is infeasible, as it is generally too large.

  6. Error Correcting Codes

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 2; Issue 3. Error Correcting Codes - Reed Solomon Codes. Priti Shankar. Series Article Volume 2 Issue 3 March ... Author Affiliations. Priti Shankar1. Department of Computer Science and Automation, Indian Institute of Science, Bangalore 560 012, India ...

  7. Challenge and Error: Critical Events and Attention-Related Errors

    Science.gov (United States)

    Cheyne, James Allan; Carriere, Jonathan S. A.; Solman, Grayden J. F.; Smilek, Daniel

    2011-01-01

    Attention lapses resulting from reactivity to task challenges and their consequences constitute a pervasive factor affecting everyday performance errors and accidents. A bidirectional model of attention lapses (error [image omitted] attention-lapse: Cheyne, Solman, Carriere, & Smilek, 2009) argues that errors beget errors by generating attention…

  8. 19 CFR 173.1 - Authority to review for error.

    Science.gov (United States)

    2010-04-01

    ... 19 Customs Duties 2 2010-04-01 2010-04-01 false Authority to review for error. 173.1 Section 173.1 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY (CONTINUED) ADMINISTRATIVE REVIEW IN GENERAL § 173.1 Authority to review for error. Port directors...

  9. Team errors: definition and taxonomy

    International Nuclear Information System (INIS)

    Sasou, Kunihide; Reason, James

    1999-01-01

    In error analysis or error management, the focus is usually upon individuals who have made errors. In large complex systems, however, most people work in teams or groups. Considering this working environment, insufficient emphasis has been given to 'team errors'. This paper discusses the definition of team errors and its taxonomy. These notions are also applied to events that have occurred in the nuclear power industry, aviation industry and shipping industry. The paper also discusses the relations between team errors and Performance Shaping Factors (PSFs). As a result, the proposed definition and taxonomy are found to be useful in categorizing team errors. The analysis also reveals that deficiencies in communication, resource/task management, excessive authority gradient, excessive professional courtesy will cause team errors. Handling human errors as team errors provides an opportunity to reduce human errors

  10. Mixing at double-Tee junctions with unequal pipe sizes in water distribution systems

    Data.gov (United States)

    U.S. Environmental Protection Agency — Pipe flow mixing with various solute concentrations and flow rates at pipe junctions is investigated. The degree of mixing affects the spread of contaminants in a...

  11. Imagery of Errors in Typing

    Science.gov (United States)

    Rieger, Martina; Martinez, Fanny; Wenke, Dorit

    2011-01-01

    Using a typing task we investigated whether insufficient imagination of errors and error corrections is related to duration differences between execution and imagination. In Experiment 1 spontaneous error imagination was investigated, whereas in Experiment 2 participants were specifically instructed to imagine errors. Further, in Experiment 2 we…

  12. Correction of refractive errors

    Directory of Open Access Journals (Sweden)

    Vladimir Pfeifer

    2005-10-01

    Full Text Available Background: Spectacles and contact lenses are the most frequently used, the safest and the cheapest way to correct refractive errors. The development of keratorefractive surgery has brought new opportunities for correction of refractive errors in patients who have the need to be less dependent of spectacles or contact lenses. Until recently, RK was the most commonly performed refractive procedure for nearsighted patients.Conclusions: The introduction of excimer laser in refractive surgery has given the new opportunities of remodelling the cornea. The laser energy can be delivered on the stromal surface like in PRK or deeper on the corneal stroma by means of lamellar surgery. In LASIK flap is created with microkeratome in LASEK with ethanol and in epi-LASIK the ultra thin flap is created mechanically.

  13. Error-Free Software

    Science.gov (United States)

    1989-01-01

    001 is an integrated tool suited for automatically developing ultra reliable models, simulations and software systems. Developed and marketed by Hamilton Technologies, Inc. (HTI), it has been applied in engineering, manufacturing, banking and software tools development. The software provides the ability to simplify the complex. A system developed with 001 can be a prototype or fully developed with production quality code. It is free of interface errors, consistent, logically complete and has no data or control flow errors. Systems can be designed, developed and maintained with maximum productivity. Margaret Hamilton, President of Hamilton Technologies, also directed the research and development of USE.IT, an earlier product which was the first computer aided software engineering product in the industry to concentrate on automatically supporting the development of an ultrareliable system throughout its life cycle. Both products originated in NASA technology developed under a Johnson Space Center contract.

  14. Minimum Tracking Error Volatility

    OpenAIRE

    Luca RICCETTI

    2010-01-01

    Investors assign part of their funds to asset managers that are given the task of beating a benchmark. The risk management department usually imposes a maximum value of the tracking error volatility (TEV) in order to keep the risk of the portfolio near to that of the selected benchmark. However, risk management does not establish a rule on TEV which enables us to understand whether the asset manager is really active or not and, in practice, asset managers sometimes follow passively the corres...

  15. Error-correction coding

    Science.gov (United States)

    Hinds, Erold W. (Principal Investigator)

    1996-01-01

    This report describes the progress made towards the completion of a specific task on error-correcting coding. The proposed research consisted of investigating the use of modulation block codes as the inner code of a concatenated coding system in order to improve the overall space link communications performance. The study proposed to identify and analyze candidate codes that will complement the performance of the overall coding system which uses the interleaved RS (255,223) code as the outer code.

  16. Satellite Photometric Error Determination

    Science.gov (United States)

    2015-10-18

    Satellite Photometric Error Determination Tamara E. Payne, Philip J. Castro, Stephen A. Gregory Applied Optimization 714 East Monument Ave, Suite...advocate the adoption of new techniques based on in-frame photometric calibrations enabled by newly available all-sky star catalogs that contain highly...filter systems will likely be supplanted by the Sloan based filter systems. The Johnson photometric system is a set of filters in the optical

  17. Video Error Correction Using Steganography

    Science.gov (United States)

    Robie, David L.; Mersereau, Russell M.

    2002-12-01

    The transmission of any data is always subject to corruption due to errors, but video transmission, because of its real time nature must deal with these errors without retransmission of the corrupted data. The error can be handled using forward error correction in the encoder or error concealment techniques in the decoder. This MPEG-2 compliant codec uses data hiding to transmit error correction information and several error concealment techniques in the decoder. The decoder resynchronizes more quickly with fewer errors than traditional resynchronization techniques. It also allows for perfect recovery of differentially encoded DCT-DC components and motion vectors. This provides for a much higher quality picture in an error-prone environment while creating an almost imperceptible degradation of the picture in an error-free environment.

  18. Video Error Correction Using Steganography

    Directory of Open Access Journals (Sweden)

    Robie David L

    2002-01-01

    Full Text Available The transmission of any data is always subject to corruption due to errors, but video transmission, because of its real time nature must deal with these errors without retransmission of the corrupted data. The error can be handled using forward error correction in the encoder or error concealment techniques in the decoder. This MPEG-2 compliant codec uses data hiding to transmit error correction information and several error concealment techniques in the decoder. The decoder resynchronizes more quickly with fewer errors than traditional resynchronization techniques. It also allows for perfect recovery of differentially encoded DCT-DC components and motion vectors. This provides for a much higher quality picture in an error-prone environment while creating an almost imperceptible degradation of the picture in an error-free environment.

  19. Experimental quantum error correction with high fidelity

    International Nuclear Information System (INIS)

    Zhang Jingfu; Gangloff, Dorian; Moussa, Osama; Laflamme, Raymond

    2011-01-01

    More than ten years ago a first step toward quantum error correction (QEC) was implemented [Phys. Rev. Lett. 81, 2152 (1998)]. The work showed there was sufficient control in nuclear magnetic resonance to implement QEC, and demonstrated that the error rate changed from ε to ∼ε 2 . In the current work we reproduce a similar experiment using control techniques that have been since developed, such as the pulses generated by gradient ascent pulse engineering algorithm. We show that the fidelity of the QEC gate sequence and the comparative advantage of QEC are appreciably improved. This advantage is maintained despite the errors introduced by the additional operations needed to protect the quantum states.

  20. A Slicing Tree Representation and QCP-Model-Based Heuristic Algorithm for the Unequal-Area Block Facility Layout Problem

    Directory of Open Access Journals (Sweden)

    Mei-Shiang Chang

    2013-01-01

    Full Text Available The facility layout problem is a typical combinational optimization problem. In this research, a slicing tree representation and a quadratically constrained program model are combined with harmony search to develop a heuristic method for solving the unequal-area block layout problem. Because of characteristics of slicing tree structure, we propose a regional structure of harmony memory to memorize facility layout solutions and two kinds of harmony improvisation to enhance global search ability of the proposed heuristic method. The proposed harmony search based heuristic is tested on 10 well-known unequal-area facility layout problems from the literature. The results are compared with the previously best-known solutions obtained by genetic algorithm, tabu search, and ant system as well as exact methods. For problems O7, O9, vC10Ra, M11*, and Nug12, new best solutions are found. For other problems, the proposed approach can find solutions that are very similar to previous best-known solutions.

  1. Error-related brain activity and error awareness in an error classification paradigm.

    Science.gov (United States)

    Di Gregorio, Francesco; Steinhauser, Marco; Maier, Martin E

    2016-10-01

    Error-related brain activity has been linked to error detection enabling adaptive behavioral adjustments. However, it is still unclear which role error awareness plays in this process. Here, we show that the error-related negativity (Ne/ERN), an event-related potential reflecting early error monitoring, is dissociable from the degree of error awareness. Participants responded to a target while ignoring two different incongruent distractors. After responding, they indicated whether they had committed an error, and if so, whether they had responded to one or to the other distractor. This error classification paradigm allowed distinguishing partially aware errors, (i.e., errors that were noticed but misclassified) and fully aware errors (i.e., errors that were correctly classified). The Ne/ERN was larger for partially aware errors than for fully aware errors. Whereas this speaks against the idea that the Ne/ERN foreshadows the degree of error awareness, it confirms the prediction of a computational model, which relates the Ne/ERN to post-response conflict. This model predicts that stronger distractor processing - a prerequisite of error classification in our paradigm - leads to lower post-response conflict and thus a smaller Ne/ERN. This implies that the relationship between Ne/ERN and error awareness depends on how error awareness is related to response conflict in a specific task. Our results further indicate that the Ne/ERN but not the degree of error awareness determines adaptive performance adjustments. Taken together, we conclude that the Ne/ERN is dissociable from error awareness and foreshadows adaptive performance adjustments. Our results suggest that the relationship between the Ne/ERN and error awareness is correlative and mediated by response conflict. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. A Public Choice Approach to the Unequal Treatment of Securities Market Participants and Home Borrowers

    Directory of Open Access Journals (Sweden)

    Jonathan Macey

    2017-01-01

    Full Text Available This article contrasts the protections provided to participants in U.S. securities markets with the protections provided to participants in the U.S. mortgage markets. Participants in securities markets purchase and sell equity and debt securities. Participants in the mortgage markets borrow money to buy homes, using those homes as collateral for the mortgage loans they receive. Even after Dodd-Frank, participants in securities markets are afforded significantly higher levels of protection than participants in mortgage markets. The doctrine of suitability is a prime example of this inequity. Exploring possible explanations for this odd asymmetry of treatment, I conclude that interest group politics is to blame for the anomaly.

  3. Diagnostic errors in pediatric radiology

    International Nuclear Information System (INIS)

    Taylor, George A.; Voss, Stephan D.; Melvin, Patrice R.; Graham, Dionne A.

    2011-01-01

    Little information is known about the frequency, types and causes of diagnostic errors in imaging children. Our goals were to describe the patterns and potential etiologies of diagnostic error in our subspecialty. We reviewed 265 cases with clinically significant diagnostic errors identified during a 10-year period. Errors were defined as a diagnosis that was delayed, wrong or missed; they were classified as perceptual, cognitive, system-related or unavoidable; and they were evaluated by imaging modality and level of training of the physician involved. We identified 484 specific errors in the 265 cases reviewed (mean:1.8 errors/case). Most discrepancies involved staff (45.5%). Two hundred fifty-eight individual cognitive errors were identified in 151 cases (mean = 1.7 errors/case). Of these, 83 cases (55%) had additional perceptual or system-related errors. One hundred sixty-five perceptual errors were identified in 165 cases. Of these, 68 cases (41%) also had cognitive or system-related errors. Fifty-four system-related errors were identified in 46 cases (mean = 1.2 errors/case) of which all were multi-factorial. Seven cases were unavoidable. Our study defines a taxonomy of diagnostic errors in a large academic pediatric radiology practice and suggests that most are multi-factorial in etiology. Further study is needed to define effective strategies for improvement. (orig.)

  4. Minimum Error Entropy Classification

    CERN Document Server

    Marques de Sá, Joaquim P; Santos, Jorge M F; Alexandre, Luís A

    2013-01-01

    This book explains the minimum error entropy (MEE) concept applied to data classification machines. Theoretical results on the inner workings of the MEE concept, in its application to solving a variety of classification problems, are presented in the wider realm of risk functionals. Researchers and practitioners also find in the book a detailed presentation of practical data classifiers using MEE. These include multi‐layer perceptrons, recurrent neural networks, complexvalued neural networks, modular neural networks, and decision trees. A clustering algorithm using a MEE‐like concept is also presented. Examples, tests, evaluation experiments and comparison with similar machines using classic approaches, complement the descriptions.

  5. TO THE SOLUTION OF PROBLEMS ABOUT THE RAILWAYS CALCULATION FOR STRENGTH TAKING INTO ACCOUNT UNEQUAL ELASTICITY OF THE SUBRAIL BASE

    Directory of Open Access Journals (Sweden)

    D. M. Kurhan

    2014-11-01

    Full Text Available Purpose. The module of elasticity of the subrail base is one of the main characteristics for an assessment intense the deformed condition of a track. Need for different cases to consider unequal elasticity of the subrail base repeatedly was considered, however, results contained rather difficult mathematical approaches and the obtained decisions didn't keep within borders of standard engineering calculation of a railway on strength. Therefore the purpose of this work is obtaining the decision within this document. Methodology. It is offered to consider a rail model as a beam which has the distributed loading of such outline corresponding to value of the module of elasticity that gives an equivalent deflection at free seating on bearing parts. Findings. The method of the accounting of gradual change of the module of elasticity of the subrail base by means of the correcting coefficient in engineering calculation of a way on strength was received. Expansion of existing calculation of railways strength was developed for the accounting of sharp change of the module of elasticity of the subrail base (for example, upon transition from a ballast design of a way on the bridge. The characteristic of change of forces operating from a rail on a basis, depending on distance to the bridge on an approach site from a ballast design of a way was received. The results of the redistribution of forces after a sudden change in the elastic modulus of the base under the rail explain the formation of vertical irregularities before the bridge. Originality. The technique of engineering calculation of railways strength for performance of calculations taking into account unequal elasticity of the subrail base was improved. Practical value. The obtained results allow carrying out engineering calculations for an assessment of strength of a railway in places of unequal elasticity caused by a condition of a way or features of a design. The solution of the return task on

  6. Standard Errors for Matrix Correlations.

    Science.gov (United States)

    Ogasawara, Haruhiko

    1999-01-01

    Derives the asymptotic standard errors and intercorrelations for several matrix correlations assuming multivariate normality for manifest variables and derives the asymptotic standard errors of the matrix correlations for two factor-loading matrices. (SLD)

  7. Creating illusions of knowledge: learning errors that contradict prior knowledge.

    Science.gov (United States)

    Fazio, Lisa K; Barber, Sarah J; Rajaram, Suparna; Ornstein, Peter A; Marsh, Elizabeth J

    2013-02-01

    Most people know that the Pacific is the largest ocean on Earth and that Edison invented the light bulb. Our question is whether this knowledge is stable, or if people will incorporate errors into their knowledge bases, even if they have the correct knowledge stored in memory. To test this, we asked participants general-knowledge questions 2 weeks before they read stories that contained errors (e.g., "Franklin invented the light bulb"). On a later general-knowledge test, participants reproduced story errors despite previously answering the questions correctly. This misinformation effect was found even for questions that were answered correctly on the initial test with the highest level of confidence. Furthermore, prior knowledge offered no protection against errors entering the knowledge base; the misinformation effect was equivalent for previously known and unknown facts. Errors can enter the knowledge base even when learners have the knowledge necessary to catch the errors. 2013 APA, all rights reserved

  8. Error forecasting schemes of error correction at receiver

    International Nuclear Information System (INIS)

    Bhunia, C.T.

    2007-08-01

    To combat error in computer communication networks, ARQ (Automatic Repeat Request) techniques are used. Recently Chakraborty has proposed a simple technique called the packet combining scheme in which error is corrected at the receiver from the erroneous copies. Packet Combining (PC) scheme fails: (i) when bit error locations in erroneous copies are the same and (ii) when multiple bit errors occur. Both these have been addressed recently by two schemes known as Packet Reversed Packet Combining (PRPC) Scheme, and Modified Packet Combining (MPC) Scheme respectively. In the letter, two error forecasting correction schemes are reported, which in combination with PRPC offer higher throughput. (author)

  9. Evaluating a medical error taxonomy.

    OpenAIRE

    Brixey, Juliana; Johnson, Todd R.; Zhang, Jiajie

    2002-01-01

    Healthcare has been slow in using human factors principles to reduce medical errors. The Center for Devices and Radiological Health (CDRH) recognizes that a lack of attention to human factors during product development may lead to errors that have the potential for patient injury, or even death. In response to the need for reducing medication errors, the National Coordinating Council for Medication Errors Reporting and Prevention (NCC MERP) released the NCC MERP taxonomy that provides a stand...

  10. Application of derivative spectrophotometry under orthogonal polynomial at unequal intervals: determination of metronidazole and nystatin in their pharmaceutical mixture.

    Science.gov (United States)

    Korany, Mohamed A; Abdine, Heba H; Ragab, Marwa A A; Aboras, Sara I

    2015-05-15

    This paper discusses a general method for the use of orthogonal polynomials for unequal intervals (OPUI) to eliminate interferences in two-component spectrophotometric analysis. In this paper, a new approach was developed by using first derivative D1 curve instead of absorbance curve to be convoluted using OPUI method for the determination of metronidazole (MTR) and nystatin (NYS) in their mixture. After applying derivative treatment of the absorption data many maxima and minima points appeared giving characteristic shape for each drug allowing the selection of different number of points for the OPUI method for each drug. This allows the specific and selective determination of each drug in presence of the other and in presence of any matrix interference. The method is particularly useful when the two absorption spectra have considerable overlap. The results obtained are encouraging and suggest that the method can be widely applied to similar problems. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Sample Size Estimation for Negative Binomial Regression Comparing Rates of Recurrent Events with Unequal Follow-Up Time.

    Science.gov (United States)

    Tang, Yongqiang

    2015-01-01

    A sample size formula is derived for negative binomial regression for the analysis of recurrent events, in which subjects can have unequal follow-up time. We obtain sharp lower and upper bounds on the required size, which is easy to compute. The upper bound is generally only slightly larger than the required size, and hence can be used to approximate the sample size. The lower and upper size bounds can be decomposed into two terms. The first term relies on the mean number of events in each group, and the second term depends on two factors that measure, respectively, the extent of between-subject variability in event rates, and follow-up time. Simulation studies are conducted to assess the performance of the proposed method. An application of our formulae to a multiple sclerosis trial is provided.

  12. Sample size for comparing negative binomial rates in noninferiority and equivalence trials with unequal follow-up times.

    Science.gov (United States)

    Tang, Yongqiang

    2017-05-25

    We derive the sample size formulae for comparing two negative binomial rates based on both the relative and absolute rate difference metrics in noninferiority and equivalence trials with unequal follow-up times, and establish an approximate relationship between the sample sizes required for the treatment comparison based on the two treatment effect metrics. The proposed method allows the dispersion parameter to vary by treatment groups. The accuracy of these methods is assessed by simulations. It is demonstrated that ignoring the between-subject variation in the follow-up time by setting the follow-up time for all individuals to be the mean follow-up time may greatly underestimate the required size, resulting in underpowered studies. Methods are provided for back-calculating the dispersion parameter based on the published summary results.

  13. Brewing Unequal Exchanges in Coffee: A Qualitative Investigation into the Consequences of the Java Trade in Rural Uganda

    Directory of Open Access Journals (Sweden)

    Kelly F. Austin

    2017-08-01

    Full Text Available This study represents a qualitative case study examining the broad impacts of coffee cultivation from a rural region in Eastern Uganda, the Bududa District. Over 20 interviews with coffee cultivators provide insights into how the coffee economy impacts gender relations, physical health, deforestation, and economic conditions. While there are some material benefits from cultivating and selling coffee beans, a lack of long-term economic stability for households and the consequences for the status of women, the health of the community, and the local environment calls into question the efficacy of coffee production as a viable development scheme that significantly enhances overall community well-being. This research hopes to bring attention to the mechanisms that enable broader unequal exchange relationships by focusing on the perspectives and experiences of growers in Bududa, Uganda, where a considerable amount of world coffee is grown and supplied to consumers in core nations.

  14. Photodouble ionization studies of the Ne(2s{sup 2}) state under unequal energy sharing conditions

    Energy Technology Data Exchange (ETDEWEB)

    Bolognesi, P [CNR-IMIP, Area della Ricerca di Roma 1, Monterotondo Scalo, Rome (Italy); Kheifets, A [Research School of Physical Sciences and Engineering, Australian National University, Canberra (Australia); Otranto, S [Physics Department, University of Missouri-Rolla, Rolla MO (United States); CONICET and Depto. de Fisica, Universidad Nacional del Sur, 8000 Bahia Blanca (Argentina); Coreno, M [CNR-IMIP, Area della Ricerca di Roma 1, Monterotondo Scalo, Rome (Italy); CNR-TASC, Gas Phase Photoemission Beamline at Elettra, Area Science Park, Trieste (Italy); Feyer, V [CNR-IMIP, Area della Ricerca di Roma 1, Monterotondo Scalo, Rome (Italy); Institute of Electron Physics, National Academy of Sciences, Uzhgorod (Ukraine); Colavecchia, F D [CONICET and Centro Atomico Bariloche, 8400 SC de Bariloche (Argentina); Garibotti, C R [CONICET and Centro Atomico Bariloche, 8400 SC de Bariloche (Argentina); Avaldi, L [CNR-IMIP, Area della Ricerca di Roma 1, Monterotondo Scalo, Rome (Italy); CNR-TASC, Gas Phase Photoemission Beamline at Elettra, Area Science Park, Trieste (Italy)

    2006-04-28

    The triple differential cross section (TDCS) of the He{sup 2+}(1s{sup -2}) and Ne{sup 2+}(2s{sup -2}) states has been studied under unequal energy sharing conditions and perpendicular geometry, for a ratio of about 3 between the energies of the two ejected electrons. The dynamical quantities which govern the photodouble ionization (PDI) process, i.e. the squared moduli of the gerade and ungerade complex amplitudes and the cosine of their relative phase, have been extracted from the experimental data. The results from the two targets have been compared between themselves as well as with the theoretical predictions of the SC3 and convergent close coupling (CCC) calculations. This work represents a joint experimental and theoretical approach to the investigation of PDI of atomic systems with more than two electrons.

  15. The sigh of the oppressed: The palliative effects of ideology are stronger for people living in highly unequal neighbourhoods.

    Science.gov (United States)

    Sengupta, Nikhil K; Greaves, Lara M; Osborne, Danny; Sibley, Chris G

    2017-09-01

    Ideologies that legitimize status hierarchies are associated with increased well-being. However, which ideologies have 'palliative effects', why they have these effects, and whether these effects extend to low-status groups remain unresolved issues. This study aimed to address these issues by testing the effects of the ideology of Symbolic Prejudice on well-being among low- and high-status ethnic groups (4,519 Europeans and 1,091 Māori) nested within 1,437 regions in New Zealand. Results showed that Symbolic Prejudice predicted increased well-being for both groups, but that this relationship was stronger for those living in highly unequal neighbourhoods. This suggests that it is precisely those who have the strongest need to justify inequality that accrue the most psychological benefit from subscribing to legitimizing ideologies. © 2017 The British Psychological Society.

  16. Uncertainty quantification and error analysis

    Energy Technology Data Exchange (ETDEWEB)

    Higdon, Dave M [Los Alamos National Laboratory; Anderson, Mark C [Los Alamos National Laboratory; Habib, Salman [Los Alamos National Laboratory; Klein, Richard [Los Alamos National Laboratory; Berliner, Mark [OHIO STATE UNIV.; Covey, Curt [LLNL; Ghattas, Omar [UNIV OF TEXAS; Graziani, Carlo [UNIV OF CHICAGO; Seager, Mark [LLNL; Sefcik, Joseph [LLNL; Stark, Philip [UC/BERKELEY; Stewart, James [SNL

    2010-01-01

    UQ studies all sources of error and uncertainty, including: systematic and stochastic measurement error; ignorance; limitations of theoretical models; limitations of numerical representations of those models; limitations on the accuracy and reliability of computations, approximations, and algorithms; and human error. A more precise definition for UQ is suggested below.

  17. Error Patterns in Problem Solving.

    Science.gov (United States)

    Babbitt, Beatrice C.

    Although many common problem-solving errors within the realm of school mathematics have been previously identified, a compilation of such errors is not readily available within learning disabilities textbooks, mathematics education texts, or teacher's manuals for school mathematics texts. Using data on error frequencies drawn from both the Fourth…

  18. Performance, postmodernity and errors

    DEFF Research Database (Denmark)

    Harder, Peter

    2013-01-01

    speaker’s competency (note the –y ending!) reflects adaptation to the community langue, including variations. This reversal of perspective also reverses our understanding of the relationship between structure and deviation. In the heyday of structuralism, it was tempting to confuse the invariant system...... with the prestige variety, and conflate non-standard variation with parole/performance and class both as erroneous. Nowadays the anti-structural sentiment of present-day linguistics makes it tempting to confuse the rejection of ideal abstract structure with a rejection of any distinction between grammatical...... as deviant from the perspective of function-based structure and discuss to what extent the recognition of a community langue as a source of adaptive pressure may throw light on different types of deviation, including language handicaps and learner errors....

  19. Relative efficiency of unequal versus equal cluster sizes in cluster randomized trials using generalized estimating equation models.

    Science.gov (United States)

    Liu, Jingxia; Colditz, Graham A

    2018-05-01

    There is growing interest in conducting cluster randomized trials (CRTs). For simplicity in sample size calculation, the cluster sizes are assumed to be identical across all clusters. However, equal cluster sizes are not guaranteed in practice. Therefore, the relative efficiency (RE) of unequal versus equal cluster sizes has been investigated when testing the treatment effect. One of the most important approaches to analyze a set of correlated data is the generalized estimating equation (GEE) proposed by Liang and Zeger, in which the "working correlation structure" is introduced and the association pattern depends on a vector of association parameters denoted by ρ. In this paper, we utilize GEE models to test the treatment effect in a two-group comparison for continuous, binary, or count data in CRTs. The variances of the estimator of the treatment effect are derived for the different types of outcome. RE is defined as the ratio of variance of the estimator of the treatment effect for equal to unequal cluster sizes. We discuss a commonly used structure in CRTs-exchangeable, and derive the simpler formula of RE with continuous, binary, and count outcomes. Finally, REs are investigated for several scenarios of cluster size distributions through simulation studies. We propose an adjusted sample size due to efficiency loss. Additionally, we also propose an optimal sample size estimation based on the GEE models under a fixed budget for known and unknown association parameter (ρ) in the working correlation structure within the cluster. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Errors in causal inference: an organizational schema for systematic error and random error.

    Science.gov (United States)

    Suzuki, Etsuji; Tsuda, Toshihide; Mitsuhashi, Toshiharu; Mansournia, Mohammad Ali; Yamamoto, Eiji

    2016-11-01

    To provide an organizational schema for systematic error and random error in estimating causal measures, aimed at clarifying the concept of errors from the perspective of causal inference. We propose to divide systematic error into structural error and analytic error. With regard to random error, our schema shows its four major sources: nondeterministic counterfactuals, sampling variability, a mechanism that generates exposure events and measurement variability. Structural error is defined from the perspective of counterfactual reasoning and divided into nonexchangeability bias (which comprises confounding bias and selection bias) and measurement bias. Directed acyclic graphs are useful to illustrate this kind of error. Nonexchangeability bias implies a lack of "exchangeability" between the selected exposed and unexposed groups. A lack of exchangeability is not a primary concern of measurement bias, justifying its separation from confounding bias and selection bias. Many forms of analytic errors result from the small-sample properties of the estimator used and vanish asymptotically. Analytic error also results from wrong (misspecified) statistical models and inappropriate statistical methods. Our organizational schema is helpful for understanding the relationship between systematic error and random error from a previously less investigated aspect, enabling us to better understand the relationship between accuracy, validity, and precision. Copyright © 2016 Elsevier Inc. All rights reserved.

  1. Boys Go Fishing, Girls Work at Home: Gender Roles, Poverty and Unequal School Access among Semi-Nomadic Fishing Communities in South Western Madagascar

    Science.gov (United States)

    Nascimento Moreira, Catarina; Rabenevanana, Man Wai; Picard, David

    2017-01-01

    Drawing from data gathered in South Western Madagascar in 2011, the work explores the combination of poverty and traditional gender roles as a critical factor in determining unequal school access among young people from semi-nomadic fishing communities. It demonstrates that from the age of early puberty, most boys go fishing with their fathers and…

  2. Recognition Memory zROC Slopes for Items with Correct versus Incorrect Source Decisions Discriminate the Dual Process and Unequal Variance Signal Detection Models

    Science.gov (United States)

    Starns, Jeffrey J.; Rotello, Caren M.; Hautus, Michael J.

    2014-01-01

    We tested the dual process and unequal variance signal detection models by jointly modeling recognition and source confidence ratings. The 2 approaches make unique predictions for the slope of the recognition memory zROC function for items with correct versus incorrect source decisions. The standard bivariate Gaussian version of the unequal…

  3. The "Hamburger Connection" as Ecologically Unequal Exchange: A Cross-National Investigation of Beef Exports and Deforestation in Less-Developed Countries

    Science.gov (United States)

    Austin, Kelly

    2010-01-01

    This study explores Norman Myers's concept of the "hamburger connection" as a form of ecologically unequal exchange, where more-developed nations are able to transfer the environmental costs of beef consumption to less-developed nations. I used ordinary least squares (OLS) regression to test whether deforestation in less-developed…

  4. Combined group ECC protection and subgroup parity protection

    Science.gov (United States)

    Gara, Alan G.; Chen, Dong; Heidelberger, Philip; Ohmacht, Martin

    2013-06-18

    A method and system are disclosed for providing combined error code protection and subgroup parity protection for a given group of n bits. The method comprises the steps of identifying a number, m, of redundant bits for said error protection; and constructing a matrix P, wherein multiplying said given group of n bits with P produces m redundant error correction code (ECC) protection bits, and two columns of P provide parity protection for subgroups of said given group of n bits. In the preferred embodiment of the invention, the matrix P is constructed by generating permutations of m bit wide vectors with three or more, but an odd number of, elements with value one and the other elements with value zero; and assigning said vectors to rows of the matrix P.

  5. Controlling errors in unidosis carts

    Directory of Open Access Journals (Sweden)

    Inmaculada Díaz Fernández

    2010-01-01

    Full Text Available Objective: To identify errors in the unidosis system carts. Method: For two months, the Pharmacy Service controlled medication either returned or missing from the unidosis carts both in the pharmacy and in the wards. Results: Uncorrected unidosis carts show a 0.9% of medication errors (264 versus 0.6% (154 which appeared in unidosis carts previously revised. In carts not revised, the error is 70.83% and mainly caused when setting up unidosis carts. The rest are due to a lack of stock or unavailability (21.6%, errors in the transcription of medical orders (6.81% or that the boxes had not been emptied previously (0.76%. The errors found in the units correspond to errors in the transcription of the treatment (3.46%, non-receipt of the unidosis copy (23.14%, the patient did not take the medication (14.36%or was discharged without medication (12.77%, was not provided by nurses (14.09%, was withdrawn from the stocks of the unit (14.62%, and errors of the pharmacy service (17.56% . Conclusions: It is concluded the need to redress unidosis carts and a computerized prescription system to avoid errors in transcription.Discussion: A high percentage of medication errors is caused by human error. If unidosis carts are overlooked before sent to hospitalization units, the error diminishes to 0.3%.

  6. Prioritising interventions against medication errors

    DEFF Research Database (Denmark)

    Lisby, Marianne; Pape-Larsen, Louise; Sørensen, Ann Lykkegaard

    errors are therefore needed. Development of definition: A definition of medication errors including an index of error types for each stage in the medication process was developed from existing terminology and through a modified Delphi-process in 2008. The Delphi panel consisted of 25 interdisciplinary......Abstract Authors: Lisby M, Larsen LP, Soerensen AL, Nielsen LP, Mainz J Title: Prioritising interventions against medication errors – the importance of a definition Objective: To develop and test a restricted definition of medication errors across health care settings in Denmark Methods: Medication...... errors constitute a major quality and safety problem in modern healthcare. However, far from all are clinically important. The prevalence of medication errors ranges from 2-75% indicating a global problem in defining and measuring these [1]. New cut-of levels focusing the clinical impact of medication...

  7. Social aspects of clinical errors.

    Science.gov (United States)

    Richman, Joel; Mason, Tom; Mason-Whitehead, Elizabeth; McIntosh, Annette; Mercer, Dave

    2009-08-01

    Clinical errors, whether committed by doctors, nurses or other professions allied to healthcare, remain a sensitive issue requiring open debate and policy formulation in order to reduce them. The literature suggests that the issues underpinning errors made by healthcare professionals involve concerns about patient safety, professional disclosure, apology, litigation, compensation, processes of recording and policy development to enhance quality service. Anecdotally, we are aware of narratives of minor errors, which may well have been covered up and remain officially undisclosed whilst the major errors resulting in damage and death to patients alarm both professionals and public with resultant litigation and compensation. This paper attempts to unravel some of these issues by highlighting the historical nature of clinical errors and drawing parallels to contemporary times by outlining the 'compensation culture'. We then provide an overview of what constitutes a clinical error and review the healthcare professional strategies for managing such errors.

  8. Separate and unequal: Structural racism and infant mortality in the US.

    Science.gov (United States)

    Wallace, Maeve; Crear-Perry, Joia; Richardson, Lisa; Tarver, Meshawn; Theall, Katherine

    2017-05-01

    We examined associations between state-level measures of structural racism and infant mortality among black and white populations across the US. Overall and race-specific infant mortality rates in each state were calculated from national linked birth and infant death records from 2010 to 2013. Structural racism in each state was characterized by racial inequity (ratio of black to white population estimates) in educational attainment, median household income, employment, imprisonment, and juvenile custody. Poisson regression with robust standard errors estimated infant mortality rate ratios (RR) and 95% confidence intervals (CI) associated with an IQR increase in indicators of structural racism overall and separately within black and white populations. Across all states, increasing racial inequity in unemployment was associated with a 5% increase in black infant mortality (RR=1.05, 95% CI=1.01, 1.10). Decreasing racial inequity in education was associated with an almost 10% reduction in the black infant mortality rate (RR=0.92, 95% CI=0.85, 0.99). None of the structural racism measures were significantly associated with infant mortality among whites. Structural racism may contribute to the persisting racial inequity in infant mortality. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Errors in clinical laboratories or errors in laboratory medicine?

    Science.gov (United States)

    Plebani, Mario

    2006-01-01

    Laboratory testing is a highly complex process and, although laboratory services are relatively safe, they are not as safe as they could or should be. Clinical laboratories have long focused their attention on quality control methods and quality assessment programs dealing with analytical aspects of testing. However, a growing body of evidence accumulated in recent decades demonstrates that quality in clinical laboratories cannot be assured by merely focusing on purely analytical aspects. The more recent surveys on errors in laboratory medicine conclude that in the delivery of laboratory testing, mistakes occur more frequently before (pre-analytical) and after (post-analytical) the test has been performed. Most errors are due to pre-analytical factors (46-68.2% of total errors), while a high error rate (18.5-47% of total errors) has also been found in the post-analytical phase. Errors due to analytical problems have been significantly reduced over time, but there is evidence that, particularly for immunoassays, interference may have a serious impact on patients. A description of the most frequent and risky pre-, intra- and post-analytical errors and advice on practical steps for measuring and reducing the risk of errors is therefore given in the present paper. Many mistakes in the Total Testing Process are called "laboratory errors", although these may be due to poor communication, action taken by others involved in the testing process (e.g., physicians, nurses and phlebotomists), or poorly designed processes, all of which are beyond the laboratory's control. Likewise, there is evidence that laboratory information is only partially utilized. A recent document from the International Organization for Standardization (ISO) recommends a new, broader definition of the term "laboratory error" and a classification of errors according to different criteria. In a modern approach to total quality, centered on patients' needs and satisfaction, the risk of errors and mistakes

  10. Errors in abdominal computed tomography

    International Nuclear Information System (INIS)

    Stephens, S.; Marting, I.; Dixon, A.K.

    1989-01-01

    Sixty-nine patients are presented in whom a substantial error was made on the initial abdominal computed tomography report. Certain features of these errors have been analysed. In 30 (43.5%) a lesion was simply not recognised (error of observation); in 39 (56.5%) the wrong conclusions were drawn about the nature of normal or abnormal structures (error of interpretation). The 39 errors of interpretation were more complex; in 7 patients an abnormal structure was noted but interpreted as normal, whereas in four a normal structure was thought to represent a lesion. Other interpretive errors included those where the wrong cause for a lesion had been ascribed (24 patients), and those where the abnormality was substantially under-reported (4 patients). Various features of these errors are presented and discussed. Errors were made just as often in relation to small and large lesions. Consultants made as many errors as senior registrar radiologists. It is like that dual reporting is the best method of avoiding such errors and, indeed, this is widely practised in our unit. (Author). 9 refs.; 5 figs.; 1 tab

  11. Laboratory errors and patient safety.

    Science.gov (United States)

    Miligy, Dawlat A

    2015-01-01

    Laboratory data are extensively used in medical practice; consequently, laboratory errors have a tremendous impact on patient safety. Therefore, programs designed to identify and reduce laboratory errors, as well as, setting specific strategies are required to minimize these errors and improve patient safety. The purpose of this paper is to identify part of the commonly encountered laboratory errors throughout our practice in laboratory work, their hazards on patient health care and some measures and recommendations to minimize or to eliminate these errors. Recording the encountered laboratory errors during May 2008 and their statistical evaluation (using simple percent distribution) have been done in the department of laboratory of one of the private hospitals in Egypt. Errors have been classified according to the laboratory phases and according to their implication on patient health. Data obtained out of 1,600 testing procedure revealed that the total number of encountered errors is 14 tests (0.87 percent of total testing procedures). Most of the encountered errors lay in the pre- and post-analytic phases of testing cycle (representing 35.7 and 50 percent, respectively, of total errors). While the number of test errors encountered in the analytic phase represented only 14.3 percent of total errors. About 85.7 percent of total errors were of non-significant implication on patients health being detected before test reports have been submitted to the patients. On the other hand, the number of test errors that have been already submitted to patients and reach the physician represented 14.3 percent of total errors. Only 7.1 percent of the errors could have an impact on patient diagnosis. The findings of this study were concomitant with those published from the USA and other countries. This proves that laboratory problems are universal and need general standardization and bench marking measures. Original being the first data published from Arabic countries that

  12. Agricultural injuries in Korea and errors in systems of safety

    Directory of Open Access Journals (Sweden)

    Hyocher Kim

    2016-07-01

    It was found that most agricultural injuries were caused by a complex layer of root causes which were classified as errors in the systems of safety. This result indicates that not only training and personal protective equipment, but also regulation of safety design, mitigation devices, inspection/maintenance of workplaces, and other factors play an important role in preventing agricultural injuries. The identification of errors will help farmers to implement easily an effective prevention programme.

  13. Dopamine reward prediction error coding.

    Science.gov (United States)

    Schultz, Wolfram

    2016-03-01

    Reward prediction errors consist of the differences between received and predicted rewards. They are crucial for basic forms of learning about rewards and make us strive for more rewards-an evolutionary beneficial trait. Most dopamine neurons in the midbrain of humans, monkeys, and rodents signal a reward prediction error; they are activated by more reward than predicted (positive prediction error), remain at baseline activity for fully predicted rewards, and show depressed activity with less reward than predicted (negative prediction error). The dopamine signal increases nonlinearly with reward value and codes formal economic utility. Drugs of addiction generate, hijack, and amplify the dopamine reward signal and induce exaggerated, uncontrolled dopamine effects on neuronal plasticity. The striatum, amygdala, and frontal cortex also show reward prediction error coding, but only in subpopulations of neurons. Thus, the important concept of reward prediction errors is implemented in neuronal hardware.

  14. Statistical errors in Monte Carlo estimates of systematic errors

    Science.gov (United States)

    Roe, Byron P.

    2007-01-01

    For estimating the effects of a number of systematic errors on a data sample, one can generate Monte Carlo (MC) runs with systematic parameters varied and examine the change in the desired observed result. Two methods are often used. In the unisim method, the systematic parameters are varied one at a time by one standard deviation, each parameter corresponding to a MC run. In the multisim method (see ), each MC run has all of the parameters varied; the amount of variation is chosen from the expected distribution of each systematic parameter, usually assumed to be a normal distribution. The variance of the overall systematic error determination is derived for each of the two methods and comparisons are made between them. If one focuses not on the error in the prediction of an individual systematic error, but on the overall error due to all systematic errors in the error matrix element in data bin m, the number of events needed is strongly reduced because of the averaging effect over all of the errors. For simple models presented here the multisim model was far better if the statistical error in the MC samples was larger than an individual systematic error, while for the reverse case, the unisim model was better. Exact formulas and formulas for the simple toy models are presented so that realistic calculations can be made. The calculations in the present note are valid if the errors are in a linear region. If that region extends sufficiently far, one can have the unisims or multisims correspond to k standard deviations instead of one. This reduces the number of events required by a factor of k2. The specific terms unisim and multisim were coined by Peter Meyers and Steve Brice, respectively, for the MiniBooNE experiment. However, the concepts have been developed over time and have been in general use for some time.

  15. Statistical errors in Monte Carlo estimates of systematic errors

    Energy Technology Data Exchange (ETDEWEB)

    Roe, Byron P. [Department of Physics, University of Michigan, Ann Arbor, MI 48109 (United States)]. E-mail: byronroe@umich.edu

    2007-01-01

    For estimating the effects of a number of systematic errors on a data sample, one can generate Monte Carlo (MC) runs with systematic parameters varied and examine the change in the desired observed result. Two methods are often used. In the unisim method, the systematic parameters are varied one at a time by one standard deviation, each parameter corresponding to a MC run. In the multisim method (see ), each MC run has all of the parameters varied; the amount of variation is chosen from the expected distribution of each systematic parameter, usually assumed to be a normal distribution. The variance of the overall systematic error determination is derived for each of the two methods and comparisons are made between them. If one focuses not on the error in the prediction of an individual systematic error, but on the overall error due to all systematic errors in the error matrix element in data bin m, the number of events needed is strongly reduced because of the averaging effect over all of the errors. For simple models presented here the multisim model was far better if the statistical error in the MC samples was larger than an individual systematic error, while for the reverse case, the unisim model was better. Exact formulas and formulas for the simple toy models are presented so that realistic calculations can be made. The calculations in the present note are valid if the errors are in a linear region. If that region extends sufficiently far, one can have the unisims or multisims correspond to k standard deviations instead of one. This reduces the number of events required by a factor of k{sup 2}.

  16. Statistical errors in Monte Carlo estimates of systematic errors

    International Nuclear Information System (INIS)

    Roe, Byron P.

    2007-01-01

    For estimating the effects of a number of systematic errors on a data sample, one can generate Monte Carlo (MC) runs with systematic parameters varied and examine the change in the desired observed result. Two methods are often used. In the unisim method, the systematic parameters are varied one at a time by one standard deviation, each parameter corresponding to a MC run. In the multisim method (see ), each MC run has all of the parameters varied; the amount of variation is chosen from the expected distribution of each systematic parameter, usually assumed to be a normal distribution. The variance of the overall systematic error determination is derived for each of the two methods and comparisons are made between them. If one focuses not on the error in the prediction of an individual systematic error, but on the overall error due to all systematic errors in the error matrix element in data bin m, the number of events needed is strongly reduced because of the averaging effect over all of the errors. For simple models presented here the multisim model was far better if the statistical error in the MC samples was larger than an individual systematic error, while for the reverse case, the unisim model was better. Exact formulas and formulas for the simple toy models are presented so that realistic calculations can be made. The calculations in the present note are valid if the errors are in a linear region. If that region extends sufficiently far, one can have the unisims or multisims correspond to k standard deviations instead of one. This reduces the number of events required by a factor of k 2

  17. Discovery of a Highly Unequal-mass Binary T Dwarf with Keck Laser Guide Star Adaptive Optics: A Coevality Test of Substellar Theoretical Models and Effective Temperatures

    Science.gov (United States)

    Liu, Michael C.; Dupuy, Trent J.; Leggett, S. K.

    2010-10-01

    Highly unequal-mass ratio binaries are rare among field brown dwarfs, with the mass ratio distribution of the known census described by q (4.9±0.7). However, such systems enable a unique test of the joint accuracy of evolutionary and atmospheric models, under the constraint of coevality for the individual components (the "isochrone test"). We carry out this test using two of the most extreme field substellar binaries currently known, the T1 + T6 epsilon Ind Bab binary and a newly discovered 0farcs14 T2.0 + T7.5 binary, 2MASS J12095613-1004008AB, identified with Keck laser guide star adaptive optics. The latter is the most extreme tight binary resolved to date (q ≈ 0.5). Based on the locations of the binary components on the Hertzsprung-Russell (H-R) diagram, current models successfully indicate that these two systems are coeval, with internal age differences of log(age) = -0.8 ± 1.3(-1.0+1.2 -1.3) dex and 0.5+0.4 -0.3(0.3+0.3 -0.4) dex for 2MASS J1209-1004AB and epsilon Ind Bab, respectively, as inferred from the Lyon (Tucson) models. However, the total mass of epsilon Ind Bab derived from the H-R diagram (≈ 80 M Jup using the Lyon models) is strongly discrepant with the reported dynamical mass. This problem, which is independent of the assumed age of the epsilon Ind Bab system, can be explained by a ≈ 50-100 K systematic error in the model atmosphere fitting, indicating slightly warmer temperatures for both components; bringing the mass determinations from the H-R diagram and the visual orbit into consistency leads to an inferred age of ≈ 6 Gyr for epsilon Ind Bab, older than previously assumed. Overall, the two T dwarf binaries studied here, along with recent results from T dwarfs in age and mass benchmark systems, yield evidence for small (≈100 K) errors in the evolutionary models and/or model atmospheres, but not significantly larger. Future parallax, resolved spectroscopy, and dynamical mass measurements for 2MASS J1209-1004AB will enable a more

  18. Architecture design for soft errors

    CERN Document Server

    Mukherjee, Shubu

    2008-01-01

    This book provides a comprehensive description of the architetural techniques to tackle the soft error problem. It covers the new methodologies for quantitative analysis of soft errors as well as novel, cost-effective architectural techniques to mitigate them. To provide readers with a better grasp of the broader problem deffinition and solution space, this book also delves into the physics of soft errors and reviews current circuit and software mitigation techniques.

  19. Dopamine reward prediction error coding

    OpenAIRE

    Schultz, Wolfram

    2016-01-01

    Reward prediction errors consist of the differences between received and predicted rewards. They are crucial for basic forms of learning about rewards and make us strive for more rewards?an evolutionary beneficial trait. Most dopamine neurons in the midbrain of humans, monkeys, and rodents signal a reward prediction error; they are activated by more reward than predicted (positive prediction error), remain at baseline activity for fully predicted rewards, and show depressed activity with less...

  20. Identifying Error in AUV Communication

    National Research Council Canada - National Science Library

    Coleman, Joseph; Merrill, Kaylani; O'Rourke, Michael; Rajala, Andrew G; Edwards, Dean B

    2006-01-01

    Mine Countermeasures (MCM) involving Autonomous Underwater Vehicles (AUVs) are especially susceptible to error, given the constraints on underwater acoustic communication and the inconstancy of the underwater communication channel...

  1. Human Errors in Decision Making

    OpenAIRE

    Mohamad, Shahriari; Aliandrina, Dessy; Feng, Yan

    2005-01-01

    The aim of this paper was to identify human errors in decision making process. The study was focused on a research question such as: what could be the human error as a potential of decision failure in evaluation of the alternatives in the process of decision making. Two case studies were selected from the literature and analyzed to find the human errors contribute to decision fail. Then the analysis of human errors was linked with mental models in evaluation of alternative step. The results o...

  2. Finding beam focus errors automatically

    International Nuclear Information System (INIS)

    Lee, M.J.; Clearwater, S.H.; Kleban, S.D.

    1987-01-01

    An automated method for finding beam focus errors using an optimization program called COMFORT-PLUS. The steps involved in finding the correction factors using COMFORT-PLUS has been used to find the beam focus errors for two damping rings at the SLAC Linear Collider. The program is to be used as an off-line program to analyze actual measured data for any SLC system. A limitation on the application of this procedure is found to be that it depends on the magnitude of the machine errors. Another is that the program is not totally automated since the user must decide a priori where to look for errors

  3. Heuristic errors in clinical reasoning.

    Science.gov (United States)

    Rylander, Melanie; Guerrasio, Jeannette

    2016-08-01

    Errors in clinical reasoning contribute to patient morbidity and mortality. The purpose of this study was to determine the types of heuristic errors made by third-year medical students and first-year residents. This study surveyed approximately 150 clinical educators inquiring about the types of heuristic errors they observed in third-year medical students and first-year residents. Anchoring and premature closure were the two most common errors observed amongst third-year medical students and first-year residents. There was no difference in the types of errors observed in the two groups. Errors in clinical reasoning contribute to patient morbidity and mortality Clinical educators perceived that both third-year medical students and first-year residents committed similar heuristic errors, implying that additional medical knowledge and clinical experience do not affect the types of heuristic errors made. Further work is needed to help identify methods that can be used to reduce heuristic errors early in a clinician's education. © 2015 John Wiley & Sons Ltd.

  4. The selective power of causality on memory errors.

    Science.gov (United States)

    Marsh, Jessecae K; Kulkofsky, Sarah

    2015-01-01

    We tested the influence of causal links on the production of memory errors in a misinformation paradigm. Participants studied a set of statements about a person, which were presented as either individual statements or pairs of causally linked statements. Participants were then provided with causally plausible and causally implausible misinformation. We hypothesised that studying information connected with causal links would promote representing information in a more abstract manner. As such, we predicted that causal information would not provide an overall protection against memory errors, but rather would preferentially help in the rejection of misinformation that was causally implausible, given the learned causal links. In two experiments, we measured whether the causal linkage of information would be generally protective against all memory errors or only selectively protective against certain types of memory errors. Causal links helped participants reject implausible memory lures, but did not protect against plausible lures. Our results suggest that causal information may promote an abstract storage of information that helps prevent only specific types of memory errors.

  5. Entanglement renormalization, quantum error correction, and bulk causality

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Isaac H. [IBM T.J. Watson Research Center,1101 Kitchawan Rd., Yorktown Heights, NY (United States); Kastoryano, Michael J. [NBIA, Niels Bohr Institute, University of Copenhagen, Blegdamsvej 17, 2100 Copenhagen (Denmark)

    2017-04-07

    Entanglement renormalization can be viewed as an encoding circuit for a family of approximate quantum error correcting codes. The logical information becomes progressively more well-protected against erasure errors at larger length scales. In particular, an approximate variant of holographic quantum error correcting code emerges at low energy for critical systems. This implies that two operators that are largely separated in scales behave as if they are spatially separated operators, in the sense that they obey a Lieb-Robinson type locality bound under a time evolution generated by a local Hamiltonian.

  6. Error studies for SNS Linac. Part 1: Transverse errors

    International Nuclear Information System (INIS)

    Crandall, K.R.

    1998-01-01

    The SNS linac consist of a radio-frequency quadrupole (RFQ), a drift-tube linac (DTL), a coupled-cavity drift-tube linac (CCDTL) and a coupled-cavity linac (CCL). The RFQ and DTL are operated at 402.5 MHz; the CCDTL and CCL are operated at 805 MHz. Between the RFQ and DTL is a medium-energy beam-transport system (MEBT). This error study is concerned with the DTL, CCDTL and CCL, and each will be analyzed separately. In fact, the CCL is divided into two sections, and each of these will be analyzed separately. The types of errors considered here are those that affect the transverse characteristics of the beam. The errors that cause the beam center to be displaced from the linac axis are quad displacements and quad tilts. The errors that cause mismatches are quad gradient errors and quad rotations (roll)

  7. Error begat error: design error analysis and prevention in social infrastructure projects.

    Science.gov (United States)

    Love, Peter E D; Lopez, Robert; Edwards, David J; Goh, Yang M

    2012-09-01

    Design errors contribute significantly to cost and schedule growth in social infrastructure projects and to engineering failures, which can result in accidents and loss of life. Despite considerable research that has addressed their error causation in construction projects they still remain prevalent. This paper identifies the underlying conditions that contribute to design errors in social infrastructure projects (e.g. hospitals, education, law and order type buildings). A systemic model of error causation is propagated and subsequently used to develop a learning framework for design error prevention. The research suggests that a multitude of strategies should be adopted in congruence to prevent design errors from occurring and so ensure that safety and project performance are ameliorated. Copyright © 2011. Published by Elsevier Ltd.

  8. Dual Processing and Diagnostic Errors

    Science.gov (United States)

    Norman, Geoff

    2009-01-01

    In this paper, I review evidence from two theories in psychology relevant to diagnosis and diagnostic errors. "Dual Process" theories of thinking, frequently mentioned with respect to diagnostic error, propose that categorization decisions can be made with either a fast, unconscious, contextual process called System 1 or a slow, analytical,…

  9. Barriers to medical error reporting

    Directory of Open Access Journals (Sweden)

    Jalal Poorolajal

    2015-01-01

    Full Text Available Background: This study was conducted to explore the prevalence of medical error underreporting and associated barriers. Methods: This cross-sectional study was performed from September to December 2012. Five hospitals, affiliated with Hamadan University of Medical Sciences, in Hamedan,Iran were investigated. A self-administered questionnaire was used for data collection. Participants consisted of physicians, nurses, midwives, residents, interns, and staffs of radiology and laboratory departments. Results: Overall, 50.26% of subjects had committed but not reported medical errors. The main reasons mentioned for underreporting were lack of effective medical error reporting system (60.0%, lack of proper reporting form (51.8%, lack of peer supporting a person who has committed an error (56.0%, and lack of personal attention to the importance of medical errors (62.9%. The rate of committing medical errors was higher in men (71.4%, age of 50-40 years (67.6%, less-experienced personnel (58.7%, educational level of MSc (87.5%, and staff of radiology department (88.9%. Conclusions: This study outlined the main barriers to reporting medical errors and associated factors that may be helpful for healthcare organizations in improving medical error reporting as an essential component for patient safety enhancement.

  10. Taming of the few-The unequal distribution of greenhouse gas emissions from personal travel in the UK

    International Nuclear Information System (INIS)

    Brand, Christian; Boardman, Brenda

    2008-01-01

    Greenhouse gas emissions from personal transport have risen steadily in the UK. Yet, surprisingly little is known about who exactly is contributing to the problem and the extent to which different groups of the population will be affected by any policy responses. This paper describes an innovative methodology and evaluation tool for profiling annual greenhouse gas emissions from personal travel across all modes of travel. A case study application of the methodology involving a major survey of UK residents provides an improved understanding of the extent to which individual and household travel activity patterns, choice of transport mode, geographical location, socio-economic and other factors impact on greenhouse gas emissions. Air and car travel dominate overall emissions. Conversely, land-based public transport accounts for a very small proportion of emissions on average. There is a highly unequal distribution of emissions amongst the population, independent of the mode of travel, location and unit of analysis. The top 10% of emitters are responsible for 43% of emissions and the bottom 10% for only 1%. Income, economic activity, age, household structure and car availability significantly influence emissions levels. Key policy implications of the results are discussed. The paper concludes by suggesting potential applications of the methodology and evaluation tool

  11. Hangup effect in unequal mass binary black hole mergers and further studies of their gravitational radiation and remnant properties

    Science.gov (United States)

    Healy, James; Lousto, Carlos O.

    2018-04-01

    We present the results of 74 new simulations of nonprecessing spinning black hole binaries with mass ratios q =m1/m2 in the range 1 /7 ≤q ≤1 and individual spins covering the parameter space -0.95 ≤α1 ,2≤0.95 . We supplement those runs with 107 previous simulations to study the hangup effect in black hole mergers, i.e. the delay or prompt merger of spinning holes with respect to nonspinning binaries. We perform the numerical evolution for typically the last ten orbits before the merger and down to the formation of the final remnant black hole. This allows us to study the hangup effect for unequal mass binaries leading us to identify the spin variable that controls the number of orbits before merger as S→ hu.L ^ , where S→ hu=(1 +1/2 m/2 m1 )S→ 1+(1 +1/2 m/1 m2 )S→ 2 . We also combine the total results of those 181 simulations to obtain improved fitting formulas for the remnant final black hole mass, spin and recoil velocity as well as for the peak luminosity and peak frequency of the gravitational strain, and find new correlations among them. This accurate new set of simulations enhances the number of available numerical relativity waveforms available for parameter estimation of gravitational wave observations.

  12. Separate and unequal: the influence of neighborhood and school characteristics on spatial proximity between fast food and schools.

    Science.gov (United States)

    Kwate, Naa Oyo A; Loh, Ji Meng

    2010-08-01

    Social science and health literature have identified residential segregation as a critical factor in exposure to health-related resources, including food environments. Differential spatial patterning of food environments surrounding schools has significant import for youth. We examined whether fast food restaurants clustered around schools in New York City, and whether any observed clustering varied as a function of school type, school racial demographics, and area racial and socioeconomic demographics. We geocoded fast food locations from 2006 (n=817) and schools from 2004-2005 (n=2096; public and private, elementary and secondary) in the five boroughs of New York City. A point process model (inhomogeneous cross-K function) examined spatial clustering. A minimum of 25% of schools had a fast food restaurant within 400 m. High schools had higher fast food clustering than elementary schools. Public elementary and high schools with large proportions of Black students or in block groups with large proportions of Black residents had higher clustering than White counterparts. Finally, public high schools had higher clustering than private counterparts, with 1.25 to 2 times as many restaurants than expected by chance. The results suggest that the geography of opportunity as it relates to school food environments is unequal in New York City. Copyright 2010 Elsevier Inc. All rights reserved.

  13. The Unequal Structure of the German Education System: Structural Reasons for Educational Failures of Turkish Youth in Germany.

    Science.gov (United States)

    Fernandez-Kelly, Patricia

    The paper examines the educational experiences of Turkish youth in Germany with special references to the statistical data of Educational Report, PISA surveys. The results of the educational statistics of Germany show that more than group characteristics like social and cultural capital, structural and institutional factors (multi-track system with its selective mechanism, education policy, context of negative reception of Germany, institutional discrimination, and lack of intercultural curriculum) could have a decisive role in hampering the educational and labor market integration and social mobility of Turkish youth. This can be explained by a mix of factors: the education system which does not foster the educational progress of children from disadvantaged families; the high importance of school degrees for accessing to the vocational training system and the labor market; and direct and indirect institutional discrimination in educational area in Germany. Thus, this work suggests that the nature of the education system in Germany remains deeply "unequal," "hierarchical" and "exclusive." This study also demonstrates maintaining the marginalized position of Turkish children in Germany means that the country of origin or the immigrants' background is still a barrier to having access to education and the labor market of Germany.

  14. A theory of human error

    Science.gov (United States)

    Mcruer, D. T.; Clement, W. F.; Allen, R. W.

    1981-01-01

    Human errors tend to be treated in terms of clinical and anecdotal descriptions, from which remedial measures are difficult to derive. Correction of the sources of human error requires an attempt to reconstruct underlying and contributing causes of error from the circumstantial causes cited in official investigative reports. A comprehensive analytical theory of the cause-effect relationships governing propagation of human error is indispensable to a reconstruction of the underlying and contributing causes. A validated analytical theory of the input-output behavior of human operators involving manual control, communication, supervisory, and monitoring tasks which are relevant to aviation, maritime, automotive, and process control operations is highlighted. This theory of behavior, both appropriate and inappropriate, provides an insightful basis for investigating, classifying, and quantifying the needed cause-effect relationships governing propagation of human error.

  15. Correcting AUC for Measurement Error.

    Science.gov (United States)

    Rosner, Bernard; Tworoger, Shelley; Qiu, Weiliang

    2015-12-01

    Diagnostic biomarkers are used frequently in epidemiologic and clinical work. The ability of a diagnostic biomarker to discriminate between subjects who develop disease (cases) and subjects who do not (controls) is often measured by the area under the receiver operating characteristic curve (AUC). The diagnostic biomarkers are usually measured with error. Ignoring measurement error can cause biased estimation of AUC, which results in misleading interpretation of the efficacy of a diagnostic biomarker. Several methods have been proposed to correct AUC for measurement error, most of which required the normality assumption for the distributions of diagnostic biomarkers. In this article, we propose a new method to correct AUC for measurement error and derive approximate confidence limits for the corrected AUC. The proposed method does not require the normality assumption. Both real data analyses and simulation studies show good performance of the proposed measurement error correction method.

  16. Cognitive aspect of diagnostic errors.

    Science.gov (United States)

    Phua, Dong Haur; Tan, Nigel C K

    2013-01-01

    Diagnostic errors can result in tangible harm to patients. Despite our advances in medicine, the mental processes required to make a diagnosis exhibits shortcomings, causing diagnostic errors. Cognitive factors are found to be an important cause of diagnostic errors. With new understanding from psychology and social sciences, clinical medicine is now beginning to appreciate that our clinical reasoning can take the form of analytical reasoning or heuristics. Different factors like cognitive biases and affective influences can also impel unwary clinicians to make diagnostic errors. Various strategies have been proposed to reduce the effect of cognitive biases and affective influences when clinicians make diagnoses; however evidence for the efficacy of these methods is still sparse. This paper aims to introduce the reader to the cognitive aspect of diagnostic errors, in the hope that clinicians can use this knowledge to improve diagnostic accuracy and patient outcomes.

  17. Physical implementation of protected qubits

    International Nuclear Information System (INIS)

    Douçot, B; Ioffe, L B

    2012-01-01

    We review the general notion of topological protection of quantum states in spin models and its relation with the ideas of quantum error correction. We show that topological protection can be viewed as a Hamiltonian realization of error correction: for a quantum code for which the minimal number of errors that remain undetected is N, the corresponding Hamiltonian model of the effects of the environment noise appears only in the Nth order of the perturbation theory. We discuss the simplest model Hamiltonians that realize topological protection and their implementation in superconducting arrays. We focus on two dual realizations: in one the protected state is stored in the parity of the Cooper pair number, in the other, in the parity of the flux number. In both cases the superconducting arrays allow a number of fault-tolerant operations that should make the universal quantum computation possible. (key issues reviews)

  18. Physical implementation of protected qubits

    Science.gov (United States)

    Douçot, B.; Ioffe, L. B.

    2012-07-01

    We review the general notion of topological protection of quantum states in spin models and its relation with the ideas of quantum error correction. We show that topological protection can be viewed as a Hamiltonian realization of error correction: for a quantum code for which the minimal number of errors that remain undetected is N, the corresponding Hamiltonian model of the effects of the environment noise appears only in the Nth order of the perturbation theory. We discuss the simplest model Hamiltonians that realize topological protection and their implementation in superconducting arrays. We focus on two dual realizations: in one the protected state is stored in the parity of the Cooper pair number, in the other, in the parity of the flux number. In both cases the superconducting arrays allow a number of fault-tolerant operations that should make the universal quantum computation possible.

  19. Temporal clumping of prey and coexistence of unequal interferers: experiments on social forager groups of brown trout feeding on invertebrate drift

    DEFF Research Database (Denmark)

    Jonsson, Mikael; Skov, Christian; Koed, Anders

    2008-01-01

    Environmental fluctuations have been proposed to enhance the coexistence of competing phenotypes. Evaluations are here presented on the effects of prey density and short-term temporal clumping of prey availability on the relative foraging success of unequal interferers in social forager groups...... of juvenile brown trout Salmo trutta feeding on drifting invertebrate prey (frozen chironomids). Groups of three trout with established linear dominance hierarchies (dominant, intermediate and subordinate) were subjected to three different total numbers of prey, combined with three different levels...

  20. Errors, error detection, error correction and hippocampal-region damage: data and theories.

    Science.gov (United States)

    MacKay, Donald G; Johnson, Laura W

    2013-11-01

    This review and perspective article outlines 15 observational constraints on theories of errors, error detection, and error correction, and their relation to hippocampal-region (HR) damage. The core observations come from 10 studies with H.M., an amnesic with cerebellar and HR damage but virtually no neocortical damage. Three studies examined the detection of errors planted in visual scenes (e.g., a bird flying in a fish bowl in a school classroom) and sentences (e.g., I helped themselves to the birthday cake). In all three experiments, H.M. detected reliably fewer errors than carefully matched memory-normal controls. Other studies examined the detection and correction of self-produced errors, with controls for comprehension of the instructions, impaired visual acuity, temporal factors, motoric slowing, forgetting, excessive memory load, lack of motivation, and deficits in visual scanning or attention. In these studies, H.M. corrected reliably fewer errors than memory-normal and cerebellar controls, and his uncorrected errors in speech, object naming, and reading aloud exhibited two consistent features: omission and anomaly. For example, in sentence production tasks, H.M. omitted one or more words in uncorrected encoding errors that rendered his sentences anomalous (incoherent, incomplete, or ungrammatical) reliably more often than controls. Besides explaining these core findings, the theoretical principles discussed here explain H.M.'s retrograde amnesia for once familiar episodic and semantic information; his anterograde amnesia for novel information; his deficits in visual cognition, sentence comprehension, sentence production, sentence reading, and object naming; and effects of aging on his ability to read isolated low frequency words aloud. These theoretical principles also explain a wide range of other data on error detection and correction and generate new predictions for future test. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. Human errors in NPP operations

    International Nuclear Information System (INIS)

    Sheng Jufang

    1993-01-01

    Based on the operational experiences of nuclear power plants (NPPs), the importance of studying human performance problems is described. Statistical analysis on the significance or frequency of various root-causes and error-modes from a large number of human-error-related events demonstrate that the defects in operation/maintenance procedures, working place factors, communication and training practices are primary root-causes, while omission, transposition, quantitative mistake are the most frequent among the error-modes. Recommendations about domestic research on human performance problem in NPPs are suggested

  2. Linear network error correction coding

    CERN Document Server

    Guang, Xuan

    2014-01-01

    There are two main approaches in the theory of network error correction coding. In this SpringerBrief, the authors summarize some of the most important contributions following the classic approach, which represents messages by sequences?similar to algebraic coding,?and also briefly discuss the main results following the?other approach,?that uses the theory of rank metric codes for network error correction of representing messages by subspaces. This book starts by establishing the basic linear network error correction (LNEC) model and then characterizes two equivalent descriptions. Distances an

  3. Error field considerations for BPX

    International Nuclear Information System (INIS)

    LaHaye, R.J.

    1992-01-01

    Irregularities in the position of poloidal and/or toroidal field coils in tokamaks produce resonant toroidal asymmetries in the vacuum magnetic fields. Otherwise stable tokamak discharges become non-linearly unstable to disruptive locked modes when subjected to low level error fields. Because of the field errors, magnetic islands are produced which would not otherwise occur in tearing mode table configurations; a concomitant reduction of the total confinement can result. Poloidal and toroidal asymmetries arise in the heat flux to the divertor target. In this paper, the field errors from perturbed BPX coils are used in a field line tracing code of the BPX equilibrium to study these deleterious effects. Limits on coil irregularities for device design and fabrication are computed along with possible correcting coils for reducing such field errors

  4. The uncorrected refractive error challenge

    Directory of Open Access Journals (Sweden)

    Kovin Naidoo

    2016-11-01

    Full Text Available Refractive error affects people of all ages, socio-economic status and ethnic groups. The most recent statistics estimate that, worldwide, 32.4 million people are blind and 191 million people have vision impairment. Vision impairment has been defined based on distance visual acuity only, and uncorrected distance refractive error (mainly myopia is the single biggest cause of worldwide vision impairment. However, when we also consider near visual impairment, it is clear that even more people are affected. From research it was estimated that the number of people with vision impairment due to uncorrected distance refractive error was 107.8 million,1 and the number of people affected by uncorrected near refractive error was 517 million, giving a total of 624.8 million people.

  5. Quantile Regression With Measurement Error

    KAUST Repository

    Wei, Ying

    2009-08-27

    Regression quantiles can be substantially biased when the covariates are measured with error. In this paper we propose a new method that produces consistent linear quantile estimation in the presence of covariate measurement error. The method corrects the measurement error induced bias by constructing joint estimating equations that simultaneously hold for all the quantile levels. An iterative EM-type estimation algorithm to obtain the solutions to such joint estimation equations is provided. The finite sample performance of the proposed method is investigated in a simulation study, and compared to the standard regression calibration approach. Finally, we apply our methodology to part of the National Collaborative Perinatal Project growth data, a longitudinal study with an unusual measurement error structure. © 2009 American Statistical Association.

  6. Comprehensive Error Rate Testing (CERT)

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Centers for Medicare and Medicaid Services (CMS) implemented the Comprehensive Error Rate Testing (CERT) program to measure improper payments in the Medicare...

  7. Numerical optimization with computational errors

    CERN Document Server

    Zaslavski, Alexander J

    2016-01-01

    This book studies the approximate solutions of optimization problems in the presence of computational errors. A number of results are presented on the convergence behavior of algorithms in a Hilbert space; these algorithms are examined taking into account computational errors. The author illustrates that algorithms generate a good approximate solution, if computational errors are bounded from above by a small positive constant. Known computational errors are examined with the aim of determining an approximate solution. Researchers and students interested in the optimization theory and its applications will find this book instructive and informative. This monograph contains 16 chapters; including a chapters devoted to the subgradient projection algorithm, the mirror descent algorithm, gradient projection algorithm, the Weiszfelds method, constrained convex minimization problems, the convergence of a proximal point method in a Hilbert space, the continuous subgradient method, penalty methods and Newton’s meth...

  8. Dual processing and diagnostic errors.

    Science.gov (United States)

    Norman, Geoff

    2009-09-01

    In this paper, I review evidence from two theories in psychology relevant to diagnosis and diagnostic errors. "Dual Process" theories of thinking, frequently mentioned with respect to diagnostic error, propose that categorization decisions can be made with either a fast, unconscious, contextual process called System 1 or a slow, analytical, conscious, and conceptual process, called System 2. Exemplar theories of categorization propose that many category decisions in everyday life are made by unconscious matching to a particular example in memory, and these remain available and retrievable individually. I then review studies of clinical reasoning based on these theories, and show that the two processes are equally effective; System 1, despite its reliance in idiosyncratic, individual experience, is no more prone to cognitive bias or diagnostic error than System 2. Further, I review evidence that instructions directed at encouraging the clinician to explicitly use both strategies can lead to consistent reduction in error rates.

  9. Error correcting coding for OTN

    DEFF Research Database (Denmark)

    Justesen, Jørn; Larsen, Knud J.; Pedersen, Lars A.

    2010-01-01

    Forward error correction codes for 100 Gb/s optical transmission are currently receiving much attention from transport network operators and technology providers. We discuss the performance of hard decision decoding using product type codes that cover a single OTN frame or a small number...... of such frames. In particular we argue that a three-error correcting BCH is the best choice for the component code in such systems....

  10. Negligence, genuine error, and litigation

    OpenAIRE

    Sohn DH

    2013-01-01

    David H SohnDepartment of Orthopedic Surgery, University of Toledo Medical Center, Toledo, OH, USAAbstract: Not all medical injuries are the result of negligence. In fact, most medical injuries are the result either of the inherent risk in the practice of medicine, or due to system errors, which cannot be prevented simply through fear of disciplinary action. This paper will discuss the differences between adverse events, negligence, and system errors; the current medical malpractice tort syst...

  11. Eliminating US hospital medical errors.

    Science.gov (United States)

    Kumar, Sameer; Steinebach, Marc

    2008-01-01

    Healthcare costs in the USA have continued to rise steadily since the 1980s. Medical errors are one of the major causes of deaths and injuries of thousands of patients every year, contributing to soaring healthcare costs. The purpose of this study is to examine what has been done to deal with the medical-error problem in the last two decades and present a closed-loop mistake-proof operation system for surgery processes that would likely eliminate preventable medical errors. The design method used is a combination of creating a service blueprint, implementing the six sigma DMAIC cycle, developing cause-and-effect diagrams as well as devising poka-yokes in order to develop a robust surgery operation process for a typical US hospital. In the improve phase of the six sigma DMAIC cycle, a number of poka-yoke techniques are introduced to prevent typical medical errors (identified through cause-and-effect diagrams) that may occur in surgery operation processes in US hospitals. It is the authors' assertion that implementing the new service blueprint along with the poka-yokes, will likely result in the current medical error rate to significantly improve to the six-sigma level. Additionally, designing as many redundancies as possible in the delivery of care will help reduce medical errors. Primary healthcare providers should strongly consider investing in adequate doctor and nurse staffing, and improving their education related to the quality of service delivery to minimize clinical errors. This will lead to an increase in higher fixed costs, especially in the shorter time frame. This paper focuses additional attention needed to make a sound technical and business case for implementing six sigma tools to eliminate medical errors that will enable hospital managers to increase their hospital's profitability in the long run and also ensure patient safety.

  12. Approximation errors during variance propagation

    International Nuclear Information System (INIS)

    Dinsmore, Stephen

    1986-01-01

    Risk and reliability analyses are often performed by constructing and quantifying large fault trees. The inputs to these models are component failure events whose probability of occuring are best represented as random variables. This paper examines the errors inherent in two approximation techniques used to calculate the top event's variance from the inputs' variance. Two sample fault trees are evaluated and several three dimensional plots illustrating the magnitude of the error over a wide range of input means and variances are given

  13. Job mismatching, unequal opportunities and long-term sickness absence in female white-collar workers in Sweden.

    Science.gov (United States)

    Sandmark, Hélène

    2009-01-01

    To investigate associations between long-term sick-listing and factors at work and in family life. Associations were investigated in a cross-sectional case-referent study. The study base included women in white-collar jobs, aged 30-55 years, living in three urban areas in Sweden between February 2004 and October 2004. A postal questionnaire was constructed with questions on occupational and family circumstances, and sent to 513 randomly selected female white-collar workers, of whom 233 had ongoing sick-leave of 90 days or more. The response rate was 81% (n = 413). Most of the women in this study were in managerial positions. The unadjusted associations showed that sick-listed women with children showed the highest estimates regarding reported long working hours, bullying, high mental strain, low control and low influence at work, and work-family imbalance. In a regression model, the strongest associations were: experiencing too high mental strain in work tasks (odds ratio (OR) = 2.57, 95% confidence interval (CI) = 2.09-3.15) and low control and influence at work (OR=2.17, 95% CI= 1.60-2.94). Sick-listed women reported an overall higher dissatisfaction with their workplace and working life. There seems to be a greater tendency for the sick-listed women in this study to experience low control and too high mental strain at work and to live in traditional family relationships with unequal opportunities. The women who were sick-listed were probably less able to cope with work stress and to find a balance between work and family life.

  14. [Medical errors: inevitable but preventable].

    Science.gov (United States)

    Giard, R W

    2001-10-27

    Medical errors are increasingly reported in the lay press. Studies have shown dramatic error rates of 10 percent or even higher. From a methodological point of view, studying the frequency and causes of medical errors is far from simple. Clinical decisions on diagnostic or therapeutic interventions are always taken within a clinical context. Reviewing outcomes of interventions without taking into account both the intentions and the arguments for a particular action will limit the conclusions from a study on the rate and preventability of errors. The interpretation of the preventability of medical errors is fraught with difficulties and probably highly subjective. Blaming the doctor personally does not do justice to the actual situation and especially the organisational framework. Attention for and improvement of the organisational aspects of error are far more important then litigating the person. To err is and will remain human and if we want to reduce the incidence of faults we must be able to learn from our mistakes. That requires an open attitude towards medical mistakes, a continuous effort in their detection, a sound analysis and, where feasible, the institution of preventive measures.

  15. Quantum error correction for beginners

    International Nuclear Information System (INIS)

    Devitt, Simon J; Nemoto, Kae; Munro, William J

    2013-01-01

    Quantum error correction (QEC) and fault-tolerant quantum computation represent one of the most vital theoretical aspects of quantum information processing. It was well known from the early developments of this exciting field that the fragility of coherent quantum systems would be a catastrophic obstacle to the development of large-scale quantum computers. The introduction of quantum error correction in 1995 showed that active techniques could be employed to mitigate this fatal problem. However, quantum error correction and fault-tolerant computation is now a much larger field and many new codes, techniques, and methodologies have been developed to implement error correction for large-scale quantum algorithms. In response, we have attempted to summarize the basic aspects of quantum error correction and fault-tolerance, not as a detailed guide, but rather as a basic introduction. The development in this area has been so pronounced that many in the field of quantum information, specifically researchers who are new to quantum information or people focused on the many other important issues in quantum computation, have found it difficult to keep up with the general formalisms and methodologies employed in this area. Rather than introducing these concepts from a rigorous mathematical and computer science framework, we instead examine error correction and fault-tolerance largely through detailed examples, which are more relevant to experimentalists today and in the near future. (review article)

  16. Medical Error and Moral Luck.

    Science.gov (United States)

    Hubbeling, Dieneke

    2016-09-01

    This paper addresses the concept of moral luck. Moral luck is discussed in the context of medical error, especially an error of omission that occurs frequently, but only rarely has adverse consequences. As an example, a failure to compare the label on a syringe with the drug chart results in the wrong medication being administered and the patient dies. However, this error may have previously occurred many times with no tragic consequences. Discussions on moral luck can highlight conflicting intuitions. Should perpetrators receive a harsher punishment because of an adverse outcome, or should they be dealt with in the same way as colleagues who have acted similarly, but with no adverse effects? An additional element to the discussion, specifically with medical errors, is that according to the evidence currently available, punishing individual practitioners does not seem to be effective in preventing future errors. The following discussion, using relevant philosophical and empirical evidence, posits a possible solution for the moral luck conundrum in the context of medical error: namely, making a distinction between the duty to make amends and assigning blame. Blame should be assigned on the basis of actual behavior, while the duty to make amends is dependent on the outcome.

  17. Software for radiation protection

    International Nuclear Information System (INIS)

    Graffunder, H.

    2002-01-01

    The software products presented are universally usable programs for radiation protection. The systems were designed in order to establish a comprehensive database specific to radiation protection and, on this basis, model in programs subjects of radiation protection. Development initially focused on the creation of the database. Each software product was to access the same nuclide-specific data; input errors and differences in spelling were to be excluded from the outset. This makes the products more compatible with each other and able to exchange data among each other. The software products are modular in design. Functions recurring in radiation protection are always treated the same way in different programs, and also represented the same way on the program surface. The recognition effect makes it easy for users to familiarize with the products quickly. All software products are written in German and are tailored to the administrative needs and codes and regulations in Germany and in Switzerland. (orig.) [de

  18. Whistleblower Protection

    Science.gov (United States)

    The Whistleblower Protection Enhancement Act of 2012 (WPA) and the Whistleblower Protection Act of 1989 Enhanced by the Act of 2012 provides protection rights for Federal employees against retaliation for whistleblowing activities.

  19. Burnout, engagement and resident physicians' self-reported errors.

    Science.gov (United States)

    Prins, J T; van der Heijden, F M M A; Hoekstra-Weebers, J E H M; Bakker, A B; van de Wiel, H B M; Jacobs, B; Gazendam-Donofrio, S M

    2009-12-01

    Burnout is a work-related syndrome that may negatively affect more than just the resident physician. On the other hand, engagement has been shown to protect employees; it may also positively affect the patient care that the residents provide. Little is known about the relationship between residents' self-reported errors and burnout and engagement. In our national study that included all residents and physicians in The Netherlands, 2115 questionnaires were returned (response rate 41.1%). The residents reported on burnout (Maslach Burnout Inventory-Health and Social Services), engagement (Utrecht Work Engagement Scale) and self-assessed patient care practices (six items, two factors: errors in action/judgment, errors due to lack of time). Ninety-four percent of the residents reported making one or more mistake without negative consequences for the patient during their training. Seventy-one percent reported performing procedures for which they did not feel properly trained. More than half (56%) of the residents stated they had made a mistake with a negative consequence. Seventy-six percent felt they had fallen short in the quality of care they provided on at least one occasion. Men reported more errors in action/judgment than women. Significant effects of specialty and clinical setting were found on both types of errors. Residents with burnout reported significantly more errors (p engaged residents reported fewer errors (p burnout and to keep residents engaged in their work.

  20. Human Error Assessmentin Minefield Cleaning Operation Using Human Event Analysis

    Directory of Open Access Journals (Sweden)

    Mohammad Hajiakbari

    2015-12-01

    Full Text Available Background & objective: Human error is one of the main causes of accidents. Due to the unreliability of the human element and the high-risk nature of demining operations, this study aimed to assess and manage human errors likely to occur in such operations. Methods: This study was performed at a demining site in war zones located in the West of Iran. After acquiring an initial familiarity with the operations, methods, and tools of clearing minefields, job task related to clearing landmines were specified. Next, these tasks were studied using HTA and related possible errors were assessed using ATHEANA. Results: de-mining task was composed of four main operations, including primary detection, technical identification, investigation, and neutralization. There were found four main reasons for accidents occurring in such operations; walking on the mines, leaving mines with no action, error in neutralizing operation and environmental explosion. The possibility of human error in mine clearance operations was calculated as 0.010. Conclusion: The main causes of human error in de-mining operations can be attributed to various factors such as poor weather and operating conditions like outdoor work, inappropriate personal protective equipment, personality characteristics, insufficient accuracy in the work, and insufficient time available. To reduce the probability of human error in de-mining operations, the aforementioned factors should be managed properly.

  1. Radiation protection

    International Nuclear Information System (INIS)

    Koelzer, W.

    1975-01-01

    Physical and radiological terms, quantities, and units. Basic principles of radiation protection (ICRP, IAEA, EURATOM, FRG). Biological effects of ionizing radiation. Objectives of practical radiation protection. (HP) [de

  2. Radiation protection: A correction

    International Nuclear Information System (INIS)

    1972-01-01

    An error in translation inadvertently distorted the sense of a paragraph in the article entitled 'Ecological Aspects of Radiation Protection', by Dr. P. Recht, which appeared in the Bulletin, Volume 14, No. 2 earlier this year. In the English text the error appears on Page 28, second paragraph, which reads, as published: 'An instance familiar to radiation protection specialists, which has since come to be regarded as a classic illustration of this approach, is the accidental release at the Windscale nuclear centre in the north of England.' In the French original of this text no reference was made, or intended, to the accidental release which took place in 1957; the reference was to the study of the critical population group exposed to routine releases from the centre, as the footnote made clear. A more correct translation of the relevant sentence reads: 'A classic example of this approach, well-known to radiation protection specialists, is that of releases from the Windscale nuclear centre, in the north of England.' A second error appeared in the footnote already referred to. In all languages, the critical population group studied in respect of the Windscale releases is named as that of Cornwall; the reference should be, of course, to that part of the population of Wales who eat laver bread. (author)

  3. On unequal footing

    DEFF Research Database (Denmark)

    Ounanian, Kristen; Delaney, Alyne; Raakjær, Jesper

    2012-01-01

    This article concentrates on five marine sectors active in the marine environment (fisheries, offshore renewable energy, offshore oil and gas, navigation, and coastal tourism) and on non-industry stakeholders represented by environmental Non-Governmental Organizations (eNGOs) and how they have...... waters: (a) Boundaries; (b) policy and management coordination; and (c) balancing values and user conflicts have been explored. The paper concludes that from a governance perspective it is clear that the MSFD has not been that well-thought through. The consistency of the overall legal frameworks...

  4. Different, Unequal or Unconnected

    Directory of Open Access Journals (Sweden)

    Néstor García Canclini

    2004-10-01

    Full Text Available The author proposes three key elements for dealing with the subject of interculturality and globalisation: difference, inequality and unconnectedness. He wonders not only about how to recognise the differences or correct the inequalities, but also about how to connectthe majorities to the global networks. For this, in the first place, he situates inequality and difference, and he deals with the latter from the theorisations of ethnic studies. And, secondly, he takes up the articulation of differences and inequalities proposed by Pierre Bourdieu and modified by authors that developed different perspectives based on their initial collaboration with him, such as Claude Grignon, Jean-Claude Passeron and Luc Boltanski. Canclini isattracted by these authors’ attempts to “open up the national horizon at a time when interculturality is globalising.”

  5. Different, Unequal or Unconnected

    OpenAIRE

    Néstor García Canclini

    2004-01-01

    The author proposes three key elements for dealing with the subject of interculturality and globalisation: difference, inequality and unconnectedness. He wonders not only about how to recognise the differences or correct the inequalities, but also about how to connectthe majorities to the global networks. For this, in the first place, he situates inequality and difference, and he deals with the latter from the theorisations of ethnic studies. And, secondly, he takes up the articulation of dif...

  6. Unequal Welfare States

    NARCIS (Netherlands)

    A.J. Soede; J.C. Vrooman; P.M. Ferraresi; G. Segre

    2004-01-01

    The financial sustainability of the ageing welfare states in Europe has become a key policy issue recently. This study highlights a different aspect of the ageing process, namely its potential impact on income distributions. Since older people usually live on a lower income, their growing share

  7. Unequal by Structure

    DEFF Research Database (Denmark)

    Holck, Lotte

    2018-01-01

    generates inequality. Second, inequality is sustained by inadequate integration methods that merge a formal–informal hierarchy, which results in peer competition and majority elites. The structural approach to organizational diversity developed in this article nuances the current research on diversity...

  8. Preventing Ageing Unequally

    Science.gov (United States)

    OECD Publishing, 2017

    2017-01-01

    This report examines how the two global mega-trends of population ageing and rising inequalities have been developing and interacting, both within and across generations. Taking a life-course perspective the report shows how inequalities in education, health, employment and earnings compound, resulting in large differences in lifetime earnings…

  9. An unequable race

    DEFF Research Database (Denmark)

    Günzel, Franziska; Holm, Anna B.

    In this paper we review how openness towards technological innovation and opening of the traditional business model in the newspaper industry has led to an undermining of the industry’s dominant business model and to a dismantling of the extended business model configuration. More specifically, we...... discuss how changes introduced during the on-going development of digital platforms of news production and delivery has affected key components of these business models, namely value creation, proposition, delivery and capture. By using a multiple case study approach we have examined the development...... of the three leading Danish newspaper titles business models. The findings suggest that openness towards new technology needs to be coupled with openness towards business model experimentation to secure that new technologies do not solely dictate company’s future path. Instead learning experiences through...

  10. Predictors of Errors of Novice Java Programmers

    Science.gov (United States)

    Bringula, Rex P.; Manabat, Geecee Maybelline A.; Tolentino, Miguel Angelo A.; Torres, Edmon L.

    2012-01-01

    This descriptive study determined which of the sources of errors would predict the errors committed by novice Java programmers. Descriptive statistics revealed that the respondents perceived that they committed the identified eighteen errors infrequently. Thought error was perceived to be the main source of error during the laboratory programming…

  11. Learning time-dependent noise to reduce logical errors: real time error rate estimation in quantum error correction

    Science.gov (United States)

    Huo, Ming-Xia; Li, Ying

    2017-12-01

    Quantum error correction is important to quantum information processing, which allows us to reliably process information encoded in quantum error correction codes. Efficient quantum error correction benefits from the knowledge of error rates. We propose a protocol for monitoring error rates in real time without interrupting the quantum error correction. Any adaptation of the quantum error correction code or its implementation circuit is not required. The protocol can be directly applied to the most advanced quantum error correction techniques, e.g. surface code. A Gaussian processes algorithm is used to estimate and predict error rates based on error correction data in the past. We find that using these estimated error rates, the probability of error correction failures can be significantly reduced by a factor increasing with the code distance.

  12. Mains protection. 4. ed.; Netzschutztechnik

    Energy Technology Data Exchange (ETDEWEB)

    Schossig, Thomas [Omicron Electronics GmbH, Klaus (Austria); Schossig, Walter

    2013-06-01

    Besides the description of the function of the line equipment, transformer protection functions and protection equipments, selective earth fault detection, voltage regulation, the control of the detuning information for the selection, commissioning and business management will be given. Special emphasis is put on a general adjustment standards and audit recommendations. Also transducers, auxiliary power supply and switching fault detection as well as classifications for equipment and circuit documents are mentioned. Updates affect particularly error clarification times, stimulation reliability, protection of SF6 switchgear, power directional frequency load shedding, protection of decentralized power plants, selective earth fault detection and communication in switch boards.

  13. Redundant measurements for controlling errors

    International Nuclear Information System (INIS)

    Ehinger, M.H.; Crawford, J.M.; Madeen, M.L.

    1979-07-01

    Current federal regulations for nuclear materials control require consideration of operating data as part of the quality control program and limits of error propagation. Recent work at the BNFP has revealed that operating data are subject to a number of measurement problems which are very difficult to detect and even more difficult to correct in a timely manner. Thus error estimates based on operational data reflect those problems. During the FY 1978 and FY 1979 R and D demonstration runs at the BNFP, redundant measurement techniques were shown to be effective in detecting these problems to allow corrective action. The net effect is a reduction in measurement errors and a significant increase in measurement sensitivity. Results show that normal operation process control measurements, in conjunction with routine accountability measurements, are sensitive problem indicators when incorporated in a redundant measurement program

  14. Large errors and severe conditions

    CERN Document Server

    Smith, D L; Van Wormer, L A

    2002-01-01

    Physical parameters that can assume real-number values over a continuous range are generally represented by inherently positive random variables. However, if the uncertainties in these parameters are significant (large errors), conventional means of representing and manipulating the associated variables can lead to erroneous results. Instead, all analyses involving them must be conducted in a probabilistic framework. Several issues must be considered: First, non-linear functional relations between primary and derived variables may lead to significant 'error amplification' (severe conditions). Second, the commonly used normal (Gaussian) probability distribution must be replaced by a more appropriate function that avoids the occurrence of negative sampling results. Third, both primary random variables and those derived through well-defined functions must be dealt with entirely in terms of their probability distributions. Parameter 'values' and 'errors' should be interpreted as specific moments of these probabil...

  15. Negligence, genuine error, and litigation

    Directory of Open Access Journals (Sweden)

    Sohn DH

    2013-02-01

    Full Text Available David H SohnDepartment of Orthopedic Surgery, University of Toledo Medical Center, Toledo, OH, USAAbstract: Not all medical injuries are the result of negligence. In fact, most medical injuries are the result either of the inherent risk in the practice of medicine, or due to system errors, which cannot be prevented simply through fear of disciplinary action. This paper will discuss the differences between adverse events, negligence, and system errors; the current medical malpractice tort system in the United States; and review current and future solutions, including medical malpractice reform, alternative dispute resolution, health courts, and no-fault compensation systems. The current political environment favors investigation of non-cap tort reform remedies; investment into more rational oversight systems, such as health courts or no-fault systems may reap both quantitative and qualitative benefits for a less costly and safer health system.Keywords: medical malpractice, tort reform, no fault compensation, alternative dispute resolution, system errors

  16. Spacecraft and propulsion technician error

    Science.gov (United States)

    Schultz, Daniel Clyde

    Commercial aviation and commercial space similarly launch, fly, and land passenger vehicles. Unlike aviation, the U.S. government has not established maintenance policies for commercial space. This study conducted a mixed methods review of 610 U.S. space launches from 1984 through 2011, which included 31 failures. An analysis of the failure causal factors showed that human error accounted for 76% of those failures, which included workmanship error accounting for 29% of the failures. With the imminent future of commercial space travel, the increased potential for the loss of human life demands that changes be made to the standardized procedures, training, and certification to reduce human error and failure rates. Several recommendations were made by this study to the FAA's Office of Commercial Space Transportation, space launch vehicle operators, and maintenance technician schools in an effort to increase the safety of the space transportation passengers.

  17. Sensation seeking and error processing.

    Science.gov (United States)

    Zheng, Ya; Sheng, Wenbin; Xu, Jing; Zhang, Yuanyuan

    2014-09-01

    Sensation seeking is defined by a strong need for varied, novel, complex, and intense stimulation, and a willingness to take risks for such experience. Several theories propose that the insensitivity to negative consequences incurred by risks is one of the hallmarks of sensation-seeking behaviors. In this study, we investigated the time course of error processing in sensation seeking by recording event-related potentials (ERPs) while high and low sensation seekers performed an Eriksen flanker task. Whereas there were no group differences in ERPs to correct trials, sensation seeking was associated with a blunted error-related negativity (ERN), which was female-specific. Further, different subdimensions of sensation seeking were related to ERN amplitude differently. These findings indicate that the relationship between sensation seeking and error processing is sex-specific. Copyright © 2014 Society for Psychophysiological Research.

  18. Errors of Inference Due to Errors of Measurement.

    Science.gov (United States)

    Linn, Robert L.; Werts, Charles E.

    Failure to consider errors of measurement when using partial correlation or analysis of covariance techniques can result in erroneous conclusions. Certain aspects of this problem are discussed and particular attention is given to issues raised in a recent article by Brewar, Campbell, and Crano. (Author)

  19. Measurement error models with uncertainty about the error variance

    NARCIS (Netherlands)

    Oberski, D.L.; Satorra, A.

    2013-01-01

    It is well known that measurement error in observable variables induces bias in estimates in standard regression analysis and that structural equation models are a typical solution to this problem. Often, multiple indicator equations are subsumed as part of the structural equation model, allowing

  20. Reward positivity: Reward prediction error or salience prediction error?

    Science.gov (United States)

    Heydari, Sepideh; Holroyd, Clay B

    2016-08-01

    The reward positivity is a component of the human ERP elicited by feedback stimuli in trial-and-error learning and guessing tasks. A prominent theory holds that the reward positivity reflects a reward prediction error signal that is sensitive to outcome valence, being larger for unexpected positive events relative to unexpected negative events (Holroyd & Coles, 2002). Although the theory has found substantial empirical support, most of these studies have utilized either monetary or performance feedback to test the hypothesis. However, in apparent contradiction to the theory, a recent study found that unexpected physical punishments also elicit the reward positivity (Talmi, Atkinson, & El-Deredy, 2013). The authors of this report argued that the reward positivity reflects a salience prediction error rather than a reward prediction error. To investigate this finding further, in the present study participants navigated a virtual T maze and received feedback on each trial under two conditions. In a reward condition, the feedback indicated that they would either receive a monetary reward or not and in a punishment condition the feedback indicated that they would receive a small shock or not. We found that the feedback stimuli elicited a typical reward positivity in the reward condition and an apparently delayed reward positivity in the punishment condition. Importantly, this signal was more positive to the stimuli that predicted the omission of a possible punishment relative to stimuli that predicted a forthcoming punishment, which is inconsistent with the salience hypothesis. © 2016 Society for Psychophysiological Research.

  1. ERROR HANDLING IN INTEGRATION WORKFLOWS

    Directory of Open Access Journals (Sweden)

    Alexey M. Nazarenko

    2017-01-01

    Full Text Available Simulation experiments performed while solving multidisciplinary engineering and scientific problems require joint usage of multiple software tools. Further, when following a preset plan of experiment or searching for optimum solu- tions, the same sequence of calculations is run multiple times with various simulation parameters, input data, or conditions while overall workflow does not change. Automation of simulations like these requires implementing of a workflow where tool execution and data exchange is usually controlled by a special type of software, an integration environment or plat- form. The result is an integration workflow (a platform-dependent implementation of some computing workflow which, in the context of automation, is a composition of weakly coupled (in terms of communication intensity typical subtasks. These compositions can then be decomposed back into a few workflow patterns (types of subtasks interaction. The pat- terns, in their turn, can be interpreted as higher level subtasks.This paper considers execution control and data exchange rules that should be imposed by the integration envi- ronment in the case of an error encountered by some integrated software tool. An error is defined as any abnormal behavior of a tool that invalidates its result data thus disrupting the data flow within the integration workflow. The main requirementto the error handling mechanism implemented by the integration environment is to prevent abnormal termination of theentire workflow in case of missing intermediate results data. Error handling rules are formulated on the basic pattern level and on the level of a composite task that can combine several basic patterns as next level subtasks. The cases where workflow behavior may be different, depending on user's purposes, when an error takes place, and possible error handling op- tions that can be specified by the user are also noted in the work.

  2. Analysis of Medication Error Reports

    Energy Technology Data Exchange (ETDEWEB)

    Whitney, Paul D.; Young, Jonathan; Santell, John; Hicks, Rodney; Posse, Christian; Fecht, Barbara A.

    2004-11-15

    In medicine, as in many areas of research, technological innovation and the shift from paper based information to electronic records has created a climate of ever increasing availability of raw data. There has been, however, a corresponding lag in our abilities to analyze this overwhelming mass of data, and classic forms of statistical analysis may not allow researchers to interact with data in the most productive way. This is true in the emerging area of patient safety improvement. Traditionally, a majority of the analysis of error and incident reports has been carried out based on an approach of data comparison, and starts with a specific question which needs to be answered. Newer data analysis tools have been developed which allow the researcher to not only ask specific questions but also to “mine” data: approach an area of interest without preconceived questions, and explore the information dynamically, allowing questions to be formulated based on patterns brought up by the data itself. Since 1991, United States Pharmacopeia (USP) has been collecting data on medication errors through voluntary reporting programs. USP’s MEDMARXsm reporting program is the largest national medication error database and currently contains well over 600,000 records. Traditionally, USP has conducted an annual quantitative analysis of data derived from “pick-lists” (i.e., items selected from a list of items) without an in-depth analysis of free-text fields. In this paper, the application of text analysis and data analysis tools used by Battelle to analyze the medication error reports already analyzed in the traditional way by USP is described. New insights and findings were revealed including the value of language normalization and the distribution of error incidents by day of the week. The motivation for this effort is to gain additional insight into the nature of medication errors to support improvements in medication safety.

  3. Medication errors: definitions and classification

    Science.gov (United States)

    Aronson, Jeffrey K

    2009-01-01

    To understand medication errors and to identify preventive strategies, we need to classify them and define the terms that describe them. The four main approaches to defining technical terms consider etymology, usage, previous definitions, and the Ramsey–Lewis method (based on an understanding of theory and practice). A medication error is ‘a failure in the treatment process that leads to, or has the potential to lead to, harm to the patient’. Prescribing faults, a subset of medication errors, should be distinguished from prescription errors. A prescribing fault is ‘a failure in the prescribing [decision-making] process that leads to, or has the potential to lead to, harm to the patient’. The converse of this, ‘balanced prescribing’ is ‘the use of a medicine that is appropriate to the patient's condition and, within the limits created by the uncertainty that attends therapeutic decisions, in a dosage regimen that optimizes the balance of benefit to harm’. This excludes all forms of prescribing faults, such as irrational, inappropriate, and ineffective prescribing, underprescribing and overprescribing. A prescription error is ‘a failure in the prescription writing process that results in a wrong instruction about one or more of the normal features of a prescription’. The ‘normal features’ include the identity of the recipient, the identity of the drug, the formulation, dose, route, timing, frequency, and duration of administration. Medication errors can be classified, invoking psychological theory, as knowledge-based mistakes, rule-based mistakes, action-based slips, and memory-based lapses. This classification informs preventive strategies. PMID:19594526

  4. Correcting quantum errors with entanglement.

    Science.gov (United States)

    Brun, Todd; Devetak, Igor; Hsieh, Min-Hsiu

    2006-10-20

    We show how entanglement shared between encoder and decoder can simplify the theory of quantum error correction. The entanglement-assisted quantum codes we describe do not require the dual-containing constraint necessary for standard quantum error-correcting codes, thus allowing us to "quantize" all of classical linear coding theory. In particular, efficient modern classical codes that attain the Shannon capacity can be made into entanglement-assisted quantum codes attaining the hashing bound (closely related to the quantum capacity). For systems without large amounts of shared entanglement, these codes can also be used as catalytic codes, in which a small amount of initial entanglement enables quantum communication.

  5. Human Error and Organizational Management

    Directory of Open Access Journals (Sweden)

    Alecxandrina DEACONU

    2009-01-01

    Full Text Available The concern for performance is a topic that raises interest in the businessenvironment but also in other areas that – even if they seem distant from thisworld – are aware of, interested in or conditioned by the economy development.As individual performance is very much influenced by the human resource, wechose to analyze in this paper the mechanisms that generate – consciously or not–human error nowadays.Moreover, the extremely tense Romanian context,where failure is rather a rule than an exception, made us investigate thephenomenon of generating a human error and the ways to diminish its effects.

  6. Preventing statistical errors in scientific journals.

    NARCIS (Netherlands)

    Nuijten, M.B.

    2016-01-01

    There is evidence for a high prevalence of statistical reporting errors in psychology and other scientific fields. These errors display a systematic preference for statistically significant results, distorting the scientific literature. There are several possible causes for this systematic error

  7. Chromosome structures: reduction of certain problems with unequal gene content and gene paralogs to integer linear programming.

    Science.gov (United States)

    Lyubetsky, Vassily; Gershgorin, Roman; Gorbunov, Konstantin

    2017-12-06

    Chromosome structure is a very limited model of the genome including the information about its chromosomes such as their linear or circular organization, the order of genes on them, and the DNA strand encoding a gene. Gene lengths, nucleotide composition, and intergenic regions are ignored. Although highly incomplete, such structure can be used in many cases, e.g., to reconstruct phylogeny and evolutionary events, to identify gene synteny, regulatory elements and promoters (considering highly conserved elements), etc. Three problems are considered; all assume unequal gene content and the presence of gene paralogs. The distance problem is to determine the minimum number of operations required to transform one chromosome structure into another and the corresponding transformation itself including the identification of paralogs in two structures. We use the DCJ model which is one of the most studied combinatorial rearrangement models. Double-, sesqui-, and single-operations as well as deletion and insertion of a chromosome region are considered in the model; the single ones comprise cut and join. In the reconstruction problem, a phylogenetic tree with chromosome structures in the leaves is given. It is necessary to assign the structures to inner nodes of the tree to minimize the sum of distances between terminal structures of each edge and to identify the mutual paralogs in a fairly large set of structures. A linear algorithm is known for the distance problem without paralogs, while the presence of paralogs makes it NP-hard. If paralogs are allowed but the insertion and deletion operations are missing (and special constraints are imposed), the reduction of the distance problem to integer linear programming is known. Apparently, the reconstruction problem is NP-hard even in the absence of paralogs. The problem of contigs is to find the optimal arrangements for each given set of contigs, which also includes the mutual identification of paralogs. We proved that these

  8. Unequal distribution of health human resource in mainland China: what are the determinants from a comprehensive perspective?

    Science.gov (United States)

    Li, Dan; Zhou, Zhongliang; Si, Yafei; Xu, Yongjian; Shen, Chi; Wang, Yiyang; Wang, Xiao

    2018-02-27

    The inequality of health human resource is a worldwide problem, and solving it also is one of the major goals of China's recent health system reform. Yet there is a huge disparity among cities in mainland China. The aim of this study is to analyze the distribution inequality of the health human resource in 322 prefecture-level cities of mainland China in 2014, and to reveal the facets and causes of the inequalities. The data for this study were acquired from the provincial and municipal Health Statistics Yearbook (2014) and Statistical Yearbook (2014), the municipal National Economic Bulletin (2014), and the official websites of municipal governments, involving 322 prefecture-level cities. Meanwhile, Concentration Index was used to measure the magnitude of the unequal distribution of health human resource. A decomposition analysis was employed to quantify the contribution of each determinant to the total inequality. The overall concentration index of doctors and nurses in mainland China in 2014 was 0.1038 (95% CI = 0.0208, 0.1865) and 0.0785 (95% CI =0.0018, 0.1561). Decomposition of the concentration index revealed that economic status was the primary contributor (58.5% and 57%) to the inequality of doctors and nurses, followed by the Southwest China (19.1% and 18.6%), urbanization level (- 13.1% and - 12.8%), and revenue (8.0% and 7.8%). Party secretaries with Master degree (7.0%, 6.8%), mayors who were 60 years old or above (6.3%, 6.1%) also were proved to be a major contributor to the inequality of health human resource. There was inequality of health human resource distribution which was pro-rich in mainland China in 2014. Economic status of the cities accounted for most of the existing inequality, followed by the Southwest China, urbanization level, revenue, party secretaries with Master degree, and mayors who were 60 years old or above in respective importance. Besides, the party secretaries and mayors also had certain influence on the allocation

  9. Error Resilience in Current Distributed Video Coding Architectures

    Directory of Open Access Journals (Sweden)

    Tonoli Claudia

    2009-01-01

    Full Text Available In distributed video coding the signal prediction is shifted at the decoder side, giving therefore most of the computational complexity burden at the receiver. Moreover, since no prediction loop exists before transmission, an intrinsic robustness to transmission errors has been claimed. This work evaluates and compares the error resilience performance of two distributed video coding architectures. In particular, we have considered a video codec based on the Stanford architecture (DISCOVER codec and a video codec based on the PRISM architecture. Specifically, an accurate temporal and rate/distortion based evaluation of the effects of the transmission errors for both the considered DVC architectures has been performed and discussed. These approaches have been also compared with H.264/AVC, in both cases of no error protection, and simple FEC error protection. Our evaluations have highlighted in all cases a strong dependence of the behavior of the various codecs to the content of the considered video sequence. In particular, PRISM seems to be particularly well suited for low-motion sequences, whereas DISCOVER provides better performance in the other cases.

  10. Physical protection

    International Nuclear Information System (INIS)

    Myers, D.A.

    1989-01-01

    Physical protection is defined and its function in relation to other functions of a State System of Accounting for and Control of Nuclear Materials is described. The need for a uniform minimum international standard for physical protection as well as the need for international cooperation in physical protection is emphasized. The IAEA's INFCIRC/225/Rev. 1 (Annex 1) is reviewed. The Convention on the Physical Protection of Nuclear Material (Annex 2) is discussed. Photographs show examples of typical physical protection technology (Annex 3)

  11. Diplomatic Protection

    OpenAIRE

    Režná, Jana

    2006-01-01

    Final thesis Topic: Diplomatic protection Thesis supervisor: JUDr. Vladimír Balaš, CSc. Student: Marek Čermák Thesis on the topic of diplomatic protection deals with the granting of exercise of diplomatic protection by the states and is divided into seven chapters which follow each other. The first chapter describes the diplomatic protection and its historical foundations. The second chapter focuses on the possibility of exercise of diplomatic protection in respect of natural persons and the ...

  12. Unequal distribution of RT-PCR artifacts along the E1-E2 region of Hepatitis C virus.

    Science.gov (United States)

    Domingo-Calap, Pilar; Sentandreu, Vicente; Bracho, Maria Alma; González-Candelas, Fernando; Moya, Andrés; Sanjuán, Rafael

    2009-10-01

    Although viral variability studies have focused traditionally on consensus sequences, the relevance of molecular clone sequences for studying viral evolution at the intra-host level is being increasingly recognized. However, for this approach to be reliable, RT-PCR artifacts do not have to contribute excessively to the observed variability. Molecular clone sequences were obtained from an in vitro transcript to estimate the maximum error rate associated to RT-PCR for the Hepatitis C virus (HCV) E1-E2 region. On average, the frequency of RT-PCR errors was one order of magnitude lower than the level of intra-host genetic variability observed in samples from an HCV outbreak. However, RT-PCR errors were not distributed evenly along the E1-E2 region and were concentrated heavily in the hypervariable region 2 (HVR 2). Although it is concluded that RT-PCR molecular clone sequences are reliable, these results warn against extrapolation of RT-PCR error rates to different genome regions. The data suggest that the RNA sequence context or secondary structure can determine the fidelity of in vitro transcription or reverse transcription. Potentially, these factors might also modify the fidelity of the viral polymerase.

  13. relay coordination in the protection of radially-connected power

    African Journals Online (AJOL)

    ... PROTECTION OF. RADIALLY-CONNECTED POWER SYSTEM NETWORK ... Protective relays detect intolerable or unwanted conditions within an assigned area, and then trip or open one ... time, and current transformer ratio errors. 2.2.1.

  14. Pharyngitis – fatal infectious disease or medical error?

    Directory of Open Access Journals (Sweden)

    Marta Rorat

    2015-08-01

    Full Text Available Reporting on adverse events is essential to create a culture of safety, which focuses on protecting doctors and patients from medical errors. We present a fatal case of Streptococcus C pharyngitis in a 56-year-old man. The clinical course and the results of additional diagnostics and autopsy showed that sepsis followed by multiple organ failure was the ultimate cause of death. The clinical course appeared fatal due to a chain of adverse events, including errors made by the physicians caring for the patient for 10 days.

  15. Medication errors in pediatric inpatients

    DEFF Research Database (Denmark)

    Rishoej, Rikke Mie; Almarsdóttir, Anna Birna; Christesen, Henrik Thybo

    2017-01-01

    The aim was to describe medication errors (MEs) in hospitalized children reported to the national mandatory reporting and learning system, the Danish Patient Safety Database (DPSD). MEs were extracted from DPSD from the 5-year period of 2010–2014. We included reports from public hospitals on pati...... safety in pediatric inpatients.(Table presented.)...

  16. Learner Corpora without Error Tagging

    Directory of Open Access Journals (Sweden)

    Rastelli, Stefano

    2009-01-01

    Full Text Available The article explores the possibility of adopting a form-to-function perspective when annotating learner corpora in order to get deeper insights about systematic features of interlanguage. A split between forms and functions (or categories is desirable in order to avoid the "comparative fallacy" and because – especially in basic varieties – forms may precede functions (e.g., what resembles to a "noun" might have a different function or a function may show up in unexpected forms. In the computer-aided error analysis tradition, all items produced by learners are traced to a grid of error tags which is based on the categories of the target language. Differently, we believe it is possible to record and make retrievable both words and sequence of characters independently from their functional-grammatical label in the target language. For this purpose at the University of Pavia we adapted a probabilistic POS tagger designed for L1 on L2 data. Despite the criticism that this operation can raise, we found that it is better to work with "virtual categories" rather than with errors. The article outlines the theoretical background of the project and shows some examples in which some potential of SLA-oriented (non error-based tagging will be possibly made clearer.

  17. Theory of Test Translation Error

    Science.gov (United States)

    Solano-Flores, Guillermo; Backhoff, Eduardo; Contreras-Nino, Luis Angel

    2009-01-01

    In this article, we present a theory of test translation whose intent is to provide the conceptual foundation for effective, systematic work in the process of test translation and test translation review. According to the theory, translation error is multidimensional; it is not simply the consequence of defective translation but an inevitable fact…

  18. and Correlated Error-Regressor

    African Journals Online (AJOL)

    Nekky Umera

    in queuing theory and econometrics, where the usual assumption of independent error terms may not be plausible in most cases. Also, when using time-series data on a number of micro-economic units, such as households and service oriented channels, where the stochastic disturbance terms in part reflect variables which ...

  19. Rank error-correcting pairs

    DEFF Research Database (Denmark)

    Martinez Peñas, Umberto; Pellikaan, Ruud

    2017-01-01

    Error-correcting pairs were introduced as a general method of decoding linear codes with respect to the Hamming metric using coordinatewise products of vectors, and are used for many well-known families of codes. In this paper, we define new types of vector products, extending the coordinatewise ...

  20. Clinical errors and medical negligence.

    Science.gov (United States)

    Oyebode, Femi

    2013-01-01

    This paper discusses the definition, nature and origins of clinical errors including their prevention. The relationship between clinical errors and medical negligence is examined as are the characteristics of litigants and events that are the source of litigation. The pattern of malpractice claims in different specialties and settings is examined. Among hospitalized patients worldwide, 3-16% suffer injury as a result of medical intervention, the most common being the adverse effects of drugs. The frequency of adverse drug effects appears superficially to be higher in intensive care units and emergency departments but once rates have been corrected for volume of patients, comorbidity of conditions and number of drugs prescribed, the difference is not significant. It is concluded that probably no more than 1 in 7 adverse events in medicine result in a malpractice claim and the factors that predict that a patient will resort to litigation include a prior poor relationship with the clinician and the feeling that the patient is not being kept informed. Methods for preventing clinical errors are still in their infancy. The most promising include new technologies such as electronic prescribing systems, diagnostic and clinical decision-making aids and error-resistant systems. Copyright © 2013 S. Karger AG, Basel.

  1. Finding errors in big data

    NARCIS (Netherlands)

    Puts, Marco; Daas, Piet; de Waal, A.G.

    No data source is perfect. Mistakes inevitably creep in. Spotting errors is hard enough when dealing with survey responses from several thousand people, but the difficulty is multiplied hugely when that mysterious beast Big Data comes into play. Statistics Netherlands is about to publish its first

  2. The Errors of Our Ways

    Science.gov (United States)

    Kane, Michael

    2011-01-01

    Errors don't exist in our data, but they serve a vital function. Reality is complicated, but our models need to be simple in order to be manageable. We assume that attributes are invariant over some conditions of observation, and once we do that we need some way of accounting for the variability in observed scores over these conditions of…

  3. Cascade Error Projection Learning Algorithm

    Science.gov (United States)

    Duong, T. A.; Stubberud, A. R.; Daud, T.

    1995-01-01

    A detailed mathematical analysis is presented for a new learning algorithm termed cascade error projection (CEP) and a general learning frame work. This frame work can be used to obtain the cascade correlation learning algorithm by choosing a particular set of parameters.

  4. Towards New Empirical Versions of Financial and Accounting Models Corrected for Measurement Errors

    OpenAIRE

    Francois-Éric Racicot; Raymond Théoret; Alain Coen

    2006-01-01

    In this paper, we propose a new empirical version of the Fama and French Model based on the Hausman (1978) specification test and aimed at discarding measurement errors in the variables. The proposed empirical framework is general enough to be used for correcting other financial and accounting models of measurement errors. Removing measurement errors is important at many levels as information disclosure, corporate governance and protection of investors.

  5. Occupational inequalities in health expectancies in France in the early 2000s: Unequal chances of reaching and living retirement in good health

    Directory of Open Access Journals (Sweden)

    Emmanuelle Cambois

    2011-08-01

    Full Text Available Increasing life expectancy (LE raises expectations for social participation at later ages. We computed health expectancies (HE to assess the (unequal chances of social/work participation after age 50 in the context of France in 2003. We considered five HEs, covering various health situations which can jeopardize participation, and focused on both older ages and the pre-retirement period. HEs reveal large inequalities for both sexes in the chances of remaining healthy after retirement, and also of reaching retirement age in good health and without disability, especially in low-qualified occupations. These results challenge the policy expectation of an overall increase in social participation at later ages.

  6. CLIM : A cross-level workload-aware timing error prediction model for functional units

    NARCIS (Netherlands)

    Jiao, Xun; Rahimi, Abbas; Jiang, Yu; Wang, Jianguo; Fatemi, Hamed; De Gyvez, Jose Pineda; Gupta, Rajesh K.

    2018-01-01

    Timing errors that are caused by the timing violations of sensitized circuit paths, have emerged as an important threat to the reliability of synchronous digital circuits. To protect circuits from these timing errors, designers typically use a conservative timing margin, which leads to operational

  7. Error and its meaning in forensic science.

    Science.gov (United States)

    Christensen, Angi M; Crowder, Christian M; Ousley, Stephen D; Houck, Max M

    2014-01-01

    The discussion of "error" has gained momentum in forensic science in the wake of the Daubert guidelines and has intensified with the National Academy of Sciences' Report. Error has many different meanings, and too often, forensic practitioners themselves as well as the courts misunderstand scientific error and statistical error rates, often confusing them with practitioner error (or mistakes). Here, we present an overview of these concepts as they pertain to forensic science applications, discussing the difference between practitioner error (including mistakes), instrument error, statistical error, and method error. We urge forensic practitioners to ensure that potential sources of error and method limitations are understood and clearly communicated and advocate that the legal community be informed regarding the differences between interobserver errors, uncertainty, variation, and mistakes. © 2013 American Academy of Forensic Sciences.

  8. A methodology for translating positional error into measures of attribute error, and combining the two error sources

    Science.gov (United States)

    Yohay Carmel; Curtis Flather; Denis Dean

    2006-01-01

    This paper summarizes our efforts to investigate the nature, behavior, and implications of positional error and attribute error in spatiotemporal datasets. Estimating the combined influence of these errors on map analysis has been hindered by the fact that these two error types are traditionally expressed in different units (distance units, and categorical units,...

  9. Binary palmprint representation for feature template protection

    NARCIS (Netherlands)

    Mu, Meiru; Ruan, Qiuqi; Shao, X.; Spreeuwers, Lieuwe Jan; Veldhuis, Raymond N.J.

    2012-01-01

    The major challenge of biometric template protection comes from the intraclass variations of biometric data. The helper data scheme aims to solve this problem by employing the Error Correction Codes (ECC). However, many reported biometric binary features from the same user reach bit error rate (BER)

  10. Augmented GNSS Differential Corrections Minimum Mean Square Error Estimation Sensitivity to Spatial Correlation Modeling Errors

    Directory of Open Access Journals (Sweden)

    Nazelie Kassabian

    2014-06-01

    Full Text Available Railway signaling is a safety system that has evolved over the last couple of centuries towards autonomous functionality. Recently, great effort is being devoted in this field, towards the use and exploitation of Global Navigation Satellite System (GNSS signals and GNSS augmentation systems in view of lower railway track equipments and maintenance costs, that is a priority to sustain the investments for modernizing the local and regional lines most of which lack automatic train protection systems and are still manually operated. The objective of this paper is to assess the sensitivity of the Linear Minimum Mean Square Error (LMMSE algorithm to modeling errors in the spatial correlation function that characterizes true pseudorange Differential Corrections (DCs. This study is inspired by the railway application; however, it applies to all transportation systems, including the road sector, that need to be complemented by an augmentation system in order to deliver accurate and reliable positioning with integrity specifications. A vector of noisy pseudorange DC measurements are simulated, assuming a Gauss-Markov model with a decay rate parameter inversely proportional to the correlation distance that exists between two points of a certain environment. The LMMSE algorithm is applied on this vector to estimate the true DC, and the estimation error is compared to the noise added during simulation. The results show that for large enough correlation distance to Reference Stations (RSs distance separation ratio values, the LMMSE brings considerable advantage in terms of estimation error accuracy and precision. Conversely, the LMMSE algorithm may deteriorate the quality of the DC measurements whenever the ratio falls below a certain threshold.

  11. Environmental protection

    International Nuclear Information System (INIS)

    Klinda, J.; Lieskovska, Z.

    1998-01-01

    In this chapter environmental protection in the Slovak Republic in 1997 are reviewed. The economics of environmental protection, state budget, Slovak state environmental fund, economic instruments, environmental laws, environmental impact assessment, environmental management systems, and environmental education are presented

  12. Discretization vs. Rounding Error in Euler's Method

    Science.gov (United States)

    Borges, Carlos F.

    2011-01-01

    Euler's method for solving initial value problems is an excellent vehicle for observing the relationship between discretization error and rounding error in numerical computation. Reductions in stepsize, in order to decrease discretization error, necessarily increase the number of steps and so introduce additional rounding error. The problem is…

  13. Total Survey Error for Longitudinal Surveys

    NARCIS (Netherlands)

    Lynn, Peter; Lugtig, P.J.

    2016-01-01

    This article describes the application of the total survey error paradigm to longitudinal surveys. Several aspects of survey error, and of the interactions between different types of error, are distinct in the longitudinal survey context. Furthermore, error trade-off decisions in survey design and

  14. Negligence, genuine error, and litigation

    Science.gov (United States)

    Sohn, David H

    2013-01-01

    Not all medical injuries are the result of negligence. In fact, most medical injuries are the result either of the inherent risk in the practice of medicine, or due to system errors, which cannot be prevented simply through fear of disciplinary action. This paper will discuss the differences between adverse events, negligence, and system errors; the current medical malpractice tort system in the United States; and review current and future solutions, including medical malpractice reform, alternative dispute resolution, health courts, and no-fault compensation systems. The current political environment favors investigation of non-cap tort reform remedies; investment into more rational oversight systems, such as health courts or no-fault systems may reap both quantitative and qualitative benefits for a less costly and safer health system. PMID:23426783

  15. Robot learning and error correction

    Science.gov (United States)

    Friedman, L.

    1977-01-01

    A model of robot learning is described that associates previously unknown perceptions with the sensed known consequences of robot actions. For these actions, both the categories of outcomes and the corresponding sensory patterns are incorporated in a knowledge base by the system designer. Thus the robot is able to predict the outcome of an action and compare the expectation with the experience. New knowledge about what to expect in the world may then be incorporated by the robot in a pre-existing structure whether it detects accordance or discrepancy between a predicted consequence and experience. Errors committed during plan execution are detected by the same type of comparison process and learning may be applied to avoiding the errors.

  16. Performance Analysis of Amplify-and-Forward Two-Way Relaying with Co-Channel Interference and Channel Estimation Error

    KAUST Repository

    Liang Yang,

    2013-06-01

    In this paper, we consider the performance of a two-way amplify-and-forward relaying network (AF TWRN) in the presence of unequal power co-channel interferers (CCI). Specifically, we first consider AF TWRN with an interference-limited relay and two noisy-nodes with channel estimation errors and CCI. We derive the approximate signal-to-interference plus noise ratio expressions and then use them to evaluate the outage probability, error probability, and achievable rate. Subsequently, to investigate the joint effects of the channel estimation error and CCI on the system performance, we extend our analysis to a multiple-relay network and derive several asymptotic performance expressions. For comparison purposes, we also provide the analysis for the relay selection scheme under the total power constraint at the relays. For AF TWRN with channel estimation error and CCI, numerical results show that the performance of the relay selection scheme is not always better than that of the all-relay participating case. In particular, the relay selection scheme can improve the system performance in the case of high power levels at the sources and small powers at the relays.

  17. Error studies of Halbach Magnets

    Energy Technology Data Exchange (ETDEWEB)

    Brooks, S. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2017-03-02

    These error studies were done on the Halbach magnets for the CBETA “First Girder” as described in note [CBETA001]. The CBETA magnets have since changed slightly to the lattice in [CBETA009]. However, this is not a large enough change to significantly affect the results here. The QF and BD arc FFAG magnets are considered. For each assumed set of error distributions and each ideal magnet, 100 random magnets with errors are generated. These are then run through an automated version of the iron wire multipole cancellation algorithm. The maximum wire diameter allowed is 0.063” as in the proof-of-principle magnets. Initially, 32 wires (2 per Halbach wedge) are tried, then if this does not achieve 1e-­4 level accuracy in the simulation, 48 and then 64 wires. By “1e-4 accuracy”, it is meant the FOM defined by √(Σn≥sextupole an 2+bn 2) is less than 1 unit, where the multipoles are taken at the maximum nominal beam radius, R=23mm for these magnets. The algorithm initially uses 20 convergence interations. If 64 wires does not achieve 1e-­4 accuracy, this is increased to 50 iterations to check for slow converging cases. There are also classifications for magnets that do not achieve 1e-4 but do achieve 1e-3 (FOM ≤ 10 units). This is technically within the spec discussed in the Jan 30, 2017 review; however, there will be errors in practical shimming not dealt with in the simulation, so it is preferable to do much better than the spec in the simulation.

  18. [Errors in laboratory daily practice].

    Science.gov (United States)

    Larrose, C; Le Carrer, D

    2007-01-01

    Legislation set by GBEA (Guide de bonne exécution des analyses) requires that, before performing analysis, the laboratory directors have to check both the nature of the samples and the patients identity. The data processing of requisition forms, which identifies key errors, was established in 2000 and in 2002 by the specialized biochemistry laboratory, also with the contribution of the reception centre for biological samples. The laboratories follow a strict criteria of defining acceptability as a starting point for the reception to then check requisition forms and biological samples. All errors are logged into the laboratory database and analysis report are sent to the care unit specifying the problems and the consequences they have on the analysis. The data is then assessed by the laboratory directors to produce monthly or annual statistical reports. This indicates the number of errors, which are then indexed to patient files to reveal the specific problem areas, therefore allowing the laboratory directors to teach the nurses and enable corrective action.

  19. Technical errors in MR arthrography

    International Nuclear Information System (INIS)

    Hodler, Juerg

    2008-01-01

    This article discusses potential technical problems of MR arthrography. It starts with contraindications, followed by problems relating to injection technique, contrast material and MR imaging technique. For some of the aspects discussed, there is only little published evidence. Therefore, the article is based on the personal experience of the author and on local standards of procedures. Such standards, as well as medico-legal considerations, may vary from country to country. Contraindications for MR arthrography include pre-existing infection, reflex sympathetic dystrophy and possibly bleeding disorders, avascular necrosis and known allergy to contrast media. Errors in injection technique may lead to extra-articular collection of contrast agent or to contrast agent leaking from the joint space, which may cause diagnostic difficulties. Incorrect concentrations of contrast material influence image quality and may also lead to non-diagnostic examinations. Errors relating to MR imaging include delays between injection and imaging and inadequate choice of sequences. Potential solutions to the various possible errors are presented. (orig.)

  20. Technical errors in MR arthrography

    Energy Technology Data Exchange (ETDEWEB)

    Hodler, Juerg [Orthopaedic University Hospital of Balgrist, Radiology, Zurich (Switzerland)

    2008-01-15

    This article discusses potential technical problems of MR arthrography. It starts with contraindications, followed by problems relating to injection technique, contrast material and MR imaging technique. For some of the aspects discussed, there is only little published evidence. Therefore, the article is based on the personal experience of the author and on local standards of procedures. Such standards, as well as medico-legal considerations, may vary from country to country. Contraindications for MR arthrography include pre-existing infection, reflex sympathetic dystrophy and possibly bleeding disorders, avascular necrosis and known allergy to contrast media. Errors in injection technique may lead to extra-articular collection of contrast agent or to contrast agent leaking from the joint space, which may cause diagnostic difficulties. Incorrect concentrations of contrast material influence image quality and may also lead to non-diagnostic examinations. Errors relating to MR imaging include delays between injection and imaging and inadequate choice of sequences. Potential solutions to the various possible errors are presented. (orig.)

  1. Autonomous Quantum Error Correction with Application to Quantum Metrology

    Science.gov (United States)

    Reiter, Florentin; Sorensen, Anders S.; Zoller, Peter; Muschik, Christine A.

    2017-04-01

    We present a quantum error correction scheme that stabilizes a qubit by coupling it to an engineered environment which protects it against spin- or phase flips. Our scheme uses always-on couplings that run continuously in time and operates in a fully autonomous fashion without the need to perform measurements or feedback operations on the system. The correction of errors takes place entirely at the microscopic level through a build-in feedback mechanism. Our dissipative error correction scheme can be implemented in a system of trapped ions and can be used for improving high precision sensing. We show that the enhanced coherence time that results from the coupling to the engineered environment translates into a significantly enhanced precision for measuring weak fields. In a broader context, this work constitutes a stepping stone towards the paradigm of self-correcting quantum information processing.

  2. Secure and Reliable IPTV Multimedia Transmission Using Forward Error Correction

    Directory of Open Access Journals (Sweden)

    Chi-Huang Shih

    2012-01-01

    Full Text Available With the wide deployment of Internet Protocol (IP infrastructure and rapid development of digital technologies, Internet Protocol Television (IPTV has emerged as one of the major multimedia access techniques. A general IPTV transmission system employs both encryption and forward error correction (FEC to provide the authorized subscriber with a high-quality perceptual experience. This two-layer processing, however, complicates the system design in terms of computational cost and management cost. In this paper, we propose a novel FEC scheme to ensure the secure and reliable transmission for IPTV multimedia content and services. The proposed secure FEC utilizes the characteristics of FEC including the FEC-encoded redundancies and the limitation of error correction capacity to protect the multimedia packets against the malicious attacks and data transmission errors/losses. Experimental results demonstrate that the proposed scheme obtains similar performance compared with the joint encryption and FEC scheme.

  3. #2 - An Empirical Assessment of Exposure Measurement Error ...

    Science.gov (United States)

    Background• Differing degrees of exposure error acrosspollutants• Previous focus on quantifying and accounting forexposure error in single-pollutant models• Examine exposure errors for multiple pollutantsand provide insights on the potential for bias andattenuation of effect estimates in single and bipollutantepidemiological models The National Exposure Research Laboratory (NERL) Human Exposure and Atmospheric Sciences Division (HEASD) conducts research in support of EPA mission to protect human health and the environment. HEASD research program supports Goal 1 (Clean Air) and Goal 4 (Healthy People) of EPA strategic plan. More specifically, our division conducts research to characterize the movement of pollutants from the source to contact with humans. Our multidisciplinary research program produces Methods, Measurements, and Models to identify relationships between and characterize processes that link source emissions, environmental concentrations, human exposures, and target-tissue dose. The impact of these tools is improved regulatory programs and policies for EPA.

  4. Clock error models for simulation and estimation

    International Nuclear Information System (INIS)

    Meditch, J.S.

    1981-10-01

    Mathematical models for the simulation and estimation of errors in precision oscillators used as time references in satellite navigation systems are developed. The results, based on all currently known oscillator error sources, are directly implementable on a digital computer. The simulation formulation is sufficiently flexible to allow for the inclusion or exclusion of individual error sources as desired. The estimation algorithms, following from Kalman filter theory, provide directly for the error analysis of clock errors in both filtering and prediction

  5. Effects of errors and gaps in spatial data sets on assessment of conservation progress.

    Science.gov (United States)

    Visconti, P; Di Marco, M; Álvarez-Romero, J G; Januchowski-Hartley, S R; Pressey, R L; Weeks, R; Rondinini, C

    2013-10-01

    Data on the location and extent of protected areas, ecosystems, and species' distributions are essential for determining gaps in biodiversity protection and identifying future conservation priorities. However, these data sets always come with errors in the maps and associated metadata. Errors are often overlooked in conservation studies, despite their potential negative effects on the reported extent of protection of species and ecosystems. We used 3 case studies to illustrate the implications of 3 sources of errors in reporting progress toward conservation objectives: protected areas with unknown boundaries that are replaced by buffered centroids, propagation of multiple errors in spatial data, and incomplete protected-area data sets. As of 2010, the frequency of protected areas with unknown boundaries in the World Database on Protected Areas (WDPA) caused the estimated extent of protection of 37.1% of the terrestrial Neotropical mammals to be overestimated by an average 402.8% and of 62.6% of species to be underestimated by an average 10.9%. Estimated level of protection of the world's coral reefs was 25% higher when using recent finer-resolution data on coral reefs as opposed to globally available coarse-resolution data. Accounting for additional data sets not yet incorporated into WDPA contributed up to 6.7% of additional protection to marine ecosystems in the Philippines. We suggest ways for data providers to reduce the errors in spatial and ancillary data and ways for data users to mitigate the effects of these errors on biodiversity assessments. © 2013 Society for Conservation Biology.

  6. Protective relay

    International Nuclear Information System (INIS)

    Lim, Mu Ji; Jung, Hae Sang

    1974-10-01

    This book is divided into two chapters, which deals with protective relay. The first chapter deals with the basic knowledge of relay on development of relay, classification of protective relay, rating of protective relay general structure of protective relay, detecting of ground protection, about point of contact, operating relay and trip relaying. The second chapter is about structure and explanation of relay on classification by structure such as motor type and moving-coil type, explanation of other relays over current relay, over voltage relay, short voltage relay, relay for power, relay for direction, test of over voltage relay, test of short voltage relay and test of directional circuit relay.

  7. Protecting knowledge

    DEFF Research Database (Denmark)

    Sofka, Wolfgang; de Faria, Pedro; Shehu, Edlira

    2018-01-01

    Most firms use secrecy to protect their knowledge from potential imitators. However, the theoretical foundations for secrecy have not been well explored. We extend knowledge protection literature and propose theoretical mechanisms explaining how information visibility influences the importance...... of secrecy as a knowledge protection instrument. Building on mechanisms from information economics and signaling theory, we postulate that secrecy is more important for protecting knowledge for firms that have legal requirements to reveal information to shareholders. Furthermore, we argue that this effect...... and a firm's investment in fixed assets. Our findings inform both academics and managers on how firms balance information disclosure requirements with the use of secrecy as a knowledge protection instrument....

  8. IPTV multicast with peer-assisted lossy error control

    Science.gov (United States)

    Li, Zhi; Zhu, Xiaoqing; Begen, Ali C.; Girod, Bernd

    2010-07-01

    Emerging IPTV technology uses source-specific IP multicast to deliver television programs to end-users. To provide reliable IPTV services over the error-prone DSL access networks, a combination of multicast forward error correction (FEC) and unicast retransmissions is employed to mitigate the impulse noises in DSL links. In existing systems, the retransmission function is provided by the Retransmission Servers sitting at the edge of the core network. In this work, we propose an alternative distributed solution where the burden of packet loss repair is partially shifted to the peer IP set-top boxes. Through Peer-Assisted Repair (PAR) protocol, we demonstrate how the packet repairs can be delivered in a timely, reliable and decentralized manner using the combination of server-peer coordination and redundancy of repairs. We also show that this distributed protocol can be seamlessly integrated with an application-layer source-aware error protection mechanism called forward and retransmitted Systematic Lossy Error Protection (SLEP/SLEPr). Simulations show that this joint PARSLEP/ SLEPr framework not only effectively mitigates the bottleneck experienced by the Retransmission Servers, thus greatly enhancing the scalability of the system, but also efficiently improves the resistance to the impulse noise.

  9. Design of nanophotonic circuits for autonomous subsystem quantum error correction

    Energy Technology Data Exchange (ETDEWEB)

    Kerckhoff, J; Pavlichin, D S; Chalabi, H; Mabuchi, H, E-mail: jkerc@stanford.edu [Edward L Ginzton Laboratory, Stanford University, Stanford, CA 94305 (United States)

    2011-05-15

    We reapply our approach to designing nanophotonic quantum memories in order to formulate an optical network that autonomously protects a single logical qubit against arbitrary single-qubit errors. Emulating the nine-qubit Bacon-Shor subsystem code, the network replaces the traditionally discrete syndrome measurement and correction steps by continuous, time-independent optical interactions and coherent feedback of unitarily processed optical fields.

  10. Righting errors in writing errors: the Wing and Baddeley (1980) spelling error corpus revisited.

    Science.gov (United States)

    Wing, Alan M; Baddeley, Alan D

    2009-03-01

    We present a new analysis of our previously published corpus of handwriting errors (slips) using the proportional allocation algorithm of Machtynger and Shallice (2009). As previously, the proportion of slips is greater in the middle of the word than at the ends, however, in contrast to before, the proportion is greater at the end than at the beginning of the word. The findings are consistent with the hypothesis of memory effects in a graphemic output buffer.

  11. Error-transparent evolution: the ability of multi-body interactions to bypass decoherence

    International Nuclear Information System (INIS)

    Vy, Os; Jacobs, Kurt; Wang Xiaoting

    2013-01-01

    We observe that multi-body interactions, unlike two-body interactions, can implement any unitary operation on an encoded system in such a way that the evolution is uninterrupted by noise that the encoding is designed to protect against. Such ‘error-transparent’ evolution is distinct from that usually considered in quantum computing, as the latter is merely correctable. We prove that the minimum body-ness required to protect (i) a qubit from a single type of Pauli error, (ii) a target qubit from a controller with such errors and (iii) a single qubit from all errors is three-body, four-body and five-body, respectively. We also discuss applications to computing, coherent feedback control and quantum metrology. Finally, we evaluate the performance of error-transparent evolution for some examples using numerical simulations. (paper)

  12. An adaptive orienting theory of error processing.

    Science.gov (United States)

    Wessel, Jan R

    2018-03-01

    The ability to detect and correct action errors is paramount to safe and efficient goal-directed behaviors. Existing work on the neural underpinnings of error processing and post-error behavioral adaptations has led to the development of several mechanistic theories of error processing. These theories can be roughly grouped into adaptive and maladaptive theories. While adaptive theories propose that errors trigger a cascade of processes that will result in improved behavior after error commission, maladaptive theories hold that error commission momentarily impairs behavior. Neither group of theories can account for all available data, as different empirical studies find both impaired and improved post-error behavior. This article attempts a synthesis between the predictions made by prominent adaptive and maladaptive theories. Specifically, it is proposed that errors invoke a nonspecific cascade of processing that will rapidly interrupt and inhibit ongoing behavior and cognition, as well as orient attention toward the source of the error. It is proposed that this cascade follows all unexpected action outcomes, not just errors. In the case of errors, this cascade is followed by error-specific, controlled processing, which is specifically aimed at (re)tuning the existing task set. This theory combines existing predictions from maladaptive orienting and bottleneck theories with specific neural mechanisms from the wider field of cognitive control, including from error-specific theories of adaptive post-error processing. The article aims to describe the proposed framework and its implications for post-error slowing and post-error accuracy, propose mechanistic neural circuitry for post-error processing, and derive specific hypotheses for future empirical investigations. © 2017 Society for Psychophysiological Research.

  13. WACC: Definition, misconceptions and errors

    OpenAIRE

    Fernandez, Pablo

    2011-01-01

    The WACC is just the rate at which the Free Cash Flows must be discounted to obtain the same result as in the valuation using Equity Cash Flows discounted at the required return to equity (Ke) The WACC is neither a cost nor a required return: it is a weighted average of a cost and a required return. To refer to the WACC as the "cost of capital" may be misleading because it is not a cost. The paper includes 7 errors due to not remembering the definition of WACC and shows the relationship betwe...

  14. Wavefront error sensing for LDR

    Science.gov (United States)

    Tubbs, Eldred F.; Glavich, T. A.

    1988-01-01

    Wavefront sensing is a significant aspect of the LDR control problem and requires attention at an early stage of the control system definition and design. A combination of a Hartmann test for wavefront slope measurement and an interference test for piston errors of the segments was examined and is presented as a point of departure for further discussion. The assumption is made that the wavefront sensor will be used for initial alignment and periodic alignment checks but that it will not be used during scientific observations. The Hartmann test and the interferometric test are briefly examined.

  15. Human decision error (HUMDEE) trees

    International Nuclear Information System (INIS)

    Ostrom, L.T.

    1993-01-01

    Graphical presentations of human actions in incident and accident sequences have been used for many years. However, for the most part, human decision making has been underrepresented in these trees. This paper presents a method of incorporating the human decision process into graphical presentations of incident/accident sequences. This presentation is in the form of logic trees. These trees are called Human Decision Error Trees or HUMDEE for short. The primary benefit of HUMDEE trees is that they graphically illustrate what else the individuals involved in the event could have done to prevent either the initiation or continuation of the event. HUMDEE trees also present the alternate paths available at the operator decision points in the incident/accident sequence. This is different from the Technique for Human Error Rate Prediction (THERP) event trees. There are many uses of these trees. They can be used for incident/accident investigations to show what other courses of actions were available and for training operators. The trees also have a consequence component so that not only the decision can be explored, also the consequence of that decision

  16. Apology for errors: whose responsibility?

    Science.gov (United States)

    Leape, Lucian L

    2012-01-01

    When things go wrong during a medical procedure, patients' expectations are fairly straightforward: They expect an explanation of what happened, an apology if an error was made, and assurance that something will be done to prevent it from happening to another patient. Patients have a right to full disclosure; it is also therapeutic in relieving their anxiety. But if they have been harmed by our mistake, they also need an apology to maintain trust. Apology conveys respect, mutual suffering, and responsibility. Meaningful apology requires that the patient's physician and the institution both take responsibility, show remorse, and make amends. As the patient's advocate, the physician must play the lead role. However, as custodian of the systems, the hospital has primary responsibility for the mishap, for preventing that error in the future, and for compensation. The responsibility for making all this happen rests with the CEO. The hospital must have policies and practices that ensure that every injured patient is treated the way we would want to be treated ourselves--openly, honestly, with compassion, and, when indicated, with an apology and compensation. To make that happen, hospitals need to greatly expand training of physicians and others, and develop support programs for patients and caregivers.

  17. Error exponents for entanglement concentration

    International Nuclear Information System (INIS)

    Hayashi, Masahito; Koashi, Masato; Matsumoto, Keiji; Morikoshi, Fumiaki; Winter, Andreas

    2003-01-01

    Consider entanglement concentration schemes that convert n identical copies of a pure state into a maximally entangled state of a desired size with success probability being close to one in the asymptotic limit. We give the distillable entanglement, the number of Bell pairs distilled per copy, as a function of an error exponent, which represents the rate of decrease in failure probability as n tends to infinity. The formula fills the gap between the least upper bound of distillable entanglement in probabilistic concentration, which is the well-known entropy of entanglement, and the maximum attained in deterministic concentration. The method of types in information theory enables the detailed analysis of the distillable entanglement in terms of the error rate. In addition to the probabilistic argument, we consider another type of entanglement concentration scheme, where the initial state is deterministically transformed into a (possibly mixed) final state whose fidelity to a maximally entangled state of a desired size converges to one in the asymptotic limit. We show that the same formula as in the probabilistic argument is valid for the argument on fidelity by replacing the success probability with the fidelity. Furthermore, we also discuss entanglement yield when optimal success probability or optimal fidelity converges to zero in the asymptotic limit (strong converse), and give the explicit formulae for those cases

  18. Critiques of World-Systems Analysis and Alternatives: Unequal Exchange and Three Forms of Class and Struggle in the Japan–US Silk Network, 1880–1890

    Directory of Open Access Journals (Sweden)

    Elson E. Boles

    2015-08-01

    Full Text Available Sympathetic critics of world-system analysis contend that its systemic level of abstraction results in one-sided generalizations of systemic change. Unequal exchange theory and commodity chain analysis similarly reduce distinct and historical forms of labor and their interrelationships to common functional and ahistorical essences. This paper applies an incorporated comparisons method to give historical content to an understanding of unequal exchange and global inequality through a study of the Japan–US silk network’s formation and change during the mid 1880–1890s. Analysis of unequal exchange processes requires, in this case, an examination of the mutual integration and transformation of distinct labor and value forms —peasant sericulture, ?lature wage-labor, and industrial silk factory wage-labor—and the infundibular market forces they structured. These relations were decisively conditioned by new landlordism and debt-peonage, class-patriarchy, state mediations, migration, and by peasant and worker struggles against deteriorating conditions. Indeed, the transitional nature of the silk network’s formation, which concluded the Tokugawa system and decisively contributed to Japan’s emergence as a nation-state of the capitalist world-economy, was signi?ed by the very last millenarian and quasi-modern peasant uprising in 1884 among indebted sericulturists, the very ?rst recorded factory strikes in 1885–86, by women raw silk reelers in K?fu, and by strikes among unionizing workers in patriarchal and mechanized silk factories in Paterson, New Jersey, 1885–86 (Boles 1996, 1998. The “local” conditions of each con?ict were molded by the interdependence of those conditions that constituted a formative part of the world-system and its development. In the face of struggles and intensifying world-market competition, Japanese and US manufacturers took opposite spatial strategies of regional expansion to overcome the structural constraints of

  19. Measurement error models with interactions

    Science.gov (United States)

    Midthune, Douglas; Carroll, Raymond J.; Freedman, Laurence S.; Kipnis, Victor

    2016-01-01

    An important use of measurement error models is to correct regression models for bias due to covariate measurement error. Most measurement error models assume that the observed error-prone covariate (\\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}$W$\\end{document}) is a linear function of the unobserved true covariate (\\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}$X$\\end{document}) plus other covariates (\\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}$Z$\\end{document}) in the regression model. In this paper, we consider models for \\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}$W$\\end{document} that include interactions between \\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}$X$\\end{document} and \\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}$Z$\\end{document}. We derive the conditional distribution of

  20. Game Design Principles based on Human Error

    Directory of Open Access Journals (Sweden)

    Guilherme Zaffari

    2016-03-01

    Full Text Available This paper displays the result of the authors’ research regarding to the incorporation of Human Error, through design principles, to video game design. In a general way, designers must consider Human Error factors throughout video game interface development; however, when related to its core design, adaptations are in need, since challenge is an important factor for fun and under the perspective of Human Error, challenge can be considered as a flaw in the system. The research utilized Human Error classifications, data triangulation via predictive human error analysis, and the expanded flow theory to allow the design of a set of principles in order to match the design of playful challenges with the principles of Human Error. From the results, it was possible to conclude that the application of Human Error in game design has a positive effect on player experience, allowing it to interact only with errors associated with the intended aesthetics of the game.

  1. Understanding human management of automation errors

    Science.gov (United States)

    McBride, Sara E.; Rogers, Wendy A.; Fisk, Arthur D.

    2013-01-01

    Automation has the potential to aid humans with a diverse set of tasks and support overall system performance. Automated systems are not always reliable, and when automation errs, humans must engage in error management, which is the process of detecting, understanding, and correcting errors. However, this process of error management in the context of human-automation interaction is not well understood. Therefore, we conducted a systematic review of the variables that contribute to error management. We examined relevant research in human-automation interaction and human error to identify critical automation, person, task, and emergent variables. We propose a framework for management of automation errors to incorporate and build upon previous models. Further, our analysis highlights variables that may be addressed through design and training to positively influence error management. Additional efforts to understand the error management process will contribute to automation designed and implemented to support safe and effective system performance. PMID:25383042

  2. An Error Analysis on TFL Learners’ Writings

    Directory of Open Access Journals (Sweden)

    Arif ÇERÇİ

    2016-12-01

    Full Text Available The main purpose of the present study is to identify and represent TFL learners’ writing errors through error analysis. All the learners started learning Turkish as foreign language with A1 (beginner level and completed the process by taking C1 (advanced certificate in TÖMER at Gaziantep University. The data of the present study were collected from 14 students’ writings in proficiency exams for each level. The data were grouped as grammatical, syntactic, spelling, punctuation, and word choice errors. The ratio and categorical distributions of identified errors were analyzed through error analysis. The data were analyzed through statistical procedures in an effort to determine whether error types differ according to the levels of the students. The errors in this study are limited to the linguistic and intralingual developmental errors

  3. Field errors in hybrid insertion devices

    International Nuclear Information System (INIS)

    Schlueter, R.D.

    1995-02-01

    Hybrid magnet theory as applied to the error analyses used in the design of Advanced Light Source (ALS) insertion devices is reviewed. Sources of field errors in hybrid insertion devices are discussed

  4. Field errors in hybrid insertion devices

    Energy Technology Data Exchange (ETDEWEB)

    Schlueter, R.D. [Lawrence Berkeley Lab., CA (United States)

    1995-02-01

    Hybrid magnet theory as applied to the error analyses used in the design of Advanced Light Source (ALS) insertion devices is reviewed. Sources of field errors in hybrid insertion devices are discussed.

  5. Error Covariance Estimation of Mesoscale Data Assimilation

    National Research Council Canada - National Science Library

    Xu, Qin

    2005-01-01

    The goal of this project is to explore and develop new methods of error covariance estimation that will provide necessary statistical descriptions of prediction and observation errors for mesoscale data assimilation...

  6. The interaction of human population, food production, and biodiversity protection.

    Science.gov (United States)

    Crist, Eileen; Mora, Camilo; Engelman, Robert

    2017-04-21

    Research suggests that the scale of human population and the current pace of its growth contribute substantially to the loss of biological diversity. Although technological change and unequal consumption inextricably mingle with demographic impacts on the environment, the needs of all human beings-especially for food-imply that projected population growth will undermine protection of the natural world. Numerous solutions have been proposed to boost food production while protecting biodiversity, but alone these proposals are unlikely to staunch biodiversity loss. An important approach to sustaining biodiversity and human well-being is through actions that can slow and eventually reverse population growth: investing in universal access to reproductive health services and contraceptive technologies, advancing women's education, and achieving gender equality. Copyright © 2017, American Association for the Advancement of Science.

  7. Spectrum of diagnostic errors in radiology

    OpenAIRE

    Pinto, Antonio; Brunese, Luca

    2010-01-01

    Diagnostic errors are important in all branches of medicine because they are an indication of poor patient care. Since the early 1970s, physicians have been subjected to an increasing number of medical malpractice claims. Radiology is one of the specialties most liable to claims of medical negligence. Most often, a plaintiff’s complaint against a radiologist will focus on a failure to diagnose. The etiology of radiological error is multi-factorial. Errors fall into recurrent patterns. Errors ...

  8. Improving Type Error Messages in OCaml

    OpenAIRE

    Charguéraud , Arthur

    2015-01-01

    International audience; Cryptic type error messages are a major obstacle to learning OCaml or other ML-based languages. In many cases, error messages cannot be interpreted without a sufficiently-precise model of the type inference algorithm. The problem of improving type error messages in ML has received quite a bit of attention over the past two decades, and many different strategies have been considered. The challenge is not only to produce error messages that are both sufficiently concise ...

  9. Different grades MEMS accelerometers error characteristics

    Science.gov (United States)

    Pachwicewicz, M.; Weremczuk, J.

    2017-08-01

    The paper presents calibration effects of two different MEMS accelerometers of different price and quality grades and discusses different accelerometers errors types. The calibration for error determining is provided by reference centrifugal measurements. The design and measurement errors of the centrifuge are discussed as well. It is shown that error characteristics of the sensors are very different and it is not possible to use simple calibration methods presented in the literature in both cases.

  10. Naming game with learning errors in communications

    OpenAIRE

    Lou, Yang; Chen, Guanrong

    2014-01-01

    Naming game simulates the process of naming an objective by a population of agents organized in a certain communication network topology. By pair-wise iterative interactions, the population reaches a consensus state asymptotically. In this paper, we study naming game with communication errors during pair-wise conversations, where errors are represented by error rates in a uniform probability distribution. First, a model of naming game with learning errors in communications (NGLE) is proposed....

  11. Radiation Protection

    International Nuclear Information System (INIS)

    Loos, M.

    2002-01-01

    Major achievements of SCK-CEN's Radiation Protection Department in 2001 are described. The main areas for R and D of the department are enviromnental remediation, emergency planning, radiation protection research, low-level radioactvity measurements, safeguards and physics measurements, decision strategy research and policy support and social sciences in nuclear research. Main achievements for 2001 in these areas are reported

  12. Sun protection

    Science.gov (United States)

    ... sun exposure. The start of summer is when UV rays can cause the most skin damage. Use sun protection, even on cloudy days. Clouds and haze don't protect you from the sun. Avoid surfaces that reflect light, such as water, sand, concrete, snow, and areas ...

  13. How Do Simulated Error Experiences Impact Attitudes Related to Error Prevention?

    Science.gov (United States)

    Breitkreuz, Karen R; Dougal, Renae L; Wright, Melanie C

    2016-10-01

    The objective of this project was to determine whether simulated exposure to error situations changes attitudes in a way that may have a positive impact on error prevention behaviors. Using a stratified quasi-randomized experiment design, we compared risk perception attitudes of a control group of nursing students who received standard error education (reviewed medication error content and watched movies about error experiences) to an experimental group of students who reviewed medication error content and participated in simulated error experiences. Dependent measures included perceived memorability of the educational experience, perceived frequency of errors, and perceived caution with respect to preventing errors. Experienced nursing students perceived the simulated error experiences to be more memorable than movies. Less experienced students perceived both simulated error experiences and movies to be highly memorable. After the intervention, compared with movie participants, simulation participants believed errors occurred more frequently. Both types of education increased the participants' intentions to be more cautious and reported caution remained higher than baseline for medication errors 6 months after the intervention. This study provides limited evidence of an advantage of simulation over watching movies describing actual errors with respect to manipulating attitudes related to error prevention. Both interventions resulted in long-term impacts on perceived caution in medication administration. Simulated error experiences made participants more aware of how easily errors can occur, and the movie education made participants more aware of the devastating consequences of errors.

  14. Green” Technology and Ecologically Unequal Exchange: The Environmental and Social Consequences of Ecological Modernization in the World-System

    Directory of Open Access Journals (Sweden)

    Eric Bonds

    2015-08-01

    Full Text Available This paper contributes to understandings of ecologically unequal exchange within the world-systems perspective by offering a series of case studies of ecological modernization in the automobile industry. The case studies demonstrate that “green” technologies developed and instituted in core nations often require specific raw materials that are extracted from the periphery and semi-periphery. Extraction of such natural resources causes significant environmental degradation and often displaces entire communities from their land. Moreover, because states often use violence and repression to facilitate raw material extraction, the widespread commercialization of “green” technologies can result in serious human rights violations. These findings challenge ecological modernization theory, which rests on the assumption that the development and commercialization of more ecologically-efficient technologies is universally beneficial.

  15. Interpreting the change detection error matrix

    NARCIS (Netherlands)

    Oort, van P.A.J.

    2007-01-01

    Two different matrices are commonly reported in assessment of change detection accuracy: (1) single date error matrices and (2) binary change/no change error matrices. The third, less common form of reporting, is the transition error matrix. This paper discuses the relation between these matrices.

  16. Human Errors and Bridge Management Systems

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle; Nowak, A. S.

    on basis of reliability profiles for bridges without human errors are extended to include bridges with human errors. The first rehabilitation distributions for bridges without and with human errors are combined into a joint first rehabilitation distribution. The methodology presented is illustrated...... for reinforced concrete bridges....

  17. Error Analysis in Mathematics. Technical Report #1012

    Science.gov (United States)

    Lai, Cheng-Fei

    2012-01-01

    Error analysis is a method commonly used to identify the cause of student errors when they make consistent mistakes. It is a process of reviewing a student's work and then looking for patterns of misunderstanding. Errors in mathematics can be factual, procedural, or conceptual, and may occur for a number of reasons. Reasons why students make…

  18. On-Error Training (Book Excerpt).

    Science.gov (United States)

    Fukuda, Ryuji

    1985-01-01

    This excerpt from "Managerial Engineering: Techniques for Improving Quality and Productivity in the Workplace" describes the development, objectives, and use of On-Error Training (OET), a method which trains workers to learn from their errors. Also described is New Joharry's Window, a performance-error data analysis technique used in…

  19. Human Error Mechanisms in Complex Work Environments

    DEFF Research Database (Denmark)

    Rasmussen, Jens

    1988-01-01

    will account for most of the action errors observed. In addition, error mechanisms appear to be intimately related to the development of high skill and know-how in a complex work context. This relationship between errors and human adaptation is discussed in detail for individuals and organisations...

  20. Measurement error in a single regressor

    NARCIS (Netherlands)

    Meijer, H.J.; Wansbeek, T.J.

    2000-01-01

    For the setting of multiple regression with measurement error in a single regressor, we present some very simple formulas to assess the result that one may expect when correcting for measurement error. It is shown where the corrected estimated regression coefficients and the error variance may lie,

  1. Valuing Errors for Learning: Espouse or Enact?

    Science.gov (United States)

    Grohnert, Therese; Meuwissen, Roger H. G.; Gijselaers, Wim H.

    2017-01-01

    Purpose: This study aims to investigate how organisations can discourage covering up and instead encourage learning from errors through a supportive learning from error climate. In explaining professionals' learning from error behaviour, this study distinguishes between espoused (verbally expressed) and enacted (behaviourally expressed) values…

  2. Improved Landau gauge fixing and discretisation errors

    International Nuclear Information System (INIS)

    Bonnet, F.D.R.; Bowman, P.O.; Leinweber, D.B.; Richards, D.G.; Williams, A.G.

    2000-01-01

    Lattice discretisation errors in the Landau gauge condition are examined. An improved gauge fixing algorithm in which O(a 2 ) errors are removed is presented. O(a 2 ) improvement of the gauge fixing condition displays the secondary benefit of reducing the size of higher-order errors. These results emphasise the importance of implementing an improved gauge fixing condition

  3. Acoustic Evidence for Phonologically Mismatched Speech Errors

    Science.gov (United States)

    Gormley, Andrea

    2015-01-01

    Speech errors are generally said to accommodate to their new phonological context. This accommodation has been validated by several transcription studies. The transcription methodology is not the best choice for detecting errors at this level, however, as this type of error can be difficult to perceive. This paper presents an acoustic analysis of…

  4. Average beta-beating from random errors

    CERN Document Server

    Tomas Garcia, Rogelio; Langner, Andy Sven; Malina, Lukas; Franchi, Andrea; CERN. Geneva. ATS Department

    2018-01-01

    The impact of random errors on average β-beating is studied via analytical derivations and simulations. A systematic positive β-beating is expected from random errors quadratic with the sources or, equivalently, with the rms β-beating. However, random errors do not have a systematic effect on the tune.

  5. Jonas Olson's Evidence for Moral Error Theory

    NARCIS (Netherlands)

    Evers, Daan

    2016-01-01

    Jonas Olson defends a moral error theory in (2014). I first argue that Olson is not justified in believing the error theory as opposed to moral nonnaturalism in his own opinion. I then argue that Olson is not justified in believing the error theory as opposed to moral contextualism either (although

  6. Unequal-thickness billet optimization in transitional region during isothermal local loading forming of Ti-alloy rib-web component using response surface method

    Directory of Open Access Journals (Sweden)

    Ke WEI

    2018-04-01

    Full Text Available Avoiding the folding defect and improving the die filling capability in the transitional region are desired in isothermal local loading forming of a large-scale Ti-alloy rib-web component (LTRC. To achieve a high-precision LTRC, the folding evolution and die filling process in the transitional region were investigated by 3D finite element simulation and experiment using an equal-thickness billet (ETB. It is found that the initial volume distribution in the second-loading region can greatly affect the amount of material transferred into the first-loading region during the second-loading step, and thus lead to the folding defect. Besides, an improper initial volume distribution results in non-concurrent die filling in the cavities of ribs after the second-loading step, and then causes die underfilling. To this end, an unequal-thickness billet (UTB was employed with the initial volume distribution optimized by the response surface method (RSM. For a certain eigenstructure, the critical value of the percentage of transferred material determined by the ETB was taken as a constraint condition for avoiding the folding defect in the UTB optimization process, and the die underfilling rate was considered as the optimization objective. Then, based on the RSM models of the percentage of transferred material and the die underfilling rate, non-folding parameter combinations and optimum die filling were achieved. Lastly, an optimized UTB was obtained and verified by the simulation and experiment. Keywords: Die filling, Folding defect, Isothermal local loading forming, Transitional region, Unequal-thickness billet optimization

  7. Interplay of Coulomb interactions and disorder in three-dimensional quadratic band crossings without time-reversal symmetry and with unequal masses for conduction and valence bands

    Science.gov (United States)

    Mandal, Ipsita; Nandkishore, Rahul M.

    2018-03-01

    Coulomb interactions famously drive three-dimensional quadratic band crossing semimetals into a non-Fermi liquid phase of matter. In a previous work [Nandkishore and Parameswaran, Phys. Rev. B 95, 205106 (2017), 10.1103/PhysRevB.95.205106], the effect of disorder on this non-Fermi liquid phase was investigated, assuming that the band structure was isotropic, assuming that the conduction and valence bands had the same band mass, and assuming that the disorder preserved exact time-reversal symmetry and statistical isotropy. It was shown that the non-Fermi liquid fixed point is unstable to disorder and that a runaway flow to strong disorder occurs. In this paper, we extend that analysis by relaxing the assumption of time-reversal symmetry and allowing the electron and hole masses to differ (but continuing to assume isotropy of the low energy band structure). We first incorporate time-reversal symmetry breaking disorder and demonstrate that there do not appear any new fixed points. Moreover, while the system continues to flow to strong disorder, time-reversal-symmetry-breaking disorder grows asymptotically more slowly than time-reversal-symmetry-preserving disorder, which we therefore expect should dominate the strong-coupling phase. We then allow for unequal electron and hole masses. We show that whereas asymmetry in the two masses is irrelevant in the clean system, it is relevant in the presence of disorder, such that the `effective masses' of the conduction and valence bands should become sharply distinct in the low-energy limit. We calculate the RG flow equations for the disordered interacting system with unequal band masses and demonstrate that the problem exhibits a runaway flow to strong disorder. Along the runaway flow, time-reversal-symmetry-preserving disorder grows asymptotically more rapidly than both time-reversal-symmetry-breaking disorder and the Coulomb interaction.

  8. List of Error-Prone Abbreviations, Symbols, and Dose Designations

    Science.gov (United States)

    ... Analysis and Coaching Report an Error Report a Medication Error Report a Vaccine Error Consumer Error Reporting Search ... which have been reported through the ISMP National Medication Errors Reporting Program (ISMP MERP) as being frequently misinterpreted ...

  9. Analysis of error patterns in clinical radiotherapy

    International Nuclear Information System (INIS)

    Macklis, Roger; Meier, Tim; Barrett, Patricia; Weinhous, Martin

    1996-01-01

    Purpose: Until very recently, prescription errors and adverse treatment events have rarely been studied or reported systematically in oncology. We wished to understand the spectrum and severity of radiotherapy errors that take place on a day-to-day basis in a high-volume academic practice and to understand the resource needs and quality assurance challenges placed on a department by rapid upswings in contract-based clinical volumes requiring additional operating hours, procedures, and personnel. The goal was to define clinical benchmarks for operating safety and to detect error-prone treatment processes that might function as 'early warning' signs. Methods: A multi-tiered prospective and retrospective system for clinical error detection and classification was developed, with formal analysis of the antecedents and consequences of all deviations from prescribed treatment delivery, no matter how trivial. A department-wide record-and-verify system was operational during this period and was used as one method of treatment verification and error detection. Brachytherapy discrepancies were analyzed separately. Results: During the analysis year, over 2000 patients were treated with over 93,000 individual fields. A total of 59 errors affecting a total of 170 individual treated fields were reported or detected during this period. After review, all of these errors were classified as Level 1 (minor discrepancy with essentially no potential for negative clinical implications). This total treatment delivery error rate (170/93, 332 or 0.18%) is significantly better than corresponding error rates reported for other hospital and oncology treatment services, perhaps reflecting the relatively sophisticated error avoidance and detection procedures used in modern clinical radiation oncology. Error rates were independent of linac model and manufacturer, time of day (normal operating hours versus late evening or early morning) or clinical machine volumes. There was some relationship to

  10. Comparison between calorimeter and HLNC errors

    International Nuclear Information System (INIS)

    Goldman, A.S.; De Ridder, P.; Laszlo, G.

    1991-01-01

    This paper summarizes an error analysis that compares systematic and random errors of total plutonium mass estimated for high-level neutron coincidence counter (HLNC) and calorimeter measurements. This task was part of an International Atomic Energy Agency (IAEA) study on the comparison of the two instruments to determine if HLNC measurement errors met IAEA standards and if the calorimeter gave ''significantly'' better precision. Our analysis was based on propagation of error models that contained all known sources of errors including uncertainties associated with plutonium isotopic measurements. 5 refs., 2 tabs

  11. Radiation protection

    International Nuclear Information System (INIS)

    1989-01-01

    A NRPB leaflet in the 'At-a-Glance' series explains in a simple but scientifically accurate way what radiation is, the biological effects and the relative sensitivity of different parts of the human body. The leaflet then discusses radiation protection principles, radiation protection in the UK and finally the effectiveness of this radiation protection as judged by a breakdown of the total dose received by an average person in the UK, a heavy consumer of Cumbrian seafood, an average nuclear industry worker and an average person in Cornwall. (UK)

  12. Medication errors: an overview for clinicians.

    Science.gov (United States)

    Wittich, Christopher M; Burkle, Christopher M; Lanier, William L

    2014-08-01

    Medication error is an important cause of patient morbidity and mortality, yet it can be a confusing and underappreciated concept. This article provides a review for practicing physicians that focuses on medication error (1) terminology and definitions, (2) incidence, (3) risk factors, (4) avoidance strategies, and (5) disclosure and legal consequences. A medication error is any error that occurs at any point in the medication use process. It has been estimated by the Institute of Medicine that medication errors cause 1 of 131 outpatient and 1 of 854 inpatient deaths. Medication factors (eg, similar sounding names, low therapeutic index), patient factors (eg, poor renal or hepatic function, impaired cognition, polypharmacy), and health care professional factors (eg, use of abbreviations in prescriptions and other communications, cognitive biases) can precipitate medication errors. Consequences faced by physicians after medication errors can include loss of patient trust, civil actions, criminal charges, and medical board discipline. Methods to prevent medication errors from occurring (eg, use of information technology, better drug labeling, and medication reconciliation) have been used with varying success. When an error is discovered, patients expect disclosure that is timely, given in person, and accompanied with an apology and communication of efforts to prevent future errors. Learning more about medication errors may enhance health care professionals' ability to provide safe care to their patients. Copyright © 2014 Mayo Foundation for Medical Education and Research. Published by Elsevier Inc. All rights reserved.

  13. Analysis of errors in forensic science

    Directory of Open Access Journals (Sweden)

    Mingxiao Du

    2017-01-01

    Full Text Available Reliability of expert testimony is one of the foundations of judicial justice. Both expert bias and scientific errors affect the reliability of expert opinion, which in turn affects the trustworthiness of the findings of fact in legal proceedings. Expert bias can be eliminated by replacing experts; however, it may be more difficult to eliminate scientific errors. From the perspective of statistics, errors in operation of forensic science include systematic errors, random errors, and gross errors. In general, process repetition and abiding by the standard ISO/IEC:17025: 2005, general requirements for the competence of testing and calibration laboratories, during operation are common measures used to reduce errors that originate from experts and equipment, respectively. For example, to reduce gross errors, the laboratory can ensure that a test is repeated several times by different experts. In applying for forensic principles and methods, the Federal Rules of Evidence 702 mandate that judges consider factors such as peer review, to ensure the reliability of the expert testimony. As the scientific principles and methods may not undergo professional review by specialists in a certain field, peer review serves as an exclusive standard. This study also examines two types of statistical errors. As false-positive errors involve a higher possibility of an unfair decision-making, they should receive more attention than false-negative errors.

  14. Error management process for power stations

    International Nuclear Information System (INIS)

    Hirotsu, Yuko; Takeda, Daisuke; Fujimoto, Junzo; Nagasaka, Akihiko

    2016-01-01

    The purpose of this study is to establish 'error management process for power stations' for systematizing activities for human error prevention and for festering continuous improvement of these activities. The following are proposed by deriving concepts concerning error management process from existing knowledge and realizing them through application and evaluation of their effectiveness at a power station: an entire picture of error management process that facilitate four functions requisite for maraging human error prevention effectively (1. systematizing human error prevention tools, 2. identifying problems based on incident reports and taking corrective actions, 3. identifying good practices and potential problems for taking proactive measures, 4. prioritizeng human error prevention tools based on identified problems); detail steps for each activity (i.e. developing an annual plan for human error prevention, reporting and analyzing incidents and near misses) based on a model of human error causation; procedures and example of items for identifying gaps between current and desired levels of executions and outputs of each activity; stages for introducing and establishing the above proposed error management process into a power station. By giving shape to above proposals at a power station, systematization and continuous improvement of activities for human error prevention in line with the actual situation of the power station can be expected. (author)

  15. INVESTIGATION OF INFLUENCE OF ENCODING FUNCTION COMPLEXITY ON DISTRIBUTION OF ERROR MASKING PROBABILITY

    Directory of Open Access Journals (Sweden)

    A. B. Levina

    2016-03-01

    Full Text Available Error detection codes are mechanisms that enable robust delivery of data in unreliable communication channels and devices. Unreliable channels and devices are error-prone objects. Respectively, error detection codes allow detecting such errors. There are two classes of error detecting codes - classical codes and security-oriented codes. The classical codes have high percentage of detected errors; however, they have a high probability to miss an error in algebraic manipulation. In order, security-oriented codes are codes with a small Hamming distance and high protection to algebraic manipulation. The probability of error masking is a fundamental parameter of security-oriented codes. A detailed study of this parameter allows analyzing the behavior of the error-correcting code in the case of error injection in the encoding device. In order, the complexity of the encoding function plays an important role in the security-oriented codes. Encoding functions with less computational complexity and a low probability of masking are the best protection of encoding device against malicious acts. This paper investigates the influence of encoding function complexity on the error masking probability distribution. It will be shownthat the more complex encoding function reduces the maximum of error masking probability. It is also shown in the paper that increasing of the function complexity changes the error masking probability distribution. In particular, increasing of computational complexity decreases the difference between the maximum and average value of the error masking probability. Our resultshave shown that functions with greater complexity have smoothed maximums of error masking probability, which significantly complicates the analysis of error-correcting code by attacker. As a result, in case of complex encoding function the probability of the algebraic manipulation is reduced. The paper discusses an approach how to measure the error masking

  16. Impact of Measurement Error on Synchrophasor Applications

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Yilu [Univ. of Tennessee, Knoxville, TN (United States); Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Gracia, Jose R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Ewing, Paul D. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Zhao, Jiecheng [Univ. of Tennessee, Knoxville, TN (United States); Tan, Jin [Univ. of Tennessee, Knoxville, TN (United States); Wu, Ling [Univ. of Tennessee, Knoxville, TN (United States); Zhan, Lingwei [Univ. of Tennessee, Knoxville, TN (United States)

    2015-07-01

    Phasor measurement units (PMUs), a type of synchrophasor, are powerful diagnostic tools that can help avert catastrophic failures in the power grid. Because of this, PMU measurement errors are particularly worrisome. This report examines the internal and external factors contributing to PMU phase angle and frequency measurement errors and gives a reasonable explanation for them. It also analyzes the impact of those measurement errors on several synchrophasor applications: event location detection, oscillation detection, islanding detection, and dynamic line rating. The primary finding is that dynamic line rating is more likely to be influenced by measurement error. Other findings include the possibility of reporting nonoscillatory activity as an oscillation as the result of error, failing to detect oscillations submerged by error, and the unlikely impact of error on event location and islanding detection.

  17. Advanced hardware design for error correcting codes

    CERN Document Server

    Coussy, Philippe

    2015-01-01

    This book provides thorough coverage of error correcting techniques. It includes essential basic concepts and the latest advances on key topics in design, implementation, and optimization of hardware/software systems for error correction. The book’s chapters are written by internationally recognized experts in this field. Topics include evolution of error correction techniques, industrial user needs, architectures, and design approaches for the most advanced error correcting codes (Polar Codes, Non-Binary LDPC, Product Codes, etc). This book provides access to recent results, and is suitable for graduate students and researchers of mathematics, computer science, and engineering. • Examines how to optimize the architecture of hardware design for error correcting codes; • Presents error correction codes from theory to optimized architecture for the current and the next generation standards; • Provides coverage of industrial user needs advanced error correcting techniques.

  18. Approximate error conjugation gradient minimization methods

    Science.gov (United States)

    Kallman, Jeffrey S

    2013-05-21

    In one embodiment, a method includes selecting a subset of rays from a set of all rays to use in an error calculation for a constrained conjugate gradient minimization problem, calculating an approximate error using the subset of rays, and calculating a minimum in a conjugate gradient direction based on the approximate error. In another embodiment, a system includes a processor for executing logic, logic for selecting a subset of rays from a set of all rays to use in an error calculation for a constrained conjugate gradient minimization problem, logic for calculating an approximate error using the subset of rays, and logic for calculating a minimum in a conjugate gradient direction based on the approximate error. In other embodiments, computer program products, methods, and systems are described capable of using approximate error in constrained conjugate gradient minimization problems.

  19. Protected Areas

    Data.gov (United States)

    Kansas Data Access and Support Center — This dataset shows the boundaries of properties in Kansas in public or institutional ownership that contain ecological resources that merit some level of protection....

  20. Protective clothing

    International Nuclear Information System (INIS)

    Malet, J.C.; Regnier, J.

    1979-01-01

    The present operational and intervention suits are described. Research work is currently in progress to improve the performance of the existing suits and to develop more resistant protective clothing. (author)

  1. Radiation protection

    International Nuclear Information System (INIS)

    Ures Pantazi, M.

    1994-01-01

    This work define procedures and controls about ionizing radiations. Between some definitions it found the following topics: radiation dose, risk, biological effects, international radioprotection bodies, workers exposure, accidental exposure, emergencies and radiation protection

  2. Employment protection

    OpenAIRE

    Stefano Scarpetta

    2014-01-01

    Laws on hiring and firing are intended to protect workers from unfair behavior by employers, to counter imperfections in financial markets that limit workers’ ability to insure themselves against job loss, and to preserve firm-specific human capital. But by imposing costs on firms’ adaptation to changes in demand and technology, employment protection legislation may reduce not only job destruction but also job creation, hindering the efficient allocation of labor and productivity growth....

  3. Environmental protection

    International Nuclear Information System (INIS)

    Martinez, A.S.

    1987-01-01

    The question of environment protection related to the use of nuclear energy aiming to power generation, based on the harmonic concept of economic and industrial development, preserving the environment, is discussed. A brief study of environmental impacts for some energy sources, including nuclear energy, to present the systems of a nuclear power plant which aim at environmental protection, is done. (M.C.K.) [pt

  4. Radiation protection

    International Nuclear Information System (INIS)

    Koelzer, W.

    1976-01-01

    The lecture is divided into five sections. The introduction deals with the physical and radiological terms, quantities and units. Then the basic principles of radiological protection are discussed. In the third section attention is paid to the biological effects of ionizing radiation. The fourth section deals with the objectives of practical radiological protection. Finally the emergency measures are discussed to be taken in radiation accidents. (HP) [de

  5. Copyright protection

    OpenAIRE

    Plchotová, Gabriela

    2011-01-01

    The aim of this thesis is to offer a straightforward manual to anyone who authors their own original work or who utilises the original work of other creators. As such, it is necessary to briefly and clearly explain the historical development and essential terms of authorship as a concept and the origin of the need for copyright protection. Furthermore, this thesis includes chapters on copyright protection development specifically in the Czech Republic and the current definition of related law...

  6. Radiological protection

    International Nuclear Information System (INIS)

    Azorin N, J.; Azorin V, J. C.

    2010-01-01

    This work is directed to all those people related with the exercise of the radiological protection and has the purpose of providing them a base of knowledge in this discipline so that they can make decisions documented on technical and scientist factors for the protection of the personnel occupationally exposed, the people in general and the environment during the work with ionizing radiations. Before de lack of a text on this matter, this work seeks to cover the specific necessities of our country, providing a solid presentation of the radiological protection, included the bases of the radiations physics, the detection and radiation dosimetry, the radiobiology, the normative and operational procedures associates, the radioactive wastes, the emergencies and the transport of the radioactive material through the medical and industrial applications of the radiations, making emphasis in the relative particular aspects to the radiological protection in Mexico. The book have 16 chapters and with the purpose of supplementing the given information, are included at the end four appendixes: 1) the radioactive waste management in Mexico, 2-3) the Mexican official standards related with the radiological protection, 4) a terms glossary used in radiological protection. We hope this book will be of utility for those people that work in the investigation and the applications of the ionizing radiations. (Author)

  7. Design Of Photovoltaic Powered Cathodic Protection System

    Directory of Open Access Journals (Sweden)

    Golina Samir Adly

    2017-07-01

    Full Text Available The corrosion caused by chemical reaction between metallic structures and surrounding mediums such as soil or water .the CP cathodic protection system is used to protect metallic structure against corrosion. Cathodic protection CP used to minimize corrosion by utilizing an external source of electrical current which forces the entire structure to become a cathode. There are two Types of cathodic protection system Galvanic current Impressed current.the Galvanic current is called a sacrificial anode is connected to the protected structure cathode through a DC power supply. In Galvanic current system a current passes from the sacrificing anode to the protected structure .the sacrificial anode is corroded rather than causing the protected structure corrosion .protected structure requires a constant current to stop the corrosion which determined by area structure metal and the surrounding medium. The rains humidity are decrease soil resistivity and increase the DC current .The corrosion and over protection resulting from increase in the DC current is harmful for the metallic structure. This problem can be solved by conventional cathodic protection system by manual adjustment of DC voltage periodically to obtain a constant current .the manual adjustment of DC voltage depends on experience of the technician and using the accuracy of the measuring equipment. The errors of measuring current depend on error from the technician or error from the measuring equipment. the corrosion of structure may occur when the interval between two successive adjustment is long .An automatically regulated cathodic protection system is used to overcome problems from conventional cathodic protection system .the regulated cathodic protection system adjust the DC voltage of the system automatically when it senses the variations of surrounding medium resistivity so the DC current is constant at the required level.

  8. Medication errors in anesthesia: unacceptable or unavoidable?

    Directory of Open Access Journals (Sweden)

    Ira Dhawan

    Full Text Available Abstract Medication errors are the common causes of patient morbidity and mortality. It adds financial burden to the institution as well. Though the impact varies from no harm to serious adverse effects including death, it needs attention on priority basis since medication errors' are preventable. In today's world where people are aware and medical claims are on the hike, it is of utmost priority that we curb this issue. Individual effort to decrease medication error alone might not be successful until a change in the existing protocols and system is incorporated. Often drug errors that occur cannot be reversed. The best way to ‘treat' drug errors is to prevent them. Wrong medication (due to syringe swap, overdose (due to misunderstanding or preconception of the dose, pump misuse and dilution error, incorrect administration route, under dosing and omission are common causes of medication error that occur perioperatively. Drug omission and calculation mistakes occur commonly in ICU. Medication errors can occur perioperatively either during preparation, administration or record keeping. Numerous human and system errors can be blamed for occurrence of medication errors. The need of the hour is to stop the blame - game, accept mistakes and develop a safe and ‘just' culture in order to prevent medication errors. The newly devised systems like VEINROM, a fluid delivery system is a novel approach in preventing drug errors due to most commonly used medications in anesthesia. Similar developments along with vigilant doctors, safe workplace culture and organizational support all together can help prevent these errors.

  9. Dissipative quantum error correction and application to quantum sensing with trapped ions.

    Science.gov (United States)

    Reiter, F; Sørensen, A S; Zoller, P; Muschik, C A

    2017-11-28

    Quantum-enhanced measurements hold the promise to improve high-precision sensing ranging from the definition of time standards to the determination of fundamental constants of nature. However, quantum sensors lose their sensitivity in the presence of noise. To protect them, the use of quantum error-correcting codes has been proposed. Trapped ions are an excellent technological platform for both quantum sensing and quantum error correction. Here we present a quantum error correction scheme that harnesses dissipation to stabilize a trapped-ion qubit. In our approach, always-on couplings to an engineered environment protect the qubit against spin-flips or phase-flips. Our dissipative error correction scheme operates in a continuous manner without the need to perform measurements or feedback operations. We show that the resulting enhanced coherence time translates into a significantly enhanced precision for quantum measurements. Our work constitutes a stepping stone towards the paradigm of self-correcting quantum information processing.

  10. Radiation protection

    International Nuclear Information System (INIS)

    Jain, Aman; Sharma, Shivam; Parasher, Abhishek

    2014-01-01

    Radiation dose measurement, field of radiobiology, is considered to be critical factor for optimizing radiation protection to the health care practitioners, patients and the public. This lead to equipment that has dose - area product meters permanently installed. In many countries and even institution, the range of equipment is vast and with the opportunity for radiation protection and dose recording varies considerably. Practitioners must move with the changed demands of radiation protection but in many cases without assistance of modern advancements in technology Keeping the three basic safety measures Time, Dose and Shielding we can say 'Optimum dose is safe dose' instead of 'No dose is safe dose'. The purpose enclosed within the title 'Radiation Protection'. The use of radiation is expanding widely everyday around the world and crossing boundaries of medical imaging, diagnostic and. The way to get the ''As low as reasonably achievable' is only achievable by using methodology of radiation protection and to bring the concern of general public and practitioners over the hazards of un-necessary radiation dose. Three basic principles of radiation protection are time, distance and shielding. By minimizing the exposure time increasing the distance and including the shielding we can reduce the optimum range of dose. The ability of shielding material to attenuate radiation is generally given as half value layer. This is the thickness of the material which will reduce the amount of radiation by 50%. Lab coat and gloves must be worn when handling radioactive material or when working in a labeled radiation work area. Safety glasses or other appropriate splash shields should be used when handling radioactive material. 1. Reached to low dose level to occupational workers, public as per prescribed dose limit. 2. By mean of ALARA principle we achieved the protection from radiation besides us using the radiation for our benefit

  11. Machine Protection

    International Nuclear Information System (INIS)

    Schmidt, R

    2014-01-01

    The protection of accelerator equipment is as old as accelerator technology and was for many years related to high-power equipment. Examples are the protection of powering equipment from overheating (magnets, power converters, high-current cables), of superconducting magnets from damage after a quench and of klystrons. The protection of equipment from beam accidents is more recent. It is related to the increasing beam power of high-power proton accelerators such as ISIS, SNS, ESS and the PSI cyclotron, to the emission of synchrotron light by electron–positron accelerators and FELs, and to the increase of energy stored in the beam (in particular for hadron colliders such as LHC). Designing a machine protection system requires an excellent understanding of accelerator physics and operation to anticipate possible failures that could lead to damage. Machine protection includes beam and equipment monitoring, a system to safely stop beam operation (e.g. dumping the beam or stopping the beam at low energy) and an interlock system providing the glue between these systems. The most recent accelerator, the LHC, will operate with about 3 × 10 14 protons per beam, corresponding to an energy stored in each beam of 360 MJ. This energy can cause massive damage to accelerator equipment in case of uncontrolled beam loss, and a single accident damaging vital parts of the accelerator could interrupt operation for years. This article provides an overview of the requirements for protection of accelerator equipment and introduces the various protection systems. Examples are mainly from LHC, SNS and ESS

  12. Human errors related to maintenance and modifications

    International Nuclear Information System (INIS)

    Laakso, K.; Pyy, P.; Reiman, L.

    1998-01-01

    The focus in human reliability analysis (HRA) relating to nuclear power plants has traditionally been on human performance in disturbance conditions. On the other hand, some studies and incidents have shown that also maintenance errors, which have taken place earlier in plant history, may have an impact on the severity of a disturbance, e.g. if they disable safety related equipment. Especially common cause and other dependent failures of safety systems may significantly contribute to the core damage risk. The first aim of the study was to identify and give examples of multiple human errors which have penetrated the various error detection and inspection processes of plant safety barriers. Another objective was to generate numerical safety indicators to describe and forecast the effectiveness of maintenance. A more general objective was to identify needs for further development of maintenance quality and planning. In the first phase of this operational experience feedback analysis, human errors recognisable in connection with maintenance were looked for by reviewing about 4400 failure and repair reports and some special reports which cover two nuclear power plant units on the same site during 1992-94. A special effort was made to study dependent human errors since they are generally the most serious ones. An in-depth root cause analysis was made for 14 dependent errors by interviewing plant maintenance foremen and by thoroughly analysing the errors. A more simple treatment was given to maintenance-related single errors. The results were shown as a distribution of errors among operating states i.a. as regards the following matters: in what operational state the errors were committed and detected; in what operational and working condition the errors were detected, and what component and error type they were related to. These results were presented separately for single and dependent maintenance-related errors. As regards dependent errors, observations were also made

  13. Coherence protection by random coding

    International Nuclear Information System (INIS)

    Brion, E; Akulin, V M; Dumer, I; Harel, G; Kurizki, G

    2005-01-01

    We show that the multidimensional Zeno effect combined with non-holonomic control allows one to efficiently protect quantum systems from decoherence by a method similar to classical random coding. The method is applicable to arbitrary error-inducing Hamiltonians and general quantum systems. The quantum encoding approaches the Hamming upper bound for large dimension increases. Applicability of the method is demonstrated with a seven-qubit toy computer

  14. Machine Protection

    CERN Document Server

    Schmidt, R

    2014-01-01

    The protection of accelerator equipment is as old as accelerator technology and was for many years related to high-power equipment. Examples are the protection of powering equipment from overheating (magnets, power converters, high-current cables), of superconducting magnets from damage after a quench and of klystrons. The protection of equipment from beam accidents is more recent. It is related to the increasing beam power of high-power proton accelerators such as ISIS, SNS, ESS and the PSI cyclotron, to the emission of synchrotron light by electron–positron accelerators and FELs, and to the increase of energy stored in the beam (in particular for hadron colliders such as LHC). Designing a machine protection system requires an excellent understanding of accelerator physics and operation to anticipate possible failures that could lead to damage. Machine protection includes beam and equipment monitoring, a system to safely stop beam operation (e.g. dumping the beam or stopping the beam at low energy) and an ...

  15. Angular truncation errors in integrating nephelometry

    International Nuclear Information System (INIS)

    Moosmueller, Hans; Arnott, W. Patrick

    2003-01-01

    Ideal integrating nephelometers integrate light scattered by particles over all directions. However, real nephelometers truncate light scattered in near-forward and near-backward directions below a certain truncation angle (typically 7 deg. ). This results in truncation errors, with the forward truncation error becoming important for large particles. Truncation errors are commonly calculated using Mie theory, which offers little physical insight and no generalization to nonspherical particles. We show that large particle forward truncation errors can be calculated and understood using geometric optics and diffraction theory. For small truncation angles (i.e., <10 deg. ) as typical for modern nephelometers, diffraction theory by itself is sufficient. Forward truncation errors are, by nearly a factor of 2, larger for absorbing particles than for nonabsorbing particles because for large absorbing particles most of the scattered light is due to diffraction as transmission is suppressed. Nephelometers calibration procedures are also discussed as they influence the effective truncation error

  16. Collection of offshore human error probability data

    International Nuclear Information System (INIS)

    Basra, Gurpreet; Kirwan, Barry

    1998-01-01

    Accidents such as Piper Alpha have increased concern about the effects of human errors in complex systems. Such accidents can in theory be predicted and prevented by risk assessment, and in particular human reliability assessment (HRA), but HRA ideally requires qualitative and quantitative human error data. A research initiative at the University of Birmingham led to the development of CORE-DATA, a Computerised Human Error Data Base. This system currently contains a reasonably large number of human error data points, collected from a variety of mainly nuclear-power related sources. This article outlines a recent offshore data collection study, concerned with collecting lifeboat evacuation data. Data collection methods are outlined and a selection of human error probabilities generated as a result of the study are provided. These data give insights into the type of errors and human failure rates that could be utilised to support offshore risk analyses

  17. Error-related anterior cingulate cortex activity and the prediction of conscious error awareness

    Directory of Open Access Journals (Sweden)

    Catherine eOrr

    2012-06-01

    Full Text Available Research examining the neural mechanisms associated with error awareness has consistently identified dorsal anterior cingulate activity (ACC as necessary but not predictive of conscious error detection. Two recent studies (Steinhauser and Yeung, 2010; Wessel et al. 2011 have found a contrary pattern of greater dorsal ACC activity (in the form of the error-related negativity during detected errors, but suggested that the greater activity may instead reflect task influences (e.g., response conflict, error probability and or individual variability (e.g., statistical power. We re-analyzed fMRI BOLD data from 56 healthy participants who had previously been administered the Error Awareness Task, a motor Go/No-go response inhibition task in which subjects make errors of commission of which they are aware (Aware errors, or unaware (Unaware errors. Consistent with previous data, the activity in a number of cortical regions was predictive of error awareness, including bilateral inferior parietal and insula cortices, however in contrast to previous studies, including our own smaller sample studies using the same task, error-related dorsal ACC activity was significantly greater during aware errors when compared to unaware errors. While the significantly faster RT for aware errors (compared to unaware was consistent with the hypothesis of higher response conflict increasing ACC activity, we could find no relationship between dorsal ACC activity and the error RT difference. The data suggests that individual variability in error awareness is associated with error-related dorsal ACC activity, and therefore this region may be important to conscious error detection, but it remains unclear what task and individual factors influence error awareness.

  18. Common patterns in 558 diagnostic radiology errors.

    Science.gov (United States)

    Donald, Jennifer J; Barnard, Stuart A

    2012-04-01

    As a Quality Improvement initiative our department has held regular discrepancy meetings since 2003. We performed a retrospective analysis of the cases presented and identified the most common pattern of error. A total of 558 cases were referred for discussion over 92 months, and errors were classified as perceptual or interpretative. The most common patterns of error for each imaging modality were analysed, and the misses were scored by consensus as subtle or non-subtle. Of 558 diagnostic errors, 447 (80%) were perceptual and 111 (20%) were interpretative errors. Plain radiography and computed tomography (CT) scans were the most frequent imaging modalities accounting for 246 (44%) and 241 (43%) of the total number of errors, respectively. In the plain radiography group 120 (49%) of the errors occurred in chest X-ray reports with perceptual miss of a lung nodule occurring in 40% of this subgroup. In the axial and appendicular skeleton missed fractures occurred most frequently, and metastatic bone disease was overlooked in 12 of 50 plain X-rays of the pelvis or spine. The majority of errors within the CT group were in reports of body scans with the commonest perceptual errors identified including 16 missed significant bone lesions, 14 cases of thromboembolic disease and 14 gastrointestinal tumours. Of the 558 errors, 312 (56%) were considered subtle and 246 (44%) non-subtle. Diagnostic errors are not uncommon and are most frequently perceptual in nature. Identification of the most common patterns of error has the potential to improve the quality of reporting by improving the search behaviour of radiologists. © 2012 The Authors. Journal of Medical Imaging and Radiation Oncology © 2012 The Royal Australian and New Zealand College of Radiologists.

  19. Group representations, error bases and quantum codes

    Energy Technology Data Exchange (ETDEWEB)

    Knill, E

    1996-01-01

    This report continues the discussion of unitary error bases and quantum codes. Nice error bases are characterized in terms of the existence of certain characters in a group. A general construction for error bases which are non-abelian over the center is given. The method for obtaining codes due to Calderbank et al. is generalized and expressed purely in representation theoretic terms. The significance of the inertia subgroup both for constructing codes and obtaining the set of transversally implementable operations is demonstrated.

  20. Practical, Reliable Error Bars in Quantum Tomography

    OpenAIRE

    Faist, Philippe; Renner, Renato

    2015-01-01

    Precise characterization of quantum devices is usually achieved with quantum tomography. However, most methods which are currently widely used in experiments, such as maximum likelihood estimation, lack a well-justified error analysis. Promising recent methods based on confidence regions are difficult to apply in practice or yield error bars which are unnecessarily large. Here, we propose a practical yet robust method for obtaining error bars. We do so by introducing a novel representation of...

  1. Soft errors in modern electronic systems

    CERN Document Server

    Nicolaidis, Michael

    2010-01-01

    This book provides a comprehensive presentation of the most advanced research results and technological developments enabling understanding, qualifying and mitigating the soft errors effect in advanced electronics, including the fundamental physical mechanisms of radiation induced soft errors, the various steps that lead to a system failure, the modelling and simulation of soft error at various levels (including physical, electrical, netlist, event driven, RTL, and system level modelling and simulation), hardware fault injection, accelerated radiation testing and natural environment testing, s

  2. Error calculations statistics in radioactive measurements

    International Nuclear Information System (INIS)

    Verdera, Silvia

    1994-01-01

    Basic approach and procedures frequently used in the practice of radioactive measurements.Statistical principles applied are part of Good radiopharmaceutical Practices and quality assurance.Concept of error, classification as systematic and random errors.Statistic fundamentals,probability theories, populations distributions, Bernoulli, Poisson,Gauss, t-test distribution,Ξ2 test, error propagation based on analysis of variance.Bibliography.z table,t-test table, Poisson index ,Ξ2 test

  3. Machine Protection

    International Nuclear Information System (INIS)

    Zerlauth, Markus; Schmidt, Rüdiger; Wenninger, Jörg

    2012-01-01

    The present architecture of the machine protection system is being recalled and the performance of the associated systems during the 2011 run will be briefly summarized. An analysis of the causes of beam dumps as well as an assessment of the dependability of the machine protection systems (MPS) itself is being presented. Emphasis will be given to events that risked exposing parts of the machine to damage. Further improvements and mitigations of potential holes in the protection systems will be evaluated along with their impact on the 2012 run. The role of rMPP during the various operational phases (commissioning, intensity ramp up, MDs...) will be discussed along with a proposal for the intensity ramp up for the start of beam operation in 2012

  4. Machine Protection

    CERN Document Server

    Zerlauth, Markus; Wenninger, Jörg

    2012-01-01

    The present architecture of the machine protection system is being recalled and the performance of the associated systems during the 2011 run will be briefly summarized. An analysis of the causes of beam dumps as well as an assessment of the dependability of the machine protection systems (MPS) itself is being presented. Emphasis will be given to events that risked exposing parts of the machine to damage. Further improvements and mitigations of potential holes in the protection systems will be evaluated along with their impact on the 2012 run. The role of rMPP during the various operational phases (commissioning, intensity ramp up, MDs...) will be discussed along with a proposal for the intensity ramp up for the start of beam operation in 2012.

  5. Machine Protection

    Energy Technology Data Exchange (ETDEWEB)

    Zerlauth, Markus; Schmidt, Rüdiger; Wenninger, Jörg [European Organization for Nuclear Research, Geneva (Switzerland)

    2012-07-01

    The present architecture of the machine protection system is being recalled and the performance of the associated systems during the 2011 run will be briefly summarized. An analysis of the causes of beam dumps as well as an assessment of the dependability of the machine protection systems (MPS) itself is being presented. Emphasis will be given to events that risked exposing parts of the machine to damage. Further improvements and mitigations of potential holes in the protection systems will be evaluated along with their impact on the 2012 run. The role of rMPP during the various operational phases (commissioning, intensity ramp up, MDs...) will be discussed along with a proposal for the intensity ramp up for the start of beam operation in 2012.

  6. Error monitoring issues for common channel signaling

    Science.gov (United States)

    Hou, Victor T.; Kant, Krishna; Ramaswami, V.; Wang, Jonathan L.

    1994-04-01

    Motivated by field data which showed a large number of link changeovers and incidences of link oscillations between in-service and out-of-service states in common channel signaling (CCS) networks, a number of analyses of the link error monitoring procedures in the SS7 protocol were performed by the authors. This paper summarizes the results obtained thus far and include the following: (1) results of an exact analysis of the performance of the error monitoring procedures under both random and bursty errors; (2) a demonstration that there exists a range of error rates within which the error monitoring procedures of SS7 may induce frequent changeovers and changebacks; (3) an analysis of the performance ofthe SS7 level-2 transmission protocol to determine the tolerable error rates within which the delay requirements can be met; (4) a demonstration that the tolerable error rate depends strongly on various link and traffic characteristics, thereby implying that a single set of error monitor parameters will not work well in all situations; (5) some recommendations on a customizable/adaptable scheme of error monitoring with a discussion on their implementability. These issues may be particularly relevant in the presence of anticipated increases in SS7 traffic due to widespread deployment of Advanced Intelligent Network (AIN) and Personal Communications Service (PCS) as well as for developing procedures for high-speed SS7 links currently under consideration by standards bodies.

  7. Neurochemical enhancement of conscious error awareness.

    Science.gov (United States)

    Hester, Robert; Nandam, L Sanjay; O'Connell, Redmond G; Wagner, Joe; Strudwick, Mark; Nathan, Pradeep J; Mattingley, Jason B; Bellgrove, Mark A

    2012-02-22

    How the brain monitors ongoing behavior for performance errors is a central question of cognitive neuroscience. Diminished awareness of performance errors limits the extent to which humans engage in corrective behavior and has been linked to loss of insight in a number of psychiatric syndromes (e.g., attention deficit hyperactivity disorder, drug addiction). These conditions share alterations in monoamine signaling that may influence the neural mechanisms underlying error processing, but our understanding of the neurochemical drivers of these processes is limited. We conducted a randomized, double-blind, placebo-controlled, cross-over design of the influence of methylphenidate, atomoxetine, and citalopram on error awareness in 27 healthy participants. The error awareness task, a go/no-go response inhibition paradigm, was administered to assess the influence of monoaminergic agents on performance errors during fMRI data acquisition. A single dose of methylphenidate, but not atomoxetine or citalopram, significantly improved the ability of healthy volunteers to consciously detect performance errors. Furthermore, this behavioral effect was associated with a strengthening of activation differences in the dorsal anterior cingulate cortex and inferior parietal lobe during the methylphenidate condition for errors made with versus without awareness. Our results have implications for the understanding of the neurochemical underpinnings of performance monitoring and for the pharmacological treatment of a range of disparate clinical conditions that are marked by poor awareness of errors.

  8. [Analysis of intrusion errors in free recall].

    Science.gov (United States)

    Diesfeldt, H F A

    2017-06-01

    Extra-list intrusion errors during five trials of the eight-word list-learning task of the Amsterdam Dementia Screening Test (ADST) were investigated in 823 consecutive psychogeriatric patients (87.1% suffering from major neurocognitive disorder). Almost half of the participants (45.9%) produced one or more intrusion errors on the verbal recall test. Correct responses were lower when subjects made intrusion errors, but learning slopes did not differ between subjects who committed intrusion errors and those who did not so. Bivariate regression analyses revealed that participants who committed intrusion errors were more deficient on measures of eight-word recognition memory, delayed visual recognition and tests of executive control (the Behavioral Dyscontrol Scale and the ADST-Graphical Sequences as measures of response inhibition). Using hierarchical multiple regression, only free recall and delayed visual recognition retained an independent effect in the association with intrusion errors, such that deficient scores on tests of episodic memory were sufficient to explain the occurrence of intrusion errors. Measures of inhibitory control did not add significantly to the explanation of intrusion errors in free recall, which makes insufficient strength of memory traces rather than a primary deficit in inhibition the preferred account for intrusion errors in free recall.

  9. Human error mechanisms in complex work environments

    International Nuclear Information System (INIS)

    Rasmussen, J.

    1988-01-01

    Human error taxonomies have been developed from analysis of industrial incident reports as well as from psychological experiments. In this paper the results of the two approaches are reviewed and compared. It is found, in both cases, that a fairly small number of basic psychological mechanisms will account for most of the action errors observed. In addition, error mechanisms appear to be intimately related to the development of high skill and know-how in a complex work context. This relationship between errors and human adaptation is discussed in detail for individuals and organisations. The implications for system safety and briefly mentioned, together with the implications for system design. (author)

  10. Human error mechanisms in complex work environments

    International Nuclear Information System (INIS)

    Rasmussen, Jens; Danmarks Tekniske Hoejskole, Copenhagen)

    1988-01-01

    Human error taxonomies have been developed from analysis of industrial incident reports as well as from psychological experiments. In this paper the results of the two approaches are reviewed and compared. It is found, in both cases, that a fairly small number of basic psychological mechanisms will account for most of the action errors observed. In addition, error mechanisms appear to be intimately related to the development of high skill and know-how in a complex work context. This relationship between errors and human adaptation is discussed in detail for individuals and organisations. The implications for system safety are briefly mentioned, together with the implications for system design. (author)

  11. Study of Errors among Nursing Students

    Directory of Open Access Journals (Sweden)

    Ella Koren

    2007-09-01

    Full Text Available The study of errors in the health system today is a topic of considerable interest aimed at reducing errors through analysis of the phenomenon and the conclusions reached. Errors that occur frequently among health professionals have also been observed among nursing students. True, in most cases they are actually “near errors,” but these could be a future indicator of therapeutic reality and the effect of nurses' work environment on their personal performance. There are two different approaches to such errors: (a The EPP (error prone person approach lays full responsibility at the door of the individual involved in the error, whether a student, nurse, doctor, or pharmacist. According to this approach, handling consists purely in identifying and penalizing the guilty party. (b The EPE (error prone environment approach emphasizes the environment as a primary contributory factor to errors. The environment as an abstract concept includes components and processes of interpersonal communications, work relations, human engineering, workload, pressures, technical apparatus, and new technologies. The objective of the present study was to examine the role played by factors in and components of personal performance as compared to elements and features of the environment. The study was based on both of the aforementioned approaches, which, when combined, enable a comprehensive understanding of the phenomenon of errors among the student population as well as a comparison of factors contributing to human error and to error deriving from the environment. The theoretical basis of the study was a model that combined both approaches: one focusing on the individual and his or her personal performance and the other focusing on the work environment. The findings emphasize the work environment of health professionals as an EPE. However, errors could have been avoided by means of strict adherence to practical procedures. The authors examined error events in the

  12. Learning from errors in super-resolution.

    Science.gov (United States)

    Tang, Yi; Yuan, Yuan

    2014-11-01

    A novel framework of learning-based super-resolution is proposed by employing the process of learning from the estimation errors. The estimation errors generated by different learning-based super-resolution algorithms are statistically shown to be sparse and uncertain. The sparsity of the estimation errors means most of estimation errors are small enough. The uncertainty of the estimation errors means the location of the pixel with larger estimation error is random. Noticing the prior information about the estimation errors, a nonlinear boosting process of learning from these estimation errors is introduced into the general framework of the learning-based super-resolution. Within the novel framework of super-resolution, a low-rank decomposition technique is used to share the information of different super-resolution estimations and to remove the sparse estimation errors from different learning algorithms or training samples. The experimental results show the effectiveness and the efficiency of the proposed framework in enhancing the performance of different learning-based algorithms.

  13. Physical protection

    International Nuclear Information System (INIS)

    Myre, W.C.; DeMontmollin, J.M.

    1989-01-01

    Serious concern about physical protection of nuclear facilities began around 1972. R and D was initiated at Sandia National Laboratories which had developed techniques to protect weapons for many years. Special vehicles, convoy procedures, and a communications system previously developed for weapons shipments were improved and extended for shipments of other sensitive materials. Barriers, perimeter alarms, portal and internal control systems were developed, tested, and published in handbooks and presented at symposia. Training programs were initiated for U.S. and foreign personnel. Containment and surveillance techniques were developed for the IAEA. Presently emphasis is on computer security, active barriers, and techniques to prevent theft or sabotage by ''insiders''

  14. Random and Systematic Errors Share in Total Error of Probes for CNC Machine Tools

    Directory of Open Access Journals (Sweden)

    Adam Wozniak

    2018-03-01

    Full Text Available Probes for CNC machine tools, as every measurement device, have accuracy limited by random errors and by systematic errors. Random errors of these probes are described by a parameter called unidirectional repeatability. Manufacturers of probes for CNC machine tools usually specify only this parameter, while parameters describing systematic errors of the probes, such as pre-travel variation or triggering radius variation, are used rarely. Systematic errors of the probes, linked to the differences in pre-travel values for different measurement directions, can be corrected or compensated, but it is not a widely used procedure. In this paper, the share of systematic errors and random errors in total error of exemplary probes are determined. In the case of simple, kinematic probes, systematic errors are much greater than random errors, so compensation would significantly reduce the probing error. Moreover, it shows that in the case of kinematic probes commonly specified unidirectional repeatability is significantly better than 2D performance. However, in the case of more precise strain-gauge probe systematic errors are of the same order as random errors, which means that errors correction or compensation, in this case, would not yield any significant benefits.

  15. Medication errors in chemotherapy preparation and administration: a survey conducted among oncology nurses in Turkey.

    Science.gov (United States)

    Ulas, Arife; Silay, Kamile; Akinci, Sema; Dede, Didem Sener; Akinci, Muhammed Bulent; Sendur, Mehmet Ali Nahit; Cubukcu, Erdem; Coskun, Hasan Senol; Degirmenci, Mustafa; Utkan, Gungor; Ozdemir, Nuriye; Isikdogan, Abdurrahman; Buyukcelik, Abdullah; Inanc, Mevlude; Bilici, Ahmet; Odabasi, Hatice; Cihan, Sener; Avci, Nilufer; Yalcin, Bulent

    2015-01-01

    Medication errors in oncology may cause severe clinical problems due to low therapeutic indices and high toxicity of chemotherapeutic agents. We aimed to investigate unintentional medication errors and underlying factors during chemotherapy preparation and administration based on a systematic survey conducted to reflect oncology nurses experience. This study was conducted in 18 adult chemotherapy units with volunteer participation of 206 nurses. A survey developed by primary investigators and medication errors (MAEs) defined preventable errors during prescription of medication, ordering, preparation or administration. The survey consisted of 4 parts: demographic features of nurses; workload of chemotherapy units; errors and their estimated monthly number during chemotherapy preparation and administration; and evaluation of the possible factors responsible from ME. The survey was conducted by face to face interview and data analyses were performed with descriptive statistics. Chi-square or Fisher exact tests were used for a comparative analysis of categorical data. Some 83.4% of the 210 nurses reported one or more than one error during chemotherapy preparation and administration. Prescribing or ordering wrong doses by physicians (65.7%) and noncompliance with administration sequences during chemotherapy administration (50.5%) were the most common errors. The most common estimated average monthly error was not following the administration sequence of the chemotherapeutic agents (4.1 times/month, range 1-20). The most important underlying reasons for medication errors were heavy workload (49.7%) and insufficient number of staff (36.5%). Our findings suggest that the probability of medication error is very high during chemotherapy preparation and administration, the most common involving prescribing and ordering errors. Further studies must address the strategies to minimize medication error in chemotherapy receiving patients, determine sufficient protective measures

  16. Margin benefit assessment of the YGN 3 cycle 1 fxy error files for COLSS and CPC overall uncertainty analyses

    International Nuclear Information System (INIS)

    Yoon, Rae Young; In, Wang Kee; Auh, Geun Sun; Kim, Hee Cheol; Lee, Sang Keun

    1994-01-01

    Margin benefits are quantitatively assessed for the Yonggwang Unit 3 (YGN 3) Cycle 1 planar radial peaking factor (Fxy) error files for each time-in-life, i.e., BOC, IOC, MOC and EOC. The generic Fxy error file (FXYMEQO) is presently used for Yonggwang Unit 3 Cycle 1 COLSS (Core Operating Limit Supervisory System) and CPC (Core Protection Calculator) Overall Uncertainty Analyses (OUA). However, because this file is more conservative than the plant/cycle specific Fxy error files, COLSS and CPC thermal margins (DNB-OPM) for the generic Fxy error file are less than those of the plant/cycle specific Fxy error file. Therefore, the YGN 3 Cycle 1 Fxy error files were generated and analyzed by the modified codes for Yonggwang Plants. The YGN 3 Cycle 1 Fxy error files gave the increased thermal margin by about 1% for COLSS and CPC, respectively

  17. Negotiating Protection

    DEFF Research Database (Denmark)

    Bille, Mikkel

    efficacy. Some informants, for example, adopt an orthodox scriptural Islamic approach to protection and denounce certain material registers as un-Islamic and materialistic leftovers from an ignorant past, and rather prescribe Qur'anic remembrance. For other informants the very physicality of such contested...

  18. Protection Myopia

    DEFF Research Database (Denmark)

    Laursen, Keld; Salter, Ammon; Li, Cher

    from having an orientation towards legal appropriability, we conjecture that protection myopia may lead some firms to allocate too much attention to legal appropriability, in particular when the behavioral and structural contingencies are unfavorable. Examining a panel of three successive waves...

  19. Characteristics of pediatric chemotherapy medication errors in a national error reporting database.

    Science.gov (United States)

    Rinke, Michael L; Shore, Andrew D; Morlock, Laura; Hicks, Rodney W; Miller, Marlene R

    2007-07-01

    Little is known regarding chemotherapy medication errors in pediatrics despite studies suggesting high rates of overall pediatric medication errors. In this study, the authors examined patterns in pediatric chemotherapy errors. The authors queried the United States Pharmacopeia MEDMARX database, a national, voluntary, Internet-accessible error reporting system, for all error reports from 1999 through 2004 that involved chemotherapy medications and patients aged error reports, 85% reached the patient, and 15.6% required additional patient monitoring or therapeutic intervention. Forty-eight percent of errors originated in the administering phase of medication delivery, and 30% originated in the drug-dispensing phase. Of the 387 medications cited, 39.5% were antimetabolites, 14.0% were alkylating agents, 9.3% were anthracyclines, and 9.3% were topoisomerase inhibitors. The most commonly involved chemotherapeutic agents were methotrexate (15.3%), cytarabine (12.1%), and etoposide (8.3%). The most common error types were improper dose/quantity (22.9% of 327 cited error types), wrong time (22.6%), omission error (14.1%), and wrong administration technique/wrong route (12.2%). The most common error causes were performance deficit (41.3% of 547 cited error causes), equipment and medication delivery devices (12.4%), communication (8.8%), knowledge deficit (6.8%), and written order errors (5.5%). Four of the 5 most serious errors occurred at community hospitals. Pediatric chemotherapy errors often reached the patient, potentially were harmful, and differed in quality between outpatient and inpatient areas. This study indicated which chemotherapeutic agents most often were involved in errors and that administering errors were common. Investigation is needed regarding targeted medication administration safeguards for these high-risk medications. Copyright (c) 2007 American Cancer Society.

  20. ERF/ERFC, Calculation of Error Function, Complementary Error Function, Probability Integrals

    International Nuclear Information System (INIS)

    Vogel, J.E.

    1983-01-01

    1 - Description of problem or function: ERF and ERFC are used to compute values of the error function and complementary error function for any real number. They may be used to compute other related functions such as the normal probability integrals. 4. Method of solution: The error function and complementary error function are approximated by rational functions. Three such rational approximations are used depending on whether - x .GE.4.0. In the first region the error function is computed directly and the complementary error function is computed via the identity erfc(x)=1.0-erf(x). In the other two regions the complementary error function is computed directly and the error function is computed from the identity erf(x)=1.0-erfc(x). The error function and complementary error function are real-valued functions of any real argument. The range of the error function is (-1,1). The range of the complementary error function is (0,2). 5. Restrictions on the complexity of the problem: The user is cautioned against using ERF to compute the complementary error function by using the identity erfc(x)=1.0-erf(x). This subtraction may cause partial or total loss of significance for certain values of x

  1. Iterative optimization of quantum error correcting codes

    International Nuclear Information System (INIS)

    Reimpell, M.; Werner, R.F.

    2005-01-01

    We introduce a convergent iterative algorithm for finding the optimal coding and decoding operations for an arbitrary noisy quantum channel. This algorithm does not require any error syndrome to be corrected completely, and hence also finds codes outside the usual Knill-Laflamme definition of error correcting codes. The iteration is shown to improve the figure of merit 'channel fidelity' in every step

  2. Measurement Error in Education and Growth Regressions

    NARCIS (Netherlands)

    Portela, Miguel; Alessie, Rob; Teulings, Coen

    2010-01-01

    The use of the perpetual inventory method for the construction of education data per country leads to systematic measurement error. This paper analyzes its effect on growth regressions. We suggest a methodology for correcting this error. The standard attenuation bias suggests that using these

  3. Spectrum of diagnostic errors in radiology.

    Science.gov (United States)

    Pinto, Antonio; Brunese, Luca

    2010-10-28

    Diagnostic errors are important in all branches of medicine because they are an indication of poor patient care. Since the early 1970s, physicians have been subjected to an increasing number of medical malpractice claims. Radiology is one of the specialties most liable to claims of medical negligence. Most often, a plaintiff's complaint against a radiologist will focus on a failure to diagnose. The etiology of radiological error is multi-factorial. Errors fall into recurrent patterns. Errors arise from poor technique, failures of perception, lack of knowledge and misjudgments. The work of diagnostic radiology consists of the complete detection of all abnormalities in an imaging examination and their accurate diagnosis. Every radiologist should understand the sources of error in diagnostic radiology as well as the elements of negligence that form the basis of malpractice litigation. Error traps need to be uncovered and highlighted, in order to prevent repetition of the same mistakes. This article focuses on the spectrum of diagnostic errors in radiology, including a classification of the errors, and stresses the malpractice issues in mammography, chest radiology and obstetric sonography. Missed fractures in emergency and communication issues between radiologists and physicians are also discussed.

  4. Random error in cardiovascular meta-analyses

    DEFF Research Database (Denmark)

    Albalawi, Zaina; McAlister, Finlay A; Thorlund, Kristian

    2013-01-01

    BACKGROUND: Cochrane reviews are viewed as the gold standard in meta-analyses given their efforts to identify and limit systematic error which could cause spurious conclusions. The potential for random error to cause spurious conclusions in meta-analyses is less well appreciated. METHODS: We exam...

  5. The Impact of Error-Management Climate, Error Type and Error Originator on Auditors’ Reporting Errors Discovered on Audit Work Papers

    NARCIS (Netherlands)

    A.H. Gold-Nöteberg (Anna); U. Gronewold (Ulfert); S. Salterio (Steve)

    2010-01-01

    textabstractWe examine factors affecting the auditor’s willingness to report their own or their peers’ self-discovered errors in working papers subsequent to detailed working paper review. Prior research has shown that errors in working papers are detected in the review process; however, such

  6. Error tracking in a clinical biochemistry laboratory

    DEFF Research Database (Denmark)

    Szecsi, Pal Bela; Ødum, Lars

    2009-01-01

    BACKGROUND: We report our results for the systematic recording of all errors in a standard clinical laboratory over a 1-year period. METHODS: Recording was performed using a commercial database program. All individuals in the laboratory were allowed to report errors. The testing processes were cl...

  7. Sources of Error in Satellite Navigation Positioning

    Directory of Open Access Journals (Sweden)

    Jacek Januszewski

    2017-09-01

    Full Text Available An uninterrupted information about the user’s position can be obtained generally from satellite navigation system (SNS. At the time of this writing (January 2017 currently two global SNSs, GPS and GLONASS, are fully operational, two next, also global, Galileo and BeiDou are under construction. In each SNS the accuracy of the user’s position is affected by the three main factors: accuracy of each satellite position, accuracy of pseudorange measurement and satellite geometry. The user’s position error is a function of both the pseudorange error called UERE (User Equivalent Range Error and user/satellite geometry expressed by right Dilution Of Precision (DOP coefficient. This error is decomposed into two types of errors: the signal in space ranging error called URE (User Range Error and the user equipment error UEE. The detailed analyses of URE, UEE, UERE and DOP coefficients, and the changes of DOP coefficients in different days are presented in this paper.

  8. Volterra Filtering for ADC Error Correction

    Directory of Open Access Journals (Sweden)

    J. Saliga

    2001-09-01

    Full Text Available Dynamic non-linearity of analog-to-digital converters (ADCcontributes significantly to the distortion of digitized signals. Thispaper introduces a new effective method for compensation such adistortion based on application of Volterra filtering. Considering ana-priori error model of ADC allows finding an efficient inverseVolterra model for error correction. Efficiency of proposed method isdemonstrated on experimental results.

  9. Errors and untimely radiodiagnosis of occupational diseases

    International Nuclear Information System (INIS)

    Sokolik, L.I.; Shkondin, A.N.; Sergienko, N.S.; Doroshenko, A.N.; Shumakov, A.V.

    1987-01-01

    Most errors in the diagnosis of occupational diseases occur due to hyperdiagnosis (37%), because data of dynamic clinico-roentgenological examination were not considered (23%). Defects in the organization of prophylactic fluorography results in untimely diagnosis of dust-induced occupational diseases. Errors also occurred because working conditions were not always considered atypical development and course were not always analyzed

  10. Comparing classifiers for pronunciation error detection

    NARCIS (Netherlands)

    Strik, H.; Truong, K.; Wet, F. de; Cucchiarini, C.

    2007-01-01

    Providing feedback on pronunciation errors in computer assisted language learning systems requires that pronunciation errors be detected automatically. In the present study we compare four types of classifiers that can be used for this purpose: two acoustic-phonetic classifiers (one of which employs

  11. Comparison of Prediction-Error-Modelling Criteria

    DEFF Research Database (Denmark)

    Jørgensen, John Bagterp; Jørgensen, Sten Bay

    2007-01-01

    Single and multi-step prediction-error-methods based on the maximum likelihood and least squares criteria are compared. The prediction-error methods studied are based on predictions using the Kalman filter and Kalman predictors for a linear discrete-time stochastic state space model, which is a r...

  12. Error and uncertainty in scientific practice

    NARCIS (Netherlands)

    Boumans, M.; Hon, G.; Petersen, A.C.

    2014-01-01

    Assessment of error and uncertainty is a vital component of both natural and social science. Empirical research involves dealing with all kinds of errors and uncertainties, yet there is significant variance in how such results are dealt with. Contributors to this volume present case studies of

  13. Measurement Error in Education and Growth Regressions

    NARCIS (Netherlands)

    Portela, M.; Teulings, C.N.; Alessie, R.

    The perpetual inventory method used for the construction of education data per country leads to systematic measurement error. This paper analyses the effect of this measurement error on GDP regressions. There is a systematic difference in the education level between census data and observations

  14. Measurement error in education and growth regressions

    NARCIS (Netherlands)

    Portela, Miguel; Teulings, Coen; Alessie, R.

    2004-01-01

    The perpetual inventory method used for the construction of education data per country leads to systematic measurement error. This paper analyses the effect of this measurement error on GDP regressions. There is a systematic difference in the education level between census data and observations

  15. Position Error Covariance Matrix Validation and Correction

    Science.gov (United States)

    Frisbee, Joe, Jr.

    2016-01-01

    In order to calculate operationally accurate collision probabilities, the position error covariance matrices predicted at times of closest approach must be sufficiently accurate representations of the position uncertainties. This presentation will discuss why the Gaussian distribution is a reasonable expectation for the position uncertainty and how this assumed distribution type is used in the validation and correction of position error covariance matrices.

  16. Opportunistic Error Correction for WLAN Applications

    NARCIS (Netherlands)

    Shao, X.; Schiphorst, Roelof; Slump, Cornelis H.

    2008-01-01

    The current error correction layer of IEEE 802.11a WLAN is designed for worst case scenarios, which often do not apply. In this paper, we propose a new opportunistic error correction layer based on Fountain codes and a resolution adaptive ADC. The key part in the new proposed system is that only

  17. Error Discounting in Probabilistic Category Learning

    Science.gov (United States)

    Craig, Stewart; Lewandowsky, Stephan; Little, Daniel R.

    2011-01-01

    The assumption in some current theories of probabilistic categorization is that people gradually attenuate their learning in response to unavoidable error. However, existing evidence for this error discounting is sparse and open to alternative interpretations. We report 2 probabilistic-categorization experiments in which we investigated error…

  18. Learning mechanisms to limit medication administration errors.

    Science.gov (United States)

    Drach-Zahavy, Anat; Pud, Dorit

    2010-04-01

    This paper is a report of a study conducted to identify and test the effectiveness of learning mechanisms applied by the nursing staff of hospital wards as a means of limiting medication administration errors. Since the influential report ;To Err Is Human', research has emphasized the role of team learning in reducing medication administration errors. Nevertheless, little is known about the mechanisms underlying team learning. Thirty-two hospital wards were randomly recruited. Data were collected during 2006 in Israel by a multi-method (observations, interviews and administrative data), multi-source (head nurses, bedside nurses) approach. Medication administration error was defined as any deviation from procedures, policies and/or best practices for medication administration, and was identified using semi-structured observations of nurses administering medication. Organizational learning was measured using semi-structured interviews with head nurses, and the previous year's reported medication administration errors were assessed using administrative data. The interview data revealed four learning mechanism patterns employed in an attempt to learn from medication administration errors: integrated, non-integrated, supervisory and patchy learning. Regression analysis results demonstrated that whereas the integrated pattern of learning mechanisms was associated with decreased errors, the non-integrated pattern was associated with increased errors. Supervisory and patchy learning mechanisms were not associated with errors. Superior learning mechanisms are those that represent the whole cycle of team learning, are enacted by nurses who administer medications to patients, and emphasize a system approach to data analysis instead of analysis of individual cases.

  19. Automatic error compensation in dc amplifiers

    International Nuclear Information System (INIS)

    Longden, L.L.

    1976-01-01

    When operational amplifiers are exposed to high levels of neutron fluence or total ionizing dose, significant changes may be observed in input voltages and currents. These changes may produce large errors at the output of direct-coupled amplifier stages. Therefore, the need exists for automatic compensation techniques. However, previously introduced techniques compensate only for errors in the main amplifier and neglect the errors induced by the compensating circuitry. In this paper, the techniques introduced compensate not only for errors in the main operational amplifier, but also for errors induced by the compensation circuitry. Included in the paper is a theoretical analysis of each compensation technique, along with advantages and disadvantages of each. Important design criteria and information necessary for proper selection of semiconductor switches will also be included. Introduced in this paper will be compensation circuitry for both resistive and capacitive feedback networks

  20. Heuristics and Cognitive Error in Medical Imaging.

    Science.gov (United States)

    Itri, Jason N; Patel, Sohil H

    2018-05-01

    The field of cognitive science has provided important insights into mental processes underlying the interpretation of imaging examinations. Despite these insights, diagnostic error remains a major obstacle in the goal to improve quality in radiology. In this article, we describe several types of cognitive bias that lead to diagnostic errors in imaging and discuss approaches to mitigate cognitive biases and diagnostic error. Radiologists rely on heuristic principles to reduce complex tasks of assessing probabilities and predicting values into simpler judgmental operations. These mental shortcuts allow rapid problem solving based on assumptions and past experiences. Heuristics used in the interpretation of imaging studies are generally helpful but can sometimes result in cognitive biases that lead to significant errors. An understanding of the causes of cognitive biases can lead to the development of educational content and systematic improvements that mitigate errors and improve the quality of care provided by radiologists.

  1. El error en el delito imprudente

    Directory of Open Access Journals (Sweden)

    Miguel Angel Muñoz García

    2011-12-01

    Full Text Available La teoría del error en los delitos culposos constituye un tema álgido de tratar, y controversial en la dogmática penal: existen en realidad muy escasas referencias, y no se ha llegado a un consenso razonable. Partiendo del análisis de la estructura dogmática del delito imprudente, en donde se destaca el deber objetivo de cuidado como elemento del tipo sobre el que recae el error, y de las diferentes posiciones doctrinales que defienden la aplicabilidad del error de tipo y del error de prohibición, se plantea la viabilidad de este último, con fundamento en razones dogmáticas y de política criminal, siendo la infracción del deber objetivo de cuidado en tanto consecuencia del error, un tema por analizar en sede de culpabilidad.

  2. Error Resilient Video Compression Using Behavior Models

    Directory of Open Access Journals (Sweden)

    Jacco R. Taal

    2004-03-01

    Full Text Available Wireless and Internet video applications are inherently subjected to bit errors and packet errors, respectively. This is especially so if constraints on the end-to-end compression and transmission latencies are imposed. Therefore, it is necessary to develop methods to optimize the video compression parameters and the rate allocation of these applications that take into account residual channel bit errors. In this paper, we study the behavior of a predictive (interframe video encoder and model the encoders behavior using only the statistics of the original input data and of the underlying channel prone to bit errors. The resulting data-driven behavior models are then used to carry out group-of-pictures partitioning and to control the rate of the video encoder in such a way that the overall quality of the decoded video with compression and channel errors is optimized.

  3. Telemetry location error in a forested habitat

    Science.gov (United States)

    Chu, D.S.; Hoover, B.A.; Fuller, M.R.; Geissler, P.H.; Amlaner, Charles J.

    1989-01-01

    The error associated with locations estimated by radio-telemetry triangulation can be large and variable in a hardwood forest. We assessed the magnitude and cause of telemetry location errors in a mature hardwood forest by using a 4-element Yagi antenna and compass bearings toward four transmitters, from 21 receiving sites. The distance error from the azimuth intersection to known transmitter locations ranged from 0 to 9251 meters. Ninety-five percent of the estimated locations were within 16 to 1963 meters, and 50% were within 99 to 416 meters of actual locations. Angles with 20o of parallel had larger distance errors than other angles. While angle appeared most important, greater distances and the amount of vegetation between receivers and transmitters also contributed to distance error.

  4. The District Nursing Clinical Error Reduction Programme.

    Science.gov (United States)

    McGraw, Caroline; Topping, Claire

    2011-01-01

    The District Nursing Clinical Error Reduction (DANCER) Programme was initiated in NHS Islington following an increase in the number of reported medication errors. The objectives were to reduce the actual degree of harm and the potential risk of harm associated with medication errors and to maintain the existing positive reporting culture, while robustly addressing performance issues. One hundred medication errors reported in 2007/08 were analysed using a framework that specifies the factors that predispose to adverse medication events in domiciliary care. Various contributory factors were identified and interventions were subsequently developed to address poor drug calculation and medication problem-solving skills and incorrectly transcribed medication administration record charts. Follow up data were obtained at 12 months and two years. The evaluation has shown that although medication errors do still occur, the programme has resulted in a marked shift towards a reduction in the associated actual degree of harm and the potential risk of harm.

  5. A Comparative Study on Error Analysis

    DEFF Research Database (Denmark)

    Wu, Xiaoli; Zhang, Chun

    2015-01-01

    Title: A Comparative Study on Error Analysis Subtitle: - Belgian (L1) and Danish (L1) learners’ use of Chinese (L2) comparative sentences in written production Xiaoli Wu, Chun Zhang Abstract: Making errors is an inevitable and necessary part of learning. The collection, classification and analysis...... the occurrence of errors either in linguistic or pedagogical terms. The purpose of the current study is to demonstrate the theoretical and practical relevance of error analysis approach in CFL by investigating two cases - (1) Belgian (L1) learners’ use of Chinese (L2) comparative sentences in written production...... of errors in the written and spoken production of L2 learners has a long tradition in L2 pedagogy. Yet, in teaching and learning Chinese as a foreign language (CFL), only handful studies have been made either to define the ‘error’ in a pedagogically insightful way or to empirically investigate...

  6. Medication Error, What Is the Reason?

    Directory of Open Access Journals (Sweden)

    Ali Banaozar Mohammadi

    2015-09-01

    Full Text Available Background: Medication errors due to different reasons may alter the outcome of all patients, especially patients with drug poisoning. We introduce one of the most common type of medication error in the present article. Case:A 48 year old woman with suspected organophosphate poisoning was died due to lethal medication error. Unfortunately these types of errors are not rare and had some preventable reasons included lack of suitable and enough training and practicing of medical students and some failures in medical students’ educational curriculum. Conclusion:Hereby some important reasons are discussed because sometimes they are tre-mendous. We found that most of them are easily preventable. If someone be aware about the method of use, complications, dosage and contraindication of drugs, we can minimize most of these fatal errors.

  7. [Errors in Peruvian medical journals references].

    Science.gov (United States)

    Huamaní, Charles; Pacheco-Romero, José

    2009-01-01

    References are fundamental in our studies; an adequate selection is asimportant as an adequate description. To determine the number of errors in a sample of references found in Peruvian medical journals. We reviewed 515 scientific papers references selected by systematic randomized sampling and corroborated reference information with the original document or its citation in Pubmed, LILACS or SciELO-Peru. We found errors in 47,6% (245) of the references, identifying 372 types of errors; the most frequent were errors in presentation style (120), authorship (100) and title (100), mainly due to spelling mistakes (91). References error percentage was high, varied and multiple. We suggest systematic revision of references in the editorial process as well as to extend the discussion on this theme. references, periodicals, research, bibliometrics.

  8. A qualitative description of human error

    International Nuclear Information System (INIS)

    Li Zhaohuan

    1992-11-01

    The human error has an important contribution to risk of reactor operation. The insight and analytical model are main parts in human reliability analysis. It consists of the concept of human error, the nature, the mechanism of generation, the classification and human performance influence factors. On the operating reactor the human error is defined as the task-human-machine mismatch. The human error event is focused on the erroneous action and the unfavored result. From the time limitation of performing a task, the operation is divided into time-limited and time-opened. The HCR (human cognitive reliability) model is suited for only time-limited. The basic cognitive process consists of the information gathering, cognition/thinking, decision making and action. The human erroneous action may be generated in any stage of this process. The more natural ways to classify human errors are presented. The human performance influence factors including personal, organizational and environmental factors are also listed

  9. A qualitative description of human error

    Energy Technology Data Exchange (ETDEWEB)

    Zhaohuan, Li [Academia Sinica, Beijing, BJ (China). Inst. of Atomic Energy

    1992-11-01

    The human error has an important contribution to risk of reactor operation. The insight and analytical model are main parts in human reliability analysis. It consists of the concept of human error, the nature, the mechanism of generation, the classification and human performance influence factors. On the operating reactor the human error is defined as the task-human-machine mismatch. The human error event is focused on the erroneous action and the unfavored result. From the time limitation of performing a task, the operation is divided into time-limited and time-opened. The HCR (human cognitive reliability) model is suited for only time-limited. The basic cognitive process consists of the information gathering, cognition/thinking, decision making and action. The human erroneous action may be generated in any stage of this process. The more natural ways to classify human errors are presented. The human performance influence factors including personal, organizational and environmental factors are also listed.

  10. A memory of errors in sensorimotor learning.

    Science.gov (United States)

    Herzfeld, David J; Vaswani, Pavan A; Marko, Mollie K; Shadmehr, Reza

    2014-09-12

    The current view of motor learning suggests that when we revisit a task, the brain recalls the motor commands it previously learned. In this view, motor memory is a memory of motor commands, acquired through trial-and-error and reinforcement. Here we show that the brain controls how much it is willing to learn from the current error through a principled mechanism that depends on the history of past errors. This suggests that the brain stores a previously unknown form of memory, a memory of errors. A mathematical formulation of this idea provides insights into a host of puzzling experimental data, including savings and meta-learning, demonstrating that when we are better at a motor task, it is partly because the brain recognizes the errors it experienced before. Copyright © 2014, American Association for the Advancement of Science.

  11. Solving a mathematical model integrating unequal-area facilities layout and part scheduling in a cellular manufacturing system by a genetic algorithm.

    Science.gov (United States)

    Ebrahimi, Ahmad; Kia, Reza; Komijan, Alireza Rashidi

    2016-01-01

    In this article, a novel integrated mixed-integer nonlinear programming model is presented for designing a cellular manufacturing system (CMS) considering machine layout and part scheduling problems simultaneously as interrelated decisions. The integrated CMS model is formulated to incorporate several design features including part due date, material handling time, operation sequence, processing time, an intra-cell layout of unequal-area facilities, and part scheduling. The objective function is to minimize makespan, tardiness penalties, and material handling costs of inter-cell and intra-cell movements. Two numerical examples are solved by the Lingo software to illustrate the results obtained by the incorporated features. In order to assess the effects and importance of integration of machine layout and part scheduling in designing a CMS, two approaches, sequentially and concurrent are investigated and the improvement resulted from a concurrent approach is revealed. Also, due to the NP-hardness of the integrated model, an efficient genetic algorithm is designed. As a consequence, computational results of this study indicate that the best solutions found by GA are better than the solutions found by B&B in much less time for both sequential and concurrent approaches. Moreover, the comparisons between the objective function values (OFVs) obtained by sequential and concurrent approaches demonstrate that the OFV improvement is averagely around 17 % by GA and 14 % by B&B.

  12. Unequal diffusivities case of homogeneous–heterogeneous reactions within viscoelastic fluid flow in the presence of induced magnetic-field and nonlinear thermal radiation

    Directory of Open Access Journals (Sweden)

    I.L. Animasaun

    2016-06-01

    Full Text Available This article presents the effects of nonlinear thermal radiation and induced magnetic field on viscoelastic fluid flow toward a stagnation point. It is assumed that there exists a kind of chemical reaction between chemical species A and B. The diffusion coefficients of the two chemical species in the viscoelastic fluid flow are unequal. Since chemical species B is a catalyst at the horizontal surface, hence homogeneous and heterogeneous schemes are of the isothermal cubic autocatalytic reaction and first order reaction respectively. The transformed governing equations are solved numerically using Runge–Kutta integration scheme along with Newton’s method. Good agreement is obtained between present and published numerical results for a limiting case. The influence of some pertinent parameters on skin friction coefficient, local heat transfer rate, together with velocity, induced magnetic field, temperature, and concentration profiles is illustrated graphically and discussed. Based on all of these assumptions, results indicate that the effects of induced magnetic and viscoelastic parameters on velocity, transverse velocity and velocity of induced magnetic field are almost the same but opposite in nature. The strength of heterogeneous reaction parameter is very helpful to reduce the concentration of bulk fluid and increase the concentration of catalyst at the surface.

  13. Analysing the Unequal Effects of Positive and Negative Information on the Behaviour of Users of a Taiwanese On-Line Bulletin Board.

    Directory of Open Access Journals (Sweden)

    Shu-Li Cheng

    Full Text Available The impact of social influence causes people to adopt the behaviour of others when interacting with other individuals. The effects of social influence can be direct or indirect. Direct social influence is the result of an individual directly influencing the opinion of another, while indirect social influence is a process taking place when an individual's opinion and behaviour is affected by the availability of information about others' actions. Such indirect effect may exhibit a more significant impact in the on-line community because the internet records not only positive but also negative information, for example on-line written text comments. This study focuses on indirect social influence and examines the effect of preceding information on subsequent users' opinions by fitting statistical models to data collected from an on-line bulletin board. Specifically, the different impacts of information on approval and disapproval comments on subsequent opinions were investigated. Although in an anonymous situation where social influence is assumed to be at minimum, our results demonstrate the tendency of on-line users to adopt both positive and negative information to conform to the neighbouring trend when expressing opinions. Moreover, our results suggest unequal effects of the local approval and disapproval comments in affecting the likelihood of expressing opinions. The impact of neighbouring disapproval densities was stronger than that of neighbouring approval densities on inducing subsequent disapproval relative to approval comments. However, our results suggest no effects of global social influence on subsequent opinion expression.

  14. Insights into secondary growth in perennial plants: its unequal spatial and temporal dynamics in the apple (Malus domestica) is driven by architectural position and fruit load.

    Science.gov (United States)

    Lauri, P E; Kelner, J J; Trottier, C; Costes, E

    2010-04-01

    Secondary growth is a main physiological sink. However, the hierarchy between the processes which compete with secondary growth is still a matter of debate, especially on fruit trees where fruit weight dramatically increases with time. It was hypothesized that tree architecture, here mediated by branch age, is likely to have a major effect on the dynamics of secondary growth within a growing season. Three variables were monitored on 6-year-old 'Golden Delicious' apple trees from flowering time to harvest: primary shoot growth, fruit volume, and cross-section area of branch portions of consecutive ages. Analyses were done through an ANOVA-type analysis in a linear mixed model framework. Secondary growth exhibited three consecutive phases characterized by unequal relative area increment over the season. The age of the branch had the strongest effect, with the highest and lowest relative area increment for the current-year shoots and the trunk, respectively. The growth phase had a lower effect, with a shift of secondary growth through the season from leafy shoots towards older branch portions. Eventually, fruit load had an effect on secondary growth mainly after primary growth had ceased. The results support the idea that relationships between production of photosynthates and allocation depend on both primary growth and branch architectural position. Fruit load mainly interacted with secondary growth later in the season, especially on old branch portions.

  15. The protection of the Cyrillic alphabet in telecommunications: Taxation aspects

    Directory of Open Access Journals (Sweden)

    Marilović Đorđe

    2016-01-01

    Full Text Available The use of Cyrillic and other specific alphabets is discouraged in some telecommunication services. In this paper; the author focuses on unequal treatment of the Cyrillic alphabet in telecommunications (in SMS messages, which is incompatible with the interests of a multilingual society to cherish its linguistic heritage and diversity. Referring to the Convention on the Protection and Promotion of the Diversity of Cultural Expression (2005, the author suggests introducing measures which would lead to removing the discriminatory pricing of Cyrillic SMS messages; and introducing tax measures which would support mobile network operators and prevent possible market inequalities stemming from introducing these measures. The suggested solution is applicable to any multicultural society facing the same problem; regardless of languages in question.

  16. Common Errors in Ecological Data Sharing

    Directory of Open Access Journals (Sweden)

    Robert B. Cook

    2013-04-01

    Full Text Available Objectives: (1 to identify common errors in data organization and metadata completeness that would preclude a “reader” from being able to interpret and re-use the data for a new purpose; and (2 to develop a set of best practices derived from these common errors that would guide researchers in creating more usable data products that could be readily shared, interpreted, and used.Methods: We used directed qualitative content analysis to assess and categorize data and metadata errors identified by peer reviewers of data papers published in the Ecological Society of America’s (ESA Ecological Archives. Descriptive statistics provided the relative frequency of the errors identified during the peer review process.Results: There were seven overarching error categories: Collection & Organization, Assure, Description, Preserve, Discover, Integrate, and Analyze/Visualize. These categories represent errors researchers regularly make at each stage of the Data Life Cycle. Collection & Organization and Description errors were some of the most common errors, both of which occurred in over 90% of the papers.Conclusions: Publishing data for sharing and reuse is error prone, and each stage of the Data Life Cycle presents opportunities for mistakes. The most common errors occurred when the researcher did not provide adequate metadata to enable others to interpret and potentially re-use the data. Fortunately, there are ways to minimize these mistakes through carefully recording all details about study context, data collection, QA/ QC, and analytical procedures from the beginning of a research project and then including this descriptive information in the metadata.

  17. Analyzing temozolomide medication errors: potentially fatal.

    Science.gov (United States)

    Letarte, Nathalie; Gabay, Michael P; Bressler, Linda R; Long, Katie E; Stachnik, Joan M; Villano, J Lee

    2014-10-01

    The EORTC-NCIC regimen for glioblastoma requires different dosing of temozolomide (TMZ) during radiation and maintenance therapy. This complexity is exacerbated by the availability of multiple TMZ capsule strengths. TMZ is an alkylating agent and the major toxicity of this class is dose-related myelosuppression. Inadvertent overdose can be fatal. The websites of the Institute for Safe Medication Practices (ISMP), and the Food and Drug Administration (FDA) MedWatch database were reviewed. We searched the MedWatch database for adverse events associated with TMZ and obtained all reports including hematologic toxicity submitted from 1st November 1997 to 30th May 2012. The ISMP describes errors with TMZ resulting from the positioning of information on the label of the commercial product. The strength and quantity of capsules on the label were in close proximity to each other, and this has been changed by the manufacturer. MedWatch identified 45 medication errors. Patient errors were the most common, accounting for 21 or 47% of errors, followed by dispensing errors, which accounted for 13 or 29%. Seven reports or 16% were errors in the prescribing of TMZ. Reported outcomes ranged from reversible hematological adverse events (13%), to hospitalization for other adverse events (13%) or death (18%). Four error reports lacked detail and could not be categorized. Although the FDA issued a warning in 2003 regarding fatal medication errors and the product label warns of overdosing, errors in TMZ dosing occur for various reasons and involve both healthcare professionals and patients. Overdosing errors can be fatal.

  18. NLO error propagation exercise: statistical results

    International Nuclear Information System (INIS)

    Pack, D.J.; Downing, D.J.

    1985-09-01

    Error propagation is the extrapolation and cumulation of uncertainty (variance) above total amounts of special nuclear material, for example, uranium or 235 U, that are present in a defined location at a given time. The uncertainty results from the inevitable inexactness of individual measurements of weight, uranium concentration, 235 U enrichment, etc. The extrapolated and cumulated uncertainty leads directly to quantified limits of error on inventory differences (LEIDs) for such material. The NLO error propagation exercise was planned as a field demonstration of the utilization of statistical error propagation methodology at the Feed Materials Production Center in Fernald, Ohio from April 1 to July 1, 1983 in a single material balance area formed specially for the exercise. Major elements of the error propagation methodology were: variance approximation by Taylor Series expansion; variance cumulation by uncorrelated primary error sources as suggested by Jaech; random effects ANOVA model estimation of variance effects (systematic error); provision for inclusion of process variance in addition to measurement variance; and exclusion of static material. The methodology was applied to material balance area transactions from the indicated time period through a FORTRAN computer code developed specifically for this purpose on the NLO HP-3000 computer. This paper contains a complete description of the error propagation methodology and a full summary of the numerical results of applying the methodlogy in the field demonstration. The error propagation LEIDs did encompass the actual uranium and 235 U inventory differences. Further, one can see that error propagation actually provides guidance for reducing inventory differences and LEIDs in future time periods

  19. Repeated speech errors: evidence for learning.

    Science.gov (United States)

    Humphreys, Karin R; Menzies, Heather; Lake, Johanna K

    2010-11-01

    Three experiments elicited phonological speech errors using the SLIP procedure to investigate whether there is a tendency for speech errors on specific words to reoccur, and whether this effect can be attributed to implicit learning of an incorrect mapping from lemma to phonology for that word. In Experiment 1, when speakers made a phonological speech error in the study phase of the experiment (e.g. saying "beg pet" in place of "peg bet") they were over four times as likely to make an error on that same item several minutes later at test. A pseudo-error condition demonstrated that the effect is not simply due to a propensity for speakers to repeat phonological forms, regardless of whether or not they have been made in error. That is, saying "beg pet" correctly at study did not induce speakers to say "beg pet" in error instead of "peg bet" at test. Instead, the effect appeared to be due to learning of the error pathway. Experiment 2 replicated this finding, but also showed that after 48 h, errors made at study were no longer more likely to reoccur. As well as providing constraints on the longevity of the effect, this provides strong evidence that the error reoccurrences observed are not due to item-specific difficulty that leads individual speakers to make habitual mistakes on certain items. Experiment 3 showed that the diminishment of the effect 48 h later is not due to specific extra practice at the task. We discuss how these results fit in with a larger view of language as a dynamic system that is constantly adapting in response to experience. Copyright © 2010 Elsevier B.V. All rights reserved.

  20. A concatenated coding scheme for biometric template protection

    NARCIS (Netherlands)

    Shao, X.; Xu, H.; Veldhuis, Raymond N.J.; Slump, Cornelis H.

    2012-01-01

    Cryptography may mitigate the privacy problem in biometric recognition systems. However, cryptography technologies lack error-tolerance and biometric samples cannot be reproduced exactly, rising the robustness problem. The biometric template protection system needs a good feature extraction

  1. Protective articles

    International Nuclear Information System (INIS)

    Wardley, R.B.

    1983-01-01

    This patent specification describes an article affording protection against radiation, and especially against X-rays comprising at least one flexible layer of lead filled material in an envelope of, or sandwiched between two layers of a knitted, woven or non-woven fabric preferably of synthetic fibrous material, carrying on its outer surface a coating of flexible polyurethane. The outer fabric provides a resilient, extremely tough and cut resistant covering for the relatively soft lead filled material. (author)

  2. Eye Protection

    OpenAIRE

    Pashby, Tom

    1986-01-01

    Eye injuries frequently occur in the home, at work and at play. Many result in legally blind eyes, and most are preventable. Awareness of potential hazards is essential to preventing eye injuries, particularly in children. In addition, protective devices must be used appropriately. We have developed eye protectors that have proved effective in reducing both the overall incidence and the severity of sports eye injuries.

  3. Error and Congestion Resilient Video Streaming over Broadband Wireless

    Directory of Open Access Journals (Sweden)

    Laith Al-Jobouri

    2015-04-01

    Full Text Available In this paper, error resilience is achieved by adaptive, application-layer rateless channel coding, which is used to protect H.264/Advanced Video Coding (AVC codec data-partitioned videos. A packetization strategy is an effective tool to control error rates and, in the paper, source-coded data partitioning serves to allocate smaller packets to more important compressed video data. The scheme for doing this is applied to real-time streaming across a broadband wireless link. The advantages of rateless code rate adaptivity are then demonstrated in the paper. Because the data partitions of a video slice are each assigned to different network packets, in congestion-prone wireless networks the increased number of packets per slice and their size disparity may increase the packet loss rate from buffer overflows. As a form of congestion resilience, this paper recommends packet-size dependent scheduling as a relatively simple way of alleviating the buffer-overflow problem arising from data-partitioned packets. The paper also contributes an analysis of data partitioning and packet sizes as a prelude to considering scheduling regimes. The combination of adaptive channel coding and prioritized packetization for error resilience with packet-size dependent packet scheduling results in a robust streaming scheme specialized for broadband wireless and real-time streaming applications such as video conferencing, video telephony, and telemedicine.

  4. Error budget calculations in laboratory medicine: linking the concepts of biological variation and allowable medical errors

    NARCIS (Netherlands)

    Stroobants, A. K.; Goldschmidt, H. M. J.; Plebani, M.

    2003-01-01

    Background: Random, systematic and sporadic errors, which unfortunately are not uncommon in laboratory medicine, can have a considerable impact on the well being of patients. Although somewhat difficult to attain, our main goal should be to prevent all possible errors. A good insight on error-prone

  5. Error-information in tutorial documentation: Supporting users' errors to facilitate initial skill learning

    NARCIS (Netherlands)

    Lazonder, Adrianus W.; van der Meij, Hans

    1995-01-01

    Novice users make many errors when they first try to learn how to work with a computer program like a spreadsheet or wordprocessor. No matter how user-friendly the software or the training manual, errors can and will occur. The current view on errors is that they can be helpful or disruptive,

  6. [Cerebral protection].

    Science.gov (United States)

    Cattaneo, A D

    1993-09-01

    Cerebral protection means prevention of cerebral neuronal damage. Severe brain damage extinguishes the very "human" functions such as speech, consciousness, intellectual capacity, and emotional integrity. Many pathologic conditions may inflict injuries to the brain, therefore the protection and salvage of cerebral neuronal function must be the top priorities in the care of critically ill patients. Brain tissue has unusually high energy requirements, its stores of energy metabolites are small and, as a result, the brain is totally dependent on a continuous supply of substrates and oxygen, via the circulation. In complete global ischemia (cardiac arrest) reperfusion is characterized by an immediate reactive hyperemia followed within 20-30 min by a delayed hypoperfusion state. It has been postulated that the latter contributes to the ultimate neurologic outcome. In focal ischemia (stroke) the primary focus of necrosis is encircled by an area (ischemic penumbra) that is underperfused and contains neurotoxic substances such as free radicals, prostaglandins, calcium, and excitatory neurotransmitters. The variety of therapeutic effort that have addressed the question of protecting the brain reflects their limited success. 1) Barbiturates. After an initial enthusiastic endorsement by many clinicians and years of vigorous controversy, it can now be unequivocally stated that there is no place for barbiturate therapy following resuscitation from cardiac arrest. One presumed explanation for this negative statement is that cerebral metabolic suppression by barbiturates (and other anesthetics) is impossible in the absence of an active EEG. Conversely, in the event of incomplete ischemia EEG activity in usually present (albeit altered) and metabolic suppression and hence possibly protection can be induced with barbiturates. Indeed, most of the animal studies led to a number of recommendations for barbiturate therapy in man for incomplete ischemia. 2) Isoflurane. From a cerebral

  7. Evaluation of Data with Systematic Errors

    International Nuclear Information System (INIS)

    Froehner, F. H.

    2003-01-01

    Application-oriented evaluated nuclear data libraries such as ENDF and JEFF contain not only recommended values but also uncertainty information in the form of 'covariance' or 'error files'. These can neither be constructed nor utilized properly without a thorough understanding of uncertainties and correlations. It is shown how incomplete information about errors is described by multivariate probability distributions or, more summarily, by covariance matrices, and how correlations are caused by incompletely known common errors. Parameter estimation for the practically most important case of the Gaussian distribution with common errors is developed in close analogy to the more familiar case without. The formalism shows that, contrary to widespread belief, common ('systematic') and uncorrelated ('random' or 'statistical') errors are to be added in quadrature. It also shows explicitly that repetition of a measurement reduces mainly the statistical uncertainties but not the systematic ones. While statistical uncertainties are readily estimated from the scatter of repeatedly measured data, systematic uncertainties can only be inferred from prior information about common errors and their propagation. The optimal way to handle error-affected auxiliary quantities ('nuisance parameters') in data fitting and parameter estimation is to adjust them on the same footing as the parameters of interest and to integrate (marginalize) them out of the joint posterior distribution afterward

  8. Sources of medical error in refractive surgery.

    Science.gov (United States)

    Moshirfar, Majid; Simpson, Rachel G; Dave, Sonal B; Christiansen, Steven M; Edmonds, Jason N; Culbertson, William W; Pascucci, Stephen E; Sher, Neal A; Cano, David B; Trattler, William B

    2013-05-01

    To evaluate the causes of laser programming errors in refractive surgery and outcomes in these cases. In this multicenter, retrospective chart review, 22 eyes of 18 patients who had incorrect data entered into the refractive laser computer system at the time of treatment were evaluated. Cases were analyzed to uncover the etiology of these errors, patient follow-up treatments, and final outcomes. The results were used to identify potential methods to avoid similar errors in the future. Every patient experienced compromised uncorrected visual acuity requiring additional intervention, and 7 of 22 eyes (32%) lost corrected distance visual acuity (CDVA) of at least one line. Sixteen patients were suitable candidates for additional surgical correction to address these residual visual symptoms and six were not. Thirteen of 22 eyes (59%) received surgical follow-up treatment; nine eyes were treated with contact lenses. After follow-up treatment, six patients (27%) still had a loss of one line or more of CDVA. Three significant sources of error were identified: errors of cylinder conversion, data entry, and patient identification error. Twenty-seven percent of eyes with laser programming errors ultimately lost one or more lines of CDVA. Patients who underwent surgical revision had better outcomes than those who did not. Many of the mistakes identified were likely avoidable had preventive measures been taken, such as strict adherence to patient verification protocol or rigorous rechecking of treatment parameters. Copyright 2013, SLACK Incorporated.

  9. Research trend on human error reduction

    International Nuclear Information System (INIS)

    Miyaoka, Sadaoki

    1990-01-01

    Human error has been the problem in all industries. In 1988, the Bureau of Mines, Department of the Interior, USA, carried out the worldwide survey on the human error in all industries in relation to the fatal accidents in mines. There was difference in the results according to the methods of collecting data, but the proportion that human error took in the total accidents distributed in the wide range of 20∼85%, and was 35% on the average. The rate of occurrence of accidents and troubles in Japanese nuclear power stations is shown, and the rate of occurrence of human error is 0∼0.5 cases/reactor-year, which did not much vary. Therefore, the proportion that human error took in the total tended to increase, and it has become important to reduce human error for lowering the rate of occurrence of accidents and troubles hereafter. After the TMI accident in 1979 in USA, the research on man-machine interface became active, and after the Chernobyl accident in 1986 in USSR, the problem of organization and management has been studied. In Japan, 'Safety 21' was drawn up by the Advisory Committee for Energy, and also the annual reports on nuclear safety pointed out the importance of human factors. The state of the research on human factors in Japan and abroad and three targets to reduce human error are reported. (K.I.)

  10. Human error theory: relevance to nurse management.

    Science.gov (United States)

    Armitage, Gerry

    2009-03-01

    Describe, discuss and critically appraise human error theory and consider its relevance for nurse managers. Healthcare errors are a persistent threat to patient safety. Effective risk management and clinical governance depends on understanding the nature of error. This paper draws upon a wide literature from published works, largely from the field of cognitive psychology and human factors. Although the content of this paper is pertinent to any healthcare professional; it is written primarily for nurse managers. Error is inevitable. Causation is often attributed to individuals, yet causation in complex environments such as healthcare is predominantly multi-factorial. Individual performance is affected by the tendency to develop prepacked solutions and attention deficits, which can in turn be related to local conditions and systems or latent failures. Blame is often inappropriate. Defences should be constructed in the light of these considerations and to promote error wisdom and organizational resilience. Managing and learning from error is seen as a priority in the British National Health Service (NHS), this can be better achieved with an understanding of the roots, nature and consequences of error. Such an understanding can provide a helpful framework for a range of risk management activities.

  11. Wind power error estimation in resource assessments.

    Directory of Open Access Journals (Sweden)

    Osvaldo Rodríguez

    Full Text Available Estimating the power output is one of the elements that determine the techno-economic feasibility of a renewable project. At present, there is a need to develop reliable methods that achieve this goal, thereby contributing to wind power penetration. In this study, we propose a method for wind power error estimation based on the wind speed measurement error, probability density function, and wind turbine power curves. This method uses the actual wind speed data without prior statistical treatment based on 28 wind turbine power curves, which were fitted by Lagrange's method, to calculate the estimate wind power output and the corresponding error propagation. We found that wind speed percentage errors of 10% were propagated into the power output estimates, thereby yielding an error of 5%. The proposed error propagation complements the traditional power resource assessments. The wind power estimation error also allows us to estimate intervals for the power production leveled cost or the investment time return. The implementation of this method increases the reliability of techno-economic resource assessment studies.

  12. Wind power error estimation in resource assessments.

    Science.gov (United States)

    Rodríguez, Osvaldo; Del Río, Jesús A; Jaramillo, Oscar A; Martínez, Manuel

    2015-01-01

    Estimating the power output is one of the elements that determine the techno-economic feasibility of a renewable project. At present, there is a need to develop reliable methods that achieve this goal, thereby contributing to wind power penetration. In this study, we propose a method for wind power error estimation based on the wind speed measurement error, probability density function, and wind turbine power curves. This method uses the actual wind speed data without prior statistical treatment based on 28 wind turbine power curves, which were fitted by Lagrange's method, to calculate the estimate wind power output and the corresponding error propagation. We found that wind speed percentage errors of 10% were propagated into the power output estimates, thereby yielding an error of 5%. The proposed error propagation complements the traditional power resource assessments. The wind power estimation error also allows us to estimate intervals for the power production leveled cost or the investment time return. The implementation of this method increases the reliability of techno-economic resource assessment studies.

  13. The benefit of generating errors during learning.

    Science.gov (United States)

    Potts, Rosalind; Shanks, David R

    2014-04-01

    Testing has been found to be a powerful learning tool, but educators might be reluctant to make full use of its benefits for fear that any errors made would be harmful to learning. We asked whether testing could be beneficial to memory even during novel learning, when nearly all responses were errors, and where errors were unlikely to be related to either cues or targets. In 4 experiments, participants learned definitions for unfamiliar English words, or translations for foreign vocabulary, by generating a response and being given corrective feedback, by reading the word and its definition or translation, or by selecting from a choice of definitions or translations followed by feedback. In a final test of all words, generating errors followed by feedback led to significantly better memory for the correct definition or translation than either reading or making incorrect choices, suggesting that the benefits of generation are not restricted to correctly generated items. Even when information to be learned is novel, errorful generation may play a powerful role in potentiating encoding of corrective feedback. Experiments 2A, 2B, and 3 revealed, via metacognitive judgments of learning, that participants are strikingly unaware of this benefit, judging errorful generation to be a less effective encoding method than reading or incorrect choosing, when in fact it was better. Predictions reflected participants' subjective experience during learning. If subjective difficulty leads to more effort at encoding, this could at least partly explain the errorful generation advantage.

  14. Influence of Current Transformer Saturation on Operation of Current Protection

    Directory of Open Access Journals (Sweden)

    F. A. Romaniouk

    2010-01-01

    Full Text Available An analysis of the influence of instrument current transformer errors on operation of current protection of power supply diagram elements has been carried out in the paper. The paper shows the influence of an aperiodic component of transient current and secondary load on current  transformer errors.Peculiar operational features of measuring elements of electromechanical and microprocessor current protection with their joint operation with electromagnetic current transformers have been analyzed in the paper.

  15. Medication Errors: New EU Good Practice Guide on Risk Minimisation and Error Prevention.

    Science.gov (United States)

    Goedecke, Thomas; Ord, Kathryn; Newbould, Victoria; Brosch, Sabine; Arlett, Peter

    2016-06-01

    A medication error is an unintended failure in the drug treatment process that leads to, or has the potential to lead to, harm to the patient. Reducing the risk of medication errors is a shared responsibility between patients, healthcare professionals, regulators and the pharmaceutical industry at all levels of healthcare delivery. In 2015, the EU regulatory network released a two-part good practice guide on medication errors to support both the pharmaceutical industry and regulators in the implementation of the changes introduced with the EU pharmacovigilance legislation. These changes included a modification of the 'adverse reaction' definition to include events associated with medication errors, and the requirement for national competent authorities responsible for pharmacovigilance in EU Member States to collaborate and exchange information on medication errors resulting in harm with national patient safety organisations. To facilitate reporting and learning from medication errors, a clear distinction has been made in the guidance between medication errors resulting in adverse reactions, medication errors without harm, intercepted medication errors and potential errors. This distinction is supported by an enhanced MedDRA(®) terminology that allows for coding all stages of the medication use process where the error occurred in addition to any clinical consequences. To better understand the causes and contributing factors, individual case safety reports involving an error should be followed-up with the primary reporter to gather information relevant for the conduct of root cause analysis where this may be appropriate. Such reports should also be summarised in periodic safety update reports and addressed in risk management plans. Any risk minimisation and prevention strategy for medication errors should consider all stages of a medicinal product's life-cycle, particularly the main sources and types of medication errors during product development. This article

  16. Notes on human error analysis and prediction

    International Nuclear Information System (INIS)

    Rasmussen, J.

    1978-11-01

    The notes comprise an introductory discussion of the role of human error analysis and prediction in industrial risk analysis. Following this introduction, different classes of human errors and role in industrial systems are mentioned. Problems related to the prediction of human behaviour in reliability and safety analysis are formulated and ''criteria for analyzability'' which must be met by industrial systems so that a systematic analysis can be performed are suggested. The appendices contain illustrative case stories and a review of human error reports for the task of equipment calibration and testing as found in the US Licensee Event Reports. (author)

  17. Error estimation in plant growth analysis

    Directory of Open Access Journals (Sweden)

    Andrzej Gregorczyk

    2014-01-01

    Full Text Available The scheme is presented for calculation of errors of dry matter values which occur during approximation of data with growth curves, determined by the analytical method (logistic function and by the numerical method (Richards function. Further formulae are shown, which describe absolute errors of growth characteristics: Growth rate (GR, Relative growth rate (RGR, Unit leaf rate (ULR and Leaf area ratio (LAR. Calculation examples concerning the growth course of oats and maize plants are given. The critical analysis of the estimation of obtained results has been done. The purposefulness of joint application of statistical methods and error calculus in plant growth analysis has been ascertained.

  18. Fixturing error measurement and analysis using CMMs

    International Nuclear Information System (INIS)

    Wang, Y; Chen, X; Gindy, N

    2005-01-01

    Influence of fixture on the errors of a machined surface can be very significant. The machined surface errors generated during machining can be measured by using a coordinate measurement machine (CMM) through the displacements of three coordinate systems on a fixture-workpiece pair in relation to the deviation of the machined surface. The surface errors consist of the component movement, component twist, deviation between actual machined surface and defined tool path. A turbine blade fixture for grinding operation is used for case study

  19. ERROR VS REJECTION CURVE FOR THE PERCEPTRON

    OpenAIRE

    PARRONDO, JMR; VAN DEN BROECK, Christian

    1993-01-01

    We calculate the generalization error epsilon for a perceptron J, trained by a teacher perceptron T, on input patterns S that form a fixed angle arccos (J.S) with the student. We show that the error is reduced from a power law to an exponentially fast decay by rejecting input patterns that lie within a given neighbourhood of the decision boundary J.S = 0. On the other hand, the error vs. rejection curve epsilon(rho), where rho is the fraction of rejected patterns, is shown to be independent ...

  20. Accounting for optical errors in microtensiometry.

    Science.gov (United States)

    Hinton, Zachary R; Alvarez, Nicolas J

    2018-09-15

    Drop shape analysis (DSA) techniques measure interfacial tension subject to error in image analysis and the optical system. While considerable efforts have been made to minimize image analysis errors, very little work has treated optical errors. There are two main sources of error when considering the optical system: the angle of misalignment and the choice of focal plane. Due to the convoluted nature of these sources, small angles of misalignment can lead to large errors in measured curvature. We demonstrate using microtensiometry the contributions of these sources to measured errors in radius, and, more importantly, deconvolute the effects of misalignment and focal plane. Our findings are expected to have broad implications on all optical techniques measuring interfacial curvature. A geometric model is developed to analytically determine the contributions of misalignment angle and choice of focal plane on measurement error for spherical cap interfaces. This work utilizes a microtensiometer to validate the geometric model and to quantify the effect of both sources of error. For the case of a microtensiometer, an empirical calibration is demonstrated that corrects for optical errors and drastically simplifies implementation. The combination of geometric modeling and experimental results reveal a convoluted relationship between the true and measured interfacial radius as a function of the misalignment angle and choice of focal plane. The validated geometric model produces a full operating window that is strongly dependent on the capillary radius and spherical cap height. In all cases, the contribution of optical errors is minimized when the height of the spherical cap is equivalent to the capillary radius, i.e. a hemispherical interface. The understanding of these errors allow for correct measure of interfacial curvature and interfacial tension regardless of experimental setup. For the case of microtensiometry, this greatly decreases the time for experimental setup

  1. Human Error Analysis by Fuzzy-Set

    International Nuclear Information System (INIS)

    Situmorang, Johnny

    1996-01-01

    In conventional HRA the probability of Error is treated as a single and exact value through constructing even tree, but in this moment the Fuzzy-Set Theory is used. Fuzzy set theory treat the probability of error as a plausibility which illustrate a linguistic variable. Most parameter or variable in human engineering been defined verbal good, fairly good, worst etc. Which describe a range of any value of probability. For example this analysis is quantified the human error in calibration task, and the probability of miscalibration is very low

  2. KMRR thermal power measurement error estimation

    International Nuclear Information System (INIS)

    Rhee, B.W.; Sim, B.S.; Lim, I.C.; Oh, S.K.

    1990-01-01

    The thermal power measurement error of the Korea Multi-purpose Research Reactor has been estimated by a statistical Monte Carlo method, and compared with those obtained by the other methods including deterministic and statistical approaches. The results show that the specified thermal power measurement error of 5% cannot be achieved if the commercial RTDs are used to measure the coolant temperatures of the secondary cooling system and the error can be reduced below the requirement if the commercial RTDs are replaced by the precision RTDs. The possible range of the thermal power control operation has been identified to be from 100% to 20% of full power

  3. Magnetic field errors tolerances of Nuclotron booster

    Science.gov (United States)

    Butenko, Andrey; Kazinova, Olha; Kostromin, Sergey; Mikhaylov, Vladimir; Tuzikov, Alexey; Khodzhibagiyan, Hamlet

    2018-04-01

    Generation of magnetic field in units of booster synchrotron for the NICA project is one of the most important conditions for getting the required parameters and qualitative accelerator operation. Research of linear and nonlinear dynamics of ion beam 197Au31+ in the booster have carried out with MADX program. Analytical estimation of magnetic field errors tolerance and numerical computation of dynamic aperture of booster DFO-magnetic lattice are presented. Closed orbit distortion with random errors of magnetic fields and errors in layout of booster units was evaluated.

  4. Analysis of field errors in existing undulators

    International Nuclear Information System (INIS)

    Kincaid, B.M.

    1990-01-01

    The Advanced Light Source (ALS) and other third generation synchrotron light sources have been designed for optimum performance with undulator insertion devices. The performance requirements for these new undulators are explored, with emphasis on the effects of errors on source spectral brightness. Analysis of magnetic field data for several existing hybrid undulators is presented, decomposing errors into systematic and random components. An attempts is made to identify the sources of these errors, and recommendations are made for designing future insertion devices. 12 refs., 16 figs

  5. Awareness of technology-induced errors and processes for identifying and preventing such errors.

    Science.gov (United States)

    Bellwood, Paule; Borycki, Elizabeth M; Kushniruk, Andre W

    2015-01-01

    There is a need to determine if organizations working with health information technology are aware of technology-induced errors and how they are addressing and preventing them. The purpose of this study was to: a) determine the degree of technology-induced error awareness in various Canadian healthcare organizations, and b) identify those processes and procedures that are currently in place to help address, manage, and prevent technology-induced errors. We identified a lack of technology-induced error awareness among participants. Participants identified there was a lack of well-defined procedures in place for reporting technology-induced errors, addressing them when they arise, and preventing them.

  6. Concepts of radiation protection

    International Nuclear Information System (INIS)

    2013-01-01

    This seventh chapter presents the concepts and principles of safety and radiation protection, emergency situations; NORM and TENORM; radiation protection care; radiation protection plan; activities of the radiation protection service; practical rules of radiation protection and the radiation symbol

  7. Understanding and Confronting Our Mistakes: The Epidemiology of Error in Radiology and Strategies for Error Reduction.

    Science.gov (United States)

    Bruno, Michael A; Walker, Eric A; Abujudeh, Hani H

    2015-10-01

    Arriving at a medical diagnosis is a highly complex process that is extremely error prone. Missed or delayed diagnoses often lead to patient harm and missed opportunities for treatment. Since medical imaging is a major contributor to the overall diagnostic process, it is also a major potential source of diagnostic error. Although some diagnoses may be missed because of the technical or physical limitations of the imaging modality, including image resolution, intrinsic or extrinsic contrast, and signal-to-noise ratio, most missed radiologic diagnoses are attributable to image interpretation errors by radiologists. Radiologic interpretation cannot be mechanized or automated; it is a human enterprise based on complex psychophysiologic and cognitive processes and is itself subject to a wide variety of error types, including perceptual errors (those in which an important abnormality is simply not seen on the images) and cognitive errors (those in which the abnormality is visually detected but the meaning or importance of the finding is not correctly understood or appreciated). The overall prevalence of radiologists' errors in practice does not appear to have changed since it was first estimated in the 1960s. The authors review the epidemiology of errors in diagnostic radiology, including a recently proposed taxonomy of radiologists' errors, as well as research findings, in an attempt to elucidate possible underlying causes of these errors. The authors also propose strategies for error reduction in radiology. On the basis of current understanding, specific suggestions are offered as to how radiologists can improve their performance in practice. © RSNA, 2015.

  8. The Errors of Our Ways: Understanding Error Representations in Cerebellar-Dependent Motor Learning.

    Science.gov (United States)

    Popa, Laurentiu S; Streng, Martha L; Hewitt, Angela L; Ebner, Timothy J

    2016-04-01

    The cerebellum is essential for error-driven motor learning and is strongly implicated in detecting and correcting for motor errors. Therefore, elucidating how motor errors are represented in the cerebellum is essential in understanding cerebellar function, in general, and its role in motor learning, in particular. This review examines how motor errors are encoded in the cerebellar cortex in the context of a forward internal model that generates predictions about the upcoming movement and drives learning and adaptation. In this framework, sensory prediction errors, defined as the discrepancy between the predicted consequences of motor commands and the sensory feedback, are crucial for both on-line movement control and motor learning. While many studies support the dominant view that motor errors are encoded in the complex spike discharge of Purkinje cells, others have failed to relate complex spike activity with errors. Given these limitations, we review recent findings in the monkey showing that complex spike modulation is not necessarily required for motor learning or for simple spike adaptation. Also, new results demonstrate that the simple spike discharge provides continuous error signals that both lead and lag the actual movements in time, suggesting errors are encoded as both an internal prediction of motor commands and the actual sensory feedback. These dual error representations have opposing effects on simple spike discharge, consistent with the signals needed to generate sensory prediction errors used to update a forward internal model.

  9. Error Detection and Error Classification: Failure Awareness in Data Transfer Scheduling

    Energy Technology Data Exchange (ETDEWEB)

    Louisiana State University; Balman, Mehmet; Kosar, Tevfik

    2010-10-27

    Data transfer in distributed environment is prone to frequent failures resulting from back-end system level problems, like connectivity failure which is technically untraceable by users. Error messages are not logged efficiently, and sometimes are not relevant/useful from users point-of-view. Our study explores the possibility of an efficient error detection and reporting system for such environments. Prior knowledge about the environment and awareness of the actual reason behind a failure would enable higher level planners to make better and accurate decisions. It is necessary to have well defined error detection and error reporting methods to increase the usability and serviceability of existing data transfer protocols and data management systems. We investigate the applicability of early error detection and error classification techniques and propose an error reporting framework and a failure-aware data transfer life cycle to improve arrangement of data transfer operations and to enhance decision making of data transfer schedulers.

  10. Environmental protection

    International Nuclear Information System (INIS)

    Hull, A.P.

    1979-01-01

    Environmental Studies and Internal Dosimetry projects include: Environmental Protection; 1977 Environmental Monitoring Report; Sewage Sludge Disposal on the Sanitary Landfill; Radiological Analyses of Marshall Islands Environmental Samples, 1974 to 1976; External Radiation Survey and Dose Predictions for Rongelap, Utirik, Rongerik, Ailuk, and Wotje Atolls; Marshall Islands - Diet and Life Style Study; Dose Reassessment for Populations on Rongelap and Utirik Following Exposure to Fallout from BRAVO Incident (March 1, 1954); Whole Body Counting Results from 1974 to 1979 for Bikini Island Residents; Dietary Radioactivity Intake from Bioassay Data, a Model Applied to 137 Cs Intake by Bikini Island Residents; and External Exposure Measurements at Bikini Atoll

  11. Radiation protection

    International Nuclear Information System (INIS)

    Kamalaksh Shenoy, K.

    2013-01-01

    Three main pillars underpin the IAEA's mission: Safety and Security - The IAEA helps countries to upgrade their infrastructure for nuclear and radiation safety and security, and to prepare for and respond to emergencies. Work is keyed to international conventions, the development of international standards and the application of these standards. The aim is to protect people and the environment from the harmful effects of exposure to ionizing radiation. Science and Technology - The IAEA is the world's focal point for mobilizing peaceful applications of nuclear science and technology for critical needs in developing countries. The work contributes to alleviating poverty, combating disease and pollution of the environment and to other goals of sustainable development. Safeguards and Verification - The IAEA is the nuclear inspectorate, with more than four decades of verification experience. Inspectors work to verify that nuclear material and activities are not diverted towards military purposes. Quantities and Units: Dose equivalent is the product of absorbed dose of radiation and quality factor (Q). For absorbed dose in rads, dose equivalent is in rems. If absorbed dose is in gray, the dose equivalent is in sievert. Quality factor is defined without reference to any particular biological end point. Quality factors are recommended by committees such as the International Commission on Radiological Protection (ICRP) or the National Council on Radiation Protection and Measurements (NCRP), based on experimental RBE values but with some judgment exercised. Effective Dose Equivalent: It is the sum of the weighted dose equivalents for all irradiated tissues, in which the weighting factors represent the different risks of each tissue to mortality from cancer and hereditary effects. Committed dose equivalent: It is the integral over 50 years of dose equivalent following the intake of a radionuclide. Collective effective dose equivalent: It is a quantity for a population and is

  12. Protecting Antarctica

    Science.gov (United States)

    Carlowicz, Michael

    House Science Committee Chairman Robert Walker (R-Pa.) has introduced a bill into Congress to give the United States the legislative authority to implement the 1991 Environmental Protocol to the Antarctic Treaty. That protocol established rules and principles to shield the Antarctic environment from human spoilage—placing limits on the discharge of pollutants, protecting plant and animal life, and requiring environmental impact assessments before new activities and programs are launched. The protocol also forbids prospecting or developing of mineral resources except for scientific research.

  13. Simulator data on human error probabilities

    International Nuclear Information System (INIS)

    Kozinsky, E.J.; Guttmann, H.E.

    1982-01-01

    Analysis of operator errors on NPP simulators is being used to determine Human Error Probabilities (HEP) for task elements defined in NUREG/CR 1278. Simulator data tapes from research conducted by EPRI and ORNL are being analyzed for operator error rates. The tapes collected, using Performance Measurement System software developed for EPRI, contain a history of all operator manipulations during simulated casualties. Analysis yields a time history or Operational Sequence Diagram and a manipulation summary, both stored in computer data files. Data searches yield information on operator errors of omission and commission. This work experimentally determines HEPs for Probabilistic Risk Assessment calculations. It is the only practical experimental source of this data to date

  14. Soft error mechanisms, modeling and mitigation

    CERN Document Server

    Sayil, Selahattin

    2016-01-01

    This book introduces readers to various radiation soft-error mechanisms such as soft delays, radiation induced clock jitter and pulses, and single event (SE) coupling induced effects. In addition to discussing various radiation hardening techniques for combinational logic, the author also describes new mitigation strategies targeting commercial designs. Coverage includes novel soft error mitigation techniques such as the Dynamic Threshold Technique and Soft Error Filtering based on Transmission gate with varied gate and body bias. The discussion also includes modeling of SE crosstalk noise, delay and speed-up effects. Various mitigation strategies to eliminate SE coupling effects are also introduced. Coverage also includes the reliability of low power energy-efficient designs and the impact of leakage power consumption optimizations on soft error robustness. The author presents an analysis of various power optimization techniques, enabling readers to make design choices that reduce static power consumption an...

  15. Identifying systematic DFT errors in catalytic reactions

    DEFF Research Database (Denmark)

    Christensen, Rune; Hansen, Heine Anton; Vegge, Tejs

    2015-01-01

    Using CO2 reduction reactions as examples, we present a widely applicable method for identifying the main source of errors in density functional theory (DFT) calculations. The method has broad applications for error correction in DFT calculations in general, as it relies on the dependence...... of the applied exchange–correlation functional on the reaction energies rather than on errors versus the experimental data. As a result, improved energy corrections can now be determined for both gas phase and adsorbed reaction species, particularly interesting within heterogeneous catalysis. We show...... that for the CO2 reduction reactions, the main source of error is associated with the C[double bond, length as m-dash]O bonds and not the typically energy corrected OCO backbone....

  16. Simulator data on human error probabilities

    International Nuclear Information System (INIS)

    Kozinsky, E.J.; Guttmann, H.E.

    1981-01-01

    Analysis of operator errors on NPP simulators is being used to determine Human Error Probabilities (HEP) for task elements defined in NUREG/CR-1278. Simulator data tapes from research conducted by EPRI and ORNL are being analyzed for operator error rates. The tapes collected, using Performance Measurement System software developed for EPRI, contain a history of all operator manipulations during simulated casualties. Analysis yields a time history or Operational Sequence Diagram and a manipulation summary, both stored in computer data files. Data searches yield information on operator errors of omission and commission. This work experimentally determined HEP's for Probabilistic Risk Assessment calculations. It is the only practical experimental source of this data to date

  17. Error Sonification of a Complex Motor Task

    Directory of Open Access Journals (Sweden)

    Riener Robert

    2011-12-01

    Full Text Available Visual information is mainly used to master complex motor tasks. Thus, additional information providing augmented feedback should be displayed in other modalities than vision, e.g. hearing. The present work evaluated the potential of error sonification to enhance learning of a rowing-type motor task. In contrast to a control group receiving self-controlled terminal feedback, the experimental group could not significantly reduce spatial errors. Thus, motor learning was not enhanced by error sonification, although during the training the participant could benefit from it. It seems that the motor task was too slow, resulting in immediate corrections of the movement rather than in an internal representation of the general characteristics of the motor task. Therefore, further studies should elaborate the impact of error sonification when general characteristics of the motor tasks are already known.

  18. Human error in remote Afterloading Brachytherapy

    International Nuclear Information System (INIS)

    Quinn, M.L.; Callan, J.; Schoenfeld, I.; Serig, D.

    1994-01-01

    Remote Afterloading Brachytherapy (RAB) is a medical process used in the treatment of cancer. RAB uses a computer-controlled device to remotely insert and remove radioactive sources close to a target (or tumor) in the body. Some RAB problems affecting the radiation dose to the patient have been reported and attributed to human error. To determine the root cause of human error in the RAB system, a human factors team visited 23 RAB treatment sites in the US. The team observed RAB treatment planning and delivery, interviewed RAB personnel, and performed walk-throughs, during which staff demonstrated the procedures and practices used in performing RAB tasks. Factors leading to human error in the RAB system were identified. The impact of those factors on the performance of RAB was then evaluated and prioritized in terms of safety significance. Finally, the project identified and evaluated alternative approaches for resolving the safety significant problems related to human error

  19. Error estimation and adaptivity for incompressible hyperelasticity

    KAUST Repository

    Whiteley, J.P.

    2014-04-30

    SUMMARY: A Galerkin FEM is developed for nonlinear, incompressible (hyper) elasticity that takes account of nonlinearities in both the strain tensor and the relationship between the strain tensor and the stress tensor. By using suitably defined linearised dual problems with appropriate boundary conditions, a posteriori error estimates are then derived for both linear functionals of the solution and linear functionals of the stress on a boundary, where Dirichlet boundary conditions are applied. A second, higher order method for calculating a linear functional of the stress on a Dirichlet boundary is also presented together with an a posteriori error estimator for this approach. An implementation for a 2D model problem with known solution, where the entries of the strain tensor exhibit large, rapid variations, demonstrates the accuracy and sharpness of the error estimators. Finally, using a selection of model problems, the a posteriori error estimate is shown to provide a basis for effective mesh adaptivity. © 2014 John Wiley & Sons, Ltd.

  20. Systematic sampling with errors in sample locations

    DEFF Research Database (Denmark)

    Ziegel, Johanna; Baddeley, Adrian; Dorph-Petersen, Karl-Anton

    2010-01-01

    analysis using point process methods. We then analyze three different models for the error process, calculate exact expressions for the variances, and derive asymptotic variances. Errors in the placement of sample points can lead to substantial inflation of the variance, dampening of zitterbewegung......Systematic sampling of points in continuous space is widely used in microscopy and spatial surveys. Classical theory provides asymptotic expressions for the variance of estimators based on systematic sampling as the grid spacing decreases. However, the classical theory assumes that the sample grid...... is exactly periodic; real physical sampling procedures may introduce errors in the placement of the sample points. This paper studies the effect of errors in sample positioning on the variance of estimators in the case of one-dimensional systematic sampling. First we sketch a general approach to variance...